Science.gov

Sample records for airborne multispectral camera

  1. An airborne multispectral imaging system based on two consumer-grade cameras for agricultural remote sensing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes the design and evaluation of an airborne multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame complementary metal oxide semiconductor (CMOS) sensor with 5616 × 3744 pixels. One came...

  2. A Multispectral Image Creating Method for a New Airborne Four-Camera System with Different Bandpass Filters

    PubMed Central

    Li, Hanlun; Zhang, Aiwu; Hu, Shaoxing

    2015-01-01

    This paper describes an airborne high resolution four-camera multispectral system which mainly consists of four identical monochrome cameras equipped with four interchangeable bandpass filters. For this multispectral system, an automatic multispectral data composing method was proposed. The homography registration model was chosen, and the scale-invariant feature transform (SIFT) and random sample consensus (RANSAC) were used to generate matching points. For the difficult registration problem between visible band images and near-infrared band images in cases lacking manmade objects, we presented an effective method based on the structural characteristics of the system. Experiments show that our method can acquire high quality multispectral images and the band-to-band alignment error of the composed multiple spectral images is less than 2.5 pixels. PMID:26205264

  3. Assessing the application of an airborne intensified multispectral video camera to measure chlorophyll a in three Florida estuaries

    SciTech Connect

    Dierberg, F.E.; Zaitzeff, J.

    1997-08-01

    After absolute and spectral calibration, an airborne intensified, multispectral video camera was field tested for water quality assessments over three Florida estuaries (Tampa Bay, Indian River Lagoon, and the St. Lucie River Estuary). Univariate regression analysis of upwelling spectral energy vs. ground-truthed uncorrected chlorophyll a (Chl a) for each estuary yielded lower coefficients of determination (R{sup 2}) with increasing concentrations of Gelbstoff within an estuary. More predictive relationships were established by adding true color as a second independent variable in a bivariate linear regression model. These regressions successfully explained most of the variation in upwelling light energy (R{sup 2}=0.94, 0.82 and 0.74 for the Tampa Bay, Indian River Lagoon, and St. Lucie estuaries, respectively). Ratioed wavelength bands within the 625-710 nm range produced the highest correlations with ground-truthed uncorrected Chl a, and were similar to those reported as being the most predictive for Chl a in Tennessee reservoirs. However, the ratioed wavebands producing the best predictive algorithms for Chl a differed among the three estuaries due to the effects of varying concentrations of Gelbstoff on upwelling spectral signatures, which precluded combining the data into a common data set for analysis.

  4. Highly Protable Airborne Multispectral Imaging System

    NASA Technical Reports Server (NTRS)

    Lehnemann, Robert; Mcnamee, Todd

    2001-01-01

    A portable instrumentation system is described that includes and airborne and a ground-based subsytem. It can acquire multispectral image data over swaths of terrain ranging in width from about 1.5 to 1 km. The system was developed especially for use in coastal environments and is well suited for performing remote sensing and general environmental monitoring. It includes a small,munpilotaed, remotely controlled airplance that carries a forward-looking camera for navigation, three downward-looking monochrome video cameras for imaging terrain in three spectral bands, a video transmitter, and a Global Positioning System (GPS) reciever.

  5. Study on airborne multispectral imaging fusion detection technology

    NASA Astrophysics Data System (ADS)

    Ding, Na; Gao, Jiaobo; Wang, Jun; Cheng, Juan; Gao, Meng; Gao, Fei; Fan, Zhe; Sun, Kefeng; Wu, Jun; Li, Junna; Gao, Zedong; Cheng, Gang

    2014-11-01

    The airborne multispectral imaging fusion detection technology is proposed in this paper. In this design scheme, the airborne multispectral imaging system consists of the multispectral camera, the image processing unit, and the stabilized platform. The multispectral camera can operate in the spectral region from visible to near infrared waveband (0.4-1.0um), it has four same and independent imaging channels, and sixteen different typical wavelengths to be selected based on the different typical targets and background. The related experiments were tested by the airborne multispectral imaging system. In particularly, the camouflage targets were fused and detected in the different complex environment, such as the land vegetation background, the desert hot background and underwater. In the spectral region from 0.4 um to 1.0um, the three different characteristic wave from sixteen typical spectral are selected and combined according to different backgrounds and targets. The spectral image corresponding to the three characteristic wavelengths is resisted and fused by the image processing technology in real time, and the fusion video with typical target property is outputted. In these fusion images, the contrast of target and background is greatly increased. Experimental results confirm that the airborne multispectral imaging fusion detection technology can acquire multispectral fusion image with high contrast in real time, and has the ability of detecting and identification camouflage objects from complex background to targets underwater.

  6. Airborne system for testing multispectral reconnaissance technologies

    NASA Astrophysics Data System (ADS)

    Schmitt, Dirk-Roger; Doergeloh, Heinrich; Keil, Heiko; Wetjen, Wilfried

    1999-07-01

    There is an increasing demand for future airborne reconnaissance systems to obtain aerial images for tactical or peacekeeping operations. Especially Unmanned Aerial Vehicles (UAVs) equipped with multispectral sensor system and with real time jam resistant data transmission capabilities are of high interest. An airborne experimental platform has been developed as testbed to investigate different concepts of reconnaissance systems before their application in UAVs. It is based on a Dornier DO 228 aircraft, which is used as flying platform. Great care has been taken to achieve the possibility to test different kinds of multispectral sensors. Hence basically it is capable to be equipped with an IR sensor head, high resolution aerial cameras of the whole optical spectrum and radar systems. The onboard equipment further includes system for digital image processing, compression, coding, and storage. The data are RF transmitted to the ground station using technologies with high jam resistance. The images, after merging with enhanced vision components, are delivered to the observer who has an uplink data channel available to control flight and imaging parameters.

  7. Lens assemblies for multispectral camera

    NASA Astrophysics Data System (ADS)

    Lepretre, Francois

    1994-09-01

    In the framework of a contract with the Indian Space Research Organization (ISRO), MATRA DEFENSE - DOD/UAO have developed, produced and tested 36 types LISS 1 - LISS 2 lenses and 12 LISS 3 lenses equipped with their interferential filters. These lenses are intended to form the optical systems of multispectral imaging sensors aboard Indian earth observation satellites IRS 1A, 1B, 1C, and 1D. It should be noted that the multispectrum cameras of the IRS 1A - 1B satellite have been in operation for two years and have given very satisfactory results according to ISRO. Each of these multispectrum LISS 3 cameras consists of lenses, each working in a different spectral bandwidth (B2: 520 - 590 nm; B3: 620 - 680 nm; B4: 770 - 860 nm; B5: 1550 - 1700 nm). In order to superimpose the images of each spectral band without digital processing, the image formats (60 mm) of the lenses are registered better that 2 micrometers and remain as such throughout all the environmental tests. Similarly, due to the absence of precise thermal control aboard the satellite, the lenses are as athermal as possible.

  8. Airborne ballistic camera tracking systems

    NASA Technical Reports Server (NTRS)

    Redish, W. L.

    1976-01-01

    An operational airborne ballistic camera tracking system was tested for operational and data reduction feasibility. The acquisition and data processing requirements of the system are discussed. Suggestions for future improvements are also noted. A description of the data reduction mathematics is outlined. Results from a successful reentry test mission are tabulated. The test mission indicated that airborne ballistic camera tracking systems are feasible.

  9. ASPIS, A Flexible Multispectral System for Airborne Remote Sensing Environmental Applications

    PubMed Central

    Papale, Dario; Belli, Claudio; Gioli, Beniamino; Miglietta, Franco; Ronchi, Cesare; Vaccari, Francesco Primo; Valentini, Riccardo

    2008-01-01

    Airborne multispectral and hyperspectral remote sensing is a powerful tool for environmental monitoring applications. In this paper we describe a new system (ASPIS) composed by a 4-CCD spectral sensor, a thermal IR camera and a laser altimeter that is mounted on a flexible Sky-Arrow airplane. A test application of the multispectral sensor to estimate durum wheat quality is also presented.

  10. Airborne system for multispectral, multiangle polarimetric imaging.

    PubMed

    Bowles, Jeffrey H; Korwan, Daniel R; Montes, Marcos J; Gray, Deric J; Gillis, David B; Lamela, Gia M; Miller, W David

    2015-11-01

    In this paper, we describe the design, fabrication, calibration, and deployment of an airborne multispectral polarimetric imager. The motivation for the development of this instrument was to explore its ability to provide information about water constituents, such as particle size and type. The instrument is based on four 16 MP cameras and uses wire grid polarizers (aligned at 0°, 45°, 90°, and 135°) to provide the separation of the polarization states. A five-position filter wheel provides for four narrow-band spectral filters (435, 550, 625, and 750 nm) and one blocked position for dark-level measurements. When flown, the instrument is mounted on a programmable stage that provides control of the view angles. View angles that range to ±65° from the nadir have been used. Data processing provides a measure of the polarimetric signature as a function of both the view zenith and view azimuth angles. As a validation of our initial results, we compare our measurements, over water, with the output of a Monte Carlo code, both of which show neutral points off the principle plane. The locations of the calculated and measured neutral points are compared. The random error level in the measured degree of linear polarization (8% at 435) is shown to be better than 0.25%. PMID:26560615

  11. An airborne four-camera imaging system for agricultural applications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes the design and testing of an airborne multispectral digital imaging system for remote sensing applications. The system consists of four high resolution charge coupled device (CCD) digital cameras and a ruggedized PC equipped with a frame grabber and image acquisition software. T...

  12. Airborne multispectral detection of regrowth cotton fields

    NASA Astrophysics Data System (ADS)

    Westbrook, John K.; Suh, Charles P.-C.; Yang, Chenghai; Lan, Yubin; Eyster, Ritchie S.

    2015-01-01

    Effective methods are needed for timely areawide detection of regrowth cotton plants because boll weevils (a quarantine pest) can feed and reproduce on these plants beyond the cotton production season. Airborne multispectral images of regrowth cotton plots were acquired on several dates after three shredding (i.e., stalk destruction) dates. Linear spectral unmixing (LSU) classification was applied to high-resolution airborne multispectral images of regrowth cotton plots to estimate the minimum detectable size and subsequent growth of plants. We found that regrowth cotton fields can be identified when the mean plant width is ˜0.2 m for an image resolution of 0.1 m. LSU estimates of canopy cover of regrowth cotton plots correlated well (r2=0.81) with the ratio of mean plant width to row spacing, a surrogate measure of plant canopy cover. The height and width of regrowth plants were both well correlated (r2=0.94) with accumulated degree-days after shredding. The results will help boll weevil eradication program managers use airborne multispectral images to detect and monitor the regrowth of cotton plants after stalk destruction, and identify fields that may require further inspection and mitigation of boll weevil infestations.

  13. Development of a multispectral camera system

    NASA Astrophysics Data System (ADS)

    Sugiura, Hiroaki; Kuno, Tetsuya; Watanabe, Norihiro; Matoba, Narihiro; Hayashi, Junichiro; Miyake, Yoichi

    2000-05-01

    A highly accurate multispectral camera and the application software have been developed as a practical system to capture digital images of the artworks stored in galleries and museums. Instead of recording color data in the conventional three RGB primary colors, the newly developed camera and the software carry out a pixel-wise estimation of spectral reflectance, the color data specific to the object, to enable the practical multispectral imaging. In order to realize the accurate multispectral imaging, the dynamic range of the camera is set to 14 bits or over and the output bits to 14 bits so as to allow capturing even when the difference in light quantity between the each channel is large. Further, a small-size rotary color filter was simultaneously developed to keep the camera to a practical size. We have developed software capable of selecting the optimum combination of color filters available in the market. Using this software, n types of color filter can be selected from m types of color filter giving a minimum Euclidean distance or minimum color difference in CIELAB color space between actual and estimated spectral reflectance as to 147 types of oil paint samples.

  14. Classroom multispectral imaging using inexpensive digital cameras.

    NASA Astrophysics Data System (ADS)

    Fortes, A. D.

    2007-12-01

    The proliferation of increasingly cheap digital cameras in recent years means that it has become easier to exploit the broad wavelength sensitivity of their CCDs (360 - 1100 nm) for classroom-based teaching. With the right tools, it is possible to open children's eyes to the invisible world of UVA and near-IR radiation either side of our narrow visual band. The camera-filter combinations I describe can be used to explore the world of animal vision, looking for invisible markings on flowers, or in bird plumage, for example. In combination with a basic spectroscope (such as the Project-STAR handheld plastic spectrometer, 25), it is possible to investigate the range of human vision and camera sensitivity, and to explore the atomic and molecular absorption lines from the solar and terrestrial atmospheres. My principal use of the cameras has been to teach multispectral imaging of the kind used to determine remotely the composition of planetary surfaces. A range of camera options, from 50 circuit-board mounted CCDs up to $900 semi-pro infrared camera kits (including mobile phones along the way), and various UV-vis-IR filter options will be presented. Examples of multispectral images taken with these systems are used to illustrate the range of classroom topics that can be covered. Particular attention is given to learning about spectral reflectance curves and comparing images from Earth and Mars taken using the same filter combination that it used on the Mars Rovers.

  15. Astronaut Jack Lousma works at Multispectral camera experiment

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Astronaut Jack R. Lousma, Skylab 3 pilot, works at the S190A multispectral camera experiment in the Multiple Docking Adapter (MDA), seen from a color television transmission made by a TV camera aboard the Skylab space station cluster in Earth orbit. Lousma later used a small brush to clean the six lenses of the multispectral camera.

  16. Spatial Modeling and Variability Analysis for Modeling and Prediction of Soil and Crop Canopy Coverage Using Multispectral Imagery from an Airborne Remote Sensing System

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Based on a previous study on an airborne remote sensing system with automatic camera stabilization for crop management, multispectral imagery was acquired using the MS-4100 multispectral camera at different flight altitudes over a 115 ha cotton field. After the acquired images were geo-registered an...

  17. Multispectral Airborne Laser Scanning for Automated Map Updating

    NASA Astrophysics Data System (ADS)

    Matikainen, Leena; Hyyppä, Juha; Litkey, Paula

    2016-06-01

    During the last 20 years, airborne laser scanning (ALS), often combined with multispectral information from aerial images, has shown its high feasibility for automated mapping processes. Recently, the first multispectral airborne laser scanners have been launched, and multispectral information is for the first time directly available for 3D ALS point clouds. This article discusses the potential of this new single-sensor technology in map updating, especially in automated object detection and change detection. For our study, Optech Titan multispectral ALS data over a suburban area in Finland were acquired. Results from a random forests analysis suggest that the multispectral intensity information is useful for land cover classification, also when considering ground surface objects and classes, such as roads. An out-of-bag estimate for classification error was about 3% for separating classes asphalt, gravel, rocky areas and low vegetation from each other. For buildings and trees, it was under 1%. According to feature importance analyses, multispectral features based on several channels were more useful that those based on one channel. Automatic change detection utilizing the new multispectral ALS data, an old digital surface model (DSM) and old building vectors was also demonstrated. Overall, our first analyses suggest that the new data are very promising for further increasing the automation level in mapping. The multispectral ALS technology is independent of external illumination conditions, and intensity images produced from the data do not include shadows. These are significant advantages when the development of automated classification and change detection procedures is considered.

  18. A high-resolution airborne four-camera imaging system for agricultural remote sensing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes the design and testing of an airborne multispectral digital imaging system for remote sensing applications. The system consists of four high resolution charge coupled device (CCD) digital cameras and a ruggedized PC equipped with a frame grabber and image acquisition software. T...

  19. Citrus greening detection using airborne hyperspectral and multispectral imaging techniques

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hyperspectral imaging can provide unique spectral signatures for diseased vegetation. Airborne multispectral and hyperspectral imaging can be used to detect potentially infected trees over a large area for rapid detection of infected zones. This paper proposes a method to detect the citrus greening...

  20. Airborne multispectral detecting system for marine mammals survey

    NASA Astrophysics Data System (ADS)

    Podobna, Yuliya; Sofianos, James; Schoonmaker, Jon; Medeiros, Dustin; Boucher, Cynthia; Oakley, Daniel; Saggese, Steve

    2010-04-01

    This work presents an electro-optical multispectral capability that detects and monitors marine mammals. It is a continuance of Whale Search Radar SBIR program funded by PMA-264 through NAVAIR. A lightweight, multispectral, turreted imaging system is designed for airborne and ship based platforms to detect and monitor marine mammals. The system tests were conducted over the Humpback whale breeding and calving area in Maui, Hawaii. The results of the tests and the system description are presented. The development of an automatic whale detection algorithm is discussed as well as methodology used to turn raw survey data into quantifiable data products.

  1. Sandia Multispectral Airborne Lidar for UAV Deployment

    SciTech Connect

    Daniels, J.W.; Hargis,Jr. P.J.; Henson, T.D.; Jordan, J.D.; Lang, A.R.; Schmitt, R.L.

    1998-10-23

    Sandia National Laboratories has initiated the development of an airborne system for W laser remote sensing measurements. System applications include the detection of effluents associated with the proliferation of weapons of mass destruction and the detection of biological weapon aerosols. This paper discusses the status of the conceptual design development and plans for both the airborne payload (pointing and tracking, laser transmitter, and telescope receiver) and the Altus unmanned aerospace vehicle platform. Hardware design constraints necessary to maintain system weight, power, and volume limitations of the flight platform are identified.

  2. Generic MSFA mosaicking and demosaicking for multispectral cameras

    NASA Astrophysics Data System (ADS)

    Miao, Lidan; Qi, Hairong; Ramanath, Rajeev

    2006-02-01

    In this paper, we investigate the potential application of the multispectral filter array (MSFA) techniques in multispectral imaging for reasons like low cost, exact registration, and strong robustness. In both human and many animal visual systems, different types of photoreceptors are organized into mosaic patterns. This behavior has been emulated in the industry to develop the so-called color filter array (CFA) in the manufacture of digital color cameras. In this way, only one color component is measured at each pixel, and the sensed image is a mosaic of different color bands. We extend this idea to multispectral imaging by developing generic mosaicking and demosaicking algorithms. The binary tree-driven MSFA design process guarantees that the pixel distributions of different spectral bands are uniform and highly correlated. These spatial features facilitate the design of the generic demosaicking algorithm based on the same binary tree, which considers three interrelated issues: band selection, pixel selection and interpolation. We evaluate the reconstructed images from two aspects: better reconstruction and better target classification. The experimental results demonstrate that the mosaicking and demosaicking process preserves the image quality effectively, which further supports that the MSFA technique is a feasible solution for multispectral cameras.

  3. Camera system for multispectral imaging of documents

    NASA Astrophysics Data System (ADS)

    Christens-Barry, William A.; Boydston, Kenneth; France, Fenella G.; Knox, Keith T.; Easton, Roger L., Jr.; Toth, Michael B.

    2009-02-01

    A spectral imaging system comprising a 39-Mpixel monochrome camera, LED-based narrowband illumination, and acquisition/control software has been designed for investigations of cultural heritage objects. Notable attributes of this system, referred to as EurekaVision, include: streamlined workflow, flexibility, provision of well-structured data and metadata for downstream processing, and illumination that is safer for the artifacts. The system design builds upon experience gained while imaging the Archimedes Palimpsest and has been used in studies of a number of important objects in the LOC collection. This paper describes practical issues that were considered by EurekaVision to address key research questions for the study of fragile and unique cultural objects over a range of spectral bands. The system is intended to capture important digital records for access by researchers, professionals, and the public. The system was first used for spectral imaging of the 1507 world map by Martin Waldseemueller, the first printed map to reference "America." It was also used to image sections of the Carta Marina 1516 map by the same cartographer for comparative purposes. An updated version of the system is now being utilized by the Preservation Research and Testing Division of the Library of Congress.

  4. The design and the development of a hyperspectral and multispectral airborne mapping system

    NASA Astrophysics Data System (ADS)

    Gorsevski, Pece V.; Gessler, Paul E.

    Flexible and cost-effective tools for rapid image acquisition and natural resource mapping are needed by land managers. This paper describes the hardware and software architecture of a low-cost system that can be deployed on a light aircraft for rapid data acquisition. The Hyperspectral and Multispectral Cameras for Airborne Mapping (HAMCAM) was designed and developed in the Geospatial Laboratory for Environmental Dynamics at the University of Idaho as a student-learning tool, and to enhance the existing curriculum currently offered. The system integrates a hyperspectral sensor with four multispectral cameras, an Inertial Navigation System (INS), a Wide Area Augmentation System (WAAS)-capable Global Positioning System (GPS), a data acquisition computer, and custom software for running the sensors in a variety of different modes. The outputs include very high resolution imagery obtained in four adjustable visible and near-infrared bands from the multispectral imager. The hyperspectral sensor acquires 240 spectral bands along 2.7 nm intervals within the 445-900 nm range. The INS provides aircraft pitch, roll and yaw information for rapid geo-registration of the imagery. This paper will discuss the challenges associated with the development of the system and the integration of components and software for implementation of this system for natural resource management applications. In addition, sample imagery acquired by the sensor will be presented.

  5. Spectra-view: A high performance, low-cost multispectral airborne imaging system

    SciTech Connect

    Helder, D.

    1996-11-01

    Although a variety of airborne platforms are available for collecting remote sensing data, a niche exists for a low cost, compact systemd capable of collecting accurate visible and infrared multispectral data in a digital format. To fill this void, an instrument known as Spectra-View was developed by Airborne Data Systems. Multispectral data is collected in the visible and near-infrared using an array of CCD cameras with appropriate spectral filtering. Infrared imaging is accomplished using commercially available cameras. Although the current system images in five spectral bands, a modular design approach allows various configurations for imaging in the visible and infrared regions with up to 10 or more channels. It was built entirely through integration of readily available commercial components, is compact enough to fly in an aircraft as small as a Cessna 172, and can record imagery at airspeeds in excess of 150 knots. A GPS-based navigation system provides a course deviation indicator for the pilot to follow and allows for georeferencing of the data. To maintain precise pointing knowledge, and at the same time keep system cost low, attitude sensors are mounted directly with the cameras rather than using a stabilized mounting system. Information is collect during camera firing of aircraft/camera attitude along the yaw, pitch, and roll axes. All data is collected in a digital format on a hard disk that is removable during flight so that virtually unlimited amounts of data may be recorded. Following collection, imagery is readily available for viewing and incorporation into computer-based systems for analysis and reduction. Ground processing software has been developed to perform radiometric calibration and georeference the imagery. Since June, 1995, the system has been collecting high-quality data in a variety of applications for numerous customers including applications in agriculture, forestry, and global change research. Several examples will be presented.

  6. Deepwater Horizon oil spill monitoring using airborne multispectral infrared imagery

    NASA Astrophysics Data System (ADS)

    Shen, Sylvia S.; Lewis, Paul E.

    2011-06-01

    On April 28, 2010, the Environmental Protection Agency's (EPA) Airborne Spectral Photometric Environmental Collection Technology (ASPECT) aircraft was deployed to Gulfport, Mississippi to provide airborne remotely sensed air monitoring and situational awareness data and products in response to the Deepwater Horizon oil spill disaster. The ASPECT aircraft was released from service on August 9, 2010 after having flown over 85 missions that included over 325 hours of flight operation. This paper describes several advanced analysis capabilities specifically developed for the Deepwater Horizon mission to correctly locate, identify, characterize, and quantify surface oil using ASPECT's multispectral infrared data. The data products produced using these advanced analysis capabilities provided the Deepwater Horizon Incident Command with a capability that significantly increased the effectiveness of skimmer vessel oil recovery efforts directed by the U.S. Coast Guard, and were considered by the Incident Command as key situational awareness information.

  7. Development of a portable multispectral thermal infrared camera

    NASA Technical Reports Server (NTRS)

    Osterwisch, Frederick G.

    1991-01-01

    The purpose of this research and development effort was to design and build a prototype instrument designated the 'Thermal Infrared Multispectral Camera' (TIRC). The Phase 2 effort was a continuation of the Phase 1 feasibility study and preliminary design for such an instrument. The completed instrument designated AA465 has application in the field of geologic remote sensing and exploration. The AA465 Thermal Infrared Camera (TIRC) System is a field-portable multispectral thermal infrared camera operating over the 8.0 - 13.0 micron wavelength range. Its primary function is to acquire two-dimensional thermal infrared images of user-selected scenes. Thermal infrared energy emitted by the scene is collected, dispersed into ten 0.5 micron wide channels, and then measured and recorded by the AA465 System. This multispectral information is presented in real time on a color display to be used by the operator to identify spectral and spatial variations in the scenes emissivity and/or irradiance. This fundamental instrument capability has a wide variety of commercial and research applications. While ideally suited for two-man operation in the field, the AA465 System can be transported and operated effectively by a single user. Functionally, the instrument operates as if it were a single exposure camera. System measurement sensitivity requirements dictate relatively long (several minutes) instrument exposure times. As such, the instrument is not suited for recording time-variant information. The AA465 was fabricated, assembled, tested, and documented during this Phase 2 work period. The detailed design and fabrication of the instrument was performed during the period of June 1989 to July 1990. The software development effort and instrument integration/test extended from July 1990 to February 1991. Software development included an operator interface/menu structure, instrument internal control functions, DSP image processing code, and a display algorithm coding program. The

  8. Application of multimode airborne digital camera system in Wenchuan earthquake disaster monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Xue; Li, Qingting; Fang, Junyong; Tong, Qingxi; Zheng, Lanfen

    2009-06-01

    Remote sensing, especially airborne remote sensing, can be an invaluable technique for quick response to natural disasters. Timely acquired images by airborne remote sensing can provide very important information for the headquarters and decision makers to be aware of the disaster situation, and make effective relief arrangements. The image acquisition and processing of Multi-mode Airborne Digital Camera System (MADC) and its application in Wenchuan earthquake disaster monitoring are presented in this paper. MADC system is a novel airborne digital camera developed by Institute of Remote Sensing Applications, Chinese Academy of Sciences. This camera system can acquire high quality images in three modes, namely wide field, multi-spectral (hyper-spectral) and stereo conformation. The basic components and technical parameters of MADC are also presented in this paper. MADC system played a very important role in the disaster monitoring of Wenchuan earthquake. In particular, the map of dammed lakes in Jianjiang river area was produced and provided to the front line headquarters. Analytical methods and information extraction techniques of MADC are introduced. Some typical analytical and imaging results are given too. Suggestions for the design and configuration of the airborne sensors are discussed at the end of this paper.

  9. MEDUSA: an ultra-lightweight multi-spectral camera for a HALE UAV

    NASA Astrophysics Data System (ADS)

    Van Achteren, T.; Delauré, B.; Everaerts, J.; Beghuin, D.; Ligot, R.

    2007-10-01

    The ESA-PRODEX funded MEDUSA project aims to develop a light weight high resolution multi-spectral earth observation instrument, which will be embarked on a solar-powered high altitude long endurance (HALE) UAV, operated at stratospheric altitudes (15 to 18km). The MEDUSA instrument is designed to fill the gap between traditional airborne and spaceborne instruments regarding resolution and coverage. It targets applications such as crisis management and cartography, requiring high resolution images with regional coverage, flexible flight patterns, high update rates and long mission lengths (weeks to months). The MEDUSA camera is designed to operate at a ground resolution of 30 cm at 18 km altitude in the visible spectrum (400-650 nm), and a swath of 3000m. The central part of the payload is a focal plane assembly consisting of two frame sensors (PAN and RGB). The wide swath is realized with a custom designed highly sensitive CMOS sensor of 10000x1200 pixels. A GPS receiver and Inertial Measurement Unit (IMU) provide accurate position and attitude information. A direct downlink allows near-real time data delivery to the user. The on-board data processing consists mainly of basic image corrections and data compression (JPEG2000). The challenge lies mainly in fulfilling the requirements within the extreme environmental and physical constraints of the HALE UAV. Compared to traditional airborne and spaceborne systems, the MEDUSA camera system is ultra light weight (about 2 kg) and is operated in a low pressure and low temperature environment. System modeling and simulation is used to make careful trade-offs between requirements and subsystem performances. On 27th November 2006 the phase C/D for the design, production and test of the camera has started at VITO with the support of 9 industrial partners. The MEDUSA camera is expected to transmit its first images the end of 2008.

  10. Overview of test and application of the multispectral camera on ZY-3 satellite

    NASA Astrophysics Data System (ADS)

    Cai, Weijun; Fan, Bin; Zhang, Xiaohong

    2015-10-01

    ZY-3 satellite is the first high-precision Stereo-mapping satellite for civilian purposes in China, which was launched in 9 Jun 2012. The Multi-Spectral Camera mounted on the satellite can acquire 4 multi-spectral bands. It has served steadily on orbit for more than three years. End to 22 Mar 2015 the satellite had produced more than 822730 images, include 212386 Multi-Spectral images, the multi-spectral data is more than 133TB. The telemetry data and the images shows that the imaging quality and the stability of interior orientation elements all satisfied the requirements of the mission.

  11. Multispectral light scattering imaging and multivariate analysis of airborne particulates

    NASA Astrophysics Data System (ADS)

    Holler, Stephen; Skelsey, Charles R.; Fuerstenau, Stephen D.

    2005-05-01

    Light scattering patterns from non-spherical particles and aggregates exhibit complex structure that is only revealed when observing in two angular dimensions. However, due to the varied shape and packing of such aerosols, the rich structure in the two-dimensional angular optical scattering (TAOS) pattern varies from particle to particle. We examine two-dimensional light scattering patterns obtained at multiple wavelengths using a single CCD camera with minimal cross talk between channels. The integration of the approach with a single CCD camera assures that data is acquired within the same solid angle and orientation. Since the optical size of the scattering particle is inversely proportional to the illuminating wavelength, the spectrally resolved scattering information provides characteristic information about the airborne particles simultaneously in two different scaling regimes. The simultaneous acquisition of data from airborne particulate matter at two different wavelengths allows for additional degrees of freedom in the analysis and characterization of the aerosols. Whereas our previous multivariate analyses of aerosol particles has relied solely on spatial frequency components, our present approach attempts to incorporate the relative symmetry of the particledetector system while extracting information content from both spectral channels. In addition to single channel data, this current approach also examines relative metrics. Consequently, we have begun to employ multivariate techniques based on novel morphological descriptors in order to classify "unknown" particles within a database of TAOS patterns from known aerosols utilizing both spectral and spatial information acquired. A comparison is made among several different classification metrics, all of which show improved classification capabilities relative to our previous approaches.

  12. Michigan experimental multispectral mapping system: A description of the M7 airborne sensor and its performance

    NASA Technical Reports Server (NTRS)

    Hasell, P. G., Jr.

    1974-01-01

    The development and characteristics of a multispectral band scanner for an airborne mapping system are discussed. The sensor operates in the ultraviolet, visual, and infrared frequencies. Any twelve of the bands may be selected for simultaneous, optically registered recording on a 14-track analog tape recorder. Multispectral imagery recorded on magnetic tape in the aircraft can be laboratory reproduced on film strips for visual analysis or optionally machine processed in analog and/or digital computers before display. The airborne system performance is analyzed.

  13. Employing airborne multispectral digital imagery to map Brazilian pepper infestation in south Texas.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A study was conducted in south Texas to determine the feasibility of using airborne multispectral digital imagery for differentiating the invasive plant Brazilian pepper (Schinus terebinthifolius) from other cover types. Imagery obtained in the visible, near infrared, and mid infrared regions of th...

  14. Comparison of different detection methods for citrus greening disease based on airborne multispectral and hyperspectral imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Citrus greening or Huanglongbing (HLB) is a devastating disease spread in many citrus groves since first found in 2005 in Florida. Multispectral (MS) and hyperspectral (HS) airborne images of citrus groves in Florida were taken to detect citrus greening infected trees in 2007 and 2010. Ground truthi...

  15. Remote identification of potential boll weevil host plants: Airborne multispectral detection of regrowth cotton

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Regrowth cotton plants can serve as potential hosts for boll weevils during and beyond the production season. Effective methods for timely areawide detection of these host plants are critically needed to expedite eradication in south Texas. We acquired airborne multispectral images of experimental...

  16. Using airborne multispectral imagery to monitor cotton root rot expansion within a growing season

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton root rot is a serious and destructive disease that affects cotton production in the southwestern United States. Accurate delineation of cotton root rot infestations is important for cost-effective management of the disease. The objective of this study was to use airborne multispectral imagery...

  17. Television camera on RMS surveys insulation on Airborne Support Equipment

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The television camera on the end effector of the Canadian-built Remote Manipulator System (RMS) is seen surveying some of the insulation on the Airborne Support Equipment (ASE). Flight controllers called for the survey following the departure of the Advanced Communications Technology Satellite (ACTS) and its Transfer Orbit Stage (TOS).

  18. Development of filter exchangeable 3CCD camera for multispectral imaging acquisition

    NASA Astrophysics Data System (ADS)

    Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

    2012-05-01

    There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

  19. Comparison of Airborne Multispectral and Hyperspectral Imagery for Yield Estimation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Multispectral and hyperspectral imagery is being used to monitor crop conditions and map yield variability. However, limited research has been conducted to compare the differences between these two types of imagery for assessing crop growth and yield. The objective of this study was to compare airbo...

  20. Citrus greening disease detection using airborne multispectral and hyperspectral imaging

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hyperspectral imaging can provide unique spectral signatures for diseased vegetation. Airborne hyperspectral imaging can be used to detect potentially infected trees over a large area for rapid detection of infected zones. Ground inspection and management can be focused on these infected zones rath...

  1. Calibrated and geocoded clutter from an airborne multispectral scanner

    NASA Astrophysics Data System (ADS)

    Heuer, Markus; Bruehlmann, Ralph; John, Marc-Andre; Schmid, Konrad J.; Hueppi, Rudolph; Koenig, Reto

    1999-07-01

    Robustness of automatic target recognition (ATR) to varying observation conditions and countermeasures is substantially increased by use of multispectral sensors. Assessment of such ATR systems is performed by captive flight tests and simulations (HWIL or complete modeling). Although the clutter components of a scene can be generated with specified statistics, clutter maps directly obtained from measurement are required for validation of a simulation. In addition, urban scenes have non-stationary characteristics and are difficult to simulate. The present paper describes a scanner, data acquisition and processing system used for the generation of realistic clutter maps incorporating infrared, passive and active millimeter wave channels. The sensors are mounted on a helicopter with coincident line-of-sight, enabling us to measure consistent clutter signatures under varying observation conditions. Position and attitude data from GPS and an inertial measurement unit, respectively, are used to geometrically correct the raw scanner data. After sensor calibration the original voltage signals are converted to physical units, i.e. temperatures and reflectivities, describing the clutter independently of the scanning sensor, thus allowing us the use of the clutter maps in tests of a priori unknown multispectral sensors. The data correction procedures are described and results are presented.

  2. Determining density of maize canopy. 2: Airborne multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Stoner, E. R.; Baumgardner, M. F.; Cipra, J. E.

    1971-01-01

    Multispectral scanner data were collected in two flights over a light colored soil background cover plot at an altitude of 305 m. Energy in eleven reflective wavelength band from 0.45 to 2.6 microns was recorded. Four growth stages of maize (Zea mays L.) gave a wide range of canopy densities for each flight date. Leaf area index measurements were taken from the twelve subplots and were used as a measure of canopy density. Ratio techniques were used to relate uncalibrated scanner response to leaf area index. The ratios of scanner data values for the 0.72 to 0.92 micron wavelength band over the 0.61 to 0.70 micron wavelength band were calculated for each plot. The ratios related very well to leaf area index for a given flight date. The results indicated that spectral data from maize canopies could be of value in determining canopy density.

  3. Evaluation of eelgrass beds mapping using a high-resolution airborne multispectral scanner

    USGS Publications Warehouse

    Su, H.; Karna, D.; Fraim, E.; Fitzgerald, M.; Dominguez, R.; Myers, J.S.; Coffland, B.; Handley, L.R.; Mace, T.

    2006-01-01

    Eelgrass (Zostera marina) can provide vital ecological functions in stabilizing sediments, influencing current dynamics, and contributing significant amounts of biomass to numerous food webs in coastal ecosystems. Mapping eelgrass beds is important for coastal water and nearshore estuarine monitoring, management, and planning. This study demonstrated the possible use of high spatial (approximately 5 m) and temporal (maximum low tide) resolution airborne multispectral scanner on mapping eelgrass beds in Northern Puget Sound, Washington. A combination of supervised and unsupervised classification approaches were performed on the multispectral scanner imagery. A normalized difference vegetation index (NDVI) derived from the red and near-infrared bands and ancillary spatial information, were used to extract and mask eelgrass beds and other submerged aquatic vegetation (SAV) in the study area. We evaluated the resulting thematic map (geocoded, classified image) against a conventional aerial photograph interpretation using 260 point locations randomly stratified over five defined classes from the thematic map. We achieved an overall accuracy of 92 percent with 0.92 Kappa Coefficient in the study area. This study demonstrates that the airborne multispectral scanner can be useful for mapping eelgrass beds in a local or regional scale, especially in regions for which optical remote sensing from space is constrained by climatic and tidal conditions. ?? 2006 American Society for Photogrammetry and Remote Sensing.

  4. Towards Automatic Single-Sensor Mapping by Multispectral Airborne Laser Scanning

    NASA Astrophysics Data System (ADS)

    Ahokas, E.; Hyyppä, J.; Yu, X.; Liang, X.; Matikainen, L.; Karila, K.; Litkey, P.; Kukko, A.; Jaakkola, A.; Kaartinen, H.; Holopainen, M.; Vastaranta, M.

    2016-06-01

    This paper describes the possibilities of the Optech Titan multispectral airborne laser scanner in the fields of mapping and forestry. Investigation was targeted to six land cover classes. Multispectral laser scanner data can be used to distinguish land cover classes of the ground surface, including the roads and separate road surface classes. For forest inventory using point cloud metrics and intensity features combined, total accuracy of 93.5% was achieved for classification of three main boreal tree species (pine, spruce and birch).When using intensity features - without point height metrics - a classification accuracy of 91% was achieved for these three tree species. It was also shown that deciduous trees can be further classified into more species. We propose that intensity-related features and waveform-type features are combined with point height metrics for forest attribute derivation in area-based prediction, which is an operatively applied forest inventory process in Scandinavia. It is expected that multispectral airborne laser scanning can provide highly valuable data for city and forest mapping and is a highly relevant data asset for national and local mapping agencies in the near future.

  5. Two-inch Return Beam Vidicon (RBV) multispectral three camera subsystem

    NASA Technical Reports Server (NTRS)

    Weinstein, O.

    1973-01-01

    A return beam vidicon multispectral three camera subsystem was developed and built as one of the two principal sensor payloads for the ERTS-A and -B missions. The performance of the cameras on ERTS-1 has been excellent, meeting or exceeding all expectations, especially in the area of geometric fidelity and stability. The three cameras are coaligned in the spacecraft to view the same square ground scene but in different spectral bands. When the separate images are processed and superimposed in their respective colors, they provide a single false color image containing the radiometric and cartographic information required for the ERTS system. The three spectral regions covered by the RBV subsystem are the blue-green red, and the near infrared. The three cameras are exposed simultaneously to facilitate registration of the three separate images into the final color composite.

  6. Urban land use monitoring from computer-implemented processing of airborne multispectral data

    NASA Technical Reports Server (NTRS)

    Todd, W. J.; Mausel, P. W.; Baumgardner, M. F.

    1976-01-01

    Machine processing techniques were applied to multispectral data obtained from airborne scanners at an elevation of 600 meters over central Indianapolis in August, 1972. Computer analysis of these spectral data indicate that roads (two types), roof tops (three types), dense grass (two types), sparse grass (two types), trees, bare soil, and water (two types) can be accurately identified. Using computers, it is possible to determine land uses from analysis of type, size, shape, and spatial associations of earth surface images identified from multispectral data. Land use data developed through machine processing techniques can be programmed to monitor land use changes, simulate land use conditions, and provide impact statistics that are required to analyze stresses placed on spatial systems.

  7. Development of a Portable 3CCD Camera System for Multispectral Imaging of Biological Samples

    PubMed Central

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S.

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  8. Development of a portable 3CCD camera system for multispectral imaging of biological samples.

    PubMed

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  9. Multi-spectral CCD camera system for ocean water color and seacoast observation

    NASA Astrophysics Data System (ADS)

    Zhu, Min; Chen, Shiping; Wu, Yanlin; Huang, Qiaolin; Jin, Weiqi

    2001-10-01

    One of the earth observing instruments on HY-1 Satellite which will be launched in 2001, the multi-spectral CCD camera system, is developed by Beijing Institute of Space Mechanics & Electricity (BISME), Chinese Academy of Space Technology (CAST). In 798 km orbit, the system can provide images with 250 m ground resolution and a swath of 500 km. It is mainly used for coast zone dynamic mapping and oceanic watercolor monitoring, which include the pollution of offshore and coast zone, plant cover, watercolor, ice, terrain underwater, suspended sediment, mudflat, soil and vapor gross. The multi- spectral camera system is composed of four monocolor CCD cameras, which are line array-based, 'push-broom' scanning cameras, and responding for four spectral bands. The camera system adapts view field registration; that is, each camera scans the same region at the same moment. Each of them contains optics, focal plane assembly, electrical circuit, installation structure, calibration system, thermal control and so on. The primary features on the camera system are: (1) Offset of the central wavelength is better than 5 nm; (2) Degree of polarization is less than 0.5%; (3) Signal-to-noise ratio is about 1000; (4) Dynamic range is better than 2000:1; (5) Registration precision is better than 0.3 pixel; (6) Quantization value is 12 bit.

  10. Integration of multispectral face recognition and multi-PTZ camera automated surveillance for security applications

    NASA Astrophysics Data System (ADS)

    Chen, Chung-Hao; Yao, Yi; Chang, Hong; Koschan, Andreas; Abidi, Mongi

    2013-06-01

    Due to increasing security concerns, a complete security system should consist of two major components, a computer-based face-recognition system and a real-time automated video surveillance system. A computerbased face-recognition system can be used in gate access control for identity authentication. In recent studies, multispectral imaging and fusion of multispectral narrow-band images in the visible spectrum have been employed and proven to enhance the recognition performance over conventional broad-band images, especially when the illumination changes. Thus, we present an automated method that specifies the optimal spectral ranges under the given illumination. Experimental results verify the consistent performance of our algorithm via the observation that an identical set of spectral band images is selected under all tested conditions. Our discovery can be practically used for a new customized sensor design associated with given illuminations for an improved face recognition performance over conventional broad-band images. In addition, once a person is authorized to enter a restricted area, we still need to continuously monitor his/her activities for the sake of security. Because pantilt-zoom (PTZ) cameras are capable of covering a panoramic area and maintaining high resolution imagery for real-time behavior understanding, researches in automated surveillance systems with multiple PTZ cameras have become increasingly important. Most existing algorithms require the prior knowledge of intrinsic parameters of the PTZ camera to infer the relative positioning and orientation among multiple PTZ cameras. To overcome this limitation, we propose a novel mapping algorithm that derives the relative positioning and orientation between two PTZ cameras based on a unified polynomial model. This reduces the dependence on the knowledge of intrinsic parameters of PTZ camera and relative positions. Experimental results demonstrate that our proposed algorithm presents substantially

  11. Multispectral airborne laser scanning - a new trend in the development of LiDAR technology

    NASA Astrophysics Data System (ADS)

    Bakuła, K.

    2015-12-01

    Airborne laser scanning (ALS) is the one of the most accurate remote sensing techniques for data acquisition where the terrain and its coverage is concerned. Modern scanners have been able to scan in two or more channels (frequencies of the laser) recently. This gives the rise to the possibility of obtaining diverse information about an area with the different spectral properties of objects. The paper presents an example of a multispectral ALS system - Titan by Optech - with the possibility of data including the analysis of digital elevation models accuracy and data density. As a result of the study, the high relative accuracy of LiDAR acquisition in three spectral bands was proven. The mean differences between digital terrain models (DTMs) were less than 0.03 m. The data density analysis showed the influence of the laser wavelength. The points clouds that were tested had average densities of 25, 23 and 20 points per square metre respectively for green (G), near-infrared (NIR) and shortwave-infrared (SWIR) lasers. In this paper, the possibility of the generation of colour composites using orthoimages of laser intensity reflectance and its classification capabilities using data from airborne multispectral laser scanning for land cover mapping are also discussed and compared with conventional photogrammetric techniques.

  12. Testing of Land Cover Classification from Multispectral Airborne Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Bakuła, K.; Kupidura, P.; Jełowicki, Ł.

    2016-06-01

    Multispectral Airborne Laser Scanning provides a new opportunity for airborne data collection. It provides high-density topographic surveying and is also a useful tool for land cover mapping. Use of a minimum of three intensity images from a multiwavelength laser scanner and 3D information included in the digital surface model has the potential for land cover/use classification and a discussion about the application of this type of data in land cover/use mapping has recently begun. In the test study, three laser reflectance intensity images (orthogonalized point cloud) acquired in green, near-infrared and short-wave infrared bands, together with a digital surface model, were used in land cover/use classification where six classes were distinguished: water, sand and gravel, concrete and asphalt, low vegetation, trees and buildings. In the tested methods, different approaches for classification were applied: spectral (based only on laser reflectance intensity images), spectral with elevation data as additional input data, and spectro-textural, using morphological granulometry as a method of texture analysis of both types of data: spectral images and the digital surface model. The method of generating the intensity raster was also tested in the experiment. Reference data were created based on visual interpretation of ALS data and traditional optical aerial and satellite images. The results have shown that multispectral ALS data are unlike typical multispectral optical images, and they have a major potential for land cover/use classification. An overall accuracy of classification over 90% was achieved. The fusion of multi-wavelength laser intensity images and elevation data, with the additional use of textural information derived from granulometric analysis of images, helped to improve the accuracy of classification significantly. The method of interpolation for the intensity raster was not very helpful, and using intensity rasters with both first and last return

  13. Combining multi-spectral proximal sensors and digital cameras for monitoring grazed tropical pastures

    NASA Astrophysics Data System (ADS)

    Handcock, R. N.; Gobbett, D. L.; González, L. A.; Bishop-Hurley, G. J.; McGavin, S. L.

    2015-11-01

    Timely and accurate monitoring of pasture biomass and ground-cover is necessary in livestock production systems to ensure productive and sustainable management of forage for livestock. Interest in the use of proximal sensors for monitoring pasture status in grazing systems has increased, since such sensors can return data in near real-time, and have the potential to be deployed on large properties where remote sensing may not be suitable due to issues such as spatial scale or cloud cover. However, there are unresolved challenges in developing calibrations to convert raw sensor data to quantitative biophysical values, such as pasture biomass or vegetation ground-cover, to allow meaningful interpretation of sensor data by livestock producers. We assessed the use of multiple proximal sensors for monitoring tropical pastures with a pilot deployment of sensors at two sites on Lansdown Research Station near Townsville, Australia. Each site was monitored by a Skye SKR-four-band multi-spectral sensor (every 1 min), a digital camera (every 30 min), and a soil moisture sensor (every 1 min), each operated over 18 months. Raw data from each sensor were processed to calculate a number of multispectral vegetation indices. Visual observations of pasture characteristics, including above-ground standing biomass and ground cover, were made every 2 weeks. A methodology was developed to manage the sensor deployment and the quality control of the data collected. The data capture from the digital cameras was more reliable than the multi-spectral sensors, which had up to 63 % of data discarded after data cleaning and quality control. We found a strong relationship between sensor and pasture measurements during the wet season period of maximum pasture growth (January to April), especially when data from the multi-spectral sensors were combined with weather data. RatioNS34 (a simple band ratio between the near infrared (NIR) and lower shortwave infrared (SWIR) bands) and rainfall since 1

  14. Compact multispectral continuous zoom camera for color and SWIR vision with integrated laser range finder

    NASA Astrophysics Data System (ADS)

    Hübner, M.; Gerken, M.; Achtner, Bertram; Kraus, M.; Münzberg, M.

    2014-06-01

    In an electro-optical sensor suite for long range surveillance tasks the optics for the visible (450nm - 700nm) and the SWIR spectral wavelength range (900nm - 1700 nm) are combined with the receiver optics of an integrated laser range finder (LRF) .The incoming signal from the observed scene and the returned laser pulse are collected within the common entrance aperture of the optics. The common front part of the optics is a broadband corrected lens design from 450 - 1700nm wavelength range. The visible spectrum is split up by a dichroic beam splitter and focused on a HDTV CMOS camera. The returned laser pulse is spatially separated from the scene signal by a special prism and focused on the laser receiver diode of the integrated LRF. The achromatic lens design has a zoom factor 14 and F#2.6 in the visible path. In the SWIR path the F-number is adapted to the corresponding chip dimensions . The alignment of the LRF with respect to the SWIR camera line of sight can be controlled by adjustable integrated wedges. The two images in the visible and the SWIR spectral range match in focus and field of view (FOV) over the full zoom range between 2° and 22° HFOV. The SWIR camera has a resolution of 640×512 pixels. The HDTV camera provides a resolution of 1920×1080. The design and the performance parameters of the multispectral sensor suite is discussed.

  15. The color measurement system for spot color printing basing multispectral camera

    NASA Astrophysics Data System (ADS)

    Liu, Nanbo; Jin, Weiqi; Huang, Qinmei; Song, Li

    2014-11-01

    Color measurement and control of printing has been an important issue in computer vision technology . In the past, people have used density meter and spectrophotometer to measure the color of printing product. For the color management of 4 color press, by these kind meters, people can measure the color data from color bar printed at the side of sheet, then do ink key presetting. This way have wide application in printing field. However, it can not be used in the case that is to measure the color of spot color printing and printing pattern directly. With the development of multispectral image acquisition, it makes possible to measure the color of printing pattern in any area of the pattern by CCD camera than can acquire the whole image of sheet in high resolution. This essay give a way to measure the color of printing by multispectral camera in the process of printing. A 12 channel spectral camera with high intensity white LED illumination that have driven by a motor, scans the printing sheet. Then we can get the image, this image can include color and printing quality information of each pixel, LAB value and CMYK value of each pixel can be got by reconstructing the reflectance spectra of printing image. By this data processing, we can measure the color of spot color printing and control it. Through the spot test in the printing plant, the results show this way can get not only the color bar density value but also ROI color value. By the value, we can do ink key presetting, that makes it true to control the spot color automatically in high precision.

  16. Validation of a 2D multispectral camera: application to dermatology/cosmetology on a population covering five skin phototypes

    NASA Astrophysics Data System (ADS)

    Jolivot, Romuald; Nugroho, Hermawan; Vabres, Pierre; Ahmad Fadzil, M. H.; Marzani, Franck

    2011-07-01

    This paper presents the validation of a new multispectral camera specifically developed for dermatological application based on healthy participants from five different Skin PhotoTypes (SPT). The multispectral system provides images of the skin reflectance at different spectral bands, coupled with a neural network-based algorithm that reconstructs a hyperspectral cube of cutaneous data from a multispectral image. The flexibility of neural network based algorithm allows reconstruction at different wave ranges. The hyperspectral cube provides both high spectral and spatial information. The study population involves 150 healthy participants. The participants are classified based on their skin phototype according to the Fitzpatrick Scale and population covers five of the six types. The acquisition of a participant is performed at three body locations: two skin areas exposed to the sun (hand, face) and one area non exposed to the sun (lower back) and each is reconstructed at 3 different wave ranges. The validation is performed by comparing data acquired from a commercial spectrophotometer with the reconstructed spectrum obtained from averaging the hyperspectral cube. The comparison is calculated between 430 to 740 nm due to the limit of the spectrophotometer used. The results reveal that the multispectral camera is able to reconstruct hyperspectral cube with a goodness of fit coefficient superior to 0,997 for the average of all SPT for each location. The study reveals that the multispectral camera provides accurate reconstruction of hyperspectral cube which can be used for analysis of skin reflectance spectrum.

  17. Simultaneous multispectral framing infrared camera using an embedded diffractive optical lenslet array

    NASA Astrophysics Data System (ADS)

    Hinnrichs, Michele

    2011-06-01

    Recent advances in micro-optical element fabrication using gray scale technology have opened up the opportunity to create simultaneous multi-spectral imaging with fine structure diffractive lenses. This paper will discuss an approach that uses diffractive optical lenses configured in an array (lenslet array) and placed in close proximity to the focal plane array which enables a small compact simultaneous multispectral imaging camera [1]. The lenslet array is designed so that all lenslets have a common focal length with each lenslet tuned for a different wavelength. The number of simultaneous spectral images is determined by the number of individually configured lenslets in the array. The number of spectral images can be increased by a factor of 2 when using it with a dual-band focal plane array (MWIR/LWIR) by exploiting multiple diffraction orders. In addition, modulation of the focal length of the lenslet array with piezoelectric actuation will enable spectral bin fill-in allowing additional spectral coverage while giving up simultaneity. Different lenslet array spectral imaging concept designs are presented in this paper along with a unique concept for prefiltering the radiation focused on the detector. This approach to spectral imaging has applications in the detection of chemical agents in both aerosolized form and as a liquid on a surface. It also can be applied to the detection of weaponized biological agent and IED detection in various forms from manufacturing to deployment and post detection during forensic analysis.

  18. Charon's Color: A view from New Horizon Ralph/Multispectral Visible Imaging Camera

    NASA Astrophysics Data System (ADS)

    Olkin, C.; Howett, C.; Grundy, W. M.; Parker, A. H.; Ennico Smith, K.; Stern, S. A.; Binzel, R. P.; Cook, J. C.; Cruikshank, D. P.; Dalle Ore, C.; Earle, A. M.; Jennings, D. E.; Linscott, I.; Lunsford, A.; Parker, J. W.; Protopapa, S.; Reuter, D.; Singer, K. N.; Spencer, J. R.; Tsang, C.; Verbiscer, A.; Weaver, H. A., Jr.; Young, L. A.

    2015-12-01

    The Multispectral Visible Imaging Camera (MVIC; Reuter et al., 2008) is part of Ralph, an instrument on NASA's New Horizons spacecraft. MVIC is the color 'eyes' of New Horizons, observing objects using five bands from blue to infrared wavelengths. MVIC's images of Charon show it to be an intriguing place, a far cry from the grey heavily cratered world once postulated. Rather Charon is observed to have large surface areas free of craters, and a northern polar region that is much redder than its surroundings. This talk will describe these initial results in more detail, along with Charon's global geological color variations to put these results into their wider context. Finally possible surface coloration mechanisms due to global processes and/or seasonal cycles will be discussed.

  19. Land surface temperature retrieved from airborne multispectral scanner mid-infrared and thermal-infrared data.

    PubMed

    Qian, Yong-Gang; Wang, Ning; Ma, Ling-Ling; Liu, Yao-Kai; Wu, Hua; Tang, Bo-Hui; Tang, Ling-Li; Li, Chuan-Rong

    2016-01-25

    Land surface temperature (LST) is one of the key parameters in the physics of land surface processes at local/global scales. In this paper, a LST retrieval method was proposed from airborne multispectral scanner data comparing one mid-infrared (MIR) channel and one thermal infrared (TIR) channel with the land surface emissivity given as a priori knowledge. To remove the influence of the direct solar radiance efficiently, a relationship between the direct solar radiance and water vapor content and the view zenith angle and solar zenith angle was established. Then, LST could be retrieved with a split-window algorithm from MIR/TIR data. Finally, the proposed algorithm was applied to the actual airborne flight data and validated with in situ measurements of land surface types in the Baotou site in China on 17 October 2014. The results demonstrate that the difference between the retrieved and in situ LST was less than 1.5 K. The bais, RMSE, and standard deviation of the retrieved LST were 0.156 K, 0.883 K, and 0.869 K, respectively, for samples. PMID:26832579

  20. Airborne Multispectral and Thermal Remote Sensing for Detecting the Onset of Crop Stress Caused by Multiple Factors

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Remote sensing technology has been developed and applied to provide spatiotemporal information on crop stress for precision management. A series of multispectral images over a field planted cotton, corn and soybean were obtained by a Geospatial Systems MS4100 camera mounted on an Air Tractor 402B ai...

  1. Multispectral thermal airborne TASI-600 data to study the Pompeii (IT) archaeological area

    NASA Astrophysics Data System (ADS)

    Palombo, Angelo; Pascucci, Simone; Pergola, Nicola; Pignatti, Stefano; Santini, Federico; Soldovieri, Francesco

    2016-04-01

    The management of archaeological areas refers to the conservation of the ruins/buildings and the eventual prospection of new areas having an archaeological potential. In this framework, airborne remote sensing is a well-developed geophysical tool for supporting the archaeological surveys of wide areas. The spectral regions applied in archaeological remote sensing spans from the VNIR to the TIR. In particular, the archaeological thermal imaging considers that materials absorb, emit, transmit, and reflect the thermal infrared radiation at different rate according to their composition, density and moisture content. Despite its potential, thermal imaging in archaeological applications are scarce. Among them, noteworthy are the ones related to the use of Landsat and ASTER [1] and airborne remote sensing [2, 3, 4 and 5]. In view of these potential in Cultural Heritage applications, the present study aims at analysing the usefulness of the high spatial resolution thermal imaging on the Pompeii archaeological park. To this purpose TASI-600 [6] airborne multispectral thermal imagery (32 channels from 8 to 11.5 nm with a spectral resolution of 100nm and a spatial resolution of 1m/pixel) was acquired on December the 7th, 2015. Airborne survey has been acquired to get useful information on the building materials (both ancient and of consolidation) characteristics and, whenever possible, to retrieve quick indicators on their conservation status. Thermal images will be, moreover, processed to have an insight of the critical environmental issues impacting the structures (e.g. moisture). The proposed study shows the preliminary results of the airborne deployments, the pre-processing of the multispectral thermal imagery and the retrieving of accurate land surface temperatures (LST). LST map will be analysed to describe the thermal pattern of the city of Pompeii and detect any thermal anomalies. As far as the ongoing TASI-600 sensors pre-processing, it will include: (a) radiometric

  2. Development of low-cost high-performance multispectral camera system at Banpil

    NASA Astrophysics Data System (ADS)

    Oduor, Patrick; Mizuno, Genki; Olah, Robert; Dutta, Achyut K.

    2014-05-01

    Banpil Photonics (Banpil) has developed a low-cost high-performance multispectral camera system for Visible to Short- Wave Infrared (VIS-SWIR) imaging for the most demanding high-sensitivity and high-speed military, commercial and industrial applications. The 640x512 pixel InGaAs uncooled camera system is designed to provide a compact, smallform factor to within a cubic inch, high sensitivity needing less than 100 electrons, high dynamic range exceeding 190 dB, high-frame rates greater than 1000 frames per second (FPS) at full resolution, and low power consumption below 1W. This is practically all the feature benefits highly desirable in military imaging applications to expand deployment to every warfighter, while also maintaining a low-cost structure demanded for scaling into commercial markets. This paper describes Banpil's development of the camera system including the features of the image sensor with an innovation integrating advanced digital electronics functionality, which has made the confluence of high-performance capabilities on the same imaging platform practical at low cost. It discusses the strategies employed including innovations of the key components (e.g. focal plane array (FPA) and Read-Out Integrated Circuitry (ROIC)) within our control while maintaining a fabless model, and strategic collaboration with partners to attain additional cost reductions on optics, electronics, and packaging. We highlight the challenges and potential opportunities for further cost reductions to achieve a goal of a sub-$1000 uncooled high-performance camera system. Finally, a brief overview of emerging military, commercial and industrial applications that will benefit from this high performance imaging system and their forecast cost structure is presented.

  3. Application of combined Landsat thematic mapper and airborne thermal infrared multispectral scanner data to lithologic mapping in Nevada

    USGS Publications Warehouse

    Podwysocki, M.H.; Ehmann, W.J.; Brickey, D.W.

    1987-01-01

    Future Landsat satellites are to include the Thematic Mapper (TM) and also may incorporate additional multispectral scanners. One such scanner being considered for geologic and other applications is a four-channel thermal-infrared multispectral scanner having 60-m spatial resolution. This paper discusses the results of studies using combined Landsat TM and airborne Thermal Infrared Multispectral Scanner (TIMS) digital data for lithologic discrimination, identification, and geologic mapping in two areas within the Basin and Range province of Nevada. Field and laboratory reflectance spectra in the visible and reflective-infrared and laboratory spectra in the thermal-infrared parts of the spectrum were used to verify distinctions made between rock types in the image data sets.

  4. On-Orbit Calibration of a Multi-Spectral Satellite Satellite Sensor Using a High Altitude Airborne Imaging Spectrometer

    NASA Technical Reports Server (NTRS)

    Green, R. O.; Shimada, M.

    1996-01-01

    Earth-looking satellites must be calibrated in order to quantitatively measure and monitor components of land, water and atmosphere of the Earth system. The inevitable change in performance due to the stress of satellite launch requires that the calibration of a satellite sensor be established and validated on-orbit. A new approach to on-orbit satellite sensor calibration has been developed using the flight of a high altitude calibrated airborne imaging spectrometer below a multi-spectral satellite sensor.

  5. Forest Stand Segmentation Using Airborne LIDAR Data and Very High Resolution Multispectral Imagery

    NASA Astrophysics Data System (ADS)

    Dechesne, Clément; Mallet, Clément; Le Bris, Arnaud; Gouet, Valérie; Hervieu, Alexandre

    2016-06-01

    Forest stands are the basic units for forest inventory and mapping. Stands are large forested areas (e.g., ≥ 2 ha) of homogeneous tree species composition. The accurate delineation of forest stands is usually performed by visual analysis of human operators on very high resolution (VHR) optical images. This work is highly time consuming and should be automated for scalability purposes. In this paper, a method based on the fusion of airborne laser scanning data (or lidar) and very high resolution multispectral imagery for automatic forest stand delineation and forest land-cover database update is proposed. The multispectral images give access to the tree species whereas 3D lidar point clouds provide geometric information on the trees. Therefore, multi-modal features are computed, both at pixel and object levels. The objects are individual trees extracted from lidar data. A supervised classification is performed at the object level on the computed features in order to coarsely discriminate the existing tree species in the area of interest. The analysis at tree level is particularly relevant since it significantly improves the tree species classification. A probability map is generated through the tree species classification and inserted with the pixel-based features map in an energetical framework. The proposed energy is then minimized using a standard graph-cut method (namely QPBO with α-expansion) in order to produce a segmentation map with a controlled level of details. Comparison with an existing forest land cover database shows that our method provides satisfactory results both in terms of stand labelling and delineation (matching ranges between 94% and 99%).

  6. Airborne multispectral and thermal remote sensing for detecting the onset of crop stress caused by multiple factors

    NASA Astrophysics Data System (ADS)

    Huang, Yanbo; Thomson, Steven J.

    2010-10-01

    Remote sensing technology has been developed and applied to provide spatiotemporal information on crop stress for precision management. A series of multispectral images over a field planted cotton, corn and soybean were obtained by a Geospatial Systems MS4100 camera mounted on an Air Tractor 402B airplane equipped with Camera Link in a Magma converter box triggered by Terraverde Dragonfly® flight navigation and imaging control software. The field crops were intentionally stressed by applying glyphosate herbicide via aircraft and allowing it to drift near-field. Aerial multispectral images in the visible and near-infrared bands were manipulated to produce vegetation indices, which were used to quantify the onset of herbicide induced crop stress. The vegetation indices normalized difference vegetation index (NDVI) and soil adjusted vegetation index (SAVI) showed the ability to monitor crop response to herbicide-induced injury by revealing stress at different phenological stages. Two other fields were managed with irrigated versus nonirrigated treatments, and those fields were imaged with both the multispectral system and an Electrophysics PV-320T thermal imaging camera on board an Air Tractor 402B aircraft. Thermal imagery indicated water stress due to deficits in soil moisture, and a proposed method of determining crop cover percentage using thermal imagery was compared with a multispectral imaging method. Development of an image fusion scheme may be necessary to provide synergy and improve overall water stress detection ability.

  7. Mapping of hydrothermally altered rocks using airborne multispectral scanner data, Marysvale, Utah, mining district

    USGS Publications Warehouse

    Podwysocki, M.H.; Segal, D.B.; Jones, O.D.

    1983-01-01

    Multispectral data covering an area near Marysvale, Utah, collected with the airborne National Aeronautics and Space Administration (NASA) 24-channel Bendix multispectral scanner, were analyzed to detect areas of hydrothermally altered, potentially mineralized rocks. Spectral bands were selected for analysis that approximate those of the Landsat 4 Thematic Mapper and which are diagnostic of the presence of hydrothermally derived products. Hydrothermally altered rocks, particularly volcanic rocks affected by solutions rich in sulfuric acid, are commonly characterized by concentrations of argillic minerals such as alunite and kaolinite. These minerals are important for identifying hydrothermally altered rocks in multispectral images because they have intense absorption bands centered near a wavelength of 2.2 ??m. Unaltered volcanic rocks commonly do not contain these minerals and hence do not have the absorption bands. A color-composite image was constructed using the following spectral band ratios: 1.6??m/2.2??m, 1.6??m/0.48??m, and 0.67??m/1.0??m. The particular bands were chosen to emphasize the spectral contrasts that exist for argillic versus non-argillic rocks, limonitic versus nonlimonitic rocks, and rocks versus vegetation, respectively. The color-ratio composite successfully distinguished most types of altered rocks from unaltered rocks. Some previously unrecognized areas of hydrothermal alteration were mapped. The altered rocks included those having high alunite and/or kaolinite content, siliceous rocks containing some kaolinite, and ash-fall tuffs containing zeolitic minerals. The color-ratio-composite image allowed further division of these rocks into limonitic and nonlimonitic phases. The image did not allow separation of highly siliceous or hematitically altered rocks containing no clays or alunite from unaltered rocks. A color-coded density slice image of the 1.6??m/2.2??m band ratio allowed further discrimination among the altered units. Areas

  8. High Spatial Resolution Airborne Multispectral Thermal Infrared Remote Sensing Data for Analysis of Urban Landscape Characteristics

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Luvall, Jeffrey C.; Estes, Maurice G., Jr.; Arnold, James E. (Technical Monitor)

    2000-01-01

    We have used airborne multispectral thermal infrared (TIR) remote sensing data collected at a high spatial resolution (i.e., 10m) over several cities in the United States to study thermal energy characteristics of the urban landscape. These TIR data provide a unique opportunity to quantify thermal responses from discrete surfaces typical of the urban landscape and to identify both the spatial arrangement and patterns of thermal processes across the city. The information obtained from these data is critical to understanding how urban surfaces drive or force development of the Urban Heat Island (UHI) effect, which exists as a dome of elevated air temperatures that presides over cities in contrast to surrounding non-urbanized areas. The UHI is most pronounced in the summertime where urban surfaces, such as rooftops and pavement, store solar radiation throughout the day, and release this stored energy slowly after sunset creating air temperatures over the city that are in excess of 2-4'C warmer in contrast with non-urban or rural air temperatures. The UHI can also exist as a daytime phenomenon with surface temperatures in downtown areas of cities exceeding 38'C. The implications of the UHI are significant, particularly as an additive source of thermal energy input that exacerbates the overall production of ground level ozone over cities. We have used the Airborne Thermal and Land Applications Sensor (ATLAS), flown onboard a Lear 23 jet aircraft from the NASA Stennis Space Center, to acquire high spatial resolution multispectral TIR data (i.e., 6 bandwidths between 8.2-12.2 (um) over Huntsville, Alabama, Atlanta, Georgia, Baton Rouge, Louisiana, Salt Lake City, Utah, and Sacramento, California. These TIR data have been used to produce maps and other products, showing the spatial distribution of heating and cooling patterns over these cities to better understand how the morphology of the urban landscape affects development of the UHI. In turn, these data have been used

  9. Airborne Thermal Infrared Multispectral Scanner (TIMS) images over disseminated gold deposits, Osgood Mountains, Humboldt County, Nevada

    NASA Technical Reports Server (NTRS)

    Krohn, M. Dennis

    1986-01-01

    The U.S. Geological Survey (USGS) acquired airborne Thermal Infrared Multispectral Scanner (TIMS) images over several disseminated gold deposits in northern Nevada in 1983. The aerial surveys were flown to determine whether TIMS data could depict jasperoids (siliceous replacement bodies) associated with the gold deposits. The TIMS data were collected over the Pinson and Getchell Mines in the Osgood Mountains, the Carlin, Maggie Creek, Bootstrap, and other mines in the Tuscarora Mountains, and the Jerritt Canyon Mine in the Independence Mountains. The TIMS data seem to be a useful supplement to conventional geochemical exploration for disseminated gold deposits in the western United States. Siliceous outcrops are readily separable in the TIMS image from other types of host rocks. Different forms of silicification are not readily separable, yet, due to limitations of spatial resolution and spectral dynamic range. Features associated with the disseminated gold deposits, such as the large intrusive bodies and fault structures, are also resolvable on TIMS data. Inclusion of high-resolution thermal inertia data would be a useful supplement to the TIMS data.

  10. Curved CCD detector devices and arrays for multispectral astrophysical applications and terrestrial stereo panoramic cameras

    NASA Astrophysics Data System (ADS)

    Swain, Pradyumna; Mark, David

    2004-09-01

    The emergence of curved CCD detectors as individual devices or as contoured mosaics assembled to match the curved focal planes of astronomical telescopes and terrestrial stereo panoramic cameras represents a major optical design advancement that greatly enhances the scientific potential of such instruments. In altering the primary detection surface within the telescope"s optical instrumentation system from flat to curved, and conforming the applied CCD"s shape precisely to the contour of the telescope"s curved focal plane, a major increase in the amount of transmittable light at various wavelengths through the system is achieved. This in turn enables multi-spectral ultra-sensitive imaging with much greater spatial resolution necessary for large and very large telescope applications, including those involving infrared image acquisition and spectroscopy, conducted over very wide fields of view. For earth-based and space-borne optical telescopes, the advent of curved CCD"s as the principle detectors provides a simplification of the telescope"s adjoining optics, reducing the number of optical elements and the occurrence of optical aberrations associated with large corrective optics used to conform to flat detectors. New astronomical experiments may be devised in the presence of curved CCD applications, in conjunction with large format cameras and curved mosaics, including three dimensional imaging spectroscopy conducted over multiple wavelengths simultaneously, wide field real-time stereoscopic tracking of remote objects within the solar system at high resolution, and deep field survey mapping of distant objects such as galaxies with much greater multi-band spatial precision over larger sky regions. Terrestrial stereo panoramic cameras equipped with arrays of curved CCD"s joined with associative wide field optics will require less optical glass and no mechanically moving parts to maintain continuous proper stereo convergence over wider perspective viewing fields than

  11. Multispectral thermal airborne TASI-600 data to study the Pompeii (IT) archaeological area

    NASA Astrophysics Data System (ADS)

    Palombo, Angelo; Pascucci, Simone; Pergola, Nicola; Pignatti, Stefano; Santini, Federico; Soldovieri, Francesco

    2016-04-01

    The management of archaeological areas refers to the conservation of the ruins/buildings and the eventual prospection of new areas having an archaeological potential. In this framework, airborne remote sensing is a well-developed geophysical tool for supporting the archaeological surveys of wide areas. The spectral regions applied in archaeological remote sensing spans from the VNIR to the TIR. In particular, the archaeological thermal imaging considers that materials absorb, emit, transmit, and reflect the thermal infrared radiation at different rate according to their composition, density and moisture content. Despite its potential, thermal imaging in archaeological applications are scarce. Among them, noteworthy are the ones related to the use of Landsat and ASTER [1] and airborne remote sensing [2, 3, 4 and 5]. In view of these potential in Cultural Heritage applications, the present study aims at analysing the usefulness of the high spatial resolution thermal imaging on the Pompeii archaeological park. To this purpose TASI-600 [6] airborne multispectral thermal imagery (32 channels from 8 to 11.5 nm with a spectral resolution of 100nm and a spatial resolution of 1m/pixel) was acquired on December the 7th, 2015. Airborne survey has been acquired to get useful information on the building materials (both ancient and of consolidation) characteristics and, whenever possible, to retrieve quick indicators on their conservation status. Thermal images will be, moreover, processed to have an insight of the critical environmental issues impacting the structures (e.g. moisture). The proposed study shows the preliminary results of the airborne deployments, the pre-processing of the multispectral thermal imagery and the retrieving of accurate land surface temperatures (LST). LST map will be analysed to describe the thermal pattern of the city of Pompeii and detect any thermal anomalies. As far as the ongoing TASI-600 sensors pre-processing, it will include: (a) radiometric

  12. New, Flexible Applications with the Multi-Spectral Titan Airborne Lidar

    NASA Astrophysics Data System (ADS)

    Swirski, A.; LaRocque, D. P.; Shaker, A.; Smith, B.

    2015-12-01

    Traditional lidar designs have been restricted to using a single laser channel operating at one particular wavelength. Single-channel systems excel at collecting high-precision spatial (XYZ) data, with accuracies down to a few centimeters. However, target classification is difficult with spatial data alone, and single-wavelength systems are limited to the strengths and weaknesses of the wavelength they use. To resolve these limitations in lidar design, Teledyne Optech developed the Titan, the world's first multispectral lidar system, which uses three independent laser channels operating at 532, 1064, and 1550 nm. Since Titan collects 12 bit intensity returns for each wavelength separately, users can compare how strongly targets in the survey area reflect each wavelength. Materials such as soil, rock and foliage all reflect the wavelengths differently, enabling post-processing algorithms to identify the material of targets easily and automatically. Based on field tests in Canada, automated classification algorithms have combined this with elevation data to classify targets into six basic types with 78% accuracy. Even greater accuracy is possible with further algorithm enhancement and the use of an in-sensor passive imager such as a thermal, multispectral, CIR or RGB camera. Titan therefore presents an important new tool for applications such as land-cover classification and environmental modeling while maintaining lidar's traditional strengths: high 3D accuracy and day/night operation. Multispectral channels also enable a single lidar to handle both topographic and bathymetric surveying efficiently, which previously required separate specialized lidar systems operating at different wavelengths. On land, Titan can survey efficiently from 2000 m AGL with a 900 kHz PRF (300 kHz per channel), or up to 2500 m if only the infrared 1064 and 1550 nm channels are used. Over water, the 532 nm green channel penetrates water to collect seafloor returns while the infrared

  13. Web camera as low cost multispectral sensor for quantification of chlorophyll in soybean leaves

    NASA Astrophysics Data System (ADS)

    Adhiwibawa, Marcelinus A.; Setiawan, Yonathan E.; Prilianti, Kestrilia R.; Brotosudarmo, Tatas H. P.

    2015-01-01

    Soybeans is one of main crops in Indonesia but the demand for soybeans is not followed by an increase in soybeans national production. One of the production limitation factor is the availability of lush cultivation area for soybeans plantation. Indonesian farners are usually grow soybeans in marginal cultivation area that requires soybeans varieties which tolerant with environmental stress such as drought, nutrition limitation, pest, disease and many others. Chlorophyll content in leaf is one of plant health indicator that can be used to determine environmental stress tolerant soybean varieties. However, there are difficulties in soybeans breeding research due to the manual acquisition of data that are time consume and labour extensive. In this paper authors proposed automatic system of soybeans leaves area and chlorophyll quantification based on low cost multispectral sensor using web camera as an indicator of soybean plant tollerance to environmental stress particularlly drought stress. The system acquires the image of the plant that is placed in the acquisition box from the top of the plant. The image is segmented using NDVI (Normalized Difference Vegetation Index) from image and quantified to yield an average value of NDVI and leaf area. The proposed system showed that acquired NDVI value has a strong relationship with SPAD value with r-square value 0.70, while the leaf area prediction has error of 18.41%. Thus the automation system can quantify plant data with good result.

  14. Pluto's Global Color Variability as Seen by the New Horizons Multispectral Visible Imaging Camera

    NASA Astrophysics Data System (ADS)

    Binzel, R. P.; Stern, A.; Weaver, H. A., Jr.; Young, L. A.; Olkin, C.; Grundy, W. M.; Earle, A. M.

    2015-12-01

    While variability in Pluto's albedo, color, and methane distribution had been previously discerned from ground-based and Hubble Space Telescope observations [e.g. 1,2], the sharp juxtaposition of contrasting units forms one of the greatest surprises returned (to date) from the New Horizons mission. Here we present a global analysis of the color distribution of Pluto's surface factoring in both seasonal and large scale geologic processes. We will also explore the possible role of long-term (million year) precession cycles [3] in shaping the surface morphology and the distribution of volatiles. We utilize data returned by the New Horizons Multispectral Visible Imaging Camera (MVIC) operating as part of the Ralph instrument [4]. MVIC captures images over five wavelength bands from blue to the near-infrared, including a broad panchromatic band and a narrow band centered on the 0.89-micron methane absorption feature. References: [1] Young, E. F., Binzel, R. P., Crane, K. 2001; Astron. J. 121, 552-561. [2] Grundy, W.M., Olkin, C.B., Young, L.A., Buie, M. W., Young, E. F. 2013; Icarus 223, 710-721. [3] Earle, A. M., Binzel, R. P. 2015; Icarus 250, 405-412. [4] Reuter, D.C., Stern, S.A., Scherrer, J., et al. 2008; Space Science Reviews, 140, 129-154.

  15. Performance analysis of a multispectral framing camera for detecting mines in the littoral zone and beach zone

    NASA Astrophysics Data System (ADS)

    Louchard, Eric; Farm, Brian; Acker, Andrew

    2008-04-01

    BAE Systems Sensor Systems Identification & Surveillance (IS) has developed, under contract with the Office of Naval Research, a multispectral airborne sensor system and processing algorithms capable of detecting mine-like objects in the surf zone and land mines in the beach zone. BAE Systems has used this system in a blind test at a test range established by the Naval Surface Warfare Center - Panama City Division (NSWC-PCD) at Eglin Air Force Base. The airborne and ground subsystems used in this test are described, with graphical illustrations of the detection algorithms. We report on the performance of the system configured to operate with a human operator analyzing data on a ground station. A subsurface (underwater bottom proud mine in the surf zone and moored mine in shallow water) mine detection capability is demonstrated in the surf zone. Surface float detection and proud land mine detection capability is also demonstrated. Our analysis shows that this BAE Systems-developed multispectral airborne sensor provides a robust technical foundation for a viable system for mine counter-measures, and would be a valuable asset for use prior to an amphibious assault.

  16. Biooptical variability in the Greenland Sea observed with the Multispectral Airborne Radiometer System (MARS)

    NASA Technical Reports Server (NTRS)

    Mueller, James L.; Trees, Charles C.

    1989-01-01

    A site-specific ocean color remote sensing algorithm was developed and used to convert Multispectral Airborne Radiometer System (MARS) spectral radiance measurements to chlorophyll-a concentration profiles along aircraft tracklines in the Greenland Sea. The analysis is described and the results given in graphical or tabular form. Section 2 describes the salient characteristics and history of development of the MARS instrument. Section 3 describes the analyses of MARS flight segments over consolidated sea ice, resulting in a set of altitude dependent ratios used (over water) to estimate radiance reflected by the surface and atmosphere from total radiance measured. Section 4 presents optically weighted pigment concentrations calculated from profile data, and spectral reflectances measured in situ from the top meter of the water column; this data was analyzed to develop an algorithm relating chlorophyll-a concentrations to the ratio of radiance reflectances at 441 and 550 nm (with a selection of coefficients dependent upon whether significant gelvin presence is implied by a low ratio of reflectances at 410 and 550 nm). Section 5 describes the scaling adjustments which were derived to reconcile the MARS upwelled radiance ratios at 410:550 nm and 441:550 nm to in situ reflectance ratios measured simultaneously on the surface. Section 6 graphically presents the locations of MARS data tracklines and positions of the surface monitoring R/V. Section 7 presents stick-plots of MARS tracklines selected to illustrate two-dimensional spatial variability within the box covered by each day's flight. Section 8 presents curves of chlorophyll-a concentration profiles derived from MARS data along survey tracklines. Significant results are summarized in Section 1.

  17. A new method of building footprints detection using airborne laser scanning data and multispectral image

    NASA Astrophysics Data System (ADS)

    Luo, Yiping; Jiang, Ting; Gao, Shengli; Wang, Xin

    2010-10-01

    It presents a new approach for detecting building footprints in a combination of registered aerial image with multispectral bands and airborne laser scanning data synchronously obtained by Leica-Geosystems ALS40 and Applanix DACS-301 on the same platform. A two-step method for building detection was presented consisting of selecting 'building' candidate points and then classifying candidate points. A digital surface model(DSM) derived from last pulse laser scanning data was first filtered and the laser points were classified into classes 'ground' and 'building or tree' based on mathematic morphological filter. Then, 'ground' points were resample into digital elevation model(DEM), and a Normalized DSM(nDSM) was generated from DEM and DSM. The candidate points were selected from 'building or tree' points by height value and area threshold in nDSM. The candidate points were further classified into building points and tree points by using the support vector machines(SVM) classification method. Two classification tests were carried out using features only from laser scanning data and associated features from two input data sources. The features included height, height finite difference, RGB bands value, and so on. The RGB value of points was acquired by matching laser scanning data and image using collinear equation. The features of training points were presented as input data for SVM classification method, and cross validation was used to select best classification parameters. The determinant function could be constructed by the classification parameters and the class of candidate points was determined by determinant function. The result showed that associated features from two input data sources were superior to features only from laser scanning data. The accuracy of more than 90% was achieved for buildings in first kind of features.

  18. Monitoring Ephemeral Streams Using Airborne Very High Resolution Multispectral Remote Sensing in Arid Environments

    NASA Astrophysics Data System (ADS)

    Hamada, Y.; O'Connor, B. L.

    2012-12-01

    Development in arid environments often results in the loss and degradation of the ephemeral streams that provide habitat and critical ecosystem functions such as water delivery, sediment transport, and groundwater recharge. Quantification of these ecosystem functions is challenging because of the episodic nature of runoff events in desert landscapes and the large spatial scale of watersheds that potentially can be impacted by large-scale development. Low-impact development guidelines and regulatory protection of ephemeral streams are often lacking due to the difficulty of accurately mapping and quantifying the critical functions of ephemeral streams at scales larger than individual reaches. Renewable energy development in arid regions has the potential to disturb ephemeral streams at the watershed scale, and it is necessary to develop environmental monitoring applications for ephemeral streams to help inform land management and regulatory actions aimed at protecting and mitigating for impacts related to large-scale land disturbances. This study focuses on developing remote sensing methodologies to identify and monitor impacts on ephemeral streams resulting from the land disturbance associated with utility-scale solar energy development in the desert southwest of the United States. Airborne very high resolution (VHR) multispectral imagery is used to produce stereoscopic, three-dimensional landscape models that can be used to (1) identify and map ephemeral stream channel networks, and (2) support analyses and models of hydrologic and sediment transport processes that pertain to the critical functionality of ephemeral streams. Spectral and statistical analyses are being developed to extract information about ephemeral channel location and extent, micro-topography, riparian vegetation, and soil moisture characteristics. This presentation will demonstrate initial results and provide a framework for future work associated with this project, for developing the necessary

  19. Multispectral Photometry of the Moon and Absolute Calibration of the Clementine UV/Vis Camera

    NASA Astrophysics Data System (ADS)

    Hillier, John K.; Buratti, Bonnie J.; Hill, Kathryn

    1999-10-01

    We present a multispectral photometric study of the Moon between solar phase angles of 0 and 85°. Using Clementine images obtained between 0.4 and 1.0 μm, we produce a comprehensive study of the lunar surface containing the following results: (1) empirical photometric functions for the spectral range and viewing and illumination geometries mentioned, (2) photometric modeling that derives the physical properties of the upper regolith and includes a detailed study of the causes for the lunar opposition surge, (3) an absolute calibration of the Clementine UV/Vis camera. The calibration procedure given on the Clementine calibration web site produces reflectances relative to a halon standard and further appear significantly higher than those seen in groundbased observations. By comparing Clementine observations with prior groundbased observations of 15 sites on the Moon we have determined a good absolute calibration of the Clementine UV/Vis camera. A correction factor of 0.532 has been determined to convert the web site (www.planetary.brown.edu/clementine/calibration.html) reflectances to absolute values. From the calibrated data, we calculate empirical phase functions useful for performing photometric corrections to observations of the Moon between solar phase angles of 0 and 85° and in the spectral range 0.4 to 1.0μm. Finally, the calibrated data is used to fit a version of Hapke's photometric model modified to incorporate a new formulation, developed in this paper, of the lunar opposition surge which includes coherent backscatter. Recent studies of the lunar opposition effect have yielded contradictory results as to the mechanism responsible: shadow hiding, coherent backscatter, or both. We find that most of the surge can be explained by shadow hiding with a halfwidth of ˜8°. However, for the brightest regions (the highlands at 0.75-1.0μm) a small additional narrow component (halfwidth of <2°) of total amplitude ˜1/6 to 1/4 that of the shadow hiding surge is

  20. Effectiveness of airborne multispectral thermal data for karst groundwater resources recognition in coastal areas

    NASA Astrophysics Data System (ADS)

    Pignatti, Stefano; Fusilli, Lorenzo; Palombo, Angelo; Santini, Federico; Pascucci, Simone

    2013-04-01

    Currently the detection, use and management of groundwater in karst regions can be considered one of the most significant procedures for solving water scarcity problems during periods of low rainfall this because groundwater resources from karst aquifers play a key role in the water supply in karst areas worldwide [1]. In many countries of the Mediterranean area, where karst is widespread, groundwater resources are still underexploited, while surface waters are generally preferred [2]. Furthermore, carbonate aquifers constitute a crucial thermal water resource outside of volcanic areas, even if there is no detailed and reliable global assessment of thermal water resources. The composite hydrogeological characteristics of karst, particularly directions and zones of groundwater distribution, are not up till now adequately explained [3]. In view of the abovementioned reasons the present study aims at analyzing the detection capability of high spatial resolution thermal remote sensing of karst water resources in coastal areas in order to get useful information on the karst springs flow and on different characteristics of these environments. To this purpose MIVIS [4, 5] and TASI-600 [6] airborne multispectral thermal imagery (see sensors' characteristics in Table 1) acquired on two coastal areas of the Mediterranean area interested by karst activity, one located in Montenegro and one in Italy, were used. One study area is located in the Kotor Bay, a winding bay on the Adriatic Sea surrounded by high mountains in south-western Montenegro and characterized by many subaerial and submarine coastal springs related to deep karstic channels. The other study area is located in Santa Cesarea (Italy), encompassing coastal cold springs, the main local source of high quality water, and also a noticeable thermal groundwater outflow. The proposed study shows the preliminary results of the two airborne deployments on these areas. The preprocessing of the multispectral thermal imagery

  1. Effectiveness of airborne multispectral thermal data for karst groundwater resources recognition in coastal areas

    NASA Astrophysics Data System (ADS)

    Pignatti, Stefano; Fusilli, Lorenzo; Palombo, Angelo; Santini, Federico; Pascucci, Simone

    2013-04-01

    Currently the detection, use and management of groundwater in karst regions can be considered one of the most significant procedures for solving water scarcity problems during periods of low rainfall this because groundwater resources from karst aquifers play a key role in the water supply in karst areas worldwide [1]. In many countries of the Mediterranean area, where karst is widespread, groundwater resources are still underexploited, while surface waters are generally preferred [2]. Furthermore, carbonate aquifers constitute a crucial thermal water resource outside of volcanic areas, even if there is no detailed and reliable global assessment of thermal water resources. The composite hydrogeological characteristics of karst, particularly directions and zones of groundwater distribution, are not up till now adequately explained [3]. In view of the abovementioned reasons the present study aims at analyzing the detection capability of high spatial resolution thermal remote sensing of karst water resources in coastal areas in order to get useful information on the karst springs flow and on different characteristics of these environments. To this purpose MIVIS [4, 5] and TASI-600 [6] airborne multispectral thermal imagery (see sensors' characteristics in Table 1) acquired on two coastal areas of the Mediterranean area interested by karst activity, one located in Montenegro and one in Italy, were used. One study area is located in the Kotor Bay, a winding bay on the Adriatic Sea surrounded by high mountains in south-western Montenegro and characterized by many subaerial and submarine coastal springs related to deep karstic channels. The other study area is located in Santa Cesarea (Italy), encompassing coastal cold springs, the main local source of high quality water, and also a noticeable thermal groundwater outflow. The proposed study shows the preliminary results of the two airborne deployments on these areas. The preprocessing of the multispectral thermal imagery

  2. Airborne Multispectral LIDAR Data for Land-Cover Classification and Land/water Mapping Using Different Spectral Indexes

    NASA Astrophysics Data System (ADS)

    Morsy, S.; Shaker, A.; El-Rabbany, A.; LaRocque, P. E.

    2016-06-01

    Airborne Light Detection And Ranging (LiDAR) data is widely used in remote sensing applications, such as topographic and landwater mapping. Recently, airborne multispectral LiDAR sensors, which acquire data at different wavelengths, are available, thus allows recording a diversity of intensity values from different land features. In this study, three normalized difference feature indexes (NDFI), for vegetation, water, and built-up area mapping, were evaluated. The NDFIs namely, NDFIG-NIR, NDFIG-MIR, and NDFINIR-MIR were calculated using data collected at three wavelengths; green: 532 nm, near-infrared (NIR): 1064 nm, and mid-infrared (MIR): 1550 nm by the world's first airborne multispectral LiDAR sensor "Optech Titan". The Jenks natural breaks optimization method was used to determine the threshold values for each NDFI, in order to cluster the 3D point data into two classes (water and land or vegetation and built-up area). Two sites at Scarborough, Ontario, Canada were tested to evaluate the performance of the NDFIs for land-water, vegetation, and built-up area mapping. The use of the three NDFIs succeeded to discriminate vegetation from built-up areas with an overall accuracy of 92.51%. Based on the classification results, it is suggested to use NDFIG-MIR and NDFINIR-MIR for vegetation and built-up areas extraction, respectively. The clustering results show that the direct use of NDFIs for land-water mapping has low performance. Therefore, the clustered classes, based on the NDFIs, are constrained by the recorded number of returns from different wavelengths, thus the overall accuracy is improved to 96.98%.

  3. A simple method for vignette correction of airborne digital camera data

    SciTech Connect

    Nguyen, A.T.; Stow, D.A.; Hope, A.S.

    1996-11-01

    Airborne digital camera systems have gained popularity in recent years due to their flexibility, high geometric fidelity and spatial resolution, and fast data turn-around time. However, a common problem that plagues these types of framing systems is vignetting which causes falloff in image brightness away from principle nadir point. This paper presents a simple method for vignetting correction by utilizing laboratory images of a uniform illumination source. Multiple lab images are averaged and inverted to create digital correction templates which then are applied to actual airborne data. The vignette correction was effective in removing the systematic falloff in spectral values. We have shown that the vignette correction is a necessary part of the preprocessing of raw digital airborne remote sensing data. The consequences of not correcting for these effects are demonstrated in the context of monitoring of salt marsh habitat. 4 refs.

  4. Airborne particle monitoring with urban closed-circuit television camera networks and a chromatic technique

    NASA Astrophysics Data System (ADS)

    Kolupula, Y. R.; Aceves-Fernandez, M. A.; Jones, G. R.; Deakin, A. G.; Spencer, J. W.

    2010-11-01

    An economic approach for the preliminary assessment of 2-10 µm sized (PM10) airborne particle levels in urban areas is described. It uses existing urban closed-circuit television (CCTV) surveillance camera networks in combination with particle accumulating units and chromatic quantification of polychromatic light scattered by the captured particles. Methods for accommodating extraneous light effects are discussed and test results obtained from real urban sites are presented to illustrate the potential of the approach.

  5. GIS Meets Airborne MSS: Geospatial Applications of High-Resolution Multispectral Data

    SciTech Connect

    Albert Guber

    1999-07-27

    Bechtel Nevada operates and flies Daedalus multispectral scanners for funded project tasks at the Department of Energy's Remote Sensing Laboratory. Historically, processing and analysis of multispectral data has afforded scientists the opportunity to see natural phenomena not visible to the naked eye. However, only recently has a system, more specifically a Geometric Correction System, existed to automatically geo-reference these data directly into a Geographic Information (GIS) database. Now, analyses, performed previously in a nongeospatial environment, are integrated directly into an Arc/Info GIS. This technology is of direct benefit to environmental and emergency response applications.

  6. Comparison of airborne multispectral and hyperspectral imagery for estimating grain sorghum yield

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Both multispectral and hyperspectral images are being used to monitor crop conditions and map yield variability, but limited research has been conducted to compare the differences between these two types of imagery for assessing crop growth and yields. The objective of this study was to compare airb...

  7. An algorithm for the estimation of water temperatures from thermal multispectral airborne remotely sensed data

    NASA Technical Reports Server (NTRS)

    Jaggi, S.; Quattrochi, D.; Baskin, R.

    1992-01-01

    A method for water temperature estimation on the basis of thermal data is presented and tested against NASA's Thermal IR Multispectral Scanner. Using realistic bounds on emissivities, temperature bounds are calculated and refined to estimate a tighter bound on the emissivity of the source. The method is useful only when a realistic set of bounds can be obtained for the emissivities of the data.

  8. Development of an airborne remote sensing system for aerial applicators

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An airborne remote sensing system was developed and tested for recording aerial images of field crops, which were analyzed for variations of crop health or pest infestation. The multicomponent system consists of a multi-spectral camera system, a camera control system, and a radiometer for normalizi...

  9. A comparison between satellite and airborne multispectral data for the assessment of Mangrove areas in the eastern Caribbean

    SciTech Connect

    Green, E.P.; Edwards, A.J.; Mumby, P.J.

    1997-06-01

    Satellite (SPOT XS and Landsat TM) and airborne multispectral (CASI) imagery was acquired from the Turks and Caicos Islands, British West Indies. The descriptive resolution and accuracy of each image type is compared for two applications: mangrove habitat mapping and the measurement of mangrove canopy characteristics (leaf area index and canopy closure). Mangroves could be separated from non-mangrove vegetation to an accuracy of only 57% with SPOT XS data but better discrimination could be achieved with either Landsat TM or CASI (in both cases accuracy was >90%). CASI data permitted a more accurate classification of different mangrove habitats than was possible using Landsat TM. Nine mangrove habitats could be mapped to an accuracy of 85% with the high-resolution airborne data compared to 31% obtained with TM. A maximum of three mangrove habitats were separable with Landsat TM: the accuracy of this classification was 83%. Measurement of mangrove canopy characteristics is achieved more accurately with CASI than with either satellite sensor, but high costs probably make it a less cost-effective option. The cost-effectiveness of each sensor is discussed for each application.

  10. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test

    PubMed Central

    Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno

    2008-01-01

    The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.

  11. Optical design of multi-spectral optical system for infrared camera

    NASA Astrophysics Data System (ADS)

    Tang, Tianjin

    2015-08-01

    This paper studies about the multi-spectral imaging system and describes the design of dual-channel mirror-lens optical system with wide-field for multi-spectral sensor. Combined with the secondary imaging technology, it achieves the one hundred percent cold stop efficiency. Off-axis three-mirror reflective optics is adopted to provide an obstructive field of view and high spatial resolution over the wide-field, which is also shared by two channels. Independent relay lens are employed not only to extract the real exit-pupil matched with the cold shield, but also adjust the multiplication factors for infrared. The dichroic mirror and filters subdivide the wide spectral range into four bands, including mid-wavelength band and long-wavelength band. Each corresponds to respective field. The result shows that the Modulation Transfer Function of each band at respective fields is near the diffraction limit, which satisfies the needs of practical applications. The wavefront of the off-axis three-mirror reflective optics is also satisfactory, which is beneficial to the later alignment and measurement.

  12. Imaging and radiometric performance simulation for a new high-performance dual-band airborne reconnaissance camera

    NASA Astrophysics Data System (ADS)

    Seong, Sehyun; Yu, Jinhee; Ryu, Dongok; Hong, Jinsuk; Yoon, Jee-Yeon; Kim, Sug-Whan; Lee, Jun-Ho; Shin, Myung-Jin

    2009-05-01

    In recent years, high performance visible and IR cameras have been used widely for tactical airborne reconnaissance. The process improvement for efficient discrimination and analysis of complex target information from active battlefields requires for simultaneous multi-band measurement from airborne platforms at various altitudes. We report a new dual band airborne camera designed for simultaneous registration of both visible and IR imagery from mid-altitude ranges. The camera design uses a common front end optical telescope of around 0.3m in entrance aperture and several relay optical sub-systems capable of delivering both high spatial resolution visible and IR images to the detectors. The camera design is benefited from the use of several optical channels packaged in a compact space and the associated freedom to choose between wide (~3 degrees) and narrow (~1 degree) field of view. In order to investigate both imaging and radiometric performances of the camera, we generated an array of target scenes with optical properties such as reflection, refraction, scattering, transmission and emission. We then combined the target scenes and the camera optical system into the integrated ray tracing simulation environment utilizing Monte Carlo computation technique. Taking realistic atmospheric radiative transfer characteristics into account, both imaging and radiometric performances were then investigated. The simulation results demonstrate successfully that the camera design satisfies NIIRS 7 detection criterion. The camera concept, details of performance simulation computation, the resulting performances are discussed together with future development plan.

  13. Evaluating airborne multispectral digital video to differentiate giant Salvinia from other features in northeast Texas

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Giant salvinia is one of the world’s most noxious aquatic weeds. Researchers employed airborne digital video imagery and an unsupervised computer analysis to derive a map showing giant salvinia and other aquatic and terrestrial features within a study site located in northeast Texas. The map had a...

  14. Recent advances in airborne terrestrial remote sensing with the NASA airborne visible/infrared imaging spectrometer (AVIRIS), airborne synthetic aperture radar (SAR), and thermal infrared multispectral scanner (TIMS)

    NASA Technical Reports Server (NTRS)

    Vane, Gregg; Evans, Diane L.; Kahle, Anne B.

    1989-01-01

    Significant progress in terrestrial remote sensing from the air has been made with three NASA-developed sensors that collectively cover the solar-reflected, thermal infrared, and microwave regions of the electromagnetic spectrum. These sensors are the airborne visible/infrared imaging spectrometer (AVIRIS), the thermal infrared mapping spectrometer (TIMS) and the airborne synthetic aperture radar (SAR), respectively. AVIRIS and SAR underwent extensive in-flight engineering testing in 1987 and 1988 and are scheduled to become operational in 1989. TIMS has been in operation for several years. These sensors are described.

  15. Multispectral calibration to enhance the metrology performance of C-mount camera systems

    NASA Astrophysics Data System (ADS)

    Robson, S.; MacDonald, L.; Kyle, S. A.; Shortis, M. R.

    2014-06-01

    Low cost monochrome camera systems based on CMOS sensors and C-mount lenses have been successfully applied to a wide variety of metrology tasks. For high accuracy work such cameras are typically equipped with ring lights to image retro-reflective targets as high contrast image features. Whilst algorithms for target image measurement and lens modelling are highly advanced, including separate RGB channel lens distortion correction, target image circularity compensation and a wide variety of detection and centroiding approaches, less effort has been directed towards optimising physical target image quality by considering optical performance in narrow wavelength bands. This paper describes an initial investigation to assess the effect of wavelength on camera calibration parameters for two different camera bodies and the same `C-mount' wide angle lens. Results demonstrate the expected strong influence on principal distance, radial and tangential distortion, and also highlight possible trends in principal point, orthogonality and affinity parameters which are close to the parameter estimation noise level from the strong convergent self-calibrating image networks.

  16. Active/passive scanning. [airborne multispectral laser scanners for agricultural and water resources applications

    NASA Technical Reports Server (NTRS)

    Woodfill, J. R.; Thomson, F. J.

    1979-01-01

    The paper deals with the design, construction, and applications of an active/passive multispectral scanner combining lasers with conventional passive remote sensors. An application investigation was first undertaken to identify remote sensing applications where active/passive scanners (APS) would provide improvement over current means. Calibration techniques and instrument sensitivity are evaluated to provide predictions of the APS's capability to meet user needs. A preliminary instrument design was developed from the initial conceptual scheme. A design review settled the issues of worthwhile applications, calibration approach, hardware design, and laser complement. Next, a detailed mechanical design was drafted and construction of the APS commenced. The completed APS was tested and calibrated in the laboratory, then installed in a C-47 aircraft and ground tested. Several flight tests completed the test program.

  17. Capturing the Green River -- Multispectral airborne videography to evaluate the environmental impacts of hydropower operations

    SciTech Connect

    Snider, M.A.; Hayse, J.W.; Hlohowskyj, I.; LaGory, K.E.

    1996-02-01

    The 500-mile long Green River is the largest tributary of the Colorado River. From its origin in the Wind River Range mountains of western Wyoming to its confluence with the Colorado River in southeastern Utah, the Green River is vital to the arid region through which it flows. Large portions of the area remain near-wilderness with the river providing a source of recreation in the form of fishing and rafting, irrigation for farming and ranching, and hydroelectric power. In the late 1950`s and early 1960`s hydroelectric facilities were built on the river. One of these, Flaming Gorge Dam, is located just south of the Utah-Wyoming border near the town of Dutch John, Utah. Hydropower operations result in hourly and daily fluctuations in the releases of water from the dam that alter the natural stream flow below the dam and affect natural resources in and along the river corridor. In the present study, the authors were interested in evaluating the potential impacts of hydropower operations at Flaming Gorge Dam on the downstream natural resources. Considering the size of the area affected by the daily pattern of water release at the dam as well as the difficult terrain and limited accessibility of many reaches of the river, evaluating these impacts using standard field study methods was virtually impossible. Instead an approach was developed that used multispectral aerial videography to determine changes in the affected parameters at different flows, hydrologic modeling to predict flow conditions for various hydropower operating scenarios, and ecological information on the biological resources of concern to assign impacts.

  18. Georeferencing airborne images from a multiple digital camera system by GPS/INS

    NASA Astrophysics Data System (ADS)

    Mostafa, Mohamed Mohamed Rashad

    2000-10-01

    In this thesis, the development and testing of an airborne fully digital multi-sensor system for kinematic mapping is presented. The system acquires two streams of data, namely navigation data and imaging data. The navigation data are obtained by integrating an accurate strapdown Inertial Navigation System with two GPS receivers. The imaging data are acquired by two digital cameras, configured in such a way so as to reduce their geometric limitations. The two digital cameras capture strips of overlapping nadir and oblique images. The INS/GPS-derived trajectory contains the full translational and rotational motion of the carrier aircraft. Thus, image exterior orientation information is extracted from the trajectory, during postprocessing. This approach eliminates the need for ground control when computing 3D positions of objects that appear in the field of view of the system imaging component. Test flights were conducted over the campus of The University of Calgary. Two approaches for calibrating the system are presented, namely pre-mission calibration and in-flight calibration. Testing the system in flight showed that best ground point positioning accuracy at 1:12000 average image scale is 0.2 m (RMS) in easting and northing and 0.3 m (RMS) in height. Preliminary results indicate that major applications of such a system in the future are in the field of digital mapping, at scales of 1:10000 and smaller, and the generation of digital elevation models for engineering applications.

  19. Control design for image tracking with an inertially stabilized airborne camera platform

    NASA Astrophysics Data System (ADS)

    Hurák, Zdenek; Rezáč, Martin

    2010-04-01

    The paper reports on a few control engineering issues related to design and implementation of an image-based pointing and tracking system for an inertially stabilized airborne camera platform. A medium-sized platform has been developed by the authors and a few more team members within a joint governmental project coordinated by Czech Air Force Research Institute. The resulting experimental platform is based on a common double gimbal configuration with two direct drive motors and off-the-shelf MEMS gyros. Automatic vision-based tracking system is built on top of the inertial stabilization. Choice of a suitable control configuration is discussed first, because the decoupled structure for the inner inertial rate controllers does not extend easily to the outer imagebased pointing and tracking loop. It appears that the pointing and tracking controller can benefit much from availability of measurements of an inertial rate of the camera around its optical axis. The proposed pointing and tracking controller relies on feedback linearization well known in image-based visual servoing. Simple compensation of a one sample delay introduced into the (slow) visual pointing and tracking loop by the computer vision system is proposed. It relies on a simple modification of the well-known Smith predictor scheme where the prediction takes advantage of availability of the (fast and undelayed) inertial rate measurements.

  20. Airborne Camera System for Real-Time Applications - Support of a National Civil Protection Exercise

    NASA Astrophysics Data System (ADS)

    Gstaiger, V.; Romer, H.; Rosenbaum, D.; Henkel, F.

    2015-04-01

    In the VABENE++ project of the German Aerospace Center (DLR), powerful tools are being developed to aid public authorities and organizations with security responsibilities as well as traffic authorities when dealing with disasters and large public events. One focus lies on the acquisition of high resolution aerial imagery, its fully automatic processing, analysis and near real-time provision to decision makers in emergency situations. For this purpose a camera system was developed to be operated from a helicopter with light-weight processing units and microwave link for fast data transfer. In order to meet end-users' requirements DLR works close together with the German Federal Office of Civil Protection and Disaster Assistance (BBK) within this project. One task of BBK is to establish, maintain and train the German Medical Task Force (MTF), which gets deployed nationwide in case of large-scale disasters. In October 2014, several units of the MTF were deployed for the first time in the framework of a national civil protection exercise in Brandenburg. The VABENE++ team joined the exercise and provided near real-time aerial imagery, videos and derived traffic information to support the direction of the MTF and to identify needs for further improvements and developments. In this contribution the authors introduce the new airborne camera system together with its near real-time processing components and share experiences gained during the national civil protection exercise.

  1. Airborne multispectral and hyperspectral remote sensing: Examples of applications to the study of environmental and engineering problems

    SciTech Connect

    Bianchi, R.; Marino, C.M.

    1997-10-01

    The availability of a new aerial survey capability carried out by the CNR/LARA (National Research Council - Airborne Laboratory for the Environmental Research) by a new spectroradiometer AA5000 MIVIS (Multispectral Infrared and Visible Imaging Spectrometer) on board a CASA 212/200 aircraft, enable the scientists to obtain innovative data sets, for different approach to the definitions and the understanding of a variety of environmental and engineering problems. The 102 MIVIS channels spectral bandwidths are chosen to meet the needs of scientific research for advanced applications of remote sensing data. In such configuration MIVIS can offer significant contributions to problem solving in wide sectors such as geologic exploration, agricultural crop studies, forestry, land use mapping, idrogeology, oceanography and others. LARA in 1994-96 has been active over different test-sites in joint-venture with JPL, Pasadena, different European Institutions and Italian University and Research Institutes. These aerial surveys allow the national and international scientific community to approach the use of Hyperspectral Remote Sensing in environmental problems of very large interest. The sites surveyed in Italy, France and Germany include a variety of targets such as quarries, landfills, karst cavities areas, landslides, coastlines, geothermal areas, etc. The deployments gathered up to now more than 300 GBytes of MIVIS data in more than 30 hours of VLDS data recording. The purpose of this work is to present and to comment the procedures and the results at research and at operational level of the past campaigns with special reference to the study of environmental and engineering problems.

  2. Hydrological characterization of a riparian vegetation zone using high resolution multi-spectral airborne imagery

    NASA Astrophysics Data System (ADS)

    Akasheh, Osama Z.

    The Middle Rio Grande River (MRGR) is the main source of fresh water for the state of New Mexico. Located in an arid area with scarce local water resources, this has led to extensive diversions of river water to supply the high demand from municipalities and irrigated agricultural activities. The extensive water diversions over the last few decades have affected the composition of the native riparian vegetation by decreasing the area of cottonwood and coyote willow and increasing the spread of invasive species such as Tamarisk and Russian Olives, harmful to the river system, due to their high transpiration rates, which affect the river aquatic system. The need to study the river hydrological processes and their relation with its health is important to preserve the river ecosystem. To be able to do that a detailed vegetation map was produced using a Utah State University airborne remote sensing system for 286 km of river reach. Also a groundwater model was built in ArcGIS environment which has the ability to estimate soil water potential in the root zone and above the modeled water table. The Modified Penman-Monteith empirical equation was used in the ArcGIS environment to estimate riparian vegetation ET, taking advantage of the detailed vegetation map and spatial soil water potential layers. Vegetation water use per linear river reach was estimated to help decision makers to better manage and release the amount of water that keeps a sound river ecosystem and to support agricultural activities.

  3. Multispectral Terrain Background Simulation Techniques For Use In Airborne Sensor Evaluation

    NASA Astrophysics Data System (ADS)

    Weinberg, Michael; Wohlers, Ronald; Conant, John; Powers, Edward

    1988-08-01

    A background simulation code developed at Aerodyne Research, Inc., called AERIE is designed to reflect the major sources of clutter that are of concern to staring and scanning sensors of the type being considered for various airborne threat warning (both aircraft and missiles) sensors. The code is a first principles model that could be used to produce a consistent image of the terrain for various spectral bands, i.e., provide the proper scene correlation both spectrally and spatially. The code utilizes both topographic and cultural features to model terrain, typically from DMA data, with a statistical overlay of the critical underlying surface properties (reflectance, emittance, and thermal factors) to simulate the resulting texture in the scene. Strong solar scattering from water surfaces is included with allowance for wind driven surface roughness. Clouds can be superimposed on the scene using physical cloud models and an analytical representation of the reflectivity obtained from scattering off spherical particles. The scene generator is augmented by collateral codes that allow for the generation of images at finer resolution. These codes provide interpolation of the basic DMA databases using fractal procedures that preserve the high frequency power spectral density behavior of the original scene. Scenes are presented illustrating variations in altitude, radiance, resolution, material, thermal factors, and emissivities. The basic models utilized for simulation of the various scene components and various "engineering level" approximations are incorporated to reduce the computational complexity of the simulation.

  4. Multispectral Photography

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Model II Multispectral Camera is an advanced aerial camera that provides optimum enhancement of a scene by recording spectral signatures of ground objects only in narrow, preselected bands of the electromagnetic spectrum. Its photos have applications in such areas as agriculture, forestry, water pollution investigations, soil analysis, geologic exploration, water depth studies and camouflage detection. The target scene is simultaneously photographed in four separate spectral bands. Using a multispectral viewer, such as their Model 75 Spectral Data creates a color image from the black and white positives taken by the camera. With this optical image analysis unit, all four bands are superimposed in accurate registration and illuminated with combinations of blue green, red, and white light. Best color combination for displaying the target object is selected and printed. Spectral Data Corporation produces several types of remote sensing equipment and also provides aerial survey, image processing and analysis and number of other remote sensing services.

  5. Photointerpretation of Skylab 2 multispectral camera (S-190A) data: Advance report of significant results

    NASA Technical Reports Server (NTRS)

    Jensen, M. L. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. A significant and possible major economic example of the practical value of Skylab photographs was provided by locating on Skylab Camera Station Number 4, frame 010, SL-2, an area of exposures of limestone rocks which were thought to be completely covered by volcanic rocks based upon prior mapping. The area is located less than 12 miles north of the Ruth porphyry copper deposit, White Pine County, Nevada. This is a major copper producing open pit mine owned by Kennecott Copper Corporation. Geophysical maps consisting of gravity and aeromagnetic studies have been published indicating three large positive magnetic anomalies located at the Ruth ore deposits, the Ward Mountain, not a mineralized area, and in the area previously thought to be completely covered by post-ore volcanics. Skylab photos indicate, however, that erosion has removed volcanic cover in specific sites sufficient to expose the underlying older rocks suggesting, therefore, that the volcanic rocks may not be the cause of the aeromagnetic anomaly. Field studies have verified the initial interpretations made from the Skylab photos. The potential significance of this study is that the large positive aeromagnetic anomaly suggests the presence of cooled and solidified magma below the anomalies, in which ore-bearing solutions may have been derived forming possible large ore deposits.

  6. Preliminary investigation of multispectral retinal tissue oximetry mapping using a hyperspectral retinal camera.

    PubMed

    Desjardins, Michèle; Sylvestre, Jean-Philippe; Jafari, Reza; Kulasekara, Susith; Rose, Kalpana; Trussart, Rachel; Arbour, Jean Daniel; Hudson, Chris; Lesage, Frédéric

    2016-05-01

    Oximetry measurement of principal retinal vessels represents a first step towards understanding retinal metabolism, but the technique could be significantly enhanced by spectral imaging of the fundus outside of main vessels. In this study, a recently developed Hyperspectral Retinal Camera was used to measure relative oximetric (SatO2) and total hemoglobin (HbT) maps of the retina, outside of large vessels, in healthy volunteers at baseline (N = 7) and during systemic hypoxia (N = 11), as well as in patients with glaucoma (N = 2). Images of the retina, on a field of view of ∼30°, were acquired between 500 and 600 nm with 2 and 5 nm steps, in under 3 s. The reflectance spectrum from each pixel was fitted to a model having oxy- and deoxyhemoglobin as the main absorbers and scattering modeled by a power law, yielding estimates of relative SatO2 and HbT over the fundus. Average optic nerve head (ONH) saturation over 8 eyes was 68 ± 5%. During systemic hypoxia, mean ONH saturation decreased by 12.5% on average. Upon further development and validation, the relative SatO2 and HbT maps of microvasculature obtained with this imaging system could ultimately contribute to the diagnostic and management of diseases affecting the ONH and retina. PMID:27060375

  7. Potential of Uav-Based Laser Scanner and Multispectral Camera Data in Building Inspection

    NASA Astrophysics Data System (ADS)

    Mader, D.; Blaskow, R.; Westfeld, P.; Weller, C.

    2016-06-01

    Conventional building inspection of bridges, dams or large constructions in general is rather time consuming and often cost expensive due to traffic closures and the need of special heavy vehicles such as under-bridge inspection units or other large lifting platforms. In consideration that, an unmanned aerial vehicle (UAV) will be more reliable and efficient as well as less expensive and simpler to operate. The utilisation of UAVs as an assisting tool in building inspections is obviously. Furthermore, light-weight special sensors such as infrared and thermal cameras as well as laser scanner are available and predestined for usage on unmanned aircraft systems. Such a flexible low-cost system is realized in the ADFEX project with the goal of time-efficient object exploration, monitoring and damage detection. For this purpose, a fleet of UAVs, equipped with several sensors for navigation, obstacle avoidance and 3D object-data acquisition, has been developed and constructed. This contribution deals with the potential of UAV-based data in building inspection. Therefore, an overview of the ADFEX project, sensor specifications and requirements of building inspections in general are given. On the basis of results achieved in practical studies, the applicability and potential of the UAV system in building inspection will be presented and discussed.

  8. Ground-based multispectral measurements for airborne data verification in non-operating open pit mine "Kremikovtsi"

    NASA Astrophysics Data System (ADS)

    Borisova, Denitsa; Nikolov, Hristo; Petkov, Doyno

    2013-10-01

    The impact of mining industry and metal production on the environment is presented all over the world. In our research we set focus on the impact of already non-operating ferrous "Kremikovtsi"open pit mine and related waste dumps and tailings which we consider to be the major factor responsible for pollution of one densely populated region in Bulgaria. The approach adopted is based on correct estimation of the distribution of the iron oxides inside open pit mines and the neighboring regions those considered in this case to be the key issue for the ecological state assessment of soils, vegetation and water. For this study the foremost source of data are those of airborne origin and those combined with ground-based in-situ and laboratory acquired data were used for verification of the environmental variables and thus in process of assessment of the present environmental status influenced by previous mining activities. The percentage of iron content was selected as main indicator for presence of metal pollution since it could be reliably identified by multispectral data used in this study and also because the iron compounds are widely spread in the most of the minerals, rocks and soils. In our research the number of samples from every source (air, field, lab) was taken in the way to be statistically sound and confident. In order to establish relationship between the degree of pollution of the soil and mulspectral data 40 soil samples were collected during a field campaign in the study area together with GPS measurements for two types of laboratory measurements: the first one, chemical and mineralogical analysis and the second one, non-destructive spectroscopy. In this work for environmental variables verification over large areas mulspectral satellite data from Landsat instruments TM/ETM+ and from ALI/OLI (Operational Land Imager) were used. Ground-based (laboratory and in-situ) spectrometric measurements were performed using the designed and constructed in Remote

  9. Land cover/use classification of Cairns, Queensland, Australia: A remote sensing study involving the conjunctive use of the airborne imaging spectrometer, the large format camera and the thematic mapper simulator

    NASA Technical Reports Server (NTRS)

    Heric, Matthew; Cox, William; Gordon, Daniel K.

    1987-01-01

    In an attempt to improve the land cover/use classification accuracy obtainable from remotely sensed multispectral imagery, Airborne Imaging Spectrometer-1 (AIS-1) images were analyzed in conjunction with Thematic Mapper Simulator (NS001) Large Format Camera color infrared photography and black and white aerial photography. Specific portions of the combined data set were registered and used for classification. Following this procedure, the resulting derived data was tested using an overall accuracy assessment method. Precise photogrammetric 2D-3D-2D geometric modeling techniques is not the basis for this study. Instead, the discussion exposes resultant spectral findings from the image-to-image registrations. Problems associated with the AIS-1 TMS integration are considered, and useful applications of the imagery combination are presented. More advanced methodologies for imagery integration are needed if multisystem data sets are to be utilized fully. Nevertheless, research, described herein, provides a formulation for future Earth Observation Station related multisensor studies.

  10. Multispectral airborne imagery in the field reveals genetic determinisms of morphological and transpiration traits of an apple tree hybrid population in response to water deficit.

    PubMed

    Virlet, Nicolas; Costes, Evelyne; Martinez, Sébastien; Kelner, Jean-Jacques; Regnard, Jean-Luc

    2015-09-01

    Genetic studies of response to water deficit in adult trees are limited by low throughput of the usual phenotyping methods in the field. Here, we aimed at overcoming this bottleneck, applying a new methodology using airborne multispectral imagery and in planta measurements to compare a high number of individuals.An apple tree population, grafted on the same rootstock, was submitted to contrasting summer water regimes over two years. Aerial images acquired in visible, near- and thermal-infrared at three dates each year allowed calculation of vegetation and water stress indices. Tree vigour and fruit production were also assessed. Linear mixed models were built accounting for date and year effects on several variables and including the differential response of genotypes between control and drought conditions.Broad-sense heritability of most variables was high and 18 quantitative trait loci (QTLs) independent of the dates were detected on nine linkage groups of the consensus apple genetic map. For vegetation and stress indices, QTLs were related to the means, the intra-crown heterogeneity, and differences induced by water regimes. Most QTLs explained 15-20% of variance.Airborne multispectral imaging proved relevant to acquire simultaneous information on a whole tree population and to decipher genetic determinisms involved in response to water deficit. PMID:26208644

  11. Multispectral airborne imagery in the field reveals genetic determinisms of morphological and transpiration traits of an apple tree hybrid population in response to water deficit

    PubMed Central

    Virlet, Nicolas; Costes, Evelyne; Martinez, Sébastien; Kelner, Jean-Jacques; Regnard, Jean-Luc

    2015-01-01

    Genetic studies of response to water deficit in adult trees are limited by low throughput of the usual phenotyping methods in the field. Here, we aimed at overcoming this bottleneck, applying a new methodology using airborne multispectral imagery and in planta measurements to compare a high number of individuals. An apple tree population, grafted on the same rootstock, was submitted to contrasting summer water regimes over two years. Aerial images acquired in visible, near- and thermal-infrared at three dates each year allowed calculation of vegetation and water stress indices. Tree vigour and fruit production were also assessed. Linear mixed models were built accounting for date and year effects on several variables and including the differential response of genotypes between control and drought conditions. Broad-sense heritability of most variables was high and 18 quantitative trait loci (QTLs) independent of the dates were detected on nine linkage groups of the consensus apple genetic map. For vegetation and stress indices, QTLs were related to the means, the intra-crown heterogeneity, and differences induced by water regimes. Most QTLs explained 15−20% of variance. Airborne multispectral imaging proved relevant to acquire simultaneous information on a whole tree population and to decipher genetic determinisms involved in response to water deficit. PMID:26208644

  12. Non-invasive skin oxygenation imaging using a multi-spectral camera system: effectiveness of various concentration algorithms applied on human skin

    NASA Astrophysics Data System (ADS)

    Klaessens, John H. G. M.; Noordmans, Herke Jan; de Roode, Rowland; Verdaasdonk, Rudolf M.

    2009-02-01

    This study describes noninvasive noncontact methods to acquire and analyze functional information from the skin. Multispectral images at several selected wavelengths in the visible and near infrared region are collected and used in mathematical methods to calculate concentrations of different chromophores in the epidermis and dermis of the skin. This is based on the continuous wave Near Infrared Spectroscopy method, which is a well known non-invasive technique for measuring oxygenation changes in the brain and in muscle tissue. Concentration changes of hemoglobin (dO2Hb, dHHb and dtHb) can be calculated from light attenuations using the modified Lambert Beer equation. We applied this technique on multi-spectral images taken from the skin surface using different algorithms for calculating changes in O2Hb, HHb and tHb. In clinical settings, the imaging of local oxygenation variations and/or blood perfusion in the skin can be useful for e.g. detection of skin cancer, detection of early inflammation, checking the level of peripheral nerve block anesthesia, study of wound healing and tissue viability by skin flap transplantations. Images from the skin are obtained with a multi-spectral imaging system consisting of a 12-bit CCD camera in combination with a Liquid Crystal Tunable Filter. The skin is illuminated with either a broad band light source or a tunable multi wavelength LED light source. A polarization filter is used to block the direct reflected light. The collected multi-spectral imaging data are images of the skin surface radiance; each pixel contains either the full spectrum (420 - 730 nm) or a set of selected wavelengths. These images were converted to reflectance spectra. The algorithms were validated during skin oxygen saturation changes induced by temporary arm clamping and applied to some clinical examples. The initial results with the multi-spectral skin imaging system show good results for detecting dynamic changes in oxygen concentration. However, the

  13. Multispectral photography for earth resources

    NASA Technical Reports Server (NTRS)

    Wenderoth, S.; Yost, E.; Kalia, R.; Anderson, R.

    1972-01-01

    A guide for producing accurate multispectral results for earth resource applications is presented along with theoretical and analytical concepts of color and multispectral photography. Topics discussed include: capabilities and limitations of color and color infrared films; image color measurements; methods of relating ground phenomena to film density and color measurement; sensitometry; considerations in the selection of multispectral cameras and components; and mission planning.

  14. An Assessment Of Meso-Scale Hydraulic And Vegetation Characteristics Of The Middle Rio Grande River Using High Resolution Multispectral Airborne Imagery

    NASA Astrophysics Data System (ADS)

    Akasheh, O. Z.; Neale, C. M.

    2004-12-01

    Middle Rio Grande River (MRGR) is the main source of fresh water for the population of New Mexico as well as for irrigated agriculture. Extensive water diversion over the last few decades has affected the composition of the native Riparian vegetation such as Cottonwood population and enhanced the spread of introduced species harmful to the river system like Tamarisk and Russian Olives. High resolution airborne remote sensing is a powerful technique for riparian vegetation mapping and monitoring. Airborne multispectral digital images were acquired over the riparian corridor of the MRGR, New Mexico in June 1999 and July 2001, using the Utah State University (USU) airborne digital imaging system. The imagery were corrected for vignetting effects, geometric lens distortions, rectified to a map base, mosaicked, verified in the field, classified and checked for accuracy. Areas of the vegetation classes and in-stream features were extracted and presented per reach of the river. In this paper a relationship was developed between the total surface water area mapped and both the river water flow rate and water table readings. The consequence of this relationship on riparian vegetation distribution along the river was studied and graphically demonstrated. Strong relationship was found between the total surface water area and water flow rate. In addition the reduction in surface water area resulted in reduction of native trees downstream.

  15. Russian multispectral-hyperspectral airborne scanner for geological and environmental investigations - {open_quotes}Vesuvius-EC{close_quotes}

    SciTech Connect

    Yassinsky, G.I.; Shilin, B.V.

    1996-07-01

    Small variations of spectral characteristics in 0,3-14 microns band are of great significance in geological and environmental investigations. Multipurpose multispectral digital scanner with narrow field of view, high spectral resolution and radiometric calibration designed in Russia. Changeable modules permit to obtain parameters of the device for practical using.

  16. In vivo multispectral imaging of the absorption and scattering properties of exposed brain using a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Yoshida, Keiichiro; Ishizuka, Tomohiro; Mizushima, Chiharu; Nishidate, Izumi; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu

    2015-04-01

    To evaluate multi-spectral images of the absorption and scattering properties in the cerebral cortex of rat brain, we investigated spectral reflectance images estimated by the Wiener estimation method using a digital red-green-blue camera. A Monte Carlo simulation-based multiple regression analysis for the corresponding spectral absorbance images at nine wavelengths (500, 520, 540, 560, 570, 580, 600, 730, and 760 nm) was then used to specify the absorption and scattering parameters. The spectral images of absorption and reduced scattering coefficients were reconstructed from the absorption and scattering parameters. We performed in vivo experiments on exposed rat brain to confirm the feasibility of this method. The estimated images of the absorption coefficients were dominated by hemoglobin spectra. The estimated images of the reduced scattering coefficients had a broad scattering spectrum, exhibiting a larger magnitude at shorter wavelengths, corresponding to the typical spectrum of brain tissue published in the literature.

  17. Remote Sensing of Liquid Water and Ice Cloud Optical Thickness and Effective Radius in the Arctic: Application of Airborne Multispectral MAS Data

    NASA Technical Reports Server (NTRS)

    King, Michael D.; Platnick, Steven; Yang, Ping; Arnold, G. Thomas; Gray, Mark A.; Riedi, Jerome C.; Ackerman, Steven A.; Liou, Kuo-Nan

    2003-01-01

    A multispectral scanning spectrometer was used to obtain measurements of the reflection function and brightness temperature of clouds, sea ice, snow, and tundra surfaces at 50 discrete wavelengths between 0.47 and 14.0 microns. These observations were obtained from the NASA ER-2 aircraft as part of the FIRE Arctic Clouds Experiment, conducted over a 1600 x 500 km region of the north slope of Alaska and surrounding Beaufort and Chukchi Seas between 18 May and 6 June 1998. Multispectral images of the reflection function and brightness temperature in 11 distinct bands of the MODIS Airborne Simulator (MAS) were used to derive a confidence in clear sky (or alternatively the probability of cloud), shadow, and heavy aerosol over five different ecosystems. Based on the results of individual tests run as part of the cloud mask, an algorithm was developed to estimate the phase of the clouds (water, ice, or undetermined phase). Finally, the cloud optical thickness and effective radius were derived for both water and ice clouds that were detected during one flight line on 4 June. This analysis shows that the cloud mask developed for operational use on MODIS, and tested using MAS data in Alaska, is quite capable of distinguishing clouds from bright sea ice surfaces during daytime conditions in the high Arctic. Results of individual tests, however, make it difficult to distinguish ice clouds over snow and sea ice surfaces, so additional tests were added to enhance the confidence in the thermodynamic phase of clouds over the Beaufort Sea. The cloud optical thickness and effective radius retrievals used 3 distinct bands of the MAS, with the newly developed 1.62 and 2.13 micron bands being used quite successfully over snow and sea ice surfaces. These results are contrasted with a MODIS-based algorithm that relies on spectral reflectance at 0.87 and 2.13 micron.

  18. Interpretation of multispectral and infrared thermal surveys of the Suez Canal Zone, Egypt

    NASA Technical Reports Server (NTRS)

    Elshazly, E. M.; Hady, M. A. A. H.; Hafez, M. A. A.; Salman, A. B.; Morsy, M. A.; Elrakaiby, M. M.; Alaassy, I. E. E.; Kamel, A. F.

    1977-01-01

    Remote sensing airborne surveys were conducted, as part of the plan of rehabilitation, of the Suez Canal Zone using I2S multispectral camera and Bendix LN-3 infrared passive scanner. The multispectral camera gives four separate photographs for the same scene in the blue, green, red, and near infrared bands. The scanner was operated in the microwave bands of 8 to 14 microns and the thermal surveying was carried out both at night and in the day time. The surveys, coupled with intensive ground investigations, were utilized in the construction of new geological, structural lineation and drainage maps for the Suez Canal Zone on a scale of approximately 1:20,000, which are superior to the maps made by normal aerial photography. A considerable number of anomalies belonging to various types were revealed through the interpretation of the executed multispectral and infrared thermal surveys.

  19. Comparison of mosaicking techniques for airborne images from consumer-grade cameras

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Images captured from airborne imaging systems have the advantages of relatively low cost, high spatial resolution, and real/near-real-time availability. Multiple images taken from one or more flight lines could be used to generate a high-resolution mosaic image, which could be useful for diverse rem...

  20. A fully-automated approach to land cover mapping with airborne LiDAR and high resolution multispectral imagery in a forested suburban landscape

    NASA Astrophysics Data System (ADS)

    Parent, Jason R.; Volin, John C.; Civco, Daniel L.

    2015-06-01

    Information on land cover is essential for guiding land management decisions and supporting landscape-level ecological research. In recent years, airborne light detection and ranging (LiDAR) and high resolution aerial imagery have become more readily available in many areas. These data have great potential to enable the generation of land cover at a fine scale and across large areas by leveraging 3-dimensional structure and multispectral information. LiDAR and other high resolution datasets must be processed in relatively small subsets due to their large volumes; however, conventional classification techniques cannot be fully automated and thus are unlikely to be feasible options when processing large high-resolution datasets. In this paper, we propose a fully automated rule-based algorithm to develop a 1 m resolution land cover classification from LiDAR data and multispectral imagery. The algorithm we propose uses a series of pixel- and object-based rules to identify eight vegetated and non-vegetated land cover features (deciduous and coniferous tall vegetation, medium vegetation, low vegetation, water, riparian wetlands, buildings, low impervious cover). The rules leverage both structural and spectral properties including height, LiDAR return characteristics, brightness in visible and near-infrared wavelengths, and normalized difference vegetation index (NDVI). Pixel-based properties were used initially to classify each land cover class while minimizing omission error; a series of object-based tests were then used to remove errors of commission. These tests used conservative thresholds, based on diverse test areas, to help avoid over-fitting the algorithm to the test areas. The accuracy assessment of the classification results included a stratified random sample of 3198 validation points distributed across 30 1 × 1 km tiles in eastern Connecticut, USA. The sample tiles were selected in a stratified random manner from locations representing the full range of

  1. Reflectance Data Processing of High Resolution Multispectral Data Acquired with an Autonomous Unmanned Aerial Vehicle AggieairTM

    NASA Astrophysics Data System (ADS)

    Zaman, B.; Jensen, A.; McKee, M.

    2012-12-01

    In this study, the performance and accuracy of a method for converting airborne multispectral data to reflectance data are characterized. Spectral reflectance is the ratio of reflected to incident radiant flux and it may have values only in the interval 0-1, inclusive. Reflectance is a key physical property of a surface and is empirically derived from on-ground observations. The paper presents a method for processing multispectral data acquired by an unmanned aerial vehicle (UAV) platform, called AggieAirTM, and a process for converting raw digital numbers to calibrated reflectance values. Imagery is acquired by two identical sets of cameras. One set is aboard the UAV and the other is over a barium sulfate reference panel. The cameras have identical settings. The major steps for producing the reflectance data involve the calibration of the reference panel, calibration of the multispectral UAV cameras, zenith angle calculations and image processing. The method converts airborne multispectral data by calculating the ratio of linearly-interpolated reference values from the pre- and post-flight reference panel readings. The flight interval is typically approximately 30 minutes and the imagery is acquired around local solar noon. The UAV is typically flown at low altitudes to reduce atmospheric effects to a negligible level. Data acquired over wetlands near Great Salt Lake, Utah is used to illustrate ground data and processed imagery. The spectral resolution of the multispectral data is 25 cms. The paper discusses the accuracy issues and errors associated with the proposed method.

  2. Verification of sensitivity enhancement of SWIR imager technology in advanced multispectral SWIR/VIS zoom cameras with constant and variable F-number

    NASA Astrophysics Data System (ADS)

    Hübner, M.; Achtner, B.; Kraus, M.; Siemens, C.; Münzberg, M.

    2016-05-01

    Current designs of combined VIS-color/SWIR camera optics use constant F-number over the full field of view (FOV) range. Especially in the SWIR, limited space for the camera integration in existing system volumes and relatively high pitch dimensions of 15μm or even 20μm force the use of relatively high F- numbers to accomplish narrow fields of view less than 2.0° with reasonable resolution for long range observation and targeting applications. Constant F-number designs are already reported and considered [1] for submarine applications. The comparison of electro-optical performance was based on the given detector noise performance and sensitivity data by the detector manufacturer [1] and further modelling of the imaging chain within linear MTF system theory. The visible channel provides limited twilight capability at F/2.6 but in the SWIR the twilight capability is degraded due to the relatively high F-number of F/7 or F/5.25 for 20 μm and 15 μm pitch, respectively. Differences between prediction and experimental verification of sensitivity in terms of noise equivalent irradiance (NEI) and scenery based limiting illumination levels are shown for the visible and the SWIR spectral range. Within this context, currently developed improvements using optical zoom designs for the multispectral SWIR/VIS camera optics with continuously variable Fnumber are discussed, offering increased low light level capabilities at wide and medium fields of view while still enabling a NFOV < 2° with superior long range targeting capabilities under limited atmospherical sight conditions at daytime.

  3. Use of reflectance spectra of native plant species for interpreting airborne multispectral scanner data in the East Tintic Mountains, Utah.

    USGS Publications Warehouse

    Milton, N.M.

    1983-01-01

    Analysis of in situ reflectance spectra of native vegetation was used to interpret airborne MSS data. Representative spectra from three plant species in the E Tintic Mountains, Utah, were used to interpret the color components on a color ratio composite image made from MSS data in the visible and near-infrared regions. A map of plant communities was made from the color ratio composite image and field checked. -from Author

  4. Analysis of testbed airborne multispectral scanner data from Superflux II. [Chesapeake Bay plume and James Shelf data

    NASA Technical Reports Server (NTRS)

    Bowker, D. E.; Hardesty, C. A.; Jobson, D. J.; Bahn, G. S.

    1981-01-01

    A test bed aircraft multispectral scanner (TBAMS) was flown during the James Shelf, Plume Scan, and Chesapeake Bay missions as part of the Superflux 2 experiment. Excellent correlations were obtained between water sample measurements of chlorophyll and sediment and TBAMS radiance data. The three-band algorithms used were insensitive to aircraft altitude and varying atmospheric conditions. This was particularly fortunate due to the hazy conditions during most of the experiments. A contour map of sediment, and also chlorophyll, was derived for the Chesapeake Bay plume along the southern Virginia-Carolina coastline. A sediment maximum occurs about 5 nautical miles off the Virginia Beach coast with a chlorophyll maximum slightly shoreward of this. During the James Shelf mission, a thermal anomaly (or front) was encountered about 50 miles from the coast. There was a minor variation in chlorophyll and sediment across the boundary. During the Chesapeake Bay mission, the Sun elevation increased from 50 degrees to over 70 degrees, interfering with the generation of data products.

  5. Comparison of mosaicking techniques for airborne images from consumer-grade cameras

    NASA Astrophysics Data System (ADS)

    Song, Huaibo; Yang, Chenghai; Zhang, Jian; Hoffmann, Wesley Clint; He, Dongjian; Thomasson, J. Alex

    2016-01-01

    Images captured from airborne imaging systems can be mosaicked for diverse remote sensing applications. The objective of this study was to identify appropriate mosaicking techniques and software to generate mosaicked images for use by aerial applicators and other users. Three software packages-Photoshop CC, Autostitch, and Pix4Dmapper-were selected for mosaicking airborne images acquired from a large cropping area. Ground control points were collected for georeferencing the mosaicked images and for evaluating the accuracy of eight mosaicking techniques. Analysis and accuracy assessment showed that Pix4Dmapper can be the first choice if georeferenced imagery with high accuracy is required. The spherical method in Photoshop CC can be an alternative for cost considerations, and Autostitch can be used to quickly mosaic images with reduced spatial resolution. The results also showed that the accuracy of image mosaicking techniques could be greatly affected by the size of the imaging area or the number of the images and that the accuracy would be higher for a small area than for a large area. The results from this study will provide useful information for the selection of image mosaicking software and techniques for aerial applicators and other users.

  6. Long-Term Tracking of a Specific Vehicle Using Airborne Optical Camera Systems

    NASA Astrophysics Data System (ADS)

    Kurz, F.; Rosenbaum, D.; Runge, H.; Cerra, D.; Mattyus, G.; Reinartz, P.

    2016-06-01

    In this paper we present two low cost, airborne sensor systems capable of long-term vehicle tracking. Based on the properties of the sensors, a method for automatic real-time, long-term tracking of individual vehicles is presented. This combines the detection and tracking of the vehicle in low frame rate image sequences and applies the lagged Cell Transmission Model (CTM) to handle longer tracking outages occurring in complex traffic situations, e.g. tunnels. The CTM model uses the traffic conditions in the proximities of the target vehicle and estimates its motion to predict the position where it reappears. The method is validated on an airborne image sequence acquired from a helicopter. Several reference vehicles are tracked within a range of 500m in a complex urban traffic situation. An artificial tracking outage of 240m is simulated, which is handled by the CTM. For this, all the vehicles in the close proximity are automatically detected and tracked to estimate the basic density-flow relations of the CTM model. Finally, the real and simulated trajectories of the reference vehicles in the outage are compared showing good correspondence also in congested traffic situations.

  7. Multispectral imaging and image processing

    NASA Astrophysics Data System (ADS)

    Klein, Julie

    2014-02-01

    The color accuracy of conventional RGB cameras is not sufficient for many color-critical applications. One of these applications, namely the measurement of color defects in yarns, is why Prof. Til Aach and the Institute of Image Processing and Computer Vision (RWTH Aachen University, Germany) started off with multispectral imaging. The first acquisition device was a camera using a monochrome sensor and seven bandpass color filters positioned sequentially in front of it. The camera allowed sampling the visible wavelength range more accurately and reconstructing the spectra for each acquired image position. An overview will be given over several optical and imaging aspects of the multispectral camera that have been investigated. For instance, optical aberrations caused by filters and camera lens deteriorate the quality of captured multispectral images. The different aberrations were analyzed thoroughly and compensated based on models for the optical elements and the imaging chain by utilizing image processing. With this compensation, geometrical distortions disappear and sharpness is enhanced, without reducing the color accuracy of multispectral images. Strong foundations in multispectral imaging were laid and a fruitful cooperation was initiated with Prof. Bernhard Hill. Current research topics like stereo multispectral imaging and goniometric multispectral measure- ments that are further explored with his expertise will also be presented in this work.

  8. FireMapper 2.0: a multispectral uncooled infrared imaging system for airborne wildfire mapping and remote sensing

    NASA Astrophysics Data System (ADS)

    Hoffman, James W.; Riggan, Philip J.; Griffin, Stephanie A.; Grush, Ronald C.; Grush, William H.; Pena, James

    2003-11-01

    FireMapper®2.0 is a second-generation airborne system developed specifically for wildfire mapping and remote sensing. Its design is based on lessons learned from two years of flight-testing of a research FireMapper® system by the Pacific uthwest Research Station of the USDA Forest Service. The new, operational design features greater coverage and improved performance with a rugged sensor that is less than one third the size and weight of the original research sensor. The sensor obtains thermal infrared images in two narrow spectral bands and one wide spectral band with the use of a single uncooled microbolometer detector array. The dynamic range of the sensor is designed to accurately measure scene temperatures from normal backgrounds, for remote sensing and disaster management applications, up to flaming fronts without saturating. All three channels are extremely linear and are calibrated in-flight with a highly accurate absolute calibration system. Airborne testing of the research system has led to improved displays and simplified operator interfaces. These features facilitate the operational use of the FireMapper®2.0 system on both fixed wing aircraft and helicopters with minimal operator inputs. The operating system features custom software to display and zoom in on the images in realtime as they are obtained. Selected images can also be saved and recalled for detailed study. All images are tagged with GPS date, time, latitude, longitude, altitude, and heading and can be recorded on a portable USB hard drive upon operator command. The operating system can also be used to replay previously recorded image sequences. The FireMapper® 2.0 was designed and fabricated by Space Instruments, Inc. as part of a Research Joint Venture with the USDA Forest Service.

  9. Airborne imaging for heritage documentation using the Fotokite tethered flying camera

    NASA Astrophysics Data System (ADS)

    Verhoeven, Geert; Lupashin, Sergei; Briese, Christian; Doneus, Michael

    2014-05-01

    Since the beginning of aerial photography, researchers used all kinds of devices (from pigeons, kites, poles, and balloons to rockets) to take still cameras aloft and remotely gather aerial imagery. To date, many of these unmanned devices are still used for what has been referred to as Low-Altitude Aerial Photography or LAAP. In addition to these more traditional camera platforms, radio-controlled (multi-)copter platforms have recently added a new aspect to LAAP. Although model airplanes have been around for several decades, the decreasing cost, increasing functionality and stability of ready-to-fly multi-copter systems has proliferated their use among non-hobbyists. As such, they became a very popular tool for aerial imaging. The overwhelming amount of currently available brands and types (heli-, dual-, tri-, quad-, hexa-, octo-, dodeca-, deca-hexa and deca-octocopters), together with the wide variety of navigation options (e.g. altitude and position hold, waypoint flight) and camera mounts indicate that these platforms are here to stay for some time. Given the multitude of still camera types and the image quality they are currently capable of, endless combinations of low- and high-cost LAAP solutions are available. In addition, LAAP allows for the exploitation of new imaging techniques, as it is often only a matter of lifting the appropriate device (e.g. video cameras, thermal frame imagers, hyperspectral line sensors). Archaeologists were among the first to adopt this technology, as it provided them with a means to easily acquire essential data from a unique point of view, whether for simple illustration purposes of standing historic structures or to compute three-dimensional (3D) models and orthophotographs from excavation areas. However, even very cheap multi-copters models require certain skills to pilot them safely. Additionally, malfunction or overconfidence might lift these devices to altitudes where they can interfere with manned aircrafts. As such, the

  10. CCD image acquisition for multispectral teledetection

    NASA Astrophysics Data System (ADS)

    Peralta-Fabi, R.; Peralta, A.; Prado, Jorge M.; Vicente, Esau; Navarette, M.

    1992-08-01

    A low cost high-reliability multispectral video system has been developed for airborne remote sensing. Three low weight CCD cameras are mounted together with a photographic camera in a keviar composite self-contained structure. The CCD cameras are remotely controlled have spectral filters (80 nm at 50 T) placed in front of their optical system and all cameras are aligned to capture the same image field. Filters may be changed so as to adjust spectral bands according to the object s reflectance properties but a set of bands common to most remote sensing aircraft and satellites are usually placed covering visible and near JR. This paper presents results obtained with this system and some comparisons as to the cost resolution and atmospheric correction advantages with respect to other more costly devices. Also a brief description of the Remotely Piloted Vehicle (RPV) project where the camera system will be mounted is given. The images so obtained replace the costlier ones obtained by satellites in severai specific applications. Other applications under development include fire monitoring identification of vegetation in the field and in the laboratory discrimination of objects by color for industrial applications and for geological and engineering surveys. 1.

  11. <5cm Ground Resolution DEMs for the Atacama Fault System (Chile), Acquried With the Modular Airborne Camera System (MACS)

    NASA Astrophysics Data System (ADS)

    Zielke, O.; Victor, P.; Oncken, O.; Bucher, T. U.; Lehmann, F.

    2011-12-01

    A primary step towards assessing time and size of future earthquakes is the identification of earthquake recurrence patterns in the existing seismic record. Geologic and geomorphic data are commonly analyzed for this purpose, reasoned by the lack of sufficiently long historical or instrumental seismic data sets. Until recently, those geomorphic data sets encompassed field observation, local total station surveys, and aerial photography. Over the last decade, LiDAR-based high-resolution topographic data sets became an additional powerful mean, contributing distinctly to a better understanding of earthquake rupture characteristics (e.g., single-event along-fault slip distribution, along-fault slip accumulation pattern) and their relation to fault geometric complexities. Typical shot densities of such data sets (e.g., airborne-LiDAR data along the San Andreas Fault) permit generation of digital elevation models (DEM) with <50 cm ground resolution, sufficient for depiction of meter-scale tectonic landforms. Identification of submeter-scale features is however prohibited by DEM resolution limitation. Here, we present a high-resolution topographic and visual data set from the Atacama fault system near Antofagasta, Chile. Data were acquired with Modular Airborne Camera System (MACS) - developed by the DLR (German Aerospace Center) in Berlin, Germany. The photogrammetrically derived DEM and True Ortho Images with <5cm ground resolution permit identification of very small-scale geomorphic features, thus enabling fault zone and earthquake rupture characterization at unprecedented detail. Compared to typical LiDAR-DEM, ground resolution is increased by an order of magnitude while the spatial extend of these data set is essentially the same. Here, we present examples of the <5cm resolution data set (DEM and visual results) and further explore resolution capabilities and potential with regards to the aforementioned tectono-geomorphic questions.

  12. Airborne multispectral remote sensing data to estimate several oenological parameters in vineyard production. A case study of application of remote sensing data to precision viticulture in central Italy.

    NASA Astrophysics Data System (ADS)

    Tramontana, Gianluca; Girard, Filippo; Belli, Claudio; Comandini, Maria Cristina; Pietromarchi, Paolo; Tiberi, Domenico; Papale, Dario

    2010-05-01

    It is widely recognized that environmental differences within the vineyard, with respect to soils, microclimate, and topography, can influence grape characteristics and crop yields. Besides, the central Italy landscape is characterized by a high level of fragmentation and heterogeneity It requires stringent Remote sensing technical features in terms of spectral, geometric and temporal resolution to aimed at supporting applications for precision viticulture. In response to the needs of the Italian grape and wine industry for an evaluation of precision viticulture technologies, the DISAFRI (University of Tuscia) and the Agricultural Research Council - Oenological research unit (ENC-CRA) jointly carried out an experimental study during the year 2008. The study was carried out on 2 areas located in the town of Velletri, near Rome; for each area, two varieties (red and white grape) were studied: Nero d'Avola and Sauvignon blanc in first area , Merlot and Sauvignon blanc in second. Remote sensing data were acquired in different periods using a low cost multisensor Airborne remote sensing platform developed by DISAFRI (ASPIS-2 Advanced Spectroscopic Imager System). ASPIS-2, an evolution of the ASPIS sensor (Papale et al 2008, Sensors), is a multispectral sensor based on 4 CCD and 3 interferential filters per CCD. The filters are user selectable during the flight and in this way Aspis is able to acquire data in 12 bands in the visible and near infrared regions with a bandwidth of 10 or 20 nm. To the purposes of this study 7 spectral band were acquired and 15 vegetation indices calculated. During the ripeness period several vegetative and oenochemical parameters were monitored. Anova test shown that several oenochemical variables, such as sugars, total acidity, polyphenols and anthocyanins differ according to the variety taken into consideration. In order to evaluate the time autocorrelation of several oenological parameters value, a simple linear regression between

  13. An algorithm for the estimation of bounds on the emissivity and temperatures from thermal multispectral airborne remotely sensed data

    NASA Technical Reports Server (NTRS)

    Jaggi, S.; Quattrochi, D.; Baskin, R.

    1992-01-01

    The effective flux incident upon the detectors of a thermal sensor, after it has been corrected for atmospheric effects, is a function of a non-linear combination of the emissivity of the target for that channel and the temperature of the target. The sensor system cannot separate the contribution from the emissivity and the temperature that constitute the flux value. A method that estimates the bounds on these temperatures and emissivities from thermal data is described. This method is then tested with remotely sensed data obtained from NASA's Thermal Infrared Multispectral Scanner (TIMS) - a 6 channel thermal sensor. Since this is an under-determined set of equations i.e. there are 7 unknowns (6 emissivities and 1 temperature) and 6 equations (corresponding to the 6 channel fluxes), there exist theoretically an infinite combination of values of emissivities and temperature that can satisfy these equations. Using some realistic bounds on the emissivities, bounds on the temperature are calculated. These bounds on the temperature are refined to estimate a tighter bound on the emissivity of the source. An error analysis is also carried out to quantitatively determine the extent of uncertainty introduced in the estimate of these parameters. This method is useful only when a realistic set of bounds can be obtained for the emissivities of the data. In the case of water the lower and upper bounds were set at 0.97 and 1.00 respectively. Five flights were flown in succession at altitudes of 2 km (low), 6 km (mid), 12 km (high), and then back again at 6 km and 2 km. The area selected with the Ross Barnett reservoir near Jackson, Mississippi. The mission was flown during the predawn hours of 1 Feb. 1992. Radiosonde data was collected for that duration to profile the characteristics of the atmosphere. Ground truth temperatures using thermometers and radiometers were also obtained over an area of the reservoir. The results of two independent runs of the radiometer data averaged

  14. Retrieval of cloud optical properties using airborne hyperspectral cameras during the VOCALS campaign.

    NASA Astrophysics Data System (ADS)

    Labrador, L.; Vaughan, G.

    2009-09-01

    A set of two hyperspectral imaging sensors have been used to analyze the optical properties of stratocumulus cloud off the coast of Northern Chile within the framework of the VAMOS Ocean Clouds Atmosphere Land Study (VOCALS) during September-October 2008. The SPECIM Aisa Eagle & Hawk are tandem pushbroom-type hyperspectral imagers scanning in the 400-970 and 970-2500 nm range, respectively. The instruments were mounted onboard the National Environmental Research Council's (NERC) Dornier DO-228 aircraft, based in Arica, northern Chile during the campaign. An area approximately 600 x 200 km was surveyed off the northern coast of Chile and a total of 14 science flights were carried out where hyperspectral data were successfully collected over the stratocumulus deck at altitudes varying between 10000 and 15000 ft. Cloud optical properties, such as cloud optical thickness, cloud effective radius and liquid water path can be retrieved which can then be compared with space-borne hyperspectral imagers' retrievals. Atmospheric corrections have been applied to enable the comparison between the different type of sensors and the analysis requires, amongst other, solving the back-scattering problems associated with off-nadir views. The high resolution, both spatial and temporal, of these airborne sensors makes them ideal to validate satellite retrievals of cloud optical properties.

  15. [In-flight absolute radiometric calibration of UAV multispectral sensor].

    PubMed

    Chen, Wei; Yan, Lei; Gou, Zhi-Yang; Zhao, Hong-Ying; Liu, Da-Ping; Duan, Yi-Ni

    2012-12-01

    Based on the data of the scientific experiment in Urad Front Banner for UAV Remote Sensing Load Calibration Field project, with the help of 6 hyperspectral radiometric targets with good Lambertian property, the wide-view multispectral camera in UAV was calibrated adopting reflectance-based method. The result reveals that for green, red and infrared channel, whose images were successfully captured, the linear correlation coefficients between the DN and radiance are all larger than 99%. In final analysis, the comprehensive error is no more than 6%. The calibration results demonstrate that the hyperspectral targets equipped by the calibration field are well suitable for air-borne multispectral load in-flight calibration. The calibration result is reliable and could be used in the retrieval of geophysical parameters. PMID:23427528

  16. Retrieval of water quality algorithms from airborne HySpex camera for oxbow lakes in north-eastern Poland

    NASA Astrophysics Data System (ADS)

    Slapinska, Malgorzata; Berezowski, Tomasz; Frąk, Magdalena; Chormański, Jarosław

    2016-04-01

    The aim of this study was to retrieve empirical formulas for water quality of oxbow lakes in Lower Biebrza Basin (river located in NE Poland) using HySpex airborne imaging spectrometer. Biebrza River is one of the biggest wetland in Europe. It is characterised by low contamination level and small human influence. Because of those characteristics Biebrza River can be treated as a reference area for other floodplains and fen ecosystem in Europe. Oxbow lakes are important part of Lower Biebrza Basin due to their retention and habitat function. Hyperspectral remote sensing data were acquired by the HySpex sensor (which covers the range of 400-2500 nm) on 01-02.08.2015 with the ground measurements campaign conducted 03-04.08.2015. The ground measurements consisted of two parts. First part included spectral reflectance sampling with spectroradiometer ASD FieldSpec 3, which covered the wavelength range of 350-2500 nm at 1 nm intervals. In situ data were collected both for water and for specific objects within the area. Second part of the campaign included water parameters such as Secchi disc depth (SDD), electric conductivity (EC), pH, temperature and phytoplankton. Measured reflectance enabled empirical line atmospheric correction which was conducted for the HySpex data. Our results indicated that proper atmospheric correction was very important for further data analysis. The empirical formulas for our water parameters were retrieved based on reflecatance data. This study confirmed applicability of HySpex camera to retrieve water quality.

  17. Development of a computer-aided alignment simulator for an EO/IR dual-band airborne camera

    NASA Astrophysics Data System (ADS)

    Lee, Jun Ho; Ryoo, Seungyeol; Park, Kwang-Woo; Lee, Haeng Bok

    2012-10-01

    An airborne sensor is developed for remote sensing on an unmanned aerial vehicle (UAV). The sensor is an optical payload for an eletro-optical/infrared (EO/IR) dual band camera that combines visible and IR imaging capabilities in a compact and lightweight manner. It adopts a Ritchey-Chrétien telescope for the common front end optics with several relay optics that divide and deliver EO and IR bands to a charge-coupled-device (CCD) and an IR detector, respectively. For the easy assemble of such a complicated optics, a computer-aided alignment program (herein called simulator) is developed. The simulator first estimates the details of the misalignments such as locations, types, and amounts from the test results such as modulation transfer function (MTF), Zernike polynomial coefficients, and RMS wavefront errors at different field positions. Then it recommends the compensator movement(s) with the estimated optical performance. The simulator is coded on Matlab with the hidden connection to optical analysis/design software Zemax. By interfacing ZEMAX and MATLAB, the GUI-based alignment simulator, will help even those not familiar with the two programs to obtain accurate results more easily and quickly.

  18. Multispectral imaging of absorption and scattering properties of in vivo exposed rat brain using a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Yoshida, Keiichiro; Nishidate, Izumi; Ishizuka, Tomohiro; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu

    2015-05-01

    In order to estimate multispectral images of the absorption and scattering properties in the cerebral cortex of in vivo rat brain, we investigated spectral reflectance images estimated by the Wiener estimation method using a digital RGB camera. A Monte Carlo simulation-based multiple regression analysis for the corresponding spectral absorbance images at nine wavelengths (500, 520, 540, 560, 570, 580, 600, 730, and 760 nm) was then used to specify the absorption and scattering parameters of brain tissue. In this analysis, the concentrations of oxygenated hemoglobin and that of deoxygenated hemoglobin were estimated as the absorption parameters, whereas the coefficient a and the exponent b of the reduced scattering coefficient spectrum approximated by a power law function were estimated as the scattering parameters. The spectra of absorption and reduced scattering coefficients were reconstructed from the absorption and scattering parameters, and the spectral images of absorption and reduced scattering coefficients were then estimated. In order to confirm the feasibility of this method, we performed in vivo experiments on exposed rat brain. The estimated images of the absorption coefficients were dominated by the spectral characteristics of hemoglobin. The estimated spectral images of the reduced scattering coefficients had a broad scattering spectrum, exhibiting a larger magnitude at shorter wavelengths, corresponding to the typical spectrum of brain tissue published in the literature. The changes in the estimated absorption and scattering parameters during normoxia, hyperoxia, and anoxia indicate the potential applicability of the method by which to evaluate the pathophysiological conditions of in vivo brain due to the loss of tissue viability.

  19. Geo-Referenced Mapping Using AN Airborne 3d Time-Of Camera

    NASA Astrophysics Data System (ADS)

    Kohoutek, T. K.; Nitsche, M.; Eisenbeiss, H.

    2011-09-01

    This paper presents the first experience of a close range bird's eye view photogrammetry with range imaging (RIM) sensors for the real time generation of high resolution geo-referenced 3D surface models. The aim of this study was to develop a mobile, versatile and less costly outdoor survey methodology to measure natural surfaces compared to the terrestrial laser scanning (TLS). Two commercial RIM cameras (SR4000 by MESA Imaging AG and a CamCube 2.0 by PMDTechnologies GmbH) were mounted on a lightweight crane and on an unmanned aerial vehicle (UAV). The field experiments revealed various challenges in real time deployment of the two state-of-the-art RIM systems, e.g. processing of the large data volume. Acquisition strategy and data processing and first measurements are presented. The precision of the measured distances is less than 1 cm for good conditions. However, the measurement precision degraded under the test conditions due to direct sunlight, strong illumination contrasts and helicopter vibrations.

  20. Improved Airborne System for Sensing Wildfires

    NASA Technical Reports Server (NTRS)

    McKeown, Donald; Richardson, Michael

    2008-01-01

    The Wildfire Airborne Sensing Program (WASP) is engaged in a continuing effort to develop an improved airborne instrumentation system for sensing wildfires. The system could also be used for other aerial-imaging applications, including mapping and military surveillance. Unlike prior airborne fire-detection instrumentation systems, the WASP system would not be based on custom-made multispectral line scanners and associated custom- made complex optomechanical servomechanisms, sensors, readout circuitry, and packaging. Instead, the WASP system would be based on commercial off-the-shelf (COTS) equipment that would include (1) three or four electronic cameras (one for each of three or four wavelength bands) instead of a multispectral line scanner; (2) all associated drive and readout electronics; (3) a camera-pointing gimbal; (4) an inertial measurement unit (IMU) and a Global Positioning System (GPS) receiver for measuring the position, velocity, and orientation of the aircraft; and (5) a data-acquisition subsystem. It would be necessary to custom-develop an integrated sensor optical-bench assembly, a sensor-management subsystem, and software. The use of mostly COTS equipment is intended to reduce development time and cost, relative to those of prior systems.

  1. Molecular Shocks Associated with Massive Young Stars: CO Line Images with a New Far-Infrared Spectroscopic Camera on the Kuiper Airborne Observatory

    NASA Technical Reports Server (NTRS)

    Watson, Dan M.

    1997-01-01

    Under the terms of our contract with NASA Ames Research Center, the University of Rochester (UR) offers the following final technical report on grant NAG 2-958, Molecular shocks associated with massive young stars: CO line images with a new far-infrared spectroscopic camera, given for implementation of the UR Far-Infrared Spectroscopic Camera (FISC) on the Kuiper Airborne Observatory (KAO), and use of this camera for observations of star-formation regions 1. Two KAO flights in FY 1995, the final year of KAO operations, were awarded to this program, conditional upon a technical readiness confirmation which was given in January 1995. The funding period covered in this report is 1 October 1994 - 30 September 1996. The project was supported with $30,000, and no funds remained at the conclusion of the project.

  2. Michigan experimental multispectral scanner system

    NASA Technical Reports Server (NTRS)

    Hasell, P. G., Jr.

    1972-01-01

    A functional description of a multispectral airborne scanner system that provides spectral bands along a single optical line of sight is reported. The airborne scanner consists of an optical telescope for scanning plane perpendicular to the longitudinal axis of the aircraft and radiation detectors for converting radiation to electrical signals. The system makes a linear transformation of input radiation to voltage recorded on analog magnetic tape.

  3. A Cryogenic, Insulating Suspension System for the High Resolution Airborne Wideband Camera (HAWC)and Submillemeter And Far Infrared Experiment (SAFIRE) Adiabatic Demagnetization Refrigerators (ADRs)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Jackson, Michael L.; Shirron, Peter J.; Tuttle, James G.

    2002-01-01

    The High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter And Far Infrared Experiment (SAFIRE) will use identical Adiabatic Demagnetization Refrigerators (ADR) to cool their detectors to 200mK and 100mK, respectively. In order to minimize thermal loads on the salt pill, a Kevlar suspension system is used to hold it in place. An innovative, kinematic suspension system is presented. The suspension system is unique in that it consists of two parts that can be assembled and tensioned offline, and later bolted onto the salt pill.

  4. Evaluation of an airborne remote sensing platform consisting of two consumer-grade cameras for crop identification

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Remote sensing systems based on consumer-grade cameras have been increasingly used in scientific research and remote sensing applications because of their low cost and ease of use. However, the performance of consumer-grade cameras for practical applications have not been well documented in related ...

  5. Remote sensing techniques applied to multispectral recognition of the Aranjuez pilot zone

    NASA Technical Reports Server (NTRS)

    Lemos, G. L.; Salinas, J.; Rebollo, M.

    1977-01-01

    A rectangular (7 x 14 km) area 40 km S of Madrid was remote-sensed with a three-stage recognition process. Ground truth was established in the first phase, airborne sensing with a multispectral scanner and photographic cameras were used in the second phase, and Landsat satellite data were obtained in the third phase. Agronomic and hydrological photointerpretation problems are discussed. Color, black/white, and labeled areas are displayed for crop recognition in the land-use survey; turbidity, concentrations of pollutants and natural chemicals, and densitometry of the water are considered in the evaluation of water resources.

  6. UAV-based multi-spectral environmental monitoring

    NASA Astrophysics Data System (ADS)

    Arnold, Thomas; De Biasio, Martin; Fritz, Andreas; Frank, Albert; Leitner, Raimund

    2012-06-01

    This paper describes an airborne multi-spectral imaging system which is able to simultaneously capture three visible (400-670nm at 50% FWHM) and three near infrared channels (670-1000nm at 50% FWHM). The rst prototype was integrated in a Schiebel CAMCOPTER®S-100 VTOL (Vertical Take-O and Landing) UAV (Unmanned Aerial Vehicle) for initial test ights in spring 2010. The UAV was own over land containing various types of vegetation. A miniaturized version of the initial multi-spectral imaging system was developed in 2011 to t into a more compact UAV. The imaging system captured six bands with a minimal spatial resolution of approx. 10cm x 10cm (depending on altitude). Results show that the system is able to resist the high vibration level during ight and that the actively stabilized camera gimbal compensates for rapid roll/tilt movements of the UAV. After image registration the acquired images are stitched together for land cover mapping and ight path validation. Moreover the system is able to distinguish between dierent types of vegetation and soil. Future work will include the use of spectral imaging techniques to identify spectral features that are related to water stress, nutrient deciency and pest infestation. Once these bands have been identied, narrowband lters will be incorporated into the airborne system.

  7. Optical design of high resolution and large format CCD airborne remote sensing camera on unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Qian, Yixian; Cheng, Xiaowei; Shao, Jie

    2010-11-01

    Unmanned aerial vehicle remote sensing (UAVRS) is lower in cost, flexible on task arrangement and automatic and intelligent in application, it has been used widely for mapping, surveillance, reconnaissance and city planning. Airborne remote sensing missions require sensors with both high resolution and large fields of view, large format CCD digital airborne imaging systems are now a reality. A refractive system was designed to meet the requirements with the help of code V software, It has a focal length of 150mm, F number of 5.6, waveband of 0.45~0.7um, and field of view reaches 20°. It is shown that the value of modulation transfer function is higher than 0.5 at 55lp/mm, distortion is less than 0.1%, image quality reaches the diffraction limit. The system with large format CCD and wide field can satisfy the demand of the wide ground overlay area and high resolution. The optical system with simpler structure, smaller size and lighter weight, can be used in airborne remote sensing.

  8. Design and Fabrication of Two-Dimensional Semiconducting Bolometer Arrays for the High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC-II)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Allen, Christine A.; Amato, Michael J.; Babu, Sachidananda R.; Bartels, Arlin E.; Benford, Dominic J.; Derro, Rebecca J.; Dowell, C. Darren; Harper, D. Al; Jhabvala, Murzy D.

    2002-01-01

    The High resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC II) will use almost identical versions of an ion-implanted silicon bolometer array developed at the National Aeronautics and Space Administration's Goddard Space Flight Center (GSFC). The GSFC 'Pop-up' Detectors (PUD's) use a unique folding technique to enable a 12 x 32-element close-packed array of bolometers with a filling factor greater than 95 percent. A kinematic Kevlar(trademark) suspension system isolates the 200 mK bolometers from the helium bath temperature, and GSFC - developed silicon bridge chips make electrical connection to the bolometers, while maintaining thermal isolation. The JFET preamps operate at 120 K. Providing good thermal heat sinking for these, and keeping their conduction and radiation from reaching the nearby bolometers, is one of the principal design challenges encountered. Another interesting challenge is the preparation of the silicon bolometers. They are manufactured in 32-element, planar rows using Micro Electro Mechanical Systems (MEMS) semiconductor etching techniques, and then cut and folded onto a ceramic bar. Optical alignment using specialized jigs ensures their uniformity and correct placement. The rows are then stacked to create the 12 x 32-element array. Engineering results from the first light run of SHARC II at the Caltech Submillimeter Observatory (CSO) are presented.

  9. Design and Fabrication of Two-Dimensional Semiconducting Bolometer Arrays for the High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC-II)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Allen, Christine A.; Amato, Michael J.; Babu, Sachidananda R.; Bartels, Arlin E.; Benford, Dominic J.; Derro, Rebecca J.; Dowell, C. Darren; Harper, D. Al; Jhabvala, Murzy D.; Simpson, A. D. (Technical Monitor)

    2002-01-01

    The High resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC 11) will use almost identical versions of an ion-implanted silicon bolometer array developed at the National Aeronautics and Space Administration's Goddard Space Flight Center (GSFC). The GSFC "Pop-Up" Detectors (PUD's) use a unique folding technique to enable a 12 x 32-element close-packed array of bolometers with a filling factor greater than 95 percent. A kinematic Kevlar(Registered Trademark) suspension system isolates the 200 mK bolometers from the helium bath temperature, and GSFC - developed silicon bridge chips make electrical connection to the bolometers, while maintaining thermal isolation. The JFET preamps operate at 120 K. Providing good thermal heat sinking for these, and keeping their conduction and radiation from reaching the nearby bolometers, is one of the principal design challenges encountered. Another interesting challenge is the preparation of the silicon bolometers. They are manufactured in 32-element, planar rows using Micro Electro Mechanical Systems (MEMS) semiconductor etching techniques, and then cut and folded onto a ceramic bar. Optical alignment using specialized jigs ensures their uniformity and correct placement. The rows are then stacked to create the 12 x 32-element array. Engineering results from the first light run of SHARC II at the CalTech Submillimeter Observatory (CSO) are presented.

  10. Cucumber disease diagnosis using multispectral images

    NASA Astrophysics Data System (ADS)

    Feng, Jie; Li, Hongning; Shi, Junsheng; Yang, Weiping; Liao, Ningfang

    2009-07-01

    In this paper, multispectral imaging technique for plant diseases diagnosis is presented. Firstly, multispectral imaging system is designed. This system utilizes 15 narrow-band filters, a panchromatic band, a monochrome CCD camera, and standard illumination observing environment. The spectral reflectance and color of 8 Macbeth color patches are reproduced between 400nm and 700nm in the process. In addition, spectral reflectance angle and color difference is obtained through measurements and analysis of color patches using spectrometer and multispectral imaging system. The result shows that 16 narrow-bands multispectral imaging system realizes good accuracy in spectral reflectance and color reproduction. Secondly, a horticultural plant, cucumber' familiar disease are the researching objects. 210 multispectral samples are obtained by multispectral and are classified by BP artificial neural network. The classification accuracies of Sphaerotheca fuliginea, Corynespora cassiicola, Pseudoperonospora cubensis are 100%. Trichothecium roseum and Cladosporium cucumerinum are 96.67% and 90.00%. It is confirmed that the multispectral imaging system realizes good accuracy in the cucumber diseases diagnosis.

  11. Replacing 16 mm film cameras with high definition digital cameras

    SciTech Connect

    Balch, K.S.

    1995-12-31

    For many years 16 mm film cameras have been used in severe environments. These film cameras are used on Hy-G automotive sleds, airborne gun cameras, range tracking and other hazardous environments. The companies and government agencies using these cameras are in need of replacing them with a more cost effective solution. Film-based cameras still produce the best resolving capability, however, film development time, chemical disposal, recurring media cost, and faster digital analysis are factors influencing the desire for a 16 mm film camera replacement. This paper will describe a new camera from Kodak that has been designed to replace 16 mm high speed film cameras.

  12. High Resolution Airborne Digital Imagery for Precision Agriculture

    NASA Technical Reports Server (NTRS)

    Herwitz, Stanley R.

    1998-01-01

    The Environmental Research Aircraft and Sensor Technology (ERAST) program is a NASA initiative that seeks to demonstrate the application of cost-effective aircraft and sensor technology to private commercial ventures. In 1997-98, a series of flight-demonstrations and image acquisition efforts were conducted over the Hawaiian Islands using a remotely-piloted solar- powered platform (Pathfinder) and a fixed-wing piloted aircraft (Navajo) equipped with a Kodak DCS450 CIR (color infrared) digital camera. As an ERAST Science Team Member, I defined a set of flight lines over the largest coffee plantation in Hawaii: the Kauai Coffee Company's 4,000 acre Koloa Estate. Past studies have demonstrated the applications of airborne digital imaging to agricultural management. Few studies have examined the usefulness of high resolution airborne multispectral imagery with 10 cm pixel sizes. The Kodak digital camera integrated with ERAST's Airborne Real Time Imaging System (ARTIS) which generated multiband CCD images consisting of 6 x 106 pixel elements. At the designated flight altitude of 1,000 feet over the coffee plantation, pixel size was 10 cm. The study involved the analysis of imagery acquired on 5 March 1998 for the detection of anomalous reflectance values and for the definition of spectral signatures as indicators of tree vigor and treatment effectiveness (e.g., drip irrigation; fertilizer application).

  13. Commercial Applications Multispectral Sensor System

    NASA Technical Reports Server (NTRS)

    Birk, Ronald J.; Spiering, Bruce

    1993-01-01

    NASA's Office of Commercial Programs is funding a multispectral sensor system to be used in the development of remote sensing applications. The Airborne Terrestrial Applications Sensor (ATLAS) is designed to provide versatility in acquiring spectral and spatial information. The ATLAS system will be a test bed for the development of specifications for airborne and spaceborne remote sensing instrumentation for dedicated applications. This objective requires spectral coverage from the visible through thermal infrared wavelengths, variable spatial resolution from 2-25 meters; high geometric and geo-location accuracy; on-board radiometric calibration; digital recording; and optimized performance for minimized cost, size, and weight. ATLAS is scheduled to be available in 3rd quarter 1992 for acquisition of data for applications such as environmental monitoring, facilities management, geographic information systems data base development, and mineral exploration.

  14. Multispectral Photography: the obscure becomes the obvious

    ERIC Educational Resources Information Center

    Polgrean, John

    1974-01-01

    Commonly used in map making, real estate zoning, and highway route location, aerial photography planes equipped with multispectral cameras may, among many environmental applications, now be used to locate mineral deposits, define marshland boundaries, study water pollution, and detect diseases in crops and forests. (KM)

  15. Processing Of Multispectral Data For Identification Of Rocks

    NASA Technical Reports Server (NTRS)

    Evans, Diane L.

    1990-01-01

    Linear discriminant analysis and supervised classification evaluated. Report discusses processing of multispectral remote-sensing imagery to identify kinds of sedimentary rocks by spectral signatures in geological and geographical contexts. Raw image data are spectra of picture elements in images of seven sedimentary rock units exposed on margin of Wind River Basin in Wyoming. Data acquired by Landsat Thematic Mapper (TM), Thermal Infrared Multispectral Scanner (TIMS), and NASA/JPL airborne synthetic-aperture radar (SAR).

  16. Multi-spectral image dissector camera system

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The image dissector sensor for the Earth Resources Program is evaluated using contrast and reflectance data. The ground resolution obtainable for low contrast at the targeted signal to noise ratio of 1.8 was defined. It is concluded that the system is capable of achieving the detection of small, low contrast ground targets from satellites.

  17. Analysis of vegetation indices derived from aerial multispectral and ground hyperspectral data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Aerial multispectral images are a good source of crop, soil, and ground coverage information. Spectral reflectance indices provide a useful tool for monitoring crop growing status. A series of aerial images were acquired by an airborne MS4100 multispectral imaging system on the cotton and soybean f...

  18. D Land Cover Classification Based on Multispectral LIDAR Point Clouds

    NASA Astrophysics Data System (ADS)

    Zou, Xiaoliang; Zhao, Guihua; Li, Jonathan; Yang, Yuanxi; Fang, Yong

    2016-06-01

    Multispectral Lidar System can emit simultaneous laser pulses at the different wavelengths. The reflected multispectral energy is captured through a receiver of the sensor, and the return signal together with the position and orientation information of sensor is recorded. These recorded data are solved with GNSS/IMU data for further post-processing, forming high density multispectral 3D point clouds. As the first commercial multispectral airborne Lidar sensor, Optech Titan system is capable of collecting point clouds data from all three channels at 532nm visible (Green), at 1064 nm near infrared (NIR) and at 1550nm intermediate infrared (IR). It has become a new source of data for 3D land cover classification. The paper presents an Object Based Image Analysis (OBIA) approach to only use multispectral Lidar point clouds datasets for 3D land cover classification. The approach consists of three steps. Firstly, multispectral intensity images are segmented into image objects on the basis of multi-resolution segmentation integrating different scale parameters. Secondly, intensity objects are classified into nine categories by using the customized features of classification indexes and a combination the multispectral reflectance with the vertical distribution of object features. Finally, accuracy assessment is conducted via comparing random reference samples points from google imagery tiles with the classification results. The classification results show higher overall accuracy for most of the land cover types. Over 90% of overall accuracy is achieved via using multispectral Lidar point clouds for 3D land cover classification.

  19. Application of multispectral systems for the diagnosis of plant diseases

    NASA Astrophysics Data System (ADS)

    Feng, Jie; Liao, Ningfang; Wang, Guolong; Luo, Yongdao; Liang, Minyong

    2008-03-01

    Multispectral imaging technique combines space imaging and spectral detecting. It can obtain the spectral information and image information of object at the same time. Base on this concept, A new method proposed multispectral camera system to demonstrated plant diseases. In this paper, multispectral camera was used as image capturing device. It consists of a monochrome CCD camera and 16 narrow-band filters. The multispectral images of Macbeth 24 color patches are captured under the illumination of incandescent lamp in this experiment The 64 spectral reflectances of each color patches are calculated using Spline interpolation from 400 to 700nm in the process. And the color of the object is reproduced from the estimated spectral reflectance. The result for reproduction is contrast with the color signal using X-rite PULSE spectrophotometer. The average and maximum ΔΕ * ab are 9.23 and 12.81. It is confirmed that the multispectral system realizes the color reproduction of plant diseases from narrow-band multispectral image.

  20. Multispectral observations of marine mammals

    NASA Astrophysics Data System (ADS)

    Schoonmaker, Jon; Dirbas, Joseph; Podobna, Yuliya; Wells, Tami; Boucher, Cynthia; Oakley, Daniel

    2008-10-01

    Multispectral visible and infrared observations of various species of whales were made in the St. Lawrence Seaway near Quebec, Canada and Papawai Point in Maui, Hawaii. The Multi-mission Adaptable Narrowband Imaging System (MANTIS) was deployed in two configurations: airborne looking down, and bluff mounted looking at low-grazing angles. An Infrared (IR) sensor was also deployed in the bluff mounted configuration. Detections of marine mammals were made with these systems of submerged mammals and surface mammals at ranges up to 8 miles. Automatic detection algorithms are being explored to detect, track and monitor the behavior of individuals and pods of whales. This effort is part of a United States Navy effort to insure that marine mammals are not injured during the testing of the US Navy's acoustic Anti-submarine Warfare (ASW) systems.

  1. Leica ADS40 Sensor for Coastal Multispectral Imaging

    NASA Technical Reports Server (NTRS)

    Craig, John C.

    2007-01-01

    The Leica ADS40 Sensor as it is used for coastal multispectral imaging is presented. The contents include: 1) Project Area Overview; 2) Leica ADS40 Sensor; 3) Focal Plate Arrangements; 4) Trichroid Filter; 5) Gradient Correction; 6) Image Acquisition; 7) Remote Sensing and ADS40; 8) Band comparisons of Satellite and Airborne Sensors; 9) Impervious Surface Extraction; and 10) Impervious Surface Details.

  2. Multispectral microwave imaging radar for remote sensing applications

    NASA Technical Reports Server (NTRS)

    Larson, R. W.; Rawson, R.; Ausherman, D.; Bryan, L.; Porcello, L.

    1974-01-01

    A multispectral airborne microwave radar imaging system, capable of obtaining four images simultaneously is described. The system has been successfully demonstrated in several experiments and one example of results obtained, fresh water ice, is given. Consideration of the digitization of the imagery is given and an image digitizing system described briefly. Preliminary results of digitization experiments are included.

  3. Multispectral imaging system for contaminant detection

    NASA Technical Reports Server (NTRS)

    Poole, Gavin H. (Inventor)

    2003-01-01

    An automated inspection system for detecting digestive contaminants on food items as they are being processed for consumption includes a conveyor for transporting the food items, a light sealed enclosure which surrounds a portion of the conveyor, with a light source and a multispectral or hyperspectral digital imaging camera disposed within the enclosure. Operation of the conveyor, light source and camera are controlled by a central computer unit. Light reflected by the food items within the enclosure is detected in predetermined wavelength bands, and detected intensity values are analyzed to detect the presence of digestive contamination.

  4. Multispectral imaging using a single bucket detector

    PubMed Central

    Bian, Liheng; Suo, Jinli; Situ, Guohai; Li, Ziwei; Fan, Jingtao; Chen, Feng; Dai, Qionghai

    2016-01-01

    Existing multispectral imagers mostly use available array sensors to separately measure 2D data slices in a 3D spatial-spectral data cube. Thus they suffer from low photon efficiency, limited spectrum range and high cost. To address these issues, we propose to conduct multispectral imaging using a single bucket detector, to take full advantage of its high sensitivity, wide spectrum range, low cost, small size and light weight. Technically, utilizing the detector’s fast response, a scene’s 3D spatial-spectral information is multiplexed into a dense 1D measurement sequence and then demultiplexed computationally under the single pixel imaging scheme. A proof-of-concept setup is built to capture multispectral data of 64 pixels × 64 pixels × 10 wavelength bands ranging from 450 nm to 650 nm, with the acquisition time being 1 minute. The imaging scheme holds great potentials for various low light and airborne applications, and can be easily manufactured as production-volume portable multispectral imagers. PMID:27103168

  5. Multispectral imaging using a single bucket detector.

    PubMed

    Bian, Liheng; Suo, Jinli; Situ, Guohai; Li, Ziwei; Fan, Jingtao; Chen, Feng; Dai, Qionghai

    2016-01-01

    Existing multispectral imagers mostly use available array sensors to separately measure 2D data slices in a 3D spatial-spectral data cube. Thus they suffer from low photon efficiency, limited spectrum range and high cost. To address these issues, we propose to conduct multispectral imaging using a single bucket detector, to take full advantage of its high sensitivity, wide spectrum range, low cost, small size and light weight. Technically, utilizing the detector's fast response, a scene's 3D spatial-spectral information is multiplexed into a dense 1D measurement sequence and then demultiplexed computationally under the single pixel imaging scheme. A proof-of-concept setup is built to capture multispectral data of 64 pixels × 64 pixels × 10 wavelength bands ranging from 450 nm to 650 nm, with the acquisition time being 1 minute. The imaging scheme holds great potentials for various low light and airborne applications, and can be easily manufactured as production-volume portable multispectral imagers. PMID:27103168

  6. Compositional and Mineralogic Interpretation of MSL Curiosity Rover Mastcam Multispectral Measurements in Gale Crater

    NASA Astrophysics Data System (ADS)

    Wellington, D. F.; Bell, J. F.; Godber, A.; Johnson, J. R.; Rice, M. S.; Kinch, K. M.; MSL Science Team

    2014-07-01

    The MSL Mast Cameras can produce multispectral (445-1013nm) images, which together with frequent calibration target imaging allows calibration to relative reflectance. Since landing, distinct spectral classes of surface materials have been observed.

  7. Novel instrumentation of multispectral imaging technology for detecting tissue abnormity

    NASA Astrophysics Data System (ADS)

    Yi, Dingrong; Kong, Linghua

    2012-10-01

    Multispectral imaging is becoming a powerful tool in a wide range of biological and clinical studies by adding spectral, spatial and temporal dimensions to visualize tissue abnormity and the underlying biological processes. A conventional spectral imaging system includes two physically separated major components: a band-passing selection device (such as liquid crystal tunable filter and diffraction grating) and a scientific-grade monochromatic camera, and is expensive and bulky. Recently micro-arrayed narrow-band optical mosaic filter was invented and successfully fabricated to reduce the size and cost of multispectral imaging devices in order to meet the clinical requirement for medical diagnostic imaging applications. However the challenging issue of how to integrate and place the micro filter mosaic chip to the targeting focal plane, i.e., the imaging sensor, of an off-shelf CMOS/CCD camera is not reported anywhere. This paper presents the methods and results of integrating such a miniaturized filter with off-shelf CMOS imaging sensors to produce handheld real-time multispectral imaging devices for the application of early stage pressure ulcer (ESPU) detection. Unlike conventional multispectral imaging devices which are bulky and expensive, the resulting handheld real-time multispectral ESPU detector can produce multiple images at different center wavelengths with a single shot, therefore eliminates the image registration procedure required by traditional multispectral imaging technologies.

  8. Remote sensing of shorelines using data fusion of hyperspectral and multispectral imagery acquired from mobile and fixed platforms

    NASA Astrophysics Data System (ADS)

    Bostater, Charles R.; Frystacky, Heather

    2012-06-01

    An optimized data fusion methodology is presented and makes use of airborne and vessel mounted hyperspectral and multispectral imagery acquired at littoral zones in Florida and the northern Gulf of Mexico. The results demonstrate the use of hyperspectral-multispectral data fusion anomaly detection along shorelines and in surface and subsurface waters. Hyperspectral imagery utilized in the data fusion analysis was collected using a 64-1024 channel, 1376 pixel swath width; temperature stabilized sensing system; an integrated inertial motion unit; and differential GPS. The imaging system is calibrated using dual 18 inch calibration spheres, spectral line sources, and custom line targets. Simultaneously collected multispectral three band imagery used in the data fusion analysis was derived either a 12 inch focal length large format camera using 9 inch high speed AGFA color negative film, a 12.3 megapixel digital camera or dual high speed full definition video cameras. Pushbroom sensor imagery is corrected using Kalman filtering and smoothing in order to correct images for airborne platform motions or motions of a small vessel. Custom software developed for the hyperspectral system and the optimized data fusion process allows for post processing using atmospherically corrected and georeferenced reflectance imagery. The optimized data fusion approach allows for detecting spectral anomalies in the resolution enhanced data cubes. Spectral-spatial anomaly detection is demonstrated using simulated embedded targets in actual imagery. The approach allows one to utilize spectral signature anomalies to identify features and targets that would otherwise not be possible. The optimized data fusion techniques and software has been developed in order to perform sensitivity analysis of the synthetic images in order to optimize the singular value decomposition model building process and the 2-D Butterworth cutoff frequency selection process, using the concept of user defined "feature

  9. Low SWaP multispectral sensors using dichroic filter arrays

    NASA Astrophysics Data System (ADS)

    Dougherty, John; Varghese, Ron

    2015-06-01

    The benefits of multispectral imaging are well established in a variety of applications including remote sensing, authentication, satellite and aerial surveillance, machine vision, biomedical, and other scientific and industrial uses. However, many of the potential solutions require more compact, robust, and cost-effective cameras to realize these benefits. The next generation of multispectral sensors and cameras needs to deliver improvements in size, weight, power, portability, and spectral band customization to support widespread deployment for a variety of purpose-built aerial, unmanned, and scientific applications. A novel implementation uses micro-patterning of dichroic filters1 into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. This approach can be implemented across a variety of wavelength ranges and on a variety of detector types including linear, area, silicon, and InGaAs. This dichroic filter array approach can also reduce payloads and increase range for unmanned systems, with the capability to support both handheld and autonomous systems. Recent examples and results of 4 band RGB + NIR dichroic filter arrays in multispectral cameras are discussed. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and scalable production.

  10. Remote Multispectral Imaging of Wildland Fires (Invited)

    NASA Astrophysics Data System (ADS)

    Vodacek, A.; Kremens, R.

    2010-12-01

    Wildland fires produce a variety of signal phenomenology that are remotely observable. These signals span a large portion of the electromagnetic spectrum and can be related to a variety of properties of wildland fires as they propagate. The deployment of multispectral sensors from aircraft provides a unique perspective on the fire and its interactions in the environment by repeated imaging over time. We describe a set of airborne imaging experiments, image processing methodologies and a workflow system for near real-time extraction of information on the fire and the immediate environment.

  11. A real-time multispectral imaging system for low- or mid-altitude remote sensing

    NASA Astrophysics Data System (ADS)

    Yi, Dingrong; Kong, Linghua

    2012-10-01

    Multispectral imaging is a powerful tool in remote sensing applications. Recently a micro-arrayed narrow-band optical mosaic filter was invented and successfully fabricated to reduce the size and cost of multispectral imaging devices in order to meet the requirements for low- or mid- altitude remote sensing. Such a filter with four narrow bands is integrated with an off-shelf CCD camera, resulting in an economic and light-weight multispectral imaging camera with the capacity of producing multiple images at different center wavelengths with a single shot. The multispectral imaging camera is then integrated with a wireless transmitter and battery to produce a remote sensing multispectral imaging system. The design and some preliminary results of a prototyped multispectral imaging system with the potential for remote sensing applications with a weight of only 200 grams are reported. The prototyped multispectral imaging system eliminates the image registration procedure required by traditional multispectral imaging technologies. In addition, it has other advantages such as low cost, being light weight and compact in design.

  12. Development and application of multispectral algorithms for defect apple inspection

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This research developed and evaluated the multispectral algorithm derived from hyperspectral line-scan imaging system which equipped with an electron-multiplying-charge-coupled-device camera and an imaging spectrograph for the detection of defect Red Delicious apples. The algorithm utilized the fluo...

  13. Replacing 16-mm film cameras with high-definition digital cameras

    NASA Astrophysics Data System (ADS)

    Balch, Kris S.

    1995-09-01

    For many years 16 mm film cameras have been used in severe environments. These film cameras are used on Hy-G automotive sleds, airborne gun cameras, range tracking and other hazardous environments. The companies and government agencies using these cameras are in need of replacing them with a more cost effective solution. Film-based cameras still produce the best resolving capability, however, film development time, chemical disposal, recurring media cost, and faster digital analysis are factors influencing the desire for a 16 mm film camera replacement. This paper will describe a new camera from Kodak that has been designed to replace 16 mm high speed film cameras.

  14. Real-time aerial multispectral imaging solutions using dichroic filter arrays

    NASA Astrophysics Data System (ADS)

    Chandler, Eric V.; Fish, David E.

    2014-06-01

    The next generation of multispectral sensors and cameras needs to deliver significant improvements in size, weight, portability, and spectral band customization to support widespread commercial deployment for a variety of purposebuilt aerial, unmanned, and scientific applications. The benefits of multispectral imaging are well established for applications including machine vision, biomedical, authentication, and remote sensing environments - but many aerial and OEM solutions require more compact, robust, and cost-effective production cameras to realize these benefits. A novel implementation uses micropatterning of dichroic filters into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color camera image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. We demonstrate recent results of 4-9 band dichroic filter arrays in multispectral cameras using a variety of sensors including linear, area, silicon, and InGaAs. Specific implementations range from hybrid RGB + NIR sensors to custom sensors with applicationspecific VIS, NIR, and SWIR spectral bands. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and development path. Finally, we report on the wafer-level fabrication of dichroic filter arrays on imaging sensors for scalable production of multispectral sensors and cameras.

  15. Real-time compact multispectral imaging solutions using dichroic filter arrays

    NASA Astrophysics Data System (ADS)

    Chandler, Eric V.; Fish, David E.

    2014-03-01

    The next generation of multispectral sensors and cameras will need to deliver significant improvements in size, weight, portability, and spectral band customization to support widespread commercial deployment. The benefits of multispectral imaging are well established for applications including machine vision, biomedical, authentication, and aerial remote sensing environments - but many OEM solutions require more compact, robust, and cost-effective production cameras to realize these benefits. A novel implementation uses micro-patterning of dichroic filters into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color camera image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. We demonstrate recent results of 4-9 band dichroic filter arrays in multispectral cameras using a variety of sensors including linear, area, silicon, and InGaAs. Specific implementations range from hybrid RGB + NIR sensors to custom sensors with application-specific VIS, NIR, and SWIR spectral bands. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and development path. Finally, we report on the wafer-level fabrication of dichroic filter arrays on imaging sensors for scalable production of multispectral sensors and cameras.

  16. Active and passive multispectral scanner for earth resources applications: An advanced applications flight experiment

    NASA Technical Reports Server (NTRS)

    Hasell, P. G., Jr.; Peterson, L. M.; Thomson, F. J.; Work, E. A.; Kriegler, F. J.

    1977-01-01

    The development of an experimental airborne multispectral scanner to provide both active (laser illuminated) and passive (solar illuminated) data from a commonly registered surface scene is discussed. The system was constructed according to specifications derived in an initial programs design study. The system was installed in an aircraft and test flown to produce illustrative active and passive multi-spectral imagery. However, data was not collected nor analyzed for any specific application.

  17. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems. PMID:27410361

  18. An Approach to Application Validation of Multispectral Sensors Using AVIRIS

    NASA Technical Reports Server (NTRS)

    Warner, Amanda; Blonski, Slawomir; Gasser, Gerald; Ryan, Robert; Zanoni, Vicki

    2001-01-01

    High-resolution multispectral data are becoming widely available for commercial and scientific use. For specific applications, such as agriculture studies, there is a need to quantify the performance of such systems. In many cases, parameters such as GSD and SNR can be optimized. Data sets with varying GSD's for the Landsat ETM+ bands were produced to evaluate the effects of GSD on various algorithms and transformations, such as NDVI, principal component analysis, unsupervised classification, and mixture analysis. By showing that AVIRIS data can be used to simulate spaceborne and airborne multispectral platforms over a wide range of GSD, this research can be used to assist in band selection and spatial resolution specifications for new sensors and in optimization of acquisition strategies for existing multispectral systems.

  19. Galileo multispectral imaging of Earth.

    PubMed

    Geissler, P; Thompson, W R; Greenberg, R; Moersch, J; McEwen, A; Sagan, C

    1995-08-25

    Nearly 6000 multispectral images of Earth were acquired by the Galileo spacecraft during its two flybys. The Galileo images offer a unique perspective on our home planet through the spectral capability made possible by four narrowband near-infrared filters, intended for observations of methane in Jupiter's atmosphere, which are not incorporated in any of the currently operating Earth orbital remote sensing systems. Spectral variations due to mineralogy, vegetative cover, and condensed water are effectively mapped by the visible and near-infrared multispectral imagery, showing a wide variety of biological, meteorological, and geological phenomena. Global tectonic and volcanic processes are clearly illustrated by these images, providing a useful basis for comparative planetary geology. Differences between plant species are detected through the narrowband IR filters on Galileo, allowing regional measurements of variation in the "red edge" of chlorophyll and the depth of the 1-micrometer water band, which is diagnostic of leaf moisture content. Although evidence of life is widespread in the Galileo data set, only a single image (at approximately 2 km/pixel) shows geometrization plausibly attributable to our technical civilization. Water vapor can be uniquely imaged in the Galileo 0.73-micrometer band, permitting spectral discrimination of moist and dry clouds with otherwise similar albedo. Surface snow and ice can be readily distinguished from cloud cover by narrowband imaging within the sensitivity range of Galileo's silicon CCD camera. Ice grain size variations can be mapped using the weak H2O absorption at 1 micrometer, a technique which may find important applications in the exploration of the moons of Jupiter. The Galileo images have the potential to make unique contributions to Earth science in the areas of geological, meteorological and biological remote sensing, due to the inclusion of previously untried narrowband IR filters. The vast scale and near global

  20. Land use classification utilizing remote multispectral scanner data and computer analysis techniques

    NASA Technical Reports Server (NTRS)

    Leblanc, P. N.; Johannsen, C. J.; Yanner, J. E.

    1973-01-01

    An airborne multispectral scanner was used to collect the visible and reflective infrared data. A small subdivision near Lafayette, Indiana was selected as the test site for the urban land use study. Multispectral scanner data were collected over the subdivision on May 1, 1970 from an altitude of 915 meters. The data were collected in twelve wavelength bands from 0.40 to 1.00 micrometers by the scanner. The results indicated that computer analysis of multispectral data can be very accurate in classifying and estimating the natural and man-made materials that characterize land uses in an urban scene.

  1. Multispectral imaging probe

    SciTech Connect

    Sandison, David R.; Platzbecker, Mark R.; Descour, Michael R.; Armour, David L.; Craig, Marcus J.; Richards-Kortum, Rebecca

    1999-01-01

    A multispectral imaging probe delivers a range of wavelengths of excitation light to a target and collects a range of expressed light wavelengths. The multispectral imaging probe is adapted for mobile use and use in confined spaces, and is sealed against the effects of hostile environments. The multispectral imaging probe comprises a housing that defines a sealed volume that is substantially sealed from the surrounding environment. A beam splitting device mounts within the sealed volume. Excitation light is directed to the beam splitting device, which directs the excitation light to a target. Expressed light from the target reaches the beam splitting device along a path coaxial with the path traveled by the excitation light from the beam splitting device to the target. The beam splitting device directs expressed light to a collection subsystem for delivery to a detector.

  2. Multispectral imaging probe

    DOEpatents

    Sandison, D.R.; Platzbecker, M.R.; Descour, M.R.; Armour, D.L.; Craig, M.J.; Richards-Kortum, R.

    1999-07-27

    A multispectral imaging probe delivers a range of wavelengths of excitation light to a target and collects a range of expressed light wavelengths. The multispectral imaging probe is adapted for mobile use and use in confined spaces, and is sealed against the effects of hostile environments. The multispectral imaging probe comprises a housing that defines a sealed volume that is substantially sealed from the surrounding environment. A beam splitting device mounts within the sealed volume. Excitation light is directed to the beam splitting device, which directs the excitation light to a target. Expressed light from the target reaches the beam splitting device along a path coaxial with the path traveled by the excitation light from the beam splitting device to the target. The beam splitting device directs expressed light to a collection subsystem for delivery to a detector. 8 figs.

  3. Fusion of remotely sensed data from airborne and ground-based sensors for cotton regrowth study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The study investigated the use of aerial multispectral imagery and ground-based hyperspectral data for the discrimination of different crop types and timely detection of cotton plants over large areas. Airborne multispectral imagery and ground-based spectral reflectance data were acquired at the sa...

  4. Remote online processing of multispectral image data

    NASA Astrophysics Data System (ADS)

    Groh, Christine; Rothe, Hendrik

    2005-10-01

    Within the scope of this paper a both compact and economical data acquisition system for multispecral images is described. It consists of a CCD camera, a liquid crystal tunable filter in combination with an associated concept for data processing. Despite of their limited functionality (e.g.regarding calibration) in comparison with commercial systems such as AVIRIS the use of these upcoming compact multispectral camera systems can be advantageous in many applications. Additional benefit can be derived adding online data processing. In order to maintain the systems low weight and price this work proposes to separate data acquisition and processing modules, and transmit pre-processed camera data online to a stationary high performance computer for further processing. The inevitable data transmission has to be optimised because of bandwidth limitations. All mentioned considerations hold especially for applications involving mini-unmanned-aerial-vehicles (mini-UAVs). Due to their limited internal payload the use of a lightweight, compact camera system is of particular importance. This work emphasises on the optimal software interface in between pre-processed data (from the camera system), transmitted data (regarding small bandwidth) and post-processed data (based on high performance computer). Discussed parameters are pre-processing algorithms, channel bandwidth, and resulting accuracy in the classification of multispectral image data. The benchmarked pre-processing algorithms include diagnostic statistics, test of internal determination coefficients as well as loss-free and lossy data compression methods. The resulting classification precision is computed in comparison to a classification performed with the original image dataset.

  5. Simultaneous multispectral imaging using lenslet arrays

    NASA Astrophysics Data System (ADS)

    Hinnrichs, Michele; Jensen, James

    2013-03-01

    There is a need for small compact multispectral and hyperspectral imaging systems that simultaneously images in many spectral bands across the infrared spectral region from short to long-wave infrared. This is a challenge for conventional optics and usually requires large, costly and complex optical systems. However, with the advances in materials and photolithographic technology, Micro-Optical-Electrical-Machine-Systems (MOEMS) can meet these goals. In this paper Pacific Advanced Technology and ECBC will present the work that we are doing under a SBIR contract to the US Army using a MOEMS based diffractive optical lenslet array to perform simultaneous multispectral and hyperspectral imaging with relatively high spatial resolution. Under this program we will develop a proof of concept system that demonstrates how a diffractive optical (DO) lenslet array can image 1024 x 1024 pixels in 16 colors every frame of the camera. Each color image has a spatial resolution of 256 x 256 pixels with an IFOV of 1.7 mrads and FOV of 25 degrees. The purpose of this work is to simultaneously image multiple colors each frame and reduce the temporal changes between colors that are apparent in sequential multispectral imaging. Translating the lenslet array will collect hyperspectral image data cubes as will be explained later in this paper. Because the optics is integrated with the detector the entire multispectral/hyperspectral system can be contained in a miniature package. The spectral images are collected simultaneously allowing high resolution spectral-spatial-temporal information each frame of the camera. Thus enabling the implementation of spectral-temporal-spatial algorithms in real-time with high sensitivity for the detection of weak signals in a high background clutter environment with low sensitivity to camera motion. Using MOEMS actuation the DO lenslet array is translated along the optical axis to complete the full hyperspectral data cube in just a few frames of the

  6. SWNT Imaging Using Multispectral Image Processing

    NASA Astrophysics Data System (ADS)

    Blades, Michael; Pirbhai, Massooma; Rotkin, Slava V.

    2012-02-01

    A flexible optical system was developed to image carbon single-wall nanotube (SWNT) photoluminescence using the multispectral capabilities of a typical CCD camcorder. The built in Bayer filter of the CCD camera was utilized, using OpenCV C++ libraries for image processing, to decompose the image generated in a high magnification epifluorescence microscope setup into three pseudo-color channels. By carefully calibrating the filter beforehand, it was possible to extract spectral data from these channels, and effectively isolate the SWNT signals from the background.

  7. Automated Data Production for a Novel Airborne Multiangle Spectropolarimetric Imager (airmspi)

    NASA Astrophysics Data System (ADS)

    Jovanovic, V. M.; Bull, M.; Diner, D. J.; Geier, S.; Rheingans, B.

    2012-07-01

    A novel polarimetric imaging technique making use of rapid retardance modulation has been developed by JPL as a part of NASA's Instrument Incubator Program. It has been built into the Airborne Multiangle SpectroPolarimetric Imager (AirMSPI) under NASA's Airborne Instrument Technology Transition Program, and is aimed primarily at remote sensing of the amounts and microphysical properties of aerosols and clouds. AirMSPI includes an 8-band (355, 380, 445, 470, 555, 660, 865, 935 nm) pushbroom camera that measures polarization in a subset of the bands (470, 660, and 865 nm). The camera is mounted on a gimbal and acquires imagery in a configurable set of along-track viewing angles ranging between +67° and -67° relative to nadir. As a result, near simultaneous multi-angle, multi-spectral, and polarimetric measurements of the targeted areas at a spatial resolution ranging from 7 m to 20 m (depending on the viewing angle) can be derived. An automated data production system is being built to support high data acquisition rate in concert with co-registration and orthorectified mapping requirements. To date, a number of successful engineering checkout flights were conducted in October 2010, August-September 2011, and January 2012. Data products resulting from these flights will be presented.

  8. Automated Data Production For A Novel Airborne Multiangle Spectropolarimetric Imager (AIRMSPI)

    NASA Technical Reports Server (NTRS)

    Jovanovic, V .M.; Bull, M.; Diner, D. J.; Geier, S.; Rheingans, B.

    2012-01-01

    A novel polarimetric imaging technique making use of rapid retardance modulation has been developed by JPL as a part of NASA's Instrument Incubator Program. It has been built into the Airborne Multiangle SpectroPolarimetric Imager (AirMSPI) under NASA's Airborne Instrument Technology Transition Program, and is aimed primarily at remote sensing of the amounts and microphysical properties of aerosols and clouds. AirMSPI includes an 8-band (355, 380, 445, 470, 555, 660, 865, 935 nm) pushbroom camera that measures polarization in a subset of the bands (470, 660, and 865 nm). The camera is mounted on a gimbal and acquires imagery in a configurable set of along-track viewing angles ranging between +67 deg and -67 deg relative to nadir. As a result, near simultaneous multi-angle, multi-spectral, and polarimetric measurements of the targeted areas at a spatial resolution ranging from 7 m to 20 m (depending on the viewing angle) can be derived. An automated data production system is being built to support high data acquisition rate in concert with co-registration and orthorectified mapping requirements. To date, a number of successful engineering checkout flights were conducted in October 2010, August-September 2011, and January 2012. Data products resulting from these flights will be presented.

  9. Characterization method for a multispectral high-dynamic-range imaging system

    NASA Astrophysics Data System (ADS)

    Kim, Duck Bong; Lee, Kwan H.

    2014-07-01

    An accurate characterization method for a multispectral high-dynamic-range (HDR) imaging system is proposed by combining multispectral and HDR imaging technologies. The multispectral HDR imaging system, which can acquire the visible spectrum at many wavelength bands, can provide an accurate color reproduction and physical radiance information of real objects. An HDR camera is used to capture an HDR image without multiple exposures and a liquid crystal tunable filter (LCTF) is used to generate multispectral images. Due to its several limitations in the multispectral HDR imaging system, a carefully designed and an innovative characterization algorithm is presented by considering a logarithmic camera response of the HDR camera and different spectral transmittance of the LCTF. The proposed method efficiently and accurately recovers the full spectrum from the multispectral HDR images using a transformation matrix and provides device-independent color information (e.g., CIEXYZ and CIELAB). The transformation matrix is estimated by training the estimated sensor responses from a multispectral HDR imaging system and the reflectance measurements from a spectroradiometer using Moore-Penrose pseudoinverse matrix.

  10. Development of the second generation Hyperspectral Airborne Terrestrial Imager (HATI): HATI - 2500

    NASA Astrophysics Data System (ADS)

    Sandor-Leahy, S.; Thordarson, S.; Baldauf, B.; Figueroa, M.; Helmlinger, M.; Miller, H.; Reynolds, T.; Shepanski, J.

    2010-08-01

    Northrop Grumman Aerospace Systems (NGAS) has a long legacy developing and fielding hyperspectral sensors, including airborne and space based systems covering the visible through Long Wave Infrared (LWIR) wavelength ranges. Most recently NGAS has developed the Hyperspectral Airborne Terrestrial Instrument (HATI) family of hyperspectral sensors, which are compact airborne hyperspectral imagers designed to fly on a variety of platforms and be integrated with other sensors in NGAS's instrument suite. The current sensor under development is the HATI-2500, a full range Visible Near Infrared (VNIR) through Short Wave Infrared (SWIR) instrument covering the 0.4 - 2.5 micron wavelength range with high spectral resolution (3nm). The system includes a framing camera integrated with a GPS/INS to provide high-resolution multispectral imagery and precision geolocation. Its compact size and flexible acquisition parameters allow HATI-2500 to be integrated on a large variety of aerial platforms. This paper describes the HATI-2500 sensor and subsystems and its expected performance specifications.

  11. Multispectral Mapping of the Moon by Clementine

    NASA Technical Reports Server (NTRS)

    Eliason, Eric M.; McEwen, Alfred S.; Robinson, M.; Lucey, Paul G.; Duxbury, T.; Malaret, E.; Pieters, Carle; Becker, T.; Isbell, C.; Lee, E.

    1998-01-01

    One of the chief scientific objectives of the Clementine mission at the Moon was to acquire global multispectral mapping. A global digital map of the Moon in 11 spectral bandpasses and at a scale of 100 m/pixel is being produced at the U.S. Geological Survey in Flagstaff Arizona Near-global coverage was acquired with the UVVIS camera (central wavelengths of 415, 750, 900, 950, and 1000 nm) and the NIR camera (1102, 1248, 1499, 1996, 2620, and 2792 nary). We expect to complete processing of the UVVIS mosaics before the fall of 1998, and to complete the NIR mosaics a year later. The purpose of this poster is to provide an update on the processing and to show examples of the products or perhaps even a wall-sized display of color products from the UVVIS mosaics.

  12. Predicting beef tenderness using color and multispectral image texture features.

    PubMed

    Sun, X; Chen, K J; Maddock-Carlin, K R; Anderson, V L; Lepper, A N; Schwartz, C A; Keller, W L; Ilse, B R; Magolski, J D; Berg, E P

    2012-12-01

    The objective of this study was to investigate the usefulness of raw meat surface characteristics (texture) in predicting cooked beef tenderness. Color and multispectral texture features, including 4 different wavelengths and 217 image texture features, were extracted from 2 laboratory-based multispectral camera imaging systems. Steaks were segregated into tough and tender classification groups based on Warner-Bratzler shear force. The texture features were submitted to STEPWISE multiple regression and support vector machine (SVM) analyses to establish prediction models for beef tenderness. A subsample (80%) of tender or tough classified steaks were used to train models which were then validated on the remaining (20%) test steaks. For color images, the SVM model correctly identified tender steaks with 100% accurately while the STEPWISE equation identified 94.9% of the tender steaks correctly. For multispectral images, the SVM model predicted 91% and STEPWISE predicted 87% average accuracy of beef tender. PMID:22647652

  13. [Nitrogen stress measurement of canola based on multi-spectral charged coupled device imaging sensor].

    PubMed

    Feng, Lei; Fang, Hui; Zhou, Wei-Jun; Huang, Min; He, Yong

    2006-09-01

    Site-specific variable nitrogen application is one of the major precision crop production management operations. Obtaining sufficient crop nitrogen stress information is essential for achieving effective site-specific nitrogen applications. The present paper describes the development of a multi-spectral nitrogen deficiency sensor, which uses three channels (green, red, near-infrared) of crop images to determine the nitrogen level of canola. This sensor assesses the nitrogen stress by means of estimated SPAD value of the canola based on canola canopy reflectance sensed using three channels (green, red, near-infrared) of the multi-spectral camera. The core of this investigation is the calibration methods between the multi-spectral references and the nitrogen levels in crops measured using a SPAD 502 chlorophyll meter. Based on the results obtained from this study, it can be concluded that a multi-spectral CCD camera can provide sufficient information to perform reasonable SPAD values estimation during field operations. PMID:17112062

  14. Developing unmanned airship onboard multispectral imagery system for quick-response to drinking water pollution

    NASA Astrophysics Data System (ADS)

    Liu, Zhigang; Wu, Jun; Yang, Haisheng; Li, Bo; Zhang, Yun; Yang, Shengtian

    2009-10-01

    Satellite multispectral imageries are usually limited in low space resolution, long revisit cycle or high cost. This paper presents our ongoing research on developing cost-effective unmanned airship on board Multispectral imagery system to acquire high-resolution multispectral imagery for quick-response to drinking water pollution issues. First, the overall architecture of developed system is described. After that, system integration including CCD cameras coupling, GPS/INS synchronization, stabilize platform control and wireless communication are discussed in detail. Next, system calibration is implemented in radiance and geometry respectively. An adaptive calibration method is developed to obtain absolute radiance and classic homography principle is employed to relate CCD cameras with each other geometrically. Finally, flight experiments are implemented to acquire high-resolution multispectral imageries along river and imageries are deliberately calibrated for the estimation of water quality. Conclusions are also conducted as well.

  15. Multispectral metamaterial absorber.

    PubMed

    Grant, J; McCrindle, I J H; Li, C; Cumming, D R S

    2014-03-01

    We present the simulation, implementation, and measurement of a multispectral metamaterial absorber (MSMMA) and show that we can realize a simple absorber structure that operates in the mid-IR and terahertz (THz) bands. By embedding an IR metamaterial absorber layer into a standard THz metamaterial absorber stack, a narrowband resonance is induced at a wavelength of 4.3 μm. This resonance is in addition to the THz metamaterial absorption resonance at 109 μm (2.75 THz). We demonstrate the inherent scalability and versatility of our MSMMA by describing a second device whereby the MM-induced IR absorption peak frequency is tuned by varying the IR absorber geometry. Such a MSMMA could be coupled with a suitable sensor and formed into a focal plane array, enabling multispectral imaging. PMID:24690713

  16. Multispectral Image Feature Points

    PubMed Central

    Aguilera, Cristhian; Barrera, Fernando; Lumbreras, Felipe; Sappa, Angel D.; Toledo, Ricardo

    2012-01-01

    This paper presents a novel feature point descriptor for the multispectral image case Far-Infrared and Visible Spectrum images. It allows matching interest points on images of the same scene but acquired in different spectral bands. Initially, points of interest are detected on both images through a SIFT-like based scale space representation. Then, these points are characterized using an Edge Oriented Histogram (EOH) descriptor. Finally, points of interest from multispectral images are matched by finding nearest couples using the information from the descriptor. The provided experimental results and comparisons with similar methods show both the validity of the proposed approach as well as the improvements it offers with respect to the current state-of-the-art.

  17. Color enhancement in multispectral image of human skin

    NASA Astrophysics Data System (ADS)

    Mitsui, Masanori; Murakami, Yuri; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki

    2003-07-01

    Multispectral imaging is receiving attention in medical color imaging, as high-fidelity color information can be acquired by the multispectral image capturing. On the other hand, as color enhancement in medical color image is effective for distinguishing lesion from normal part, we apply a new technique for color enhancement using multispectral image to enhance the features contained in a certain spectral band, without changing the average color distribution of original image. In this method, to keep the average color distribution, KL transform is applied to spectral data, and only high-order KL coefficients are amplified in the enhancement. Multispectral images of human skin of bruised arm are captured by 16-band multispectral camera, and the proposed color enhancement is applied. The resultant images are compared with the color images reproduced assuming CIE D65 illuminant (obtained by natural color reproduction technique). As a result, the proposed technique successfully visualizes unclear bruised lesions, which are almost invisible in natural color images. The proposed technique will provide support tool for the diagnosis in dermatology, visual examination in internal medicine, nursing care for preventing bedsore, and so on.

  18. Polarimetric Multispectral Imaging Technology

    NASA Technical Reports Server (NTRS)

    Cheng, L.-J.; Chao, T.-H.; Dowdy, M.; Mahoney, C.; Reyes, G.

    1993-01-01

    The Jet Propulsion Laboratory is developing a remote sensing technology on which a new generation of compact, lightweight, high-resolution, low-power, reliable, versatile, programmable scientific polarimetric multispectral imaging instruments can be built to meet the challenge of future planetary exploration missions. The instrument is based on the fast programmable acousto-optic tunable filter (AOTF) of tellurium dioxide (TeO2) that operates in the wavelength range of 0.4-5 microns. Basically, the AOTF multispectral imaging instrument measures incoming light intensity as a function of spatial coordinates, wavelength, and polarization. Its operation can be in either sequential, random access, or multiwavelength mode as required. This provides observation flexibility, allowing real-time alternation among desired observations, collecting needed data only, minimizing data transmission, and permitting implementation of new experiments. These will result in optimization of the mission performance with minimal resources. Recently we completed a polarimetric multispectral imaging prototype instrument and performed outdoor field experiments for evaluating application potentials of the technology. We also investigated potential improvements on AOTF performance to strengthen technology readiness for applications. This paper will give a status report on the technology and a prospect toward future planetary exploration.

  19. Wetlands mapping with spot multispectral scanner data

    SciTech Connect

    Mackey, H.E. Jr. ); Jensen, J.R. . Dept. of Geography)

    1989-01-01

    Government facilities such as the US Department of Energy's Savannah River Plant (SRP) near Aiken, South Carolina, often use remote sensing data to assist in environmental management. Airborne multispectral scanner (MSS) data have been acquired at SRP since 1981. Various types of remote sensing data have been used to map and characterize wetlands. Regional Landsat MSS and TM satellite data have been used for wetlands mapping by various government agencies and private organizations. Furthermore, SPOT MSS data are becoming available and provide opportunities for increased spacial resolution and temporal coverage for wetlands mapping. This paper summarizes the initial results from using five dates of SPOT MSS data from April through October, 1987, as a means to monitor seasonal wetland changes in freshwater wetlands of the SRP. 11 refs., 4 figs.

  20. A preliminary report of multispectral scanner data from the Cleveland harbor study

    NASA Technical Reports Server (NTRS)

    Shook, D.; Raquet, C.; Svehla, R.; Wachter, D.; Salzman, J.; Coney, T.; Gedney, D.

    1975-01-01

    Imagery obtained from an airborne multispectral scanner is presented. A synoptic view of the entire study area is shown for a number of time periods and for a number of spectral bands. Using several bands, sediment distributions, thermal plumes, and Rhodamine B dye distributions are shown.

  1. Aerial multispectral imaging for cotton yield estimation under different irrigation and nitrogen treatments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton yield varied spatially within a field. The variability can be caused by various production inputs such as soil property, water management, and fertilizer application. Airborne multispectral imaging is capable of providing data and information to study effects of the inputs on the yield qualit...

  2. Estimation of cotton yield with varied irrigation and nitrogen treatments using aerial multispectral imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton yield varies spatially within a field. The variability can be caused by various production inputs such as soil properties, water management, and fertilizer application. Airborne multispectral imaging is capable of providing data and information to study effects of the inputs on yield qualitat...

  3. Use of multispectral scanner images for assessment of hydrothermal alteration in the Marysvale, Utah, mining area.

    USGS Publications Warehouse

    Podwysocki, M.H.; Segal, D.B.; Abrams, M.J.

    1983-01-01

    Airborne multispectral scanner. A color composite image was constructed using the following spectral band ratios: 1.6/2.2 mu m, 1.6/0.48 mu m, and 0.67/1.0 mu m. The color ratio composite successfully distinguished most types of altered rocks from unaltered rocks; further division of altered rocks into ferric oxide-rich and -poor types.

  4. TACMSI: a novel multi-look multispectral imager for maritime mine detection

    NASA Astrophysics Data System (ADS)

    Leonard, Carrie L.; Chan, Chong Wai; Cottis, Tamara; DeWeert, Michael; Dichner, Michael; Farm, Brian; Kokubun, Dan; Louchard, Eric; Noguchi, Reid; Topping, Miles; Wong, Timothy; Yoon, Dugan

    2008-04-01

    Airborne EO imagery, including wideband, hyperspectral, and multispectral modalities, has greatly enhanced the ability of the ISR community to detect and classify various targets of interest from long standoff distances and with large area coverage rates. The surf zone is a dynamic environment that presents physical and operational challenges to effective remote sensing with optical systems. In response to these challenges, BAE Systems has developed the Tactical Multi-spectral (TACMSI) system. The system includes a VNIR six-band multispectral sensor and all other hardware that is used to acquire, store and process imagery, navigation, and supporting metadata on the airborne platform. In conjunction with the hardware, BAE Systems has innovative data processing methods that exploit the inherent capabilities of multi-look framing imagery to essentially remove the overlying clutter or obscuration to enable EO visualization of the objects of interest.

  5. Multispectral observations of the surf zone

    NASA Astrophysics Data System (ADS)

    Schoonmaker, Jon S.; Dirbas, Joseph; Gilbert, Gary

    2003-09-01

    Airborne multispectral imagery was collected over various targets on the beach and in the water in an attempt to characterize the surf zone environment with respect to electro-optical system capabilities and to assess the utility of very low cost, small multispectral systems in mine counter measures (MCM) and intelligence, surveillance and reconnaissance applications. The data was collected by PAR Government Systems Corporation (PGSC) at the Army Corps of Engineers Field Research Facility at Duck North Carolina and on the beaches of Camp Pendleton Marine Corps Base in Southern California. PGSC flew the first two of its MANTIS (Mission Adaptable Narrowband Tunable Imaging Sensor) systems. Both MANTIS systems were flown in an IR - red - green - blue (700, 600, 550, 480 nm) configuration from altitudes ranging from 200 to 700 meters. Data collected has been lightly analyzed and a surf zone index (SZI) defined and calculated. This index allows mine hunting system performance measurements in the surf zone to be normalized by environmental conditions. The SZI takes into account water clarity, wave energy, and foam persistence.

  6. Multispectral imaging with type II superlattice detectors

    NASA Astrophysics Data System (ADS)

    Ariyawansa, Gamini; Duran, Joshua M.; Grupen, Matt; Scheihing, John E.; Nelson, Thomas R.; Eismann, Michael T.

    2012-06-01

    Infrared (IR) focal plane arrays (FPAs) with multispectral detector elements promise significant advantages for airborne threat warning, surveillance, and targeting applications. At present, the use of type II superlattice (T2SL) structures based on the 6.1Å-family materials (InAs, GaSb, and AlSb) has become an area of interest for developing IR detectors and their FPAs. The ability to vary the bandgap in the IR range, suppression of Auger processes, prospective reduction of Shockley-Read-Hall centers by improved material growth capabilities, and the material stability are a few reasons for the predicted dominance of the T2SL technology over presently leading HgCdTe and quantum well technologies. The focus of the work reported here is on the development of T2SL based dual-band IR detectors and their applicability for multispectral imaging. A new NpBPN detector designed for the detection of IR in the 3-5 and 8-12 μm atmospheric windows is presented; comparing its advantages over other T2SL based approaches. One of the key challenges of the T2SL dual-band detectors is the spectral crosstalk associated with the LWIR band. The properties of the state-of-the-art T2SLs (i.e., absorption coefficient, minority carrier lifetime and mobility, etc.) and the present growth limitations that impact spectral crosstalk are discussed.

  7. Automated oil spill detection with multispectral imagery

    NASA Astrophysics Data System (ADS)

    Bradford, Brian N.; Sanchez-Reyes, Pedro J.

    2011-06-01

    In this publication we present an automated detection method for ocean surface oil, like that which existed in the Gulf of Mexico as a result of the April 20, 2010 Deepwater Horizon drilling rig explosion. Regions of surface oil in airborne imagery are isolated using red, green, and blue bands from multispectral data sets. The oil shape isolation procedure involves a series of image processing functions to draw out the visual phenomenological features of the surface oil. These functions include selective color band combinations, contrast enhancement and histogram warping. An image segmentation process then separates out contiguous regions of oil to provide a raster mask to an analyst. We automate the detection algorithm to allow large volumes of data to be processed in a short time period, which can provide timely oil coverage statistics to response crews. Geo-referenced and mosaicked data sets enable the largest identified oil regions to be mapped to exact geographic coordinates. In our simulation, multispectral imagery came from multiple sources including first-hand data collected from the Gulf. Results of the simulation show the oil spill coverage area as a raster mask, along with histogram statistics of the oil pixels. A rough square footage estimate of the coverage is reported if the image ground sample distance is available.

  8. Fusion of remotely sensed data from airborne and ground-based sensors to enhance detection of cotton plants

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The study investigated the use of aerial multispectral imagery and ground-based hyperspectral data for the discrimination of different crop types and timely detection of cotton plants over large areas. Airborne multispectral imagery and ground-based spectral reflectance data were acquired at the sa...

  9. Airborne imaging sensors for environmental monitoring & surveillance in support of oil spills & recovery efforts

    NASA Astrophysics Data System (ADS)

    Bostater, Charles R.; Jones, James; Frystacky, Heather; Coppin, Gaelle; Leavaux, Florian; Neyt, Xavier

    2011-11-01

    Collection of pushbroom sensor imagery from a mobile platform requires corrections using inertial measurement units (IMU's) and DGPS in order to create useable imagery for environmental monitoring and surveillance of shorelines in freshwater systems, coastal littoral zones and harbor areas. This paper describes a suite of imaging systems used during collection of hyperspectral imagery in northern Florida panhandle and Gulf of Mexico airborne missions to detect weathered oil in coastal littoral zones. Underlying concepts of pushbroom imagery, the needed corrections for directional changes using DGPS and corrections for platform yaw, pitch, and roll using IMU data is described as well as the development and application of optimal band and spectral regions associated with weathered oil. Pushbroom sensor and frame camera data collected in response to the recent Gulf of Mexico oil spill disaster is presented as the scenario documenting environmental monitoring and surveillance techniques using mobile sensing platforms. Data was acquired during the months of February, March, April and May of 2011. The low altitude airborne systems include a temperature stabilized hyperspectral imaging system capable of up to 1024 spectral channels and 1376 spatial across track pixels flown from 3,000 to 4,500 feet altitudes. The hyperspectral imaging system is collocated with a full resolution high definition video recorder for simultaneous HD video imagery, a 12.3 megapixel digital, a mapping camera using 9 inch film types that yields scanned aerial imagery with approximately 22,200 by 22,200 pixel multispectral imagery (~255 megapixel RGB multispectral images in order to conduct for spectral-spatial sharpening of fused multispectral, hyperspectral imagery. Two high spectral (252 channels) and radiometric sensitivity solid state spectrographs are used for collecting upwelling radiance (sub-meter pixels) with downwelling irradiance fiber optic attachment. These sensors are utilized for

  10. Dual multispectral and 3D structured light laparoscope

    NASA Astrophysics Data System (ADS)

    Clancy, Neil T.; Lin, Jianyu; Arya, Shobhit; Hanna, George B.; Elson, Daniel S.

    2015-03-01

    Intraoperative feedback on tissue function, such as blood volume and oxygenation would be useful to the surgeon in cases where current clinical practice relies on subjective measures, such as identification of ischaemic bowel or tissue viability during anastomosis formation. Also, tissue surface profiling may be used to detect and identify certain pathologies, as well as diagnosing aspects of tissue health such as gut motility. In this paper a dual modality laparoscopic system is presented that combines multispectral reflectance and 3D surface imaging. White light illumination from a xenon source is detected by a laparoscope-mounted fast filter wheel camera to assemble a multispectral image (MSI) cube. Surface shape is then calculated using a spectrally-encoded structured light (SL) pattern detected by the same camera and triangulated using an active stereo technique. Images of porcine small bowel were acquired during open surgery. Tissue reflectance spectra were acquired and blood volume was calculated at each spatial pixel across the bowel wall and mesentery. SL features were segmented and identified using a `normalised cut' algoritm and the colour vector of each spot. Using the 3D geometry defined by the camera coordinate system the multispectral data could be overlaid onto the surface mesh. Dual MSI and SL imaging has the potential to provide augmented views to the surgeon supplying diagnostic information related to blood supply health and organ function. Future work on this system will include filter optimisation to reduce noise in tissue optical property measurement, and minimise spot identification errors in the SL pattern.

  11. Video rate multispectral imaging for camouflaged target detection

    NASA Astrophysics Data System (ADS)

    Henry, Sam

    2015-05-01

    The ability to detect and identify camouflaged targets is critical in combat environments. Hyperspectral and Multispectral cameras allow a soldier to identify threats more effectively than traditional RGB cameras due to both increased color resolution and ability to see beyond visible light. Static imagers have proven successful, however the development of video rate imagers allows for continuous real time target identification and tracking. This paper presents an analysis of existing anomaly detection algorithms and how they can be adopted to video rates, and presents a general purpose semisupervised real time anomaly detection algorithm using multiple frame sampling.

  12. Remote sensing operations (multispectral scanner and photographic) in the New York Bight, 22 September 1975

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.; Hall, J. B., Jr.

    1977-01-01

    Ocean dumping of waste materials is a significant environmental concern in the New York Bight. One of these waste materials, sewage sludge, was monitored in an experiment conducted in the New York Bight on September 22, 1975. Remote sensing over controlled sewage sludge dumping included an 11-band multispectral scanner, fiver multispectral cameras and one mapping camera. Concurrent in situ water samples were taken and acoustical measurements were made of the sewage sludge plumes. Data were obtained for sewage sludge plumes resulting from line (moving barge) and spot (stationary barge) dumps. Multiple aircraft overpasses were made to evaluate temporal effects on the plume signature.

  13. MULTISPECTRAL THERMAL IMAGER - OVERVIEW

    SciTech Connect

    P. WEBER

    2001-03-01

    The Multispectral Thermal Imager satellite fills a new and important role in advancing the state of the art in remote sensing sciences. Initial results with the full calibration system operating indicate that the system was already close to achieving the very ambitious goals which we laid out in 1993, and we are confident of reaching all of these goals as we continue our research and improve our analyses. In addition to the DOE interests, the satellite is tasked about one-third of the time with requests from other users supporting research ranging from volcanology to atmospheric sciences.

  14. Multispectral thermal imaging

    SciTech Connect

    Weber, P.G.; Bender, S.C.; Borel, C.C.; Clodius, W.B.; Smith, B.W.; Garrett, A.; Pendergast, M.M.; Kay, R.R.

    1998-12-01

    Many remote sensing applications rely on imaging spectrometry. Here the authors use imaging spectrometry for thermal and multispectral signatures measured from a satellite platform enhanced with a combination of accurate calibrations and on-board data for correcting atmospheric distortions. The approach is supported by physics-based end-to-end modeling and analysis, which permits a cost-effective balance between various hardware and software aspects. The goal is to develop and demonstrate advanced technologies and analysis tools toward meeting the needs of the customer; at the same time, the attributes of this system can address other applications in such areas as environmental change, agriculture, and volcanology.

  15. LISS-4 camera for Resourcesat

    NASA Astrophysics Data System (ADS)

    Paul, Sandip; Dave, Himanshu; Dewan, Chirag; Kumar, Pradeep; Sansowa, Satwinder Singh; Dave, Amit; Sharma, B. N.; Verma, Anurag

    2006-12-01

    The Indian Remote Sensing Satellites use indigenously developed high resolution cameras for generating data related to vegetation, landform /geomorphic and geological boundaries. This data from this camera is used for working out maps at 1:12500 scale for national level policy development for town planning, vegetation etc. The LISS-4 Camera was launched onboard Resourcesat-1 satellite by ISRO in 2003. LISS-4 is a high-resolution multi-spectral camera with three spectral bands and having a resolution of 5.8m and swath of 23Km from 817 Km altitude. The panchromatic mode provides a swath of 70Km and 5-day revisit. This paper briefly discusses the configuration of LISS-4 Camera of Resourcesat-1, its onboard performance and also the changes in the Camera being developed for Resourcesat-2. LISS-4 camera images the earth in push-broom mode. It is designed around a three mirror un-obscured telescope, three linear 12-K CCDs and associated electronics for each band. Three spectral bands are realized by splitting the focal plane in along track direction using an isosceles prism. High-speed Camera Electronics is designed for each detector with 12- bit digitization and digital double sampling of video. Seven bit data selected from 10 MSBs data by Telecommand is transmitted. The total dynamic range of the sensor covers up to 100% albedo. The camera structure has heritage of IRS- 1C/D. The optical elements are precisely glued to specially designed flexure mounts. The camera is assembled onto a rotating deck on spacecraft to facilitate +/- 26° steering in Pitch-Yaw plane. The camera is held on spacecraft in a stowed condition before deployment. The excellent imageries from LISS-4 Camera onboard Resourcesat-1 are routinely used worldwide. Such second Camera is being developed for Resourcesat-2 launch in 2007 with similar performance. The Camera electronics is optimized and miniaturized. The size and weight are reduced to one third and the power to half of the values in Resourcesat

  16. Remote detection of past habitability at Mars-analogue hydrothermal alteration terrains using an ExoMars Panoramic Camera emulator

    NASA Astrophysics Data System (ADS)

    Harris, J. K.; Cousins, C. R.; Gunn, M.; Grindrod, P. M.; Barnes, D.; Crawford, I. A.; Cross, R. E.; Coates, A. J.

    2015-05-01

    A major scientific goal of the European Space Agency's ExoMars 2018 rover is to identify evidence of life within the martian rock record. Key to this objective is the remote detection of geological substrates that are indicative of past habitable environments, which will rely on visual (stereo wide-angle, and high resolution images) and multispectral (440-1000 nm) data produced by the Panoramic Camera (PanCam) instrument. We deployed a PanCam emulator at four hydrothermal sites in the Námafjall volcanic region of Iceland, a Mars-analogue hydrothermal alteration terrain. At these sites, sustained acidic-neutral aqueous interaction with basaltic substrates (crystalline and sedimentary) has produced phyllosilicate, ferric oxide, and sulfate-rich alteration soils, and secondary mineral deposits including gypsum veins and zeolite amygdales. PanCam emulator datasets from these sites were complemented with (i) NERC Airborne Research and Survey Facility aerial hyperspectral images of the study area; (ii) in situ reflectance spectroscopy (400-1000 nm) of PanCam spectral targets; (iii) laboratory X-ray Diffraction, and (iv) laboratory VNIR (350-2500 nm) spectroscopy of target samples to identify their bulk mineralogy and spectral properties. The mineral assemblages and palaeoenvironments characterised here are analogous to neutral-acidic alteration terrains on Mars, such as at Mawrth Vallis and Gusev Crater. Combined multispectral and High Resolution Camera datasets were found to be effective at capturing features of astrobiological importance, such as secondary gypsum and zeolite mineral veins, and phyllosilicate-rich substrates. Our field observations with the PanCam emulator also uncovered stray light problems which are most significant in the NIR wavelengths and investigations are being undertaken to ensure that the flight model PanCam cameras are not similarly affected.

  17. Retinal oxygen saturation evaluation by multi-spectral fundus imaging

    NASA Astrophysics Data System (ADS)

    Khoobehi, Bahram; Ning, Jinfeng; Puissegur, Elise; Bordeaux, Kimberly; Balasubramanian, Madhusudhanan; Beach, James

    2007-03-01

    Purpose: To develop a multi-spectral method to measure oxygen saturation of the retina in the human eye. Methods: Five Cynomolgus monkeys with normal eyes were anesthetized with intramuscular ketamine/xylazine and intravenous pentobarbital. Multi-spectral fundus imaging was performed in five monkeys with a commercial fundus camera equipped with a liquid crystal tuned filter in the illumination light path and a 16-bit digital camera. Recording parameters were controlled with software written specifically for the application. Seven images at successively longer oxygen-sensing wavelengths were recorded within 4 seconds. Individual images for each wavelength were captured in less than 100 msec of flash illumination. Slightly misaligned images of separate wavelengths due to slight eye motion were registered and corrected by translational and rotational image registration prior to analysis. Numerical values of relative oxygen saturation of retinal arteries and veins and the underlying tissue in between the artery/vein pairs were evaluated by an algorithm previously described, but which is now corrected for blood volume from averaged pixels (n > 1000). Color saturation maps were constructed by applying the algorithm at each image pixel using a Matlab script. Results: Both the numerical values of relative oxygen saturation and the saturation maps correspond to the physiological condition, that is, in a normal retina, the artery is more saturated than the tissue and the tissue is more saturated than the vein. With the multi-spectral fundus camera and proper registration of the multi-wavelength images, we were able to determine oxygen saturation in the primate retinal structures on a tolerable time scale which is applicable to human subjects. Conclusions: Seven wavelength multi-spectral imagery can be used to measure oxygen saturation in retinal artery, vein, and tissue (microcirculation). This technique is safe and can be used to monitor oxygen uptake in humans. This work

  18. Using Google Earth for Rapid Dissemination of Airborne Remote Sensing Lidar and Photography

    NASA Astrophysics Data System (ADS)

    Wright, C. W.; Nayegandhi, A.; Brock, J. C.

    2006-12-01

    In order to visualize and disseminate vast amounts of lidar and digital photography data, we present a unique method that make these data layers available via the Google Earth interface. The NASA Experimental Advanced Airborne Research Lidar (EAARL) provides unprecedented capabilities to survey coral reefs, nearshore benthic habitats, coastal vegetation, and sandy beaches. The EAARL sensor suite includes a water-penetrating lidar that provides high-resolution topographic information, a down-looking color digital camera, a down-looking high-resolution color-infrared (CIR) digital camera, and precision kinematic GPS receivers which provide for sub-meter geo-referencing of each laser and multispectral sample. Google Earth "kml" files are created for each EAARL multispectral and processed lidar image. A hierarchical structure of network links allows the user to download high-resolution images within the region of interest. The first network link (kmz file) downloaded by the user contains a color coded flight path and "minute marker" icons along the flight path. Each "minute" icon provides access to the image overlays, and additional network links for each second along the flight path as well as flight navigation information. Layers of false-color-coded lidar Digital Elevation Model (DEM) data are made available in 2 km by 2km tiles. These layers include canopy-top, bare-Earth, submerged topography, and links to any other lidar products. The user has the option to download the x,y,z ascii point data or a DEM in the Geotif file format for each tile. The NASA EAARL project captured roughly 250,000 digital photographs in five flights conducted a few days after Hurricane Katrina made landfall along the Gulf Coast in 2005. All of the photos and DEM layers are georeferenced and viewable online using Google Earth.

  19. Multispectral image alignment using a three channel endoscope in vivo during minimally invasive surgery

    PubMed Central

    Clancy, Neil T.; Stoyanov, Danail; James, David R. C.; Di Marco, Aimee; Sauvage, Vincent; Clark, James; Yang, Guang-Zhong; Elson, Daniel S.

    2012-01-01

    Sequential multispectral imaging is an acquisition technique that involves collecting images of a target at different wavelengths, to compile a spectrum for each pixel. In surgical applications it suffers from low illumination levels and motion artefacts. A three-channel rigid endoscope system has been developed that allows simultaneous recording of stereoscopic and multispectral images. Salient features on the tissue surface may be tracked during the acquisition in the stereo cameras and, using multiple camera triangulation techniques, this information used to align the multispectral images automatically even though the tissue or camera is moving. This paper describes a detailed validation of the set-up in a controlled experiment before presenting the first in vivo use of the device in a porcine minimally invasive surgical procedure. Multispectral images of the large bowel were acquired and used to extract the relative concentration of haemoglobin in the tissue despite motion due to breathing during the acquisition. Using the stereoscopic information it was also possible to overlay the multispectral information on the reconstructed 3D surface. This experiment demonstrates the ability of this system for measuring blood perfusion changes in the tissue during surgery and its potential use as a platform for other sequential imaging modalities. PMID:23082296

  20. Multispectral image alignment using a three channel endoscope in vivo during minimally invasive surgery.

    PubMed

    Clancy, Neil T; Stoyanov, Danail; James, David R C; Di Marco, Aimee; Sauvage, Vincent; Clark, James; Yang, Guang-Zhong; Elson, Daniel S

    2012-10-01

    Sequential multispectral imaging is an acquisition technique that involves collecting images of a target at different wavelengths, to compile a spectrum for each pixel. In surgical applications it suffers from low illumination levels and motion artefacts. A three-channel rigid endoscope system has been developed that allows simultaneous recording of stereoscopic and multispectral images. Salient features on the tissue surface may be tracked during the acquisition in the stereo cameras and, using multiple camera triangulation techniques, this information used to align the multispectral images automatically even though the tissue or camera is moving. This paper describes a detailed validation of the set-up in a controlled experiment before presenting the first in vivo use of the device in a porcine minimally invasive surgical procedure. Multispectral images of the large bowel were acquired and used to extract the relative concentration of haemoglobin in the tissue despite motion due to breathing during the acquisition. Using the stereoscopic information it was also possible to overlay the multispectral information on the reconstructed 3D surface. This experiment demonstrates the ability of this system for measuring blood perfusion changes in the tissue during surgery and its potential use as a platform for other sequential imaging modalities. PMID:23082296

  1. Multispectral remote sensing as stratigraphic and structural tool, Wind River Basin and Big Horn Basin areas, Wyoming

    SciTech Connect

    Lang, H.R.; Adams, S.L.; Conel, J.E.; Mcguffie, B.A.; Paylor, E.D.; Walker, R.E.

    1987-04-01

    The use of Landsat TM, Airborne Imaging Spectrometer, and airborne Thermal IR Multispectral Scanner data in the geological evaluation of two sites in central Wyoming is described and illustrated with diagrams, maps, photographs, sample images, and tables of numerical data. The value of the remotely sensed information on the areal variation of attitude, sequence, thickness, and lithology of exposed strata is demonstrated; details of the data analysis are given; and the specialized software packages employed are briefly characterized. 46 references.

  2. Multispectral Resource Sampler Workshop

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The utility of the multispectral resource sampler (MRS) was examined by users in the following disciplines: agriculture, atmospheric studies, engineering, forestry, geology, hydrology/oceanography, land use, and rangelands/soils. Modifications to the sensor design were recommended and the desired types of products and number of scenes required per month were indicated. The history, design, capabilities, and limitations of the MRS are discussed as well as the multilinear spectral array technology which it uses. Designed for small area inventory, the MRS can provide increased temporal, spectral, and spatial resolution, facilitate polarization measurement and atmospheric correction, and test onboard data compression techniques. The advantages of using it along with the thematic mapper are considered.

  3. Multispectral imaging axicons.

    PubMed

    Bialic, Emilie; de la Tocnaye, Jean-Louis de Bougrenet

    2011-07-10

    Large-aperture linear diffractive axicons are optical devices providing achromatic nondiffracting beams with an extended depth of focus when illuminated by white light sources. Annular apertures introduce chromatic foci separation, making chromatic imaging possible despite important radiometric losses. Recently, a new type of diffractive axicon has been introduced, by multiplexing concentric annular axicons with appropriate sizes and periods, called a multiple annular linear diffractive axicon (MALDA). This new family of conical optics combines multiple annular axicons in different ways to optimize color foci recombination, separation, or interleaving. We present different types of MALDA, give an experimental illustration of the use of these devices, and describe the manufacturing issues related to their fabrication to provide color imaging systems with long focal depths and good diffraction efficiency. Application to multispectral image analysis is discussed. PMID:21743576

  4. Multispectral scanner optical system

    NASA Technical Reports Server (NTRS)

    Stokes, R. C.; Koch, N. G. (Inventor)

    1980-01-01

    An optical system for use in a multispectral scanner of the type used in video imaging devices is disclosed. Electromagnetic radiation reflected by a rotating scan mirror is focused by a concave primary telescope mirror and collimated by a second concave mirror. The collimated beam is split by a dichroic filter which transmits radiant energy in the infrared spectrum and reflects visible and near infrared energy. The long wavelength beam is filtered and focused on an infrared detector positioned in a cryogenic environment. The short wavelength beam is dispersed by a pair of prisms, then projected on an array of detectors also mounted in a cryogenic environment and oriented at an angle relative to the optical path of the dispersed short wavelength beam.

  5. Scene segmentation from motion in multispectral imagery to aid automatic human gait recognition

    NASA Astrophysics Data System (ADS)

    Pearce, Daniel; Harvey, Christophe; Day, Simon; Goffredo, Michela

    2007-10-01

    Primarily focused at military and security environments where there is a need to identify humans covertly and remotely; this paper outlines how recovering human gait biometrics from a multi-spectral imaging system can overcome the failings of traditional biometrics to fulfil those needs. With the intention of aiding single camera human gait recognition, an algorithm was developed to accurately segment a walking human from multi-spectral imagery. 16-band imagery from the image replicating imaging spectrometer (IRIS) camera system is used to overcome some of the common problems associated with standard change detection techniques. Fusing the concepts of scene segmentation by spectral characterisation and background subtraction by image differencing gives a uniquely robust approach. This paper presents the results of real trials with human subjects and a prototype IRIS camera system, and compares performance to typical broadband camera systems.

  6. Multispectral Microimager for Astrobiology

    NASA Technical Reports Server (NTRS)

    Sellar, R. Glenn; Farmer, Jack D.; Kieta, Andrew; Huang, Julie

    2006-01-01

    A primary goal of the astrobiology program is the search for fossil records. The astrobiology exploration strategy calls for the location and return of samples indicative of environments conducive to life, and that best capture and preserve biomarkers. Successfully returning samples from environments conducive to life requires two primary capabilities: (1) in situ mapping of the mineralogy in order to determine whether the desired minerals are present; and (2) nondestructive screening of samples for additional in-situ testing and/or selection for return to laboratories for more in-depth examination. Two of the most powerful identification techniques are micro-imaging and visible/infrared spectroscopy. The design and test results are presented from a compact rugged instrument that combines micro-imaging and spectroscopic capability to provide in-situ analysis, mapping, and sample screening capabilities. Accurate reflectance spectra should be a measure of reflectance as a function of wavelength only. Other compact multispectral microimagers use separate LEDs (light-emitting diodes) for each wavelength and therefore vary the angles of illumination when changing wavelengths. When observing a specularly-reflecting sample, this produces grossly inaccurate spectra due to the variation in the angle of illumination. An advanced design and test results are presented for a multispectral microimager which demonstrates two key advances relative to previous LED-based microimagers: (i) acquisition of actual reflectance spectra in which the flux is a function of wavelength only, rather than a function of both wavelength and illumination geometry; and (ii) increase in the number of spectral bands to eight bands covering a spectral range of 468 to 975 nm.

  7. Time-resolved multispectral imaging of combustion reaction

    NASA Astrophysics Data System (ADS)

    Huot, Alexandrine; Gagnon, Marc-André; Jahjah, Karl-Alexandre; Tremblay, Pierre; Savary, Simon; Farley, Vincent; Lagueux, Philippe; Guyot, Éric; Chamberland, Martin; Marcotte, Fréderick

    2015-05-01

    Thermal infrared imaging is a field of science that evolves rapidly. Scientists have used for years the simplest tool: thermal broadband cameras. This allows to perform target characterization in both the longwave (LWIR) and midwave (MWIR) infrared spectral range. Infrared thermal imaging is used for a wide range of applications, especially in the combustion domain. For example, it can be used to follow combustion reactions, in order to characterize the injection and the ignition in a combustion chamber or even to observe gases produced by a flare or smokestack. Most combustion gases such as carbon dioxide (CO2) selectively absorb/emit infrared radiation at discrete energies, i.e. over a very narrow spectral range. Therefore, temperatures derived from broadband imaging are not reliable without prior knowledge about spectral emissivity. This information is not directly available from broadband images. However, spectral information is available using spectral filters. In this work, combustion analysis was carried out using Telops MS-IR MW camera which allows multispectral imaging at a high frame rate. A motorized filter wheel allowing synchronized acquisitions on eight (8) different channels was used to provide time-resolved multispectral imaging of combustion products of a candle in which black powder has been burnt to create a burst. It was then possible to estimate the temperature by modeling spectral profile derived from information obtained with the different spectral filters. Comparison with temperatures obtained using conventional broadband imaging illustrates the benefits of time-resolved multispectral imaging for the characterization of combustion processes.

  8. Time-resolved multispectral imaging of combustion reactions

    NASA Astrophysics Data System (ADS)

    Huot, Alexandrine; Gagnon, Marc-André; Jahjah, Karl-Alexandre; Tremblay, Pierre; Savary, Simon; Farley, Vincent; Lagueux, Philippe; Guyot, Éric; Chamberland, Martin; Marcotte, Frédérick

    2015-10-01

    Thermal infrared imaging is a field of science that evolves rapidly. Scientists have used for years the simplest tool: thermal broadband cameras. These allow to perform target characterization in both the longwave (LWIR) and midwave (MWIR) infrared spectral range. Infrared thermal imaging is used for a wide range of applications, especially in the combustion domain. For example, it can be used to follow combustion reactions, in order to characterize the injection and the ignition in a combustion chamber or even to observe gases produced by a flare or smokestack. Most combustion gases, such as carbon dioxide (CO2), selectively absorb/emit infrared radiation at discrete energies, i.e. over a very narrow spectral range. Therefore, temperatures derived from broadband imaging are not reliable without prior knowledge of spectral emissivity. This information is not directly available from broadband images. However, spectral information is available using spectral filters. In this work, combustion analysis was carried out using a Telops MS-IR MW camera, which allows multispectral imaging at a high frame rate. A motorized filter wheel allowing synchronized acquisitions on eight (8) different channels was used to provide time-resolved multispectral imaging of combustion products of a candle in which black powder has been burnt to create a burst. It was then possible to estimate the temperature by modeling spectral profiles derived from information obtained with the different spectral filters. Comparison with temperatures obtained using conventional broadband imaging illustrates the benefits of time-resolved multispectral imaging for the characterization of combustion processes.

  9. MSS D Multispectral Scanner System

    NASA Technical Reports Server (NTRS)

    Lauletta, A. M.; Johnson, R. L.; Brinkman, K. L. (Principal Investigator)

    1982-01-01

    The development and acceptance testing of the 4-band Multispectral Scanners to be flown on LANDSAT D and LANDSAT D Earth resources satellites are summarized. Emphasis is placed on the acceptance test phase of the program. Test history and acceptance test algorithms are discussed. Trend data of all the key performance parameters are included and discussed separately for each of the two multispectral scanner instruments. Anomalies encountered and their resolutions are included.

  10. Red to far-red multispectral fluorescence image fusion for detection of fecal contamination on apples

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This research developed a multispectral algorithm derived from hyperspectral line-scan fluorescence imaging under violet/blue LED excitation for detection of fecal contamination on Golden Delicious apples. Using a hyperspectral line-scan imaging system consisting of an EMCCD camera, spectrograph, an...

  11. MULTISPECTRAL IMAGING SYSTEM FOR FECAL AND INGESTA DETECTION ON POULTRY CARCASSES

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A multispectral imaging system including a common aperture camera with three optical trim filters (515.4, 566.4 and 631 nm), which were selected by visible/NIR spectroscopy and validated by a hyperspectral imaging system, was developed for a real-time, on-line poultry inspection application. The al...

  12. Real-time multispectral imaging system for online poultry fecal inspection using UML

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A prototype real-time multispectral imaging system for fecal and ingesta contaminant detection on broiler carcasses was developed and tested. The prototype system includes a common aperture camera with three optical trim filters (517, 565 and 802-nm wavelength), which were selected and validated by...

  13. Real-Time Multispectral Imaging System for Online Poultry Fecal Inspection using Unified Modeling Language.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A prototype real-time multispectral imaging system for fecal detection on broiler carcasses has been developed. The prototype system included a common aperture camera with three optical trim filters (517, 565 and 802-nm wavelength), which were selected by visible/NIR spectroscopy and validated by a...

  14. Space Camera

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Nikon's F3 35mm camera was specially modified for use by Space Shuttle astronauts. The modification work produced a spinoff lubricant. Because lubricants in space have a tendency to migrate within the camera, Nikon conducted extensive development to produce nonmigratory lubricants; variations of these lubricants are used in the commercial F3, giving it better performance than conventional lubricants. Another spinoff is the coreless motor which allows the F3 to shoot 140 rolls of film on one set of batteries.

  15. Fourth Airborne Geoscience Workshop

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The focus of the workshop was on how the airborne community can assist in achieving the goals of the Global Change Research Program. The many activities that employ airborne platforms and sensors were discussed: platforms and instrument development; airborne oceanography; lidar research; SAR measurements; Doppler radar; laser measurements; cloud physics; airborne experiments; airborne microwave measurements; and airborne data collection.

  16. Uniqueness in multispectral constant-wave epi-illumination imaging.

    PubMed

    Garcia-Allende, P B; Radrich, K; Symvoulidis, P; Glatz, J; Koch, M; Jentoft, K M; Ripoll, J; Ntziachristos, V

    2016-07-01

    Multispectral tissue imaging based on optical cameras and continuous-wave tissue illumination is commonly used in medicine and biology. Surprisingly, there is a characteristic absence of a critical look at the quantities that can be uniquely characterized from optically diffuse matter by multispectral imaging. Here, we investigate the fundamental question of uniqueness in epi-illumination measurements from turbid media obtained at multiple wavelengths. By utilizing an analytical model, tissue-mimicking phantoms, and an in vivo imaging experiment we show that independent of the bands employed, spectral measurements cannot uniquely retrieve absorption and scattering coefficients. We also establish that it is, nevertheless, possible to uniquely quantify oxygen saturation and the Mie scattering power-a previously undocumented uniqueness condition. PMID:27367111

  17. Multispectral Remote Sensing at the Savannah River Plant

    SciTech Connect

    Shines, J.E.; Tinney, L.R.; Hawley, D.L.

    1984-01-01

    Aerial Mesurements Operations (AMO) is the remote sensing arm of the Department of Energy (DOE). The purpose of AMO is to provide timely, accurate, and cost-effective remote sensing data on a non-interference basis over DOE facilities located around the country. One of the programs administered by AMO is the Comprehensive Integrated Remote Sensing (CIRS) program, which involves the use of a wide range of data acquisition systems - aerial cameras, multispectral and infrared scanners, and nuclear detectors - to acquire data at DOE sites. The data are then processed, analyzed and interpreted to provide useful information, which is then catalogued into a data base for future use. This report describes some of the data acquisition and analysis capabilities of the Multispectral Remote Sensing Department (MRSD) as they relate to the CIRS program. 3 tables.

  18. Airborne laser

    NASA Astrophysics Data System (ADS)

    Lamberson, Steven E.

    2002-06-01

    The US Air Force Airborne Laser (ABL) is an airborne, megawatt-class laser system with a state-of-the-art atmospheric compensation system to destroy enemy ballistic missiles at long ranges. This system will provide both deterrence and defense against the use of such weapons during conflicts. This paper provides an overview of the ABL weapon system including: the notional operational concept, the development approach and schedule, the overall aircraft configuration, the technologies being incorporated in the ABL, and the risk reduction approach being utilized to ensure program success.

  19. Mapping crop ground cover using airborne multispectral digital imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Empirical relationships between remotely sensed vegetation indices and density information, such as leaf area index or ground cover (GC), are commonly used to derive spatial information in many precision farming operations. In this study, we modified an existing methodology that does not depend on e...

  20. EVALUATION OF COTTON DEFOLIATION STRATEGIES USING AIRBORNE MULTISPECTRAL IMAGERY

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Visual observations and ground measurements are commonly used to evaluate cotton (Gossypium hirsutum L.) harvest aids for defoliation, boll opening, and re-growth control. This paper presents a remote sensing-based method for evaluating the effectiveness of different defoliation treatments. Field ...

  1. An interactive lake survey program. [airborne multispectral sensor image processing

    NASA Technical Reports Server (NTRS)

    Smith, A. Y.

    1977-01-01

    Consideration is given to the development and operation of the interactive lake survey program developed by the Jet Propulsion Laboratory and the Environmental Protection Agency. The program makes it possible to locate, isolate, and store any number of water bodies on the basis of a given digital image. The stored information may be used to generate statistical analyses of each body of water including the lake surface area and the shoreline perimeter. The hardware includes a 360/65 host computer, a Ramtek G100B display controller, and a trackball cursor. The system is illustrated by the LAKELOC operation as it would be applied to a Landsat scene, noting the FARINA and STATUS programs. The water detection algorithm, which increases the accuracy with which water and land data may be separated, is discussed.

  2. Multispectral Thermal Infrared Mapping of Sulfur Dioxide Plumes: A Case Study from the East Rift Zone of Kilauea Volcano, Hawaii

    NASA Technical Reports Server (NTRS)

    Realmuto, V. J.; Sutton, A. J.; Elias, T.

    1996-01-01

    The synoptic perspective and rapid mode of data acquisition provided by remote sensing are well-suited for the study of volcanic SO2 plumes. In this paper we describe a plume-mapping procedure that is based on image data acquired with NASA's airborne Thermal Infrared Multispectral Scanner (TIMS).

  3. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    USGS Publications Warehouse

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  4. Multispectral inspection station detects defects on apples

    NASA Astrophysics Data System (ADS)

    Throop, James A.; Aneshansley, Daniel J.; Anger, Bill

    2000-12-01

    The performance of a multi-spectral apple inspection station capable of orienting some cultivars, conveying, and presenting apples to a camera at five apples per second is described. Apples are pre-sized and hand placed on the conveying devices to rotate about an axis passing through both the stem and calyx of each apple. An image of each apple is captured at four different wavelengths through a common aperture. Special optics and filters allow simultaneous image capture of apple reflectance for wavelength bands of 540 nm, 650 nm, 750 nm, and 950 nm, each with a bandwidth of approximately 60 nm. As each apple is conveyed laterally and rotated through the camera's field of view, 6 regions of interest representing most of the apple's surface at each wavelength band are captured. The images are processed to segment each defect from the surrounding undamaged tissue and the area of each defect is recorded. Typical defects such as new bruises, bruises on stored apples, scab, sooty blotch, corking, rot, russet, and insect damage are detected. Data is shown quantifying the ability of the inspection station to sort damaged apples into appropriate grades for correct pricing in the processing industry.

  5. Summaries of the Seventh JPL Airborne Earth Science Workshop January 12-16, 1998. Volume 1; AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1998-01-01

    This publication contains the summaries for the Seventh JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 12-16, 1998. The main workshop is divided into three smaller workshops, and each workshop has a volume as follows: (1) Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) Workshop; (2) Airborne Synthetic Aperture Radar (AIRSAR) Workshop; and (3) Thermal Infrared Multispectral Scanner (TIMS) Workshop. This Volume 1 publication contains 58 papers taken from the AVIRIS workshop.

  6. Infrared Camera

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A sensitive infrared camera that observes the blazing plumes from the Space Shuttle or expendable rocket lift-offs is capable of scanning for fires, monitoring the environment and providing medical imaging. The hand-held camera uses highly sensitive arrays in infrared photodetectors known as quantum well infrared photo detectors (QWIPS). QWIPS were developed by the Jet Propulsion Laboratory's Center for Space Microelectronics Technology in partnership with Amber, a Raytheon company. In October 1996, QWIP detectors pointed out hot spots of the destructive fires speeding through Malibu, California. Night vision, early warning systems, navigation, flight control systems, weather monitoring, security and surveillance are among the duties for which the camera is suited. Medical applications are also expected.

  7. A practical one-shot multispectral imaging system using a single image sensor.

    PubMed

    Monno, Yusuke; Kikuchi, Sunao; Tanaka, Masayuki; Okutomi, Masatoshi

    2015-10-01

    Single-sensor imaging using the Bayer color filter array (CFA) and demosaicking is well established for current compact and low-cost color digital cameras. An extension from the CFA to a multispectral filter array (MSFA) enables us to acquire a multispectral image in one shot without increased size or cost. However, multispectral demosaicking for the MSFA has been a challenging problem because of very sparse sampling of each spectral band in the MSFA. In this paper, we propose a high-performance multispectral demosaicking algorithm, and at the same time, a novel MSFA pattern that is suitable for our proposed algorithm. Our key idea is the use of the guided filter to interpolate each spectral band. To generate an effective guide image, in our proposed MSFA pattern, we maintain the sampling density of the G -band as high as the Bayer CFA, and we array each spectral band so that an adaptive kernel can be estimated directly from raw MSFA data. Given these two advantages, we effectively generate the guide image from the most densely sampled G -band using the adaptive kernel. In the experiments, we demonstrate that our proposed algorithm with our proposed MSFA pattern outperforms existing algorithms and provides better color fidelity compared with a conventional color imaging system with the Bayer CFA. We also show some real applications using a multispectral camera prototype we built. PMID:26011882

  8. Synthesis of Multispectral Bands from Hyperspectral Data: Validation Based on Images Acquired by AVIRIS, Hyperion, ALI, and ETM+

    NASA Technical Reports Server (NTRS)

    Blonksi, Slawomir; Gasser, Gerald; Russell, Jeffrey; Ryan, Robert; Terrie, Greg; Zanoni, Vicki

    2001-01-01

    Multispectral data requirements for Earth science applications are not always studied rigorously studied before a new remote sensing system is designed. A study of the spatial resolution, spectral bandpasses, and radiometric sensitivity requirements of real-world applications would focus the design onto providing maximum benefits to the end-user community. To support systematic studies of multispectral data requirements, the Applications Research Toolbox (ART) has been developed at NASA's Stennis Space Center. The ART software allows users to create and assess simulated datasets while varying a wide range of system parameters. The simulations are based on data acquired by existing multispectral and hyperspectral instruments. The produced datasets can be further evaluated for specific end-user applications. Spectral synthesis of multispectral images from hyperspectral data is a key part of the ART software. In this process, hyperspectral image cubes are transformed into multispectral imagery without changes in spatial sampling and resolution. The transformation algorithm takes into account spectral responses of both the synthesized, broad, multispectral bands and the utilized, narrow, hyperspectral bands. To validate the spectral synthesis algorithm, simulated multispectral images are compared with images collected near-coincidentally by the Landsat 7 ETM+ and the EO-1 ALI instruments. Hyperspectral images acquired with the airborne AVIRIS instrument and with the Hyperion instrument onboard the EO-1 satellite were used as input data to the presented simulations.

  9. Multi-spectral imaging with infrared sensitive organic light emitting diode.

    PubMed

    Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R; So, Franky

    2014-01-01

    Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions. PMID:25091589

  10. Multi-spectral imaging with infrared sensitive organic light emitting diode

    NASA Astrophysics Data System (ADS)

    Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky

    2014-08-01

    Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions.

  11. Nikon Camera

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Nikon FM compact has simplification feature derived from cameras designed for easy, yet accurate use in a weightless environment. Innovation is a plastic-cushioned advance lever which advances the film and simultaneously switches on a built in light meter. With a turn of the lens aperture ring, a glowing signal in viewfinder confirms correct exposure.

  12. CCD Camera

    DOEpatents

    Roth, Roger R.

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  13. CCD Camera

    DOEpatents

    Roth, R.R.

    1983-08-02

    A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

  14. Multispectral imaging method and apparatus

    DOEpatents

    Sandison, D.R.; Platzbecker, M.R.; Vargo, T.D.; Lockhart, R.R.; Descour, M.R.; Richards-Kortum, R.

    1999-07-06

    A multispectral imaging method and apparatus are described which are adapted for use in determining material properties, especially properties characteristic of abnormal non-dermal cells. A target is illuminated with a narrow band light beam. The target expresses light in response to the excitation. The expressed light is collected and the target's response at specific response wavelengths to specific excitation wavelengths is measured. From the measured multispectral response the target's properties can be determined. A sealed, remote probe and robust components can be used for cervical imaging. 5 figs.

  15. Multispectral imaging method and apparatus

    DOEpatents

    Sandison, David R.; Platzbecker, Mark R.; Vargo, Timothy D.; Lockhart, Randal R.; Descour, Michael R.; Richards-Kortum, Rebecca

    1999-01-01

    A multispectral imaging method and apparatus adapted for use in determining material properties, especially properties characteristic of abnormal non-dermal cells. A target is illuminated with a narrow band light beam. The target expresses light in response to the excitation. The expressed light is collected and the target's response at specific response wavelengths to specific excitation wavelengths is measured. From the measured multispectral response the target's properties can be determined. A sealed, remote probe and robust components can be used for cervical imaging

  16. New uses for the Zeiss KS-153A camera system

    NASA Astrophysics Data System (ADS)

    Spiller, Rudolf H.

    1995-09-01

    The Zeiss KS-153A aerial reconnaissance framing camera compliments satellite, mapping, and remote sensor data with imagery that is geometrically correct. KS-153A imagery is in a format for tactical 3-D mapping, targeting, and high-resolution intelligence data collection. This system is based upon rugged microprocessor technology that allows a wide variety of mission parameters. Geometrically correct horizon-to-horizon photography, multi-spectral mine detection, stand-off photography, NIRS nine high speed, and very low altitude anti-terrorist surveillance are KS-153A capabilities that have been proven in tests and actual missions. Civilian use of the KS-153A has ranged from measuring flood levels to forest infestations. These are everyday tasks for the KS-153A throughout the world. Zeiss optics have superb spectral response and resolution. Surprisingly effective haze penetration was shown on a day when the pilot himself could not see the terrain. Tests with CCD arrays have also produced outstanding results. This superb spectral response can be used for camouflage detection in wartime, or used for effective environmental control in peacetime, with its ability to detect subtle changes in the signature of vegetation, calling attention to man induced stress such as disease, drought, and pollution. One serious man-induced problem in many parts of the world deserves even more attention in these times: the locating and safe removal of mines. The KS- 153A is currently configured with four different optics. High acuity horizon-to-horizon Pentalens and Multi-spectral Lens (MUC) modules have been added to the basic KS-153A with Trilens and Telelens. This modular concept nearly meets all of today's airborne reconnaissance requirements. Modern recce programs, for example German Air Force Recce Tornado (GAF Recce), have selected the KS-153A. By simply adding additional focal length lens assemblies to an existing KS-153A configuration, the user can instantly and economically adapt

  17. Multispectral Analysis of Indigenous Rock Art Using Terrestrial Laser Scanning

    NASA Astrophysics Data System (ADS)

    Skoog, B.; Helmholz, P.; Belton, D.

    2016-06-01

    Multispectral analysis is a widely used technique in the photogrammetric and remote sensing industry. The use of Terrestrial Laser Scanning (TLS) in combination with imagery is becoming increasingly common, with its applications spreading to a wider range of fields. Both systems benefit from being a non-contact technique that can be used to accurately capture data regarding the target surface. Although multispectral analysis is actively performed within the spatial sciences field, its extent of application within an archaeological context has been limited. This study effectively aims to apply the multispectral techniques commonly used, to a remote Indigenous site that contains an extensive gallery of aging rock art. The ultimate goal for this research is the development of a systematic procedure that could be applied to numerous similar sites for the purpose of heritage preservation and research. The study consisted of extensive data capture of the rock art gallery using two different TLS systems and a digital SLR camera. The data was combined into a common 2D reference frame that allowed for standard image processing to be applied. An unsupervised k-means classifier was applied to the multiband images to detect the different types of rock art present. The result was unsatisfactory as the subsequent classification accuracy was relatively low. The procedure and technique does however show potential and further testing with different classification algorithms could possibly improve the result significantly.

  18. Real-time multispectral imaging application for poultry safety inspection

    NASA Astrophysics Data System (ADS)

    Park, Bosoon; Lawrence, Kurt C.; Windham, William R.; Snead, Matthew P.

    2006-02-01

    The ARS imaging research group in Athens, Georgia has developed a real-time multispectral imaging system for fecal and ingesta contaminant detection on broiler carcasses for poultry industry. The industrial scale system includes a common aperture camera with three visible wavelength optical trim filters. This paper demonstrates calibration of common aperture multispectral imaging hardware and real-time image processing software. The software design, especially the Unified Modeling Language (UML) design approach was used to develop real-time image processing software for on-line application. The UML models including class, object, activity, sequence, and collaboration diagram were presented. Both hardware and software for a real-time fecal and ingesta contaminant detection were tested at the pilot-scale poultry processing line. The test results of industrial sacle real-time system showed that the multispectral imaging technique performed well for detecting fecal contaminants with a commercial processing speed (currently 140 birds per minute). The accuracy for the detection of fecal and ingesta contaminates was approximately 96%.

  19. Multispectral slice of APXS

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Portions of Sojourner's Alpha Proton X-Ray Spectrometer (APXS), a deployment spring, and the rock Barnacle Bill are visible in this color image. The image was taken by Sojourner's rear camera, and shows that the APXS made good contact with Barnacle Bill.

    Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is an operating division of the California Institute of Technology (Caltech). The Imager for Mars Pathfinder (IMP) was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator.

  20. Multispectral scanner (MSS), ERTS-1

    NASA Technical Reports Server (NTRS)

    Arlauskas, J.

    1973-01-01

    The multispectral scanner onboard ERTS-A spacecraft provides simultaneous images in three visible bands and one near infrared band. The instrument employs fiber optics to transfer optical images to the detectors and photomultiplier tubes. Detector outputs are digitized and multiplexed for transmission from the spacecraft by analog to digital processor.

  1. Multispectral Landsat images of Antartica

    SciTech Connect

    Lucchitta, B.K.; Bowell, J.A.; Edwards, K.L.; Eliason, E.M.; Fergurson, H.M.

    1988-01-01

    The U.S. Geological Survey has a program to map Antarctica by using colored, digitally enhanced Landsat multispectral scanner images to increase existing map coverage and to improve upon previously published Landsat maps. This report is a compilation of images and image mosaic that covers four complete and two partial 1:250,000-scale quadrangles of the McMurdo Sound region.

  2. Application of multispectral color photography to flame flow visualization

    NASA Technical Reports Server (NTRS)

    Stoffers, G.

    1979-01-01

    For flames of short duration and low intensity of radiation a spectroscopical flame diagnostics is difficult. In order to find some other means of extracting information about the flame structure from its radiation, the feasibility of using multispectral color photography was successfully evaluated. Since the flame photographs are close-ups, there is a considerable parallax between the single images, when several cameras are used, and additive color viewing is not possible. Each image must be analyzed individually, it is advisable to use color film in all cameras. One can either use color films of different spectral sensitivities or color films of the same type with different color filters. Sharp cutting filters are recommended.

  3. A comparison of digital multi-spectral imagery versus conventional photography for mapping seagrass in Indian River Lagoon, Florida

    SciTech Connect

    Virnstein, R.; Tepera, M.; Beazley, L.

    1997-06-01

    A pilot study is very briefly summarized in the article. The study tested the potential of multi-spectral digital imagery for discrimination of seagrass densities and species, algae, and bottom types. Imagery was obtained with the Compact Airborne Spectral Imager (casi) and two flight lines flown with hyper-spectral mode. The photogrammetric method used allowed interpretation of the highest quality product, eliminating limitations caused by outdated or poor quality base maps and the errors associated with transfer of polygons. Initial image analysis indicates that the multi-spectral imagery has several advantages, including sophisticated spectral signature recognition and classification, ease of geo-referencing, and rapid mosaicking.

  4. A two-camera imaging system for pest detection and aerial application

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This presentation reports on the design and testing of an airborne two-camera imaging system for pest detection and aerial application assessment. The system consists of two digital cameras with 5616 x 3744 effective pixels. One camera captures normal color images with blue, green and red bands, whi...

  5. Multispectral processing without spectra

    NASA Astrophysics Data System (ADS)

    Drew, Mark S.; Finlayson, Graham D.

    2003-07-01

    It is often the case that multiplications of whole spectra, component by component, must be carried out, for example when light reflects from or is transmitted through materials. This leads to particularly taxing calculations, especially in spectrally based ray tracing or radiosity in graphics, making a full-spectrum method prohibitively expensive. Nevertheless, using full spectra is attractive because of the many important phenomena that can be modeled only by using all the physics at hand. We apply to the task of spectral multiplication a method previously used in modeling RGB-based light propagation. We show that we can often multiply spectra without carrying out spectral multiplication. In previous work J. Opt. Soc. Am. A 11 , 1553 (1994) we developed a method called spectral sharpening, which took camera RGBs to a special sharp basis that was designed to render illuminant change simple to model. Specifically, in the new basis, one can effectively model illuminant change by using a diagonal matrix rather than the 33 linear transform that results from a three-component finite-dimensional model G. Healey and D. Slater, J. Opt. Soc. Am. A 11 , 3003 (1994). We apply this idea of sharpening to the set of principal components vectors derived from a representative set of spectra that might reasonably be encountered in a given application. With respect to the sharp spectral basis, we show that spectral multiplications can be modeled as the multiplication of the basis coefficients. These new product coefficients applied to the sharp basis serve to accurately reconstruct the spectral product. Although the method is quite general, we show how to use spectral modeling by taking advantage of metameric surfaces, ones that match under one light but not another, for tasks such as volume rendering. The use of metamers allows a user to pick out or merge different volume structures in real time simply by changing the lighting. 2003 Optical Society of America

  6. Multispectral processing without spectra.

    PubMed

    Drew, Mark S; Finlayson, Graham D

    2003-07-01

    It is often the case that multiplications of whole spectra, component by component, must be carried out,for example when light reflects from or is transmitted through materials. This leads to particularly taxing calculations, especially in spectrally based ray tracing or radiosity in graphics, making a full-spectrum method prohibitively expensive. Nevertheless, using full spectra is attractive because of the many important phenomena that can be modeled only by using all the physics at hand. We apply to the task of spectral multiplication a method previously used in modeling RGB-based light propagation. We show that we can often multiply spectra without carrying out spectral multiplication. In previous work [J. Opt. Soc. Am. A 11, 1553 (1994)] we developed a method called spectral sharpening, which took camera RGBs to a special sharp basis that was designed to render illuminant change simple to model. Specifically, in the new basis, one can effectively model illuminant change by using a diagonal matrix rather than the 3 x 3 linear transform that results from a three-component finite-dimensional model [G. Healey and D. Slater, J. Opt. Soc. Am. A 11, 3003 (1994)]. We apply this idea of sharpening to the set of principal components vectors derived from a representative set of spectra that might reasonably be encountered in a given application. With respect to the sharp spectral basis, we show that spectral multiplications can be modeled as the multiplication of the basis coefficients. These new product coefficients applied to the sharp basis serve to accurately reconstruct the spectral product. Although the method is quite general, we show how to use spectral modeling by taking advantage of metameric surfaces, ones that match under one light but not another, for tasks such as volume rendering. The use of metamers allows a user to pick out or merge different volume structures in real time simply by changing the lighting. PMID:12868625

  7. Spectral difference analysis and airborne imaging classification for citrus greening infected trees

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Citrus greening, also called Huanglongbing (HLB), became a devastating disease spread through citrus groves in Florida, since it was first found in 2005. Multispectral (MS) and hyperspectral (HS) airborne images of citrus groves in Florida were acquired to detect citrus greening infected trees in 20...

  8. Mapping giant reed along the Rio Grande using airborne and satellite imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Giant reed (Arundo donax L.) is a perennial invasive weed that presents a severe threat to agroecosystems and riparian areas in the Texas and Mexican portions of the Rio Grande Basin. The objective of this presentation is to give an overview on the use of aerial photography, airborne multispectral a...

  9. Daily evapotranspiration estimates from extrapolating instantaneous airborne remote sensing ET values

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this study, six extrapolation methods have been compared for their ability to estimate daily crop evapotranspiration (ETd) from instantaneous latent heat flux estimates derived from digital airborne multispectral remote sensing imagery. Data used in this study were collected during an experiment...

  10. Multispectral Analysis of NMR Imagery

    NASA Technical Reports Server (NTRS)

    Butterfield, R. L.; Vannier, M. W. And Associates; Jordan, D.

    1985-01-01

    Conference paper discusses initial efforts to adapt multispectral satellite-image analysis to nuclear magnetic resonance (NMR) scans of human body. Flexibility of these techniques makes it possible to present NMR data in variety of formats, including pseudocolor composite images of pathological internal features. Techniques do not have to be greatly modified from form in which used to produce satellite maps of such Earth features as water, rock, or foliage.

  11. Acquisition of multi-spectral flash image using optimization method via weight map

    NASA Astrophysics Data System (ADS)

    Choi, Bong-Seok; Kim, Dae-Chul; Kwon, Oh-Seol; Ha, Yeong-Ho

    2013-02-01

    To acquire images in low-light environments, it is usually necessary to adopt long exposure times or to resort to flashes. Flashes, however, often induce color distortion, cause the red-eye effect and can be disturbing to the subjects. On the other hand, long-exposure shots are susceptible to subject-motion, as well as motion-blur due to camera shake when performed with a hand-held camera. A recently introduced technique to overcome the limitations of the traditional lowlight photography is the use of the multi-spectral flash. Multi-spectral flash images are a combination of UV/IR and visible spectrum information. The general idea is to retrieve the details from the UV/IR spectrum and the color from the visible spectrum. Multi-spectral flash images, however, are themselves subject to color distortion and noise. In this work, a method of computing multi-spectral flash images so as to reduce the noise and to improve the color accuracy is presented. The proposed method is a previously seen optimization method, improved by introducing a weight map used to discriminate the uniform regions from the detail regions. The optimization target function takes into account the output likelihood with respect to the ambient light image, the sparsity of image gradients, and the spectral constraints for the IR-red and UV-blue channels. The performance of the proposed method was objectively evaluated using longexposure shots as references.

  12. A Comparative Study of Land Cover Classification by Using Multispectral and Texture Data.

    PubMed

    Qadri, Salman; Khan, Dost Muhammad; Ahmad, Farooq; Qadri, Syed Furqan; Babar, Masroor Ellahi; Shahid, Muhammad; Ul-Rehman, Muzammil; Razzaq, Abdul; Shah Muhammad, Syed; Fahad, Muhammad; Ahmad, Sarfraz; Pervez, Muhammad Tariq; Naveed, Nasir; Aslam, Naeem; Jamil, Mutiullah; Rehmani, Ejaz Ahmad; Ahmad, Nazir; Akhtar Khan, Naeem

    2016-01-01

    The main objective of this study is to find out the importance of machine vision approach for the classification of five types of land cover data such as bare land, desert rangeland, green pasture, fertile cultivated land, and Sutlej river land. A novel spectra-statistical framework is designed to classify the subjective land cover data types accurately. Multispectral data of these land covers were acquired by using a handheld device named multispectral radiometer in the form of five spectral bands (blue, green, red, near infrared, and shortwave infrared) while texture data were acquired with a digital camera by the transformation of acquired images into 229 texture features for each image. The most discriminant 30 features of each image were obtained by integrating the three statistical features selection techniques such as Fisher, Probability of Error plus Average Correlation, and Mutual Information (F + PA + MI). Selected texture data clustering was verified by nonlinear discriminant analysis while linear discriminant analysis approach was applied for multispectral data. For classification, the texture and multispectral data were deployed to artificial neural network (ANN: n-class). By implementing a cross validation method (80-20), we received an accuracy of 91.332% for texture data and 96.40% for multispectral data, respectively. PMID:27376088

  13. A Comparative Study of Land Cover Classification by Using Multispectral and Texture Data

    PubMed Central

    Qadri, Salman; Khan, Dost Muhammad; Ahmad, Farooq; Qadri, Syed Furqan; Babar, Masroor Ellahi; Shahid, Muhammad; Ul-Rehman, Muzammil; Razzaq, Abdul; Shah Muhammad, Syed; Fahad, Muhammad; Ahmad, Sarfraz; Pervez, Muhammad Tariq; Naveed, Nasir; Aslam, Naeem; Jamil, Mutiullah; Rehmani, Ejaz Ahmad; Ahmad, Nazir; Akhtar Khan, Naeem

    2016-01-01

    The main objective of this study is to find out the importance of machine vision approach for the classification of five types of land cover data such as bare land, desert rangeland, green pasture, fertile cultivated land, and Sutlej river land. A novel spectra-statistical framework is designed to classify the subjective land cover data types accurately. Multispectral data of these land covers were acquired by using a handheld device named multispectral radiometer in the form of five spectral bands (blue, green, red, near infrared, and shortwave infrared) while texture data were acquired with a digital camera by the transformation of acquired images into 229 texture features for each image. The most discriminant 30 features of each image were obtained by integrating the three statistical features selection techniques such as Fisher, Probability of Error plus Average Correlation, and Mutual Information (F + PA + MI). Selected texture data clustering was verified by nonlinear discriminant analysis while linear discriminant analysis approach was applied for multispectral data. For classification, the texture and multispectral data were deployed to artificial neural network (ANN: n-class). By implementing a cross validation method (80-20), we received an accuracy of 91.332% for texture data and 96.40% for multispectral data, respectively. PMID:27376088

  14. Multispectral Microscopic Imager (MMI): Multispectral Imaging of Geological Materials at a Handlens Scale

    NASA Astrophysics Data System (ADS)

    Farmer, J. D.; Nunez, J. I.; Sellar, R. G.; Gardner, P. B.; Manatt, K. S.; Dingizian, A.; Dudik, M. J.; McDonnell, G.; Le, T.; Thomas, J. A.; Chu, K.

    2011-12-01

    The Multispectral Microscopic Imager (MMI) is a prototype instrument presently under development for future astrobiological missions to Mars. The MMI is designed to be a arm-mounted rover instrument for use in characterizing the microtexture and mineralogy of materials along geological traverses [1,2,3]. Such geological information is regarded as essential for interpreting petrogenesis and geological history, and when acquired in near real-time, can support hypothesis-driven exploration and optimize science return. Correlated microtexure and mineralogy also provides essential data for selecting samples for analysis with onboard lab instruments, and for prioritizing samples for potential Earth return. The MMI design employs multispectral light-emitting diodes (LEDs) and an uncooled focal plane array to achieve the low-mass (<1kg), low-cost, and high reliability (no moving parts) required for an arm-mounted instrument on a planetary rover [2,3]. The MMI acquires multispectral, reflectance images at 62 μm/pixel, in which each image pixel is comprised of a 21-band VNIR spectrum (0.46 to 1.73 μm). This capability enables the MMI to discriminate and resolve the spatial distribution of minerals and textures at the microscale [2, 3]. By extending the spectral range into the infrared, and increasing the number of spectral bands, the MMI exceeds the capabilities of current microimagers, including the MER Microscopic Imager (MI); 4, the Phoenix mission Robotic Arm Camera (RAC; 5) and the Mars Science Laboratory's Mars Hand Lens Imager (MAHLI; 6). In this report we will review the capabilities of the MMI by highlighting recent lab and field applications, including: 1) glove box deployments in the Astromaterials lab at Johnson Space Center to analyze Apollo lunar samples; 2) GeoLab glove box deployments during the 2011 Desert RATS field trials in northern AZ to characterize analog materials collected by astronauts during simulated EVAs; 3) field deployments on Mauna Kea

  15. Remote sensing technology - The 24-channel multispectral scanner.

    NASA Technical Reports Server (NTRS)

    Hayre, H. S.; Richard, R. R.

    1973-01-01

    The multispectral scanner system installed in the NASA C-130 aircraft for use in the Earth Observations Aircraft Program constitutes a 24-channel imaging spectrometer that senses electromagnetic energy in the spectral interval from .34 to 13 microns. Energy reflected or emitted from the terrain and system calibration sources is collected by a scan mirror, reflected into collecting optics, and brought to a focus in a plane containing a 0.09 inch square aperture. A dichroic optical element splits the energy which passes through the aperture into two wavelength bands (wavelengths above and below 2 microns). These wavelength bands are then dispersed spectrally into 24 distinct spectral bands by two grating spectrometers. The spectral intervals are transformed into electrical signals by separate detector-preamplifier combinations, and the signals are used as inputs to a video processor in an airborne electronics console. System operation and performance are described.

  16. Improved capabilities of the Multispectral Atmospheric Mapping Sensor (MAMS)

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; Batson, K. Bryan; Atkinson, Robert J.; Moeller, Chris C.; Menzel, W. Paul; James, Mark W.

    1989-01-01

    The Multispectral Atmospheric Mapping Sensor (MAMS) is an airborne instrument being investigated as part of NASA's high altitude research program. Findings from work on this and other instruments have been important as the scientific justification of new instrumentation for the Earth Observing System (EOS). This report discusses changes to the instrument which have led to new capabilities, improved data quality, and more accurate calibration methods. In order to provide a summary of the data collected with MAMS, a complete list of flight dates and locations is provided. For many applications, registration of MAMS imagery with landmarks is required. The navigation of this data on the Man-computer Interactive Data Access System (McIDAS) is discussed. Finally, research applications of the data are discussed and specific examples are presented to show the applicability of these measurements to NASA's Earth System Science (ESS) objectives.

  17. Multispectral determination of vegetative cover in corn crop canopy

    NASA Technical Reports Server (NTRS)

    Stoner, E. R.; Baumgardner, M. F.

    1972-01-01

    The relationship between different amounts of vegetative ground cover and the energy reflected by corn canopies was investigated. Low altitude photography and an airborne multispectral scanner were used to measure this reflected energy. Field plots were laid out, representing four growth stages of corn. Two plot locations were chosen-on a very dark and a very light surface soil. Color and color infrared photographs were taken from a vertical distance of 10 m. Estimates of ground cover were made from these photographs and were related to field measurements of leaf area index. Ground cover could be predicted from leaf area index measurements by a second order equation. Microdensitometry and digitzation of the three separated dye layers of color infrared film showed that the near infrared dye layer is most valuable in ground cover determinations. Computer analysis of the digitized photography provided an accurate method of determining precent ground cover.

  18. Gimbaled multispectral imaging system and method

    DOEpatents

    Brown, Kevin H.; Crollett, Seferino; Henson, Tammy D.; Napier, Matthew; Stromberg, Peter G.

    2016-01-26

    A gimbaled multispectral imaging system and method is described herein. In an general embodiment, the gimbaled multispectral imaging system has a cross support that defines a first gimbal axis and a second gimbal axis, wherein the cross support is rotatable about the first gimbal axis. The gimbaled multispectral imaging system comprises a telescope that fixed to an upper end of the cross support, such that rotation of the cross support about the first gimbal axis causes the tilt of the telescope to alter. The gimbaled multispectral imaging system includes optics that facilitate on-gimbal detection of visible light and off-gimbal detection of infrared light.

  19. Registration of 3D and multispectral data for the study of cultural heritage surfaces.

    PubMed

    Chane, Camille Simon; Schütze, Rainer; Boochs, Frank; Marzani, Franck S

    2013-01-01

    We present a technique for the multi-sensor registration of featureless datasets based on the photogrammetric tracking of the acquisition systems in use. This method is developed for the in situ study of cultural heritage objects and is tested by digitizing a small canvas successively with a 3D digitization system and a multispectral camera while simultaneously tracking the acquisition systems with four cameras and using a cubic target frame with a side length of 500 mm. The achieved tracking accuracy is better than 0.03 mm spatially and 0.150 mrad angularly. This allows us to seamlessly register the 3D acquisitions and to project the multispectral acquisitions on the 3D model. PMID:23322103

  20. Summaries of the Fifth Annual JPL Airborne Earth Science Workshop. Volume 1: AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1995-01-01

    This publication is the first of three containing summaries for the Fifth Annual JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 23-26, 1995. The main workshop is divided into three smaller workshops as follows: (1) The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on January 23-24. The summaries for this workshop appear in this volume; (2) The Airborne Synthetic Aperture Radar (AIRSAR) workshop, on January 25-26. The summaries for this workshop appear in Volume 3; and (3) The Thermal Infrared Multispectral Scanner (TIMS) workshop, on January 26. The summaries for this workshop appear in Volume 2.

  1. Summaries of the Third Annual JPL Airborne Geoscience Workshop. Volume 1: AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1992-01-01

    This publication contains the preliminary agenda and summaries for the Third Annual JPL Airborne Geoscience Workshop, held at the Jet Propulsion Laboratory, Pasadena, California, on 1-5 June 1992. This main workshop is divided into three smaller workshops as follows: (1) the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on June 1 and 2; (2) the Thermal Infrared Multispectral Scanner (TIMS) workshop, on June 3; and (3) the Airborne Synthetic Aperture Radar (AIRSAR) workshop, on June 4 and 5. The summaries are contained in Volumes 1, 2, and 3, respectively.

  2. Summaries of the Fifth Annual JPL Airborne Earth Science Workshop. Volume 2: TIMS Workshop

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J. (Editor)

    1995-01-01

    This publication is the second volume of the summaries for the Fifth Annual JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 23-26, 1995. The main workshop is divided into three smaller workshops as follows: (1) The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop on January 23-24. The summaries for this workshop appear in Volume 1; (2) The Airborne Synthetic Aperture Radar (AIRSAR) workshop on January 25-26. The summaries for this workshop appear in volume 3; and (3) The Thermal Infrared Multispectral Scanner (TIMS) workshop on January 26. The summaries for this workshop appear in this volume.

  3. Summaries of the Third Annual JPL Airborne Geoscience Workshop. Volume 2: TIMS Workshop

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J. (Editor)

    1992-01-01

    This publication contains the preliminary agenda and summaries for the Third Annual JPL Airborne Geoscience Workshop, held at the Jet Propulsion Laboratory, Pasadena, California, on 1-5 June 1992. This main workshop is divided into three smaller workshops as follows: (1) the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on June 1 and 2; the summaries for this workshop appear in Volume 1; (2) the Thermal Infrared Multispectral Scanner (TIMS) workshop, on June 3; the summaries for this workshop appear in Volume 2; and (3) the Airborne Synthetic Aperture Radar (AIRSAR) workshop, on June 4 and 5; the summaries for this workshop appear in Volume 3.

  4. Summaries of the Fifth Annual JPL Airborne Earth Science Workshop. Volume 3: AIRSAR Workshop

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob (Editor)

    1995-01-01

    This publication is the third containing summaries for the Fifth Annual JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 23-26, 1995. The main workshop is divided into three smaller workshops as follows: (1) The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on January 23-24. The summaries for this workshop appear in Volume 1; (2) The Airborne synthetic Aperture Radar (AIRSAR) workshop, on January 25-26. The summaries for this workshop appear in this volume; and (3) The Thermal Infrared Multispectral Scanner (TIMS) workshop, on January 26. The summaries for this workshop appear in Volume 2.

  5. Summaries of the 4th Annual JPL Airborne Geoscience Workshop. Volume 3: AIRSAR Workshop

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob (Editor)

    1993-01-01

    This publication contains the summaries for the Fourth Annual JPL Airborne Geoscience Workshop, held in Washington, D.C. on October 25-29, 1993. The main workshop is divided into three smaller workshops as follows: The Airborne Visible/Infrared Spectrometer (AVIRIS) workshop, on October 25-26, whose summaries appear in Volume 1; The Thermal Infrared Multispectral Scanner (TIMS) workshop, on October 27, whose summaries appear in Volume 2; and The Airborne Synthetic Aperture Radar (AIRSAR) workshop, on October 28-29, whose summaries appear in this volume, Volume 3.

  6. Summaries of the Third Annual JPL Airborne Geoscience Workshop. Volume 3: AIRSAR Workshop

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob (Editor)

    1992-01-01

    This publication contains the preliminary agenda and summaries for the Third Annual JPL Airborne Geoscience Workshop, held at the Jet Propulsion Laboratory, Pasadena, California, on 1-5 June 1992. This main workshop is divided into three smaller workshops as follows: (1) the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on June 1 and 2; the summaries for this workshop appear in Volume 1; (2) the Thermal Infrared Multispectral Scanner (TIMS) workshop, on June 3; the summaries for this workshop appear in Volume 2; and (3) the Airborne Synthetic Aperture Radar (AIRSAR) workshop, on June 4 and 5; the summaries for this workshop appear in Volume 3.

  7. Summaries of the 4th Annual JPL Airborne Geoscience Workshop. Volume 2: TIMS Workshop

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J. (Editor)

    1993-01-01

    This is volume 2 of a three volume set of publications that contain the summaries for the Fourth Annual JPL Airborne Geoscience Workshop, held in Washington, D.C. on October 25-29, 1993. The main workshop is divided into three smaller workshops as follows: The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on October 25-26. The summaries for this workshop appear in Volume 1. The Thermal Infrared Multispectral Scanner (TIMS) workshop, on October 27. The summaries for this workshop appear in Volume 2. The Airborne Synthetic Aperture Radar (AIRSAR) workshop, on October 28-29. The summaries for this workshop appear in Volume 3.

  8. Summaries of the 4th Annual JPL Airborne Geoscience Workshop. Volume 1: AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1993-01-01

    This publication contains the summaries for the Fourth Annual JPL Airborne Geoscience Workshop, held in Washington, D. C. October 25-29, 1993 The main workshop is divided into three smaller workshops as follows: The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, October 25-26 (the summaries for this workshop appear in this volume, Volume 1); The Thermal Infrared Multispectral Scanner (TMIS) workshop, on October 27 (the summaries for this workshop appear in Volume 2); and The Airborne Synthetic Aperture Radar (AIRSAR) workshop, October 28-29 (the summaries for this workshop appear in Volume 3).

  9. Introducing a Low-Cost Mini-Uav for - and Multispectral-Imaging

    NASA Astrophysics Data System (ADS)

    Bendig, J.; Bolten, A.; Bareth, G.

    2012-07-01

    The trend to minimize electronic devices also accounts for Unmanned Airborne Vehicles (UAVs) as well as for sensor technologies and imaging devices. Consequently, it is not surprising that UAVs are already part of our daily life and the current pace of development will increase civil applications. A well known and already wide spread example is the so called flying video game based on Parrot's AR.Drone which is remotely controlled by an iPod, iPhone, or iPad (http://ardrone.parrot.com). The latter can be considered as a low-weight and low-cost Mini-UAV. In this contribution a Mini-UAV is considered to weigh less than 5 kg and is being able to carry 0.2 kg to 1.5 kg of sensor payload. While up to now Mini-UAVs like Parrot's AR.Drone are mainly equipped with RGB cameras for videotaping or imaging, the development of such carriage systems clearly also goes to multi-sensor platforms like the ones introduced for larger UAVs (5 to 20 kg) by Jaakkolla et al. (2010) for forestry applications or by Berni et al. (2009) for agricultural applications. The problem when designing a Mini-UAV for multi-sensor imaging is the limitation of payload of up to 1.5 kg and a total weight of the whole system below 5 kg. Consequently, the Mini-UAV without sensors but including navigation system and GPS sensors must weigh less than 3.5 kg. A Mini-UAV system with these characteristics is HiSystems' MK-Okto (www.mikrokopter.de). Total weight including battery without sensors is less than 2.5 kg. Payload of a MK-Okto is approx. 1 kg and maximum speed is around 30 km/h. The MK-Okto can be operated up to a wind speed of less than 19 km/h which corresponds to Beaufort scale number 3 for wind speed. In our study, the MK-Okto is equipped with a handheld low-weight NEC F30IS thermal imaging system. The F30IS which was developed for veterinary applications, covers 8 to 13 μm, weighs only 300 g

  10. High resolution multispectral photogrammetric imagery: enhancement, interpretation and evaluations

    NASA Astrophysics Data System (ADS)

    Roberts, Arthur; Haefele, Martin; Bostater, Charles; Becker, Thomas

    2007-10-01

    A variety of aerial mapping cameras were adapted and developed into simulated multiband digital photogrammetric mapping systems. Direct digital multispectral, two multiband cameras (IIS 4 band and Itek 9 band) and paired mapping and reconnaissance cameras were evaluated for digital spectral performance and photogrammetric mapping accuracy in an aquatic environment. Aerial films (24cm X 24cm format) tested were: Agfa color negative and extended red (visible and near infrared) panchromatic, and; Kodak color infrared and B&W (visible and near infrared) infrared. All films were negative processed to published standards and digitally converted at either 16 (color) or 10 (B&W) microns. Excellent precision in the digital conversions was obtained with scanning errors of less than one micron. Radiometric data conversion was undertaken using linear density conversion and centered 8 bit histogram exposure. This resulted in multiple 8 bit spectral image bands that were unaltered (not radiometrically enhanced) "optical count" conversions of film density. This provided the best film density conversion to a digital product while retaining the original film density characteristics. Data covering water depth, water quality, surface roughness, and bottom substrate were acquired using different measurement techniques as well as different techniques to locate sampling points on the imagery. Despite extensive efforts to obtain accurate ground truth data location errors, measurement errors, and variations in the correlation between water depth and remotely sensed signal persisted. These errors must be considered endemic and may not be removed through even the most elaborate sampling set up. Results indicate that multispectral photogrammetric systems offer improved feature mapping capability.

  11. Caught on Camera.

    ERIC Educational Resources Information Center

    Milshtein, Amy

    2002-01-01

    Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

  12. A multispectral scanner survey of the Tonopah Test Range, Nevada. Date of survey: August 1993

    SciTech Connect

    Brewster, S.B. Jr.; Howard, M.E.; Shines, J.E.

    1994-08-01

    The Multispectral Remote Sensing Department of the Remote Sensing Laboratory conducted an airborne multispectral scanner survey of a portion of the Tonopah Test Range, Nevada. The survey was conducted on August 21 and 22, 1993, using a Daedalus AADS1268 scanner and coincident aerial color photography. Flight altitudes were 5,000 feet (1,524 meters) above ground level for systematic coverage and 1,000 feet (304 meters) for selected areas of special interest. The multispectral scanner survey was initiated as part of an interim and limited investigation conducted to gather preliminary information regarding historical hazardous material release sites which could have environmental impacts. The overall investigation also includes an inventory of environmental restoration sites, a ground-based geophysical survey, and an aerial radiological survey. The multispectral scanner imagery and coincident aerial photography were analyzed for the detection, identification, and mapping of man-made soil disturbances. Several standard image enhancement techniques were applied to the data to assist image interpretation. A geologic ratio enhancement and a color composite consisting of AADS1268 channels 10, 7, and 9 (mid-infrared, red, and near-infrared spectral bands) proved most useful for detecting soil disturbances. A total of 358 disturbance sites were identified on the imagery and mapped using a geographic information system. Of these sites, 326 were located within the Tonopah Test Range while the remaining sites were present on the imagery but outside the site boundary. The mapped site locations are being used to support ongoing field investigations.

  13. Spectral stratigraphy: multispectral remote sensing as a stratigraphic tool, Wind River/Big Horn basin, Wyoming

    SciTech Connect

    Lang, H.R.; Paylor, E.D.

    1987-05-01

    Stratigraphic and structural analyses of the Wind River and Big Horn basins areas of central Wyoming are in progress. One result has been the development of a new approach to stratigraphic and structural analysis that uses photogeologic and spectral interpretation of multispectral image data to remotely characterize the attitude, thickness, and lithology of strata. New multispectral systems that have only been available since 1982 are used with topographic data to map upper paleozoic and Mesozoic strata exposed on the southern margin of the Bighorn Mountains. Thematic Mapper (TM) satellite data together with topographic data are used to map lithologic contacts, measure dip and strike, and develop a stratigraphic column that is correlated with conventional surface and subsurface sections. Aircraft-acquired Airborne Imaging Spectrometer and Thermal Infrared Multispectral Scanner data add mineralogical information to the TM column, including the stratigraphic distribution of quartz, calcite, dolomite, montmorillonite, and gypsum. Results illustrate an approach that has general applicability in other geologic investigations that could benefit from remotely acquired information about areal variations in attitude, sequence, thickness, and lithology of strata exposed at the Earth's surface. Application of their methods elsewhere is limited primarily by availability of multispectral and topographic data and quality of bedrock exposures.

  14. Joint spatio-spectral based edge detection for multispectral infrared imagery.

    SciTech Connect

    Krishna, Sanjay; Hayat, Majeed M.; Bender, Steven C.; Sharma, Yagya D.; Jang, Woo-Yong; Paskalva, Biliana S.

    2010-06-01

    Image segmentation is one of the most important and difficult tasks in digital image processing. It represents a key stage of automated image analysis and interpretation. Segmentation algorithms for gray-scale images utilize basic properties of intensity values such as discontinuity and similarity. However, it is possible to enhance edge-detection capability by means of using spectral information provided by multispectral (MS) or hyperspectral (HS) imagery. In this paper we consider image segmentation algorithms for multispectral images with particular emphasis on detection of multi-color or multispectral edges. More specifically, we report on an algorithm for joint spatio-spectral (JSS) edge detection. By joint we mean simultaneous utilization of spatial and spectral characteristics of a given MS or HS image. The JSS-based edge-detection approach, termed Spectral Ratio Contrast (SRC) edge-detection algorithm, utilizes the novel concept of matching edge signatures. The edge signature represents a combination of spectral ratios calculated using bands that enhance the spectral contrast between the two materials. In conjunction with a spatial mask, the edge signature give rise to a multispectral operator that can be viewed as a three-dimensional extension of the mask. In the extended mask, the third (spectral) dimension of each hyper-pixel can be chosen independently. The SRC is verified using MS and HS imagery from a quantum-dot in a well infrared (IR) focal plane array, and the Airborne Hyperspectral Imager.

  15. A multispectral scanner survey of the United States Department of Energy's Paducah Gaseous Diffusion Plant

    SciTech Connect

    Not Available

    1991-06-01

    Airborne multispectral scanner data of the Paducah Gaseous Diffusion Plant (PGDP) and surrounding area were acquired during late spring 1990. This survey was conducted by the Remote Sensing Laboratory (RSL) which is operated by EG G Energy Measurements (EG G/EM) for the US Department of Energy (DOE) Nevada Operations Office. It was requested by the US Department of Energy (DOE) Environmental Audit Team which was reviewing environmental conditions at the facility. The objectives of this survey were to: (1) Acquire 12-channel, multispectral scanner data of the PGDP from an altitude of 3000 feet above ground level (AGL); (2) Acquire predawn, digital thermal infrared (TIR) data of the site from the same altitude; (3) Collect color and color-infrared (CIR) aerial photographs over the facilities; and (4) Illustrate how the analyses of these data could benefit environmental monitoring at the PGDP. This report summarizes the two multispectral scanner and aerial photographic missions at the Paducah Gaseous Diffusion Plant. Selected examples of the multispectral data are presented to illustrate its potential for aiding environmental management at the site. 4 refs., 1 fig., 2 tabs.

  16. PORTABLE MULTISPECTRAL IMAGING INSTRUMENT FOR FOOD INDUSTRY

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of this paper is to design and fabricate a hand-held multispectral instrument for real-time contaminant detection. Specifically, the protocol to develop a portable multispectral instrument including optical sensor design, fabrication, calibration, data collection, analysis and algorith...

  17. A multispectral sorting device for wheat kernels

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A low-cost multispectral sorting device was constructed using three visible and three near-infrared light-emitting diodes (LED) with peak emission wavelengths of 470 nm (blue), 527 nm (green), 624 nm (red), 850 nm, 940 nm, and 1070 nm. The multispectral data were collected by rapidly (~12 kHz) blin...

  18. Multispectral Image Processing for Plants

    NASA Technical Reports Server (NTRS)

    Miles, Gaines E.

    1991-01-01

    The development of a machine vision system to monitor plant growth and health is one of three essential steps towards establishing an intelligent system capable of accurately assessing the state of a controlled ecological life support system for long-term space travel. Besides a network of sensors, simulators are needed to predict plant features, and artificial intelligence algorithms are needed to determine the state of a plant based life support system. Multispectral machine vision and image processing can be used to sense plant features, including health and nutritional status.

  19. Estimation of thermal flux and emissivity of the land surface from multispectral aircraft data

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.

    1989-01-01

    In order to evaluate the importance of surface thermal flux and emissivity variations on surface and boundary layer processes, a technique that uses thermal data from an airborne multispectral scanner to determine the surface skin temperature and thermal emissivity over a regional area has been developed. These values are used to estimate the total flux density emanating from the surface and at the top of the atmosphere. Data from the multispectral atmospheric mapping sensor (MAMS) collected during the First ISLSCP Field Experiment (FIFE) are used to develop the technique, and to show the time and space variability of the flux values. The ground truth data available during FIFE provide a unique resource to evaluate this technique.

  20. Multispectral thermal infrared mapping of the 1 October 1988 Kupaianaha flow field, Kilauea volcano, Hawaii

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J.; Hon, Ken; Kahle, Anne B.; Abbott, Elsa A.; Pieri, David C.

    1992-01-01

    Multispectral thermal infrared radiance measurements of the Kupaianaha flow field were acquired with the NASA airborne Thermal Infrared Multispectral Scanner (TIMS) on the morning of 1 October 1988. The TIMS data were used to map both the temperature and emissivity of the surface of the flow field. The temperature map depicted the underground storage and transport of lava. The presence of molten lava in a tube or tumulus resulted in surface temperatures that were at least 10 C above ambient. The temperature map also clearly defined the boundaries of hydrothermal plumes which resulted from the entry of lava into the ocean. The emissivity map revealed the boundaries between individual flow units within the Kupaianaha field. Distinct spectral anomalies, indicative of silica-rich surface materials, were mapped near fumaroles and ocean entry sites. This apparent enrichment in silica may have resulted from an acid-induced leaching of cations from the surfaces of glassy flows.

  1. Monitoring of maize chlorophyll content based on multispectral vegetation indices

    NASA Astrophysics Data System (ADS)

    Sun, Hong; Li, Minzan; Zheng, Lihua; Zhang, Yane; Zhang, Yajing

    2012-11-01

    In order to estimate the nutrient status of maize, the multi-spectral image was used to monitor the chlorophyll content in the field. The experiments were conducted under three different fertilizer treatments (High, Normal and Low). A multispectral CCD camera was used to collect ground-based images of maize canopy in green (G, 520~600nm), red (R, 630~690nm) and near-infrared (NIR, 760~900nm) band. Leaves of maize were randomly sampled to detect the chlorophyll content by UV-Vis spectrophotometer. The images were processed following image preprocessing, canopy segmentation and parameter calculation: Firstly, the median filtering was used to improve the visual contrast of image. Secondly, the leaves of maize canopy were segmented in NIR image. Thirdly, the average gray value (GIA, RIA and NIRIA) and the vegetation indices (DVI, RVI, NDVI, et al.) widely used in remote sensing were calculated. A new vegetation index, combination of normalized difference vegetation index (CNDVI), was developed. After the correlation analysis between image parameter and chlorophyll content, six parameters (GIA, RIA, NIRIA, GRVI, GNDVI and CNDVI) were selected to estimate chlorophyll content at shooting and trumpet stages respectively. The results of MLR predicting models showed that the R2 was 0.88 and the adjust R2 was 0.64 at shooting stage; the R2 was 0.77 and the adjust R2 was 0.31 at trumpet stage. It was indicated that vegetation indices derived from multispectral image could be used to monitor the chlorophyll content. It provided a feasible method for the chlorophyll content detection.

  2. Large Multispectral and Albedo Panoramas Acquired by the Pancam Instruments on the Mars Exploration Rovers Spirit and Opportunity

    NASA Technical Reports Server (NTRS)

    Bell, J. F., III; Arneson, H. M.; Farrand, W. H.; Goetz, W.; Hayes, A. G.; Herkenhoff, K.; Johnson, M. J.; Johnson, J. R.; Joseph, J.; Kinch, K.

    2005-01-01

    Introduction. The panoramic camera (Pancam) multispectral, stereoscopic imaging systems on the Mars Exploration Rovers Spirit and Opportunity [1] have acquired and downlinked more than 45,000 images (35 Gbits of data) over more than 700 combined sols of operation on Mars as of early January 2005. A large subset of these images were acquired as part of 26 large multispectral and/or broadband "albedo" panoramas (15 on Spirit, 11 on Opportunity) covering large ranges of azimuth (12 spanning 360 ) and designed to characterize major regional color and albedo characteristics of the landing sites and various points along both rover traverses.

  3. Proactive PTZ Camera Control

    NASA Astrophysics Data System (ADS)

    Qureshi, Faisal Z.; Terzopoulos, Demetri

    We present a visual sensor network—comprising wide field-of-view (FOV) passive cameras and pan/tilt/zoom (PTZ) active cameras—capable of automatically capturing closeup video of selected pedestrians in a designated area. The passive cameras can track multiple pedestrians simultaneously and any PTZ camera can observe a single pedestrian at a time. We propose a strategy for proactive PTZ camera control where cameras plan ahead to select optimal camera assignment and handoff with respect to predefined observational goals. The passive cameras supply tracking information that is used to control the PTZ cameras.

  4. Time-of-Flight Microwave Camera

    NASA Astrophysics Data System (ADS)

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-10-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.

  5. Time-of-Flight Microwave Camera

    PubMed Central

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-01-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz–12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum. PMID:26434598

  6. Time-of-Flight Microwave Camera.

    PubMed

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-01-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable "stealth" regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows "camera-like" behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum. PMID:26434598

  7. Classification of emerald based on multispectral image and PCA

    NASA Astrophysics Data System (ADS)

    Yang, Weiping; Zhao, Dazun; Huang, Qingmei; Ren, Pengyuan; Feng, Jie; Zhang, Xiaoyan

    2005-02-01

    Traditionally, the grade discrimination and classifying of bowlders (emeralds) are implemented by using methods based on people's experiences. In our previous works, a method based on NCS(Natural Color System) color system and sRGB color space conversion is employed for a coarse grade classification of emeralds. However, it is well known that the color match of two colors is not a true "match" unless their spectra are the same. Because metameric colors can not be differentiated by a three channel(RGB) camera, a multispectral camera(MSC) is used as image capturing device in this paper. It consists of a trichromatic digital camera and a set of wide-band filters. The spectra are obtained by measuring a series of natural bowlders(emeralds) samples. Principal component analysis(PCA) method is employed to get some spectral eigenvectors. During the fine classification, the color difference and RMS of spectrum difference between estimated and original spectra are used as criterion. It has been shown that 6 eigenvectors are enough to reconstruct reflection spectra of the testing samples.

  8. Comparative performance between compressed and uncompressed airborne imagery

    NASA Astrophysics Data System (ADS)

    Phan, Chung; Rupp, Ronald; Agarwal, Sanjeev; Trang, Anh; Nair, Sumesh

    2008-04-01

    The US Army's RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD), Countermine Division is evaluating the compressibility of airborne multi-spectral imagery for mine and minefield detection application. Of particular interest is to assess the highest image data compression rate that can be afforded without the loss of image quality for war fighters in the loop and performance of near real time mine detection algorithm. The JPEG-2000 compression standard is used to perform data compression. Both lossless and lossy compressions are considered. A multi-spectral anomaly detector such as RX (Reed & Xiaoli), which is widely used as a core algorithm baseline in airborne mine and minefield detection on different mine types, minefields, and terrains to identify potential individual targets, is used to compare the mine detection performance. This paper presents the compression scheme and compares detection performance results between compressed and uncompressed imagery for various level of compressions. The compression efficiency is evaluated and its dependence upon different backgrounds and other factors are documented and presented using multi-spectral data.

  9. Computer simulator for training operators of thermal cameras

    NASA Astrophysics Data System (ADS)

    Chrzanowski, Krzysztof; Krupski, Marcin

    2004-08-01

    A PC-based image generator SIMTERM developed for training operators of non-airborne military thermal imaging systems is presented in this paper. SIMTERM allows its users to generate images closely resembling thermal images of many military type targets at different scenarios obtained with the simulated thermal camera. High fidelity of simulation was achieved due to use of measurable parameters of thermal camera as input data. Two modified versions of this computer simulator developed for designers and test teams are presented, too.

  10. On-board multispectral classification study

    NASA Technical Reports Server (NTRS)

    Ewalt, D.

    1979-01-01

    The factors relating to onboard multispectral classification were investigated. The functions implemented in ground-based processing systems for current Earth observation sensors were reviewed. The Multispectral Scanner, Thematic Mapper, Return Beam Vidicon, and Heat Capacity Mapper were studied. The concept of classification was reviewed and extended from the ground-based image processing functions to an onboard system capable of multispectral classification. Eight different onboard configurations, each with varying amounts of ground-spacecraft interaction, were evaluated. Each configuration was evaluated in terms of turnaround time, onboard processing and storage requirements, geometric and classification accuracy, onboard complexity, and ancillary data required from the ground.

  11. An operational multispectral scanner for bathymetric surveys - The ABS NORDA scanner

    NASA Technical Reports Server (NTRS)

    Haimbach, Stephen P.; Joy, Richard T.; Hickman, G. Daniel

    1987-01-01

    The Naval Ocean Research and Development Activity (NORDA) is developing the Airborne Bathymetric Survey (ABS) system, which will take shallow water depth soundings from a Navy P-3 aircraft. The system combines active and passive sensors to obtain optical measurements of water depth. The ABS NORDA Scanner is the systems passive multispectral scanner whose design goal is to provide 100 percent coverage of the seafloor, to depths of 20 m in average coastal waters. The ABS NORDA Scanner hardware and operational environment is discussed in detail. The optical model providing the basis for depth extraction is reviewed and the proposed data processing routine discussed.

  12. Assessment of Pen Branch delta and corridor vegetation changes using multispectral scanner data 1992--1994

    SciTech Connect

    1996-01-01

    Airborne multispectral scanner data were used to monitor natural succession of wetland vegetation species over a three-year period from 1992 through 1994 for Pen Branch on the Savannah River Site in South Carolina. Image processing techniques were used to identify and measure wetland vegetation communities in the lower portion of the Pen Branch corridor and delta. The study provided a reliable means for monitoring medium- and large-scale changes in a diverse environment. Findings from the study will be used to support decisions regarding remediation efforts following the cessation of cooling water discharge from K reactor at the Department of Energy`s Savannah River Site in South Carolina.

  13. Combining transverse field detectors and color filter arrays to improve multispectral imaging systems.

    PubMed

    Martínez, Miguel A; Valero, Eva M; Hernández-Andrés, Javier; Romero, Javier; Langfelder, Giacomo

    2014-05-01

    This work focuses on the improvement of a multispectral imaging sensor based on transverse field detectors (TFDs). We aimed to achieve a higher color and spectral accuracy in the estimation of spectral reflectances from sensor responses. Such an improvement was done by combining these recently developed silicon-based sensors with color filter arrays (CFAs). Consequently, we sacrificed the filter-less full spatial resolution property of TFDs to narrow down the spectrally broad sensitivities of these sensors. We designed and performed several experiments to test the influence of different design features on the estimation quality (type of sensor, tunability, interleaved polarization, use of CFAs, type of CFAs, number of shots), some of which are exclusive to TFDs. We compared systems that use a TFD with systems that use normal monochrome sensors, both combined with multispectral CFAs as well as common RGB filters present in commercial digital color cameras. Results showed that a system that combines TFDs and CFAs performs better than systems with the same type of multispectral CFA and other sensors, or even the same TFDs combined with different kinds of filters used in common imaging systems. We propose CFA+TFD-based systems with one or two shots, depending on the possibility of using longer capturing times or not. Improved TFD systems thus emerge as an interesting possibility for multispectral acquisition, which overcomes the limited accuracy found in previous studies. PMID:24921886

  14. Uav Multispectral Survey to Map Soil and Crop for Precision Farming Applications

    NASA Astrophysics Data System (ADS)

    Sonaa, Giovanna; Passoni, Daniele; Pinto, Livio; Pagliari, Diana; Masseroni, Daniele; Ortuani, Bianca; Facchi, Arianna

    2016-06-01

    New sensors mounted on UAV and optimal procedures for survey, data acquisition and analysis are continuously developed and tested for applications in precision farming. Procedures to integrate multispectral aerial data about soil and crop and ground-based proximal geophysical data are a recent research topic aimed to delineate homogeneous zones for the management of agricultural inputs (i.e., water, nutrients). Multispectral and multitemporal orthomosaics were produced over a test field (a 100 m x 200 m plot within a maize field), to map vegetation and soil indices, as well as crop heights, with suitable ground resolution. UAV flights were performed in two moments during the crop season, before sowing on bare soil, and just before flowering when maize was nearly at the maximum height. Two cameras, for color (RGB) and false color (NIR-RG) images, were used. The images were processed in Agisoft Photoscan to produce Digital Surface Model (DSM) of bare soil and crop, and multispectral orthophotos. To overcome some difficulties in the automatic searching of matching points for the block adjustment of the crop image, also the scientific software developed by Politecnico of Milan was used to enhance images orientation. Surveys and image processing are described, as well as results about classification of multispectral-multitemporal orthophotos and soil indices.

  15. Pre-flight and On-orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.; Wagner, R. V.; Robinson, M. S.; Licht, A.; Thomas, P. C.; Becker, K.; Anderson, J.; Brylow, S. M.; Humm, D. C.; Tschimmel, M.

    2016-04-01

    The Lunar Reconnaissance Orbiter Camera (LROC) consists of two imaging systems that provide multispectral and high resolution imaging of the lunar surface. The Wide Angle Camera (WAC) is a seven color push-frame imager with a 90∘ field of view in monochrome mode and 60∘ field of view in color mode. From the nominal 50 km polar orbit, the WAC acquires images with a nadir ground sampling distance of 75 m for each of the five visible bands and 384 m for the two ultraviolet bands. The Narrow Angle Camera (NAC) consists of two identical cameras capable of acquiring images with a ground sampling distance of 0.5 m from an altitude of 50 km. The LROC team geometrically calibrated each camera before launch at Malin Space Science Systems in San Diego, California and the resulting measurements enabled the generation of a detailed camera model for all three cameras. The cameras were mounted and subsequently launched on the Lunar Reconnaissance Orbiter (LRO) on 18 June 2009. Using a subset of the over 793000 NAC and 207000 WAC images of illuminated terrain collected between 30 June 2009 and 15 December 2013, we improved the interior and exterior orientation parameters for each camera, including the addition of a wavelength dependent radial distortion model for the multispectral WAC. These geometric refinements, along with refined ephemeris, enable seamless projections of NAC image pairs with a geodetic accuracy better than 20 meters and sub-pixel precision and accuracy when orthorectifying WAC images.

  16. Multispectral Scanner for Monitoring Plants

    NASA Technical Reports Server (NTRS)

    Gat, Nahum

    2004-01-01

    A multispectral scanner has been adapted to capture spectral images of living plants under various types of illumination for purposes of monitoring the health of, or monitoring the transfer of genes into, the plants. In a health-monitoring application, the plants are illuminated with full-spectrum visible and near infrared light and the scanner is used to acquire a reflected-light spectral signature known to be indicative of the health of the plants. In a gene-transfer- monitoring application, the plants are illuminated with blue or ultraviolet light and the scanner is used to capture fluorescence images from a green fluorescent protein (GFP) that is expressed as result of the gene transfer. The choice of wavelength of the illumination and the wavelength of the fluorescence to be monitored depends on the specific GFP.

  17. Multispectral sensing of moisture stress

    NASA Technical Reports Server (NTRS)

    Olson, C. E., Jr.

    1970-01-01

    Laboratory reflectance data, and field tests with multispectral remote sensors provide support for this hypotheses that differences in moisture content and water deficits are closely related to foliar reflectance from woody plants. When these relationships are taken into account, automatic recognition techniques become more powerful than when they are ignored. Evidence is increasing that moisture relationships inside plant foliage are much more closely related to foliar reflectance characteristics than are external variables such as soil moisture, wind, and air temperature. Short term changes in water deficits seem to have little influence on foliar reflectance, however. This is in distinct contrast to significant short-term changes in foliar emittance from the same plants with changing wind, air temperature, incident radiation, or water deficit conditions.

  18. Cinematic camera emulation using two-dimensional color transforms

    NASA Astrophysics Data System (ADS)

    McElvain, Jon S.; Gish, Walter

    2015-02-01

    For cinematic and episodic productions, on-set look management is an important component of the creative process, and involves iterative adjustments of the set, actors, lighting and camera configuration. Instead of using the professional motion capture device to establish a particular look, the use of a smaller form factor DSLR is considered for this purpose due to its increased agility. Because the spectral response characteristics will be different between the two camera systems, a camera emulation transform is needed to approximate the behavior of the destination camera. Recently, twodimensional transforms have been shown to provide high-accuracy conversion of raw camera signals to a defined colorimetric state. In this study, the same formalism is used for camera emulation, whereby a Canon 5D Mark III DSLR is used to approximate the behavior a Red Epic cinematic camera. The spectral response characteristics for both cameras were measured and used to build 2D as well as 3x3 matrix emulation transforms. When tested on multispectral image databases, the 2D emulation transforms outperform their matrix counterparts, particularly for images containing highly chromatic content.

  19. Experimental Advanced Airborne Research Lidar (EAARL) Data Processing Manual

    USGS Publications Warehouse

    Bonisteel, Jamie M.; Nayegandhi, Amar; Wright, C. Wayne; Brock, John C.; Nagle, David

    2009-01-01

    . Each pulse is focused into an illumination area that has a radius of about 20 centimeters on the ground. The pulse-repetition frequency of the EAARL transmitter varies along each across-track scan to produce equal cross-track sample spacing and near uniform density (Nayegandhi and others, 2006). Targets can have varying physical and optical characteristics that cause extreme fluctuations in laser backscatter complexity and signal strength. To accommodate this dynamic range, EAARL has the real-time ability to detect, capture, and automatically adapt to each laser return backscatter. The backscattered energy is collected by an array of four high-speed waveform digitizers connected to an array of four sub-nanosecond photodetectors. Each of the four photodetectors receives a finite range of the returning laser backscatter photons. The most sensitive channel receives 90% of the photons, the least sensitive receives 0.9%, and the middle channel receives 9% (Wright and Brock, 2002). The fourth channel is available for detection but is not currently being utilized. All four channels are digitized simultaneously into 65,536 samples for every laser pulse. Receiver optics consists of a 15-centimeter-diameter dielectric-coated Newtonian telescope, a computer-driven raster scanning mirror oscillating at 12.5 hertz (25 rasters per second), and an array of sub-nanosecond photodetectors. The signal emitted by the pulsed laser transmitter is amplified as backscatter by the optical telescope receiver. The photomultiplier tube (PMT) then converts the optical energy into electrical impulses (Nayegandhi and others, 2006). In addition to the full-waveform resolving laser, the EAARL sensor suite includes a down-looking 70-centimeter-resolution Red-Green-Blue (RGB) digital network camera, a high-resolution color infrared (CIR) multispectral camera (14-centimeter-resolution), two precision dual-frequency kinematic carrier-phase global positioning system (GPS) receivers, and an

  20. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  1. Color image reproduction based on multispectral and multiprimary imaging: experimental evaluation

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Masahiro; Teraji, Taishi; Ohsawa, Kenro; Uchiyama, Toshio; Motomura, Hideto; Murakami, Yuri; Ohyama, Nagaaki

    2001-12-01

    Multispectral imaging is significant technology for the acquisition and display of accurate color information. Natural color reproduction under arbitrary illumination becomes possible using spectral information of both image and illumination light. In addition, multiprimary color display, i.e., using more than three primary colors, has been also developed for the reproduction of expanded color gamut, and for discounting observer metamerism. In this paper, we present the concept for the multispectral data interchange for natural color reproduction, and the experimental results using 16-band multispectral camera and 6-primary color display. In the experiment, the accuracy of color reproduction is evaluated in CIE (Delta) Ea*b* for both image capture and display systems. The average and maximum (Delta) Ea*b* = 1.0 and 2.1 in 16-band mutispectral camera system, using Macbeth 24 color patches. In the six-primary color projection display, average and maximum (Delta) Ea*b* = 1.3 and 2.7 with 30 test colors inside the display gamut. Moreover, the color reproduction results with different spectral distributions but same CIE tristimulus value are visually compared, and it is confirmed that the 6-primary display gives improved agreement between the original and reproduced colors.

  2. Use of a Multispectral Uav Photogrammetry for Detection and Tracking of Forest Disturbance Dynamics

    NASA Astrophysics Data System (ADS)

    Minařík, R.; Langhammer, J.

    2016-06-01

    This study presents a new methodological approach for assessment of spatial and qualitative aspects of forest disturbance based on the use of multispectral imaging camera with the UAV photogrammetry. We have used the miniaturized multispectral sensor Tetracam Micro Multiple Camera Array (μ-MCA) Snap 6 with the multirotor imaging platform to get multispectral imagery with high spatial resolution. The study area is located in the Sumava Mountains, Central Europe, heavily affected by windstorms, followed by extensive and repeated bark beetle (Ips typographus [L.]) outbreaks in the past 20 years. After two decades, there is apparent continuous spread of forest disturbance as well as rapid regeneration of forest vegetation, related with changes in species and their diversity. For testing of suggested methodology, we have launched imaging campaign in experimental site under various stages of forest disturbance and regeneration. The imagery of high spatial and spectral resolution enabled to analyse the inner structure and dynamics of the processes. The most informative bands for tree stress detection caused by bark beetle infestation are band 2 (650nm) and band 3 (700nm), followed by band 4 (800 nm) from the, red-edge and NIR part of the spectrum. We have identified only three indices, which seems to be able to correctly detect different forest disturbance categories in the complex conditions of mixture of categories. These are Normalized Difference Vegetation Index (NDVI), Simple 800/650 Ratio Pigment specific simple ratio B1 and Red-edge Index.

  3. Single sensor that outputs narrowband multispectral images

    NASA Astrophysics Data System (ADS)

    Kong, Linghua; Yi, Dingrong; Sprigle, Stephen; Wang, Fengtao; Wang, Chao; Liu, Fuhan; Adibi, Ali; Tummala, Rao

    2010-01-01

    We report the work of developing a hand-held (or miniaturized), low-cost, stand-alone, real-time-operation, narrow bandwidth multispectral imaging device for the detection of early stage pressure ulcers.

  4. Multispectral imaging with vertical silicon nanowires

    PubMed Central

    Park, Hyunsung; Crozier, Kenneth B.

    2013-01-01

    Multispectral imaging is a powerful tool that extends the capabilities of the human eye. However, multispectral imaging systems generally are expensive and bulky, and multiple exposures are needed. Here, we report the demonstration of a compact multispectral imaging system that uses vertical silicon nanowires to realize a filter array. Multiple filter functions covering visible to near-infrared (NIR) wavelengths are simultaneously defined in a single lithography step using a single material (silicon). Nanowires are then etched and embedded into polydimethylsiloxane (PDMS), thereby realizing a device with eight filter functions. By attaching it to a monochrome silicon image sensor, we successfully realize an all-silicon multispectral imaging system. We demonstrate visible and NIR imaging. We show that the latter is highly sensitive to vegetation and furthermore enables imaging through objects opaque to the eye. PMID:23955156

  5. Study on multispectral imaging detection and recognition

    NASA Astrophysics Data System (ADS)

    Jun, Wang; Na, Ding; Gao, Jiaobo; Yu, Hu; Jun, Wu; Li, Junna; Zheng, Yawei; Fei, Gao; Sun, Kefeng

    2009-07-01

    Multispectral imaging detecting technology use target radiation character in spectral spatial distribution and relation between spectral and image to detect target and remote sensing measure. Its speciality is multi channel, narrow bandwidth, large amount of information, high accuracy. The ability of detecting target in environment of clutter, camouflage, concealment and beguilement is improved. At present, spectral imaging technology in the range of multispectral and hyperspectral develop greatly. The multispectral imaging equipment of unmanned aerial vehicle can be used in mine detection, information, surveillance and reconnaissance. Spectral imaging spectrometer operating in MWIR and LWIR has already been applied in the field of remote sensing and military in the advanced country. The paper presents the technology of multispectral imaging. It can enhance the reflectance, scatter and radiation character of the artificial targets among nature background. The targets among complex background and camouflage/stealth targets can be effectively identified. The experiment results and the data of spectral imaging is obtained.

  6. Simultaneous denoising and compression of multispectral images

    NASA Astrophysics Data System (ADS)

    Hagag, Ahmed; Amin, Mohamed; Abd El-Samie, Fathi E.

    2013-01-01

    A new technique for denoising and compression of multispectral satellite images to remove the effect of noise on the compression process is presented. One type of multispectral images has been considered: Landsat Enhanced Thematic Mapper Plus. The discrete wavelet transform (DWT), the dual-tree DWT, and a simple Huffman coder are used in the compression process. Simulation results show that the proposed technique is more effective than other traditional compression-only techniques.

  7. Multispectral Palmprint Recognition Using a Quaternion Matrix

    PubMed Central

    Xu, Xingpeng; Guo, Zhenhua; Song, Changjiang; Li, Yafeng

    2012-01-01

    Palmprints have been widely studied for biometric recognition for many years. Traditionally, a white light source is used for illumination. Recently, multispectral imaging has drawn attention because of its high recognition accuracy. Multispectral palmprint systems can provide more discriminant information under different illuminations in a short time, thus they can achieve better recognition accuracy. Previously, multispectral palmprint images were taken as a kind of multi-modal biometrics, and the fusion scheme on the image level or matching score level was used. However, some spectral information will be lost during image level or matching score level fusion. In this study, we propose a new method for multispectral images based on a quaternion model which could fully utilize the multispectral information. Firstly, multispectral palmprint images captured under red, green, blue and near-infrared (NIR) illuminations were represented by a quaternion matrix, then principal component analysis (PCA) and discrete wavelet transform (DWT) were applied respectively on the matrix to extract palmprint features. After that, Euclidean distance was used to measure the dissimilarity between different features. Finally, the sum of two distances and the nearest neighborhood classifier were employed for recognition decision. Experimental results showed that using the quaternion matrix can achieve a higher recognition rate. Given 3000 test samples from 500 palms, the recognition rate can be as high as 98.83%. PMID:22666049

  8. Pancam: A Multispectral Imaging Investigation on the NASA 2003 Mars Exploration Rover Mission

    NASA Technical Reports Server (NTRS)

    Bell, J. F., III; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Dingizian, A.; Brown, D.; Morris, R. V.; Arneson, H. M.; Johnson, M. J.

    2003-01-01

    One of the six science payload elements carried on each of the NASA Mars Exploration Rovers (MER; Figure 1) is the Panoramic Camera System, or Pancam. Pancam consists of three major components: a pair of digital CCD cameras, the Pancam Mast Assembly (PMA), and a radiometric calibration target. The PMA provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. The calibration target provides a set of reference color and grayscale standards for calibration validation, and a shadow post for quantification of the direct vs. diffuse illumination of the scene. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360 of azimuth and from zenith to nadir, providing a complete view of the scene around the rover in up to 12 unique wavelengths. The major characteristics of Pancam are summarized.

  9. MAPPING GRAIN SORGHUM YEILD VARIABILITY USING AIRBORNE DIGITAL VIDEOGRAPHY

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Mapping crop yield variability is one important aspect of precision agriculture. This study was designed to assess airborne digital videography as a tool for mapping grain sorghum yields for precision farming. Color-infrared (CIR) imagery was acquired with a three- camera digital video imaging sys...

  10. Estimating atmospheric parameters and reducing noise for multispectral imaging

    DOEpatents

    Conger, James Lynn

    2014-02-25

    A method and system for estimating atmospheric radiance and transmittance. An atmospheric estimation system is divided into a first phase and a second phase. The first phase inputs an observed multispectral image and an initial estimate of the atmospheric radiance and transmittance for each spectral band and calculates the atmospheric radiance and transmittance for each spectral band, which can be used to generate a "corrected" multispectral image that is an estimate of the surface multispectral image. The second phase inputs the observed multispectral image and the surface multispectral image that was generated by the first phase and removes noise from the surface multispectral image by smoothing out change in average deviations of temperatures.

  11. MESSENGER multispectral observations of Mercury (Invited)

    NASA Astrophysics Data System (ADS)

    Denevi, B. W.; Robinson, M. S.; Solomon, S. C.; Murchie, S. L.; Blewett, D. T.; Domingue, D. L.; McCoy, T. J.; Ernst, C. M.; Head, J. W.; Watters, T. R.; Chabot, N. L.

    2009-12-01

    MESSENGER’s first two flybys of Mercury provide new insights into Mercury’s dynamic past and reveal a planet rich in color and compositional diversity. Including images from Mariner 10, over 90% of Mercury has now been observed at resolutions >2 km/pixel, and 80% of the planet has been observed in the 11 colors of the Mercury Dual Imaging System’s wide-angle camera (WAC). The multispectral WAC images confirm the existence of color variations correlated to geologic terrains such as smooth plains deposits and crater and basin ejecta, as well as more diffuse variations that, in some cases, have not yet been linked to particular geologic features. Smooth plains, many of which have been interpreted to be of volcanic origin, cover nearly 40% of the mapped surface. What fraction of the smooth plains formed through volcanism, as opposed to originating during impact events (as impact melt or basin ejecta), is not yet known. Globally, smooth plains do not appear to have a single color signature but instead show a range of color and reflectance nearly as large as that observed on Mercury as a whole, and on par with the contrast variations observed among the lunar maria. This observation suggests that the smooth plains have a range of compositions. Reflectance spectra are consistent with the presence of low-FeO silicates, as well as a spectrally neutral opaque component in varying abundances. Intercrater plains are similar in color and reflectance to the intermediate- to low-reflectance smooth plains, perhaps indicating a similar composition and/or origin. Color, and likely compositional, end-members include pyroclastic deposits (relatively high in reflectance with a steeper spectral slope) and low-reflectance material (LRM, with a shallower spectral slope). Pyroclastic materials, which on other bodies can originate in the mantle, provide insight into the composition of the source regions. LRM is typically concentrated in crater and basin ejecta excavated from depths as

  12. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-06-30

    This paper describes image evaluation techniques used to standardize camera system characterizations. The authors group is involved with building and fielding several types of camera systems. Camera types include gated intensified cameras, multi-frame cameras, and streak cameras. Applications range from X-ray radiography to visible and infrared imaging. Key areas of performance include sensitivity, noise, and resolution. This team has developed an analysis tool, in the form of image processing software, to aid an experimenter in measuring a set of performance metrics for their camera system. These performance parameters are used to identify a camera system's capabilities and limitations while establishing a means for camera system comparisons. The analysis tool is used to evaluate digital images normally recorded with CCD cameras. Electro-optical components provide fast shuttering and/or optical gain to camera systems. Camera systems incorporate a variety of electro-optical components such as microchannel plate (MCP) or proximity focused diode (PFD) image intensifiers; electro-static image tubes; or electron-bombarded (EB) CCDs. It is often valuable to evaluate the performance of an intensified camera in order to determine if a particular system meets experimental requirements.

  13. Multispectral fundus imaging for early detection of diabetic retinopathy

    NASA Astrophysics Data System (ADS)

    Beach, James M.; Tiedeman, James S.; Hopkins, Mark F.; Sabharwal, Yashvinder S.

    1999-04-01

    Functional imaging of the retina and associated structures may provide information for early assessment of risks of developing retinopathy in diabetic patients. Here we show results of retinal oximetry performed using multi-spectral reflectance imaging techniques to assess hemoglobin (Hb) oxygen saturation (OS) in blood vessels of the inner retina and oxygen utilization at the optic nerve in diabetic patients without retinopathy and early disease during experimental hyperglycemia. Retinal images were obtained through a fundus camera and simultaneously recorded at up to four wavelengths using image-splitting modules coupled to a digital camera. Changes in OS in large retinal vessels, in average OS in disk tissue, and in the reduced state of cytochrome oxidase (CO) at the disk were determined from changes in reflectance associated with the oxidation/reduction states of Hb and CO. Step to high sugar lowered venous oxygen saturation to a degree dependent on disease duration. Moderate increase in sugar produced higher levels of reduced CO in both the disk and surrounding tissue without a detectable change in average tissue OS. Results suggest that regulation of retinal blood supply and oxygen consumption are altered by hyperglycemia and that such functional changes are present before clinical signs of retinopathy.

  14. Airborne oceanographic lidar system

    NASA Technical Reports Server (NTRS)

    Bressel, C.; Itzkan, I.; Nunes, J. E.; Hoge, F.

    1977-01-01

    The characteristics of an Airborne Oceanographic Lidar (AOL) are given. The AOL system is described and its potential for various measurement applications including bathymetry and fluorosensing is discussed.

  15. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  16. Novel fundus camera design

    NASA Astrophysics Data System (ADS)

    Dehoog, Edward A.

    A fundus camera a complex optical system that makes use of the principle of reflex free indirect ophthalmoscopy to image the retina. Despite being in existence as early as 1900's, little has changed in the design of a fundus camera and there is minimal information about the design principles utilized. Parameters and specifications involved in the design of fundus camera are determined and their affect on system performance are discussed. Fundus cameras incorporating different design methods are modeled and a performance evaluation based on design parameters is used to determine the effectiveness of each design strategy. By determining the design principles involved in the fundus camera, new cameras can be designed to include specific imaging modalities such as optical coherence tomography, imaging spectroscopy and imaging polarimetry to gather additional information about properties and structure of the retina. Design principles utilized to incorporate such modalities into fundus camera systems are discussed. Design, implementation and testing of a snapshot polarimeter fundus camera are demonstrated.

  17. Advanced camera for surveys

    NASA Astrophysics Data System (ADS)

    Clampin, Mark; Ford, Holland C.; Bartko, Frank; Bely, Pierre Y.; Broadhurst, Tom; Burrows, Christopher J.; Cheng, Edward S.; Crocker, James H.; Franx, Marijn; Feldman, Paul D.; Golimowski, David A.; Hartig, George F.; Illingworth, Garth; Kimble, Randy A.; Lesser, Michael P.; Miley, George H.; Postman, Marc; Rafal, Marc D.; Rosati, Piero; Sparks, William B.; Tsvetanov, Zlatan; White, Richard L.; Sullivan, Pamela; Volmer, Paul; LaJeunesse, Tom

    2000-07-01

    The Advanced Camera for Surveys (ACS) is a third generation instrument for the Hubble Space Telescope (HST). It is currently planned for installation in HST during the fourth servicing mission in Summer 2001. The ACS will have three cameras.

  18. Constrained space camera assembly

    DOEpatents

    Heckendorn, Frank M.; Anderson, Erin K.; Robinson, Casandra W.; Haynes, Harriet B.

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  19. Making Ceramic Cameras

    ERIC Educational Resources Information Center

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  20. Adaptation of the Camera Link Interface for Flight-Instrument Applications

    NASA Technical Reports Server (NTRS)

    Randall, David P.; Mahoney, John C.

    2010-01-01

    COTS (commercial-off-the-shelf) hard ware using an industry-standard Camera Link interface is proposed to accomplish the task of designing, building, assembling, and testing electronics for an airborne spectrometer that would be low-cost, but sustain the required data speed and volume. The focal plane electronics were designed to support that hardware standard. Analysis was done to determine how these COTS electronics could be interfaced with space-qualified camera electronics. Interfaces available for spaceflight application do not support the industry standard Camera Link interface, but with careful design, COTS EGSE (electronics ground support equipment), including camera interfaces and camera simulators, can still be used.

  1. Multispectral Imaging from Mars PATHFINDER

    NASA Technical Reports Server (NTRS)

    Ferrand, William H.; Bell, James F., III; Johnson, Jeffrey R.; Bishop, Janice L.; Morris, Richard V.

    2007-01-01

    The Imager for Mars Pathfinder (IMP) was a mast-mounted instrument on the Mars Pathfinder lander which landed on Mars Ares Vallis floodplain on July 4, 1997. During the 83 sols of Mars Pathfinders landed operations, the IMP collected over 16,600 images. Multispectral images were collected using twelve narrowband filters at wavelengths between 400 and 1000 nm in the visible and near infrared (VNIR) range. The IMP provided VNIR spectra of the materials surrounding the lander including rocks, bright soils, dark soils, and atmospheric observations. During the primary mission, only a single primary rock spectral class, Gray Rock, was recognized; since then, Black Rock, has been identified. The Black Rock spectra have a stronger absorption at longer wavelengths than do Gray Rock spectra. A number of coated rocks have also been described, the Red and Maroon Rock classes, and perhaps indurated soils in the form of the Pink Rock class. A number of different soil types were also recognized with the primary ones being Bright Red Drift, Dark Soil, Brown Soil, and Disturbed Soil. Examination of spectral parameter plots indicated two trends which were interpreted as representing alteration products formed in at least two different environmental epochs of the Ares Vallis area. Subsequent analysis of the data and comparison with terrestrial analogs have supported the interpretation that the rock coatings provide evidence of earlier martian environments. However, the presence of relatively uncoated examples of the Gray and Black rock classes indicate that relatively unweathered materials can persist on the martian surface.

  2. Pedestrian detection by multispectral fusion

    NASA Astrophysics Data System (ADS)

    Ma, Yunqian; Wang, Zheng; Bazakos, Mike

    2006-04-01

    Security systems increasingly rely on the use of Automated Video Surveillance (AVS) technology. In particular the use of digital video renders itself to internet and local communications, remote monitoring, and to computer processing. AVS systems can perform many tedious and repetitive tasks currently performed by trained security personnel. AVS technology has already made some significant steps towards automating some basic security functions such as: motion detection, object tracking and event-based video recording. However, there are still many problems associated with just these automated functions, which need to be addressed further. Some examples of these problems are: the high "false alarm rate" and the "loss of track" under total or partial occlusion, when used under a wide range of operational parameters (day, night, sunshine, cloudy, foggy, range, viewing angle, clutter, etc.). Current surveillance systems work well only under a narrow range of operational parameters. Therefore, they need be hardened against a wide range of operational conditions. In this paper, we present a Multi-spectral fusion approach to perform accurate pedestrian segmentation under varying operational parameters. Our fusion method combines the "best" detection results from the visible images and the "best" from the thermal images. Commonly, the motion detection results in the visible images are easily affected by noise and shadows. The objects in the thermal image are relatively stable, but they may be missing some parts of the objects, because they thermally blend with the background. Our method makes use of the "best" object components and de-emphasize the "not best".

  3. Nanosecond frame cameras

    SciTech Connect

    Frank, A M; Wilkins, P R

    2001-01-05

    The advent of CCD cameras and computerized data recording has spurred the development of several new cameras and techniques for recording nanosecond images. We have made a side by side comparison of three nanosecond frame cameras, examining them for both performance and operational characteristics. The cameras include; Micro-Channel Plate/CCD, Image Diode/CCD and Image Diode/Film; combinations of gating/data recording. The advantages and disadvantages of each device will be discussed.

  4. Plenoptic processing methods for distributed camera arrays

    NASA Astrophysics Data System (ADS)

    Boyle, Frank A.; Yancey, Jerry W.; Maleh, Ray; Deignan, Paul

    2011-05-01

    Recent advances in digital photography have enabled the development and demonstration of plenoptic cameras with impressive capabilities. They function by recording sub-aperture images that can be combined to re-focus images or to generate stereoscopic pairs. Plenoptic methods are being explored for fusing images from distributed arrays of cameras, with a view toward applications in which hardware resources are limited (e.g. size, weight, power constraints). Through computer simulation and experimental studies, the influences of non-idealities such as camera position uncertainty are being considered. Component image rescaling and balancing methods are being explored to compensate. Of interest is the impact on precision passive ranging and super-resolution. In a preliminary experiment, a set of images from a camera array was recorded and merged to form a 3D representation of a scene. Conventional plenoptic refocusing was demonstrated and techniques were explored for balancing the images. Nonlinear methods were explored for combining the images limited the ghosting caused by sub-sampling. Plenoptic processing was explored as a means for determining 3D information from airborne video. Successive frames were processed as camera array elements to extract the heights of structures. Practical means were considered for rendering the 3D information in color.

  5. Land mine detection using multispectral image fusion

    SciTech Connect

    Clark, G.A.; Sengupta, S.K.; Aimonetti, W.D.; Roeske, F.; Donetti, J.G.; Fields, D.J.; Sherwood, R.J.; Schaich, P.C.

    1995-03-29

    Our system fuses information contained in registered images from multiple sensors to reduce the effects of clutter and improve the ability to detect surface and buried land mines. The sensor suite currently consists of a camera that acquires images in six bands (400nm, 500nm, 600nm, 700nm, 800nm and 900nm). Past research has shown that it is extremely difficult to distinguish land mines from background clutter in images obtained from a single sensor. It is hypothesized, however, that information fused from a suite of various sensors is likely to provide better detection reliability, because the suite of sensors detects a variety of physical properties that are more separable in feature space. The materials surrounding the mines can include natural materials (soil, rocks, foliage, water, etc.) and some artifacts. We use a supervised learning pattern recognition approach to detecting the metal and plastic land mines. The overall process consists of four main parts: Preprocessing, feature extraction, feature selection, and classification. These parts are used in a two step process to classify a subimage. We extract features from the images, and use feature selection algorithms to select only the most important features according to their contribution to correct detections. This allows us to save computational complexity and determine which of the spectral bands add value to the detection system. The most important features from the various sensors are fused using a supervised learning pattern classifier (the probabilistic neural network). We present results of experiments to detect land mines from real data collected from an airborne platform, and evaluate the usefulness of fusing feature information from multiple spectral bands.

  6. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

  7. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-07-01

    This paper describes image evaluation techniques used to standardize camera system characterizations. Key areas of performance include resolution, noise, and sensitivity. This team has developed a set of analysis tools, in the form of image processing software used to evaluate camera calibration data, to aid an experimenter in measuring a set of camera performance metrics. These performance metrics identify capabilities and limitations of the camera system, while establishing a means for comparing camera systems. Analysis software is used to evaluate digital camera images recorded with charge-coupled device (CCD) cameras. Several types of intensified camera systems are used in the high-speed imaging field. Electro-optical components are used to provide precise shuttering or optical gain for a camera system. These components including microchannel plate or proximity focused diode image intensifiers, electro-static image tubes, or electron-bombarded CCDs affect system performance. It is important to quantify camera system performance in order to qualify a system as meeting experimental requirements. The camera evaluation tool is designed to provide side-by-side camera comparison and system modeling information.

  8. Harpicon camera for HDTV

    NASA Astrophysics Data System (ADS)

    Tanada, Jun

    1992-08-01

    Ikegami has been involved in broadcast equipment ever since it was established as a company. In conjunction with NHK it has brought forth countless television cameras, from black-and-white cameras to color cameras, HDTV cameras, and special-purpose cameras. In the early days of HDTV (high-definition television, also known as "High Vision") cameras the specifications were different from those for the cameras of the present-day system, and cameras using all kinds of components, having different arrangements of components, and having different appearances were developed into products, with time spent on experimentation, design, fabrication, adjustment, and inspection. But recently the knowhow built up thus far in components, , printed circuit boards, and wiring methods has been incorporated in camera fabrication, making it possible to make HDTV cameras by metbods similar to the present system. In addition, more-efficient production, lower costs, and better after-sales service are being achieved by using the same circuits, components, mechanism parts, and software for both HDTV cameras and cameras that operate by the present system.

  9. Airborne sensor systems under development at the NASA/NSTL/Earth Resources Laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, James E.; Meeks, Gerald R.

    1988-01-01

    The operational characteristics of the Airborne Bathymetric System (ABS) MSS and the Airborne Multispectral Pushbroom Scanner (AMPS), which are currently being developed at NASA's Earth Resources Laboratory (ERL), are described. The ABS MSS system scans through a swath width of + or - 40 deg from nadir and the sensor incorporates onboard calibration references for the visible and short-wavelength IR channels. The AMPS uses five separate f/1.8 refractive telecentric lens systems, each incorporating nine optical elements, and a replaceable fixed bandwidth filter.

  10. Research on airborne infrared leakage detection of natural gas pipeline

    NASA Astrophysics Data System (ADS)

    Tan, Dongjie; Xu, Bin; Xu, Xu; Wang, Hongchao; Yu, Dongliang; Tian, Shengjie

    2011-12-01

    An airborne laser remote sensing technology is proposed to detect natural gas pipeline leakage in helicopter which carrying a detector, and the detector can detect a high spatial resolution of trace of methane on the ground. The principle of the airborne laser remote sensing system is based on tunable diode laser absorption spectroscopy (TDLAS). The system consists of an optical unit containing the laser, camera, helicopter mount, electronic unit with DGPS antenna, a notebook computer and a pilot monitor. And the system is mounted on a helicopter. The principle and the architecture of the airborne laser remote sensing system are presented. Field test experiments are carried out on West-East Natural Gas Pipeline of China, and the results show that airborne detection method is suitable for detecting gas leak of pipeline on plain, desert, hills but unfit for the area with large altitude diversification.

  11. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  12. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  13. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  14. Rapid multispectral endoscopic imaging system for near real-time mapping of the mucosa blood supply in the lung

    PubMed Central

    Fawzy, Yasser; Lam, Stephen; Zeng, Haishan

    2015-01-01

    We have developed a fast multispectral endoscopic imaging system that is capable of acquiring images in 18 optimized spectral bands spanning 400-760 nm by combining a customized light source with six triple-band filters and a standard color CCD camera. A method is developed to calibrate the spectral response of the CCD camera. Imaging speed of 15 spectral image cubes/second is achieved. A spectral analysis algorithm based on a linear matrix inversion approach is developed and implemented in a graphics processing unit (GPU) to map the mucosa blood supply in the lung in vivo. Clinical measurements on human lung patients are demonstrated. PMID:26309761

  15. Tower Camera Handbook

    SciTech Connect

    Moudry, D

    2005-01-01

    The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

  16. Applying Neural Networks to Hyperspectral and Multispectral Field Data for Discrimination of Cruciferous Weeds in Winter Crops

    PubMed Central

    de Castro, Ana-Isabel; Jurado-Expósito, Montserrat; Gómez-Casero, María-Teresa; López-Granados, Francisca

    2012-01-01

    In the context of detection of weeds in crops for site-specific weed control, on-ground spectral reflectance measurements are the first step to determine the potential of remote spectral data to classify weeds and crops. Field studies were conducted for four years at different locations in Spain. We aimed to distinguish cruciferous weeds in wheat and broad bean crops, using hyperspectral and multispectral readings in the visible and near-infrared spectrum. To identify differences in reflectance between cruciferous weeds, we applied three classification methods: stepwise discriminant (STEPDISC) analysis and two neural networks, specifically, multilayer perceptron (MLP) and radial basis function (RBF). Hyperspectral and multispectral signatures of cruciferous weeds, and wheat and broad bean crops can be classified using STEPDISC analysis, and MLP and RBF neural networks with different success, being the MLP model the most accurate with 100%, or higher than 98.1%, of classification performance for all the years. Classification accuracy from hyperspectral signatures was similar to that from multispectral and spectral indices, suggesting that little advantage would be obtained by using more expensive airborne hyperspectral imagery. Therefore, for next investigations, we recommend using multispectral remote imagery to explore whether they can potentially discriminate these weeds and crops. PMID:22629171

  17. Geometric calibration and accuracy assessment of a multispectral imager on UAVs

    NASA Astrophysics Data System (ADS)

    Zheng, Fengjie; Yu, Tao; Chen, Xingfeng; Chen, Jiping; Yuan, Guoti

    2012-11-01

    The increasing developments in Unmanned Aerial Vehicles (UAVs) platforms and associated sensing technologies have widely promoted UAVs remote sensing application. UAVs, especially low-cost UAVs, limit the sensor payload in weight and dimension. Mostly, cameras on UAVs are panoramic, fisheye lens, small-format CCD planar array camera, unknown intrinsic parameters and lens optical distortion will cause serious image aberrations, even leading a few meters or tens of meters errors in ground per pixel. However, the characteristic of high spatial resolution make accurate geolocation more critical to UAV quantitative remote sensing research. A method for MCC4-12F Multispectral Imager designed to load on UAVs has been developed and implemented. Using multi-image space resection algorithm to assess geometric calibration parameters of random position and different photogrammetric altitudes in 3D test field, which is suitable for multispectral cameras. Both theoretical and practical accuracy assessments were selected. The results of theoretical strategy, resolving object space and image point coordinate differences by space intersection, showed that object space RMSE were 0.2 and 0.14 pixels in X direction and in Y direction, image space RMSE were superior to 0.5 pixels. In order to verify the accuracy and reliability of the calibration parameters,practical study was carried out in Tianjin UAV flight experiments, the corrected accuracy validated by ground checkpoints was less than 0.3m. Typical surface reflectance retrieved on the basis of geo-rectified data was compared with ground ASD measurement resulting 4% discrepancy. Hence, the approach presented here was suitable for UAV multispectral imager.

  18. Hyperspectral and multispectral bioluminescence optical tomography for small animal imaging.

    PubMed

    Chaudhari, Abhijit J; Darvas, Felix; Bading, James R; Moats, Rex A; Conti, Peter S; Smith, Desmond J; Cherry, Simon R; Leahy, Richard M

    2005-12-01

    For bioluminescence imaging studies in small animals, it is important to be able to accurately localize the three-dimensional (3D) distribution of the underlying bioluminescent source. The spectrum of light produced by the source that escapes the subject varies with the depth of the emission source because of the wavelength-dependence of the optical properties of tissue. Consequently, multispectral or hyperspectral data acquisition should help in the 3D localization of deep sources. In this paper, we describe a framework for fully 3D bioluminescence tomographic image acquisition and reconstruction that exploits spectral information. We describe regularized tomographic reconstruction techniques that use semi-infinite slab or FEM-based diffusion approximations of photon transport through turbid media. Singular value decomposition analysis was used for data dimensionality reduction and to illustrate the advantage of using hyperspectral rather than achromatic data. Simulation studies in an atlas-mouse geometry indicated that sub-millimeter resolution may be attainable given accurate knowledge of the optical properties of the animal. A fixed arrangement of mirrors and a single CCD camera were used for simultaneous acquisition of multispectral imaging data over most of the surface of the animal. Phantom studies conducted using this system demonstrated our ability to accurately localize deep point-like sources and show that a resolution of 1.5 to 2.2 mm for depths up to 6 mm can be achieved. We also include an in vivo study of a mouse with a brain tumour expressing firefly luciferase. Co-registration of the reconstructed 3D bioluminescent image with magnetic resonance images indicated good anatomical localization of the tumour. PMID:16306643

  19. Hyperspectral and multispectral bioluminescence optical tomography for small animal imaging

    NASA Astrophysics Data System (ADS)

    Chaudhari, Abhijit J.; Darvas, Felix; Bading, James R.; Moats, Rex A.; Conti, Peter S.; Smith, Desmond J.; Cherry, Simon R.; Leahy, Richard M.

    2005-12-01

    For bioluminescence imaging studies in small animals, it is important to be able to accurately localize the three-dimensional (3D) distribution of the underlying bioluminescent source. The spectrum of light produced by the source that escapes the subject varies with the depth of the emission source because of the wavelength-dependence of the optical properties of tissue. Consequently, multispectral or hyperspectral data acquisition should help in the 3D localization of deep sources. In this paper, we describe a framework for fully 3D bioluminescence tomographic image acquisition and reconstruction that exploits spectral information. We describe regularized tomographic reconstruction techniques that use semi-infinite slab or FEM-based diffusion approximations of photon transport through turbid media. Singular value decomposition analysis was used for data dimensionality reduction and to illustrate the advantage of using hyperspectral rather than achromatic data. Simulation studies in an atlas-mouse geometry indicated that sub-millimeter resolution may be attainable given accurate knowledge of the optical properties of the animal. A fixed arrangement of mirrors and a single CCD camera were used for simultaneous acquisition of multispectral imaging data over most of the surface of the animal. Phantom studies conducted using this system demonstrated our ability to accurately localize deep point-like sources and show that a resolution of 1.5 to 2.2 mm for depths up to 6 mm can be achieved. We also include an in vivo study of a mouse with a brain tumour expressing firefly luciferase. Co-registration of the reconstructed 3D bioluminescent image with magnetic resonance images indicated good anatomical localization of the tumour.

  20. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  1. A new lossless compression algorithm for satellite earth science multi-spectral imagers

    NASA Astrophysics Data System (ADS)

    Gladkova, Irina; Gottipati, Srikanth; Grossberg, Michael

    2007-09-01

    Multispectral imaging is becoming an increasingly important tool for monitoring the earth and its environment from space borne and airborne platforms. Multispectral imaging data consists of visible and IR measurements from a scene across space and spectrum. Growing data rates resulting from faster scanning and finer spatial and spectral resolution makes compression an increasingly critical tool to reduce data volume for transmission and archiving. Examples of multispectral sensors we consider include the NASA 36 band MODIS imager, Meteosat 2nd generation 12 band SEVIRI imager, GOES R series 16 band ABI imager, current generation GOES 5 band imager, and Japan's 5 band MTSAT imager. Conventional lossless compression algorithms are not able to reach satisfactory compression ratios nor are they near the upper limits for lossless compression on imager data as estimated from the Shannon entropy. We introduce a new lossless compression algorithm developed for the NOAA-NESDIS satellite based Earth science multispectral imagers. The algorithm is based on capturing spectral correlations using spectral prediction, and spatial correlations with a linear transform encoder. Our results are evaluated by comparison with current sattelite compression algorithms such the new CCSDS standard compression algorithm, and JPEG2000. The algorithm as presented has been designed to work with NOAA's scientific data and so is purely lossless but lossy modes can be supported. The compression algorithm also structures the data in a way that makes it easy to incorporate robust error correction using FEC coding methods as TPC and LDPC for satellite use. This research was funded by NOAA-NESDIS for its Earth observing satellite program and NOAA goals.

  2. Multispectral satellite observations for arid land studies

    NASA Technical Reports Server (NTRS)

    Choudhury, Bhaskar J.

    1992-01-01

    Multispectral satellite data when properly calibrated and standardized can be used synergistically for a quantitative analysis of processes and surface characteristics, and for quantifying land surface change. Relationships among multispectral satellite data (visible reflectance, surface temperature and polarization difference of microwave emission at 37 GHz frequency) have been used to develop hypotheses concerning the relative sensitivity of these data to varied land surface characteristics, which needs to be verified by field observations. Radiative transfer models have also been developed to understand these multispectral data. Interannual variations of visible reflectance and polarization difference for the period 1982-1986 over the Sahel and the Sudan zones of Africa show a lagged response with respect to the rainfall deficit during recovery from drought, which needs to be understood in terms of biophysical parameters.

  3. [Cucumber diseases diagnosis using multispectral imaging technique].

    PubMed

    Feng, Jie; Liao, Ning-Fang; Zhao, Bo; Luo, Yong-Dao; Li, Bao-Ju

    2009-02-01

    For a reliable diagnosis of plant diseases and insect pests, spectroscopy analysis technique and mutispectral imaging technique are proposed to diagnose five cucumber diseases, namely Trichothecium roseum, Sphaerotheca fuliginea, Cladosporium cucumerinum, Corynespora cassiicola and Pseudoperonospora cubensis. In the experiment, the cucumbers' multispectral images of 14 visible lights channels, near infrared channel and panchromatic channel were captured using narrow-band multispectral imaging system under standard observation environment. And the 5 cucumber diseases, healthy leaves and reference white were classified using their multispectral information, the distance, angle and relativity. The discrimination of Trichothecium roseum, Sphaerotheca fuliginea, Cladosporium cucumerinum, and reference white was 100%, and that of Pseudoperonospora cubensis and healthy leaves was 80% and 93.33% respectively. The mean correct discrimination of diseases was 81.90% when the distance and relativity were used together. The result shows that the method realized good accuracy in the cucumber diseases diagnosis. PMID:19445229

  4. Unsupervised classification of remote multispectral sensing data

    NASA Technical Reports Server (NTRS)

    Su, M. Y.

    1972-01-01

    The new unsupervised classification technique for classifying multispectral remote sensing data which can be either from the multispectral scanner or digitized color-separation aerial photographs consists of two parts: (a) a sequential statistical clustering which is a one-pass sequential variance analysis and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. Applications of the technique using an IBM-7094 computer on multispectral data sets over Purdue's Flight Line C-1 and the Yellowstone National Park test site have been accomplished. Comparisons between the classification maps by the unsupervised technique and the supervised maximum liklihood technique indicate that the classification accuracies are in agreement.

  5. Radiometric Characterization of Hyperspectral Imagers using Multispectral Sensors

    NASA Technical Reports Server (NTRS)

    McCorkel, Joel; Kurt, Thome; Leisso, Nathan; Anderson, Nikolaus; Czapla-Myers, Jeff

    2009-01-01

    The Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne and satellite based sensors. Often, ground-truth measurements at these test sites are not always successful due to weather and funding availability. Therefore, RSG has also automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor, This work studies the feasibility of determining the radiometric calibration of a hyperspectral imager using multispectral a imagery. The work relies on the Moderate Resolution Imaging Spectroradiometer (M0DIS) as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. Hyperion bands are compared to MODIS by band averaging Hyperion's high spectral resolution data with the relative spectral response of M0DlS. The results compare cross-calibration scenarios that differ in image acquisition coincidence, test site used for the calibration, and reference sensor. Cross-calibration results are presented that show agreement between the use of coincident and non-coincident image pairs within 2% in most brands as well as similar agreement between results that employ the different MODIS sensors as a reference.

  6. Radiometric characterization of hyperspectral imagers using multispectral sensors

    NASA Astrophysics Data System (ADS)

    McCorkel, Joel; Thome, Kurt; Leisso, Nathan; Anderson, Nikolaus; Czapla-Myers, Jeff

    2009-08-01

    The Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne and satellite based sensors. Often, ground-truth measurements at these tests sites are not always successful due to weather and funding availability. Therefore, RSG has also employed automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor. This work studies the feasibility of determining the radiometric calibration of a hyperspectral imager using multispectral imagery. The work relies on the Moderate Resolution Imaging Spectroradiometer (MODIS) as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. Hyperion bands are compared to MODIS by band averaging Hyperion's high spectral resolution data with the relative spectral response of MODIS. The results compare cross-calibration scenarios that differ in image acquisition coincidence, test site used for the calibration, and reference sensor. Cross-calibration results are presented that show agreement between the use of coincident and non-coincident image pairs within 2% in most bands as well as similar agreement between results that employ the different MODIS sensors as a reference.

  7. High-speed multispectral confocal biomedical imaging

    PubMed Central

    Carver, Gary E.; Locknar, Sarah A.; Morrison, William A.; Krishnan Ramanujan, V.; Farkas, Daniel L.

    2014-01-01

    Abstract. A new approach for generating high-speed multispectral confocal images has been developed. The central concept is that spectra can be acquired for each pixel in a confocal spatial scan by using a fast spectrometer based on optical fiber delay lines. This approach merges fast spectroscopy with standard spatial scanning to create datacubes in real time. The spectrometer is based on a serial array of reflecting spectral elements, delay lines between these elements, and a single element detector. The spatial, spectral, and temporal resolution of the instrument is described and illustrated by multispectral images of laser-induced autofluorescence in biological tissues. PMID:24658777

  8. Narrowband multispectral liquid crystal tunable filter.

    PubMed

    Abuleil, Marwan; Abdulhalim, Ibrahim

    2016-05-01

    Multispectral tunable filters with high performance are desirable components in various biomedical and industrial applications. In this Letter, we present a new narrowband multispectral tunable filter with high throughput over a wide dynamic range. It is composed from a wideband large dynamic range liquid crystal tunable filter combined with a multiple narrowbands spectral filter made of two stacks of photonic crystals and cavity layer in between. The filter tunes between nine spectral bands covering the range 450-1000 nm with bandwidth <10  nm and throughput >80%. PMID:27128048

  9. Coastal and estuarine applications of multispectral photography

    NASA Technical Reports Server (NTRS)

    Yost, E.; Wenderoth, S.

    1972-01-01

    An evaluation of multispectral photographic techniques for optical penetration of water in the northeastern United States and the Gulf of Mexico coastal waters is presented. The spectral band (493 to 543 nanom), when exposed to place the water mass at about unit density on the photographic emulsion, was found to provide the best water penetration, independent of altitude or time of day, as long as solar glitter from the surface of the water is avoided. An isoluminous color technique was perfected, which eliminates the dimension of brightness from a multispectral color presentation.

  10. Airborne gravity is here

    SciTech Connect

    Hammer, S.

    1982-01-11

    After 20 years of development efforts, the airborne gravity survey has finally become a practical exploration method. Besides gravity data, the airborne survey can also collect simultaneous, continuous records of high-precision magneticfield data as well as terrain clearance; these provide a topographic contour map useful in calculating terrain conditions and in subsequent planning and engineering. Compared with a seismic survey, the airborne gravity method can cover the same area much more quickly and cheaply; a seismograph could then detail the interesting spots.

  11. Microchannel plate streak camera

    DOEpatents

    Wang, Ching L.

    1989-01-01

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

  12. GRACE star camera noise

    NASA Astrophysics Data System (ADS)

    Harvey, Nate

    2016-08-01

    Extending results from previous work by Bandikova et al. (2012) and Inacio et al. (2015), this paper analyzes Gravity Recovery and Climate Experiment (GRACE) star camera attitude measurement noise by processing inter-camera quaternions from 2003 to 2015. We describe a correction to star camera data, which will eliminate a several-arcsec twice-per-rev error with daily modulation, currently visible in the auto-covariance function of the inter-camera quaternion, from future GRACE Level-1B product releases. We also present evidence supporting the argument that thermal conditions/settings affect long-term inter-camera attitude biases by at least tens-of-arcsecs, and that several-to-tens-of-arcsecs per-rev star camera errors depend largely on field-of-view.

  13. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1989-03-21

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

  14. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1984-09-28

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

  15. Hemodynamic and morphologic responses in mouse brain during acute head injury imaged by multispectral structured illumination

    NASA Astrophysics Data System (ADS)

    Volkov, Boris; Mathews, Marlon S.; Abookasis, David

    2015-03-01

    Multispectral imaging has received significant attention over the last decade as it integrates spectroscopy, imaging, tomography analysis concurrently to acquire both spatial and spectral information from biological tissue. In the present study, a multispectral setup based on projection of structured illumination at several near-infrared wavelengths and at different spatial frequencies is applied to quantitatively assess brain function before, during, and after the onset of traumatic brain injury in an intact mouse brain (n=5). For the production of head injury, we used the weight drop method where weight of a cylindrical metallic rod falling along a metal tube strikes the mouse's head. Structured light was projected onto the scalp surface and diffuse reflected light was recorded by a CCD camera positioned perpendicular to the mouse head. Following data analysis, we were able to concurrently show a series of hemodynamic and morphologic changes over time including higher deoxyhemoglobin, reduction in oxygen saturation, cell swelling, etc., in comparison with baseline measurements. Overall, results demonstrates the capability of multispectral imaging based structured illumination to detect and map of brain tissue optical and physiological properties following brain injury in a simple noninvasive and noncontact manner.

  16. Multispectral and hyperspectral measurements of smoke candles and soldier's camouflage equipment

    NASA Astrophysics Data System (ADS)

    Lagueux, Philippe; Gagnon, Marc-André; Kastek, Mariusz; PiÄ tkowski, Tadeusz; Dulski, Rafał; Trzaskawka, Piotr

    2012-09-01

    The emergence of new infrared camouflage and countermeasure technologies in the context of military operations has paved the way to enhanced detection capabilities. Camouflage devices such as candles (or smoke bombs) and flares are developed to generate either large area or localized screens with very high absorption in the infrared. Similarly, soldier's camouflage devices such as clothing have evolved in design to dissolve their infrared characteristics with that of the background. In all cases, the analysis of the targets infrared images needs to be conducted in both multispectral and hyperspectral domains to assess their capability to efficiently provide visible and infrared camouflage. The Military University of Technology has conducted several intensive field campaigns where various types of smoke candles and camouflage uniforms were deployed in different conditions and were measured both in the multispectral and hyperspectral domains. Cooled broadband infrared cameras were used for the multispectral analysis whereas the high spectral, spatial and temporal resolution acquisition of these thermodynamic events was recorded with the Telops Hyper-Cam sensor. This paper presents the test campaign concept and the analysis of the recorded measurements.

  17. Non-contact assessment of melanin distribution via multispectral temporal illumination coding

    NASA Astrophysics Data System (ADS)

    Amelard, Robert; Scharfenberger, Christian; Wong, Alexander; Clausi, David A.

    2015-03-01

    Melanin is a pigment that is highly absorptive in the UV and visible electromagnetic spectra. It is responsible for perceived skin tone, and protects against harmful UV effects. Abnormal melanin distribution is often an indicator for melanoma. We propose a novel approach for non-contact melanin distribution via multispectral temporal illumination coding to estimate the two-dimensional melanin distribution based on its absorptive characteristics. In the proposed system, a novel multispectral, cross-polarized, temporally-coded illumination sequence is synchronized with a camera to measure reflectance under both multispectral and ambient illumination. This allows us to eliminate the ambient illumination contribution from the acquired reflectance measurements, and also to determine the melanin distribution in an observed region based on the spectral properties of melanin using the Beer-Lambert law. Using this information, melanin distribution maps can be generated for objective, quantitative assessment of skin type of individuals. We show that the melanin distribution map correctly identifies areas with high melanin densities (e.g., nevi).

  18. Analytical multicollimator camera calibration

    USGS Publications Warehouse

    Tayman, W.P.

    1978-01-01

    Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

  19. LSST Camera Optics Design

    SciTech Connect

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  20. Polarization encoded color camera.

    PubMed

    Schonbrun, Ethan; Möller, Guðfríður; Di Caprio, Giuseppe

    2014-03-15

    Digital cameras would be colorblind if they did not have pixelated color filters integrated into their image sensors. Integration of conventional fixed filters, however, comes at the expense of an inability to modify the camera's spectral properties. Instead, we demonstrate a micropolarizer-based camera that can reconfigure its spectral response. Color is encoded into a linear polarization state by a chiral dispersive element and then read out in a single exposure. The polarization encoded color camera is capable of capturing three-color images at wavelengths spanning the visible to the near infrared. PMID:24690806

  1. Ringfield lithographic camera

    DOEpatents

    Sweatt, William C.

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  2. Multispectral Dual-Aperture Schmidt Objective

    NASA Technical Reports Server (NTRS)

    Minott, P. O.

    1983-01-01

    Off-axis focal planes make room for beam splitters. System includes two off-axis primary spherical reflectors, each concentric with refractive corrector at aperature. Off-axis design assures large aperture required for adequate spatial resolution. Separate images have precise registration, used for multispectral resource mapping or remote sensing.

  3. Estimating noise and information for multispectral imagery

    NASA Astrophysics Data System (ADS)

    Aiazzi, Bruno; Alparone, Luciano; Barducci, Alessandro; Baronti, Stefano; Pippi, Ivan

    2002-03-01

    We focus on reliably estimating the information conveyed to a user by multispectral image data. The goal is establishing the extent to which an increase in spectral resolution can increase the amount of usable information. As a matter of fact, a trade- off exists between spatial and spectral resolution, due to physical constraints of sensors imaging with a prefixed SNR. After describing some methods developed for automatically estimating the variance of the noise introduced by multispectral imagers, lossless data compression is exploited to measure the useful information content of the multispectral data. In fact, the bit rate achieved by the reversible compression process takes into account both the contribution of the 'observation' noise, i.e., information regarded as statistical uncertainty, whose relevance is null to a user, and the intrinsic information of hypothetically noise free multispectral data. An entropic model of the image source is defined and, once the standard deviation of the noise, assumed to be white and Gaussian, has been preliminarily estimated, such a model is inverted to yield an estimate of the information content of the noise-free source from the code rate. Results of both noise and information assessment are reported and discussed on synthetic noisy images and on Landsat thematic mapper (TM) data.

  4. Multispectral laser imaging for advanced food analysis

    NASA Astrophysics Data System (ADS)

    Senni, L.; Burrascano, P.; Ricci, M.

    2016-07-01

    A hardware-software apparatus for food inspection capable of realizing multispectral NIR laser imaging at four different wavelengths is herein discussed. The system was designed to operate in a through-transmission configuration to detect the presence of unwanted foreign bodies inside samples, whether packed or unpacked. A modified Lock-In technique was employed to counterbalance the significant signal intensity attenuation due to transmission across the sample and to extract the multispectral information more efficiently. The NIR laser wavelengths used to acquire the multispectral images can be varied to deal with different materials and to focus on specific aspects. In the present work the wavelengths were selected after a preliminary analysis to enhance the image contrast between foreign bodies and food in the sample, thus identifying the location and nature of the defects. Experimental results obtained from several specimens, with and without packaging, are presented and the multispectral image processing as well as the achievable spatial resolution of the system are discussed.

  5. Toolsets for Airborne Data

    Atmospheric Science Data Center

    2015-04-02

    article title:  Toolsets for Airborne Data     View larger image The ... limit of detection values. Prior to accessing the TAD Web Application ( https://tad.larc.nasa.gov ) for the first time, users must ...

  6. The airborne laser

    NASA Astrophysics Data System (ADS)

    Lamberson, Steven; Schall, Harold; Shattuck, Paul

    2007-05-01

    The Airborne Laser (ABL) is an airborne, megawatt-class laser system with a state-of-the-art atmospheric compensation system to destroy enemy ballistic missiles at long ranges. This system will provide both deterrence and defense against the use of such weapons during conflicts. This paper provides an overview of the ABL weapon system including: the notional operational concept, the development approach and schedule, the overall aircraft configuration, the technologies being incorporated in the ABL, and the current program status.

  7. Preliminary Evaluation Of Charge-Coupled Device (CCD) Multispectral Analysis In Ophthalmology

    NASA Astrophysics Data System (ADS)

    Launay, Francoise; Fauconnier, Thierry; Fort, Bernard; Cailloux, Mireille; Bonnin, Paul; Bloch-Michel, Etienne

    1986-05-01

    The work described was originally aimed at providing a new diagnostic technique for the early detection of malignant ocular tumors through their spectral signature. The instrument developed comprises a modified fundus camera, a Charge-coupled device (CCD) camera and a 16 bit microcomputer equipped with floppy disk drives and a 512 x 512X 8-bit display device. The system allows the recording of digitized fundus or iris reflectance pictures in eight spec-tral bands between 500 and 1100 nm. After calibration and preprocessing of the data, a multi-spectral analysis is performed by means of a VAX computer. The image processing methods are described and their ability to characterize pigmented lesions or other ocular anatomical fea-tures through their spectral signature is evaluated.

  8. Vein visualization using a smart phone with multispectral Wiener estimation for point-of-care applications.

    PubMed

    Song, Jae Hee; Kim, Choye; Yoo, Yangmo

    2015-03-01

    Effective vein visualization is clinically important for various point-of-care applications, such as needle insertion. It can be achieved by utilizing ultrasound imaging or by applying infrared laser excitation and monitoring its absorption. However, while these approaches can be used for vein visualization, they are not suitable for point-of-care applications because of their cost, time, and accessibility. In this paper, a new vein visualization method based on multispectral Wiener estimation is proposed and its real-time implementation on a smart phone is presented. In the proposed method, a conventional RGB camera on a commercial smart phone (i.e., Galaxy Note 2, Samsung Electronics Inc., Suwon, Korea) is used to acquire reflectance information from veins. Wiener estimation is then applied to extract the multispectral information from the veins. To evaluate the performance of the proposed method, an experiment was conducted using a color calibration chart (ColorChecker Classic, X-rite, Grand Rapids, MI, USA) and an average root-mean-square error of 12.0% was obtained. In addition, an in vivo subcutaneous vein imaging experiment was performed to explore the clinical performance of the smart phone-based Wiener estimation. From the in vivo experiment, the veins at various sites were successfully localized using the reconstructed multispectral images and these results were confirmed by ultrasound B-mode and color Doppler images. These results indicate that the presented multispectral Wiener estimation method can be used for visualizing veins using a commercial smart phone for point-of-care applications (e.g., vein puncture guidance). PMID:24691170

  9. Recent Multispectral Imaging Results from the Pancam Instruments on the Mars Exploration Rovers Spirit and Opportunity

    NASA Astrophysics Data System (ADS)

    Bell, J. F.

    2006-12-01

    As of early September 2006, the Mars Exploration Rover Panoramic Camera (Pancam) instruments have acquired more than 57,000 and 52,000 multispectral images, respectively, from the rovers' landing sites and traverse paths within Gusev crater and Meridiani Planum. These observations include more than 950 and 600 full multispectral imaging observations, respectively, in the 11 distinct near-UV to near-IR wavelengths sampled by the Pancams. Both rovers have faced challenges in their exploration activities during 2006 because of power restrictions imposed by the low-Sun, southern hemisphere winter conditions. Still, major science campaigns have been conducted at both landing sites. At Gusev, Pancam imaging documented major geomorphic and color units during the Spirit rover's traverse down the south side of Husband Hill and across the Southern Basin to the possible volcanic or impact feature known as Home Plate. More recently, at Spirit's "Winter Haven" stationary location in Gusev crater, a full-resolution, 360 degree, low-compression panorama (the "McMurdo" panorama) has been obtained using all of Pancam's filters. In Meridiani, the geology and color properties of the terrain during Opportunity's traverse south from Erebus crater to its current location at the rim of Victoria crater has been documented in detail by Pancam multispectral imaging, including a number of albedo measurements and other coordinated observations with orbiting NASA and ESA spacecraft designed to enhance surface-orbital "ground truth" connections. Panoramas, mosaics, and multispectral analysis results from these recent Pancam data sets will be summarized and discussed in terms of their geologic context and complimentarity to other MER remote sensing and in situ investigations and results obtained during this past year.

  10. Real-time air quality monitoring by using internet video surveillance camera

    NASA Astrophysics Data System (ADS)

    Wong, C. J.; Lim, H. S.; MatJafri, M. Z.; Abdullah, K.; Low, K. L.

    2007-04-01

    Nowadays internet video surveillance cameras are widely use in security monitoring. The quantities of installations of these cameras also become more and more. This paper reports that the internet video surveillance cameras can be applied as a remote sensor for monitoring the concentrations of particulate matter less than 10 micron (PM10), so that real time air quality can be monitored at multi location simultaneously. An algorithm was developed based on the regression analysis of relationship between the measured reflectance components from a surface material and the atmosphere. This algorithm converts multispectral image pixel values acquired from these cameras into quantitative values of the concentrations of PM10. These computed PM10 values were compared to other standard values measured by a DustTrak TM meter. The correlation results showed that the newly develop algorithm produced a high degree of accuracy as indicated by high correlation coefficient (R2) and low root-mean-square-error (RMS) values. The preliminary results showed that the accuracy produced by this internet video surveillance camera is slightly better than that from the internet protocol (IP) camera. Basically the spatial resolution of images acquired by the IP camera was poorer compared to the internet video surveillance camera. This is because the images acquired by IP camera had been compressed and there was no compression for the images from the internet video surveillance camera.

  11. The Camera Cook Book.

    ERIC Educational Resources Information Center

    Education Development Center, Inc., Newton, MA.

    Intended for use with the photographic materials available from the Workshop for Learning Things, Inc., this "camera cookbook" describes procedures that have been tried in classrooms and workshops and proven to be the most functional and inexpensive. Explicit starting off instructions--directions for exploring and loading the camera and for taking…

  12. The DSLR Camera

    NASA Astrophysics Data System (ADS)

    Berkó, Ernő; Argyle, R. W.

    Cameras have developed significantly in the past decade; in particular, digital Single-Lens Reflex Cameras (DSLR) have appeared. As a consequence we can buy cameras of higher and higher pixel number, and mass production has resulted in the great reduction of prices. CMOS sensors used for imaging are increasingly sensitive, and the electronics in the cameras allows images to be taken with much less noise. The software background is developing in a similar way—intelligent programs are created for after-processing and other supplementary works. Nowadays we can find a digital camera in almost every household, most of these cameras are DSLR ones. These can be used very well for astronomical imaging, which is nicely demonstrated by the amount and quality of the spectacular astrophotos appearing in different publications. These examples also show how much post-processing software contributes to the rise in the standard of the pictures. To sum up, the DSLR camera serves as a cheap alternative for the CCD camera, with somewhat weaker technical characteristics. In the following, I will introduce how we can measure the main parameters (position angle and separation) of double stars, based on the methods, software and equipment I use. Others can easily apply these for their own circumstances.

  13. Constrained space camera assembly

    DOEpatents

    Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

    1999-05-11

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

  14. CCD Luminescence Camera

    NASA Technical Reports Server (NTRS)

    Janesick, James R.; Elliott, Tom

    1987-01-01

    New diagnostic tool used to understand performance and failures of microelectronic devices. Microscope integrated to low-noise charge-coupled-device (CCD) camera to produce new instrument for analyzing performance and failures of microelectronics devices that emit infrared light during operation. CCD camera also used to indentify very clearly parts that have failed where luminescence typically found.

  15. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  16. The PanCam Calibration Target (PCT) and multispectral image processing for the ExoMars 2018 mission

    NASA Astrophysics Data System (ADS)

    Barnes, D.; Wilding, M.; Gunn, M.; Tyler, L.; Pugh, S.; Coates, A.; Griffiths, A.; Cousins, C.; Schmitz, N.; Paar, G.

    2011-10-01

    The Panoramic Camera (PanCam) instrument for the ESA/NASA 2018 ExoMars mission is designed to be the 'eyes' of the Mars rover and is equipped with two wide angle multispectral cameras (WACs) from MSSL, and a focusable High Resolution Camera (HRC) from DLR. To achieve its science role within the ExoMars mission, the PanCam will generate terrain reflectance spectra to help identify the mineralogy of the Martian surface, and generate true-colour images of the Martian environment. The PanCam Calibration Target (PCT) is an essential component for the science operations of the PanCam instrument. Its purpose is to allow radiometric calibration and to support geometric calibration check-out of the PanCam instrument during the mission. Unlike other camera calibration targets flown to Mars, the PCT target regions are being made from stained glass. The paper describes the work undertaken during the early build and testing of the PCT, together with results from the baseline algorithms that have been designed and implemented to process the multispectral PanCam images.

  17. Dry imaging cameras

    PubMed Central

    Indrajit, IK; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

    2011-01-01

    Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

  18. Chromaffin cell calcium signal and morphology study based on multispectral images

    NASA Astrophysics Data System (ADS)

    Wu, Hongxiu; Wei, Shunhui; Qu, Anlian; Zhou, Zhuan

    1998-09-01

    Increasing or decreasing the internal calcium concentration can promote or prevent programmed cell death (PCD). We therefore performed a Ca2+ imaging study using Ca2+ indicator dye fura-2 and a sensitive cooled-CCD camera with a 12 bit resolution. Monochromatic beams of light with a wavelength of 345,380 nm were isolated from light emitted by a xenon lamp using a monochromator. The concentration of free calcium can be directly calculated from the ratio of two fluorescence values taken at two appropriately selected wavelength. Fluorescent light emitted from the cells was capture using a camera system. The cell morphology study is based on multispectral scanning, with smear images provided as three monochromatic images by illumination with light of 610,535 and 470 nm wavelengths. The nuclear characteristic parameters extracted from individual nuclei by system are nuclear area, nuclear diameter, nuclear density vector. The results of the restoration of images and the performance of a primitive logic for the detection of nuclei with PCD proved the usefulness of the system and the advantages of using multispectral images in the restoration and detection procedures.

  19. Multispectral fluorescence imaging techniques for nondestructive food safety inspection

    NASA Astrophysics Data System (ADS)

    Kim, Moon S.; Lefcourt, Alan M.; Chen, Yud-Ren

    2004-03-01

    The use of spectral sensing has gained acceptance as a rapid means for nondestructive inspection of postharvest food produce. Current technologies generally use color or a single wavelength camera technology. The applicability and sensitivity of these techniques can be expanded through the use of multiple wavelengths. Reflectance in the Vis/NIR is the prevalent spectral technique. Fluorescence, compared to reflectance, is regarded as a more sensitive technique due to its dynamic responses to subtle changes in biological entities. Our laboratory has been exploring fluorescence as a potential means for detection of quality and wholesomeness of food products. Applications of fluorescence sensing require an understanding of the spectral characteristics emanating from constituents and potential contaminants. A number of factors affecting fluorescence emission characteristics are discussed. Because of relatively low fluorescence quantum yield from biological samples, a system with a powerful pulse light source such as a laser coupled with a gated detection device is used to harvest fluorescence, in the presence of ambient light. Several fluorescence sensor platforms developed in our laboratory, including hyperspectral imaging, and laser-induced fluorescence (LIF) and steady-state fluorescence imaging systems with multispectral capabilities are presented. We demonstrate the potential uses of recently developed fluorescence imaging platforms in food safety inspection of apples contaminated with animal feces.

  20. Demonstration of clutter reduction and aircraft detection in multispectral data

    NASA Astrophysics Data System (ADS)

    Hoff, Lawrence E.; Winter, Edwin M.

    1992-08-01

    The DARPA multi-spectral infrared camera (MUSIC) was used for a series of experiments in Australia and Maui, Hawaii in 1991. The Maui experiments, conducted from a high mountain, concentrated on the detection of aircraft. The detection of air vehicles without the use of temporal motion (such as the case of a head-on approaching air vehicle) is a challenging problem when background clutter is present. The technique investigated was not dependent upon either the angular motion or the spectral signature of the aircraft. This approach exploits the differential transmission of the atmosphere in neighboring long wave infrared bands. This differential transmission between the target and background `colors' the background relative to the target and allows its removal. This technique was demonstrated on many examples of MUSIC data collected in Maui, Hawaii. Targets approaching the sensor head-on were successfully detected against clouds and other backgrounds using spectral along with spatial techniques. Several different algorithms were investigated and results are compared.

  1. Active multispectral near-IR detection of small surface targets

    NASA Astrophysics Data System (ADS)

    de Jong, Arie N.; Winkel, Hans; Roos, Marco J. J.

    2001-10-01

    The detection and identification of small surface targets with Electro-Optical sensors is seriously hampered by ground clutter, leading to false alarms and reduced detection probabilities. Active ground illumination can improve the detection performance of EO sensors compared to passive skylight illumination because of the knowledge of the illumination level and of its temporal stability. Sun and sky cannot provide this due to the weather variability. In addition multispectral sensors with carefully chosen spectral bands ranging from the visual into the near IR from 400-2500 nm wavelength can take benefit of a variety of cheap active light sources, ranging from lasers to Xenon or halogen lamps. Results are presented, obtained with a two- color laser scanner with one wavelength in the chlorophyll absorption dip. Another active scanner is described operating at 4 wavebands between 1400 and 2300 nm, using tungsten halogen lamps. Finally a simple TV camera was used with either a ste of narrow band spectral filters or polarization filters in front of the lamps. The targets consisted of an array of mixed objects, most of them real mines. The results how great promise in enhancing the detection and identification probabilities of EO sensors against small surface targets.

  2. Optical Communications Link to Airborne Transceiver

    NASA Technical Reports Server (NTRS)

    Regehr, Martin W.; Kovalik, Joseph M.; Biswas, Abhijit

    2011-01-01

    An optical link from Earth to an aircraft demonstrates the ability to establish a link from a ground platform to a transceiver moving overhead. An airplane has a challenging disturbance environment including airframe vibrations and occasional abrupt changes in attitude during flight. These disturbances make it difficult to maintain pointing lock in an optical transceiver in an airplane. Acquisition can also be challenging. In the case of the aircraft link, the ground station initially has no precise knowledge of the aircraft s location. An airborne pointing system has been designed, built, and demonstrated using direct-drive brushless DC motors for passive isolation of pointing disturbances and for high-bandwidth control feedback. The airborne transceiver uses a GPS-INS system to determine the aircraft s position and attitude, and to then illuminate the ground station initially for acquisition. The ground transceiver participates in link-pointing acquisition by first using a wide-field camera to detect initial illumination from the airborne beacon, and to perform coarse pointing. It then transfers control to a high-precision pointing detector. Using this scheme, live video was successfully streamed from the ground to the aircraft at 270 Mb/s while simultaneously downlinking a 50 kb/s data stream from the aircraft to the ground.

  3. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  4. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. Near-infrared camera for the Clementine mission

    SciTech Connect

    Priest, R.E.; Lewis, I.T.; Sewall, N.R.; Park, H.S.; Shannon, M.J.; Ledebuhr, A.G.; Pleasance, L.D.; Massie, M.A.; Metschuleit, K.

    1995-04-01

    The Clementine mission provided the first ever complete, systematic surface mapping of the moon from the ultra-violet to the near-infrared regions. More than 1.7 million images of the moon, earth and space were returned from this mission. The near-infrared (NIR) multi-spectral camera, one of two workhorse lunar mapping cameras (the other being the UV/visible camera), provided {approximately}200 in spatial resolution at 400 km periselene, and a 39 km across-track swath. This 1.9 kg infrared camera using a 256 x 256 InSb FPA viewed reflected solar illumination from the lunar surface and lunar horizon in the 1 to 3 {micro}m wavelength region, extending lunar imagery and mineralogy studies into the near infrared. A description of this light-weight, low power NIR camera along with a summary of lessons learned is presented. Design goals and preliminary on-orbit performance estimates are addressed in terms of meeting the mission`s primary objective for flight qualifying the sensors for future Department of Defense flights.

  6. Air Pollution Determination Using a Surveillance Internet Protocol Camera Images

    NASA Astrophysics Data System (ADS)

    Chow Jeng, C. J.; Hwee San, Hslim; Matjafri, M. Z.; Abdullah, Abdul, K.

    Air pollution has long been a problem in the industrial nations of the West It has now become an increasing source of environmental degradation in the developing nations of east Asia Malaysia government has built a network to monitor air pollution But the cost of these networks is high and limits the knowledge of pollutant concentration to specific points of the cities A methodology based on a surveillance internet protocol IP camera for the determination air pollution concentrations was presented in this study The objective of this study was to test the feasibility of using IP camera data for estimating real time particulate matter of size less than 10 micron PM10 in the campus of USM The proposed PM10 retrieval algorithm derived from the atmospheric optical properties was employed in the present study In situ data sets of PM10 measurements and sun radiation measurements at the ground surface were collected simultaneously with the IP camera images using a DustTrak meter and a handheld spectroradiometer respectively The digital images were separated into three bands namely red green and blue bands for multispectral algorithm calibration The digital number DN of the IP camera images were converted into radiance and reflectance values After that the reflectance recorded by the digital camera was subtracted by the reflectance of the known surface and we obtained the reflectance caused by the atmospheric components The atmospheric reflectance values were used for regression analysis Regression technique was employed to determine suitable

  7. Sun and aureole spectrometer for airborne measurements to derive aerosol optical properties.

    PubMed

    Asseng, Hagen; Ruhtz, Thomas; Fischer, Jürgen

    2004-04-01

    We have designed an airborne spectrometer system for the simultaneous measurement of the direct Sun irradiance and aureole radiance. The instrument is based on diffraction grating spectrometers with linear image sensors. It is robust, lightweight, compact, and reliable, characteristics that are important for airborne applications. The multispectral radiation measurements are used to derive optical properties of tropospheric aerosols. We extract the altitude dependence of the aerosol volume scattering function and of the aerosol optical depth by using flight patterns with descents and ascents ranging from the surface level to the top of the boundary layer. The extinction coefficient and the product of single scattering albedo and phase function of separate layers can be derived from the airborne measurements. PMID:15074425

  8. Spectral Characterization of a Prototype SFA Camera for Joint Visible and NIR Acquisition

    PubMed Central

    Thomas, Jean-Baptiste; Lapray, Pierre-Jean; Gouton, Pierre; Clerc, Cédric

    2016-01-01

    Multispectral acquisition improves machine vision since it permits capturing more information on object surface properties than color imaging. The concept of spectral filter arrays has been developed recently and allows multispectral single shot acquisition with a compact camera design. Due to filter manufacturing difficulties, there was, up to recently, no system available for a large span of spectrum, i.e., visible and Near Infra-Red acquisition. This article presents the achievement of a prototype of camera that captures seven visible and one near infra-red bands on the same sensor chip. A calibration is proposed to characterize the sensor, and images are captured. Data are provided as supplementary material for further analysis and simulations. This opens a new range of applications in security, robotics, automotive and medical fields. PMID:27367690

  9. Spectral Characterization of a Prototype SFA Camera for Joint Visible and NIR Acquisition.

    PubMed

    Thomas, Jean-Baptiste; Lapray, Pierre-Jean; Gouton, Pierre; Clerc, Cédric

    2016-01-01

    Multispectral acquisition improves machine vision since it permits capturing more information on object surface properties than color imaging. The concept of spectral filter arrays has been developed recently and allows multispectral single shot acquisition with a compact camera design. Due to filter manufacturing difficulties, there was, up to recently, no system available for a large span of spectrum, i.e., visible and Near Infra-Red acquisition. This article presents the achievement of a prototype of camera that captures seven visible and one near infra-red bands on the same sensor chip. A calibration is proposed to characterize the sensor, and images are captured. Data are provided as supplementary material for further analysis and simulations. This opens a new range of applications in security, robotics, automotive and medical fields. PMID:27367690

  10. UV/visible camera for the Clementine mission

    SciTech Connect

    Kordas, J.F.; Lewis, I.T.; Priest, R.E.

    1995-04-01

    This article describes the Clementine UV/Visible (UV/Vis) multispectral camera, discusses design goals and preliminary estimates of on-orbit performance, and summarizes lessons learned in building and using the sensor. While the primary objective of the Clementine Program was to qualify a suite of 6 light-weight, low power imagers for future Department of Defense flights, the mission also has provided the first systematic mapping of the complete lunar surface in the visible and near-infrared spectral regions. The 410 g, 4.65 W UV/Vis camera uses a 384 x 288 frame-transfer silicon CCD FPA and operates at 6 user-selectable wavelength bands between 0.4 and 1.1 {micro}m. It has yielded lunar imagery and mineralogy data with up to 120 in spatial resolution (band dependent) at 400 km periselene along a 39 km cross-track swath.

  11. Integrated radar-camera security system: experimental results

    NASA Astrophysics Data System (ADS)

    Zyczkowski, M.; Palka, N.; Trzcinski, T.; Dulski, R.; Kastek, M.; Trzaskawka, P.

    2011-06-01

    The nature of the recent military conflicts and terrorist attacks along with the necessity to protect bases, convoys and patrols have made a serious impact on the development of more effective security systems. Current widely-used perimeter protection systems with zone sensors will soon be replaced with multi-sensor systems. Multi-sensor systems can utilize day/night cameras, IR uncooled thermal cameras, and millimeter-wave radars which detect radiation reflected from targets. Ranges of detection, recognition and identification for all targets depend on the parameters of the sensors used and of the observed scene itself. In this paper two essential issues connected with multispectral systems are described. We will focus on describing the autonomous method of the system regarding object detection, tracking, identification, localization and alarm notifications. We will also present the possibility of configuring the system as a stationary, mobile or portable device as in our experimental results.

  12. Night Vision Camera

    NASA Technical Reports Server (NTRS)

    1996-01-01

    PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

  13. Kitt Peak speckle camera

    NASA Technical Reports Server (NTRS)

    Breckinridge, J. B.; Mcalister, H. A.; Robinson, W. G.

    1979-01-01

    The speckle camera in regular use at Kitt Peak National Observatory since 1974 is described in detail. The design of the atmospheric dispersion compensation prisms, the use of film as a recording medium, the accuracy of double star measurements, and the next generation speckle camera are discussed. Photographs of double star speckle patterns with separations from 1.4 sec of arc to 4.7 sec of arc are shown to illustrate the quality of image formation with this camera, the effects of seeing on the patterns, and to illustrate the isoplanatic patch of the atmosphere.

  14. Structured light camera calibration

    NASA Astrophysics Data System (ADS)

    Garbat, P.; Skarbek, W.; Tomaszewski, M.

    2013-03-01

    Structured light camera which is being designed with the joined effort of Institute of Radioelectronics and Institute of Optoelectronics (both being large units of the Warsaw University of Technology within the Faculty of Electronics and Information Technology) combines various hardware and software contemporary technologies. In hardware it is integration of a high speed stripe projector and a stripe camera together with a standard high definition video camera. In software it is supported by sophisticated calibration techniques which enable development of advanced application such as real time 3D viewer of moving objects with the free viewpoint or 3D modeller for still objects.

  15. Ringfield lithographic camera

    DOEpatents

    Sweatt, W.C.

    1998-09-08

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

  16. A comparison of real and simulated airborne multisensor imagery

    NASA Astrophysics Data System (ADS)

    Bloechl, Kevin; De Angelis, Chris; Gartley, Michael; Kerekes, John; Nance, C. Eric

    2014-06-01

    This paper presents a methodology and results for the comparison of simulated imagery to real imagery acquired with multiple sensors hosted on an airborne platform. The dataset includes aerial multi- and hyperspectral imagery with spatial resolutions of one meter or less. The multispectral imagery includes data from an airborne sensor with three-band visible color and calibrated radiance imagery in the long-, mid-, and short-wave infrared. The airborne hyperspectral imagery includes 360 bands of calibrated radiance and reflectance data spanning 400 to 2450 nm in wavelength. Collected in September 2012, the imagery is of a park in Avon, NY, and includes a dirt track and areas of grass, gravel, forest, and agricultural fields. A number of artificial targets were deployed in the scene prior to collection for purposes of target detection, subpixel detection, spectral unmixing, and 3D object recognition. A synthetic reconstruction of the collection site was created in DIRSIG, an image generation and modeling tool developed by the Rochester Institute of Technology, based on ground-measured reflectance data, ground photography, and previous airborne imagery. Simulated airborne images were generated using the scene model, time of observation, estimates of the atmospheric conditions, and approximations of the sensor characteristics. The paper provides a comparison between the empirical and simulated images, including a comparison of achieved performance for classification, detection and unmixing applications. It was found that several differences exist due to the way the image is generated, including finite sampling and incomplete knowledge of the scene, atmospheric conditions and sensor characteristics. The lessons learned from this effort can be used in constructing future simulated scenes and further comparisons between real and simulated imagery.

  17. Atmospheric effects in multispectral remote sensor data

    NASA Technical Reports Server (NTRS)

    Turner, R. E.

    1975-01-01

    The problem of radiometric variations in multispectral remote sensing data which occur as a result of a change in geometric and environmental factors is studied. The case of spatially varying atmospheres is considered and the effect of atmospheric scattering is analyzed for realistic conditions. Emphasis is placed upon a simulation of LANDSAT spectral data for agricultural investigations over the United States. The effect of the target-background interaction is thoroughly analyzed in terms of various atmospheric states, geometric parameters, and target-background materials. Results clearly demonstrate that variable atmospheres can alter the classification accuracy and that the presence of various backgrounds can change the effective target radiance by a significant amount. A failure to include these effects in multispectral data analysis will result in a decrease in the classification accuracy.

  18. Information extraction techniques for multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Malila, W. A.; Crane, R. B.; Turner, R. E.

    1972-01-01

    The applicability of recognition-processing procedures for multispectral scanner data from areas and conditions used for programming the recognition computers to other data from different areas viewed under different measurement conditions was studied. The reflective spectral region approximately 0.3 to 3.0 micrometers is considered. A potential application of such techniques is in conducting area surveys. Work in three general areas is reported: (1) Nature of sources of systematic variation in multispectral scanner radiation signals, (2) An investigation of various techniques for overcoming systematic variations in scanner data; (3) The use of decision rules based upon empirical distributions of scanner signals rather than upon the usually assumed multivariate normal (Gaussian) signal distributions.

  19. Multispectral device for help in diagnosis

    NASA Astrophysics Data System (ADS)

    Delporte, Céline; Ben Chouikha, Mohamed; Sautrot, Sylvie; Viénot, Françoise; Alquié, Georges

    2012-03-01

    In order to build biological tissues spectral characteristics database to be used in a multispectral imaging system a tissues optical characterization bench is developed and validated. Several biological tissue types have been characterized in vitro and ex vivo with our device such as beef, turkey and pork muscle and beef liver. Multispectral images obtained have been analyzed in order to study the dispersion of biological tissues spectral luminance factor. Tissue internal structure inhomogeneity was identified as a phenomenon contributing to the dispersion of spectral luminance factor. This dispersion of spectral luminance factor could be a characteristic of the tissue. A method based on envelope technique has been developed to identify and differentiate biological tissues in the same scene. This method applied to pork tissues containing muscle and fat gives detection rates of 59% for pork muscle and 14% for pork fat.

  20. Investigation related to multispectral imaging systems

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F.; Erickson, J. D.

    1974-01-01

    A summary of technical progress made during a five year research program directed toward the development of operational information systems based on multispectral sensing and the use of these systems in earth-resource survey applications is presented. Efforts were undertaken during this program to: (1) improve the basic understanding of the many facets of multispectral remote sensing, (2) develop methods for improving the accuracy of information generated by remote sensing systems, (3) improve the efficiency of data processing and information extraction techniques to enhance the cost-effectiveness of remote sensing systems, (4) investigate additional problems having potential remote sensing solutions, and (5) apply the existing and developing technology for specific users and document and transfer that technology to the remote sensing community.

  1. JORNEX: An airborne campaign to quantify rangeland vegetation change and plant community-atmospheric interactions

    SciTech Connect

    Ritchie, J.C.; Rango, A.; Kustas, W.P.

    1996-11-01

    The Jornada Experimental Range in New Mexico provides a unique opportunity to integrate hydrologic-atmospheric fluxes and surface states, vegetation types, cover, and distribution, and vegetation response to changes in hydrologic states and atmospheric driving forces. The Jornada Range is the site of a long-term ecological research program to investigate the processes leading to desertification. In concert with ongoing ground measurements, remotely sensed data are being collected from ground, airborne, and satellite platforms during JORNEX (the JORNada Experiment) to provide spatial and temporal distribution of vegetation state using laser altimeter and multispectral aircraft and satellite data and surface energy balance estimates from a combination of parameters and state variables derived from remotely sensed data. These measurements will be used as inputs to models to quantify the hydrologic budget and the plant response to changes in components in the water and energy balance. Intensive three day study periods for ground and airborne campaigns have been made in May 1995 (dry season) and September 1995 (wet season), February 1996 (Winter) and are planned for wet and dry seasons of 1996. An airborne platform is being used to collect thermal, multispectral, 3-band video, and laser altimetry profile data. Bowen ratio-energy balance stations were established in shrub and grass communities in May 1995 and are collecting data continuously. Additional energy flux measurements were made using eddy correlation techniques during the September 1995 campaign. Ground-based measurements during the intensive campaigns include thermal and multispectral measurements made using yoke-based platforms and hand-held instruments, LAI, and other vegetation data. Ground and aircraft measurements are acquired during Landsat overpasses so the effect of scale on measurements can be studied. This paper discusses preliminary results from the 1995 airborne campaign. 24 refs., 13 figs., 1 tab.

  2. Investigations in adaptive processing of multispectral data

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Horwitz, H. M.

    1973-01-01

    Adaptive data processing procedures are applied to the problem of classifying objects in a scene scanned by multispectral sensor. These procedures show a performance improvement over standard nonadaptive techniques. Some sources of error in classification are identified and those correctable by adaptive processing are discussed. Experiments in adaptation of signature means by decision-directed methods are described. Some of these methods assume correlation between the trajectories of different signature means; for others this assumption is not made.

  3. Case studies of aerosol remote sensing with the Airborne Multiangle SpectroPolarimetric Imager (AirMSPI)

    NASA Astrophysics Data System (ADS)

    Diner, D. J.; Xu, F.; Garay, M. J.; Martonchik, J. V.; Kalashnikova, O. V.; Davis, A. B.; Rheingans, B.; Geier, S.; Jovanovic, V.; Bull, M.

    2012-12-01

    The Airborne Multiangle SpectroPolarimetric Imager (AirMSPI) is an 8-band (355, 380, 445, 470, 555, 660, 865, 935 nm) pushbroom camera, measuring polarization in the 470, 660, and 865 nm bands, mounted on a gimbal to acquire multiangular observations over a ±67° along-track range with 10-m spatial resolution across an 11-km wide swath. Among the instrument objectives are exploration of methodologies for combining multiangle, multispectral, polarimetric, and imaging observations to retrieve the optical depth and microphysical properties of tropospheric aerosols. AirMSPI was integrated on NASA's ER-2 high-altitude aircraft in 2010 and has successfully completed a number of flights over land and ocean targets in the Southern California vicinity. In this paper, we present case studies of AirMSPI imagery, interpreted using vector radiative transfer theory. AirMSPI observations over California's Central Valley are compared with model calculations using aerosol properties reported by the Fresno AERONET sunphotometer. Because determination of the radiative impact of different types of aerosols requires accurate attribution of the source of the reflected light along with characterization of the aerosol optical and microphysical properties, we explore the sensitivity of the Fresno measurements to variations in different aerosol properties, demonstrating the value of combining intensity and polarimetry at multiple view angles and spectral bands for constraining particle microphysical properties. Images over ocean to be presented include scenes over nearly cloud-free skies and scenes containing scattered clouds. It is well known that imperfect cloud screening confounds the determination of aerosol impact on radiation; it is perhaps less well appreciated that the effect of cloud reflections in the water can also be problematic. We calculate the magnitude of this effect in intensity and polarization and discuss its potential impact on aerosol retrievals, underscoring the value

  4. Multi-spectral materials: hybridisation of optical plasmonic filters, a mid infrared metamaterial absorber and a terahertz metamaterial absorber.

    PubMed

    Grant, James; McCrindle, Iain J H; Cumming, David R S

    2016-02-22

    Multi-spectral imaging systems typically require the cumbersome integration of disparate filtering materials and detectors in order to operate simultaneously in multiple spectral regions. Each distinct waveband must be detected at different spatial locations on a single chip or by separate chips optimised for each band. Here, we report on a single component that optically multiplexes visible, Mid Infrared (4.5 μm) and Terahertz (126 μm) radiation thereby maximising the spectral information density. We hybridise plasmonic and metamaterial structures to form a device capable of simultaneously filtering 15 visible wavelengths and absorbing Mid Infrared and Terahertz. Our synthetic multi-spectral component could be integrated with silicon complementary metal-oxide semiconductor technology where Si photodiodes are available to detect the visible radiation and micro-bolometers available to detect the Infrared/Terahertz and render an inexpensive, mass-producible camera capable of forming coaxial visible, Infrared and Terahertz images. PMID:26907004

  5. Interferometry based multispectral photon-limited 2D and 3D integral image encryption employing the Hartley transform.

    PubMed

    Muniraj, Inbarasan; Guo, Changliang; Lee, Byung-Geun; Sheridan, John T

    2015-06-15

    We present a method of securing multispectral 3D photon-counted integral imaging (PCII) using classical Hartley Transform (HT) based encryption by employing optical interferometry. This method has the simultaneous advantages of minimizing complexity by eliminating the need for holography recording and addresses the phase sensitivity problem encountered when using digital cameras. These together with single-channel multispectral 3D data compactness, the inherent properties of the classical photon counting detection model, i.e. sparse sensing and the capability for nonlinear transformation, permits better authentication of the retrieved 3D scene at various depth cues. Furthermore, the proposed technique works for both spatially and temporally incoherent illumination. To validate the proposed technique simulations were carried out for both the 2D and 3D cases. Experimental data is processed and the results support the feasibility of the encryption method. PMID:26193568

  6. Prediction of apple fruit firmness and soluble solids content using characteristics of multispectral scattering images

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Multispectral scattering is a promising technique for nondestructive sensing of multiple quality attributes of apple fruit. This research developed new, improved methods for processing and analyzing multispectral scattering profiles in order to design and build a better multispectral imaging system ...

  7. A multispectral imaging approach for diagnostics of skin pathologies

    NASA Astrophysics Data System (ADS)

    Lihacova, Ilze; Derjabo, Aleksandrs; Spigulis, Janis

    2013-06-01

    Noninvasive multispectral imaging method was applied for different skin pathology such as nevus, basal cell carcinoma, and melanoma diagnostics. Developed melanoma diagnostic parameter, using three spectral bands (540 nm, 650 nm and 950 nm), was calculated for nevus, melanoma and basal cell carcinoma. Simple multispectral diagnostic device was established and applied for skin assessment. Development and application of multispectral diagnostics method described further in this article.

  8. Image processing of underwater multispectral imagery

    USGS Publications Warehouse

    Zawada, D.G.

    2003-01-01

    Capturing in situ fluorescence images of marine organisms presents many technical challenges. The effects of the medium, as well as the particles and organisms within it, are intermixed with the desired signal. Methods for extracting and preparing the imagery for analysis are discussed in reference to a novel underwater imaging system called the low-light-level underwater multispectral imaging system (LUMIS). The instrument supports both uni- and multispectral collections, each of which is discussed in the context of an experimental application. In unispectral mode, LUMIS was used to investigate the spatial distribution of phytoplankton. A thin sheet of laser light (532 nm) induced chlorophyll fluorescence in the phytoplankton, which was recorded by LUMIS. Inhomogeneities in the light sheet led to the development of a beam-pattern-correction algorithm. Separating individual phytoplankton cells from a weak background fluorescence field required a two-step procedure consisting of edge detection followed by a series of binary morphological operations. In multispectral mode, LUMIS was used to investigate the bio-assay potential of fluorescent pigments in corals. Problems with the commercial optical-splitting device produced nonlinear distortions in the imagery. A tessellation algorithm, including an automated tie-point-selection procedure, was developed to correct the distortions. Only pixels corresponding to coral polyps were of interest for further analysis. Extraction of these pixels was performed by a dynamic global-thresholding algorithm.

  9. Hyperspectral and multispectral sensors for remote sensing

    NASA Astrophysics Data System (ADS)

    Miller, James; Kullar, Sukhbir; Cochrane, David; O, Nixon; Lomako, Andrey; Draijer, Cees

    2010-11-01

    Remote Hyperspectral and Multispectral sensors have been developed using modern CCD and CMOS fabrication techniques combined with advanced dichroic filters. The resulting sensors are more cost effective while maintaining the high performance needed in remote sensing applications. A single device can contain multiple imaging areas tailored to different multispectral bandwidths in a highly cost effective and reliable package. This paper discusses a five band visible to near IR scanning sensor. By bonding advanced dichroic filters onto the cover glass and directly in the imaging path a highly efficient multispectral sensor is achieved. Up to 12,000 linear pixel arrays are possible1 with this advanced filter technology approach. Individual imaging areas on the device are designed to have unique pixel sizes and clocking to enable tailored imaging performance for the individual spectral bands. Individual elements are also based on high resolution Time Delay and Integration technology2,3 (TDI) to maximize sensitivity and throughput. Additionally for hyperspectral imagers, a split frame CCD design is discussed using high sensitivity back side illuminated (BSI) processes that can achieve high quantum efficiency. As these sensors are used in remote sensing applications, device robustness and radiation tolerance was required.

  10. Estimating the proportions of objects within a single resolution element of a multispectral scanner.

    NASA Technical Reports Server (NTRS)

    Horwitz, H. M.; Nalepka, R. F.; Hyde, P. D.; Morgenstern, J. P.

    1971-01-01

    Description of a procedu*e designed to estimate the proportions of objects and materials contained in the instantaneous field of view (IFOV) of an airborne multispectral device. A mathematical model is derived to relate the signature of a combination of materials in a resolution cell to the signatures of the individual materials considered. Estimation algorithms are generated and digital computer programs are prepared to apply the algorithms in the description of the effects which are observed when several objects are viewed simultaneously. The maximum likelihood estimate of the proportions of various individual materials in an IFOV is discussed. A simulation program is proposed for such estimates. A procedure for analyzing the geometric relations of signatures which affect the accuracy of estimates is set forth.

  11. Comparison of multispectral remote-sensing techniques for monitoring subsurface drain conditions. [Imperial Valley, California

    NASA Technical Reports Server (NTRS)

    Goettelman, R. C.; Grass, L. B.; Millard, J. P.; Nixon, P. R.

    1983-01-01

    The following multispectral remote-sensing techniques were compared to determine the most suitable method for routinely monitoring agricultural subsurface drain conditions: airborne scanning, covering the visible through thermal-infrared (IR) portions of the spectrum; color-IR photography; and natural-color photography. Color-IR photography was determined to be the best approach, from the standpoint of both cost and information content. Aerial monitoring of drain conditions for early warning of tile malfunction appears practical. With careful selection of season and rain-induced soil-moisture conditions, extensive regional surveys are possible. Certain locations, such as the Imperial Valley, Calif., are precluded from regional monitoring because of year-round crop rotations and soil stratification conditions. Here, farms with similar crops could time local coverage for bare-field and saturated-soil conditions.

  12. Targetless Camera Calibration

    NASA Astrophysics Data System (ADS)

    Barazzetti, L.; Mussio, L.; Remondino, F.; Scaioni, M.

    2011-09-01

    In photogrammetry a camera is considered calibrated if its interior orientation parameters are known. These encompass the principal distance, the principal point position and some Additional Parameters used to model possible systematic errors. The current state of the art for automated camera calibration relies on the use of coded targets to accurately determine the image correspondences. This paper presents a new methodology for the efficient and rigorous photogrammetric calibration of digital cameras which does not require any longer the use of targets. A set of images depicting a scene with a good texture are sufficient for the extraction of natural corresponding image points. These are automatically matched with feature-based approaches and robust estimation techniques. The successive photogrammetric bundle adjustment retrieves the unknown camera parameters and their theoretical accuracies. Examples, considerations and comparisons with real data and different case studies are illustrated to show the potentialities of the proposed methodology.

  13. Miniature TV Camera

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Originally devised to observe Saturn stage separation during Apollo flights, Marshall Space Flight Center's Miniature Television Camera, measuring only 4 x 3 x 1 1/2 inches, quickly made its way to the commercial telecommunications market.

  14. The MKID Camera

    NASA Astrophysics Data System (ADS)

    Maloney, P. R.; Czakon, N. G.; Day, P. K.; Duan, R.; Gao, J.; Glenn, J.; Golwala, S.; Hollister, M.; LeDuc, H. G.; Mazin, B.; Noroozian, O.; Nguyen, H. T.; Sayers, J.; Schlaerth, J.; Vaillancourt, J. E.; Vayonakis, A.; Wilson, P.; Zmuidzinas, J.

    2009-12-01

    The MKID Camera project is a collaborative effort of Caltech, JPL, the University of Colorado, and UC Santa Barbara to develop a large-format, multi-color millimeter and submillimeter-wavelength camera for astronomy using microwave kinetic inductance detectors (MKIDs). These are superconducting, micro-resonators fabricated from thin aluminum and niobium films. We couple the MKIDs to multi-slot antennas and measure the change in surface impedance produced by photon-induced breaking of Cooper pairs. The readout is almost entirely at room temperature and can be highly multiplexed; in principle hundreds or even thousands of resonators could be read out on a single feedline. The camera will have 576 spatial pixels that image simultaneously in four bands at 750, 850, 1100 and 1300 microns. It is scheduled for deployment at the Caltech Submillimeter Observatory in the summer of 2010. We present an overview of the camera design and readout and describe the current status of testing and fabrication.

  15. Advanced CCD camera developments

    SciTech Connect

    Condor, A.

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  16. Comparison of Hyperspectral and Multispectral Satellites for Discriminating Land Cover in Northern California

    NASA Astrophysics Data System (ADS)

    Clark, M. L.; Kilham, N. E.

    2015-12-01

    Land-cover maps are important science products needed for natural resource and ecosystem service management, biodiversity conservation planning, and assessing human-induced and natural drivers of land change. Most land-cover maps at regional to global scales are produced with remote sensing techniques applied to multispectral satellite imagery with 30-500 m pixel sizes (e.g., Landsat, MODIS). Hyperspectral, or imaging spectrometer, imagery measuring the visible to shortwave infrared regions (VSWIR) of the spectrum have shown impressive capacity to map plant species and coarser land-cover associations, yet techniques have not been widely tested at regional and greater spatial scales. The Hyperspectral Infrared Imager (HyspIRI) mission is a VSWIR hyperspectral and thermal satellite being considered for development by NASA. The goal of this study was to assess multi-temporal, HyspIRI-like satellite imagery for improved land cover mapping relative to multispectral satellites. We mapped FAO Land Cover Classification System (LCCS) classes over 22,500 km2 in the San Francisco Bay Area, California using 30-m HyspIRI, Landsat 8 and Sentinel-2 imagery simulated from data acquired by NASA's AVIRIS airborne sensor. Random Forests (RF) and Multiple-Endmember Spectral Mixture Analysis (MESMA) classifiers were applied to the simulated images and accuracies were compared to those from real Landsat 8 images. The RF classifier was superior to MESMA, and multi-temporal data yielded higher accuracy than summer-only data. With RF, hyperspectral data had overall accuracy of 72.2% and 85.1% with full 20-class and reduced 12-class schemes, respectively. Multispectral imagery had lower accuracy. For example, simulated and real Landsat data had 7.5% and 4.6% lower accuracy than HyspIRI data with 12 classes, respectively. In summary, our results indicate increased mapping accuracy using HyspIRI multi-temporal imagery, particularly in discriminating different natural vegetation types, such as

  17. The Multispectral Imaging Science Working Group. Volume 3: Appendices

    NASA Technical Reports Server (NTRS)

    Cox, S. C. (Editor)

    1982-01-01

    The status and technology requirements for using multispectral sensor imagery in geographic, hydrologic, and geologic applications are examined. Critical issues in image and information science are identified.

  18. Gamma ray camera

    SciTech Connect

    Robbins, C.D.; Wang, S.

    1980-09-09

    An anger gamma ray camera is improved by the substitution of a gamma ray sensitive, proximity type image intensifier tube for the scintillator screen in the anger camera, the image intensifier tube having a negatively charged flat scintillator screen and a flat photocathode layer and a grounded, flat output phosphor display screen all of the same dimension (Unity image magnification) and all within a grounded metallic tube envelope and having a metallic, inwardly concaved input window between the scintillator screen and the collimator.

  19. The Airborne Laser

    NASA Astrophysics Data System (ADS)

    Lamberson, Steven E.

    2002-09-01

    The US Air Force Airborne Laser (ABL) is an airborne, megawatt-class laser system with a state-of-the-art atmospheric compensation system to destroy enemy ballistic missiles at long ranges. This system will provide both deterrence and defense against the use of such weapons during conflicts. This paper provides an overview of the ABL weapon system including: the notional operational concept, the development approach and schedule, the overall aircraft configuration, the technologies being incorporated in the ABL, and the risk reduction approach being utilized to ensure program success.

  20. Airborne oceanographic lidar system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Specifications and preliminary design of an Airborne Oceanographic Lidar (AOL) system, which is to be constructed for installation and used on a NASA Wallops Flight Center (WFC) C-54 research aircraft, are reported. The AOL system is to provide an airborne facility for use by various government agencies to demonstrate the utility and practicality of hardware of this type in the wide area collection of oceanographic data on an operational basis. System measurement and performance requirements are presented, followed by a description of the conceptual system approach and the considerations attendant to its development. System performance calculations are addressed, and the system specifications and preliminary design are presented and discussed.

  1. Monitoring human and vehicle activities using airborne video

    NASA Astrophysics Data System (ADS)

    Cutler, Ross; Shekhar, Chandra S.; Burns, B.; Chellappa, Rama; Bolles, Robert C.; Davis, Larry S.

    2000-05-01

    Ongoing work in Activity Monitoring (AM) for the Airborne Video Surveillance (AVS) project is described. The goal for AM is to recognize activities of interest involving humans and vehicles using airborne video. AM consists of three major components: (1) moving object detection, tracking, and classification; (2) image to site-model registration; (3) activity recognition. Detecting and tracking humans and vehicles form airborne video is a challenging problem due to image noise, low GSD, poor contrast, motion parallax, motion blur, and camera blur, and camera jitter. We use frame-to- frame affine-warping stabilization and temporally integrated intensity differences to detect independent motion. Moving objects are initially tracked using nearest-neighbor correspondence, followed by a greedy method that favors long track lengths and assumes locally constant velocity. Object classification is based on object size, velocity, and periodicity of motion. Site-model registration uses GPS information and camera/airplane orientations to provide an initial geolocation with +/- 100m accuracy at an elevation of 1000m. A semi-automatic procedure is utilized to improve the accuracy to +/- 5m. The activity recognition component uses the geolocated tracked objects and the site-model to detect pre-specified activities, such as people entering a forbidden area and a group of vehicles leaving a staging area.

  2. CNR LARA project, Italy: Airborne laboratory for environmental research

    NASA Technical Reports Server (NTRS)

    Bianchi, R.; Cavalli, R. M.; Fiumi, L.; Marino, C. M.; Pignatti, S.

    1995-01-01

    The increasing interest for the environmental problems and the study of the impact on the environment due to antropic activity produced an enhancement of remote sensing applications. The Italian National Research Council (CNR) established a new laboratory for airborne hyperspectral imaging, the LARA Project (Laboratorio Aero per Ricerche Ambientali - Airborne Laboratory for Environmental Research), equipping its airborne laboratory, a CASA-212, mainly with the Daedalus AA5000 MIVIS (Multispectral Infrared and Visible Imaging Spectrometer) instrument. MIVIS's channels, spectral bandwidths, and locations are chosen to meet the needs of scientific research for advanced applications of remote sensing data. MIVIS can make significant contributions to solving problems in many diverse areas such as geologic exploration, land use studies, mineralogy, agricultural crop studies, energy loss analysis, pollution assessment, volcanology, forest fire management and others. The broad spectral range and the many discrete narrow channels of MIVIS provide a fine quantization of spectral information that permits accurate definition of absorption features from a variety of materials, allowing the extraction of chemical and physical information of our environment. The availability of such a hyperspectral imager, that will operate mainly in the Mediterranean area, at the present represents a unique opportunity for those who are involved in environmental studies and land-management to collect systematically large-scale and high spectral-spatial resolution data of this part of the world. Nevertheless, MIVIS deployments will touch other parts of the world, where a major interest from the international scientific community is present.

  3. Camera Edge Response

    NASA Astrophysics Data System (ADS)

    Zisk, Stanley H.; Wittels, Norman

    1988-02-01

    Edge location is an important machine vision task. Machine vision systems perform mathematical operations on rectangular arrays of numbers that are intended to faithfully represent the spatial distribution of scene luminance. The numbers are produced by periodic sampling and quantization of the camera's video output. This sequence can cause artifacts to appear in the data with a noise spectrum that is high in power at high spatial frequencies. This is a problem because most edge detection algorithms are preferentially sensitive to the high-frequency content in an image. Solid state cameras can introduce errors because of the spatial periodicity of their sensor elements. This can result in problems when image edges are aligned with camera pixel boundaries: (a) some cameras introduce transients into the video signal while switching between sensor elements; (b) most cameras use analog low-pass filters to minimize sampling artifacts and these introduce video phase delays that shift the locations of edges. The problems compound when the vision system samples asynchronously with the camera's pixel rate. Moire patterns (analogous to beat frequencies) can result. In this paper, we examine and model quantization effects in a machine vision system with particular emphasis on edge detection performance. We also compare our models with experimental measurements.

  4. Spacecraft camera image registration

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Chan, Fred N. T. (Inventor); Gamble, Donald W. (Inventor)

    1987-01-01

    A system for achieving spacecraft camera (1, 2) image registration comprises a portion external to the spacecraft and an image motion compensation system (IMCS) portion onboard the spacecraft. Within the IMCS, a computer (38) calculates an image registration compensation signal (60) which is sent to the scan control loops (84, 88, 94, 98) of the onboard cameras (1, 2). At the location external to the spacecraft, the long-term orbital and attitude perturbations on the spacecraft are modeled. Coefficients (K, A) from this model are periodically sent to the onboard computer (38) by means of a command unit (39). The coefficients (K, A) take into account observations of stars and landmarks made by the spacecraft cameras (1, 2) themselves. The computer (38) takes as inputs the updated coefficients (K, A) plus synchronization information indicating the mirror position (AZ, EL) of each of the spacecraft cameras (1, 2), operating mode, and starting and stopping status of the scan lines generated by these cameras (1, 2), and generates in response thereto the image registration compensation signal (60). The sources of periodic thermal errors on the spacecraft are discussed. The system is checked by calculating measurement residuals, the difference between the landmark and star locations predicted at the external location and the landmark and star locations as measured by the spacecraft cameras (1, 2).

  5. Multispectral thermal infrared mapping of the 1 October 1988 Kupaianaha flow field, Kilauea volcano, Hawaii

    USGS Publications Warehouse

    Realmuto, V.J.; Hon, K.; Kahle, A.B.; Abbott, E.A.; Pieri, D.C.

    1992-01-01

    Multispectral thermal infrared radiance measurements of the Kupaianaha flow field were acquired with the NASA airborne Thermal Infrared Multispectral Scanner (TIMS) on the morning of 1 October 1988. The TIMS data were used to map both the temperature and emissivity of the surface of the flow field. The temperature map depicted the underground storage and transport of lava. The presence of molten lava in a tube or tumulus resulted in surface temperatures that were at least 10?? C above ambient. The temperature map also clearly defined the boundaries of hydrothermal plumes which resulted from the entry of lava into the ocean. The emissivity map revealed the boundaries between individual flow units within the Kupaianaha field. In general, the emissivity of the flows varied systematically with age but the relationship between age and emissivity was not unique. Distinct spectral anomalies, indicative of silica-rich surface materials, were mapped near fumaroles and ocean entry sites. This apparent enrichment in silica may have resulted from an acid-induced leaching of cations from the surfaces of glassy flows. Such incipient alteration may have been the cause for virtually all of the emissivity variations observed on the flow field, the spectral anomalies representing areas where the acid attack was most intense. ?? 1992 Springer-Verlag.

  6. A multispectral automatic target recognition application for maritime surveillance, search, and rescue

    NASA Astrophysics Data System (ADS)

    Schoonmaker, Jon; Reed, Scott; Podobna, Yuliya; Vazquez, Jose; Boucher, Cynthia

    2010-04-01

    Due to increased security concerns, the commitment to monitor and maintain security in the maritime environment is increasingly a priority. A country's coast is the most vulnerable area for the incursion of illegal immigrants, terrorists and contraband. This work illustrates the ability of a low-cost, light-weight, multi-spectral, multi-channel imaging system to handle the environment and see under difficult marine conditions. The system and its implemented detecting and tracking technologies should be organic to the maritime homeland security community for search and rescue, fisheries, defense, and law enforcement. It is tailored for airborne and ship based platforms to detect, track and monitor suspected objects (such as semi-submerged targets like marine mammals, vessels in distress, and drug smugglers). In this system, automated detection and tracking technology is used to detect, classify and localize potential threats or objects of interest within the imagery provided by the multi-spectral system. These algorithms process the sensor data in real time, thereby providing immediate feedback when features of interest have been detected. A supervised detection system based on Haar features and Cascade Classifiers is presented and results are provided on real data. The system is shown to be extendable and reusable for a variety of different applications.

  7. Monitoring Geothermal Features in Yellowstone National Park with ATLAS Multispectral Imagery

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph; Berglund, Judith

    2000-01-01

    The National Park Service (NPS) must produce an Environmental Impact Statement for each proposed development in the vicinity of known geothermal resource areas (KGRAs) in Yellowstone National Park. In addition, the NPS monitors indicator KGRAs for environmental quality and is still in the process of mapping many geothermal areas. The NPS currently maps geothermal features with field survey techniques. High resolution aerial multispectral remote sensing in the visible, NIR, SWIR, and thermal spectral regions could enable YNP geothermal features to be mapped more quickly and in greater detail In response, Yellowstone Ecosystems Studies, in partnership with NASA's Commercial Remote Sensing Program, is conducting a study on the use of Airborne Terrestrial Applications Sensor (ATLAS) multispectral data for monitoring geothermal features in the Upper Geyser Basin. ATLAS data were acquired at 2.5 meter resolution on August 17, 2000. These data were processed into land cover classifications and relative temperature maps. For sufficiently large features, the ATLAS data can map geothermal areas in terms of geyser pools and hot springs, plus multiple categories of geothermal runoff that are apparently indicative of temperature gradients and microbial matting communities. In addition, the ATLAS maps clearly identify geyserite areas. The thermal bands contributed to classification success and to the computation of relative temperature. With masking techniques, one can assess the influence of geothermal features on the Firehole River. Preliminary results appear to confirm ATLAS data utility for mapping and monitoring geothermal features. Future work will include classification refinement and additional validation.

  8. Combination of multispectral remote sensing, variable rate technology and environmental modeling for citrus pest management.

    PubMed

    Du, Qian; Chang, Ni-Bin; Yang, Chenghai; Srilakshmi, Kanth R

    2008-01-01

    The Lower Rio Grande Valley (LRGV) of south Texas is an agriculturally rich area supporting intensive production of vegetables, fruits, grain sorghum, and cotton. Modern agricultural practices involve the combined use of irrigation with the application of large amounts of agrochemicals to maximize crop yields. Intensive agricultural activities in past decades might have caused potential contamination of soil, surface water, and groundwater due to leaching of pesticides in the vadose zone. In an effort to promote precision farming in citrus production, this paper aims at developing an airborne multispectral technique for identifying tree health problems in a citrus grove that can be combined with variable rate technology (VRT) for required pesticide application and environmental modeling for assessment of pollution prevention. An unsupervised linear unmixing method was applied to classify the image for the grove and quantify the symptom severity for appropriate infection control. The PRZM-3 model was used to estimate environmental impacts that contribute to nonpoint source pollution with and without the use of multispectral remote sensing and VRT. Research findings using site-specific environmental assessment clearly indicate that combination of remote sensing and VRT may result in benefit to the environment by reducing the nonpoint source pollution by 92.15%. Overall, this study demonstrates the potential of precision farming for citrus production in the nexus of industrial ecology and agricultural sustainability. PMID:17222960

  9. Genetic refinement of cloud-masking algorithms for the multi-spectral thermal imager (MTI)

    SciTech Connect

    Hirsch, K. L.; Davis, A. B.; Harvey, N. R.; Rohde, C. A.; Brumby, Steven P.

    2001-01-01

    The Multi-spectral Thermal Imager (MTI) is a high-performance remote-sensing satellite designed, owned and operated by the U.S. Department of Energy, with a dual mission in environmental studies and in nonproliferation. It has enhanced spatial and radiometric resolutions and state-of-the-art calibration capabilities. This instrumental development puts a new burden on retrieval algorithm developers to pass this accuracy on to the inferred geophysical parameters. In particular, the atmospheric correction scheme assumes the intervening atmosphere will be modeled as a plane-parallel horizontally-homogeneous medium. A single dense-enough cloud in view of the ground target can easily offset reality from the calculations, hence the need for a reliable cloud-masking algorithm. Pixel-scale cloud detection relies on the simple facts that clouds are generally whiter, brighter, and colder than the ground below; spatially, dense clouds are generally large on some scale. This is a good basis for searching multispectral datacubes for cloud signatures. However, the resulting cloud mask can be very sensitive to the choice of thresholds in whiteness, brightness, temperature, and connectivity. We have used a genetic algorithm trained on (MODIS Airborne Simulator-based) simulated MTI data to design a cloud-mask. Its performance is compared quantitatively to hand-drawn training data and to the EOS/Terra MODIS cloud mask.

  10. Analysis of multispectral signatures and investigation of multi-aspect remote sensing techniques

    NASA Technical Reports Server (NTRS)

    Malila, W. A.; Hieber, R. H.; Sarno, J. E.

    1974-01-01

    Two major aspects of remote sensing with multispectral scanners (MSS) are investigated. The first, multispectral signature analysis, includes the effects on classification performance of systematic variations found in the average signals received from various ground covers as well as the prediction of these variations with theoretical models of physical processes. The foremost effects studied are those associated with the time of day airborne MSS data are collected. Six data collection runs made over the same flight line in a period of five hours are analyzed, it is found that the time span significantly affects classification performance. Variations associated with scan angle also are studied. The second major topic of discussion is multi-aspect remote sensing, a new concept in remote sensing with scanners. Here, data are collected on multiple passes by a scanner that can be tilted to scan forward of the aircraft at different angles on different passes. The use of such spatially registered data to achieve improved classification of agricultural scenes is investigated and found promising. Also considered are the possibilities of extracting from multi-aspect data, information on the condition of corn canopies and the stand characteristics of forests.

  11. NASA Airborne Lidar July 1991

    Atmospheric Science Data Center

    2016-05-26

    NASA Airborne Lidar July 1991 Data from the 1991 NASA Langley Airborne Lidar flights following the eruption of Pinatubo in July ... and Osborn [1992a, 1992b]. Project Title:  NASA Airborne Lidar Discipline:  Field Campaigns ...

  12. NASA Airborne Lidar May 1992

    Atmospheric Science Data Center

    2016-05-26

    NASA Airborne Lidar May 1992 An airborne Nd:YAG (532 nm) lidar was operated by the NASA Langley Research Center about a year following the June 1991 eruption of ... Osborn [1992a, 1992b].  Project Title:  NASA Airborne Lidar Discipline:  Field Campaigns ...

  13. Performance evaluation of the two-inch return-beam vidicon three-camera subsystem.

    NASA Technical Reports Server (NTRS)

    Miller, B. P.; Beck, G. A.; Barletta, J. M.

    1972-01-01

    A very-high resolution multispectral television camera system is being developed for NASA for use on the Earth Resources Technology Satellite (ERTS) program. There are three cameras in the system, each viewing the same area but operating in the blue-green, red and near-infrared spectral bands. In the laboratory the cameras' limiting resolution is 4500 TV lines over the 25 x 25-mm image format of the Return Beam Vidicon (RBV). Analysis of typical ERTS scenes shows that actual contrast ratios will be much lower than those of laboratory test targets. A model was developed to predict the resolving power performance of the RBV camera under realistic conditions. The methods used are applicable to all types of imaging systems. To verify the model, tests were conducted using the RBV camera, a laser-beam image reproducer and a series of AF tribar test patterns of known values of contrast. As a more graphic demonstration, simulated multispectral images were generated using color-IR photographs from Apollo 9.

  14. 9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ON RAILROAD TRACK AND FIXED CAMERA STATION 1400 (BUILDING NO. 42021) ABOVE, ADJACENT TO STATE HIGHWAY 39, LOOKING WEST, March 23, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. Deployable Wireless Camera Penetrators

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

    2008-01-01

    A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an

  17. Multispectral imaging of the ocular fundus using light emitting diode illumination

    NASA Astrophysics Data System (ADS)

    Everdell, N. L.; Styles, I. B.; Calcagni, A.; Gibson, J.; Hebden, J.; Claridge, E.

    2010-09-01

    We present an imaging system based on light emitting diode (LED) illumination that produces multispectral optical images of the human ocular fundus. It uses a conventional fundus camera equipped with a high power LED light source and a highly sensitive electron-multiplying charge coupled device camera. It is able to take pictures at a series of wavelengths in rapid succession at short exposure times, thereby eliminating the image shift introduced by natural eye movements (saccades). In contrast with snapshot systems the images retain full spatial resolution. The system is not suitable for applications where the full spectral resolution is required as it uses discrete wavebands for illumination. This is not a problem in retinal imaging where the use of selected wavelengths is common. The modular nature of the light source allows new wavelengths to be introduced easily and at low cost. The use of wavelength-specific LEDs as a source is preferable to white light illumination and subsequent filtering of the remitted light as it minimizes the total light exposure of the subject. The system is controlled via a graphical user interface that enables flexible control of intensity, duration, and sequencing of sources in synchrony with the camera. Our initial experiments indicate that the system can acquire multispectral image sequences of the human retina at exposure times of 0.05 s in the range of 500-620 nm with mean signal to noise ratio of 17 dB (min 11, std 4.5), making it suitable for quantitative analysis with application to the diagnosis and screening of eye diseases such as diabetic retinopathy and age-related macular degeneration.

  18. Excitation spectroscopy in multispectral optical fluorescence tomography: methodology, feasibility, and computer simulation studies

    PubMed Central

    Chaudhari, Abhijit J; Ahn, Sangtae; Levenson, Richard; Badawi, Ramsey D; Cherry, Simon R; Leahy, Richard M

    2009-01-01

    Molecular probes used for in vivo Optical Fluorescence Tomography (OFT) studies in small animals are typically chosen such that their emission spectra lie in the 680–850 nm wavelength range. This is because tissue attenuation in this spectral band is relatively low, allowing optical photons even from deep sites in tissue to reach the animal surface, and consequently be detected by a CCD camera. The wavelength dependence of tissue optical properties within the 680–850 nm band can be exploited for emitted light by measuring fluorescent data via multispectral approaches and incorporating the spectral dependence of these optical properties into the OFT inverse problem - that of reconstructing underlying 3D fluorescent probe distributions from optical data collected on the animal surface. However, in the aforementioned spectral band, due to only small variations in the tissue optical properties, multispectral emission data, though superior for image reconstruction compared to achromatic data, tend to be somewhat redundant. A different spectral approach for OFT is to capitalize on the larger variations in the optical properties of tissue for excitation photons than for the emission photons by using excitation at multiple wavelengths as a means of decoding source depth in tissue. The full potential of spectral approaches in OFT can be realized by a synergistic combination of these two approaches, that is, exciting the underlying fluorescent probe at multiple wavelengths and measuring emission data multispectrally. In this paper, we describe a method that incorporates both excitation as well as emission spectral information into the OFT inverse problem. We describe a linear algebraic formulation of the multiple wavelength illumination - multispectral detection (MWI-MD) forward model for OFT and compare it to models that use only excitation at multiple wavelengths or those that use only multispectral detection techniques. This study is carried out in a realistic

  19. An investigative study of multispectral data compression for remotely-sensed images using vector quantization and difference-mapped shift-coding

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    A study is conducted to investigate the effects and advantages of data compression techniques on multispectral imagery data acquired by NASA's airborne scanners at the Stennis Space Center. The first technique used was vector quantization. The vector is defined in the multispectral imagery context as an array of pixels from the same location from each channel. The error obtained in substituting the reconstructed images for the original set is compared for different compression ratios. Also, the eigenvalues of the covariance matrix obtained from the reconstructed data set are compared with the eigenvalues of the original set. The effects of varying the size of the vector codebook on the quality of the compression and on subsequent classification are also presented. The output data from the Vector Quantization algorithm was further compressed by a lossless technique called Difference-mapped Shift-extended Huffman coding. The overall compression for 7 channels of data acquired by the Calibrated Airborne Multispectral Scanner (CAMS), with an RMS error of 15.8 pixels was 195:1 (0.41 bpp) and with an RMS error of 3.6 pixels was 18:1 (.447 bpp). The algorithms were implemented in software and interfaced with the help of dedicated image processing boards to an 80386 PC compatible computer. Modules were developed for the task of image compression and image analysis. Also, supporting software to perform image processing for visual display and interpretation of the compressed/classified images was developed.

  20. Airborne antenna pattern calculations

    NASA Technical Reports Server (NTRS)

    Knerr, T. J.; Schaffner, P. R.; Mielke, R. R.; Gilreath, M. C.

    1980-01-01

    A procedure for numerically calculating radiation patterns of fuselage-mounted airborne antennas using the Volumetric Pattern Analysis Program is presented. Special attention is given to aircraft modeling. An actual case study involving a large commercial aircraft is included to illustrate the analysis procedure.

  1. Recognizing Airborne Hazards.

    ERIC Educational Resources Information Center

    Schneider, Christian M.

    1990-01-01

    The heating, ventilating, and air conditioning (HVAC) systems in older buildings often do not adequately handle air-borne contaminants. Outlines a three-stage Indoor Air Quality (IAQ) assessment and describes a case in point at a Pittsburgh, Pennsylvania, school. (MLF)

  2. Airborne Fraunhofer Line Discriminator

    NASA Technical Reports Server (NTRS)

    Gabriel, F. C.; Markle, D. A.

    1969-01-01

    Airborne Fraunhofer Line Discriminator enables prospecting for fluorescent materials, hydrography with fluorescent dyes, and plant studies based on fluorescence of chlorophyll. Optical unit design is the coincidence of Fraunhofer lines in the solar spectrum occurring at the characteristic wavelengths of some fluorescent materials.

  3. Airborne Remote Sensing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA imaging technology has provided the basis for a commercial agricultural reconnaissance service. AG-RECON furnishes information from airborne sensors, aerial photographs and satellite and ground databases to farmers, foresters, geologists, etc. This service produces color "maps" of Earth conditions, which enable clients to detect crop color changes or temperature changes that may indicate fire damage or pest stress problems.

  4. Uncooled radiometric camera performance

    NASA Astrophysics Data System (ADS)

    Meyer, Bill; Hoelter, T.

    1998-07-01

    Thermal imaging equipment utilizing microbolometer detectors operating at room temperature has found widespread acceptance in both military and commercial applications. Uncooled camera products are becoming effective solutions to applications currently using traditional, photonic infrared sensors. The reduced power consumption and decreased mechanical complexity offered by uncooled cameras have realized highly reliable, low-cost, hand-held instruments. Initially these instruments displayed only relative temperature differences which limited their usefulness in applications such as Thermography. Radiometrically calibrated microbolometer instruments are now available. The ExplorIR Thermography camera leverages the technology developed for Raytheon Systems Company's first production microbolometer imaging camera, the Sentinel. The ExplorIR camera has a demonstrated temperature measurement accuracy of 4 degrees Celsius or 4% of the measured value (whichever is greater) over scene temperatures ranges of minus 20 degrees Celsius to 300 degrees Celsius (minus 20 degrees Celsius to 900 degrees Celsius for extended range models) and camera environmental temperatures of minus 10 degrees Celsius to 40 degrees Celsius. Direct temperature measurement with high resolution video imaging creates some unique challenges when using uncooled detectors. A temperature controlled, field-of-view limiting aperture (cold shield) is not typically included in the small volume dewars used for uncooled detector packages. The lack of a field-of-view shield allows a significant amount of extraneous radiation from the dewar walls and lens body to affect the sensor operation. In addition, the transmission of the Germanium lens elements is a function of ambient temperature. The ExplorIR camera design compensates for these environmental effects while maintaining the accuracy and dynamic range required by today's predictive maintenance and condition monitoring markets.

  5. Multispectral IR detection modules and applications

    NASA Astrophysics Data System (ADS)

    Münzberg, M.; Breiter, R.; Cabanski, W.; Lutz, H.; Wendler, J.; Ziegler, J.; Rehm, R.; Walther, M.

    2006-05-01

    This paper is intended to present firstly the current status at AIM on quantum well (QWIP) and antimonide superlattices (SL) detection modules for multi spectral ground and airborne applications in the high performance range i.e. for missile approach warning systems and secondly presents possibilities with long linear arrays i.e. 576x7 MCT to measure spectral selective in the 2 - 11μm wavelength range. QWIP and antimonide based superlattice (SL) modules are developed and produced in a work share between AIM and the Fraunhofer Institute for Applied Solid State Physics (IAF). The sensitive layers are manufactured by the IAF, hybridized and integrated to IDCA or camera level by AIM. In case of MCT based modules, all steps are done by AIM. QWIP dual band or dual color detectors provide good resolution as long as integration times in the order of 5-10ms can be tolerated. This is acceptable for all applications where no fast motions of the platform or the targets are to be expected. For spectral selective detection, a QWIP detector combining 3-5 μm (MWIR) and 8-10 μm (LWIR) detection in each pixel with coincident integration has been developed in a 384x288x2 format with 40 μm pitch. Excellent thermal resolution with NETD < 30 mK @ F/2, 6.8 ms for both peak wavelengths (4.8 μm and 8.0 μm) has been achieved. Thanks to the well established QWIP technology, the pixel outage rates even in these complex structures are well below 0.5% in both bands. The spectral cross talk between the two wavelength bands is equal or less than 1%. The substrate on the sensitive layer of the FPA was completely removed in this case and as a consequence the optical crosstalk in the array usually observed in QWIP arrays resulting in low MTF values was suppressed resulting in sharp image impression. For rapidly changing scenes - like e.g. in case of missile warning applications for airborne platforms - a material system with higher quantum efficiency is required to limit integration times to

  6. International Symposium on Airborne Geophysics

    NASA Astrophysics Data System (ADS)

    Mogi, Toru; Ito, Hisatoshi; Kaieda, Hideshi; Kusunoki, Kenichiro; Saltus, Richard W.; Fitterman, David V.; Okuma, Shigeo; Nakatsuka, Tadashi

    2006-05-01

    Airborne geophysics can be defined as the measurement of Earth properties from sensors in the sky. The airborne measurement platform is usually a traditional fixed-wing airplane or helicopter, but could also include lighter-than-air craft, unmanned drones, or other specialty craft. The earliest history of airborne geophysics includes kite and hot-air balloon experiments. However, modern airborne geophysics dates from the mid-1940s when military submarine-hunting magnetometers were first used to map variations in the Earth's magnetic field. The current gamut of airborne geophysical techniques spans a broad range, including potential fields (both gravity and magnetics), electromagnetics (EM), radiometrics, spectral imaging, and thermal imaging.

  7. KATE-140 and MKF-6M space cameras

    NASA Astrophysics Data System (ADS)

    Kuchumov, V.

    1982-07-01

    The KATE-140 large-format topographic camera made it possible to obtain photographs suitable for precision photographic-survey processing. It has a field of vision of 85 deg which makes it possible for a single frame to contain an image of a 450x450-km segment of the Earth's surface from orbital altitude. Precise measurement of the linear dimensions of objects and their mutual positions can be carried out on the photographs. Moreover, the camera's high resolution meets imagery-interpretation requirements. A guidance device provides for both single and strip photographs with a given interval. A punching machine is used to separate the strips. Before being used on the Salyut-6 station, the other fixed camera, the MKF-6M, underwent thorough testing onboard the Soyuz-22 spacecraft. The multispectral MKF-6M camera is designed to obtain information about the spectral characteristics of natural objects in order to increase the reliability of their interpretation. It has six spectral channels, four of which encompass the visible spectra and two of which are in the near-infrared.

  8. The CAMCAO infrared camera

    NASA Astrophysics Data System (ADS)

    Amorim, Antonio; Melo, Antonio; Alves, Joao; Rebordao, Jose; Pinhao, Jose; Bonfait, Gregoire; Lima, Jorge; Barros, Rui; Fernandes, Rui; Catarino, Isabel; Carvalho, Marta; Marques, Rui; Poncet, Jean-Marc; Duarte Santos, Filipe; Finger, Gert; Hubin, Norbert; Huster, Gotthard; Koch, Franz; Lizon, Jean-Louis; Marchetti, Enrico

    2004-09-01

    The CAMCAO instrument is a high resolution near infrared (NIR) camera conceived to operate together with the new ESO Multi-conjugate Adaptive optics Demonstrator (MAD) with the goal of evaluating the feasibility of Multi-Conjugate Adaptive Optics techniques (MCAO) on the sky. It is a high-resolution wide field of view (FoV) camera that is optimized to use the extended correction of the atmospheric turbulence provided by MCAO. While the first purpose of this camera is the sky observation, in the MAD setup, to validate the MCAO technology, in a second phase, the CAMCAO camera is planned to attach directly to the VLT for scientific astrophysical studies. The camera is based on the 2kx2k HAWAII2 infrared detector controlled by an ESO external IRACE system and includes standard IR band filters mounted on a positional filter wheel. The CAMCAO design requires that the optical components and the IR detector should be kept at low temperatures in order to avoid emitting radiation and lower detector noise in the region analysis. The cryogenic system inclues a LN2 tank and a sptially developed pulse tube cryocooler. Field and pupil cold stops are implemented to reduce the infrared background and the stray-light. The CAMCAO optics provide diffraction limited performance down to J Band, but the detector sampling fulfills the Nyquist criterion for the K band (2.2mm).

  9. The Dark Energy Camera

    SciTech Connect

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  10. The Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Flaugher, B.; Diehl, H. T.; Honscheid, K.; Abbott, T. M. C.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Antonik, M.; Ballester, O.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Bonati, M.; Boprie, D.; Brooks, D.; Buckley-Geer, E. J.; Campa, J.; Cardiel-Sas, L.; Castander, F. J.; Castilla, J.; Cease, H.; Cela-Ruiz, J. M.; Chappa, S.; Chi, E.; Cooper, C.; da Costa, L. N.; Dede, E.; Derylo, G.; DePoy, D. L.; de Vicente, J.; Doel, P.; Drlica-Wagner, A.; Eiting, J.; Elliott, A. E.; Emes, J.; Estrada, J.; Fausti Neto, A.; Finley, D. A.; Flores, R.; Frieman, J.; Gerdes, D.; Gladders, M. D.; Gregory, B.; Gutierrez, G. R.; Hao, J.; Holland, S. E.; Holm, S.; Huffman, D.; Jackson, C.; James, D. J.; Jonas, M.; Karcher, A.; Karliner, I.; Kent, S.; Kessler, R.; Kozlovsky, M.; Kron, R. G.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Lahav, O.; Lathrop, A.; Lee, J.; Levi, M. E.; Lewis, P.; Li, T. S.; Mandrichenko, I.; Marshall, J. L.; Martinez, G.; Merritt, K. W.; Miquel, R.; Muñoz, F.; Neilsen, E. H.; Nichol, R. C.; Nord, B.; Ogando, R.; Olsen, J.; Palaio, N.; Patton, K.; Peoples, J.; Plazas, A. A.; Rauch, J.; Reil, K.; Rheault, J.-P.; Roe, N. A.; Rogers, H.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schindler, R. H.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Schurter, P.; Scott, L.; Serrano, S.; Shaw, T. M.; Smith, R. C.; Soares-Santos, M.; Stefanik, A.; Stuermer, W.; Suchyta, E.; Sypniewski, A.; Tarle, G.; Thaler, J.; Tighe, R.; Tran, C.; Tucker, D.; Walker, A. R.; Wang, G.; Watson, M.; Weaverdyck, C.; Wester, W.; Woods, R.; Yanny, B.; DES Collaboration

    2015-11-01

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel-1. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6-9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  11. Satellite camera image navigation

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Savides, John (Inventor); Hanson, Charles W. (Inventor)

    1987-01-01

    Pixels within a satellite camera (1, 2) image are precisely located in terms of latitude and longitude on a celestial body, such as the earth, being imaged. A computer (60) on the earth generates models (40, 50) of the satellite's orbit and attitude, respectively. The orbit model (40) is generated from measurements of stars and landmarks taken by the camera (1, 2), and by range data. The orbit model (40) is an expression of the satellite's latitude and longitude at the subsatellite point, and of the altitude of the satellite, as a function of time, using as coefficients (K) the six Keplerian elements at epoch. The attitude model (50) is based upon star measurements taken by each camera (1, 2). The attitude model (50) is a set of expressions for the deviations in a set of mutually orthogonal reference optical axes (x, y, z) as a function of time, for each camera (1, 2). Measured data is fit into the models (40, 50) using a walking least squares fit algorithm. A transformation computer (66 ) transforms pixel coordinates as telemetered by the camera (1, 2) into earth latitude and longitude coordinates, using the orbit and attitude models (40, 50).

  12. Reproducible high-resolution multispectral image acquisition in dermatology

    NASA Astrophysics Data System (ADS)

    Duliu, Alexandru; Gardiazabal, José; Lasser, Tobias; Navab, Nassir

    2015-07-01

    Multispectral image acquisitions are increasingly popular in dermatology, due to their improved spectral resolution which enables better tissue discrimination. Most applications however focus on restricted regions of interest, imaging only small lesions. In this work we present and discuss an imaging framework for high-resolution multispectral imaging on large regions of interest.

  13. A multispectral method of determining sea surface temperatures

    NASA Technical Reports Server (NTRS)

    Shenk, W. E.

    1972-01-01

    A multispectral method for determining sea surface temperatures is discussed. The specifications of the equipment and the atmospheric conditions required for successful multispectral data acquisition are described. Examples of data obtained in the North Atlantic Ocean are presented. The differences between the actual sea surface temperatures and the equivalent blackbody temperatures as determined by a radiometer are plotted.

  14. Multispectral data compression through transform coding and block quantization

    NASA Technical Reports Server (NTRS)

    Ready, P. J.; Wintz, P. A.

    1972-01-01

    Transform coding and block quantization techniques are applied to multispectral aircraft scanner data, and digitized satellite imagery. The multispectral source is defined and an appropriate mathematical model proposed. The Karhunen-Loeve, Fourier, and Hadamard encoders are considered and are compared to the rate distortion function for the equivalent Gaussian source and to the performance of the single sample PCM encoder.

  15. Two mirror objective design for multispectral remote sensing

    NASA Technical Reports Server (NTRS)

    Clark, P. P.

    1982-01-01

    A two mirror flat field anastigmatic telescope was designed for multispectral sensing. The design was adapted to prism-type beamsplitting arrangements without loss of multispectral image quality by the addition of one refractive element. In addition to being relatively simple and mechanically insensitive, the design is immune to focus shift caused by index of refraction variation with temperature.

  16. Extraction of topographic and spectral albedo information from multispectral images.

    USGS Publications Warehouse

    Eliason, P.T.; Soderblom, L.A.; Chavez, P.A., Jr.

    1981-01-01

    A technique has been developed to separate and extract spectral-reflectivity variations and topographic informaiton from multispectral images. The process is a completely closed system employing only the image data and can be applied to any digital multispectral data set. -from Authors

  17. Selective-imaging camera

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Hsu, Charles; Landa, Joseph; Cha, Jae H.; Krapels, Keith A.

    2015-05-01

    How can we design cameras that image selectively in Full Electro-Magnetic (FEM) spectra? Without selective imaging, we cannot use, for example, ordinary tourist cameras to see through fire, smoke, or other obscurants contributing to creating a Visually Degraded Environment (VDE). This paper addresses a possible new design of selective-imaging cameras at firmware level. The design is consistent with physics of the irreversible thermodynamics of Boltzmann's molecular entropy. It enables imaging in appropriate FEM spectra for sensing through the VDE, and displaying in color spectra for Human Visual System (HVS). We sense within the spectra the largest entropy value of obscurants such as fire, smoke, etc. Then we apply a smart firmware implementation of Blind Sources Separation (BSS) to separate all entropy sources associated with specific Kelvin temperatures. Finally, we recompose the scene using specific RGB colors constrained by the HVS, by up/down shifting Planck spectra at each pixel and time.

  18. Solid state television camera

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

  19. HIGH SPEED CAMERA

    DOEpatents

    Rogers, B.T. Jr.; Davis, W.C.

    1957-12-17

    This patent relates to high speed cameras having resolution times of less than one-tenth microseconds suitable for filming distinct sequences of a very fast event such as an explosion. This camera consists of a rotating mirror with reflecting surfaces on both sides, a narrow mirror acting as a slit in a focal plane shutter, various other mirror and lens systems as well as an innage recording surface. The combination of the rotating mirrors and the slit mirror causes discrete, narrow, separate pictures to fall upon the film plane, thereby forming a moving image increment of the photographed event. Placing a reflecting surface on each side of the rotating mirror cancels the image velocity that one side of the rotating mirror would impart, so as a camera having this short a resolution time is thereby possible.

  20. Multispectral rock-type separation and classification.

    SciTech Connect

    Moya, Mary M.; Fogler, Robert Joseph; Paskaleva, Biliana; Hayat, Majeed M.

    2004-06-01

    This paper explores the possibility of separating and classifying remotely-sensed multispectral data from rocks and minerals onto seven geological rock-type groups. These groups are extracted from the general categories of metamorphic, igneous and sedimentary rocks. The study is performed under ideal conditions for which the data is generated according to laboratory hyperspectral data for the members, which are, in turn, passed through the Multi-spectral Thermal Imager (MTI) filters yielding 15 bands. The main challenge in separability is the small size of the training data sets, which initially did not permit direct application of Bayesian decision theory. To enable Bayseian classification, the original training data is linearly perturbed with the addition minerals, vegetation, soil, water and other valid impurities. As a result, the size of the training data is significantly increased and accurate estimates of the covariance matrices are achieved. In addition, a set of reduced (five) linearly-extracted canonical features that are optimal in providing the most important information about the data is determined. An alternative nonlinear feature-selection method is also employed based on spectral indices comprising a small subset of all possible ratios between bands. By applying three optimization strategies, combinations of two and three ratios are found that provide reliable separability and classification between all seven groups according to the Bhattacharyya distance. To set a benchmark to which the MTI capability in rock classification can be compared, an optimization strategy is performed for the selection of optimal multispectral filters, other than the MTI filters, and an improvement in classification is predicted.

  1. Multispectral fingerprint imaging for spoof detection

    NASA Astrophysics Data System (ADS)

    Nixon, Kristin A.; Rowe, Robert K.

    2005-03-01

    Fingerprint systems are the most widespread form of biometric authentication. Used in locations such as airports and in PDA's and laptops, fingerprint readers are becoming more common in everyday use. As they become more familiar, the security weaknesses of fingerprint sensors are becoming better known. Numerous websites now exist describing in detail how to create a fake fingerprint usable for spoofing a biometric system from both a cooperative user and from latent prints. While many commercial fingerprint readers claim to have some degree of spoof detection incorporated, they are still generally susceptible to spoof attempts using various artificial fingerprint samples made from gelatin or silicone or other materials and methods commonly available on the web. This paper describes a multispectral sensor that has been developed to collect data for spoof detection. The sensor has been designed to work in conjunction with a conventional optical fingerprint reader such that all images are collected during a single placement of the finger on the sensor. The multispectral imaging device captures sub-surface information about the finger that makes it very difficult to spoof. Four attributes of the finger that are collected with the multispectral imager will be described and demonstrated in this paper: spectral qualities of live skin, chromatic texture of skin, sub-surface image of live skin, and blanching on contact. Each of these attributes is well suited to discriminating against particular kinds of spoofing samples. A series of experiments was conducted to demonstrate the capabilities of the individual attributes as well as the collective spoof detection performance.

  2. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  3. A Field Evaluation of Airborne Techniques for Detection of Unexploded Ordnance

    SciTech Connect

    Bell, D.; Doll, W.E.; Hamlett, P.; Holladay, J.S.; Nyquist, J.E.; Smyre, J.; Gamey, T.J.

    1999-03-14

    US Defense Department estimates indicate that as many as 11 million acres of government land in the U. S. may contain unexploded ordnance (UXO), with the cost of identifying and disposing of this material estimated at nearly $500 billion. The size and character of the ordnance, types of interference, vegetation, geology, and topography vary from site to site. Because of size or composition, some ordnance is difficult to detect with any geophysical method, even under favorable soil and cultural interference conditions. For some sites, airborne methods may provide the most time and cost effective means for detection of UXO. Airborne methods offer lower risk to field crews from proximity to unstable ordnance, and less disturbance of sites that maybe environmentally sensitive. Data were acquired over a test site at Edwards AFB, CA using airborne magnetic, electromagnetic, multispectral and thermal sensors. Survey areas included sites where trenches might occur, and a test site in which we placed deactivated ordnance, ranging in size from small ''bomblets'' to large bombs. Magnetic data were then acquired with the Aerodat HM-3 system, which consists of three cesium magnetometers within booms extending to the front and sides of the helicopter, and mounted such that the helicopter can be flown within 3m of the surface. Electromagnetic data were acquired with an Aerodat 5 frequency coplanar induction system deployed as a sling load from a helicopter, with a sensor altitude of 15m. Surface data, acquired at selected sites, provide a comparison with airborne data. Multispectral and thermal data were acquired with a Daedelus AADS 1268 system. Preliminary analysis of the test data demonstrate the value of airborne systems for UXO detection and provide insight into improvements that might make the systems even more effective.

  4. An airborne real-time hyperspectral target detection system

    NASA Astrophysics Data System (ADS)

    Skauli, Torbjorn; Haavardsholm, Trym V.; Kåsen, Ingebjørg; Arisholm, Gunnar; Kavara, Amela; Opsahl, Thomas Olsvik; Skaugen, Atle

    2010-04-01

    An airborne system for hyperspectral target detection is described. The main sensor is a HySpex pushbroom hyperspectral imager for the visible and near-infrared spectral range with 1600 pixels across track, supplemented by a panchromatic line imager. An optional third sensor can be added, either a SWIR hyperspectral camera or a thermal camera. In real time, the system performs radiometric calibration and georeferencing of the images, followed by image processing for target detection and visualization. The current version of the system implements only spectral anomaly detection, based on normal mixture models. Image processing runs on a PC with a multicore Intel processor and an Nvidia graphics processing unit (GPU). The processing runs in a software framework optimized for large sustained data rates. The platform is a Cessna 172 aircraft based close to FFI, modified with a camera port in the floor.

  5. Multispectral analysis of ocean dumped materials

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1977-01-01

    Remotely sensed data were collected in conjunction with sea-truth measurements in three experiments in the New York Bight. Pollution features of primary interest were ocean dumped materials, such as sewage sludge and acid waste. Sewage-sludge and acid-waste plumes, including plumes from sewage sludge dumped by the 'line-dump' and 'spot-dump' methods, were located, identified, and mapped. Previously developed quantitative analysis techniques for determining quantitative distributions of materials in sewage sludge dumps were evaluated, along with multispectral analysis techniques developed to identify ocean dumped materials. Results of these experiments and the associated data analysis investigations are presented and discussed.

  6. Multispectral analysis of ocean dumped materials

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1977-01-01

    Experiments conducted in the Atlantic coastal zone indicated that plumes resulting from ocean dumping of acid wastes and sewage sludge have unique spectral characteristics. Remotely sensed wide area synoptic coverage provided information on these pollution features that was not readily available from other sources. Aircraft remotely sensed photographic and multispectral scanner data were interpreted by two methods. First, qualitative analyses in which pollution features were located, mapped, and identified without concurrent sea truth and, second, quantitative analyses in which concurrently collected sea truth was used to calibrate the remotely sensed data and to determine quantitative distributions of one or more parameters in a plume.

  7. Vector anisotropic filter for multispectral image denoising

    NASA Astrophysics Data System (ADS)

    Ben Said, Ahmed; Foufou, Sebti; Hadjidj, Rachid

    2015-04-01

    In this paper, we propose an approach to extend the application of anisotropic Gaussian filtering for multi- spectral image denoising. We study the case of images corrupted with additive Gaussian noise and use sparse matrix transform for noise covariance matrix estimation. Specifically we show that if an image has a low local variability, we can make the assumption that in the noisy image, the local variability originates from the noise variance only. We apply the proposed approach for the denoising of multispectral images corrupted by noise and compare the proposed method with some existing methods. Results demonstrate an improvement in the denoising performance.

  8. Multispectral scanner imagery for plant community classification.

    NASA Technical Reports Server (NTRS)

    Driscoll, R. S.; Spencer, M. M.

    1973-01-01

    Optimum channel selection among 12 channels of multispectral scanner imagery identified six as providing the best information for computerized classification of 11 plant communities and two nonvegetation classes. Intensive preprocessing of the spectral data was required to eliminate bidirectional reflectance effects of the spectral imagery caused by scanner view angle and varying geometry of the plant canopy. Generalized plant community types - forest, grassland, and hydrophytic systems - were acceptably classified based on ecological analysis. Serious, but soluble, errors occurred with attempts to classify specific community types within the grassland system. However, special clustering analyses provided for improved classification of specific grassland communities.

  9. Multi-Spectral Solar Telescope Array

    NASA Technical Reports Server (NTRS)

    Walker, Arthur B. C., Jr.; Lindblom, Joakim F.; O'Neal, Ray H.; Allen, Maxwell J.; Barbee, Troy W., Jr.; Hoover, Richard B.

    1990-01-01

    This paper descibes the design and the characteristics of the Multispectral Solar Telescope Array (MSSTA), a new rocket spectroheliograph to be launched in August 1990. The MSSTA includes five multilayer Ritchey-Chretien telescopes covering the spectral range 150-300 A and eight multilayer Herschelian telescopes covering the spectral range 40-1550 A, making it possible to obtain spectrohelipgrams over the soft X-ray/extreme UV/FUV spectral range. The MSSTA is expected to obtain information regarding the structure and dynamics of the solar atmosphere in the temperature range 10 to the 4th-10 to the 7th K.

  10. Multispectral-image fusion using neural networks

    NASA Astrophysics Data System (ADS)

    Kagel, Joseph H.; Platt, C. A.; Donaven, T. W.; Samstad, Eric A.

    1990-08-01

    A prototype system is being developed to demonstrate the use of neural network hardware to fuse multispectral imagery. This system consists of a neural network IC on a motherboard a circuit card assembly and a set of software routines hosted by a PC-class computer. Research in support of this consists of neural network simulations fusing 4 to 7 bands of Landsat imagery and fusing (separately) multiple bands of synthetic imagery. The simulations results and a description of the prototype system are presented. 1.

  11. Multispectral image fusion using neural networks

    NASA Technical Reports Server (NTRS)

    Kagel, J. H.; Platt, C. A.; Donaven, T. W.; Samstad, E. A.

    1990-01-01

    A prototype system is being developed to demonstrate the use of neural network hardware to fuse multispectral imagery. This system consists of a neural network IC on a motherboard, a circuit card assembly, and a set of software routines hosted by a PC-class computer. Research in support of this consists of neural network simulations fusing 4 to 7 bands of Landsat imagery and fusing (separately) multiple bands of synthetic imagery. The simulations, results, and a description of the prototype system are presented.

  12. Fusion of multisensor, multispectral, and defocused images

    NASA Astrophysics Data System (ADS)

    Shahida, Mohd.; Guptab, Sumana

    2005-10-01

    Fusion is basically extraction of best of inputs and conveying it to the output. In this paper, we present an image fusion technique using the concept of perceptual information across the bands. This algorithm is relevant to visual sensitivity and tested by merging multisensor, multispectral and Defoucused images. Fusion is achieved through the formation of one fused pyramid using the DWT coefficients from the decomposed pyramids of the source images. The fused image is obtained through conventional discrete wavelet transform (DWT) reconstruction process. Results obtained using the proposed method show a significant reduction of distortion artifacts and a large preservation of spectral information.

  13. A multisensor system for airborne surveillance of oil pollution

    NASA Technical Reports Server (NTRS)

    Edgerton, A. T.; Ketchal, R.; Catoe, C.

    1973-01-01

    The U.S. Coast Guard is developing a prototype airborne oil surveillance system for use in its Marine Environmental Protection Program. The prototype system utilizes an X-band side-looking radar, a 37-GHz imaging microwave radiometer, a multichannel line scanner, and a multispectral low light level system. The system is geared to detecting and mapping oil spills and potential pollution violators anywhere within a 25 nmi range of the aircraft flight track under all but extreme weather conditions. The system provides for false target discrimination and maximum identification of spilled materials. The system also provides an automated detection alarm, as well as a color display to achieve maximum coupling between the sensor data and the equipment operator.

  14. High spectral resolution airborne short wave infrared hyperspectral imager

    NASA Astrophysics Data System (ADS)

    Wei, Liqing; Yuan, Liyin; Wang, Yueming; Zhuang, Xiaoqiong

    2016-05-01

    Short Wave InfraRed(SWIR) spectral imager is good at detecting difference between materials and penetrating fog and mist. High spectral resolution SWIR hyperspectral imager plays a key role in developing earth observing technology. Hyperspectral data cube can help band selections that is very important for multispectral imager design. Up to now, the spectral resolution of many SWIR hyperspectral imagers is about 10nm. A high sensitivity airborne SWIR hyperspectral imager with narrower spectral band will be presented. The system consists of TMA telescope, slit, spectrometer with planar blazed grating and high sensitivity MCT FPA. The spectral sampling interval is about 3nm. The IFOV is 0.5mrad. To eliminate the influence of the thermal background, a cold shield is designed in the dewar. The pixel number of spatial dimension is 640. Performance measurement in laboratory and image analysis for flight test will also be presented.

  15. Multiresolution processing for fractal analysis of airborne remotely sensed data

    NASA Technical Reports Server (NTRS)

    Jaggi, S.; Quattrochi, D.; Lam, N.

    1992-01-01

    Images acquired by NASA's Calibrated Airborne Multispectral Scanner are used to compute the fractal dimension as a function of spatial resolution. Three methods are used to determine the fractal dimension: Shelberg's (1982, 1983) line-divider method, the variogram method, and the triangular prism method. A description of these methods and the result of applying these methods to a remotely-sensed image is also presented. The scanner data was acquired over western Puerto Rico in January, 1990 over land and water. The aim is to study impacts of man-induced changes on land that affect sedimentation into the near-shore environment. The data were obtained over the same area at three different pixel sizes: 10 m, 20 m, and 30 m.

  16. Modeling of estuarne chlorophyll a from an airborne scanner

    USGS Publications Warehouse

    Khorram, Siamak; Catts, Glenn P.; Cloern, James E.; Knight, Allen W.

    1987-01-01

    Near simultaneous collection of 34 surface water samples and airborne multispectral scanner data provided input for regression models developed to predict surface concentrations of estuarine chlorophyll a. Two wavelength ratios were employed in model development. The ratios werechosen to capitalize on the spectral characteristics of chlorophyll a, while minimizing atmospheric influences. Models were then applied to data previously acquired over the study area thre years earlier. Results are in the form of color-coded displays of predicted chlorophyll a concentrations and comparisons of the agreement among measured surface samples and predictions basedon coincident remotely sensed data. The influence of large variations in fresh-water inflow to the estuary are clearly apparent in the results. The synoptic view provided by remote sensing is another method of examining important estuarine dynamics difficult to observe from in situ sampling alone.

  17. Artificial human vision camera

    NASA Astrophysics Data System (ADS)

    Goudou, J.-F.; Maggio, S.; Fagno, M.

    2014-10-01

    In this paper we present a real-time vision system modeling the human vision system. Our purpose is to inspire from human vision bio-mechanics to improve robotic capabilities for tasks such as objects detection and tracking. This work describes first the bio-mechanical discrepancies between human vision and classic cameras and the retinal processing stage that takes place in the eye, before the optic nerve. The second part describes our implementation of these principles on a 3-camera optical, mechanical and software model of the human eyes and associated bio-inspired attention model.

  18. Bandpass filter arrays patterned by photolithography for multispectral remote sensing

    NASA Astrophysics Data System (ADS)

    Bauer, T.; Thome, Heidi; Eisenhammer, Thomas

    2014-10-01

    Optical remote sensing of the earth from air and space typically utilizes several channels from visible (VIS), near infrared (NIR) up to the short wave infrared (SWIR) spectral region. Thin-film optical filters are applied to select these channels. Filter wheels and arrays of discrete stripe filters are standard configurations. To achieve compact and light weight camera designs multi-channel filter plates or assemblies can be mounted close to the electronic detectors. Optics Balzers has implemented a micro-structuring process based on a sequence of multiple coatings and photolithography on the same substrate. High-performance band pass filters are applied by plasma assisted evaporation (plasma IAD) with advance plasma source (APS) technology and optical broad-band monitoring (BBM). This technology has already proven for various multi spectral imager (MSI) configurations on fused silica, sapphire and other substrates for remote sensing application. The optical filter design and performance is limited by the maximum coating thickness micro-structurable by photolithographic lift-off processes and by thermal and radiation load on the photoresist mask during the process Recent progress in image resolution and sensor selectivity requires improvements of optical filter performance. Blocking in the UV and NIR and in between the spectral cannels, in-band transmission and filter edge steepness are subject of current development. Technological limits of the IAD coating accuracy can be overcome by more precise coating technologies like plasma assisted reactive magnetron sputtering (PARMS) and combination with optical broadband monitoring (BBM). We present an overview about concepts and technologies for band-pass filter arrays for multi-spectral imaging at Optics Balzers. Recent performance improvements of filter arrays made by micro-structuring will be presented.

  19. Rigorous Georeferencing of ALSAT-2A Panchromatic and Multispectral Imagery

    NASA Astrophysics Data System (ADS)

    Boukerch, I.; Hadeid, M.; Mahmoudi, R.; Takarli, B.; Hasni, K.

    2013-04-01

    The exploitation of the full geometric capabilities of the High-Resolution Satellite Imagery (HRSI), require the development of an appropriate sensor orientation model. Several authors studied this problem; generally we have two categories of geometric models: physical and empirical models. Based on the analysis of the metadata provided with ALSAT-2A, a rigorous pushbroom camera model can be developed. This model has been successfully applied to many very high resolution imagery systems. The relation between the image and ground coordinates by the time dependant collinearity involving many coordinates systems has been tested. The interior orientation parameters must be integrated in the model, the interior parameters can be estimated from the viewing angles corresponding to the pointing directions of any detector, these values are derived from cubic polynomials provided in the metadata. The developed model integrates all the necessary elements with 33 unknown. All the approximate values of the 33 unknowns parameters may be derived from the informations contained in the metadata files provided with the imagery technical specifications or they are simply fixed to zero, so the condition equation is linearized and solved using SVD in a least square sense in order to correct the initial values using a suitable number of well-distributed GCPs. Using Alsat-2A images over the town of Toulouse in the south west of France, three experiments are done. The first is about 2D accuracy analysis using several sets of parameters. The second is about GCPs number and distribution. The third experiment is about georeferencing multispectral image by applying the model calculated from panchromatic image.

  20. Real-time multispectral imaging system for online poultry fecal inspection using UML

    NASA Astrophysics Data System (ADS)

    Park, Bosoon; Kise, Michio; Lawrence, Kurt C.; Windham, William R.; Smith, Douglas P.; Thai, Chi N.

    2006-10-01

    A prototype real-time multispectral imaging system for fecal and ingesta contaminant detection on broiler carcasses has been developed. The prototype system includes a common aperture camera with three optical trim filters (517, 565 and 802-nm wavelength), which were selected by visible/NIR spectroscopy and validated by a hyperspectral imaging system with decision tree algorithm. The on-line testing results showed that the multispectral imaging technique can be used effectively for detecting feces (from duodenum, ceca, and colon) and ingesta on the surface of poultry carcasses with a processing speed of 140 birds per minute. This paper demonstrated both multispectral imaging hardware and real-time image processing software. For the software development, the Unified Modeling Language (UML) design approach was used for on-line application. The UML models included class, object, activity, sequence, and collaboration diagram. User interface model included seventeen inputs and six outputs. A window based real-time image processing software composed of eleven components, which represented class, architecture, and activity. Both hardware and software for a real-time fecal detection were tested at the pilot-scale poultry processing plant. The run-time of the software including online calibration was fast enough to inspect carcasses on-line with an industry requirement. Based on the preliminary test at the pilot-scale processing line, the system was able to acquire poultry images in real-time. According to the test results, the imaging system is reliable for the harsh environments and UML based image processing software is flexible and easy to be updated when additional parameters are needed for in-plant trials.