Science.gov

Sample records for airborne multispectral camera

  1. Design and implementation of digital airborne multispectral camera system

    NASA Astrophysics Data System (ADS)

    Lin, Zhaorong; Zhang, Xuguo; Wang, Li; Pan, Deai

    2012-10-01

    The multispectral imaging equipment is a kind of new generation remote sensor, which can obtain the target image and the spectra information simultaneously. A digital airborne multispectral camera system using discrete filter method had been designed and implemented for unmanned aerial vehicle (UAV) and manned aircraft platforms. The digital airborne multispectral camera system has the advantages of larger frame, higher resolution, panchromatic and multispectral imaging. It also has great potential applications in the fields of environmental and agricultural monitoring and target detection and discrimination. In order to enhance the measurement precision and accuracy of position and orientation, Inertial Measurement Unit (IMU) is integrated in the digital airborne multispectral camera. Meanwhile, the Temperature Control Unit (TCU) guarantees that the camera can operate in the normal state in different altitudes to avoid the window fogging and frosting which will degrade the imaging quality greatly. Finally, Flying experiments were conducted to demonstrate the functionality and performance of the digital airborne multispectral camera. The resolution capability, positioning accuracy and classification and recognition ability were validated.

  2. An airborne multispectral imaging system based on two consumer-grade cameras for agricultural remote sensing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes the design and evaluation of an airborne multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame complementary metal oxide semiconductor (CMOS) sensor with 5616 × 3744 pixels. One came...

  3. A Multispectral Image Creating Method for a New Airborne Four-Camera System with Different Bandpass Filters.

    PubMed

    Li, Hanlun; Zhang, Aiwu; Hu, Shaoxing

    2015-07-20

    This paper describes an airborne high resolution four-camera multispectral system which mainly consists of four identical monochrome cameras equipped with four interchangeable bandpass filters. For this multispectral system, an automatic multispectral data composing method was proposed. The homography registration model was chosen, and the scale-invariant feature transform (SIFT) and random sample consensus (RANSAC) were used to generate matching points. For the difficult registration problem between visible band images and near-infrared band images in cases lacking manmade objects, we presented an effective method based on the structural characteristics of the system. Experiments show that our method can acquire high quality multispectral images and the band-to-band alignment error of the composed multiple spectral images is less than 2.5 pixels.

  4. A Multispectral Image Creating Method for a New Airborne Four-Camera System with Different Bandpass Filters

    PubMed Central

    Li, Hanlun; Zhang, Aiwu; Hu, Shaoxing

    2015-01-01

    This paper describes an airborne high resolution four-camera multispectral system which mainly consists of four identical monochrome cameras equipped with four interchangeable bandpass filters. For this multispectral system, an automatic multispectral data composing method was proposed. The homography registration model was chosen, and the scale-invariant feature transform (SIFT) and random sample consensus (RANSAC) were used to generate matching points. For the difficult registration problem between visible band images and near-infrared band images in cases lacking manmade objects, we presented an effective method based on the structural characteristics of the system. Experiments show that our method can acquire high quality multispectral images and the band-to-band alignment error of the composed multiple spectral images is less than 2.5 pixels. PMID:26205264

  5. Airborne multispectral identification of individual cotton plants using consumer-grade cameras

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Although multispectral remote sensing using consumer-grade cameras has successfully identified fields of small cotton plants, improvements to detection sensitivity are needed to identify individual or small clusters of plants. The imaging sensor of consumer-grade cameras are based on a Bayer patter...

  6. Assessing the application of an airborne intensified multispectral video camera to measure chlorophyll a in three Florida estuaries

    SciTech Connect

    Dierberg, F.E.; Zaitzeff, J.

    1997-08-01

    After absolute and spectral calibration, an airborne intensified, multispectral video camera was field tested for water quality assessments over three Florida estuaries (Tampa Bay, Indian River Lagoon, and the St. Lucie River Estuary). Univariate regression analysis of upwelling spectral energy vs. ground-truthed uncorrected chlorophyll a (Chl a) for each estuary yielded lower coefficients of determination (R{sup 2}) with increasing concentrations of Gelbstoff within an estuary. More predictive relationships were established by adding true color as a second independent variable in a bivariate linear regression model. These regressions successfully explained most of the variation in upwelling light energy (R{sup 2}=0.94, 0.82 and 0.74 for the Tampa Bay, Indian River Lagoon, and St. Lucie estuaries, respectively). Ratioed wavelength bands within the 625-710 nm range produced the highest correlations with ground-truthed uncorrected Chl a, and were similar to those reported as being the most predictive for Chl a in Tennessee reservoirs. However, the ratioed wavebands producing the best predictive algorithms for Chl a differed among the three estuaries due to the effects of varying concentrations of Gelbstoff on upwelling spectral signatures, which precluded combining the data into a common data set for analysis.

  7. Highly Protable Airborne Multispectral Imaging System

    NASA Technical Reports Server (NTRS)

    Lehnemann, Robert; Mcnamee, Todd

    2001-01-01

    A portable instrumentation system is described that includes and airborne and a ground-based subsytem. It can acquire multispectral image data over swaths of terrain ranging in width from about 1.5 to 1 km. The system was developed especially for use in coastal environments and is well suited for performing remote sensing and general environmental monitoring. It includes a small,munpilotaed, remotely controlled airplance that carries a forward-looking camera for navigation, three downward-looking monochrome video cameras for imaging terrain in three spectral bands, a video transmitter, and a Global Positioning System (GPS) reciever.

  8. An integrated compact airborne multispectral imaging system using embedded computer

    NASA Astrophysics Data System (ADS)

    Zhang, Yuedong; Wang, Li; Zhang, Xuguo

    2015-08-01

    An integrated compact airborne multispectral imaging system using embedded computer based control system was developed for small aircraft multispectral imaging application. The multispectral imaging system integrates CMOS camera, filter wheel with eight filters, two-axis stabilized platform, miniature POS (position and orientation system) and embedded computer. The embedded computer has excellent universality and expansibility, and has advantages in volume and weight for airborne platform, so it can meet the requirements of control system of the integrated airborne multispectral imaging system. The embedded computer controls the camera parameters setting, filter wheel and stabilized platform working, image and POS data acquisition, and stores the image and data. The airborne multispectral imaging system can connect peripheral device use the ports of the embedded computer, so the system operation and the stored image data management are easy. This airborne multispectral imaging system has advantages of small volume, multi-function, and good expansibility. The imaging experiment results show that this system has potential for multispectral remote sensing in applications such as resource investigation and environmental monitoring.

  9. Airborne system for testing multispectral reconnaissance technologies

    NASA Astrophysics Data System (ADS)

    Schmitt, Dirk-Roger; Doergeloh, Heinrich; Keil, Heiko; Wetjen, Wilfried

    1999-07-01

    There is an increasing demand for future airborne reconnaissance systems to obtain aerial images for tactical or peacekeeping operations. Especially Unmanned Aerial Vehicles (UAVs) equipped with multispectral sensor system and with real time jam resistant data transmission capabilities are of high interest. An airborne experimental platform has been developed as testbed to investigate different concepts of reconnaissance systems before their application in UAVs. It is based on a Dornier DO 228 aircraft, which is used as flying platform. Great care has been taken to achieve the possibility to test different kinds of multispectral sensors. Hence basically it is capable to be equipped with an IR sensor head, high resolution aerial cameras of the whole optical spectrum and radar systems. The onboard equipment further includes system for digital image processing, compression, coding, and storage. The data are RF transmitted to the ground station using technologies with high jam resistance. The images, after merging with enhanced vision components, are delivered to the observer who has an uplink data channel available to control flight and imaging parameters.

  10. Airborne system for multispectral, multiangle polarimetric imaging.

    PubMed

    Bowles, Jeffrey H; Korwan, Daniel R; Montes, Marcos J; Gray, Deric J; Gillis, David B; Lamela, Gia M; Miller, W David

    2015-11-01

    In this paper, we describe the design, fabrication, calibration, and deployment of an airborne multispectral polarimetric imager. The motivation for the development of this instrument was to explore its ability to provide information about water constituents, such as particle size and type. The instrument is based on four 16 MP cameras and uses wire grid polarizers (aligned at 0°, 45°, 90°, and 135°) to provide the separation of the polarization states. A five-position filter wheel provides for four narrow-band spectral filters (435, 550, 625, and 750 nm) and one blocked position for dark-level measurements. When flown, the instrument is mounted on a programmable stage that provides control of the view angles. View angles that range to ±65° from the nadir have been used. Data processing provides a measure of the polarimetric signature as a function of both the view zenith and view azimuth angles. As a validation of our initial results, we compare our measurements, over water, with the output of a Monte Carlo code, both of which show neutral points off the principle plane. The locations of the calculated and measured neutral points are compared. The random error level in the measured degree of linear polarization (8% at 435) is shown to be better than 0.25%.

  11. An airborne four-camera imaging system for agricultural applications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes the design and testing of an airborne multispectral digital imaging system for remote sensing applications. The system consists of four high resolution charge coupled device (CCD) digital cameras and a ruggedized PC equipped with a frame grabber and image acquisition software. T...

  12. Airborne multispectral detection of regrowth cotton fields

    NASA Astrophysics Data System (ADS)

    Westbrook, John K.; Suh, Charles P.-C.; Yang, Chenghai; Lan, Yubin; Eyster, Ritchie S.

    2015-01-01

    Effective methods are needed for timely areawide detection of regrowth cotton plants because boll weevils (a quarantine pest) can feed and reproduce on these plants beyond the cotton production season. Airborne multispectral images of regrowth cotton plots were acquired on several dates after three shredding (i.e., stalk destruction) dates. Linear spectral unmixing (LSU) classification was applied to high-resolution airborne multispectral images of regrowth cotton plots to estimate the minimum detectable size and subsequent growth of plants. We found that regrowth cotton fields can be identified when the mean plant width is ˜0.2 m for an image resolution of 0.1 m. LSU estimates of canopy cover of regrowth cotton plots correlated well (r2=0.81) with the ratio of mean plant width to row spacing, a surrogate measure of plant canopy cover. The height and width of regrowth plants were both well correlated (r2=0.94) with accumulated degree-days after shredding. The results will help boll weevil eradication program managers use airborne multispectral images to detect and monitor the regrowth of cotton plants after stalk destruction, and identify fields that may require further inspection and mitigation of boll weevil infestations.

  13. Astronaut Jack Lousma works at Multispectral camera experiment

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Astronaut Jack R. Lousma, Skylab 3 pilot, works at the S190A multispectral camera experiment in the Multiple Docking Adapter (MDA), seen from a color television transmission made by a TV camera aboard the Skylab space station cluster in Earth orbit. Lousma later used a small brush to clean the six lenses of the multispectral camera.

  14. ASPIS, A Flexible Multispectral System for Airborne Remote Sensing Environmental Applications

    PubMed Central

    Papale, Dario; Belli, Claudio; Gioli, Beniamino; Miglietta, Franco; Ronchi, Cesare; Vaccari, Francesco Primo; Valentini, Riccardo

    2008-01-01

    Airborne multispectral and hyperspectral remote sensing is a powerful tool for environmental monitoring applications. In this paper we describe a new system (ASPIS) composed by a 4-CCD spectral sensor, a thermal IR camera and a laser altimeter that is mounted on a flexible Sky-Arrow airplane. A test application of the multispectral sensor to estimate durum wheat quality is also presented. PMID:27879875

  15. Spatial Modeling and Variability Analysis for Modeling and Prediction of Soil and Crop Canopy Coverage Using Multispectral Imagery from an Airborne Remote Sensing System

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Based on a previous study on an airborne remote sensing system with automatic camera stabilization for crop management, multispectral imagery was acquired using the MS-4100 multispectral camera at different flight altitudes over a 115 ha cotton field. After the acquired images were geo-registered an...

  16. Aerosol Remote Sensing Applications for Airborne Multiangle, Multispectral Shortwave Radiometers

    NASA Astrophysics Data System (ADS)

    von Bismarck, Jonas; Ruhtz, Thomas; Starace, Marco; Hollstein, André; Preusker, René; Fischer, Jürgen

    2010-05-01

    Aerosol particles have an important impact on the surface net radiation budget by direct scattering and absorption (direct aerosol effect) of solar radiation, and also by influencing cloud formation processes (semi-direct and indirect aerosol effects). To study the former, a number of multispectral sky- and sunphotometers have been developed at the Institute for Space Sciences of the Free University of Berlin in the past two decades. The latest operational developments were the multispectral aureole- and sunphotometer FUBISS-ASA2, the zenith radiometer FUBISS-ZENITH, and the nadir polarimeter AMSSP-EM, all designed for a flexible use on moving platforms like aircraft or ships. Currently the multiangle, multispectral radiometer URMS/AMSSP (Universal Radiation Measurement System/ Airborne Multispectral Sunphotometer and Polarimeter) is under construction for a Wing-Pod of the high altitude research aircraft HALO operated by DLR. The system is expected to have its first mission on HALO in 2011. The algorithms for the retrieval of aerosol and trace gas properties from the recorded multidirectional, multispectral radiation measurements allow more than deriving standard products, as for instance the aerosol optical depth and the Angstrom exponent. The radiation measured in the solar aureole contains information about the aerosol phasefunction and therefore allows conclusions about the particle type. Furthermore, airborne instrument operation allows vertically resolved measurements. An inversion algorithm, based on radiative transfer simulations and additionally including measured vertical zenith-radiance profiles, allows conclusions about the aerosol single scattering albedo and the relative soot fraction in aerosol layers. Ozone column retrieval is performed evaluating measurements from pixels in the Chappuis absorption band. A retrieval algorithm to derive the water-vapor column from the sunphotometer measurements is currently under development. Of the various airborne

  17. Multispectral Airborne Laser Scanning for Automated Map Updating

    NASA Astrophysics Data System (ADS)

    Matikainen, Leena; Hyyppä, Juha; Litkey, Paula

    2016-06-01

    During the last 20 years, airborne laser scanning (ALS), often combined with multispectral information from aerial images, has shown its high feasibility for automated mapping processes. Recently, the first multispectral airborne laser scanners have been launched, and multispectral information is for the first time directly available for 3D ALS point clouds. This article discusses the potential of this new single-sensor technology in map updating, especially in automated object detection and change detection. For our study, Optech Titan multispectral ALS data over a suburban area in Finland were acquired. Results from a random forests analysis suggest that the multispectral intensity information is useful for land cover classification, also when considering ground surface objects and classes, such as roads. An out-of-bag estimate for classification error was about 3% for separating classes asphalt, gravel, rocky areas and low vegetation from each other. For buildings and trees, it was under 1%. According to feature importance analyses, multispectral features based on several channels were more useful that those based on one channel. Automatic change detection utilizing the new multispectral ALS data, an old digital surface model (DSM) and old building vectors was also demonstrated. Overall, our first analyses suggest that the new data are very promising for further increasing the automation level in mapping. The multispectral ALS technology is independent of external illumination conditions, and intensity images produced from the data do not include shadows. These are significant advantages when the development of automated classification and change detection procedures is considered.

  18. A high-resolution airborne four-camera imaging system for agricultural remote sensing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes the design and testing of an airborne multispectral digital imaging system for remote sensing applications. The system consists of four high resolution charge coupled device (CCD) digital cameras and a ruggedized PC equipped with a frame grabber and image acquisition software. T...

  19. A tiny VIS-NIR snapshot multispectral camera

    NASA Astrophysics Data System (ADS)

    Geelen, Bert; Blanch, Carolina; Gonzalez, Pilar; Tack, Nicolaas; Lambrechts, Andy

    2015-03-01

    Spectral imaging can reveal a lot of hidden details about the world around us, but is currently confined to laboratory environments due to the need for complex, costly and bulky cameras. Imec has developed a unique spectral sensor concept in which the spectral unit is monolithically integrated on top of a standard CMOS image sensor at wafer level, hence enabling the design of compact, low cost and high acquisition speed spectral cameras with a high design flexibility. This flexibility has previously been demonstrated by imec in the form of three spectral camera architectures: firstly a high spatial and spectral resolution scanning camera, secondly a multichannel snapshot multispectral camera and thirdly a per-pixel mosaic snapshot spectral camera. These snapshot spectral cameras sense an entire multispectral data cube at one discrete point in time, extending the domain of spectral imaging towards dynamic, video-rate applications. This paper describes the integration of our per-pixel mosaic snapshot spectral sensors inside a tiny, portable and extremely user-friendly camera. Our prototype demonstrator cameras can acquire multispectral image cubes, either of 272x512 pixels over 16 bands in the VIS (470-620nm) or of 217x409 pixels over 25 bands in the VNIR (600-900nm) at 170 cubes per second for normal machine vision illumination levels. The cameras themselves are extremely compact based on Ximea xiQ cameras, measuring only 26x26x30mm, and can be operated from a laptop-based USB3 connection, making them easily deployable in very diverse environments.

  20. Trajectory association across multiple airborne cameras.

    PubMed

    Sheikh, Yaser Ajmal; Shah, Mubarak

    2008-02-01

    A camera mounted on an aerial vehicle provides an excellent means for monitoring large areas of a scene. Utilizing several such cameras on different aerial vehicles allows further flexibility, in terms of increased visual scope and in the pursuit of multiple targets. In this paper, we address the problem of associating objects across multiple airborne cameras. Since the cameras are moving and often widely separated, direct appearance-based or proximity-based constraints cannot be used. Instead, we exploit geometric constraints on the relationship between the motion of each object across cameras, to test multiple association hypotheses, without assuming any prior calibration information. Given our scene model, we propose a likelihood function for evaluating a hypothesized association between observations in multiple cameras that is geometrically motivated. Since multiple cameras exist, ensuring coherency in association is an essential requirement, e.g. that transitive closure is maintained between more than two cameras. To ensure such coherency we pose the problem of maximizing the likelihood function as a k-dimensional matching and use an approximation to find the optimal assignment of association. Using the proposed error function, canonical trajectories of each object and optimal estimates of inter-camera transformations (in a maximum likelihood sense) are computed. Finally, we show that as a result of associating objects across the cameras, a concurrent visualization of multiple aerial video streams is possible and that, under special conditions, trajectories interrupted due to occlusion or missing detections can be repaired. Results are shown on a number of real and controlled scenarios with multiple objects observed by multiple cameras, validating our qualitative models, and through simulation quantitative performance is also reported.

  1. Camera system for multispectral imaging of documents

    NASA Astrophysics Data System (ADS)

    Christens-Barry, William A.; Boydston, Kenneth; France, Fenella G.; Knox, Keith T.; Easton, Roger L., Jr.; Toth, Michael B.

    2009-02-01

    A spectral imaging system comprising a 39-Mpixel monochrome camera, LED-based narrowband illumination, and acquisition/control software has been designed for investigations of cultural heritage objects. Notable attributes of this system, referred to as EurekaVision, include: streamlined workflow, flexibility, provision of well-structured data and metadata for downstream processing, and illumination that is safer for the artifacts. The system design builds upon experience gained while imaging the Archimedes Palimpsest and has been used in studies of a number of important objects in the LOC collection. This paper describes practical issues that were considered by EurekaVision to address key research questions for the study of fragile and unique cultural objects over a range of spectral bands. The system is intended to capture important digital records for access by researchers, professionals, and the public. The system was first used for spectral imaging of the 1507 world map by Martin Waldseemueller, the first printed map to reference "America." It was also used to image sections of the Carta Marina 1516 map by the same cartographer for comparative purposes. An updated version of the system is now being utilized by the Preservation Research and Testing Division of the Library of Congress.

  2. Multispectral synthesis of daylight using a commercial digital CCD camera.

    PubMed

    Nieves, Juan L; Valero, Eva M; Nascimento, Sérgio M C; Hernández-Andrés, Javier; Romero, Javier

    2005-09-20

    Performance of multispectral devices in recovering spectral data has been intensively investigated in some applications, as in spectral characterization of art paintings, but has received little attention in the context of spectral characterization of natural illumination. This study investigated the quality of the spectral estimation of daylight-type illuminants using a commercial digital CCD camera and a set of broadband colored filters. Several recovery algorithms that did not need information about spectral sensitivities of the camera sensors nor eigenvectors to describe the spectra were tested. Tests were carried out both with virtual data, using simulated camera responses, and real data obtained from real measurements. It was found that it is possible to recover daylight spectra with high spectral and colorimetric accuracy with a reduced number of three to nine spectral bands.

  3. Airborne multisensor pod system (AMPS) data: Multispectral data integration and processing hints

    SciTech Connect

    Leary, T.J.; Lamb, A.

    1996-11-01

    The Department of Energy`s Office of Arms Control and Non-Proliferation (NN-20) has developed a suite of airborne remote sensing systems that simultaneously collect coincident data from a US Navy P-3 aircraft. The primary objective of the Airborne Multisensor Pod System (AMPS) Program is {open_quotes}to collect multisensor data that can be used for data research, both to reduce interpretation problems associated with data overload and to develop information products more complete than can be obtained from any single sensor.{close_quotes} The sensors are housed in wing-mounted pods and include: a Ku-Band Synthetic Aperture Radar; a CASI Hyperspectral Imager; a Daedalus 3600 Airborne Multispectral Scanner; a Wild Heerbrugg RC-30 motion compensated large format camera; various high resolution, light intensified and thermal video cameras; and several experimental sensors (e.g. the Portable Hyperspectral Imager of Low-Light Spectroscopy (PHILLS)). Over the past year or so, the Coastal Marine Resource Assessment (CAMRA) group at the Florida Department of Environmental Protection`s Marine Research Institute (FMRI) has been working with the Department of Energy through the Naval Research Laboratory to develop applications and products from existing data. Considerable effort has been spent identifying image formats integration parameters. 2 refs., 3 figs., 2 tabs.

  4. Michigan experimental multispectral mapping system: A description of the M7 airborne sensor and its performance

    NASA Technical Reports Server (NTRS)

    Hasell, P. G., Jr.

    1974-01-01

    The development and characteristics of a multispectral band scanner for an airborne mapping system are discussed. The sensor operates in the ultraviolet, visual, and infrared frequencies. Any twelve of the bands may be selected for simultaneous, optically registered recording on a 14-track analog tape recorder. Multispectral imagery recorded on magnetic tape in the aircraft can be laboratory reproduced on film strips for visual analysis or optionally machine processed in analog and/or digital computers before display. The airborne system performance is analyzed.

  5. Comparison of different detection methods for citrus greening disease based on airborne multispectral and hyperspectral imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Citrus greening or Huanglongbing (HLB) is a devastating disease spread in many citrus groves since first found in 2005 in Florida. Multispectral (MS) and hyperspectral (HS) airborne images of citrus groves in Florida were taken to detect citrus greening infected trees in 2007 and 2010. Ground truthi...

  6. Remote identification of potential boll weevil host plants: Airborne multispectral detection of regrowth cotton

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Regrowth cotton plants can serve as potential hosts for boll weevils during and beyond the production season. Effective methods for timely areawide detection of these host plants are critically needed to expedite eradication in south Texas. We acquired airborne multispectral images of experimental...

  7. Using airborne multispectral imagery to monitor cotton root rot expansion within a growing season

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton root rot is a serious and destructive disease that affects cotton production in the southwestern United States. Accurate delineation of cotton root rot infestations is important for cost-effective management of the disease. The objective of this study was to use airborne multispectral imagery...

  8. Television camera on RMS surveys insulation on Airborne Support Equipment

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The television camera on the end effector of the Canadian-built Remote Manipulator System (RMS) is seen surveying some of the insulation on the Airborne Support Equipment (ASE). Flight controllers called for the survey following the departure of the Advanced Communications Technology Satellite (ACTS) and its Transfer Orbit Stage (TOS).

  9. Development of filter exchangeable 3CCD camera for multispectral imaging acquisition

    NASA Astrophysics Data System (ADS)

    Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

    2012-05-01

    There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

  10. Evaluating the Potential of Multispectral Airborne LIDAR for Topographic Mapping and Land Cover Classification

    NASA Astrophysics Data System (ADS)

    Wichmann, V.; Bremer, M.; Lindenberger, J.; Rutzinger, M.; Georges, C.; Petrini-Monteferri, F.

    2015-08-01

    Recently multispectral LiDAR became a promising research field for enhanced LiDAR classification workflows and e.g. the assessment of vegetation health. Current analyses on multispectral LiDAR are mainly based on experimental setups, which are often limited transferable to operational tasks. In late 2014 Optech Inc. announced the first commercially available multispectral LiDAR system for airborne topographic mapping. The combined system makes synchronic multispectral LiDAR measurements possible, solving time shift problems of experimental acquisitions. This paper presents an explorative analysis of the first airborne collected data with focus on class specific spectral signatures. Spectral patterns are used for a classification approach, which is evaluated in comparison to a manual reference classification. Typical spectral patterns comparable to optical imagery could be observed for homogeneous and planar surfaces. For rough and volumetric objects such as trees, the spectral signature becomes biased by signal modification due to multi return effects. However, we show that this first flight data set is suitable for conventional geometrical classification and mapping procedures. Additional classes such as sealed and unsealed ground can be separated with high classification accuracies. For vegetation classification the distinction of species and health classes is possible.

  11. Evaluation of eelgrass beds mapping using a high-resolution airborne multispectral scanner

    USGS Publications Warehouse

    Su, H.; Karna, D.; Fraim, E.; Fitzgerald, M.; Dominguez, R.; Myers, J.S.; Coffland, B.; Handley, L.R.; Mace, T.

    2006-01-01

    Eelgrass (Zostera marina) can provide vital ecological functions in stabilizing sediments, influencing current dynamics, and contributing significant amounts of biomass to numerous food webs in coastal ecosystems. Mapping eelgrass beds is important for coastal water and nearshore estuarine monitoring, management, and planning. This study demonstrated the possible use of high spatial (approximately 5 m) and temporal (maximum low tide) resolution airborne multispectral scanner on mapping eelgrass beds in Northern Puget Sound, Washington. A combination of supervised and unsupervised classification approaches were performed on the multispectral scanner imagery. A normalized difference vegetation index (NDVI) derived from the red and near-infrared bands and ancillary spatial information, were used to extract and mask eelgrass beds and other submerged aquatic vegetation (SAV) in the study area. We evaluated the resulting thematic map (geocoded, classified image) against a conventional aerial photograph interpretation using 260 point locations randomly stratified over five defined classes from the thematic map. We achieved an overall accuracy of 92 percent with 0.92 Kappa Coefficient in the study area. This study demonstrates that the airborne multispectral scanner can be useful for mapping eelgrass beds in a local or regional scale, especially in regions for which optical remote sensing from space is constrained by climatic and tidal conditions. ?? 2006 American Society for Photogrammetry and Remote Sensing.

  12. Towards Automatic Single-Sensor Mapping by Multispectral Airborne Laser Scanning

    NASA Astrophysics Data System (ADS)

    Ahokas, E.; Hyyppä, J.; Yu, X.; Liang, X.; Matikainen, L.; Karila, K.; Litkey, P.; Kukko, A.; Jaakkola, A.; Kaartinen, H.; Holopainen, M.; Vastaranta, M.

    2016-06-01

    This paper describes the possibilities of the Optech Titan multispectral airborne laser scanner in the fields of mapping and forestry. Investigation was targeted to six land cover classes. Multispectral laser scanner data can be used to distinguish land cover classes of the ground surface, including the roads and separate road surface classes. For forest inventory using point cloud metrics and intensity features combined, total accuracy of 93.5% was achieved for classification of three main boreal tree species (pine, spruce and birch).When using intensity features - without point height metrics - a classification accuracy of 91% was achieved for these three tree species. It was also shown that deciduous trees can be further classified into more species. We propose that intensity-related features and waveform-type features are combined with point height metrics for forest attribute derivation in area-based prediction, which is an operatively applied forest inventory process in Scandinavia. It is expected that multispectral airborne laser scanning can provide highly valuable data for city and forest mapping and is a highly relevant data asset for national and local mapping agencies in the near future.

  13. Urban land use monitoring from computer-implemented processing of airborne multispectral data

    NASA Technical Reports Server (NTRS)

    Todd, W. J.; Mausel, P. W.; Baumgardner, M. F.

    1976-01-01

    Machine processing techniques were applied to multispectral data obtained from airborne scanners at an elevation of 600 meters over central Indianapolis in August, 1972. Computer analysis of these spectral data indicate that roads (two types), roof tops (three types), dense grass (two types), sparse grass (two types), trees, bare soil, and water (two types) can be accurately identified. Using computers, it is possible to determine land uses from analysis of type, size, shape, and spatial associations of earth surface images identified from multispectral data. Land use data developed through machine processing techniques can be programmed to monitor land use changes, simulate land use conditions, and provide impact statistics that are required to analyze stresses placed on spatial systems.

  14. Development of a Portable 3CCD Camera System for Multispectral Imaging of Biological Samples

    PubMed Central

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S.

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  15. Multispectral integral imaging acquisition and processing using a monochrome camera and a liquid crystal tunable filter.

    PubMed

    Latorre-Carmona, Pedro; Sánchez-Ortiga, Emilio; Xiao, Xiao; Pla, Filiberto; Martínez-Corral, Manuel; Navarro, Héctor; Saavedra, Genaro; Javidi, Bahram

    2012-11-05

    This paper presents an acquisition system and a procedure to capture 3D scenes in different spectral bands. The acquisition system is formed by a monochrome camera, and a Liquid Crystal Tunable Filter (LCTF) that allows to acquire images at different spectral bands in the [480, 680]nm wavelength interval. The Synthetic Aperture Integral Imaging acquisition technique is used to obtain the elemental images for each wavelength. These elemental images are used to computationally obtain the reconstruction planes of the 3D scene at different depth planes. The 3D profile of the acquired scene is also obtained using a minimization of the variance of the contribution of the elemental images at each image pixel. Experimental results show the viability to recover the 3D multispectral information of the scene. Integration of 3D and multispectral information could have important benefits in different areas, including skin cancer detection, remote sensing and pattern recognition, among others.

  16. Testing of Land Cover Classification from Multispectral Airborne Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Bakuła, K.; Kupidura, P.; Jełowicki, Ł.

    2016-06-01

    Multispectral Airborne Laser Scanning provides a new opportunity for airborne data collection. It provides high-density topographic surveying and is also a useful tool for land cover mapping. Use of a minimum of three intensity images from a multiwavelength laser scanner and 3D information included in the digital surface model has the potential for land cover/use classification and a discussion about the application of this type of data in land cover/use mapping has recently begun. In the test study, three laser reflectance intensity images (orthogonalized point cloud) acquired in green, near-infrared and short-wave infrared bands, together with a digital surface model, were used in land cover/use classification where six classes were distinguished: water, sand and gravel, concrete and asphalt, low vegetation, trees and buildings. In the tested methods, different approaches for classification were applied: spectral (based only on laser reflectance intensity images), spectral with elevation data as additional input data, and spectro-textural, using morphological granulometry as a method of texture analysis of both types of data: spectral images and the digital surface model. The method of generating the intensity raster was also tested in the experiment. Reference data were created based on visual interpretation of ALS data and traditional optical aerial and satellite images. The results have shown that multispectral ALS data are unlike typical multispectral optical images, and they have a major potential for land cover/use classification. An overall accuracy of classification over 90% was achieved. The fusion of multi-wavelength laser intensity images and elevation data, with the additional use of textural information derived from granulometric analysis of images, helped to improve the accuracy of classification significantly. The method of interpolation for the intensity raster was not very helpful, and using intensity rasters with both first and last return

  17. Combining multi-spectral proximal sensors and digital cameras for monitoring grazed tropical pastures

    NASA Astrophysics Data System (ADS)

    Handcock, R. N.; Gobbett, D. L.; González, L. A.; Bishop-Hurley, G. J.; McGavin, S. L.

    2015-11-01

    Timely and accurate monitoring of pasture biomass and ground-cover is necessary in livestock production systems to ensure productive and sustainable management of forage for livestock. Interest in the use of proximal sensors for monitoring pasture status in grazing systems has increased, since such sensors can return data in near real-time, and have the potential to be deployed on large properties where remote sensing may not be suitable due to issues such as spatial scale or cloud cover. However, there are unresolved challenges in developing calibrations to convert raw sensor data to quantitative biophysical values, such as pasture biomass or vegetation ground-cover, to allow meaningful interpretation of sensor data by livestock producers. We assessed the use of multiple proximal sensors for monitoring tropical pastures with a pilot deployment of sensors at two sites on Lansdown Research Station near Townsville, Australia. Each site was monitored by a Skye SKR-four-band multi-spectral sensor (every 1 min), a digital camera (every 30 min), and a soil moisture sensor (every 1 min), each operated over 18 months. Raw data from each sensor were processed to calculate a number of multispectral vegetation indices. Visual observations of pasture characteristics, including above-ground standing biomass and ground cover, were made every 2 weeks. A methodology was developed to manage the sensor deployment and the quality control of the data collected. The data capture from the digital cameras was more reliable than the multi-spectral sensors, which had up to 63 % of data discarded after data cleaning and quality control. We found a strong relationship between sensor and pasture measurements during the wet season period of maximum pasture growth (January to April), especially when data from the multi-spectral sensors were combined with weather data. RatioNS34 (a simple band ratio between the near infrared (NIR) and lower shortwave infrared (SWIR) bands) and rainfall since 1

  18. Identification of landslides in clay terrains using Airborne Thematic Mapper (ATM) multispectral imagery

    NASA Astrophysics Data System (ADS)

    Whitworth, Malcolm; Giles, David; Murphy, William

    2002-01-01

    The slopes of the Cotswolds Escarpment in the United Kingdom are mantled by extensive landslide deposits, including both relict and active features. These landslides pose a significant threat to engineering projects and have been the focus of research into the use of airborne remote sensing data sets for landslide mapping. Due to the availability of extensive ground investigation data, a test site was chosen on the slopes of the Cotswolds Escarpment above the village of Broadway, Worcestershire, United Kingdom. Daedalus Airborne Thematic Mapper (ATM) imagery was subsequently acquired by the UK Natural Environment Research Council (NERC) to provide high-resolution multispectral imagery of the Broadway site. This paper assesses the textural enhancement of ATM imagery as an image processing technique for landslide mapping at the Broadway site. Results of three kernel based textural measures, variance, mean euclidean distance (MEUC) and grey level co-occurrence matrix (GLCM) entropy are presented. Problems encountered during textural analysis, associated with the presence of dense woodland within the project area, are discussed and a solution using Principal Component Analysis (PCA) is described. Landslide features in clay dominated terrains can be identified through textural enhancement of airborne multispectral imagery. The kernel based textural measures tested in the current study were all able to enhance areas of slope instability within ATM imagery. Additionally, results from supervised classification of the combined texture-principal component dataset show that texture based image classification can accurately classify landslide regions and that by including a Principal Component image, woodland and landslide classes can be differentiated successfully during the classification process.

  19. Traffic monitoring with serial images from airborne cameras

    NASA Astrophysics Data System (ADS)

    Reinartz, Peter; Lachaise, Marie; Schmeer, Elisabeth; Krauss, Thomas; Runge, Hartmut

    The classical means to measure traffic density and velocity depend on local measurements from induction loops and other on site instruments. This information does not give the whole picture of the two-dimensional traffic situation. In order to obtain precise knowledge about the traffic flow of a large area, only airborne cameras or cameras positioned at very high locations (towers, etc.) can provide an up-to-date image of all roads covered. The paper aims at showing the potential of using image time series from these cameras to derive traffic parameters on the basis of single car measurements. To be able to determine precise velocities and other parameters from an image time series, exact geocoding is one of the first requirements for the acquired image data. The methods presented here for determining several traffic parameters for single vehicles and vehicle groups involve recording and evaluating a number of digital or analog aerial images from high altitude and with a large total field of view. Visual and automatic methods for the interpretation of images are compared. It turns out that the recording frequency of the individual images should be at least 1/3 Hz (visual interpretation), but is preferably 3 Hz or more, especially for automatic vehicle tracking. The accuracy and potentials of the methods are analyzed and presented, as well as the usage of a digital road database for improving the tracking algorithm and for integrating the results for further traffic applications. Shortcomings of the methods are given as well as possible improvements regarding methodology and sensor platform.

  20. The color measurement system for spot color printing basing multispectral camera

    NASA Astrophysics Data System (ADS)

    Liu, Nanbo; Jin, Weiqi; Huang, Qinmei; Song, Li

    2014-11-01

    Color measurement and control of printing has been an important issue in computer vision technology . In the past, people have used density meter and spectrophotometer to measure the color of printing product. For the color management of 4 color press, by these kind meters, people can measure the color data from color bar printed at the side of sheet, then do ink key presetting. This way have wide application in printing field. However, it can not be used in the case that is to measure the color of spot color printing and printing pattern directly. With the development of multispectral image acquisition, it makes possible to measure the color of printing pattern in any area of the pattern by CCD camera than can acquire the whole image of sheet in high resolution. This essay give a way to measure the color of printing by multispectral camera in the process of printing. A 12 channel spectral camera with high intensity white LED illumination that have driven by a motor, scans the printing sheet. Then we can get the image, this image can include color and printing quality information of each pixel, LAB value and CMYK value of each pixel can be got by reconstructing the reflectance spectra of printing image. By this data processing, we can measure the color of spot color printing and control it. Through the spot test in the printing plant, the results show this way can get not only the color bar density value but also ROI color value. By the value, we can do ink key presetting, that makes it true to control the spot color automatically in high precision.

  1. Validation of a 2D multispectral camera: application to dermatology/cosmetology on a population covering five skin phototypes

    NASA Astrophysics Data System (ADS)

    Jolivot, Romuald; Nugroho, Hermawan; Vabres, Pierre; Ahmad Fadzil, M. H.; Marzani, Franck

    2011-07-01

    This paper presents the validation of a new multispectral camera specifically developed for dermatological application based on healthy participants from five different Skin PhotoTypes (SPT). The multispectral system provides images of the skin reflectance at different spectral bands, coupled with a neural network-based algorithm that reconstructs a hyperspectral cube of cutaneous data from a multispectral image. The flexibility of neural network based algorithm allows reconstruction at different wave ranges. The hyperspectral cube provides both high spectral and spatial information. The study population involves 150 healthy participants. The participants are classified based on their skin phototype according to the Fitzpatrick Scale and population covers five of the six types. The acquisition of a participant is performed at three body locations: two skin areas exposed to the sun (hand, face) and one area non exposed to the sun (lower back) and each is reconstructed at 3 different wave ranges. The validation is performed by comparing data acquired from a commercial spectrophotometer with the reconstructed spectrum obtained from averaging the hyperspectral cube. The comparison is calculated between 430 to 740 nm due to the limit of the spectrophotometer used. The results reveal that the multispectral camera is able to reconstruct hyperspectral cube with a goodness of fit coefficient superior to 0,997 for the average of all SPT for each location. The study reveals that the multispectral camera provides accurate reconstruction of hyperspectral cube which can be used for analysis of skin reflectance spectrum.

  2. A Combined Texture-principal Component Image Classification Technique For Landslide Identification Using Airborne Multispectral Imagery

    NASA Astrophysics Data System (ADS)

    Whitworth, M.; Giles, D.; Murphy, W.

    The Jurassic strata of the Cotswolds escarpment of southern central United Kingdom are associated with extensive mass movement activity, including mudslide systems, rotational and translational landslides. These mass movements can pose a significant engineering risk and have been the focus of research into the use of remote sensing techniques as a tool for landslide identification and delineation on clay slopes. The study has utilised a field site on the Cotswold escarpment above the village of Broad- way, Worcestershire, UK. Geomorphological investigation was initially undertaken at the site in order to establish ground control on landslides and other landforms present at the site. Subsequent to this, Airborne Thematic Mapper (ATM) imagery and colour stereo photography were acquired by the UK Natural Environment Research Coun- cil (NERC) for further analysis and interpretation. This paper describes the textu- ral enhancement of the airborne imagery undertaken using both mean euclidean dis- tance (MEUC) and grey level co-occurrence matrix entropy (GLCM) together with a combined texture-principal component based supervised image classification that was adopted as the method for landslide identification. The study highlights the importance of image texture for discriminating mass movements within multispectral imagery and demonstrates that by adopting a combined texture-principal component image classi- fication we have been able to achieve classification accuracy of 84 % with a Kappa statistic of 0.838 for landslide classes. This paper also highlights the potential prob- lems that can be encountered when using high-resolution multispectral imagery, such as the presence of dense variable woodland present within the image, and presents a solution using principal component analysis.

  3. Multispectral Airborne Mapping LiDAR Observations of the McMurdo Dry Valleys

    NASA Astrophysics Data System (ADS)

    Fernandez Diaz, J. C.; Fountain, A. G.; Morin, P. J.; Singhania, A.; Hauser, D.; Obryk, M.; Shrestha, R. L.; Carter, W. E.; Sartori, M. P.

    2015-12-01

    Field observations have documented dramatic changes over the past decade in the McMurdo Dry Valleys of Antarctica: extreme river incisions, significant glacier loss, and the appearance of numerous thermokarst slumps. To date these observations have been sporadic and localized, and have not been able to capture change on a valley-wide scale. During the 2014-2015 Antarctic summer season, specifically between December 4th, 2014 and January 19th, 2015, we undertook a widescale airborne laser mapping campaign to collect a baseline digital elevation model for 3500 km2 area of the Dry Valleys and other areas of interest. The airborne LiDAR observations were acquired with a novel multi-spectral LiDAR sensor with active laser observations at three light wavelengths (532 nm, 1064 nm, and 1550 nm) simultaneously; which not only allowed the generation of a high resolution elevation model of the area, but also provides multispectral signatures for observed terrain features. In addition to the LiDAR data, high resolution (5-15 cm pixels) digital color images were collected. During the six week survey campaign of the Dry Valleys a total of 30 flights were performed, in which about 20 billion LiDAR returns and 21,000 60-Mpixels images were collected. The primary objective of this project is to perform a topographic change detection analysis by comparing the recently acquired dataset to a lower resolution dataset collected by NASA in the 2001-2002 season. This presentation will describe the processing and analysis of this significant mapping dataset and will provide some initial observations from the high resolution topography acquired.

  4. Simultaneous multispectral framing infrared camera using an embedded diffractive optical lenslet array

    NASA Astrophysics Data System (ADS)

    Hinnrichs, Michele

    2011-06-01

    Recent advances in micro-optical element fabrication using gray scale technology have opened up the opportunity to create simultaneous multi-spectral imaging with fine structure diffractive lenses. This paper will discuss an approach that uses diffractive optical lenses configured in an array (lenslet array) and placed in close proximity to the focal plane array which enables a small compact simultaneous multispectral imaging camera [1]. The lenslet array is designed so that all lenslets have a common focal length with each lenslet tuned for a different wavelength. The number of simultaneous spectral images is determined by the number of individually configured lenslets in the array. The number of spectral images can be increased by a factor of 2 when using it with a dual-band focal plane array (MWIR/LWIR) by exploiting multiple diffraction orders. In addition, modulation of the focal length of the lenslet array with piezoelectric actuation will enable spectral bin fill-in allowing additional spectral coverage while giving up simultaneity. Different lenslet array spectral imaging concept designs are presented in this paper along with a unique concept for prefiltering the radiation focused on the detector. This approach to spectral imaging has applications in the detection of chemical agents in both aerosolized form and as a liquid on a surface. It also can be applied to the detection of weaponized biological agent and IED detection in various forms from manufacturing to deployment and post detection during forensic analysis.

  5. Charon's Color: A view from New Horizon Ralph/Multispectral Visible Imaging Camera

    NASA Astrophysics Data System (ADS)

    Olkin, C.; Howett, C.; Grundy, W. M.; Parker, A. H.; Ennico Smith, K.; Stern, S. A.; Binzel, R. P.; Cook, J. C.; Cruikshank, D. P.; Dalle Ore, C.; Earle, A. M.; Jennings, D. E.; Linscott, I.; Lunsford, A.; Parker, J. W.; Protopapa, S.; Reuter, D.; Singer, K. N.; Spencer, J. R.; Tsang, C.; Verbiscer, A.; Weaver, H. A., Jr.; Young, L. A.

    2015-12-01

    The Multispectral Visible Imaging Camera (MVIC; Reuter et al., 2008) is part of Ralph, an instrument on NASA's New Horizons spacecraft. MVIC is the color 'eyes' of New Horizons, observing objects using five bands from blue to infrared wavelengths. MVIC's images of Charon show it to be an intriguing place, a far cry from the grey heavily cratered world once postulated. Rather Charon is observed to have large surface areas free of craters, and a northern polar region that is much redder than its surroundings. This talk will describe these initial results in more detail, along with Charon's global geological color variations to put these results into their wider context. Finally possible surface coloration mechanisms due to global processes and/or seasonal cycles will be discussed.

  6. Statistical correction of lidar-derived digital elevation models with multispectral airborne imagery in tidal marshes

    USGS Publications Warehouse

    Buffington, Kevin J.; Dugger, Bruce D.; Thorne, Karen M.; Takekawa, John

    2016-01-01

    Airborne light detection and ranging (lidar) is a valuable tool for collecting large amounts of elevation data across large areas; however, the limited ability to penetrate dense vegetation with lidar hinders its usefulness for measuring tidal marsh platforms. Methods to correct lidar elevation data are available, but a reliable method that requires limited field work and maintains spatial resolution is lacking. We present a novel method, the Lidar Elevation Adjustment with NDVI (LEAN), to correct lidar digital elevation models (DEMs) with vegetation indices from readily available multispectral airborne imagery (NAIP) and RTK-GPS surveys. Using 17 study sites along the Pacific coast of the U.S., we achieved an average root mean squared error (RMSE) of 0.072 m, with a 40–75% improvement in accuracy from the lidar bare earth DEM. Results from our method compared favorably with results from three other methods (minimum-bin gridding, mean error correction, and vegetation correction factors), and a power analysis applying our extensive RTK-GPS dataset showed that on average 118 points were necessary to calibrate a site-specific correction model for tidal marshes along the Pacific coast. By using available imagery and with minimal field surveys, we showed that lidar-derived DEMs can be adjusted for greater accuracy while maintaining high (1 m) resolution.

  7. Development of low-cost high-performance multispectral camera system at Banpil

    NASA Astrophysics Data System (ADS)

    Oduor, Patrick; Mizuno, Genki; Olah, Robert; Dutta, Achyut K.

    2014-05-01

    Banpil Photonics (Banpil) has developed a low-cost high-performance multispectral camera system for Visible to Short- Wave Infrared (VIS-SWIR) imaging for the most demanding high-sensitivity and high-speed military, commercial and industrial applications. The 640x512 pixel InGaAs uncooled camera system is designed to provide a compact, smallform factor to within a cubic inch, high sensitivity needing less than 100 electrons, high dynamic range exceeding 190 dB, high-frame rates greater than 1000 frames per second (FPS) at full resolution, and low power consumption below 1W. This is practically all the feature benefits highly desirable in military imaging applications to expand deployment to every warfighter, while also maintaining a low-cost structure demanded for scaling into commercial markets. This paper describes Banpil's development of the camera system including the features of the image sensor with an innovation integrating advanced digital electronics functionality, which has made the confluence of high-performance capabilities on the same imaging platform practical at low cost. It discusses the strategies employed including innovations of the key components (e.g. focal plane array (FPA) and Read-Out Integrated Circuitry (ROIC)) within our control while maintaining a fabless model, and strategic collaboration with partners to attain additional cost reductions on optics, electronics, and packaging. We highlight the challenges and potential opportunities for further cost reductions to achieve a goal of a sub-$1000 uncooled high-performance camera system. Finally, a brief overview of emerging military, commercial and industrial applications that will benefit from this high performance imaging system and their forecast cost structure is presented.

  8. Ground-based analysis of volcanic ash plumes using a new multispectral thermal infrared camera approach

    NASA Astrophysics Data System (ADS)

    Williams, D.; Ramsey, M. S.

    2015-12-01

    Volcanic plumes are complex mixtures of mineral, lithic and glass fragments of varying size, together with multiple gas species. These plumes vary in size dependent on a number of factors, including vent diameter, magma composition and the quantity of volatiles within a melt. However, determining the chemical and mineralogical properties of a volcanic plume immediately after an eruption is a great challenge. Thermal infrared (TIR) satellite remote sensing of these plumes is routinely used to calculate the volcanic ash particle size variations and sulfur dioxide concentration. These analyses are commonly performed using high temporal, low spatial resolution satellites, which can only reveal large scale trends. What is lacking is a high spatial resolution study specifically of the properties of the proximal plumes. Using the emissive properties of volcanic ash, a new method has been developed to determine the plume's particle size and petrology in spaceborne and ground-based TIR data. A multispectral adaptation of a FLIR TIR camera has been developed that simulates the TIR channels found on several current orbital instruments. Using this instrument, data of volcanic plumes from Fuego and Santiaguito volcanoes in Guatemala were recently obtained Preliminary results indicate that the camera is capable of detecting silicate absorption features in the emissivity spectra over the TIR wavelength range, which can be linked to both mineral chemistry and particle size. It is hoped that this technique can be expanded to isolate different volcanic species within a plume, validate the orbital data, and ultimately to use the results to better inform eruption dynamics modelling.

  9. Forest Stand Segmentation Using Airborne LIDAR Data and Very High Resolution Multispectral Imagery

    NASA Astrophysics Data System (ADS)

    Dechesne, Clément; Mallet, Clément; Le Bris, Arnaud; Gouet, Valérie; Hervieu, Alexandre

    2016-06-01

    Forest stands are the basic units for forest inventory and mapping. Stands are large forested areas (e.g., ≥ 2 ha) of homogeneous tree species composition. The accurate delineation of forest stands is usually performed by visual analysis of human operators on very high resolution (VHR) optical images. This work is highly time consuming and should be automated for scalability purposes. In this paper, a method based on the fusion of airborne laser scanning data (or lidar) and very high resolution multispectral imagery for automatic forest stand delineation and forest land-cover database update is proposed. The multispectral images give access to the tree species whereas 3D lidar point clouds provide geometric information on the trees. Therefore, multi-modal features are computed, both at pixel and object levels. The objects are individual trees extracted from lidar data. A supervised classification is performed at the object level on the computed features in order to coarsely discriminate the existing tree species in the area of interest. The analysis at tree level is particularly relevant since it significantly improves the tree species classification. A probability map is generated through the tree species classification and inserted with the pixel-based features map in an energetical framework. The proposed energy is then minimized using a standard graph-cut method (namely QPBO with α-expansion) in order to produce a segmentation map with a controlled level of details. Comparison with an existing forest land cover database shows that our method provides satisfactory results both in terms of stand labelling and delineation (matching ranges between 94% and 99%).

  10. Airborne multispectral and thermal remote sensing for detecting the onset of crop stress caused by multiple factors

    NASA Astrophysics Data System (ADS)

    Huang, Yanbo; Thomson, Steven J.

    2010-10-01

    Remote sensing technology has been developed and applied to provide spatiotemporal information on crop stress for precision management. A series of multispectral images over a field planted cotton, corn and soybean were obtained by a Geospatial Systems MS4100 camera mounted on an Air Tractor 402B airplane equipped with Camera Link in a Magma converter box triggered by Terraverde Dragonfly® flight navigation and imaging control software. The field crops were intentionally stressed by applying glyphosate herbicide via aircraft and allowing it to drift near-field. Aerial multispectral images in the visible and near-infrared bands were manipulated to produce vegetation indices, which were used to quantify the onset of herbicide induced crop stress. The vegetation indices normalized difference vegetation index (NDVI) and soil adjusted vegetation index (SAVI) showed the ability to monitor crop response to herbicide-induced injury by revealing stress at different phenological stages. Two other fields were managed with irrigated versus nonirrigated treatments, and those fields were imaged with both the multispectral system and an Electrophysics PV-320T thermal imaging camera on board an Air Tractor 402B aircraft. Thermal imagery indicated water stress due to deficits in soil moisture, and a proposed method of determining crop cover percentage using thermal imagery was compared with a multispectral imaging method. Development of an image fusion scheme may be necessary to provide synergy and improve overall water stress detection ability.

  11. Mapping of hydrothermally altered rocks using airborne multispectral scanner data, Marysvale, Utah, mining district

    USGS Publications Warehouse

    Podwysocki, M.H.; Segal, D.B.; Jones, O.D.

    1983-01-01

    Multispectral data covering an area near Marysvale, Utah, collected with the airborne National Aeronautics and Space Administration (NASA) 24-channel Bendix multispectral scanner, were analyzed to detect areas of hydrothermally altered, potentially mineralized rocks. Spectral bands were selected for analysis that approximate those of the Landsat 4 Thematic Mapper and which are diagnostic of the presence of hydrothermally derived products. Hydrothermally altered rocks, particularly volcanic rocks affected by solutions rich in sulfuric acid, are commonly characterized by concentrations of argillic minerals such as alunite and kaolinite. These minerals are important for identifying hydrothermally altered rocks in multispectral images because they have intense absorption bands centered near a wavelength of 2.2 ??m. Unaltered volcanic rocks commonly do not contain these minerals and hence do not have the absorption bands. A color-composite image was constructed using the following spectral band ratios: 1.6??m/2.2??m, 1.6??m/0.48??m, and 0.67??m/1.0??m. The particular bands were chosen to emphasize the spectral contrasts that exist for argillic versus non-argillic rocks, limonitic versus nonlimonitic rocks, and rocks versus vegetation, respectively. The color-ratio composite successfully distinguished most types of altered rocks from unaltered rocks. Some previously unrecognized areas of hydrothermal alteration were mapped. The altered rocks included those having high alunite and/or kaolinite content, siliceous rocks containing some kaolinite, and ash-fall tuffs containing zeolitic minerals. The color-ratio-composite image allowed further division of these rocks into limonitic and nonlimonitic phases. The image did not allow separation of highly siliceous or hematitically altered rocks containing no clays or alunite from unaltered rocks. A color-coded density slice image of the 1.6??m/2.2??m band ratio allowed further discrimination among the altered units. Areas

  12. High Spatial Resolution Airborne Multispectral Thermal Infrared Remote Sensing Data for Analysis of Urban Landscape Characteristics

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Luvall, Jeffrey C.; Estes, Maurice G., Jr.; Arnold, James E. (Technical Monitor)

    2000-01-01

    We have used airborne multispectral thermal infrared (TIR) remote sensing data collected at a high spatial resolution (i.e., 10m) over several cities in the United States to study thermal energy characteristics of the urban landscape. These TIR data provide a unique opportunity to quantify thermal responses from discrete surfaces typical of the urban landscape and to identify both the spatial arrangement and patterns of thermal processes across the city. The information obtained from these data is critical to understanding how urban surfaces drive or force development of the Urban Heat Island (UHI) effect, which exists as a dome of elevated air temperatures that presides over cities in contrast to surrounding non-urbanized areas. The UHI is most pronounced in the summertime where urban surfaces, such as rooftops and pavement, store solar radiation throughout the day, and release this stored energy slowly after sunset creating air temperatures over the city that are in excess of 2-4'C warmer in contrast with non-urban or rural air temperatures. The UHI can also exist as a daytime phenomenon with surface temperatures in downtown areas of cities exceeding 38'C. The implications of the UHI are significant, particularly as an additive source of thermal energy input that exacerbates the overall production of ground level ozone over cities. We have used the Airborne Thermal and Land Applications Sensor (ATLAS), flown onboard a Lear 23 jet aircraft from the NASA Stennis Space Center, to acquire high spatial resolution multispectral TIR data (i.e., 6 bandwidths between 8.2-12.2 (um) over Huntsville, Alabama, Atlanta, Georgia, Baton Rouge, Louisiana, Salt Lake City, Utah, and Sacramento, California. These TIR data have been used to produce maps and other products, showing the spatial distribution of heating and cooling patterns over these cities to better understand how the morphology of the urban landscape affects development of the UHI. In turn, these data have been used

  13. Airborne Thermal Infrared Multispectral Scanner (TIMS) images over disseminated gold deposits, Osgood Mountains, Humboldt County, Nevada

    NASA Technical Reports Server (NTRS)

    Krohn, M. Dennis

    1986-01-01

    The U.S. Geological Survey (USGS) acquired airborne Thermal Infrared Multispectral Scanner (TIMS) images over several disseminated gold deposits in northern Nevada in 1983. The aerial surveys were flown to determine whether TIMS data could depict jasperoids (siliceous replacement bodies) associated with the gold deposits. The TIMS data were collected over the Pinson and Getchell Mines in the Osgood Mountains, the Carlin, Maggie Creek, Bootstrap, and other mines in the Tuscarora Mountains, and the Jerritt Canyon Mine in the Independence Mountains. The TIMS data seem to be a useful supplement to conventional geochemical exploration for disseminated gold deposits in the western United States. Siliceous outcrops are readily separable in the TIMS image from other types of host rocks. Different forms of silicification are not readily separable, yet, due to limitations of spatial resolution and spectral dynamic range. Features associated with the disseminated gold deposits, such as the large intrusive bodies and fault structures, are also resolvable on TIMS data. Inclusion of high-resolution thermal inertia data would be a useful supplement to the TIMS data.

  14. A pilot project combining multispectral proximal sensors and digital cameras for monitoring tropical pastures

    NASA Astrophysics Data System (ADS)

    Handcock, Rebecca N.; Gobbett, D. L.; González, Luciano A.; Bishop-Hurley, Greg J.; McGavin, Sharon L.

    2016-08-01

    Timely and accurate monitoring of pasture biomass and ground cover is necessary in livestock production systems to ensure productive and sustainable management. Interest in the use of proximal sensors for monitoring pasture status in grazing systems has increased, since data can be returned in near real time. Proximal sensors have the potential for deployment on large properties where remote sensing may not be suitable due to issues such as spatial scale or cloud cover. There are unresolved challenges in gathering reliable sensor data and in calibrating raw sensor data to values such as pasture biomass or vegetation ground cover, which allow meaningful interpretation of sensor data by livestock producers. Our goal was to assess whether a combination of proximal sensors could be reliably deployed to monitor tropical pasture status in an operational beef production system, as a precursor to designing a full sensor deployment. We use this pilot project to (1) illustrate practical issues around sensor deployment, (2) develop the methods necessary for the quality control of the sensor data, and (3) assess the strength of the relationships between vegetation indices derived from the proximal sensors and field observations across the wet and dry seasons. Proximal sensors were deployed at two sites in a tropical pasture on a beef production property near Townsville, Australia. Each site was monitored by a Skye SKR-four-band multispectral sensor (every 1 min), a digital camera (every 30 min), and a soil moisture sensor (every 1 min), each of which were operated over 18 months. Raw data from each sensor was processed to calculate multispectral vegetation indices. The data capture from the digital cameras was more reliable than the multispectral sensors, which had up to 67 % of data discarded after data cleaning and quality control for technical issues related to the sensor design, as well as environmental issues such as water incursion and insect infestations. We recommend

  15. Web camera as low cost multispectral sensor for quantification of chlorophyll in soybean leaves

    NASA Astrophysics Data System (ADS)

    Adhiwibawa, Marcelinus A.; Setiawan, Yonathan E.; Prilianti, Kestrilia R.; Brotosudarmo, Tatas H. P.

    2015-01-01

    Soybeans is one of main crops in Indonesia but the demand for soybeans is not followed by an increase in soybeans national production. One of the production limitation factor is the availability of lush cultivation area for soybeans plantation. Indonesian farners are usually grow soybeans in marginal cultivation area that requires soybeans varieties which tolerant with environmental stress such as drought, nutrition limitation, pest, disease and many others. Chlorophyll content in leaf is one of plant health indicator that can be used to determine environmental stress tolerant soybean varieties. However, there are difficulties in soybeans breeding research due to the manual acquisition of data that are time consume and labour extensive. In this paper authors proposed automatic system of soybeans leaves area and chlorophyll quantification based on low cost multispectral sensor using web camera as an indicator of soybean plant tollerance to environmental stress particularlly drought stress. The system acquires the image of the plant that is placed in the acquisition box from the top of the plant. The image is segmented using NDVI (Normalized Difference Vegetation Index) from image and quantified to yield an average value of NDVI and leaf area. The proposed system showed that acquired NDVI value has a strong relationship with SPAD value with r-square value 0.70, while the leaf area prediction has error of 18.41%. Thus the automation system can quantify plant data with good result.

  16. Multispectral thermal airborne TASI-600 data to study the Pompeii (IT) archaeological area

    NASA Astrophysics Data System (ADS)

    Palombo, Angelo; Pascucci, Simone; Pergola, Nicola; Pignatti, Stefano; Santini, Federico; Soldovieri, Francesco

    2016-04-01

    The management of archaeological areas refers to the conservation of the ruins/buildings and the eventual prospection of new areas having an archaeological potential. In this framework, airborne remote sensing is a well-developed geophysical tool for supporting the archaeological surveys of wide areas. The spectral regions applied in archaeological remote sensing spans from the VNIR to the TIR. In particular, the archaeological thermal imaging considers that materials absorb, emit, transmit, and reflect the thermal infrared radiation at different rate according to their composition, density and moisture content. Despite its potential, thermal imaging in archaeological applications are scarce. Among them, noteworthy are the ones related to the use of Landsat and ASTER [1] and airborne remote sensing [2, 3, 4 and 5]. In view of these potential in Cultural Heritage applications, the present study aims at analysing the usefulness of the high spatial resolution thermal imaging on the Pompeii archaeological park. To this purpose TASI-600 [6] airborne multispectral thermal imagery (32 channels from 8 to 11.5 nm with a spectral resolution of 100nm and a spatial resolution of 1m/pixel) was acquired on December the 7th, 2015. Airborne survey has been acquired to get useful information on the building materials (both ancient and of consolidation) characteristics and, whenever possible, to retrieve quick indicators on their conservation status. Thermal images will be, moreover, processed to have an insight of the critical environmental issues impacting the structures (e.g. moisture). The proposed study shows the preliminary results of the airborne deployments, the pre-processing of the multispectral thermal imagery and the retrieving of accurate land surface temperatures (LST). LST map will be analysed to describe the thermal pattern of the city of Pompeii and detect any thermal anomalies. As far as the ongoing TASI-600 sensors pre-processing, it will include: (a) radiometric

  17. New, Flexible Applications with the Multi-Spectral Titan Airborne Lidar

    NASA Astrophysics Data System (ADS)

    Swirski, A.; LaRocque, D. P.; Shaker, A.; Smith, B.

    2015-12-01

    Traditional lidar designs have been restricted to using a single laser channel operating at one particular wavelength. Single-channel systems excel at collecting high-precision spatial (XYZ) data, with accuracies down to a few centimeters. However, target classification is difficult with spatial data alone, and single-wavelength systems are limited to the strengths and weaknesses of the wavelength they use. To resolve these limitations in lidar design, Teledyne Optech developed the Titan, the world's first multispectral lidar system, which uses three independent laser channels operating at 532, 1064, and 1550 nm. Since Titan collects 12 bit intensity returns for each wavelength separately, users can compare how strongly targets in the survey area reflect each wavelength. Materials such as soil, rock and foliage all reflect the wavelengths differently, enabling post-processing algorithms to identify the material of targets easily and automatically. Based on field tests in Canada, automated classification algorithms have combined this with elevation data to classify targets into six basic types with 78% accuracy. Even greater accuracy is possible with further algorithm enhancement and the use of an in-sensor passive imager such as a thermal, multispectral, CIR or RGB camera. Titan therefore presents an important new tool for applications such as land-cover classification and environmental modeling while maintaining lidar's traditional strengths: high 3D accuracy and day/night operation. Multispectral channels also enable a single lidar to handle both topographic and bathymetric surveying efficiently, which previously required separate specialized lidar systems operating at different wavelengths. On land, Titan can survey efficiently from 2000 m AGL with a 900 kHz PRF (300 kHz per channel), or up to 2500 m if only the infrared 1064 and 1550 nm channels are used. Over water, the 532 nm green channel penetrates water to collect seafloor returns while the infrared

  18. Biooptical variability in the Greenland Sea observed with the Multispectral Airborne Radiometer System (MARS)

    NASA Technical Reports Server (NTRS)

    Mueller, James L.; Trees, Charles C.

    1989-01-01

    A site-specific ocean color remote sensing algorithm was developed and used to convert Multispectral Airborne Radiometer System (MARS) spectral radiance measurements to chlorophyll-a concentration profiles along aircraft tracklines in the Greenland Sea. The analysis is described and the results given in graphical or tabular form. Section 2 describes the salient characteristics and history of development of the MARS instrument. Section 3 describes the analyses of MARS flight segments over consolidated sea ice, resulting in a set of altitude dependent ratios used (over water) to estimate radiance reflected by the surface and atmosphere from total radiance measured. Section 4 presents optically weighted pigment concentrations calculated from profile data, and spectral reflectances measured in situ from the top meter of the water column; this data was analyzed to develop an algorithm relating chlorophyll-a concentrations to the ratio of radiance reflectances at 441 and 550 nm (with a selection of coefficients dependent upon whether significant gelvin presence is implied by a low ratio of reflectances at 410 and 550 nm). Section 5 describes the scaling adjustments which were derived to reconcile the MARS upwelled radiance ratios at 410:550 nm and 441:550 nm to in situ reflectance ratios measured simultaneously on the surface. Section 6 graphically presents the locations of MARS data tracklines and positions of the surface monitoring R/V. Section 7 presents stick-plots of MARS tracklines selected to illustrate two-dimensional spatial variability within the box covered by each day's flight. Section 8 presents curves of chlorophyll-a concentration profiles derived from MARS data along survey tracklines. Significant results are summarized in Section 1.

  19. Use of airborne multispectral scanner data to map alteration related to roll-front uranium migration

    SciTech Connect

    Peters, D.C.

    1983-06-01

    Computer-enhanced airborne multispectral scanner (MSS) images have been used to detect and map red oxidized alteration related to roll-front uranium migration in the southern Powder River basin, Wyoming. Information in the 0.4- to 1.1-..mu..m spectral region was used to produce a color ratio composite image, upon which the red-altered areas can be differentiated. The red-altered and incipiently altered sandstones result from the migration of a roll-front (or geochemical cell) through the sandstone in the direction of the hydrologic gradient. Most uranium deposits in the Powder River basin occur at the boundary between this oxidized sandstone and reduced sandstone. Therefore, the ability to detect and map this alteration reliably can provide important information about the potential for uranium mineralization down gradient from the altered areas, at the surface in an area of interest. Spectral reflectance studies indicate that a shift in the absorption band edge from 0.52 ..mu..m (for goethitic sandstone) to 0.58 ..mu..m (for hematitic sandstone) and an intensification of an absorption band at 0.85 ..mu..m (for hematitic sandstone) are the bases for identifying the red-altered sandstone as green anomalous areas on the color ratio composite image. Some of the incipiently altered sandstone also appears green, whereas unaltered material and white-altered sandstone appear as blue to cyan colors. Therefore, the composite image is useful in discriminating hematitic sandstone from goethitic sandstone. At high densities (>65%), vegetation masks the sandstones on the color ratio composite image. Artemisia tridentata (sage) and Stipa comata (grass) are the species that have the greatest individual effect on the image.

  20. SPLASSH: Open source software for camera-based high-speed, multispectral in-vivo optical image acquisition.

    PubMed

    Sun, Ryan; Bouchard, Matthew B; Hillman, Elizabeth M C

    2010-08-02

    Camera-based in-vivo optical imaging can provide detailed images of living tissue that reveal structure, function, and disease. High-speed, high resolution imaging can reveal dynamic events such as changes in blood flow and responses to stimulation. Despite these benefits, commercially available scientific cameras rarely include software that is suitable for in-vivo imaging applications, making this highly versatile form of optical imaging challenging and time-consuming to implement. To address this issue, we have developed a novel, open-source software package to control high-speed, multispectral optical imaging systems. The software integrates a number of modular functions through a custom graphical user interface (GUI) and provides extensive control over a wide range of inexpensive IEEE 1394 Firewire cameras. Multispectral illumination can be incorporated through the use of off-the-shelf light emitting diodes which the software synchronizes to image acquisition via a programmed microcontroller, allowing arbitrary high-speed illumination sequences. The complete software suite is available for free download. Here we describe the software's framework and provide details to guide users with development of this and similar software.

  1. Effectiveness of airborne multispectral thermal data for karst groundwater resources recognition in coastal areas

    NASA Astrophysics Data System (ADS)

    Pignatti, Stefano; Fusilli, Lorenzo; Palombo, Angelo; Santini, Federico; Pascucci, Simone

    2013-04-01

    Currently the detection, use and management of groundwater in karst regions can be considered one of the most significant procedures for solving water scarcity problems during periods of low rainfall this because groundwater resources from karst aquifers play a key role in the water supply in karst areas worldwide [1]. In many countries of the Mediterranean area, where karst is widespread, groundwater resources are still underexploited, while surface waters are generally preferred [2]. Furthermore, carbonate aquifers constitute a crucial thermal water resource outside of volcanic areas, even if there is no detailed and reliable global assessment of thermal water resources. The composite hydrogeological characteristics of karst, particularly directions and zones of groundwater distribution, are not up till now adequately explained [3]. In view of the abovementioned reasons the present study aims at analyzing the detection capability of high spatial resolution thermal remote sensing of karst water resources in coastal areas in order to get useful information on the karst springs flow and on different characteristics of these environments. To this purpose MIVIS [4, 5] and TASI-600 [6] airborne multispectral thermal imagery (see sensors' characteristics in Table 1) acquired on two coastal areas of the Mediterranean area interested by karst activity, one located in Montenegro and one in Italy, were used. One study area is located in the Kotor Bay, a winding bay on the Adriatic Sea surrounded by high mountains in south-western Montenegro and characterized by many subaerial and submarine coastal springs related to deep karstic channels. The other study area is located in Santa Cesarea (Italy), encompassing coastal cold springs, the main local source of high quality water, and also a noticeable thermal groundwater outflow. The proposed study shows the preliminary results of the two airborne deployments on these areas. The preprocessing of the multispectral thermal imagery

  2. Estimating forest canopy attributes via airborne, high-resolution, multispectral imagery in midwest forest types

    NASA Astrophysics Data System (ADS)

    Gatziolis, Demetrios

    An investigation of the utility of high spatial resolution (sub-meter), 16-bit, multispectral, airborne digital imagery for forest land cover mapping in the heterogeneous and structurally complex forested landscapes of northern Michigan is presented. Imagery frame registration and georeferencing issues are presented and a novel approach for bi-directional reflectance distribution function (BRDF) effects correction and between-frame brightness normalization is introduced. Maximum likelihood classification of five cover type classes is performed over various geographic aggregates of 34 plots established in the study area that were designed according to the Forest Inventory and Analysis protocol. Classification accuracy estimates show that although band registration and BRDF corrections and brightness normalization provide an approximately 5% improvement over the raw imagery data, overall classification accuracy remains relatively low, barely exceeding 50%. Computed kappa coefficients reveal no statistical differences among classification trials. Classification results appear to be independent of geographic aggregations of sampling plots. Estimation of forest stand canopy parameter parameters (stem density, canopy closure, and mean crown diameter) is based on quantifying the spatial autocorrelation among pixel digital numbers (DN) using variogram analysis and slope break analysis, an alternative non-parametric approach. Parameter estimation and cover type classification proceed from the identification of tree apexes. Parameter accuracy assessment is evaluated via value comparison with a spatially precise set of field observations. In general, slope-break-based parameter estimates are superior to those obtained using variograms. Estimated root mean square errors at the plot level for the former average 6.5% for stem density, 3.5% for canopy closure and 2.5% for mean crown diameter, which are less than or equal to error rates obtained via traditional forest stand

  3. Airborne Multispectral LIDAR Data for Land-Cover Classification and Land/water Mapping Using Different Spectral Indexes

    NASA Astrophysics Data System (ADS)

    Morsy, S.; Shaker, A.; El-Rabbany, A.; LaRocque, P. E.

    2016-06-01

    Airborne Light Detection And Ranging (LiDAR) data is widely used in remote sensing applications, such as topographic and landwater mapping. Recently, airborne multispectral LiDAR sensors, which acquire data at different wavelengths, are available, thus allows recording a diversity of intensity values from different land features. In this study, three normalized difference feature indexes (NDFI), for vegetation, water, and built-up area mapping, were evaluated. The NDFIs namely, NDFIG-NIR, NDFIG-MIR, and NDFINIR-MIR were calculated using data collected at three wavelengths; green: 532 nm, near-infrared (NIR): 1064 nm, and mid-infrared (MIR): 1550 nm by the world's first airborne multispectral LiDAR sensor "Optech Titan". The Jenks natural breaks optimization method was used to determine the threshold values for each NDFI, in order to cluster the 3D point data into two classes (water and land or vegetation and built-up area). Two sites at Scarborough, Ontario, Canada were tested to evaluate the performance of the NDFIs for land-water, vegetation, and built-up area mapping. The use of the three NDFIs succeeded to discriminate vegetation from built-up areas with an overall accuracy of 92.51%. Based on the classification results, it is suggested to use NDFIG-MIR and NDFINIR-MIR for vegetation and built-up areas extraction, respectively. The clustering results show that the direct use of NDFIs for land-water mapping has low performance. Therefore, the clustered classes, based on the NDFIs, are constrained by the recorded number of returns from different wavelengths, thus the overall accuracy is improved to 96.98%.

  4. A simple method for vignette correction of airborne digital camera data

    SciTech Connect

    Nguyen, A.T.; Stow, D.A.; Hope, A.S.

    1996-11-01

    Airborne digital camera systems have gained popularity in recent years due to their flexibility, high geometric fidelity and spatial resolution, and fast data turn-around time. However, a common problem that plagues these types of framing systems is vignetting which causes falloff in image brightness away from principle nadir point. This paper presents a simple method for vignetting correction by utilizing laboratory images of a uniform illumination source. Multiple lab images are averaged and inverted to create digital correction templates which then are applied to actual airborne data. The vignette correction was effective in removing the systematic falloff in spectral values. We have shown that the vignette correction is a necessary part of the preprocessing of raw digital airborne remote sensing data. The consequences of not correcting for these effects are demonstrated in the context of monitoring of salt marsh habitat. 4 refs.

  5. GIS Meets Airborne MSS: Geospatial Applications of High-Resolution Multispectral Data

    SciTech Connect

    Albert Guber

    1999-07-27

    Bechtel Nevada operates and flies Daedalus multispectral scanners for funded project tasks at the Department of Energy's Remote Sensing Laboratory. Historically, processing and analysis of multispectral data has afforded scientists the opportunity to see natural phenomena not visible to the naked eye. However, only recently has a system, more specifically a Geometric Correction System, existed to automatically geo-reference these data directly into a Geographic Information (GIS) database. Now, analyses, performed previously in a nongeospatial environment, are integrated directly into an Arc/Info GIS. This technology is of direct benefit to environmental and emergency response applications.

  6. Estimating Evapotranspiration over Heterogeneously Vegetated Surfaces using Large Aperture Scintillometer, LiDAR, and Airborne Multispectral Imagery

    NASA Astrophysics Data System (ADS)

    Geli, H. M.; Neale, C. M.; Pack, R. T.; Watts, D. R.; Osterberg, J.

    2011-12-01

    Estimates of evapotranspiration (ET) over heterogeneous areas is challenging especially in water-limited sparsely vegetated environments. New techniques such as airborne full-waveform LiDAR (Light Detection and Ranging) and high resolution multispectral and thermal imagery can provide enough detail of sparse canopies to improve energy balance model estimations as well as footprint analysis of scintillometer data. The objectives of this study were to estimate ET over such areas and develop methodologies for the use of these airborne data technologies. Because of the associated heterogeneity, this study was conducted over the Cibola National wildlife refuge, southern California on an area dominated with tamarisk (salt cedar) forest (90%) interspersed with arrowweed and bare soil (10%). A set of two large aperture scintillometers (LASs) were deployed over the area to provide estimates of sensible heat flux (HLAS). The LASs were distributed over the area in a way that allowed capturing different surface spatial heterogeneity. Bowen ratio systems were used to provide hydrometeorological variables and surface energy balance fluxes (SEBF) (i.e. Rn, G, H, and LE) measurements. Scintillometer-based estimates of HLAS were improved by considering the effect of the corresponding 3D footprint and the associated displacement height (d) and the roughness length (z0) following Geli et al. (2011). The LiDAR data were acquired using the LASSI Lidar developed at Utah State University (USU). The data was used to obtain 1-m spatial resolution DEM's and vegetation canopy height to improve the HLAS estimates. The BR measurements of Rn and G were combined with LAS estimates, HLAS, to provide estimates of LELASas a residual of the energy balance equation. A thermal remote sensing model namely the two source energy balance (TSEB) of Norman et al. (1995) was applied to provide spatial estimates of SEBF. Four airborne images at 1-4 meter spatial resolution acquired using the USU airborne

  7. Building detection by fusion of airborne laser scanner data and multi-spectral images: Performance evaluation and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Rottensteiner, Franz; Trinder, John; Clode, Simon; Kubik, Kurt

    In this paper, we describe the evaluation of a method for building detection by the Dempster-Shafer fusion of airborne laser scanner (ALS) data and multi-spectral images. For this purpose, ground truth was digitised for two test sites with quite different characteristics. Using these data sets, the heuristic models for the probability mass assignments are validated and improved, and rules for tuning the parameters are discussed. The sensitivity of the results to the most important control parameters of the method is assessed. Further we evaluate the contributions of the individual cues used in the classification process to determine the quality of the results. Applying our method with a standard set of parameters on two different ALS data sets with a spacing of about 1 point/m 2, 95% of all buildings larger than 70 m 2 could be detected and 95% of all detected buildings larger than 70 m 2 were correct in both cases. Buildings smaller than 30 m 2 could not be detected. The parameters used in the method have to be appropriately defined, but all except one (which must be determined in a training phase) can be determined from meaningful physical entities. Our research also shows that adding the multi-spectral images to the classification process improves the correctness of the results for small residential buildings by up to 20%.

  8. Semantic segmentation of forest stands of pure species combining airborne lidar data and very high resolution multispectral imagery

    NASA Astrophysics Data System (ADS)

    Dechesne, Clément; Mallet, Clément; Le Bris, Arnaud; Gouet-Brunet, Valérie

    2017-04-01

    Forest stands are the basic units for forest inventory and mapping. Stands are defined as large forested areas (e.g., ⩾ 2 ha) of homogeneous tree species composition and age. Their accurate delineation is usually performed by human operators through visual analysis of very high resolution (VHR) infra-red images. This task is tedious, highly time consuming, and should be automated for scalability and efficient updating purposes. In this paper, a method based on the fusion of airborne lidar data and VHR multispectral images is proposed for the automatic delineation of forest stands containing one dominant species (purity superior to 75%). This is the key preliminary task for forest land-cover database update. The multispectral images give information about the tree species whereas 3D lidar point clouds provide geometric information on the trees and allow their individual extraction. Multi-modal features are computed, both at pixel and object levels: the objects are individual trees extracted from lidar data. A supervised classification is then performed at the object level in order to coarsely discriminate the existing tree species in each area of interest. The classification results are further processed to obtain homogeneous areas with smooth borders by employing an energy minimum framework, where additional constraints are joined to form the energy function. The experimental results show that the proposed method provides very satisfactory results both in terms of stand labeling and delineation (overall accuracy ranges between 84 % and 99 %).

  9. A comparison between satellite and airborne multispectral data for the assessment of Mangrove areas in the eastern Caribbean

    SciTech Connect

    Green, E.P.; Edwards, A.J.; Mumby, P.J.

    1997-06-01

    Satellite (SPOT XS and Landsat TM) and airborne multispectral (CASI) imagery was acquired from the Turks and Caicos Islands, British West Indies. The descriptive resolution and accuracy of each image type is compared for two applications: mangrove habitat mapping and the measurement of mangrove canopy characteristics (leaf area index and canopy closure). Mangroves could be separated from non-mangrove vegetation to an accuracy of only 57% with SPOT XS data but better discrimination could be achieved with either Landsat TM or CASI (in both cases accuracy was >90%). CASI data permitted a more accurate classification of different mangrove habitats than was possible using Landsat TM. Nine mangrove habitats could be mapped to an accuracy of 85% with the high-resolution airborne data compared to 31% obtained with TM. A maximum of three mangrove habitats were separable with Landsat TM: the accuracy of this classification was 83%. Measurement of mangrove canopy characteristics is achieved more accurately with CASI than with either satellite sensor, but high costs probably make it a less cost-effective option. The cost-effectiveness of each sensor is discussed for each application.

  10. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test

    PubMed Central

    Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno

    2008-01-01

    The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces. PMID:27873930

  11. Remote classification from an airborne camera using image super-resolution.

    PubMed

    Woods, Matthew; Katsaggelos, Aggelos

    2017-02-01

    The image processing technique known as super-resolution (SR), which attempts to increase the effective pixel sampling density of a digital imager, has gained rapid popularity over the last decade. The majority of literature focuses on its ability to provide results that are visually pleasing to a human observer. In this paper, we instead examine the ability of SR to improve the resolution-critical capability of an imaging system to perform a classification task from a remote location, specifically from an airborne camera. In order to focus the scope of the study, we address and quantify results for the narrow case of text classification. However, we expect the results generalize to a large set of related, remote classification tasks. We generate theoretical results through simulation, which are corroborated by experiments with a camera mounted on a DJI Phantom 3 quadcopter.

  12. Imaging and radiometric performance simulation for a new high-performance dual-band airborne reconnaissance camera

    NASA Astrophysics Data System (ADS)

    Seong, Sehyun; Yu, Jinhee; Ryu, Dongok; Hong, Jinsuk; Yoon, Jee-Yeon; Kim, Sug-Whan; Lee, Jun-Ho; Shin, Myung-Jin

    2009-05-01

    In recent years, high performance visible and IR cameras have been used widely for tactical airborne reconnaissance. The process improvement for efficient discrimination and analysis of complex target information from active battlefields requires for simultaneous multi-band measurement from airborne platforms at various altitudes. We report a new dual band airborne camera designed for simultaneous registration of both visible and IR imagery from mid-altitude ranges. The camera design uses a common front end optical telescope of around 0.3m in entrance aperture and several relay optical sub-systems capable of delivering both high spatial resolution visible and IR images to the detectors. The camera design is benefited from the use of several optical channels packaged in a compact space and the associated freedom to choose between wide (~3 degrees) and narrow (~1 degree) field of view. In order to investigate both imaging and radiometric performances of the camera, we generated an array of target scenes with optical properties such as reflection, refraction, scattering, transmission and emission. We then combined the target scenes and the camera optical system into the integrated ray tracing simulation environment utilizing Monte Carlo computation technique. Taking realistic atmospheric radiative transfer characteristics into account, both imaging and radiometric performances were then investigated. The simulation results demonstrate successfully that the camera design satisfies NIIRS 7 detection criterion. The camera concept, details of performance simulation computation, the resulting performances are discussed together with future development plan.

  13. Capturing the Green River -- Multispectral airborne videography to evaluate the environmental impacts of hydropower operations

    SciTech Connect

    Snider, M.A.; Hayse, J.W.; Hlohowskyj, I.; LaGory, K.E.

    1996-02-01

    The 500-mile long Green River is the largest tributary of the Colorado River. From its origin in the Wind River Range mountains of western Wyoming to its confluence with the Colorado River in southeastern Utah, the Green River is vital to the arid region through which it flows. Large portions of the area remain near-wilderness with the river providing a source of recreation in the form of fishing and rafting, irrigation for farming and ranching, and hydroelectric power. In the late 1950`s and early 1960`s hydroelectric facilities were built on the river. One of these, Flaming Gorge Dam, is located just south of the Utah-Wyoming border near the town of Dutch John, Utah. Hydropower operations result in hourly and daily fluctuations in the releases of water from the dam that alter the natural stream flow below the dam and affect natural resources in and along the river corridor. In the present study, the authors were interested in evaluating the potential impacts of hydropower operations at Flaming Gorge Dam on the downstream natural resources. Considering the size of the area affected by the daily pattern of water release at the dam as well as the difficult terrain and limited accessibility of many reaches of the river, evaluating these impacts using standard field study methods was virtually impossible. Instead an approach was developed that used multispectral aerial videography to determine changes in the affected parameters at different flows, hydrologic modeling to predict flow conditions for various hydropower operating scenarios, and ecological information on the biological resources of concern to assign impacts.

  14. Multispectral Photography

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Model II Multispectral Camera is an advanced aerial camera that provides optimum enhancement of a scene by recording spectral signatures of ground objects only in narrow, preselected bands of the electromagnetic spectrum. Its photos have applications in such areas as agriculture, forestry, water pollution investigations, soil analysis, geologic exploration, water depth studies and camouflage detection. The target scene is simultaneously photographed in four separate spectral bands. Using a multispectral viewer, such as their Model 75 Spectral Data creates a color image from the black and white positives taken by the camera. With this optical image analysis unit, all four bands are superimposed in accurate registration and illuminated with combinations of blue green, red, and white light. Best color combination for displaying the target object is selected and printed. Spectral Data Corporation produces several types of remote sensing equipment and also provides aerial survey, image processing and analysis and number of other remote sensing services.

  15. Potential of Uav-Based Laser Scanner and Multispectral Camera Data in Building Inspection

    NASA Astrophysics Data System (ADS)

    Mader, D.; Blaskow, R.; Westfeld, P.; Weller, C.

    2016-06-01

    Conventional building inspection of bridges, dams or large constructions in general is rather time consuming and often cost expensive due to traffic closures and the need of special heavy vehicles such as under-bridge inspection units or other large lifting platforms. In consideration that, an unmanned aerial vehicle (UAV) will be more reliable and efficient as well as less expensive and simpler to operate. The utilisation of UAVs as an assisting tool in building inspections is obviously. Furthermore, light-weight special sensors such as infrared and thermal cameras as well as laser scanner are available and predestined for usage on unmanned aircraft systems. Such a flexible low-cost system is realized in the ADFEX project with the goal of time-efficient object exploration, monitoring and damage detection. For this purpose, a fleet of UAVs, equipped with several sensors for navigation, obstacle avoidance and 3D object-data acquisition, has been developed and constructed. This contribution deals with the potential of UAV-based data in building inspection. Therefore, an overview of the ADFEX project, sensor specifications and requirements of building inspections in general are given. On the basis of results achieved in practical studies, the applicability and potential of the UAV system in building inspection will be presented and discussed.

  16. Preliminary investigation of multispectral retinal tissue oximetry mapping using a hyperspectral retinal camera.

    PubMed

    Desjardins, Michèle; Sylvestre, Jean-Philippe; Jafari, Reza; Kulasekara, Susith; Rose, Kalpana; Trussart, Rachel; Arbour, Jean Daniel; Hudson, Chris; Lesage, Frédéric

    2016-05-01

    Oximetry measurement of principal retinal vessels represents a first step towards understanding retinal metabolism, but the technique could be significantly enhanced by spectral imaging of the fundus outside of main vessels. In this study, a recently developed Hyperspectral Retinal Camera was used to measure relative oximetric (SatO2) and total hemoglobin (HbT) maps of the retina, outside of large vessels, in healthy volunteers at baseline (N = 7) and during systemic hypoxia (N = 11), as well as in patients with glaucoma (N = 2). Images of the retina, on a field of view of ∼30°, were acquired between 500 and 600 nm with 2 and 5 nm steps, in under 3 s. The reflectance spectrum from each pixel was fitted to a model having oxy- and deoxyhemoglobin as the main absorbers and scattering modeled by a power law, yielding estimates of relative SatO2 and HbT over the fundus. Average optic nerve head (ONH) saturation over 8 eyes was 68 ± 5%. During systemic hypoxia, mean ONH saturation decreased by 12.5% on average. Upon further development and validation, the relative SatO2 and HbT maps of microvasculature obtained with this imaging system could ultimately contribute to the diagnostic and management of diseases affecting the ONH and retina.

  17. Airborne Camera System for Real-Time Applications - Support of a National Civil Protection Exercise

    NASA Astrophysics Data System (ADS)

    Gstaiger, V.; Romer, H.; Rosenbaum, D.; Henkel, F.

    2015-04-01

    In the VABENE++ project of the German Aerospace Center (DLR), powerful tools are being developed to aid public authorities and organizations with security responsibilities as well as traffic authorities when dealing with disasters and large public events. One focus lies on the acquisition of high resolution aerial imagery, its fully automatic processing, analysis and near real-time provision to decision makers in emergency situations. For this purpose a camera system was developed to be operated from a helicopter with light-weight processing units and microwave link for fast data transfer. In order to meet end-users' requirements DLR works close together with the German Federal Office of Civil Protection and Disaster Assistance (BBK) within this project. One task of BBK is to establish, maintain and train the German Medical Task Force (MTF), which gets deployed nationwide in case of large-scale disasters. In October 2014, several units of the MTF were deployed for the first time in the framework of a national civil protection exercise in Brandenburg. The VABENE++ team joined the exercise and provided near real-time aerial imagery, videos and derived traffic information to support the direction of the MTF and to identify needs for further improvements and developments. In this contribution the authors introduce the new airborne camera system together with its near real-time processing components and share experiences gained during the national civil protection exercise.

  18. Airborne multispectral and hyperspectral remote sensing: Examples of applications to the study of environmental and engineering problems

    SciTech Connect

    Bianchi, R.; Marino, C.M.

    1997-10-01

    The availability of a new aerial survey capability carried out by the CNR/LARA (National Research Council - Airborne Laboratory for the Environmental Research) by a new spectroradiometer AA5000 MIVIS (Multispectral Infrared and Visible Imaging Spectrometer) on board a CASA 212/200 aircraft, enable the scientists to obtain innovative data sets, for different approach to the definitions and the understanding of a variety of environmental and engineering problems. The 102 MIVIS channels spectral bandwidths are chosen to meet the needs of scientific research for advanced applications of remote sensing data. In such configuration MIVIS can offer significant contributions to problem solving in wide sectors such as geologic exploration, agricultural crop studies, forestry, land use mapping, idrogeology, oceanography and others. LARA in 1994-96 has been active over different test-sites in joint-venture with JPL, Pasadena, different European Institutions and Italian University and Research Institutes. These aerial surveys allow the national and international scientific community to approach the use of Hyperspectral Remote Sensing in environmental problems of very large interest. The sites surveyed in Italy, France and Germany include a variety of targets such as quarries, landfills, karst cavities areas, landslides, coastlines, geothermal areas, etc. The deployments gathered up to now more than 300 GBytes of MIVIS data in more than 30 hours of VLDS data recording. The purpose of this work is to present and to comment the procedures and the results at research and at operational level of the past campaigns with special reference to the study of environmental and engineering problems.

  19. Ground-based multispectral measurements for airborne data verification in non-operating open pit mine "Kremikovtsi"

    NASA Astrophysics Data System (ADS)

    Borisova, Denitsa; Nikolov, Hristo; Petkov, Doyno

    2013-10-01

    The impact of mining industry and metal production on the environment is presented all over the world. In our research we set focus on the impact of already non-operating ferrous "Kremikovtsi"open pit mine and related waste dumps and tailings which we consider to be the major factor responsible for pollution of one densely populated region in Bulgaria. The approach adopted is based on correct estimation of the distribution of the iron oxides inside open pit mines and the neighboring regions those considered in this case to be the key issue for the ecological state assessment of soils, vegetation and water. For this study the foremost source of data are those of airborne origin and those combined with ground-based in-situ and laboratory acquired data were used for verification of the environmental variables and thus in process of assessment of the present environmental status influenced by previous mining activities. The percentage of iron content was selected as main indicator for presence of metal pollution since it could be reliably identified by multispectral data used in this study and also because the iron compounds are widely spread in the most of the minerals, rocks and soils. In our research the number of samples from every source (air, field, lab) was taken in the way to be statistically sound and confident. In order to establish relationship between the degree of pollution of the soil and mulspectral data 40 soil samples were collected during a field campaign in the study area together with GPS measurements for two types of laboratory measurements: the first one, chemical and mineralogical analysis and the second one, non-destructive spectroscopy. In this work for environmental variables verification over large areas mulspectral satellite data from Landsat instruments TM/ETM+ and from ALI/OLI (Operational Land Imager) were used. Ground-based (laboratory and in-situ) spectrometric measurements were performed using the designed and constructed in Remote

  20. Mastcam-Z: Designing a Geologic, Stereoscopic, and Multispectral Pair of Zoom Cameras for the NASA Mars 2020 Rover

    NASA Astrophysics Data System (ADS)

    Bell, J. F.; Maki, J. N.; Mehall, G. L.; Ravine, M. A.; Caplinger, M. A.; Mastcam-Z Team

    2016-10-01

    Mastcam-Z is a stereoscopic, multispectral imaging investigation selected for flight on the Mars 2020 rover mission. In this presentation we review our science goals and requirements and describe our CDR-level design and operational plans.

  1. Land cover/use classification of Cairns, Queensland, Australia: A remote sensing study involving the conjunctive use of the airborne imaging spectrometer, the large format camera and the thematic mapper simulator

    NASA Technical Reports Server (NTRS)

    Heric, Matthew; Cox, William; Gordon, Daniel K.

    1987-01-01

    In an attempt to improve the land cover/use classification accuracy obtainable from remotely sensed multispectral imagery, Airborne Imaging Spectrometer-1 (AIS-1) images were analyzed in conjunction with Thematic Mapper Simulator (NS001) Large Format Camera color infrared photography and black and white aerial photography. Specific portions of the combined data set were registered and used for classification. Following this procedure, the resulting derived data was tested using an overall accuracy assessment method. Precise photogrammetric 2D-3D-2D geometric modeling techniques is not the basis for this study. Instead, the discussion exposes resultant spectral findings from the image-to-image registrations. Problems associated with the AIS-1 TMS integration are considered, and useful applications of the imagery combination are presented. More advanced methodologies for imagery integration are needed if multisystem data sets are to be utilized fully. Nevertheless, research, described herein, provides a formulation for future Earth Observation Station related multisensor studies.

  2. Multispectral airborne imagery in the field reveals genetic determinisms of morphological and transpiration traits of an apple tree hybrid population in response to water deficit

    PubMed Central

    Virlet, Nicolas; Costes, Evelyne; Martinez, Sébastien; Kelner, Jean-Jacques; Regnard, Jean-Luc

    2015-01-01

    Genetic studies of response to water deficit in adult trees are limited by low throughput of the usual phenotyping methods in the field. Here, we aimed at overcoming this bottleneck, applying a new methodology using airborne multispectral imagery and in planta measurements to compare a high number of individuals. An apple tree population, grafted on the same rootstock, was submitted to contrasting summer water regimes over two years. Aerial images acquired in visible, near- and thermal-infrared at three dates each year allowed calculation of vegetation and water stress indices. Tree vigour and fruit production were also assessed. Linear mixed models were built accounting for date and year effects on several variables and including the differential response of genotypes between control and drought conditions. Broad-sense heritability of most variables was high and 18 quantitative trait loci (QTLs) independent of the dates were detected on nine linkage groups of the consensus apple genetic map. For vegetation and stress indices, QTLs were related to the means, the intra-crown heterogeneity, and differences induced by water regimes. Most QTLs explained 15−20% of variance. Airborne multispectral imaging proved relevant to acquire simultaneous information on a whole tree population and to decipher genetic determinisms involved in response to water deficit. PMID:26208644

  3. Multispectral airborne imagery in the field reveals genetic determinisms of morphological and transpiration traits of an apple tree hybrid population in response to water deficit.

    PubMed

    Virlet, Nicolas; Costes, Evelyne; Martinez, Sébastien; Kelner, Jean-Jacques; Regnard, Jean-Luc

    2015-09-01

    Genetic studies of response to water deficit in adult trees are limited by low throughput of the usual phenotyping methods in the field. Here, we aimed at overcoming this bottleneck, applying a new methodology using airborne multispectral imagery and in planta measurements to compare a high number of individuals.An apple tree population, grafted on the same rootstock, was submitted to contrasting summer water regimes over two years. Aerial images acquired in visible, near- and thermal-infrared at three dates each year allowed calculation of vegetation and water stress indices. Tree vigour and fruit production were also assessed. Linear mixed models were built accounting for date and year effects on several variables and including the differential response of genotypes between control and drought conditions.Broad-sense heritability of most variables was high and 18 quantitative trait loci (QTLs) independent of the dates were detected on nine linkage groups of the consensus apple genetic map. For vegetation and stress indices, QTLs were related to the means, the intra-crown heterogeneity, and differences induced by water regimes. Most QTLs explained 15-20% of variance.Airborne multispectral imaging proved relevant to acquire simultaneous information on a whole tree population and to decipher genetic determinisms involved in response to water deficit.

  4. Non-invasive skin oxygenation imaging using a multi-spectral camera system: effectiveness of various concentration algorithms applied on human skin

    NASA Astrophysics Data System (ADS)

    Klaessens, John H. G. M.; Noordmans, Herke Jan; de Roode, Rowland; Verdaasdonk, Rudolf M.

    2009-02-01

    This study describes noninvasive noncontact methods to acquire and analyze functional information from the skin. Multispectral images at several selected wavelengths in the visible and near infrared region are collected and used in mathematical methods to calculate concentrations of different chromophores in the epidermis and dermis of the skin. This is based on the continuous wave Near Infrared Spectroscopy method, which is a well known non-invasive technique for measuring oxygenation changes in the brain and in muscle tissue. Concentration changes of hemoglobin (dO2Hb, dHHb and dtHb) can be calculated from light attenuations using the modified Lambert Beer equation. We applied this technique on multi-spectral images taken from the skin surface using different algorithms for calculating changes in O2Hb, HHb and tHb. In clinical settings, the imaging of local oxygenation variations and/or blood perfusion in the skin can be useful for e.g. detection of skin cancer, detection of early inflammation, checking the level of peripheral nerve block anesthesia, study of wound healing and tissue viability by skin flap transplantations. Images from the skin are obtained with a multi-spectral imaging system consisting of a 12-bit CCD camera in combination with a Liquid Crystal Tunable Filter. The skin is illuminated with either a broad band light source or a tunable multi wavelength LED light source. A polarization filter is used to block the direct reflected light. The collected multi-spectral imaging data are images of the skin surface radiance; each pixel contains either the full spectrum (420 - 730 nm) or a set of selected wavelengths. These images were converted to reflectance spectra. The algorithms were validated during skin oxygen saturation changes induced by temporary arm clamping and applied to some clinical examples. The initial results with the multi-spectral skin imaging system show good results for detecting dynamic changes in oxygen concentration. However, the

  5. Multispectral photography for earth resources

    NASA Technical Reports Server (NTRS)

    Wenderoth, S.; Yost, E.; Kalia, R.; Anderson, R.

    1972-01-01

    A guide for producing accurate multispectral results for earth resource applications is presented along with theoretical and analytical concepts of color and multispectral photography. Topics discussed include: capabilities and limitations of color and color infrared films; image color measurements; methods of relating ground phenomena to film density and color measurement; sensitometry; considerations in the selection of multispectral cameras and components; and mission planning.

  6. Russian multispectral-hyperspectral airborne scanner for geological and environmental investigations - {open_quotes}Vesuvius-EC{close_quotes}

    SciTech Connect

    Yassinsky, G.I.; Shilin, B.V.

    1996-07-01

    Small variations of spectral characteristics in 0,3-14 microns band are of great significance in geological and environmental investigations. Multipurpose multispectral digital scanner with narrow field of view, high spectral resolution and radiometric calibration designed in Russia. Changeable modules permit to obtain parameters of the device for practical using.

  7. In vivo multispectral imaging of the absorption and scattering properties of exposed brain using a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Yoshida, Keiichiro; Ishizuka, Tomohiro; Mizushima, Chiharu; Nishidate, Izumi; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu

    2015-04-01

    To evaluate multi-spectral images of the absorption and scattering properties in the cerebral cortex of rat brain, we investigated spectral reflectance images estimated by the Wiener estimation method using a digital red-green-blue camera. A Monte Carlo simulation-based multiple regression analysis for the corresponding spectral absorbance images at nine wavelengths (500, 520, 540, 560, 570, 580, 600, 730, and 760 nm) was then used to specify the absorption and scattering parameters. The spectral images of absorption and reduced scattering coefficients were reconstructed from the absorption and scattering parameters. We performed in vivo experiments on exposed rat brain to confirm the feasibility of this method. The estimated images of the absorption coefficients were dominated by hemoglobin spectra. The estimated images of the reduced scattering coefficients had a broad scattering spectrum, exhibiting a larger magnitude at shorter wavelengths, corresponding to the typical spectrum of brain tissue published in the literature.

  8. Remote Sensing of Liquid Water and Ice Cloud Optical Thickness and Effective Radius in the Arctic: Application of Airborne Multispectral MAS Data

    NASA Technical Reports Server (NTRS)

    King, Michael D.; Platnick, Steven; Yang, Ping; Arnold, G. Thomas; Gray, Mark A.; Riedi, Jerome C.; Ackerman, Steven A.; Liou, Kuo-Nan

    2003-01-01

    A multispectral scanning spectrometer was used to obtain measurements of the reflection function and brightness temperature of clouds, sea ice, snow, and tundra surfaces at 50 discrete wavelengths between 0.47 and 14.0 microns. These observations were obtained from the NASA ER-2 aircraft as part of the FIRE Arctic Clouds Experiment, conducted over a 1600 x 500 km region of the north slope of Alaska and surrounding Beaufort and Chukchi Seas between 18 May and 6 June 1998. Multispectral images of the reflection function and brightness temperature in 11 distinct bands of the MODIS Airborne Simulator (MAS) were used to derive a confidence in clear sky (or alternatively the probability of cloud), shadow, and heavy aerosol over five different ecosystems. Based on the results of individual tests run as part of the cloud mask, an algorithm was developed to estimate the phase of the clouds (water, ice, or undetermined phase). Finally, the cloud optical thickness and effective radius were derived for both water and ice clouds that were detected during one flight line on 4 June. This analysis shows that the cloud mask developed for operational use on MODIS, and tested using MAS data in Alaska, is quite capable of distinguishing clouds from bright sea ice surfaces during daytime conditions in the high Arctic. Results of individual tests, however, make it difficult to distinguish ice clouds over snow and sea ice surfaces, so additional tests were added to enhance the confidence in the thermodynamic phase of clouds over the Beaufort Sea. The cloud optical thickness and effective radius retrievals used 3 distinct bands of the MAS, with the newly developed 1.62 and 2.13 micron bands being used quite successfully over snow and sea ice surfaces. These results are contrasted with a MODIS-based algorithm that relies on spectral reflectance at 0.87 and 2.13 micron.

  9. A fully-automated approach to land cover mapping with airborne LiDAR and high resolution multispectral imagery in a forested suburban landscape

    NASA Astrophysics Data System (ADS)

    Parent, Jason R.; Volin, John C.; Civco, Daniel L.

    2015-06-01

    Information on land cover is essential for guiding land management decisions and supporting landscape-level ecological research. In recent years, airborne light detection and ranging (LiDAR) and high resolution aerial imagery have become more readily available in many areas. These data have great potential to enable the generation of land cover at a fine scale and across large areas by leveraging 3-dimensional structure and multispectral information. LiDAR and other high resolution datasets must be processed in relatively small subsets due to their large volumes; however, conventional classification techniques cannot be fully automated and thus are unlikely to be feasible options when processing large high-resolution datasets. In this paper, we propose a fully automated rule-based algorithm to develop a 1 m resolution land cover classification from LiDAR data and multispectral imagery. The algorithm we propose uses a series of pixel- and object-based rules to identify eight vegetated and non-vegetated land cover features (deciduous and coniferous tall vegetation, medium vegetation, low vegetation, water, riparian wetlands, buildings, low impervious cover). The rules leverage both structural and spectral properties including height, LiDAR return characteristics, brightness in visible and near-infrared wavelengths, and normalized difference vegetation index (NDVI). Pixel-based properties were used initially to classify each land cover class while minimizing omission error; a series of object-based tests were then used to remove errors of commission. These tests used conservative thresholds, based on diverse test areas, to help avoid over-fitting the algorithm to the test areas. The accuracy assessment of the classification results included a stratified random sample of 3198 validation points distributed across 30 1 × 1 km tiles in eastern Connecticut, USA. The sample tiles were selected in a stratified random manner from locations representing the full range of

  10. Comparison of mosaicking techniques for airborne images from consumer-grade cameras

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Images captured from airborne imaging systems have the advantages of relatively low cost, high spatial resolution, and real/near-real-time availability. Multiple images taken from one or more flight lines could be used to generate a high-resolution mosaic image, which could be useful for diverse rem...

  11. Verification of sensitivity enhancement of SWIR imager technology in advanced multispectral SWIR/VIS zoom cameras with constant and variable F-number

    NASA Astrophysics Data System (ADS)

    Hübner, M.; Achtner, B.; Kraus, M.; Siemens, C.; Münzberg, M.

    2016-05-01

    Current designs of combined VIS-color/SWIR camera optics use constant F-number over the full field of view (FOV) range. Especially in the SWIR, limited space for the camera integration in existing system volumes and relatively high pitch dimensions of 15μm or even 20μm force the use of relatively high F- numbers to accomplish narrow fields of view less than 2.0° with reasonable resolution for long range observation and targeting applications. Constant F-number designs are already reported and considered [1] for submarine applications. The comparison of electro-optical performance was based on the given detector noise performance and sensitivity data by the detector manufacturer [1] and further modelling of the imaging chain within linear MTF system theory. The visible channel provides limited twilight capability at F/2.6 but in the SWIR the twilight capability is degraded due to the relatively high F-number of F/7 or F/5.25 for 20 μm and 15 μm pitch, respectively. Differences between prediction and experimental verification of sensitivity in terms of noise equivalent irradiance (NEI) and scenery based limiting illumination levels are shown for the visible and the SWIR spectral range. Within this context, currently developed improvements using optical zoom designs for the multispectral SWIR/VIS camera optics with continuously variable Fnumber are discussed, offering increased low light level capabilities at wide and medium fields of view while still enabling a NFOV < 2° with superior long range targeting capabilities under limited atmospherical sight conditions at daytime.

  12. Use of reflectance spectra of native plant species for interpreting airborne multispectral scanner data in the East Tintic Mountains, Utah.

    USGS Publications Warehouse

    Milton, N.M.

    1983-01-01

    Analysis of in situ reflectance spectra of native vegetation was used to interpret airborne MSS data. Representative spectra from three plant species in the E Tintic Mountains, Utah, were used to interpret the color components on a color ratio composite image made from MSS data in the visible and near-infrared regions. A map of plant communities was made from the color ratio composite image and field checked. -from Author

  13. Feasibility of an airborne TV camera as a size spectrometer for cloud droplets in daylight.

    PubMed

    Roscoe, H K; Lachlan-Cope, T A; Roscoe, J

    1999-01-20

    Photographs of clouds taken with a camera with a large aperture ratio must have a short depth of focus to resolve small droplets. Hence the sampling volume is small, which limits the number of droplets and gives rise to a large statistical error on the number counted. However, useful signals can be obtained with a small aperture ratio, which allows for a sample volume large enough for counting cloud droplets at aircraft speeds with useful spatial resolution. The signal is sufficient to discriminate against noise from a sunlit cloud as background, provided the bandwidth of the light source and camera are restricted, and against readout noise. Hence, in principle, an instrument to sample the size distribution of cloud droplets from aircraft in daylight can be constructed from a simple TV camera and an array of laser diodes, without any components or screens external to the aircraft window.

  14. Visibility through the gaseous smoke in airborne remote sensing using a DSLR camera

    NASA Astrophysics Data System (ADS)

    Chabok, Mirahmad; Millington, Andrew; Hacker, Jorg M.; McGrath, Andrew J.

    2016-08-01

    Visibility and clarity of remotely sensed images acquired by consumer grade DSLR cameras, mounted on an unmanned aerial vehicle or a manned aircraft, are critical factors in obtaining accurate and detailed information from any area of interest. The presence of substantial haze, fog or gaseous smoke particles; caused, for example, by an active bushfire at the time of data capture, will dramatically reduce image visibility and quality. Although most modern hyperspectral imaging sensors are capable of capturing a large number of narrow range bands of the shortwave and thermal infrared spectral range, which have the potential to penetrate smoke and haze, the resulting images do not contain sufficient spatial detail to enable locating important objects or assist search and rescue or similar applications which require high resolution information. We introduce a new method for penetrating gaseous smoke without compromising spatial resolution using a single modified DSLR camera in conjunction with image processing techniques which effectively improves the visibility of objects in the captured images. This is achieved by modifying a DSLR camera and adding a custom optical filter to enable it to capture wavelengths from 480-1200nm (R, G and Near Infrared) instead of the standard RGB bands (400-700nm). With this modified camera mounted on an aircraft, images were acquired over an area polluted by gaseous smoke from an active bushfire. Processed data using our proposed method shows significant visibility improvements compared with other existing solutions.

  15. Application of phase matching autofocus in airborne long-range oblique photography camera

    NASA Astrophysics Data System (ADS)

    Petrushevsky, Vladimir; Guberman, Asaf

    2014-06-01

    The Condor2 long-range oblique photography (LOROP) camera is mounted in an aerodynamically shaped pod carried by a fast jet aircraft. Large aperture, dual-band (EO/MWIR) camera is equipped with TDI focal plane arrays and provides high-resolution imagery of extended areas at long stand-off ranges, at day and night. Front Ritchey-Chretien optics is made of highly stable materials. However, the camera temperature varies considerably in flight conditions. Moreover, a composite-material structure of the reflective objective undergoes gradual dehumidification in dry nitrogen atmosphere inside the pod, causing some small decrease of the structure length. The temperature and humidity effects change a distance between the mirrors by just a few microns. The distance change is small but nevertheless it alters the camera's infinity focus setpoint significantly, especially in the EO band. To realize the optics' resolution potential, the optimal focus shall be constantly maintained. In-flight best focus calibration and temperature-based open-loop focus control give mostly satisfactory performance. To get even better focusing precision, a closed-loop phase-matching autofocus method was developed for the camera. The method makes use of an existing beamsharer prism FPA arrangement where aperture partition exists inherently in an area of overlap between the adjacent detectors. The defocus is proportional to an image phase shift in the area of overlap. Low-pass filtering of raw defocus estimate reduces random errors related to variable scene content. Closed-loop control converges robustly to precise focus position. The algorithm uses the temperature- and range-based focus prediction as an initial guess for the closed-loop phase-matching control. The autofocus algorithm achieves excellent results and works robustly in various conditions of scene illumination and contrast.

  16. Multispectral imaging and image processing

    NASA Astrophysics Data System (ADS)

    Klein, Julie

    2014-02-01

    The color accuracy of conventional RGB cameras is not sufficient for many color-critical applications. One of these applications, namely the measurement of color defects in yarns, is why Prof. Til Aach and the Institute of Image Processing and Computer Vision (RWTH Aachen University, Germany) started off with multispectral imaging. The first acquisition device was a camera using a monochrome sensor and seven bandpass color filters positioned sequentially in front of it. The camera allowed sampling the visible wavelength range more accurately and reconstructing the spectra for each acquired image position. An overview will be given over several optical and imaging aspects of the multispectral camera that have been investigated. For instance, optical aberrations caused by filters and camera lens deteriorate the quality of captured multispectral images. The different aberrations were analyzed thoroughly and compensated based on models for the optical elements and the imaging chain by utilizing image processing. With this compensation, geometrical distortions disappear and sharpness is enhanced, without reducing the color accuracy of multispectral images. Strong foundations in multispectral imaging were laid and a fruitful cooperation was initiated with Prof. Bernhard Hill. Current research topics like stereo multispectral imaging and goniometric multispectral measure- ments that are further explored with his expertise will also be presented in this work.

  17. Analysis of testbed airborne multispectral scanner data from Superflux II. [Chesapeake Bay plume and James Shelf data

    NASA Technical Reports Server (NTRS)

    Bowker, D. E.; Hardesty, C. A.; Jobson, D. J.; Bahn, G. S.

    1981-01-01

    A test bed aircraft multispectral scanner (TBAMS) was flown during the James Shelf, Plume Scan, and Chesapeake Bay missions as part of the Superflux 2 experiment. Excellent correlations were obtained between water sample measurements of chlorophyll and sediment and TBAMS radiance data. The three-band algorithms used were insensitive to aircraft altitude and varying atmospheric conditions. This was particularly fortunate due to the hazy conditions during most of the experiments. A contour map of sediment, and also chlorophyll, was derived for the Chesapeake Bay plume along the southern Virginia-Carolina coastline. A sediment maximum occurs about 5 nautical miles off the Virginia Beach coast with a chlorophyll maximum slightly shoreward of this. During the James Shelf mission, a thermal anomaly (or front) was encountered about 50 miles from the coast. There was a minor variation in chlorophyll and sediment across the boundary. During the Chesapeake Bay mission, the Sun elevation increased from 50 degrees to over 70 degrees, interfering with the generation of data products.

  18. Long-Term Tracking of a Specific Vehicle Using Airborne Optical Camera Systems

    NASA Astrophysics Data System (ADS)

    Kurz, F.; Rosenbaum, D.; Runge, H.; Cerra, D.; Mattyus, G.; Reinartz, P.

    2016-06-01

    In this paper we present two low cost, airborne sensor systems capable of long-term vehicle tracking. Based on the properties of the sensors, a method for automatic real-time, long-term tracking of individual vehicles is presented. This combines the detection and tracking of the vehicle in low frame rate image sequences and applies the lagged Cell Transmission Model (CTM) to handle longer tracking outages occurring in complex traffic situations, e.g. tunnels. The CTM model uses the traffic conditions in the proximities of the target vehicle and estimates its motion to predict the position where it reappears. The method is validated on an airborne image sequence acquired from a helicopter. Several reference vehicles are tracked within a range of 500m in a complex urban traffic situation. An artificial tracking outage of 240m is simulated, which is handled by the CTM. For this, all the vehicles in the close proximity are automatically detected and tracked to estimate the basic density-flow relations of the CTM model. Finally, the real and simulated trajectories of the reference vehicles in the outage are compared showing good correspondence also in congested traffic situations.

  19. Comparison of mosaicking techniques for airborne images from consumer-grade cameras

    NASA Astrophysics Data System (ADS)

    Song, Huaibo; Yang, Chenghai; Zhang, Jian; Hoffmann, Wesley Clint; He, Dongjian; Thomasson, J. Alex

    2016-01-01

    Images captured from airborne imaging systems can be mosaicked for diverse remote sensing applications. The objective of this study was to identify appropriate mosaicking techniques and software to generate mosaicked images for use by aerial applicators and other users. Three software packages-Photoshop CC, Autostitch, and Pix4Dmapper-were selected for mosaicking airborne images acquired from a large cropping area. Ground control points were collected for georeferencing the mosaicked images and for evaluating the accuracy of eight mosaicking techniques. Analysis and accuracy assessment showed that Pix4Dmapper can be the first choice if georeferenced imagery with high accuracy is required. The spherical method in Photoshop CC can be an alternative for cost considerations, and Autostitch can be used to quickly mosaic images with reduced spatial resolution. The results also showed that the accuracy of image mosaicking techniques could be greatly affected by the size of the imaging area or the number of the images and that the accuracy would be higher for a small area than for a large area. The results from this study will provide useful information for the selection of image mosaicking software and techniques for aerial applicators and other users.

  20. Airborne imaging for heritage documentation using the Fotokite tethered flying camera

    NASA Astrophysics Data System (ADS)

    Verhoeven, Geert; Lupashin, Sergei; Briese, Christian; Doneus, Michael

    2014-05-01

    Since the beginning of aerial photography, researchers used all kinds of devices (from pigeons, kites, poles, and balloons to rockets) to take still cameras aloft and remotely gather aerial imagery. To date, many of these unmanned devices are still used for what has been referred to as Low-Altitude Aerial Photography or LAAP. In addition to these more traditional camera platforms, radio-controlled (multi-)copter platforms have recently added a new aspect to LAAP. Although model airplanes have been around for several decades, the decreasing cost, increasing functionality and stability of ready-to-fly multi-copter systems has proliferated their use among non-hobbyists. As such, they became a very popular tool for aerial imaging. The overwhelming amount of currently available brands and types (heli-, dual-, tri-, quad-, hexa-, octo-, dodeca-, deca-hexa and deca-octocopters), together with the wide variety of navigation options (e.g. altitude and position hold, waypoint flight) and camera mounts indicate that these platforms are here to stay for some time. Given the multitude of still camera types and the image quality they are currently capable of, endless combinations of low- and high-cost LAAP solutions are available. In addition, LAAP allows for the exploitation of new imaging techniques, as it is often only a matter of lifting the appropriate device (e.g. video cameras, thermal frame imagers, hyperspectral line sensors). Archaeologists were among the first to adopt this technology, as it provided them with a means to easily acquire essential data from a unique point of view, whether for simple illustration purposes of standing historic structures or to compute three-dimensional (3D) models and orthophotographs from excavation areas. However, even very cheap multi-copters models require certain skills to pilot them safely. Additionally, malfunction or overconfidence might lift these devices to altitudes where they can interfere with manned aircrafts. As such, the

  1. Airborne Imagery

    NASA Technical Reports Server (NTRS)

    1983-01-01

    ATM (Airborne Thematic Mapper) was developed for NSTL (National Space Technology Companies) by Daedalus Company. It offers expanded capabilities for timely, accurate and cost effective identification of areas with prospecting potential. A related system is TIMS, Thermal Infrared Multispectral Scanner. Originating from Landsat 4, it is also used for agricultural studies, etc.

  2. Airborne multispectral remote sensing data to estimate several oenological parameters in vineyard production. A case study of application of remote sensing data to precision viticulture in central Italy.

    NASA Astrophysics Data System (ADS)

    Tramontana, Gianluca; Girard, Filippo; Belli, Claudio; Comandini, Maria Cristina; Pietromarchi, Paolo; Tiberi, Domenico; Papale, Dario

    2010-05-01

    It is widely recognized that environmental differences within the vineyard, with respect to soils, microclimate, and topography, can influence grape characteristics and crop yields. Besides, the central Italy landscape is characterized by a high level of fragmentation and heterogeneity It requires stringent Remote sensing technical features in terms of spectral, geometric and temporal resolution to aimed at supporting applications for precision viticulture. In response to the needs of the Italian grape and wine industry for an evaluation of precision viticulture technologies, the DISAFRI (University of Tuscia) and the Agricultural Research Council - Oenological research unit (ENC-CRA) jointly carried out an experimental study during the year 2008. The study was carried out on 2 areas located in the town of Velletri, near Rome; for each area, two varieties (red and white grape) were studied: Nero d'Avola and Sauvignon blanc in first area , Merlot and Sauvignon blanc in second. Remote sensing data were acquired in different periods using a low cost multisensor Airborne remote sensing platform developed by DISAFRI (ASPIS-2 Advanced Spectroscopic Imager System). ASPIS-2, an evolution of the ASPIS sensor (Papale et al 2008, Sensors), is a multispectral sensor based on 4 CCD and 3 interferential filters per CCD. The filters are user selectable during the flight and in this way Aspis is able to acquire data in 12 bands in the visible and near infrared regions with a bandwidth of 10 or 20 nm. To the purposes of this study 7 spectral band were acquired and 15 vegetation indices calculated. During the ripeness period several vegetative and oenochemical parameters were monitored. Anova test shown that several oenochemical variables, such as sugars, total acidity, polyphenols and anthocyanins differ according to the variety taken into consideration. In order to evaluate the time autocorrelation of several oenological parameters value, a simple linear regression between

  3. [In-flight absolute radiometric calibration of UAV multispectral sensor].

    PubMed

    Chen, Wei; Yan, Lei; Gou, Zhi-Yang; Zhao, Hong-Ying; Liu, Da-Ping; Duan, Yi-Ni

    2012-12-01

    Based on the data of the scientific experiment in Urad Front Banner for UAV Remote Sensing Load Calibration Field project, with the help of 6 hyperspectral radiometric targets with good Lambertian property, the wide-view multispectral camera in UAV was calibrated adopting reflectance-based method. The result reveals that for green, red and infrared channel, whose images were successfully captured, the linear correlation coefficients between the DN and radiance are all larger than 99%. In final analysis, the comprehensive error is no more than 6%. The calibration results demonstrate that the hyperspectral targets equipped by the calibration field are well suitable for air-borne multispectral load in-flight calibration. The calibration result is reliable and could be used in the retrieval of geophysical parameters.

  4. Multispectral imaging of absorption and scattering properties of in vivo exposed rat brain using a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Yoshida, Keiichiro; Nishidate, Izumi; Ishizuka, Tomohiro; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu

    2015-05-01

    In order to estimate multispectral images of the absorption and scattering properties in the cerebral cortex of in vivo rat brain, we investigated spectral reflectance images estimated by the Wiener estimation method using a digital RGB camera. A Monte Carlo simulation-based multiple regression analysis for the corresponding spectral absorbance images at nine wavelengths (500, 520, 540, 560, 570, 580, 600, 730, and 760 nm) was then used to specify the absorption and scattering parameters of brain tissue. In this analysis, the concentrations of oxygenated hemoglobin and that of deoxygenated hemoglobin were estimated as the absorption parameters, whereas the coefficient a and the exponent b of the reduced scattering coefficient spectrum approximated by a power law function were estimated as the scattering parameters. The spectra of absorption and reduced scattering coefficients were reconstructed from the absorption and scattering parameters, and the spectral images of absorption and reduced scattering coefficients were then estimated. In order to confirm the feasibility of this method, we performed in vivo experiments on exposed rat brain. The estimated images of the absorption coefficients were dominated by the spectral characteristics of hemoglobin. The estimated spectral images of the reduced scattering coefficients had a broad scattering spectrum, exhibiting a larger magnitude at shorter wavelengths, corresponding to the typical spectrum of brain tissue published in the literature. The changes in the estimated absorption and scattering parameters during normoxia, hyperoxia, and anoxia indicate the potential applicability of the method by which to evaluate the pathophysiological conditions of in vivo brain due to the loss of tissue viability.

  5. Multispectral imaging of absorption and scattering properties of in vivo exposed rat brain using a digital red-green-blue camera.

    PubMed

    Yoshida, Keiichiro; Nishidate, Izumi; Ishizuka, Tomohiro; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu

    2015-05-01

    In order to estimate multispectral images of the absorption and scattering properties in the cerebral cortex of in vivo rat brain, we investigated spectral reflectance images estimated by the Wiener estimation method using a digital RGB camera. A Monte Carlo simulation-based multiple regression analysis for the corresponding spectral absorbance images at nine wavelengths (500, 520, 540, 560, 570, 580, 600, 730, and 760 nm) was then used to specify the absorption and scattering parameters of brain tissue. In this analysis, the concentrations of oxygenated hemoglobin and that of deoxygenated hemoglobin were estimated as the absorption parameters, whereas the coefficient a and the exponent b of the reduced scattering coefficient spectrum approximated by a power law function were estimated as the scattering parameters. The spectra of absorption and reduced scattering coefficients were reconstructed from the absorption and scattering parameters, and the spectral images of absorption and reduced scattering coefficients were then estimated. In order to confirm the feasibility of this method, we performed in vivo experiments on exposed rat brain. The estimated images of the absorption coefficients were dominated by the spectral characteristics of hemoglobin. The estimated spectral images of the reduced scattering coefficients had a broad scattering spectrum, exhibiting a larger magnitude at shorter wavelengths, corresponding to the typical spectrum of brain tissue published in the literature. The changes in the estimated absorption and scattering parameters during normoxia, hyperoxia, and anoxia indicate the potential applicability of the method by which to evaluate the pathophysiological conditions of in vivo brain due to the loss of tissue viability.

  6. Retrieval of water quality algorithms from airborne HySpex camera for oxbow lakes in north-eastern Poland

    NASA Astrophysics Data System (ADS)

    Slapinska, Malgorzata; Berezowski, Tomasz; Frąk, Magdalena; Chormański, Jarosław

    2016-04-01

    The aim of this study was to retrieve empirical formulas for water quality of oxbow lakes in Lower Biebrza Basin (river located in NE Poland) using HySpex airborne imaging spectrometer. Biebrza River is one of the biggest wetland in Europe. It is characterised by low contamination level and small human influence. Because of those characteristics Biebrza River can be treated as a reference area for other floodplains and fen ecosystem in Europe. Oxbow lakes are important part of Lower Biebrza Basin due to their retention and habitat function. Hyperspectral remote sensing data were acquired by the HySpex sensor (which covers the range of 400-2500 nm) on 01-02.08.2015 with the ground measurements campaign conducted 03-04.08.2015. The ground measurements consisted of two parts. First part included spectral reflectance sampling with spectroradiometer ASD FieldSpec 3, which covered the wavelength range of 350-2500 nm at 1 nm intervals. In situ data were collected both for water and for specific objects within the area. Second part of the campaign included water parameters such as Secchi disc depth (SDD), electric conductivity (EC), pH, temperature and phytoplankton. Measured reflectance enabled empirical line atmospheric correction which was conducted for the HySpex data. Our results indicated that proper atmospheric correction was very important for further data analysis. The empirical formulas for our water parameters were retrieved based on reflecatance data. This study confirmed applicability of HySpex camera to retrieve water quality.

  7. A Web-GIS Procedure Based on Satellite Multi-Spectral and Airborne LIDAR Data to Map the Road blockage Due to seismic Damages of Built-Up Urban Areas

    NASA Astrophysics Data System (ADS)

    Costanzo, Antonio; Montuori, Antonio; Silva, Juan Pablo; Silvestri, Malvina; Musacchio, Massimo; Buongiorno, Maria Fabrizia; Stramondo, Salvatore

    2016-08-01

    In this work, a web-GIS procedure to map the risk of road blockage in urban environments through the combined use of space-borne and airborne remote sensing sensors is presented. The methodology concerns (1) the provision of a geo-database through the integration of space-borne multispectral images and airborne LiDAR data products; (2) the modeling of building vulnerability, based on the corresponding 3D geometry and construction time information; (3) the GIS-based mapping of road closure due to seismic- related building collapses based on the building characteristic height and the width of the road. Experimental results, gathered for the Cosenza urban area, allow demonstrating the benefits of both the proposed approach and the GIS-based integration of multi-platforms remote sensing sensors and techniques for seismic road assessment purposes.

  8. Geo-Referenced Mapping Using AN Airborne 3d Time-Of Camera

    NASA Astrophysics Data System (ADS)

    Kohoutek, T. K.; Nitsche, M.; Eisenbeiss, H.

    2011-09-01

    This paper presents the first experience of a close range bird's eye view photogrammetry with range imaging (RIM) sensors for the real time generation of high resolution geo-referenced 3D surface models. The aim of this study was to develop a mobile, versatile and less costly outdoor survey methodology to measure natural surfaces compared to the terrestrial laser scanning (TLS). Two commercial RIM cameras (SR4000 by MESA Imaging AG and a CamCube 2.0 by PMDTechnologies GmbH) were mounted on a lightweight crane and on an unmanned aerial vehicle (UAV). The field experiments revealed various challenges in real time deployment of the two state-of-the-art RIM systems, e.g. processing of the large data volume. Acquisition strategy and data processing and first measurements are presented. The precision of the measured distances is less than 1 cm for good conditions. However, the measurement precision degraded under the test conditions due to direct sunlight, strong illumination contrasts and helicopter vibrations.

  9. Improved Airborne System for Sensing Wildfires

    NASA Technical Reports Server (NTRS)

    McKeown, Donald; Richardson, Michael

    2008-01-01

    The Wildfire Airborne Sensing Program (WASP) is engaged in a continuing effort to develop an improved airborne instrumentation system for sensing wildfires. The system could also be used for other aerial-imaging applications, including mapping and military surveillance. Unlike prior airborne fire-detection instrumentation systems, the WASP system would not be based on custom-made multispectral line scanners and associated custom- made complex optomechanical servomechanisms, sensors, readout circuitry, and packaging. Instead, the WASP system would be based on commercial off-the-shelf (COTS) equipment that would include (1) three or four electronic cameras (one for each of three or four wavelength bands) instead of a multispectral line scanner; (2) all associated drive and readout electronics; (3) a camera-pointing gimbal; (4) an inertial measurement unit (IMU) and a Global Positioning System (GPS) receiver for measuring the position, velocity, and orientation of the aircraft; and (5) a data-acquisition subsystem. It would be necessary to custom-develop an integrated sensor optical-bench assembly, a sensor-management subsystem, and software. The use of mostly COTS equipment is intended to reduce development time and cost, relative to those of prior systems.

  10. Estimation of the Spectral Sensitivity Functions of Un-Modified and Modified Commercial Off-The Digital Cameras to Enable Their Use as a Multispectral Imaging System for Uavs

    NASA Astrophysics Data System (ADS)

    Berra, E.; Gibson-Poole, S.; MacArthur, A.; Gaulton, R.; Hamilton, A.

    2015-08-01

    Commercial off-the-shelf (COTS) digital cameras on-board unmanned aerial vehicles (UAVs) have the potential to be used as multispectral imaging systems; however, their spectral sensitivity is usually unknown and needs to be either measured or estimated. This paper details a step by step methodology for identifying the spectral sensitivity of modified (to be response to near infra-red wavelengths) and un-modified COTS digital cameras, showing the results of its application for three different models of camera. Six digital still cameras, which are being used as imaging systems on-board different UAVs, were selected to have their spectral sensitivities measured by a monochromator. Each camera was exposed to monochromatic light ranging from 370 nm to 1100 nm in 10 nm steps, with images of each step recorded in RAW format. The RAW images were converted linearly into TIFF images using DCRaw, an open-source program, before being batch processed through ImageJ (also open-source), which calculated the mean and standard deviation values from each of the red-green-blue (RGB) channels over a fixed central region within each image. These mean values were then related to the relative spectral radiance from the monochromator and its integrating sphere, in order to obtain the relative spectral response (RSR) for each of the cameras colour channels. It was found that different un-modified camera models present very different RSR in some channels, and one of the modified cameras showed a response that was unexpected. This highlights the need to determine the RSR of a camera before using it for any quantitative studies.

  11. Molecular Shocks Associated with Massive Young Stars: CO Line Images with a New Far-Infrared Spectroscopic Camera on the Kuiper Airborne Observatory

    NASA Technical Reports Server (NTRS)

    Watson, Dan M.

    1997-01-01

    Under the terms of our contract with NASA Ames Research Center, the University of Rochester (UR) offers the following final technical report on grant NAG 2-958, Molecular shocks associated with massive young stars: CO line images with a new far-infrared spectroscopic camera, given for implementation of the UR Far-Infrared Spectroscopic Camera (FISC) on the Kuiper Airborne Observatory (KAO), and use of this camera for observations of star-formation regions 1. Two KAO flights in FY 1995, the final year of KAO operations, were awarded to this program, conditional upon a technical readiness confirmation which was given in January 1995. The funding period covered in this report is 1 October 1994 - 30 September 1996. The project was supported with $30,000, and no funds remained at the conclusion of the project.

  12. Airborne Network Camera Standard

    DTIC Science & Technology

    2015-06-01

    primarily to cover terminology included in or consistent with the GigE Vision (GEV) and IRIG 106-13 Chapter 10 standards for command and control over a...cover terminology included in or consistent with the GigE Vision1 (GEV) and IRIG 106-13 Chapter 102 standards for command and control over a variety of... standard is primarily to cover terminology included in or consistent with the GEV standard and the IRIG 106 Chapter 10 standard document. RCC Document

  13. A Cryogenic, Insulating Suspension System for the High Resolution Airborne Wideband Camera (HAWC)and Submillemeter And Far Infrared Experiment (SAFIRE) Adiabatic Demagnetization Refrigerators (ADRs)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Jackson, Michael L.; Shirron, Peter J.; Tuttle, James G.

    2002-01-01

    The High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter And Far Infrared Experiment (SAFIRE) will use identical Adiabatic Demagnetization Refrigerators (ADR) to cool their detectors to 200mK and 100mK, respectively. In order to minimize thermal loads on the salt pill, a Kevlar suspension system is used to hold it in place. An innovative, kinematic suspension system is presented. The suspension system is unique in that it consists of two parts that can be assembled and tensioned offline, and later bolted onto the salt pill.

  14. Remote sensing techniques applied to multispectral recognition of the Aranjuez pilot zone

    NASA Technical Reports Server (NTRS)

    Lemos, G. L.; Salinas, J.; Rebollo, M.

    1977-01-01

    A rectangular (7 x 14 km) area 40 km S of Madrid was remote-sensed with a three-stage recognition process. Ground truth was established in the first phase, airborne sensing with a multispectral scanner and photographic cameras were used in the second phase, and Landsat satellite data were obtained in the third phase. Agronomic and hydrological photointerpretation problems are discussed. Color, black/white, and labeled areas are displayed for crop recognition in the land-use survey; turbidity, concentrations of pollutants and natural chemicals, and densitometry of the water are considered in the evaluation of water resources.

  15. Evaluation of an airborne remote sensing platform consisting of two consumer-grade cameras for crop identification

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Remote sensing systems based on consumer-grade cameras have been increasingly used in scientific research and remote sensing applications because of their low cost and ease of use. However, the performance of consumer-grade cameras for practical applications have not been well documented in related ...

  16. Multispectral Internet imaging

    NASA Astrophysics Data System (ADS)

    Brettel, Hans; Schmitt, Francis J. M.

    2000-12-01

    We present a system for multispectral image acquisition which is accessible via an Internet connection. The system includes an electronically tunable spectral filter and a monochrome digital camera, both controlled from a PC-type computer acting as a Web server. In contrast to the three fixed color channels of an ordinary WebCam, our system provides a virtually unlimited number of spectral channels. To allow for interactive use of this multispectral image acquisition system through the network, we developed a set of Java servlets which provide access to the system through HyperText Transfer Protocol (HTTP) requests. Since only the standard Common Gateway Interface (CGI) mechanisms for client-server communication are used, the system is accessible from any Web browser.

  17. Design and Fabrication of Two-Dimensional Semiconducting Bolometer Arrays for the High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC-II)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Allen, Christine A.; Amato, Michael J.; Babu, Sachidananda R.; Bartels, Arlin E.; Benford, Dominic J.; Derro, Rebecca J.; Dowell, C. Darren; Harper, D. Al; Jhabvala, Murzy D.; Simpson, A. D. (Technical Monitor)

    2002-01-01

    The High resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC 11) will use almost identical versions of an ion-implanted silicon bolometer array developed at the National Aeronautics and Space Administration's Goddard Space Flight Center (GSFC). The GSFC "Pop-Up" Detectors (PUD's) use a unique folding technique to enable a 12 x 32-element close-packed array of bolometers with a filling factor greater than 95 percent. A kinematic Kevlar(Registered Trademark) suspension system isolates the 200 mK bolometers from the helium bath temperature, and GSFC - developed silicon bridge chips make electrical connection to the bolometers, while maintaining thermal isolation. The JFET preamps operate at 120 K. Providing good thermal heat sinking for these, and keeping their conduction and radiation from reaching the nearby bolometers, is one of the principal design challenges encountered. Another interesting challenge is the preparation of the silicon bolometers. They are manufactured in 32-element, planar rows using Micro Electro Mechanical Systems (MEMS) semiconductor etching techniques, and then cut and folded onto a ceramic bar. Optical alignment using specialized jigs ensures their uniformity and correct placement. The rows are then stacked to create the 12 x 32-element array. Engineering results from the first light run of SHARC II at the CalTech Submillimeter Observatory (CSO) are presented.

  18. Design and Fabrication of Two-Dimensional Semiconducting Bolometer Arrays for the High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC-II)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Allen, Christine A.; Amato, Michael J.; Babu, Sachidananda R.; Bartels, Arlin E.; Benford, Dominic J.; Derro, Rebecca J.; Dowell, C. Darren; Harper, D. Al; Jhabvala, Murzy D.

    2002-01-01

    The High resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC II) will use almost identical versions of an ion-implanted silicon bolometer array developed at the National Aeronautics and Space Administration's Goddard Space Flight Center (GSFC). The GSFC 'Pop-up' Detectors (PUD's) use a unique folding technique to enable a 12 x 32-element close-packed array of bolometers with a filling factor greater than 95 percent. A kinematic Kevlar(trademark) suspension system isolates the 200 mK bolometers from the helium bath temperature, and GSFC - developed silicon bridge chips make electrical connection to the bolometers, while maintaining thermal isolation. The JFET preamps operate at 120 K. Providing good thermal heat sinking for these, and keeping their conduction and radiation from reaching the nearby bolometers, is one of the principal design challenges encountered. Another interesting challenge is the preparation of the silicon bolometers. They are manufactured in 32-element, planar rows using Micro Electro Mechanical Systems (MEMS) semiconductor etching techniques, and then cut and folded onto a ceramic bar. Optical alignment using specialized jigs ensures their uniformity and correct placement. The rows are then stacked to create the 12 x 32-element array. Engineering results from the first light run of SHARC II at the Caltech Submillimeter Observatory (CSO) are presented.

  19. Multispectral photometric stereo for acquiring high-fidelity surface normals.

    PubMed

    Nam, Giljoo; Kim, Min H

    2014-01-01

    Multispectral imaging and photometric stereo are common in 3D imaging but rarely have been combined. Reconstructing a 3D object's shape using photometric stereo is challenging owing to indirect illumination, specular reflection, and self-shadows, and removing interreflection in photometric stereo is problematic. A new multispectral photometric-stereo method removes interreflection on diffuse materials using multispectral-reflectance information and reconstructs 3D shapes with high accuracy. You can integrate this method into photometric-stereo systems by simply substituting the original camera with a multispectral camera.

  20. Airborne hyperspectral surface and cloud bi-directional reflectivity observations in the Arctic using a commercial, digital camera

    NASA Astrophysics Data System (ADS)

    Ehrlich, A.; Bierwirth, E.; Wendisch, M.; Herber, A.; Gayet, J.-F.

    2011-09-01

    Spectral radiance measurements by a digital single-lens reflex camera were used to derive the bi-directional reflectivity of clouds and different surfaces in the Arctic. The camera has been calibrated radiometrically and spectrally to provide accurate radiance measurements with high angular resolution. A comparison with spectral radiance measurements with the SMART-Albedometer showed an agreement within the uncertainties of both instruments. The bi-directional reflectivity in terms of the hemispherical directional reflectance factor HDRF was obtained for sea ice, ice free ocean and clouds. The sea ice, with an albedo of ρ = 0.96, showed an almost isotropic HDRF, while sun glint was observed for the ocean HDRF (ρ = 0.12). For the cloud observations with ρ = 0.62, the fog bow - a backscatter feature typically for scattering by liquid water droplets - was covered by the camera. For measurements above a heterogeneous stratocumulus clouds, the required number of images to obtain a mean HDRF which clearly exhibits the fog bow has been estimated with about 50 images (10 min flight time). A representation of the HDRF as function of the scattering angle only reduces the image number to about 10 (2 min flight time). The measured cloud and ocean HDRF have been compared to radiative transfer simulations. The ocean HDRF simulated with the observed surface wind speed of 9 m s-1 agreed best with the measurements. For the cloud HDRF, the best agreement was obtained by a broad and weak fog bow simulated with a cloud droplet effective radius of Reff = 4 μm. This value agrees with the particle sizes from in situ measurements and retrieved from the spectral radiance of the SMART-Albedometer.

  1. Close-up multispectral images of the surface of comet 67P/Churyumov-Gerasimenko by the ROLIS camera onboard the Rosetta Philae lander

    NASA Astrophysics Data System (ADS)

    Schroeder, S.; Mottola, S.; Arnold, G.; Grothues, H. G.; Jaumann, R.; Michaelis, H.; Neukum, G.; Pelivan, I.; Bibring, J. P.

    2014-12-01

    In November 2014 the Philae lander onboard Rosetta is scheduled to land on the surface of comet 67P/Churyumov-Gerasimenko. The ROLIS camera will provide the ground truth for the Rosetta OSIRIS camera. ROLIS will acquire images both during the descent and after landing. In this paper we concentrate on the post-landing images. The close-up images will enable us to characterize the morphology and texture of the surface, and the shape, albedo, and size distribution of the particles on scales as small as 0.3 mm per pixel. We may see evidence for a dust mantle, a refractory crust, and exposed ice. In addition, we hope to identify features such as pores, cracks, or vents that allow volatiles to escape the surface. We will not only image the surface during the day but also the night, when LEDs will illuminate the surface in four different colors (blue, green, red, near-IR). This will characterize the spectral properties and heterogeneity of the surface, helping us to identify its composition. Although the ROLIS spectral range and resolution are too limited to allow an exact mineralogical characterization, a study of the spectral slope and albedo will allow a broad classification of the solid surface phases. We expect to be able to distinguish between organic material, silicates and ices. By repeated imaging over the course of the mission ROLIS may detect long term changes associated with cometary activity.

  2. Novel x-ray multispectral imaging of ultraintense laser plasmas by a single-photon charge coupled device based pinhole camera

    SciTech Connect

    Labate, L.; Giulietti, A.; Giulietti, D.; Koester, P.; Levato, T.; Gizzi, L. A.; Zamponi, F.; Luebcke, A.; Kaempfer, T.; Uschmann, I.; Foerster, E.

    2007-10-15

    Spectrally resolved two-dimensional imaging of ultrashort laser-produced plasmas is described, obtained by means of an advanced technique. The technique has been tested with microplasmas produced by ultrashort relativistic laser pulses. The technique is based on the use of a pinhole camera equipped with a charge coupled device detector operating in the single-photon regime. The spectral resolution is about 150 eV in the 4-10 keV range, and images in any selected photon energy range have a spatial resolution of 5 {mu}m. The potential of the technique to study fast electron propagation in ultraintense laser interaction with multilayer targets is discussed and some preliminary results are shown.

  3. Cucumber disease diagnosis using multispectral images

    NASA Astrophysics Data System (ADS)

    Feng, Jie; Li, Hongning; Shi, Junsheng; Yang, Weiping; Liao, Ningfang

    2009-07-01

    In this paper, multispectral imaging technique for plant diseases diagnosis is presented. Firstly, multispectral imaging system is designed. This system utilizes 15 narrow-band filters, a panchromatic band, a monochrome CCD camera, and standard illumination observing environment. The spectral reflectance and color of 8 Macbeth color patches are reproduced between 400nm and 700nm in the process. In addition, spectral reflectance angle and color difference is obtained through measurements and analysis of color patches using spectrometer and multispectral imaging system. The result shows that 16 narrow-bands multispectral imaging system realizes good accuracy in spectral reflectance and color reproduction. Secondly, a horticultural plant, cucumber' familiar disease are the researching objects. 210 multispectral samples are obtained by multispectral and are classified by BP artificial neural network. The classification accuracies of Sphaerotheca fuliginea, Corynespora cassiicola, Pseudoperonospora cubensis are 100%. Trichothecium roseum and Cladosporium cucumerinum are 96.67% and 90.00%. It is confirmed that the multispectral imaging system realizes good accuracy in the cucumber diseases diagnosis.

  4. High Resolution Airborne Digital Imagery for Precision Agriculture

    NASA Technical Reports Server (NTRS)

    Herwitz, Stanley R.

    1998-01-01

    The Environmental Research Aircraft and Sensor Technology (ERAST) program is a NASA initiative that seeks to demonstrate the application of cost-effective aircraft and sensor technology to private commercial ventures. In 1997-98, a series of flight-demonstrations and image acquisition efforts were conducted over the Hawaiian Islands using a remotely-piloted solar- powered platform (Pathfinder) and a fixed-wing piloted aircraft (Navajo) equipped with a Kodak DCS450 CIR (color infrared) digital camera. As an ERAST Science Team Member, I defined a set of flight lines over the largest coffee plantation in Hawaii: the Kauai Coffee Company's 4,000 acre Koloa Estate. Past studies have demonstrated the applications of airborne digital imaging to agricultural management. Few studies have examined the usefulness of high resolution airborne multispectral imagery with 10 cm pixel sizes. The Kodak digital camera integrated with ERAST's Airborne Real Time Imaging System (ARTIS) which generated multiband CCD images consisting of 6 x 106 pixel elements. At the designated flight altitude of 1,000 feet over the coffee plantation, pixel size was 10 cm. The study involved the analysis of imagery acquired on 5 March 1998 for the detection of anomalous reflectance values and for the definition of spectral signatures as indicators of tree vigor and treatment effectiveness (e.g., drip irrigation; fertilizer application).

  5. Commercial Applications Multispectral Sensor System

    NASA Technical Reports Server (NTRS)

    Birk, Ronald J.; Spiering, Bruce

    1993-01-01

    NASA's Office of Commercial Programs is funding a multispectral sensor system to be used in the development of remote sensing applications. The Airborne Terrestrial Applications Sensor (ATLAS) is designed to provide versatility in acquiring spectral and spatial information. The ATLAS system will be a test bed for the development of specifications for airborne and spaceborne remote sensing instrumentation for dedicated applications. This objective requires spectral coverage from the visible through thermal infrared wavelengths, variable spatial resolution from 2-25 meters; high geometric and geo-location accuracy; on-board radiometric calibration; digital recording; and optimized performance for minimized cost, size, and weight. ATLAS is scheduled to be available in 3rd quarter 1992 for acquisition of data for applications such as environmental monitoring, facilities management, geographic information systems data base development, and mineral exploration.

  6. Multispectral Photography: the obscure becomes the obvious

    ERIC Educational Resources Information Center

    Polgrean, John

    1974-01-01

    Commonly used in map making, real estate zoning, and highway route location, aerial photography planes equipped with multispectral cameras may, among many environmental applications, now be used to locate mineral deposits, define marshland boundaries, study water pollution, and detect diseases in crops and forests. (KM)

  7. Processing Of Multispectral Data For Identification Of Rocks

    NASA Technical Reports Server (NTRS)

    Evans, Diane L.

    1990-01-01

    Linear discriminant analysis and supervised classification evaluated. Report discusses processing of multispectral remote-sensing imagery to identify kinds of sedimentary rocks by spectral signatures in geological and geographical contexts. Raw image data are spectra of picture elements in images of seven sedimentary rock units exposed on margin of Wind River Basin in Wyoming. Data acquired by Landsat Thematic Mapper (TM), Thermal Infrared Multispectral Scanner (TIMS), and NASA/JPL airborne synthetic-aperture radar (SAR).

  8. Airborne multispectral detection of regrowth cotton fields

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Regrowth of cotton, Gossypium hirsutum L., can provide boll weevils, Anthonomus grandis Boheman, with an extended opportunity to feed and reproduce beyond the production season. Effective methods for timely areawide detection of these potential host plants are critically needed to achieve eradicati...

  9. Airborne Multi-Spectral Minefield Survey

    DTIC Science & Technology

    2005-05-01

    Quality Control . An important methodological aspect of the selected approach is the pyramidal information structure, which is reflected in the use of...the image interpreter to manually assign ground control points. After AGM processing for each individual image the results are stored in GEOTIFF file...comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 01 MAY 2005 2. REPORT TYPE N/A

  10. D Land Cover Classification Based on Multispectral LIDAR Point Clouds

    NASA Astrophysics Data System (ADS)

    Zou, Xiaoliang; Zhao, Guihua; Li, Jonathan; Yang, Yuanxi; Fang, Yong

    2016-06-01

    Multispectral Lidar System can emit simultaneous laser pulses at the different wavelengths. The reflected multispectral energy is captured through a receiver of the sensor, and the return signal together with the position and orientation information of sensor is recorded. These recorded data are solved with GNSS/IMU data for further post-processing, forming high density multispectral 3D point clouds. As the first commercial multispectral airborne Lidar sensor, Optech Titan system is capable of collecting point clouds data from all three channels at 532nm visible (Green), at 1064 nm near infrared (NIR) and at 1550nm intermediate infrared (IR). It has become a new source of data for 3D land cover classification. The paper presents an Object Based Image Analysis (OBIA) approach to only use multispectral Lidar point clouds datasets for 3D land cover classification. The approach consists of three steps. Firstly, multispectral intensity images are segmented into image objects on the basis of multi-resolution segmentation integrating different scale parameters. Secondly, intensity objects are classified into nine categories by using the customized features of classification indexes and a combination the multispectral reflectance with the vertical distribution of object features. Finally, accuracy assessment is conducted via comparing random reference samples points from google imagery tiles with the classification results. The classification results show higher overall accuracy for most of the land cover types. Over 90% of overall accuracy is achieved via using multispectral Lidar point clouds for 3D land cover classification.

  11. Miniature snapshot multispectral imager

    NASA Astrophysics Data System (ADS)

    Gupta, Neelam; Ashe, Philip R.; Tan, Songsheng

    2011-03-01

    We present a miniature snapshot multispectral imager based on using a monolithic filter array that operates in the short wavelength infrared spectral region and has a number of defense and commercial applications. The system is low-weight, portable with a miniature platform, and requires low power. The imager uses a 4×4 Fabry-Pérot filter array operating from 1487 to 1769 nm with a spectral bandpass ~10 nm. The design of the filters is based on using a shadow mask technique to fabricate an array of Fabry-Pérot etalons with two multilayer dielectric mirrors. The filter array is installed in a commercial handheld InGaAs camera, replacing the imaging lens with a custom designed 4×4 microlens assembly with telecentric imaging performance in each of the 16 subimaging channels. We imaged several indoor and outdoor scenes. The microlens assembly and filter design is quite flexible and can be tailored for any wavelength region from the ultraviolet to the longwave infrared, and the spectral bandpass can also be customized to meet sensing requirements. In this paper we discuss the design and characterization of the filter array, the microlens optical assembly, and imager and present imaging results.

  12. Application of multispectral systems for the diagnosis of plant diseases

    NASA Astrophysics Data System (ADS)

    Feng, Jie; Liao, Ningfang; Wang, Guolong; Luo, Yongdao; Liang, Minyong

    2008-03-01

    Multispectral imaging technique combines space imaging and spectral detecting. It can obtain the spectral information and image information of object at the same time. Base on this concept, A new method proposed multispectral camera system to demonstrated plant diseases. In this paper, multispectral camera was used as image capturing device. It consists of a monochrome CCD camera and 16 narrow-band filters. The multispectral images of Macbeth 24 color patches are captured under the illumination of incandescent lamp in this experiment The 64 spectral reflectances of each color patches are calculated using Spline interpolation from 400 to 700nm in the process. And the color of the object is reproduced from the estimated spectral reflectance. The result for reproduction is contrast with the color signal using X-rite PULSE spectrophotometer. The average and maximum ΔΕ * ab are 9.23 and 12.81. It is confirmed that the multispectral system realizes the color reproduction of plant diseases from narrow-band multispectral image.

  13. Multispectral microwave imaging radar for remote sensing applications

    NASA Technical Reports Server (NTRS)

    Larson, R. W.; Rawson, R.; Ausherman, D.; Bryan, L.; Porcello, L.

    1974-01-01

    A multispectral airborne microwave radar imaging system, capable of obtaining four images simultaneously is described. The system has been successfully demonstrated in several experiments and one example of results obtained, fresh water ice, is given. Consideration of the digitization of the imagery is given and an image digitizing system described briefly. Preliminary results of digitization experiments are included.

  14. Leica ADS40 Sensor for Coastal Multispectral Imaging

    NASA Technical Reports Server (NTRS)

    Craig, John C.

    2007-01-01

    The Leica ADS40 Sensor as it is used for coastal multispectral imaging is presented. The contents include: 1) Project Area Overview; 2) Leica ADS40 Sensor; 3) Focal Plate Arrangements; 4) Trichroid Filter; 5) Gradient Correction; 6) Image Acquisition; 7) Remote Sensing and ADS40; 8) Band comparisons of Satellite and Airborne Sensors; 9) Impervious Surface Extraction; and 10) Impervious Surface Details.

  15. Fourier multispectral imaging.

    PubMed

    Jia, Jie; Ni, Chuan; Sarangan, Andrew; Hirakawa, Keigo

    2015-08-24

    Current multispectral imaging systems use narrowband filters to capture the spectral content of a scene, which necessitates different filters to be designed for each application. In this paper, we demonstrate the concept of Fourier multispectral imaging which uses filters with sinusoidally varying transmittance. We designed and built these filters employing a single-cavity resonance, and made spectral measurements with a multispectral LED array. The measurements show that spectral features such as transmission and absorption peaks are preserved with this technique, which makes it a versatile technique than narrowband filters for a wide range of multispectral imaging applications.

  16. Multispectral imaging system for contaminant detection

    NASA Technical Reports Server (NTRS)

    Poole, Gavin H. (Inventor)

    2003-01-01

    An automated inspection system for detecting digestive contaminants on food items as they are being processed for consumption includes a conveyor for transporting the food items, a light sealed enclosure which surrounds a portion of the conveyor, with a light source and a multispectral or hyperspectral digital imaging camera disposed within the enclosure. Operation of the conveyor, light source and camera are controlled by a central computer unit. Light reflected by the food items within the enclosure is detected in predetermined wavelength bands, and detected intensity values are analyzed to detect the presence of digestive contamination.

  17. Multispectral imaging using a single bucket detector

    NASA Astrophysics Data System (ADS)

    Bian, Liheng; Suo, Jinli; Situ, Guohai; Li, Ziwei; Fan, Jingtao; Chen, Feng; Dai, Qionghai

    2016-04-01

    Existing multispectral imagers mostly use available array sensors to separately measure 2D data slices in a 3D spatial-spectral data cube. Thus they suffer from low photon efficiency, limited spectrum range and high cost. To address these issues, we propose to conduct multispectral imaging using a single bucket detector, to take full advantage of its high sensitivity, wide spectrum range, low cost, small size and light weight. Technically, utilizing the detector’s fast response, a scene’s 3D spatial-spectral information is multiplexed into a dense 1D measurement sequence and then demultiplexed computationally under the single pixel imaging scheme. A proof-of-concept setup is built to capture multispectral data of 64 pixels × 64 pixels × 10 wavelength bands ranging from 450 nm to 650 nm, with the acquisition time being 1 minute. The imaging scheme holds great potentials for various low light and airborne applications, and can be easily manufactured as production-volume portable multispectral imagers.

  18. Multispectral imaging using a single bucket detector

    PubMed Central

    Bian, Liheng; Suo, Jinli; Situ, Guohai; Li, Ziwei; Fan, Jingtao; Chen, Feng; Dai, Qionghai

    2016-01-01

    Existing multispectral imagers mostly use available array sensors to separately measure 2D data slices in a 3D spatial-spectral data cube. Thus they suffer from low photon efficiency, limited spectrum range and high cost. To address these issues, we propose to conduct multispectral imaging using a single bucket detector, to take full advantage of its high sensitivity, wide spectrum range, low cost, small size and light weight. Technically, utilizing the detector’s fast response, a scene’s 3D spatial-spectral information is multiplexed into a dense 1D measurement sequence and then demultiplexed computationally under the single pixel imaging scheme. A proof-of-concept setup is built to capture multispectral data of 64 pixels × 64 pixels × 10 wavelength bands ranging from 450 nm to 650 nm, with the acquisition time being 1 minute. The imaging scheme holds great potentials for various low light and airborne applications, and can be easily manufactured as production-volume portable multispectral imagers. PMID:27103168

  19. Multispectral imaging using a single bucket detector.

    PubMed

    Bian, Liheng; Suo, Jinli; Situ, Guohai; Li, Ziwei; Fan, Jingtao; Chen, Feng; Dai, Qionghai

    2016-04-22

    Existing multispectral imagers mostly use available array sensors to separately measure 2D data slices in a 3D spatial-spectral data cube. Thus they suffer from low photon efficiency, limited spectrum range and high cost. To address these issues, we propose to conduct multispectral imaging using a single bucket detector, to take full advantage of its high sensitivity, wide spectrum range, low cost, small size and light weight. Technically, utilizing the detector's fast response, a scene's 3D spatial-spectral information is multiplexed into a dense 1D measurement sequence and then demultiplexed computationally under the single pixel imaging scheme. A proof-of-concept setup is built to capture multispectral data of 64 pixels × 64 pixels × 10 wavelength bands ranging from 450 nm to 650 nm, with the acquisition time being 1 minute. The imaging scheme holds great potentials for various low light and airborne applications, and can be easily manufactured as production-volume portable multispectral imagers.

  20. Analysis of multispectral signatures of the shot

    NASA Astrophysics Data System (ADS)

    Kastek, Mariusz; Dulski, Rafał; Piątkowski, Tadeusz; Madura, Henryk; Bareła, Jarosław; Polakowski, Henryk

    2011-06-01

    The paper presents some practical aspects of sniper IR signature measurements. Description of particular signatures for sniper shot in typical scenarios has been presented. We take into consideration sniper activities in the open area as well as in urban environment. The measurements were made at field test ground. High precision laboratory measurements were also performed. Several infrared cameras were used during measurements to cover all measurement assumptions. Some of the cameras are measurement-class devices with high accuracy and frame rates. The registrations were simultaneously made in UV, NWIR, SWIR and LWIR spectral bands. The infrared cameras have possibilities to install optical filters for multispectral measurement. An ultra fast visual camera was also used for visible spectra registration. Exemplary sniper IR signatures for typical situation were presented. LWIR imaging spectroradiometer HyperCam was also used during the laboratory measurements and field experiments. The signatures collected by HyperCam were useful for the determination of spectral characteristics of shot.

  1. Tracking Using Peer-to-Peer Smart Infrared Cameras

    DTIC Science & Technology

    2008-11-05

    calibration and gesture recognition from multi-spectral camera setups, including infrared and visible cameras. Result: We developed new object models for...work on single-camera gesture recognition . We partnered with Yokogawa Electric to develop new architectures for embedded computer vision. We developed

  2. Novel instrumentation of multispectral imaging technology for detecting tissue abnormity

    NASA Astrophysics Data System (ADS)

    Yi, Dingrong; Kong, Linghua

    2012-10-01

    Multispectral imaging is becoming a powerful tool in a wide range of biological and clinical studies by adding spectral, spatial and temporal dimensions to visualize tissue abnormity and the underlying biological processes. A conventional spectral imaging system includes two physically separated major components: a band-passing selection device (such as liquid crystal tunable filter and diffraction grating) and a scientific-grade monochromatic camera, and is expensive and bulky. Recently micro-arrayed narrow-band optical mosaic filter was invented and successfully fabricated to reduce the size and cost of multispectral imaging devices in order to meet the clinical requirement for medical diagnostic imaging applications. However the challenging issue of how to integrate and place the micro filter mosaic chip to the targeting focal plane, i.e., the imaging sensor, of an off-shelf CMOS/CCD camera is not reported anywhere. This paper presents the methods and results of integrating such a miniaturized filter with off-shelf CMOS imaging sensors to produce handheld real-time multispectral imaging devices for the application of early stage pressure ulcer (ESPU) detection. Unlike conventional multispectral imaging devices which are bulky and expensive, the resulting handheld real-time multispectral ESPU detector can produce multiple images at different center wavelengths with a single shot, therefore eliminates the image registration procedure required by traditional multispectral imaging technologies.

  3. Remote sensing of shorelines using data fusion of hyperspectral and multispectral imagery acquired from mobile and fixed platforms

    NASA Astrophysics Data System (ADS)

    Bostater, Charles R.; Frystacky, Heather

    2012-06-01

    An optimized data fusion methodology is presented and makes use of airborne and vessel mounted hyperspectral and multispectral imagery acquired at littoral zones in Florida and the northern Gulf of Mexico. The results demonstrate the use of hyperspectral-multispectral data fusion anomaly detection along shorelines and in surface and subsurface waters. Hyperspectral imagery utilized in the data fusion analysis was collected using a 64-1024 channel, 1376 pixel swath width; temperature stabilized sensing system; an integrated inertial motion unit; and differential GPS. The imaging system is calibrated using dual 18 inch calibration spheres, spectral line sources, and custom line targets. Simultaneously collected multispectral three band imagery used in the data fusion analysis was derived either a 12 inch focal length large format camera using 9 inch high speed AGFA color negative film, a 12.3 megapixel digital camera or dual high speed full definition video cameras. Pushbroom sensor imagery is corrected using Kalman filtering and smoothing in order to correct images for airborne platform motions or motions of a small vessel. Custom software developed for the hyperspectral system and the optimized data fusion process allows for post processing using atmospherically corrected and georeferenced reflectance imagery. The optimized data fusion approach allows for detecting spectral anomalies in the resolution enhanced data cubes. Spectral-spatial anomaly detection is demonstrated using simulated embedded targets in actual imagery. The approach allows one to utilize spectral signature anomalies to identify features and targets that would otherwise not be possible. The optimized data fusion techniques and software has been developed in order to perform sensitivity analysis of the synthetic images in order to optimize the singular value decomposition model building process and the 2-D Butterworth cutoff frequency selection process, using the concept of user defined "feature

  4. Low SWaP multispectral sensors using dichroic filter arrays

    NASA Astrophysics Data System (ADS)

    Dougherty, John; Varghese, Ron

    2015-06-01

    The benefits of multispectral imaging are well established in a variety of applications including remote sensing, authentication, satellite and aerial surveillance, machine vision, biomedical, and other scientific and industrial uses. However, many of the potential solutions require more compact, robust, and cost-effective cameras to realize these benefits. The next generation of multispectral sensors and cameras needs to deliver improvements in size, weight, power, portability, and spectral band customization to support widespread deployment for a variety of purpose-built aerial, unmanned, and scientific applications. A novel implementation uses micro-patterning of dichroic filters1 into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. This approach can be implemented across a variety of wavelength ranges and on a variety of detector types including linear, area, silicon, and InGaAs. This dichroic filter array approach can also reduce payloads and increase range for unmanned systems, with the capability to support both handheld and autonomous systems. Recent examples and results of 4 band RGB + NIR dichroic filter arrays in multispectral cameras are discussed. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and scalable production.

  5. Optimization of multispectral sensors for bathymetry applications

    NASA Technical Reports Server (NTRS)

    Tanis, F. J.; Byrnes, H. J.

    1986-01-01

    The Naval Oceanographic office has proposed to augment current capabilities with an airborne MSS system capable of conducting hydrographic surveys of shallow and clear oceanic waters for purposes of determining ocean depth and identifying marine hazards. Recent efforts have concentrated on development of an active/passive system, where the active system will be used to calibrate a passive multispectral sensor. In this paper, parameters which influence collection-system design and depth-extraction techniques have been used to describe the practical bounds to which MSS technology can support coastal bathymetric surveying. Performance is estimated in terms of expected S/N and depth-extraction errors.

  6. Development and application of multispectral algorithms for defect apple inspection

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This research developed and evaluated the multispectral algorithm derived from hyperspectral line-scan imaging system which equipped with an electron-multiplying-charge-coupled-device camera and an imaging spectrograph for the detection of defect Red Delicious apples. The algorithm utilized the fluo...

  7. Real-time aerial multispectral imaging solutions using dichroic filter arrays

    NASA Astrophysics Data System (ADS)

    Chandler, Eric V.; Fish, David E.

    2014-06-01

    The next generation of multispectral sensors and cameras needs to deliver significant improvements in size, weight, portability, and spectral band customization to support widespread commercial deployment for a variety of purposebuilt aerial, unmanned, and scientific applications. The benefits of multispectral imaging are well established for applications including machine vision, biomedical, authentication, and remote sensing environments - but many aerial and OEM solutions require more compact, robust, and cost-effective production cameras to realize these benefits. A novel implementation uses micropatterning of dichroic filters into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color camera image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. We demonstrate recent results of 4-9 band dichroic filter arrays in multispectral cameras using a variety of sensors including linear, area, silicon, and InGaAs. Specific implementations range from hybrid RGB + NIR sensors to custom sensors with applicationspecific VIS, NIR, and SWIR spectral bands. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and development path. Finally, we report on the wafer-level fabrication of dichroic filter arrays on imaging sensors for scalable production of multispectral sensors and cameras.

  8. Real-time compact multispectral imaging solutions using dichroic filter arrays

    NASA Astrophysics Data System (ADS)

    Chandler, Eric V.; Fish, David E.

    2014-03-01

    The next generation of multispectral sensors and cameras will need to deliver significant improvements in size, weight, portability, and spectral band customization to support widespread commercial deployment. The benefits of multispectral imaging are well established for applications including machine vision, biomedical, authentication, and aerial remote sensing environments - but many OEM solutions require more compact, robust, and cost-effective production cameras to realize these benefits. A novel implementation uses micro-patterning of dichroic filters into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color camera image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. We demonstrate recent results of 4-9 band dichroic filter arrays in multispectral cameras using a variety of sensors including linear, area, silicon, and InGaAs. Specific implementations range from hybrid RGB + NIR sensors to custom sensors with application-specific VIS, NIR, and SWIR spectral bands. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and development path. Finally, we report on the wafer-level fabrication of dichroic filter arrays on imaging sensors for scalable production of multispectral sensors and cameras.

  9. Active and passive multispectral scanner for earth resources applications: An advanced applications flight experiment

    NASA Technical Reports Server (NTRS)

    Hasell, P. G., Jr.; Peterson, L. M.; Thomson, F. J.; Work, E. A.; Kriegler, F. J.

    1977-01-01

    The development of an experimental airborne multispectral scanner to provide both active (laser illuminated) and passive (solar illuminated) data from a commonly registered surface scene is discussed. The system was constructed according to specifications derived in an initial programs design study. The system was installed in an aircraft and test flown to produce illustrative active and passive multi-spectral imagery. However, data was not collected nor analyzed for any specific application.

  10. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.

  11. An Approach to Application Validation of Multispectral Sensors Using AVIRIS

    NASA Technical Reports Server (NTRS)

    Warner, Amanda; Blonski, Slawomir; Gasser, Gerald; Ryan, Robert; Zanoni, Vicki

    2001-01-01

    High-resolution multispectral data are becoming widely available for commercial and scientific use. For specific applications, such as agriculture studies, there is a need to quantify the performance of such systems. In many cases, parameters such as GSD and SNR can be optimized. Data sets with varying GSD's for the Landsat ETM+ bands were produced to evaluate the effects of GSD on various algorithms and transformations, such as NDVI, principal component analysis, unsupervised classification, and mixture analysis. By showing that AVIRIS data can be used to simulate spaceborne and airborne multispectral platforms over a wide range of GSD, this research can be used to assist in band selection and spatial resolution specifications for new sensors and in optimization of acquisition strategies for existing multispectral systems.

  12. Multispectral Image Capturing with Foveon Sensors

    NASA Astrophysics Data System (ADS)

    Gehrke, R.; Greiwe, A.

    2013-08-01

    This article describes a specific image quality problem using an UAV and the commercially available multispectral camera Tetracam ADC Lite. The tests were carried out with commercially available UAV Multirotor MR-X 8 performed under normal use and conditions. The ADC Lite shows a remarkable rolling shutter effect caused by the movement and vibrations of the UAV and a slow readout speed of the sensor. Based on these studies the current state of a sensor development is presented, which is composed of two compact cameras with Foveon sensors. These cameras allow to record high quality image data without motion blur or rolling shutter effect. One camera captures the normal colour range; the second camera is modified for the near infrared. The moving parts of both cameras are glued to ensure that a geometric camera calibration is valid over a longer period of time. The success of the gluing procedure has been proven by multiple calibrations. For the matching of the colour- and infrared image the usability of calibrated relative orientation parameters between both cameras were tested. Despite absolutely synchronous triggering of the cameras by an electrical signal, a time delay can be found up to 3/100 s between the images. This time delay in combination with the movement and rotation of the UAV while taking the photos results in a significant error in the previously calibrated relative orientation. These parameters should not be used in further processing. This article concludes with a first result of a 4-channel image and an outlook on the following investigations.

  13. Galileo multispectral imaging of Earth.

    PubMed

    Geissler, P; Thompson, W R; Greenberg, R; Moersch, J; McEwen, A; Sagan, C

    1995-08-25

    Nearly 6000 multispectral images of Earth were acquired by the Galileo spacecraft during its two flybys. The Galileo images offer a unique perspective on our home planet through the spectral capability made possible by four narrowband near-infrared filters, intended for observations of methane in Jupiter's atmosphere, which are not incorporated in any of the currently operating Earth orbital remote sensing systems. Spectral variations due to mineralogy, vegetative cover, and condensed water are effectively mapped by the visible and near-infrared multispectral imagery, showing a wide variety of biological, meteorological, and geological phenomena. Global tectonic and volcanic processes are clearly illustrated by these images, providing a useful basis for comparative planetary geology. Differences between plant species are detected through the narrowband IR filters on Galileo, allowing regional measurements of variation in the "red edge" of chlorophyll and the depth of the 1-micrometer water band, which is diagnostic of leaf moisture content. Although evidence of life is widespread in the Galileo data set, only a single image (at approximately 2 km/pixel) shows geometrization plausibly attributable to our technical civilization. Water vapor can be uniquely imaged in the Galileo 0.73-micrometer band, permitting spectral discrimination of moist and dry clouds with otherwise similar albedo. Surface snow and ice can be readily distinguished from cloud cover by narrowband imaging within the sensitivity range of Galileo's silicon CCD camera. Ice grain size variations can be mapped using the weak H2O absorption at 1 micrometer, a technique which may find important applications in the exploration of the moons of Jupiter. The Galileo images have the potential to make unique contributions to Earth science in the areas of geological, meteorological and biological remote sensing, due to the inclusion of previously untried narrowband IR filters. The vast scale and near global

  14. Land use classification utilizing remote multispectral scanner data and computer analysis techniques

    NASA Technical Reports Server (NTRS)

    Leblanc, P. N.; Johannsen, C. J.; Yanner, J. E.

    1973-01-01

    An airborne multispectral scanner was used to collect the visible and reflective infrared data. A small subdivision near Lafayette, Indiana was selected as the test site for the urban land use study. Multispectral scanner data were collected over the subdivision on May 1, 1970 from an altitude of 915 meters. The data were collected in twelve wavelength bands from 0.40 to 1.00 micrometers by the scanner. The results indicated that computer analysis of multispectral data can be very accurate in classifying and estimating the natural and man-made materials that characterize land uses in an urban scene.

  15. Multispectral imaging probe

    SciTech Connect

    Sandison, David R.; Platzbecker, Mark R.; Descour, Michael R.; Armour, David L.; Craig, Marcus J.; Richards-Kortum, Rebecca

    1999-01-01

    A multispectral imaging probe delivers a range of wavelengths of excitation light to a target and collects a range of expressed light wavelengths. The multispectral imaging probe is adapted for mobile use and use in confined spaces, and is sealed against the effects of hostile environments. The multispectral imaging probe comprises a housing that defines a sealed volume that is substantially sealed from the surrounding environment. A beam splitting device mounts within the sealed volume. Excitation light is directed to the beam splitting device, which directs the excitation light to a target. Expressed light from the target reaches the beam splitting device along a path coaxial with the path traveled by the excitation light from the beam splitting device to the target. The beam splitting device directs expressed light to a collection subsystem for delivery to a detector.

  16. Multispectral imaging probe

    DOEpatents

    Sandison, D.R.; Platzbecker, M.R.; Descour, M.R.; Armour, D.L.; Craig, M.J.; Richards-Kortum, R.

    1999-07-27

    A multispectral imaging probe delivers a range of wavelengths of excitation light to a target and collects a range of expressed light wavelengths. The multispectral imaging probe is adapted for mobile use and use in confined spaces, and is sealed against the effects of hostile environments. The multispectral imaging probe comprises a housing that defines a sealed volume that is substantially sealed from the surrounding environment. A beam splitting device mounts within the sealed volume. Excitation light is directed to the beam splitting device, which directs the excitation light to a target. Expressed light from the target reaches the beam splitting device along a path coaxial with the path traveled by the excitation light from the beam splitting device to the target. The beam splitting device directs expressed light to a collection subsystem for delivery to a detector. 8 figs.

  17. SWNT Imaging Using Multispectral Image Processing

    NASA Astrophysics Data System (ADS)

    Blades, Michael; Pirbhai, Massooma; Rotkin, Slava V.

    2012-02-01

    A flexible optical system was developed to image carbon single-wall nanotube (SWNT) photoluminescence using the multispectral capabilities of a typical CCD camcorder. The built in Bayer filter of the CCD camera was utilized, using OpenCV C++ libraries for image processing, to decompose the image generated in a high magnification epifluorescence microscope setup into three pseudo-color channels. By carefully calibrating the filter beforehand, it was possible to extract spectral data from these channels, and effectively isolate the SWNT signals from the background.

  18. Automated Data Production For A Novel Airborne Multiangle Spectropolarimetric Imager (AIRMSPI)

    NASA Technical Reports Server (NTRS)

    Jovanovic, V .M.; Bull, M.; Diner, D. J.; Geier, S.; Rheingans, B.

    2012-01-01

    A novel polarimetric imaging technique making use of rapid retardance modulation has been developed by JPL as a part of NASA's Instrument Incubator Program. It has been built into the Airborne Multiangle SpectroPolarimetric Imager (AirMSPI) under NASA's Airborne Instrument Technology Transition Program, and is aimed primarily at remote sensing of the amounts and microphysical properties of aerosols and clouds. AirMSPI includes an 8-band (355, 380, 445, 470, 555, 660, 865, 935 nm) pushbroom camera that measures polarization in a subset of the bands (470, 660, and 865 nm). The camera is mounted on a gimbal and acquires imagery in a configurable set of along-track viewing angles ranging between +67 deg and -67 deg relative to nadir. As a result, near simultaneous multi-angle, multi-spectral, and polarimetric measurements of the targeted areas at a spatial resolution ranging from 7 m to 20 m (depending on the viewing angle) can be derived. An automated data production system is being built to support high data acquisition rate in concert with co-registration and orthorectified mapping requirements. To date, a number of successful engineering checkout flights were conducted in October 2010, August-September 2011, and January 2012. Data products resulting from these flights will be presented.

  19. Multispectral Mapping of the Moon by Clementine

    NASA Technical Reports Server (NTRS)

    Eliason, Eric M.; McEwen, Alfred S.; Robinson, M.; Lucey, Paul G.; Duxbury, T.; Malaret, E.; Pieters, Carle; Becker, T.; Isbell, C.; Lee, E.

    1998-01-01

    One of the chief scientific objectives of the Clementine mission at the Moon was to acquire global multispectral mapping. A global digital map of the Moon in 11 spectral bandpasses and at a scale of 100 m/pixel is being produced at the U.S. Geological Survey in Flagstaff Arizona Near-global coverage was acquired with the UVVIS camera (central wavelengths of 415, 750, 900, 950, and 1000 nm) and the NIR camera (1102, 1248, 1499, 1996, 2620, and 2792 nary). We expect to complete processing of the UVVIS mosaics before the fall of 1998, and to complete the NIR mosaics a year later. The purpose of this poster is to provide an update on the processing and to show examples of the products or perhaps even a wall-sized display of color products from the UVVIS mosaics.

  20. Experimental Results of Ground Disturbance Detection Using Uncooled Infrared Imagers in Wideband and Multispectral Modes

    DTIC Science & Technology

    2012-02-01

    imaging for ground disturbance detection. We performed experiments to study ground disturbance detection using multispectral imaging. Multispectral...were investigated and experimentally validated on buried mines signature using MWIR and LWIR cameras [2-4]. As the performance of low cost, uncooled...NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Defence R&D Canada

  1. Multispectral metamaterial absorber.

    PubMed

    Grant, J; McCrindle, I J H; Li, C; Cumming, D R S

    2014-03-01

    We present the simulation, implementation, and measurement of a multispectral metamaterial absorber (MSMMA) and show that we can realize a simple absorber structure that operates in the mid-IR and terahertz (THz) bands. By embedding an IR metamaterial absorber layer into a standard THz metamaterial absorber stack, a narrowband resonance is induced at a wavelength of 4.3 μm. This resonance is in addition to the THz metamaterial absorption resonance at 109 μm (2.75 THz). We demonstrate the inherent scalability and versatility of our MSMMA by describing a second device whereby the MM-induced IR absorption peak frequency is tuned by varying the IR absorber geometry. Such a MSMMA could be coupled with a suitable sensor and formed into a focal plane array, enabling multispectral imaging.

  2. Color enhancement in multispectral image of human skin

    NASA Astrophysics Data System (ADS)

    Mitsui, Masanori; Murakami, Yuri; Obi, Takashi; Yamaguchi, Masahiro; Ohyama, Nagaaki

    2003-07-01

    Multispectral imaging is receiving attention in medical color imaging, as high-fidelity color information can be acquired by the multispectral image capturing. On the other hand, as color enhancement in medical color image is effective for distinguishing lesion from normal part, we apply a new technique for color enhancement using multispectral image to enhance the features contained in a certain spectral band, without changing the average color distribution of original image. In this method, to keep the average color distribution, KL transform is applied to spectral data, and only high-order KL coefficients are amplified in the enhancement. Multispectral images of human skin of bruised arm are captured by 16-band multispectral camera, and the proposed color enhancement is applied. The resultant images are compared with the color images reproduced assuming CIE D65 illuminant (obtained by natural color reproduction technique). As a result, the proposed technique successfully visualizes unclear bruised lesions, which are almost invisible in natural color images. The proposed technique will provide support tool for the diagnosis in dermatology, visual examination in internal medicine, nursing care for preventing bedsore, and so on.

  3. Polarimetric Multispectral Imaging Technology

    NASA Technical Reports Server (NTRS)

    Cheng, L.-J.; Chao, T.-H.; Dowdy, M.; Mahoney, C.; Reyes, G.

    1993-01-01

    The Jet Propulsion Laboratory is developing a remote sensing technology on which a new generation of compact, lightweight, high-resolution, low-power, reliable, versatile, programmable scientific polarimetric multispectral imaging instruments can be built to meet the challenge of future planetary exploration missions. The instrument is based on the fast programmable acousto-optic tunable filter (AOTF) of tellurium dioxide (TeO2) that operates in the wavelength range of 0.4-5 microns. Basically, the AOTF multispectral imaging instrument measures incoming light intensity as a function of spatial coordinates, wavelength, and polarization. Its operation can be in either sequential, random access, or multiwavelength mode as required. This provides observation flexibility, allowing real-time alternation among desired observations, collecting needed data only, minimizing data transmission, and permitting implementation of new experiments. These will result in optimization of the mission performance with minimal resources. Recently we completed a polarimetric multispectral imaging prototype instrument and performed outdoor field experiments for evaluating application potentials of the technology. We also investigated potential improvements on AOTF performance to strengthen technology readiness for applications. This paper will give a status report on the technology and a prospect toward future planetary exploration.

  4. Estimation of cotton yield with varied irrigation and nitrogen treatments using aerial multispectral imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton yield varies spatially within a field. The variability can be caused by various production inputs such as soil properties, water management, and fertilizer application. Airborne multispectral imaging is capable of providing data and information to study effects of the inputs on yield qualitat...

  5. Use of multispectral scanner images for assessment of hydrothermal alteration in the Marysvale, Utah, mining area.

    USGS Publications Warehouse

    Podwysocki, M.H.; Segal, D.B.; Abrams, M.J.

    1983-01-01

    Airborne multispectral scanner. A color composite image was constructed using the following spectral band ratios: 1.6/2.2 mu m, 1.6/0.48 mu m, and 0.67/1.0 mu m. The color ratio composite successfully distinguished most types of altered rocks from unaltered rocks; further division of altered rocks into ferric oxide-rich and -poor types.

  6. A preliminary report of multispectral scanner data from the Cleveland harbor study

    NASA Technical Reports Server (NTRS)

    Shook, D.; Raquet, C.; Svehla, R.; Wachter, D.; Salzman, J.; Coney, T.; Gedney, D.

    1975-01-01

    Imagery obtained from an airborne multispectral scanner is presented. A synoptic view of the entire study area is shown for a number of time periods and for a number of spectral bands. Using several bands, sediment distributions, thermal plumes, and Rhodamine B dye distributions are shown.

  7. Multi-Spectral Cloud Property Retrieval

    NASA Technical Reports Server (NTRS)

    Carlson, Barbara E.; Lynch, R

    1999-01-01

    Despite numerous studies to retrieve cloud properties using infrared measurements the information content of the data has not yet been fully exploited. In an effort to more fully utilize the information content of infrared measurements, we have developed a multi-spectral technique for retrieving effective cloud particle size, optical depth and effective cloud temperature. While applicable to all cloud types, we begin by validating our retrieval technique through analysis of MS spectral radiances obtained during the SUCCESS field campaign over the ARM SGP CART facility, and compare our retrieval product with lidar and MODIS Airborne Simulator (MAS) measurement results. The technique is then applied to the Nimbus-4 MS infrared spectral measurements to obtain global cloud information.

  8. Galileo multispectral imaging of Earth

    NASA Astrophysics Data System (ADS)

    Geissler, Paul; Thompson, W. Reid; Greenberg, Richard; Moersch, Jeff; McEwen, Alfred; Sagan, Carl

    Nearly 6000 multispectral images of Earth were acquired by the Galileo spacecraft during its two flybys. The Galileo images offer a unique perspective on our home planet through the spectral capability made possible by four narrowband near-infrared filters, intended for observations of methane in Jupiter's atmosphere, which are not incorporated in any of the currently operating Earth orbital remote sensing systems. Spectral variations due to mineralogy, vegetative cover, and condensed water are effectively mapped by the visible and near-infrared multispectral imagery, showing a wide variety of biological, meteorological, and geological phenomena. Global tectonic and volcanic processes are clearly illustrated by these images, providing a useful basis for comparative planetary geology. Differences between plant species are detected through the narrowband IR filters on Galileo, allowing regional measurements of variation in the ``red edge'' of chlorophyll and the depth of the 1-μm water band, which is diagnostic of leaf moisture content. Although evidence of life is widespread in the Galileo data set, only a single image (at ~2 km/pixel) shows geometrization plausibly attributable to our technical civilization. Water vapor can be uniquely imaged in the Galileo 0.73-μm band, permitting spectral discrimination of moist and dry clouds with otherwise similar albedo. Surface snow and ice can be readily distinguished from cloud cover by narrowband imaging within the sensitivity range of Galileo's silicon CCD camera. Ice grain size variations can be mapped using the weak H2O absorption at 1 μm, a technique which may find important applications in the exploration of the moons of Jupiter. The Galileo images have the potential to make unique contributions to Earth science in the areas of geological, meteorological and biological remote sensing, due to the inclusion of previously untried narrowband IR filters. The vast scale and near global coverage of the Galileo data set

  9. Automated oil spill detection with multispectral imagery

    NASA Astrophysics Data System (ADS)

    Bradford, Brian N.; Sanchez-Reyes, Pedro J.

    2011-06-01

    In this publication we present an automated detection method for ocean surface oil, like that which existed in the Gulf of Mexico as a result of the April 20, 2010 Deepwater Horizon drilling rig explosion. Regions of surface oil in airborne imagery are isolated using red, green, and blue bands from multispectral data sets. The oil shape isolation procedure involves a series of image processing functions to draw out the visual phenomenological features of the surface oil. These functions include selective color band combinations, contrast enhancement and histogram warping. An image segmentation process then separates out contiguous regions of oil to provide a raster mask to an analyst. We automate the detection algorithm to allow large volumes of data to be processed in a short time period, which can provide timely oil coverage statistics to response crews. Geo-referenced and mosaicked data sets enable the largest identified oil regions to be mapped to exact geographic coordinates. In our simulation, multispectral imagery came from multiple sources including first-hand data collected from the Gulf. Results of the simulation show the oil spill coverage area as a raster mask, along with histogram statistics of the oil pixels. A rough square footage estimate of the coverage is reported if the image ground sample distance is available.

  10. Camera Optics.

    ERIC Educational Resources Information Center

    Ruiz, Michael J.

    1982-01-01

    The camera presents an excellent way to illustrate principles of geometrical optics. Basic camera optics of the single-lens reflex camera are discussed, including interchangeable lenses and accessories available to most owners. Several experiments are described and results compared with theoretical predictions or manufacturer specifications.…

  11. Airborne imaging sensors for environmental monitoring & surveillance in support of oil spills & recovery efforts

    NASA Astrophysics Data System (ADS)

    Bostater, Charles R.; Jones, James; Frystacky, Heather; Coppin, Gaelle; Leavaux, Florian; Neyt, Xavier

    2011-11-01

    Collection of pushbroom sensor imagery from a mobile platform requires corrections using inertial measurement units (IMU's) and DGPS in order to create useable imagery for environmental monitoring and surveillance of shorelines in freshwater systems, coastal littoral zones and harbor areas. This paper describes a suite of imaging systems used during collection of hyperspectral imagery in northern Florida panhandle and Gulf of Mexico airborne missions to detect weathered oil in coastal littoral zones. Underlying concepts of pushbroom imagery, the needed corrections for directional changes using DGPS and corrections for platform yaw, pitch, and roll using IMU data is described as well as the development and application of optimal band and spectral regions associated with weathered oil. Pushbroom sensor and frame camera data collected in response to the recent Gulf of Mexico oil spill disaster is presented as the scenario documenting environmental monitoring and surveillance techniques using mobile sensing platforms. Data was acquired during the months of February, March, April and May of 2011. The low altitude airborne systems include a temperature stabilized hyperspectral imaging system capable of up to 1024 spectral channels and 1376 spatial across track pixels flown from 3,000 to 4,500 feet altitudes. The hyperspectral imaging system is collocated with a full resolution high definition video recorder for simultaneous HD video imagery, a 12.3 megapixel digital, a mapping camera using 9 inch film types that yields scanned aerial imagery with approximately 22,200 by 22,200 pixel multispectral imagery (~255 megapixel RGB multispectral images in order to conduct for spectral-spatial sharpening of fused multispectral, hyperspectral imagery. Two high spectral (252 channels) and radiometric sensitivity solid state spectrographs are used for collecting upwelling radiance (sub-meter pixels) with downwelling irradiance fiber optic attachment. These sensors are utilized for

  12. Dual multispectral and 3D structured light laparoscope

    NASA Astrophysics Data System (ADS)

    Clancy, Neil T.; Lin, Jianyu; Arya, Shobhit; Hanna, George B.; Elson, Daniel S.

    2015-03-01

    Intraoperative feedback on tissue function, such as blood volume and oxygenation would be useful to the surgeon in cases where current clinical practice relies on subjective measures, such as identification of ischaemic bowel or tissue viability during anastomosis formation. Also, tissue surface profiling may be used to detect and identify certain pathologies, as well as diagnosing aspects of tissue health such as gut motility. In this paper a dual modality laparoscopic system is presented that combines multispectral reflectance and 3D surface imaging. White light illumination from a xenon source is detected by a laparoscope-mounted fast filter wheel camera to assemble a multispectral image (MSI) cube. Surface shape is then calculated using a spectrally-encoded structured light (SL) pattern detected by the same camera and triangulated using an active stereo technique. Images of porcine small bowel were acquired during open surgery. Tissue reflectance spectra were acquired and blood volume was calculated at each spatial pixel across the bowel wall and mesentery. SL features were segmented and identified using a `normalised cut' algoritm and the colour vector of each spot. Using the 3D geometry defined by the camera coordinate system the multispectral data could be overlaid onto the surface mesh. Dual MSI and SL imaging has the potential to provide augmented views to the surgeon supplying diagnostic information related to blood supply health and organ function. Future work on this system will include filter optimisation to reduce noise in tissue optical property measurement, and minimise spot identification errors in the SL pattern.

  13. High Speed Method for in Situ Multispectral Image Registration

    SciTech Connect

    Perrine, Kenneth A.; Lamarche, Brian L.; Hopkins, Derek F.; Budge, Scott E.; Opresko, Lee; Wiley, H. S.; Sowa, Marianne B.

    2007-01-29

    Multispectral confocal spinning disk microscopy provides a high resolution method for real-time live cell imaging. However, optical distortions and the physical misalignments introduced by the use of multiple acquisition cameras can obscure spatial information contained in the captured images. In this manuscript, we describe a multispectral method for real-time image registration whereby the image from one camera is warped onto the image from a second camera via a polynomial correction. This method provides a real-time pixel-for-pixel match between images obtained over physically distinct optical paths. Using an in situ calibration method, the polynomial is characterized by a set of coefficients using a least squares solver. Error analysis demonstrates optimal performance results from the use of cubic polynomials. High-speed evaluation of the warp is then performed through forward differencing with fixed-point data types. Image reconstruction errors are reduced through bilinear interpolation. The registration techniques described here allow for successful registration of multispectral images in real-time (exceeding 15 frame/sec) and have a broad applicability to imaging methods requiring pixel matching over multiple data channels.

  14. Remote sensing operations (multispectral scanner and photographic) in the New York Bight, 22 September 1975

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.; Hall, J. B., Jr.

    1977-01-01

    Ocean dumping of waste materials is a significant environmental concern in the New York Bight. One of these waste materials, sewage sludge, was monitored in an experiment conducted in the New York Bight on September 22, 1975. Remote sensing over controlled sewage sludge dumping included an 11-band multispectral scanner, fiver multispectral cameras and one mapping camera. Concurrent in situ water samples were taken and acoustical measurements were made of the sewage sludge plumes. Data were obtained for sewage sludge plumes resulting from line (moving barge) and spot (stationary barge) dumps. Multiple aircraft overpasses were made to evaluate temporal effects on the plume signature.

  15. MULTISPECTRAL THERMAL IMAGER - OVERVIEW

    SciTech Connect

    P. WEBER

    2001-03-01

    The Multispectral Thermal Imager satellite fills a new and important role in advancing the state of the art in remote sensing sciences. Initial results with the full calibration system operating indicate that the system was already close to achieving the very ambitious goals which we laid out in 1993, and we are confident of reaching all of these goals as we continue our research and improve our analyses. In addition to the DOE interests, the satellite is tasked about one-third of the time with requests from other users supporting research ranging from volcanology to atmospheric sciences.

  16. Multispectral thermal imaging

    SciTech Connect

    Weber, P.G.; Bender, S.C.; Borel, C.C.; Clodius, W.B.; Smith, B.W.; Garrett, A.; Pendergast, M.M.; Kay, R.R.

    1998-12-01

    Many remote sensing applications rely on imaging spectrometry. Here the authors use imaging spectrometry for thermal and multispectral signatures measured from a satellite platform enhanced with a combination of accurate calibrations and on-board data for correcting atmospheric distortions. The approach is supported by physics-based end-to-end modeling and analysis, which permits a cost-effective balance between various hardware and software aspects. The goal is to develop and demonstrate advanced technologies and analysis tools toward meeting the needs of the customer; at the same time, the attributes of this system can address other applications in such areas as environmental change, agriculture, and volcanology.

  17. Remote detection of past habitability at Mars-analogue hydrothermal alteration terrains using an ExoMars Panoramic Camera emulator

    NASA Astrophysics Data System (ADS)

    Harris, J. K.; Cousins, C. R.; Gunn, M.; Grindrod, P. M.; Barnes, D.; Crawford, I. A.; Cross, R. E.; Coates, A. J.

    2015-05-01

    A major scientific goal of the European Space Agency's ExoMars 2018 rover is to identify evidence of life within the martian rock record. Key to this objective is the remote detection of geological substrates that are indicative of past habitable environments, which will rely on visual (stereo wide-angle, and high resolution images) and multispectral (440-1000 nm) data produced by the Panoramic Camera (PanCam) instrument. We deployed a PanCam emulator at four hydrothermal sites in the Námafjall volcanic region of Iceland, a Mars-analogue hydrothermal alteration terrain. At these sites, sustained acidic-neutral aqueous interaction with basaltic substrates (crystalline and sedimentary) has produced phyllosilicate, ferric oxide, and sulfate-rich alteration soils, and secondary mineral deposits including gypsum veins and zeolite amygdales. PanCam emulator datasets from these sites were complemented with (i) NERC Airborne Research and Survey Facility aerial hyperspectral images of the study area; (ii) in situ reflectance spectroscopy (400-1000 nm) of PanCam spectral targets; (iii) laboratory X-ray Diffraction, and (iv) laboratory VNIR (350-2500 nm) spectroscopy of target samples to identify their bulk mineralogy and spectral properties. The mineral assemblages and palaeoenvironments characterised here are analogous to neutral-acidic alteration terrains on Mars, such as at Mawrth Vallis and Gusev Crater. Combined multispectral and High Resolution Camera datasets were found to be effective at capturing features of astrobiological importance, such as secondary gypsum and zeolite mineral veins, and phyllosilicate-rich substrates. Our field observations with the PanCam emulator also uncovered stray light problems which are most significant in the NIR wavelengths and investigations are being undertaken to ensure that the flight model PanCam cameras are not similarly affected.

  18. Retinal oxygen saturation evaluation by multi-spectral fundus imaging

    NASA Astrophysics Data System (ADS)

    Khoobehi, Bahram; Ning, Jinfeng; Puissegur, Elise; Bordeaux, Kimberly; Balasubramanian, Madhusudhanan; Beach, James

    2007-03-01

    Purpose: To develop a multi-spectral method to measure oxygen saturation of the retina in the human eye. Methods: Five Cynomolgus monkeys with normal eyes were anesthetized with intramuscular ketamine/xylazine and intravenous pentobarbital. Multi-spectral fundus imaging was performed in five monkeys with a commercial fundus camera equipped with a liquid crystal tuned filter in the illumination light path and a 16-bit digital camera. Recording parameters were controlled with software written specifically for the application. Seven images at successively longer oxygen-sensing wavelengths were recorded within 4 seconds. Individual images for each wavelength were captured in less than 100 msec of flash illumination. Slightly misaligned images of separate wavelengths due to slight eye motion were registered and corrected by translational and rotational image registration prior to analysis. Numerical values of relative oxygen saturation of retinal arteries and veins and the underlying tissue in between the artery/vein pairs were evaluated by an algorithm previously described, but which is now corrected for blood volume from averaged pixels (n > 1000). Color saturation maps were constructed by applying the algorithm at each image pixel using a Matlab script. Results: Both the numerical values of relative oxygen saturation and the saturation maps correspond to the physiological condition, that is, in a normal retina, the artery is more saturated than the tissue and the tissue is more saturated than the vein. With the multi-spectral fundus camera and proper registration of the multi-wavelength images, we were able to determine oxygen saturation in the primate retinal structures on a tolerable time scale which is applicable to human subjects. Conclusions: Seven wavelength multi-spectral imagery can be used to measure oxygen saturation in retinal artery, vein, and tissue (microcirculation). This technique is safe and can be used to monitor oxygen uptake in humans. This work

  19. Multispectral image alignment using a three channel endoscope in vivo during minimally invasive surgery.

    PubMed

    Clancy, Neil T; Stoyanov, Danail; James, David R C; Di Marco, Aimee; Sauvage, Vincent; Clark, James; Yang, Guang-Zhong; Elson, Daniel S

    2012-10-01

    Sequential multispectral imaging is an acquisition technique that involves collecting images of a target at different wavelengths, to compile a spectrum for each pixel. In surgical applications it suffers from low illumination levels and motion artefacts. A three-channel rigid endoscope system has been developed that allows simultaneous recording of stereoscopic and multispectral images. Salient features on the tissue surface may be tracked during the acquisition in the stereo cameras and, using multiple camera triangulation techniques, this information used to align the multispectral images automatically even though the tissue or camera is moving. This paper describes a detailed validation of the set-up in a controlled experiment before presenting the first in vivo use of the device in a porcine minimally invasive surgical procedure. Multispectral images of the large bowel were acquired and used to extract the relative concentration of haemoglobin in the tissue despite motion due to breathing during the acquisition. Using the stereoscopic information it was also possible to overlay the multispectral information on the reconstructed 3D surface. This experiment demonstrates the ability of this system for measuring blood perfusion changes in the tissue during surgery and its potential use as a platform for other sequential imaging modalities.

  20. Multispectral information hiding in RGB image using bit-plane-based watermarking and its application

    NASA Astrophysics Data System (ADS)

    Shinoda, Kazuma; Watanabe, Aya; Hasegawa, Madoka; Kato, Shigeo

    2015-06-01

    Although it was expected that multispectral images would be implemented in many applications, such as remote sensing and medical imaging, their use has not been widely diffused in these fields. The development of a compact multispectral camera and display will be needed for practical use, but the format compatibility between multispectral and RGB images is also important for reducing the introduction cost and having high usability. We propose a method of embedding the spectral information into an RGB image by watermarking. The RGB image is calculated from the multispectral image, and then, the original multispectral image is estimated from the RGB image using Wiener estimation. The residual data between the original and the estimated multispectral image are compressed and embedded in the lower bit planes of the RGB image. The experimental results show that, as compared with Wiener estimation, the proposed method leads to more than a 10 dB gain in the peak signal-to-noise ratio of the reconstructed multispectral image, while there are almost no significant perceptual differences in the watermarked RGB image.

  1. Time-of-Flight Microwave Camera.

    PubMed

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-10-05

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable "stealth" regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows "camera-like" behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.

  2. Multispectral scanner optical system

    NASA Technical Reports Server (NTRS)

    Stokes, R. C.; Koch, N. G. (Inventor)

    1980-01-01

    An optical system for use in a multispectral scanner of the type used in video imaging devices is disclosed. Electromagnetic radiation reflected by a rotating scan mirror is focused by a concave primary telescope mirror and collimated by a second concave mirror. The collimated beam is split by a dichroic filter which transmits radiant energy in the infrared spectrum and reflects visible and near infrared energy. The long wavelength beam is filtered and focused on an infrared detector positioned in a cryogenic environment. The short wavelength beam is dispersed by a pair of prisms, then projected on an array of detectors also mounted in a cryogenic environment and oriented at an angle relative to the optical path of the dispersed short wavelength beam.

  3. Multispectral Resource Sampler Workshop

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The utility of the multispectral resource sampler (MRS) was examined by users in the following disciplines: agriculture, atmospheric studies, engineering, forestry, geology, hydrology/oceanography, land use, and rangelands/soils. Modifications to the sensor design were recommended and the desired types of products and number of scenes required per month were indicated. The history, design, capabilities, and limitations of the MRS are discussed as well as the multilinear spectral array technology which it uses. Designed for small area inventory, the MRS can provide increased temporal, spectral, and spatial resolution, facilitate polarization measurement and atmospheric correction, and test onboard data compression techniques. The advantages of using it along with the thematic mapper are considered.

  4. Multispectral imaging radar

    NASA Technical Reports Server (NTRS)

    Porcello, L. J.; Rendleman, R. A.

    1972-01-01

    A side-looking radar, installed in a C-46 aircraft, was modified to provide it with an initial multispectral imaging capability. The radar is capable of radiating at either of two wavelengths, these being approximately 3 cm and 30 cm, with either horizontal or vertical polarization on each wavelength. Both the horizontally- and vertically-polarized components of the reflected signal can be observed for each wavelength/polarization transmitter configuration. At present, two-wavelength observation of a terrain region can be accomplished within the same day, but not with truly simultaneous observation on both wavelengths. A multiplex circuit to permit this simultaneous observation has been designed. A brief description of the modified radar system and its operating parameters is presented. Emphasis is then placed on initial flight test data and preliminary interpretation. Some considerations pertinent to the calibration of such radars are presented in passing.

  5. A multispectral scanner survey of the Salmon Site and surrounding area, Lamar County, Mississippi

    SciTech Connect

    Blohm, J.D.; Brewster, S.B. Jr.; Shines, J.E.

    1994-06-01

    An airborne multispectral scanner survey was conducted over the Salmon Site and the surrounding area in Lamar County, Mississippi, on May 8, 1992. Twelve-channel daytime multispectral data were collected from altitudes of 2,000 feet, 4,000 feet, and 6,000 feet above ground level. Large-scale color photography was acquired simultaneously with the scanner data. Three different composite images have been prepared to demonstrate the digital image enhancement techniques that can be applied to the data. The data that were acquired offer opportunity for further standard and customized analysis based on any specific environmental characterization issues associated with this site.

  6. Multispectral Microimager for Astrobiology

    NASA Technical Reports Server (NTRS)

    Sellar, R. Glenn; Farmer, Jack D.; Kieta, Andrew; Huang, Julie

    2006-01-01

    A primary goal of the astrobiology program is the search for fossil records. The astrobiology exploration strategy calls for the location and return of samples indicative of environments conducive to life, and that best capture and preserve biomarkers. Successfully returning samples from environments conducive to life requires two primary capabilities: (1) in situ mapping of the mineralogy in order to determine whether the desired minerals are present; and (2) nondestructive screening of samples for additional in-situ testing and/or selection for return to laboratories for more in-depth examination. Two of the most powerful identification techniques are micro-imaging and visible/infrared spectroscopy. The design and test results are presented from a compact rugged instrument that combines micro-imaging and spectroscopic capability to provide in-situ analysis, mapping, and sample screening capabilities. Accurate reflectance spectra should be a measure of reflectance as a function of wavelength only. Other compact multispectral microimagers use separate LEDs (light-emitting diodes) for each wavelength and therefore vary the angles of illumination when changing wavelengths. When observing a specularly-reflecting sample, this produces grossly inaccurate spectra due to the variation in the angle of illumination. An advanced design and test results are presented for a multispectral microimager which demonstrates two key advances relative to previous LED-based microimagers: (i) acquisition of actual reflectance spectra in which the flux is a function of wavelength only, rather than a function of both wavelength and illumination geometry; and (ii) increase in the number of spectral bands to eight bands covering a spectral range of 468 to 975 nm.

  7. An LED-based lighting system for acquiring multispectral scenes

    NASA Astrophysics Data System (ADS)

    Parmar, Manu; Lansel, Steven; Farrell, Joyce

    2012-01-01

    The availability of multispectral scene data makes it possible to simulate a complete imaging pipeline for digital cameras, beginning with a physically accurate radiometric description of the original scene followed by optical transformations to irradiance signals, models for sensor transduction, and image processing for display. Certain scenes with animate subjects, e.g., humans, pets, etc., are of particular interest to consumer camera manufacturers because of their ubiquity in common images, and the importance of maintaining colorimetric fidelity for skin. Typical multispectral acquisition methods rely on techniques that use multiple acquisitions of a scene with a number of different optical filters or illuminants. Such schemes require long acquisition times and are best suited for static scenes. In scenes where animate objects are present, movement leads to problems with registration and methods with shorter acquisition times are needed. To address the need for shorter image acquisition times, we developed a multispectral imaging system that captures multiple acquisitions during a rapid sequence of differently colored LED lights. In this paper, we describe the design of the LED-based lighting system and report results of our experiments capturing scenes with human subjects.

  8. Time-resolved multispectral imaging of combustion reaction

    NASA Astrophysics Data System (ADS)

    Huot, Alexandrine; Gagnon, Marc-André; Jahjah, Karl-Alexandre; Tremblay, Pierre; Savary, Simon; Farley, Vincent; Lagueux, Philippe; Guyot, Éric; Chamberland, Martin; Marcotte, Fréderick

    2015-05-01

    Thermal infrared imaging is a field of science that evolves rapidly. Scientists have used for years the simplest tool: thermal broadband cameras. This allows to perform target characterization in both the longwave (LWIR) and midwave (MWIR) infrared spectral range. Infrared thermal imaging is used for a wide range of applications, especially in the combustion domain. For example, it can be used to follow combustion reactions, in order to characterize the injection and the ignition in a combustion chamber or even to observe gases produced by a flare or smokestack. Most combustion gases such as carbon dioxide (CO2) selectively absorb/emit infrared radiation at discrete energies, i.e. over a very narrow spectral range. Therefore, temperatures derived from broadband imaging are not reliable without prior knowledge about spectral emissivity. This information is not directly available from broadband images. However, spectral information is available using spectral filters. In this work, combustion analysis was carried out using Telops MS-IR MW camera which allows multispectral imaging at a high frame rate. A motorized filter wheel allowing synchronized acquisitions on eight (8) different channels was used to provide time-resolved multispectral imaging of combustion products of a candle in which black powder has been burnt to create a burst. It was then possible to estimate the temperature by modeling spectral profile derived from information obtained with the different spectral filters. Comparison with temperatures obtained using conventional broadband imaging illustrates the benefits of time-resolved multispectral imaging for the characterization of combustion processes.

  9. Time-resolved multispectral imaging of combustion reactions

    NASA Astrophysics Data System (ADS)

    Huot, Alexandrine; Gagnon, Marc-André; Jahjah, Karl-Alexandre; Tremblay, Pierre; Savary, Simon; Farley, Vincent; Lagueux, Philippe; Guyot, Éric; Chamberland, Martin; Marcotte, Frédérick

    2015-10-01

    Thermal infrared imaging is a field of science that evolves rapidly. Scientists have used for years the simplest tool: thermal broadband cameras. These allow to perform target characterization in both the longwave (LWIR) and midwave (MWIR) infrared spectral range. Infrared thermal imaging is used for a wide range of applications, especially in the combustion domain. For example, it can be used to follow combustion reactions, in order to characterize the injection and the ignition in a combustion chamber or even to observe gases produced by a flare or smokestack. Most combustion gases, such as carbon dioxide (CO2), selectively absorb/emit infrared radiation at discrete energies, i.e. over a very narrow spectral range. Therefore, temperatures derived from broadband imaging are not reliable without prior knowledge of spectral emissivity. This information is not directly available from broadband images. However, spectral information is available using spectral filters. In this work, combustion analysis was carried out using a Telops MS-IR MW camera, which allows multispectral imaging at a high frame rate. A motorized filter wheel allowing synchronized acquisitions on eight (8) different channels was used to provide time-resolved multispectral imaging of combustion products of a candle in which black powder has been burnt to create a burst. It was then possible to estimate the temperature by modeling spectral profiles derived from information obtained with the different spectral filters. Comparison with temperatures obtained using conventional broadband imaging illustrates the benefits of time-resolved multispectral imaging for the characterization of combustion processes.

  10. MSS D Multispectral Scanner System

    NASA Technical Reports Server (NTRS)

    Lauletta, A. M.; Johnson, R. L.; Brinkman, K. L. (Principal Investigator)

    1982-01-01

    The development and acceptance testing of the 4-band Multispectral Scanners to be flown on LANDSAT D and LANDSAT D Earth resources satellites are summarized. Emphasis is placed on the acceptance test phase of the program. Test history and acceptance test algorithms are discussed. Trend data of all the key performance parameters are included and discussed separately for each of the two multispectral scanner instruments. Anomalies encountered and their resolutions are included.

  11. Polarimetric sensor systems for airborne ISR

    NASA Astrophysics Data System (ADS)

    Chenault, David; Foster, Joseph; Pezzaniti, Joseph; Harchanko, John; Aycock, Todd; Clark, Alex

    2014-06-01

    Over the last decade, polarimetric imaging technologies have undergone significant advancements that have led to the development of small, low-power polarimetric cameras capable of meeting current airborne ISR mission requirements. In this paper, we describe the design and development of a compact, real-time, infrared imaging polarimeter, provide preliminary results demonstrating the enhanced contrast possible with such a system, and discuss ways in which this technology can be integrated with existing manned and unmanned airborne platforms.

  12. Cardiac cameras.

    PubMed

    Travin, Mark I

    2011-05-01

    Cardiac imaging with radiotracers plays an important role in patient evaluation, and the development of suitable imaging instruments has been crucial. While initially performed with the rectilinear scanner that slowly transmitted, in a row-by-row fashion, cardiac count distributions onto various printing media, the Anger scintillation camera allowed electronic determination of tracer energies and of the distribution of radioactive counts in 2D space. Increased sophistication of cardiac cameras and development of powerful computers to analyze, display, and quantify data has been essential to making radionuclide cardiac imaging a key component of the cardiac work-up. Newer processing algorithms and solid state cameras, fundamentally different from the Anger camera, show promise to provide higher counting efficiency and resolution, leading to better image quality, more patient comfort and potentially lower radiation exposure. While the focus has been on myocardial perfusion imaging with single-photon emission computed tomography, increased use of positron emission tomography is broadening the field to include molecular imaging of the myocardium and of the coronary vasculature. Further advances may require integrating cardiac nuclear cameras with other imaging devices, ie, hybrid imaging cameras. The goal is to image the heart and its physiological processes as accurately as possible, to prevent and cure disease processes.

  13. Multispectral Thermal Infrared Mapping of Sulfur Dioxide Plumes: A Case Study from the East Rift Zone of Kilauea Volcano, Hawaii

    NASA Technical Reports Server (NTRS)

    Realmuto, V. J.; Sutton, A. J.; Elias, T.

    1996-01-01

    The synoptic perspective and rapid mode of data acquisition provided by remote sensing are well-suited for the study of volcanic SO2 plumes. In this paper we describe a plume-mapping procedure that is based on image data acquired with NASA's airborne Thermal Infrared Multispectral Scanner (TIMS).

  14. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    USGS Publications Warehouse

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  15. Concept for an airborne real-time ISR system with multi-sensor 3D data acquisition

    NASA Astrophysics Data System (ADS)

    Haraké, Laura; Schilling, Hendrik; Blohm, Christian; Hillemann, Markus; Lenz, Andreas; Becker, Merlin; Keskin, Göksu; Middelmann, Wolfgang

    2016-10-01

    In modern aerial Intelligence, Surveillance and Reconnaissance operations, precise 3D information becomes inevitable for increased situation awareness. In particular, object geometries represented by texturized digital surface models constitute an alternative to a pure evaluation of radiometric measurements. Besides the 3D data's level of detail aspect, its availability is time-relevant in order to make quick decisions. Expanding the concept of our preceding remote sensing platform developed together with OHB System AG and Geosystems GmbH, in this paper we present an airborne multi-sensor system based on a motor glider equipped with two wing pods; one carries the sensors, whereas the second pod downlinks sensor data to a connected ground control station by using the Aerial Reconnaissance Data System of OHB. An uplink is created to receive remote commands from the manned mobile ground control station, which on its part processes and evaluates incoming sensor data. The system allows the integration of efficient image processing and machine learning algorithms. In this work, we introduce a near real-time approach for the acquisition of a texturized 3D data model with the help of an airborne laser scanner and four high-resolution multi-spectral (RGB, near-infrared) cameras. Image sequences from nadir and off-nadir cameras permit to generate dense point clouds and to texturize also facades of buildings. The ground control station distributes processed 3D data over a linked geoinformation system with web capabilities to off-site decision-makers. As the accurate acquisition of sensor data requires boresight calibrated sensors, we additionally examine the first steps of a camera calibration workflow.

  16. Fourth Airborne Geoscience Workshop

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The focus of the workshop was on how the airborne community can assist in achieving the goals of the Global Change Research Program. The many activities that employ airborne platforms and sensors were discussed: platforms and instrument development; airborne oceanography; lidar research; SAR measurements; Doppler radar; laser measurements; cloud physics; airborne experiments; airborne microwave measurements; and airborne data collection.

  17. Mapping crop ground cover using airborne multispectral digital imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Empirical relationships between remotely sensed vegetation indices and density information, such as leaf area index or ground cover (GC), are commonly used to derive spatial information in many precision farming operations. In this study, we modified an existing methodology that does not depend on e...

  18. Summaries of the Seventh JPL Airborne Earth Science Workshop January 12-16, 1998. Volume 1; AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1998-01-01

    This publication contains the summaries for the Seventh JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 12-16, 1998. The main workshop is divided into three smaller workshops, and each workshop has a volume as follows: (1) Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) Workshop; (2) Airborne Synthetic Aperture Radar (AIRSAR) Workshop; and (3) Thermal Infrared Multispectral Scanner (TIMS) Workshop. This Volume 1 publication contains 58 papers taken from the AVIRIS workshop.

  19. A practical one-shot multispectral imaging system using a single image sensor.

    PubMed

    Monno, Yusuke; Kikuchi, Sunao; Tanaka, Masayuki; Okutomi, Masatoshi

    2015-10-01

    Single-sensor imaging using the Bayer color filter array (CFA) and demosaicking is well established for current compact and low-cost color digital cameras. An extension from the CFA to a multispectral filter array (MSFA) enables us to acquire a multispectral image in one shot without increased size or cost. However, multispectral demosaicking for the MSFA has been a challenging problem because of very sparse sampling of each spectral band in the MSFA. In this paper, we propose a high-performance multispectral demosaicking algorithm, and at the same time, a novel MSFA pattern that is suitable for our proposed algorithm. Our key idea is the use of the guided filter to interpolate each spectral band. To generate an effective guide image, in our proposed MSFA pattern, we maintain the sampling density of the G -band as high as the Bayer CFA, and we array each spectral band so that an adaptive kernel can be estimated directly from raw MSFA data. Given these two advantages, we effectively generate the guide image from the most densely sampled G -band using the adaptive kernel. In the experiments, we demonstrate that our proposed algorithm with our proposed MSFA pattern outperforms existing algorithms and provides better color fidelity compared with a conventional color imaging system with the Bayer CFA. We also show some real applications using a multispectral camera prototype we built.

  20. Synthesis of Multispectral Bands from Hyperspectral Data: Validation Based on Images Acquired by AVIRIS, Hyperion, ALI, and ETM+

    NASA Technical Reports Server (NTRS)

    Blonksi, Slawomir; Gasser, Gerald; Russell, Jeffrey; Ryan, Robert; Terrie, Greg; Zanoni, Vicki

    2001-01-01

    Multispectral data requirements for Earth science applications are not always studied rigorously studied before a new remote sensing system is designed. A study of the spatial resolution, spectral bandpasses, and radiometric sensitivity requirements of real-world applications would focus the design onto providing maximum benefits to the end-user community. To support systematic studies of multispectral data requirements, the Applications Research Toolbox (ART) has been developed at NASA's Stennis Space Center. The ART software allows users to create and assess simulated datasets while varying a wide range of system parameters. The simulations are based on data acquired by existing multispectral and hyperspectral instruments. The produced datasets can be further evaluated for specific end-user applications. Spectral synthesis of multispectral images from hyperspectral data is a key part of the ART software. In this process, hyperspectral image cubes are transformed into multispectral imagery without changes in spatial sampling and resolution. The transformation algorithm takes into account spectral responses of both the synthesized, broad, multispectral bands and the utilized, narrow, hyperspectral bands. To validate the spectral synthesis algorithm, simulated multispectral images are compared with images collected near-coincidentally by the Landsat 7 ETM+ and the EO-1 ALI instruments. Hyperspectral images acquired with the airborne AVIRIS instrument and with the Hyperion instrument onboard the EO-1 satellite were used as input data to the presented simulations.

  1. Airborne Particles.

    ERIC Educational Resources Information Center

    Ojala, Carl F.; Ojala, Eric J.

    1987-01-01

    Describes an activity in which students collect airborne particles using a common vacuum cleaner. Suggests ways for the students to convert their data into information related to air pollution and human health. Urges consideration of weather patterns when analyzing the results of the investigation. (TW)

  2. Optimization of system parameters for a complete multispectral polarimeter

    SciTech Connect

    Hollstein, Andre; Ruhtz, Thomas; Fischer, Juergen; Preusker, Rene

    2009-08-20

    We optimize a general class of complete multispectral polarimeters with respect to signal-to-noise ratio, stability against alignment errors, and the minimization of errors regarding a given set of polarization states. The class of polarimeters that are dealt with consists of at least four polarization optics each with a multispectral detector. A polarization optic is made of an azimuthal oriented wave plate and a polarizing filter. A general, but not unique, analytic solution that minimizes signal-to-noise ratio is introduced for a polarimeter that incorporates four simultaneous measurements with four independent optics. The optics consist of four sufficient wave plates, where at least one is a quarter-wave plate. The solution is stable with respect to the retardance of the quarter-wave plate; therefore, it can be applied to real-world cases where the retardance deviates from {lambda}/4. The solution is a set of seven rotational parameters that depends on the given retardances of the wave plates. It can be applied to a broad range of real world cases. A numerical method for the optimization of arbitrary polarimeters of the type discussed is also presented and applied for two cases. First, the class of polarimeters that were analytically dealt with are further optimized with respect to stability and error performance with respect to linear polarized states. Then a multispectral case for a polarimeter that consists of four optics with real achromatic wave plates is presented. This case was used as the theoretical background for the development of the Airborne Multi-Spectral Sunphoto- and Polarimeter (AMSSP), which is an instrument for the German research aircraft HALO.

  3. Optimization of system parameters for a complete multispectral polarimeter.

    PubMed

    Hollstein, André; Ruhtz, Thomas; Fischer, Jürgen; Preusker, René

    2009-08-20

    We optimize a general class of complete multispectral polarimeters with respect to signal-to-noise ratio, stability against alignment errors, and the minimization of errors regarding a given set of polarization states. The class of polarimeters that are dealt with consists of at least four polarization optics each with a multispectral detector. A polarization optic is made of an azimuthal oriented wave plate and a polarizing filter. A general, but not unique, analytic solution that minimizes signal-to-noise ratio is introduced for a polarimeter that incorporates four simultaneous measurements with four independent optics. The optics consist of four sufficient wave plates, where at least one is a quarter-wave plate. The solution is stable with respect to the retardance of the quarter-wave plate; therefore, it can be applied to real-world cases where the retardance deviates from lambda/4. The solution is a set of seven rotational parameters that depends on the given retardances of the wave plates. It can be applied to a broad range of real world cases. A numerical method for the optimization of arbitrary polarimeters of the type discussed is also presented and applied for two cases. First, the class of polarimeters that were analytically dealt with are further optimized with respect to stability and error performance with respect to linear polarized states. Then a multispectral case for a polarimeter that consists of four optics with real achromatic wave plates is presented. This case was used as the theoretical background for the development of the Airborne Multi-Spectral Sunphoto- and Polarimeter (AMSSP), which is an instrument for the German research aircraft HALO.

  4. Multi-spectral imaging with infrared sensitive organic light emitting diode

    PubMed Central

    Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky

    2014-01-01

    Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions. PMID:25091589

  5. Classification of human carcinoma cells using multispectral imagery

    NASA Astrophysics Data System (ADS)

    Ćinar, Umut; Y. Ćetin, Yasemin; Ćetin-Atalay, Rengul; Ćetin, Enis

    2016-03-01

    In this paper, we present a technique for automatically classifying human carcinoma cell images using textural features. An image dataset containing microscopy biopsy images from different patients for 14 distinct cancer cell line type is studied. The images are captured using a RGB camera attached to an inverted microscopy device. Texture based Gabor features are extracted from multispectral input images. SVM classifier is used to generate a descriptive model for the purpose of cell line classification. The experimental results depict satisfactory performance, and the proposed method is versatile for various microscopy magnification options.

  6. CCD Camera

    DOEpatents

    Roth, Roger R.

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  7. CCD Camera

    DOEpatents

    Roth, R.R.

    1983-08-02

    A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

  8. Multispectral imaging method and apparatus

    DOEpatents

    Sandison, D.R.; Platzbecker, M.R.; Vargo, T.D.; Lockhart, R.R.; Descour, M.R.; Richards-Kortum, R.

    1999-07-06

    A multispectral imaging method and apparatus are described which are adapted for use in determining material properties, especially properties characteristic of abnormal non-dermal cells. A target is illuminated with a narrow band light beam. The target expresses light in response to the excitation. The expressed light is collected and the target's response at specific response wavelengths to specific excitation wavelengths is measured. From the measured multispectral response the target's properties can be determined. A sealed, remote probe and robust components can be used for cervical imaging. 5 figs.

  9. Multispectral imaging method and apparatus

    DOEpatents

    Sandison, David R.; Platzbecker, Mark R.; Vargo, Timothy D.; Lockhart, Randal R.; Descour, Michael R.; Richards-Kortum, Rebecca

    1999-01-01

    A multispectral imaging method and apparatus adapted for use in determining material properties, especially properties characteristic of abnormal non-dermal cells. A target is illuminated with a narrow band light beam. The target expresses light in response to the excitation. The expressed light is collected and the target's response at specific response wavelengths to specific excitation wavelengths is measured. From the measured multispectral response the target's properties can be determined. A sealed, remote probe and robust components can be used for cervical imaging

  10. New uses for the Zeiss KS-153A camera system

    NASA Astrophysics Data System (ADS)

    Spiller, Rudolf H.

    1995-09-01

    The Zeiss KS-153A aerial reconnaissance framing camera compliments satellite, mapping, and remote sensor data with imagery that is geometrically correct. KS-153A imagery is in a format for tactical 3-D mapping, targeting, and high-resolution intelligence data collection. This system is based upon rugged microprocessor technology that allows a wide variety of mission parameters. Geometrically correct horizon-to-horizon photography, multi-spectral mine detection, stand-off photography, NIRS nine high speed, and very low altitude anti-terrorist surveillance are KS-153A capabilities that have been proven in tests and actual missions. Civilian use of the KS-153A has ranged from measuring flood levels to forest infestations. These are everyday tasks for the KS-153A throughout the world. Zeiss optics have superb spectral response and resolution. Surprisingly effective haze penetration was shown on a day when the pilot himself could not see the terrain. Tests with CCD arrays have also produced outstanding results. This superb spectral response can be used for camouflage detection in wartime, or used for effective environmental control in peacetime, with its ability to detect subtle changes in the signature of vegetation, calling attention to man induced stress such as disease, drought, and pollution. One serious man-induced problem in many parts of the world deserves even more attention in these times: the locating and safe removal of mines. The KS- 153A is currently configured with four different optics. High acuity horizon-to-horizon Pentalens and Multi-spectral Lens (MUC) modules have been added to the basic KS-153A with Trilens and Telelens. This modular concept nearly meets all of today's airborne reconnaissance requirements. Modern recce programs, for example German Air Force Recce Tornado (GAF Recce), have selected the KS-153A. By simply adding additional focal length lens assemblies to an existing KS-153A configuration, the user can instantly and economically adapt

  11. Multispectral Image Out-of-Focus Deblurring Using Interchannel Correlation.

    PubMed

    Chen, Shu-Jie; Shen, Hui-Liang

    2015-11-01

    Out-of-focus blur occurs frequently in multispectral imaging systems when the camera is well focused at a specific (reference) imaging channel. As the effective focal lengths of the lens are wavelength dependent, the blurriness levels of the images at individual channels are different. This paper proposes a multispectral image deblurring framework to restore out-of-focus spectral images based on the characteristic of interchannel correlation (ICC). The ICC is investigated based on the fact that a high-dimensional color spectrum can be linearly approximated using rather a few number of intrinsic spectra. In the method, the spectral images are classified into an out-of-focus set and a well-focused set via blurriness computation. For each out-of-focus image, a guiding image is derived from the well-focused spectral images and is used as the image prior in the deblurring framework. The out-of-focus blur is modeled as a Gaussian point spread function, which is further employed as the blur kernel prior. The regularization parameters in the image deblurring framework are determined using generalized cross validation, and thus the proposed method does not need any parameter tuning. The experimental results validate that the method performs well on multispectral image deblurring and outperforms the state of the arts.

  12. Multispectral Analysis of Indigenous Rock Art Using Terrestrial Laser Scanning

    NASA Astrophysics Data System (ADS)

    Skoog, B.; Helmholz, P.; Belton, D.

    2016-06-01

    Multispectral analysis is a widely used technique in the photogrammetric and remote sensing industry. The use of Terrestrial Laser Scanning (TLS) in combination with imagery is becoming increasingly common, with its applications spreading to a wider range of fields. Both systems benefit from being a non-contact technique that can be used to accurately capture data regarding the target surface. Although multispectral analysis is actively performed within the spatial sciences field, its extent of application within an archaeological context has been limited. This study effectively aims to apply the multispectral techniques commonly used, to a remote Indigenous site that contains an extensive gallery of aging rock art. The ultimate goal for this research is the development of a systematic procedure that could be applied to numerous similar sites for the purpose of heritage preservation and research. The study consisted of extensive data capture of the rock art gallery using two different TLS systems and a digital SLR camera. The data was combined into a common 2D reference frame that allowed for standard image processing to be applied. An unsupervised k-means classifier was applied to the multiband images to detect the different types of rock art present. The result was unsatisfactory as the subsequent classification accuracy was relatively low. The procedure and technique does however show potential and further testing with different classification algorithms could possibly improve the result significantly.

  13. Multispectral slice of APXS

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Portions of Sojourner's Alpha Proton X-Ray Spectrometer (APXS), a deployment spring, and the rock Barnacle Bill are visible in this color image. The image was taken by Sojourner's rear camera, and shows that the APXS made good contact with Barnacle Bill.

    Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is an operating division of the California Institute of Technology (Caltech). The Imager for Mars Pathfinder (IMP) was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator.

  14. Application of multispectral color photography to flame flow visualization

    NASA Technical Reports Server (NTRS)

    Stoffers, G.

    1979-01-01

    For flames of short duration and low intensity of radiation a spectroscopical flame diagnostics is difficult. In order to find some other means of extracting information about the flame structure from its radiation, the feasibility of using multispectral color photography was successfully evaluated. Since the flame photographs are close-ups, there is a considerable parallax between the single images, when several cameras are used, and additive color viewing is not possible. Each image must be analyzed individually, it is advisable to use color film in all cameras. One can either use color films of different spectral sensitivities or color films of the same type with different color filters. Sharp cutting filters are recommended.

  15. Progressive piecewise registration of orthophotos and airborne scanner images

    NASA Astrophysics Data System (ADS)

    Chen, Lin-Chi; Yang, T. T.

    1994-08-01

    From the image-to-image registration point of view, we propose a scheme to iteratively register airborne multi-spectral imagery onto its counterpart, i.e., orthographic photography. The required registration control point pairs are automatically augmented first. Then a local registration procedure is applied according to the generated registration control point pairs. The coordinate transformation uses thin plate spline function. Through a consistency check, if the disparities between the reference image and the transformed airborne multi-spectral image is too large to accept, next iteration is performed. During the second iteration, some best matched feature points used in the consistency check of the first iteration append to the existing registration control points. This iteration procedure continues until the disparities are small enough. Experimental results indicate that the output image attain an excellent geometrical similarity with respect to the reference image. The rms of the disparities is less than 0.5 pixels.

  16. A two-camera imaging system for pest detection and aerial application

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This presentation reports on the design and testing of an airborne two-camera imaging system for pest detection and aerial application assessment. The system consists of two digital cameras with 5616 x 3744 effective pixels. One camera captures normal color images with blue, green and red bands, whi...

  17. Airborne Crowd Density Estimation

    NASA Astrophysics Data System (ADS)

    Meynberg, O.; Kuschk, G.

    2013-10-01

    This paper proposes a new method for estimating human crowd densities from aerial imagery. Applications benefiting from an accurate crowd monitoring system are mainly found in the security sector. Normally crowd density estimation is done through in-situ camera systems mounted on high locations although this is not appropriate in case of very large crowds with thousands of people. Using airborne camera systems in these scenarios is a new research topic. Our method uses a preliminary filtering of the whole image space by suitable and fast interest point detection resulting in a number of image regions, possibly containing human crowds. Validation of these candidates is done by transforming the corresponding image patches into a low-dimensional and discriminative feature space and classifying the results using a support vector machine (SVM). The feature space is spanned by texture features computed by applying a Gabor filter bank with varying scale and orientation to the image patches. For evaluation, we use 5 different image datasets acquired by the 3K+ aerial camera system of the German Aerospace Center during real mass events like concerts or football games. To evaluate the robustness and generality of our method, these datasets are taken from different flight heights between 800 m and 1500 m above ground (keeping a fixed focal length) and varying daylight and shadow conditions. The results of our crowd density estimation are evaluated against a reference data set obtained by manually labeling tens of thousands individual persons in the corresponding datasets and show that our method is able to estimate human crowd densities in challenging realistic scenarios.

  18. The use of optical microscope equipped with multispectral detector to distinguish different types of acute lymphoblastic leukemia

    NASA Astrophysics Data System (ADS)

    Pronichev, A. N.; Polyakov, E. V.; Tupitsyn, N. N.; Frenkel, M. A.; Mozhenkova, A. V.

    2017-01-01

    The article describes the use of a computer optical microscopy with multispectral camera to characterize the texture of blasts bone marrow of patients with different variants of acute lymphoblastic leukemia: B- and T- types. Specific characteristics of the chromatin of the nuclei of blasts for different types of acute lymphoblastic leukemia were obtained.

  19. Spectral difference analysis and airborne imaging classification for citrus greening infected trees

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Citrus greening, also called Huanglongbing (HLB), became a devastating disease spread through citrus groves in Florida, since it was first found in 2005. Multispectral (MS) and hyperspectral (HS) airborne images of citrus groves in Florida were acquired to detect citrus greening infected trees in 20...

  20. Mapping giant reed along the Rio Grande using airborne and satellite imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Giant reed (Arundo donax L.) is a perennial invasive weed that presents a severe threat to agroecosystems and riparian areas in the Texas and Mexican portions of the Rio Grande Basin. The objective of this presentation is to give an overview on the use of aerial photography, airborne multispectral a...

  1. Daily evapotranspiration estimates from extrapolating instantaneous airborne remote sensing ET values

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this study, six extrapolation methods have been compared for their ability to estimate daily crop evapotranspiration (ETd) from instantaneous latent heat flux estimates derived from digital airborne multispectral remote sensing imagery. Data used in this study were collected during an experiment...

  2. Implementation of a neural network for multispectral luminescence imaging of lake pigment paints.

    PubMed

    Chane, Camille Simon; Thoury, Mathieu; Tournié, Aurélie; Echard, Jean-Philippe

    2015-04-01

    Luminescence multispectral imaging is a developing and promising technique in the fields of conservation science and cultural heritage studies. In this article, we present a new methodology for recording the spatially resolved luminescence properties of objects. This methodology relies on the development of a lab-made multispectral camera setup optimized to collect low-yield luminescence images. In addition to a classic data preprocessing procedure to reduce noise on the data, we present an innovative method, based on a neural network algorithm, that allows us to obtain radiometrically calibrated luminescence spectra with increased spectral resolution from the low-spectral resolution acquisitions. After preliminary corrections, a neural network is trained using the 15-band multispectral luminescence acquisitions and corresponding spot spectroscopy luminescence data. This neural network is then used to retrieve a megapixel multispectral cube between 460 and 710 nm with a 5 nm resolution from a low-spectral-resolution multispectral acquisition. The resulting data are independent from the detection chain of the imaging system (filter transmittance, spectral sensitivity of the lens and optics, etc.). As a result, the image cube provides radiometrically calibrated emission spectra with increased spectral resolution. For each pixel, we can thus retrieve a spectrum comparable to those obtained with conventional luminescence spectroscopy. We apply this method to a panel of lake pigment paints and discuss the pertinence and perspectives of this new approach.

  3. A Comparative Study of Land Cover Classification by Using Multispectral and Texture Data

    PubMed Central

    Qadri, Salman; Khan, Dost Muhammad; Ahmad, Farooq; Qadri, Syed Furqan; Babar, Masroor Ellahi; Shahid, Muhammad; Ul-Rehman, Muzammil; Razzaq, Abdul; Shah Muhammad, Syed; Fahad, Muhammad; Ahmad, Sarfraz; Pervez, Muhammad Tariq; Naveed, Nasir; Aslam, Naeem; Jamil, Mutiullah; Rehmani, Ejaz Ahmad; Ahmad, Nazir; Akhtar Khan, Naeem

    2016-01-01

    The main objective of this study is to find out the importance of machine vision approach for the classification of five types of land cover data such as bare land, desert rangeland, green pasture, fertile cultivated land, and Sutlej river land. A novel spectra-statistical framework is designed to classify the subjective land cover data types accurately. Multispectral data of these land covers were acquired by using a handheld device named multispectral radiometer in the form of five spectral bands (blue, green, red, near infrared, and shortwave infrared) while texture data were acquired with a digital camera by the transformation of acquired images into 229 texture features for each image. The most discriminant 30 features of each image were obtained by integrating the three statistical features selection techniques such as Fisher, Probability of Error plus Average Correlation, and Mutual Information (F + PA + MI). Selected texture data clustering was verified by nonlinear discriminant analysis while linear discriminant analysis approach was applied for multispectral data. For classification, the texture and multispectral data were deployed to artificial neural network (ANN: n-class). By implementing a cross validation method (80-20), we received an accuracy of 91.332% for texture data and 96.40% for multispectral data, respectively. PMID:27376088

  4. A Comparative Study of Land Cover Classification by Using Multispectral and Texture Data.

    PubMed

    Qadri, Salman; Khan, Dost Muhammad; Ahmad, Farooq; Qadri, Syed Furqan; Babar, Masroor Ellahi; Shahid, Muhammad; Ul-Rehman, Muzammil; Razzaq, Abdul; Shah Muhammad, Syed; Fahad, Muhammad; Ahmad, Sarfraz; Pervez, Muhammad Tariq; Naveed, Nasir; Aslam, Naeem; Jamil, Mutiullah; Rehmani, Ejaz Ahmad; Ahmad, Nazir; Akhtar Khan, Naeem

    2016-01-01

    The main objective of this study is to find out the importance of machine vision approach for the classification of five types of land cover data such as bare land, desert rangeland, green pasture, fertile cultivated land, and Sutlej river land. A novel spectra-statistical framework is designed to classify the subjective land cover data types accurately. Multispectral data of these land covers were acquired by using a handheld device named multispectral radiometer in the form of five spectral bands (blue, green, red, near infrared, and shortwave infrared) while texture data were acquired with a digital camera by the transformation of acquired images into 229 texture features for each image. The most discriminant 30 features of each image were obtained by integrating the three statistical features selection techniques such as Fisher, Probability of Error plus Average Correlation, and Mutual Information (F + PA + MI). Selected texture data clustering was verified by nonlinear discriminant analysis while linear discriminant analysis approach was applied for multispectral data. For classification, the texture and multispectral data were deployed to artificial neural network (ANN: n-class). By implementing a cross validation method (80-20), we received an accuracy of 91.332% for texture data and 96.40% for multispectral data, respectively.

  5. Multispectral Analysis of NMR Imagery

    NASA Technical Reports Server (NTRS)

    Butterfield, R. L.; Vannier, M. W. And Associates; Jordan, D.

    1985-01-01

    Conference paper discusses initial efforts to adapt multispectral satellite-image analysis to nuclear magnetic resonance (NMR) scans of human body. Flexibility of these techniques makes it possible to present NMR data in variety of formats, including pseudocolor composite images of pathological internal features. Techniques do not have to be greatly modified from form in which used to produce satellite maps of such Earth features as water, rock, or foliage.

  6. Multispectral Microscopic Imager (MMI): Multispectral Imaging of Geological Materials at a Handlens Scale

    NASA Astrophysics Data System (ADS)

    Farmer, J. D.; Nunez, J. I.; Sellar, R. G.; Gardner, P. B.; Manatt, K. S.; Dingizian, A.; Dudik, M. J.; McDonnell, G.; Le, T.; Thomas, J. A.; Chu, K.

    2011-12-01

    The Multispectral Microscopic Imager (MMI) is a prototype instrument presently under development for future astrobiological missions to Mars. The MMI is designed to be a arm-mounted rover instrument for use in characterizing the microtexture and mineralogy of materials along geological traverses [1,2,3]. Such geological information is regarded as essential for interpreting petrogenesis and geological history, and when acquired in near real-time, can support hypothesis-driven exploration and optimize science return. Correlated microtexure and mineralogy also provides essential data for selecting samples for analysis with onboard lab instruments, and for prioritizing samples for potential Earth return. The MMI design employs multispectral light-emitting diodes (LEDs) and an uncooled focal plane array to achieve the low-mass (<1kg), low-cost, and high reliability (no moving parts) required for an arm-mounted instrument on a planetary rover [2,3]. The MMI acquires multispectral, reflectance images at 62 μm/pixel, in which each image pixel is comprised of a 21-band VNIR spectrum (0.46 to 1.73 μm). This capability enables the MMI to discriminate and resolve the spatial distribution of minerals and textures at the microscale [2, 3]. By extending the spectral range into the infrared, and increasing the number of spectral bands, the MMI exceeds the capabilities of current microimagers, including the MER Microscopic Imager (MI); 4, the Phoenix mission Robotic Arm Camera (RAC; 5) and the Mars Science Laboratory's Mars Hand Lens Imager (MAHLI; 6). In this report we will review the capabilities of the MMI by highlighting recent lab and field applications, including: 1) glove box deployments in the Astromaterials lab at Johnson Space Center to analyze Apollo lunar samples; 2) GeoLab glove box deployments during the 2011 Desert RATS field trials in northern AZ to characterize analog materials collected by astronauts during simulated EVAs; 3) field deployments on Mauna Kea

  7. Improved capabilities of the Multispectral Atmospheric Mapping Sensor (MAMS)

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; Batson, K. Bryan; Atkinson, Robert J.; Moeller, Chris C.; Menzel, W. Paul; James, Mark W.

    1989-01-01

    The Multispectral Atmospheric Mapping Sensor (MAMS) is an airborne instrument being investigated as part of NASA's high altitude research program. Findings from work on this and other instruments have been important as the scientific justification of new instrumentation for the Earth Observing System (EOS). This report discusses changes to the instrument which have led to new capabilities, improved data quality, and more accurate calibration methods. In order to provide a summary of the data collected with MAMS, a complete list of flight dates and locations is provided. For many applications, registration of MAMS imagery with landmarks is required. The navigation of this data on the Man-computer Interactive Data Access System (McIDAS) is discussed. Finally, research applications of the data are discussed and specific examples are presented to show the applicability of these measurements to NASA's Earth System Science (ESS) objectives.

  8. Multispectral determination of vegetative cover in corn crop canopy

    NASA Technical Reports Server (NTRS)

    Stoner, E. R.; Baumgardner, M. F.

    1972-01-01

    The relationship between different amounts of vegetative ground cover and the energy reflected by corn canopies was investigated. Low altitude photography and an airborne multispectral scanner were used to measure this reflected energy. Field plots were laid out, representing four growth stages of corn. Two plot locations were chosen-on a very dark and a very light surface soil. Color and color infrared photographs were taken from a vertical distance of 10 m. Estimates of ground cover were made from these photographs and were related to field measurements of leaf area index. Ground cover could be predicted from leaf area index measurements by a second order equation. Microdensitometry and digitzation of the three separated dye layers of color infrared film showed that the near infrared dye layer is most valuable in ground cover determinations. Computer analysis of the digitized photography provided an accurate method of determining precent ground cover.

  9. Gimbaled multispectral imaging system and method

    DOEpatents

    Brown, Kevin H.; Crollett, Seferino; Henson, Tammy D.; Napier, Matthew; Stromberg, Peter G.

    2016-01-26

    A gimbaled multispectral imaging system and method is described herein. In an general embodiment, the gimbaled multispectral imaging system has a cross support that defines a first gimbal axis and a second gimbal axis, wherein the cross support is rotatable about the first gimbal axis. The gimbaled multispectral imaging system comprises a telescope that fixed to an upper end of the cross support, such that rotation of the cross support about the first gimbal axis causes the tilt of the telescope to alter. The gimbaled multispectral imaging system includes optics that facilitate on-gimbal detection of visible light and off-gimbal detection of infrared light.

  10. Introducing a Low-Cost Mini-Uav for - and Multispectral-Imaging

    NASA Astrophysics Data System (ADS)

    Bendig, J.; Bolten, A.; Bareth, G.

    2012-07-01

    The trend to minimize electronic devices also accounts for Unmanned Airborne Vehicles (UAVs) as well as for sensor technologies and imaging devices. Consequently, it is not surprising that UAVs are already part of our daily life and the current pace of development will increase civil applications. A well known and already wide spread example is the so called flying video game based on Parrot's AR.Drone which is remotely controlled by an iPod, iPhone, or iPad (http://ardrone.parrot.com). The latter can be considered as a low-weight and low-cost Mini-UAV. In this contribution a Mini-UAV is considered to weigh less than 5 kg and is being able to carry 0.2 kg to 1.5 kg of sensor payload. While up to now Mini-UAVs like Parrot's AR.Drone are mainly equipped with RGB cameras for videotaping or imaging, the development of such carriage systems clearly also goes to multi-sensor platforms like the ones introduced for larger UAVs (5 to 20 kg) by Jaakkolla et al. (2010) for forestry applications or by Berni et al. (2009) for agricultural applications. The problem when designing a Mini-UAV for multi-sensor imaging is the limitation of payload of up to 1.5 kg and a total weight of the whole system below 5 kg. Consequently, the Mini-UAV without sensors but including navigation system and GPS sensors must weigh less than 3.5 kg. A Mini-UAV system with these characteristics is HiSystems' MK-Okto (www.mikrokopter.de). Total weight including battery without sensors is less than 2.5 kg. Payload of a MK-Okto is approx. 1 kg and maximum speed is around 30 km/h. The MK-Okto can be operated up to a wind speed of less than 19 km/h which corresponds to Beaufort scale number 3 for wind speed. In our study, the MK-Okto is equipped with a handheld low-weight NEC F30IS thermal imaging system. The F30IS which was developed for veterinary applications, covers 8 to 13 μm, weighs only 300 g

  11. High resolution multispectral photogrammetric imagery: enhancement, interpretation and evaluations

    NASA Astrophysics Data System (ADS)

    Roberts, Arthur; Haefele, Martin; Bostater, Charles; Becker, Thomas

    2007-10-01

    A variety of aerial mapping cameras were adapted and developed into simulated multiband digital photogrammetric mapping systems. Direct digital multispectral, two multiband cameras (IIS 4 band and Itek 9 band) and paired mapping and reconnaissance cameras were evaluated for digital spectral performance and photogrammetric mapping accuracy in an aquatic environment. Aerial films (24cm X 24cm format) tested were: Agfa color negative and extended red (visible and near infrared) panchromatic, and; Kodak color infrared and B&W (visible and near infrared) infrared. All films were negative processed to published standards and digitally converted at either 16 (color) or 10 (B&W) microns. Excellent precision in the digital conversions was obtained with scanning errors of less than one micron. Radiometric data conversion was undertaken using linear density conversion and centered 8 bit histogram exposure. This resulted in multiple 8 bit spectral image bands that were unaltered (not radiometrically enhanced) "optical count" conversions of film density. This provided the best film density conversion to a digital product while retaining the original film density characteristics. Data covering water depth, water quality, surface roughness, and bottom substrate were acquired using different measurement techniques as well as different techniques to locate sampling points on the imagery. Despite extensive efforts to obtain accurate ground truth data location errors, measurement errors, and variations in the correlation between water depth and remotely sensed signal persisted. These errors must be considered endemic and may not be removed through even the most elaborate sampling set up. Results indicate that multispectral photogrammetric systems offer improved feature mapping capability.

  12. Multispectral comparison of water ice deposits observed on cometary nuclei

    NASA Astrophysics Data System (ADS)

    Oklay Vincent, Nilda; Sunshine, Jessica M.; Pajola, Maurizio; Pommerol, Antoine; Vincent, Jean-Baptiste; Sierks, Holger; OSIRIS Team

    2016-10-01

    Cometary missions Deep Impact, EPOXI and Rosetta investigated the nuclei of comets 9P/Tempel 1, 103P/Hartley 2 and 67P/Churyumov-Gerasimenko respectively. Each of these three missions was equipped with multispectral cameras, allowing imaging at various wavelengths from NUV to NIR. In this spectral range, water ice-rich features display bluer spectral slopes than the average surface and some have very flat spectra. Features enriched in water ice are bright in the monochromatic images and are blue in the RGB color composites generated by using images taken in NUV, visible and NIR wavelentghs. Using these properties, water ice-rich features were detected on the nuclei of comets 9P [1], 103P [2] and 67P [3] via multispectral imaging cameras. Moreover, there were visual detections of jets and outbursts associated to some of these water ice-rich features when the right observing conditions were fulfilled [4, 5].We analyzed multispectral properties of different types of water ice-rich features [3] observed via OSIRIS NAC on comet 67P in the wavelength range of 260 nm to 1000 nm and then compared with those observed on comets 9P and 103P. Our multispectral analysis shows that the water ice deposits observed on comet 9P are very similar to the large bright blue clusters observed on comet 67P, while the large water ice deposit observed on comet 103P is similar to the large isolated water ice-rich features observed on comet 67P. The ice-rich deposits on comet 103P are the bluest of any comet, which indicates that the deposits on 103P contain more water ice than the ones observed on comets 9P and 67P [6].[1] Sunshine et al 2006, Science[2] Sunshine et al 2011, LPSC[3] Pommerol et al 2015, A&A[4] Oklay et al 2016, A&A[5] Vincent et al 2016, A&A[6] Oklay et al 2016, submitted

  13. A snapshot multispectral imager with integrated tiled filters and optical duplication

    NASA Astrophysics Data System (ADS)

    Geelen, Bert; Tack, Nicolaas; Lambrechts, Andy

    2013-03-01

    Although the potential of spectral imaging has been demonstrated in research environments, its adoption by industry has so far been limited due to the lack of high speed, low cost and compact spectral cameras. We have previously presented work to overcome this limitation by monolithically integrating optical interference filters on top of standard CMOS image sensors for high resolution pushbroom hyperspectral cameras. These cameras require a scanning of the scene and therefore introduce operator complexity due to the need for synchronization and alignment of the scanning to the camera. This typically leads to problems with motion blur, reduced SNR in high speed applications and detection latency and overall restricts the types of applications that can use this system. This paper introduces a novel snapshot multispectral imager concept based on optical filters monolithically integrated on top of a standard CMOS image sensor. By using monolithic integration for the dedicated, high quality spectral filters at its core, it enables the use of mass-produced fore-optics, reducing the total system cost. It overcomes the problems mentioned for scanning applications by snapshot acquisition, where an entire multispectral data cube is sensed at one discrete point in time. This is achieved by applying a novel, tiled filter layout and an optical sub-system which simultaneously duplicates the scene onto each filter tile. Through the use of monolithically integrated optical filters it retains the qualities of compactness, low cost and high acquisition speed, differentiating it from other snapshot spectral cameras based on heterogeneously integrated custom optics. Moreover, thanks to a simple cube assembly process, it enables real-time, low-latency operation. Our prototype camera can acquire multispectral image cubes of 256x256 pixels over 32 bands in the spectral range of 600-1000nm at a speed of about 30 cubes per second at daylight conditions up to 340 cubes per second at higher

  14. Summaries of the Fifth Annual JPL Airborne Earth Science Workshop. Volume 3: AIRSAR Workshop

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob (Editor)

    1995-01-01

    This publication is the third containing summaries for the Fifth Annual JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 23-26, 1995. The main workshop is divided into three smaller workshops as follows: (1) The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on January 23-24. The summaries for this workshop appear in Volume 1; (2) The Airborne synthetic Aperture Radar (AIRSAR) workshop, on January 25-26. The summaries for this workshop appear in this volume; and (3) The Thermal Infrared Multispectral Scanner (TIMS) workshop, on January 26. The summaries for this workshop appear in Volume 2.

  15. Summaries of the Fifth Annual JPL Airborne Earth Science Workshop. Volume 2: TIMS Workshop

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J. (Editor)

    1995-01-01

    This publication is the second volume of the summaries for the Fifth Annual JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 23-26, 1995. The main workshop is divided into three smaller workshops as follows: (1) The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop on January 23-24. The summaries for this workshop appear in Volume 1; (2) The Airborne Synthetic Aperture Radar (AIRSAR) workshop on January 25-26. The summaries for this workshop appear in volume 3; and (3) The Thermal Infrared Multispectral Scanner (TIMS) workshop on January 26. The summaries for this workshop appear in this volume.

  16. Summaries of the Third Annual JPL Airborne Geoscience Workshop. Volume 1: AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1992-01-01

    This publication contains the preliminary agenda and summaries for the Third Annual JPL Airborne Geoscience Workshop, held at the Jet Propulsion Laboratory, Pasadena, California, on 1-5 June 1992. This main workshop is divided into three smaller workshops as follows: (1) the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on June 1 and 2; (2) the Thermal Infrared Multispectral Scanner (TIMS) workshop, on June 3; and (3) the Airborne Synthetic Aperture Radar (AIRSAR) workshop, on June 4 and 5. The summaries are contained in Volumes 1, 2, and 3, respectively.

  17. Summaries of the Third Annual JPL Airborne Geoscience Workshop. Volume 3: AIRSAR Workshop

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob (Editor)

    1992-01-01

    This publication contains the preliminary agenda and summaries for the Third Annual JPL Airborne Geoscience Workshop, held at the Jet Propulsion Laboratory, Pasadena, California, on 1-5 June 1992. This main workshop is divided into three smaller workshops as follows: (1) the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on June 1 and 2; the summaries for this workshop appear in Volume 1; (2) the Thermal Infrared Multispectral Scanner (TIMS) workshop, on June 3; the summaries for this workshop appear in Volume 2; and (3) the Airborne Synthetic Aperture Radar (AIRSAR) workshop, on June 4 and 5; the summaries for this workshop appear in Volume 3.

  18. Summaries of the Fifth Annual JPL Airborne Earth Science Workshop. Volume 1: AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1995-01-01

    This publication is the first of three containing summaries for the Fifth Annual JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 23-26, 1995. The main workshop is divided into three smaller workshops as follows: (1) The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on January 23-24. The summaries for this workshop appear in this volume; (2) The Airborne Synthetic Aperture Radar (AIRSAR) workshop, on January 25-26. The summaries for this workshop appear in Volume 3; and (3) The Thermal Infrared Multispectral Scanner (TIMS) workshop, on January 26. The summaries for this workshop appear in Volume 2.

  19. Summaries of the Third Annual JPL Airborne Geoscience Workshop. Volume 2: TIMS Workshop

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J. (Editor)

    1992-01-01

    This publication contains the preliminary agenda and summaries for the Third Annual JPL Airborne Geoscience Workshop, held at the Jet Propulsion Laboratory, Pasadena, California, on 1-5 June 1992. This main workshop is divided into three smaller workshops as follows: (1) the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on June 1 and 2; the summaries for this workshop appear in Volume 1; (2) the Thermal Infrared Multispectral Scanner (TIMS) workshop, on June 3; the summaries for this workshop appear in Volume 2; and (3) the Airborne Synthetic Aperture Radar (AIRSAR) workshop, on June 4 and 5; the summaries for this workshop appear in Volume 3.

  20. Summaries of the 4th Annual JPL Airborne Geoscience Workshop. Volume 1: AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1993-01-01

    This publication contains the summaries for the Fourth Annual JPL Airborne Geoscience Workshop, held in Washington, D. C. October 25-29, 1993 The main workshop is divided into three smaller workshops as follows: The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, October 25-26 (the summaries for this workshop appear in this volume, Volume 1); The Thermal Infrared Multispectral Scanner (TMIS) workshop, on October 27 (the summaries for this workshop appear in Volume 2); and The Airborne Synthetic Aperture Radar (AIRSAR) workshop, October 28-29 (the summaries for this workshop appear in Volume 3).

  1. Summaries of the 4th Annual JPL Airborne Geoscience Workshop. Volume 2: TIMS Workshop

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J. (Editor)

    1993-01-01

    This is volume 2 of a three volume set of publications that contain the summaries for the Fourth Annual JPL Airborne Geoscience Workshop, held in Washington, D.C. on October 25-29, 1993. The main workshop is divided into three smaller workshops as follows: The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on October 25-26. The summaries for this workshop appear in Volume 1. The Thermal Infrared Multispectral Scanner (TIMS) workshop, on October 27. The summaries for this workshop appear in Volume 2. The Airborne Synthetic Aperture Radar (AIRSAR) workshop, on October 28-29. The summaries for this workshop appear in Volume 3.

  2. Summaries of the 4th Annual JPL Airborne Geoscience Workshop. Volume 3: AIRSAR Workshop

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob (Editor)

    1993-01-01

    This publication contains the summaries for the Fourth Annual JPL Airborne Geoscience Workshop, held in Washington, D.C. on October 25-29, 1993. The main workshop is divided into three smaller workshops as follows: The Airborne Visible/Infrared Spectrometer (AVIRIS) workshop, on October 25-26, whose summaries appear in Volume 1; The Thermal Infrared Multispectral Scanner (TIMS) workshop, on October 27, whose summaries appear in Volume 2; and The Airborne Synthetic Aperture Radar (AIRSAR) workshop, on October 28-29, whose summaries appear in this volume, Volume 3.

  3. Joint spatio-spectral based edge detection for multispectral infrared imagery.

    SciTech Connect

    Krishna, Sanjay; Hayat, Majeed M.; Bender, Steven C.; Sharma, Yagya D.; Jang, Woo-Yong; Paskalva, Biliana S.

    2010-06-01

    Image segmentation is one of the most important and difficult tasks in digital image processing. It represents a key stage of automated image analysis and interpretation. Segmentation algorithms for gray-scale images utilize basic properties of intensity values such as discontinuity and similarity. However, it is possible to enhance edge-detection capability by means of using spectral information provided by multispectral (MS) or hyperspectral (HS) imagery. In this paper we consider image segmentation algorithms for multispectral images with particular emphasis on detection of multi-color or multispectral edges. More specifically, we report on an algorithm for joint spatio-spectral (JSS) edge detection. By joint we mean simultaneous utilization of spatial and spectral characteristics of a given MS or HS image. The JSS-based edge-detection approach, termed Spectral Ratio Contrast (SRC) edge-detection algorithm, utilizes the novel concept of matching edge signatures. The edge signature represents a combination of spectral ratios calculated using bands that enhance the spectral contrast between the two materials. In conjunction with a spatial mask, the edge signature give rise to a multispectral operator that can be viewed as a three-dimensional extension of the mask. In the extended mask, the third (spectral) dimension of each hyper-pixel can be chosen independently. The SRC is verified using MS and HS imagery from a quantum-dot in a well infrared (IR) focal plane array, and the Airborne Hyperspectral Imager.

  4. A multispectral scanner survey of the United States Department of Energy's Paducah Gaseous Diffusion Plant

    SciTech Connect

    Not Available

    1991-06-01

    Airborne multispectral scanner data of the Paducah Gaseous Diffusion Plant (PGDP) and surrounding area were acquired during late spring 1990. This survey was conducted by the Remote Sensing Laboratory (RSL) which is operated by EG G Energy Measurements (EG G/EM) for the US Department of Energy (DOE) Nevada Operations Office. It was requested by the US Department of Energy (DOE) Environmental Audit Team which was reviewing environmental conditions at the facility. The objectives of this survey were to: (1) Acquire 12-channel, multispectral scanner data of the PGDP from an altitude of 3000 feet above ground level (AGL); (2) Acquire predawn, digital thermal infrared (TIR) data of the site from the same altitude; (3) Collect color and color-infrared (CIR) aerial photographs over the facilities; and (4) Illustrate how the analyses of these data could benefit environmental monitoring at the PGDP. This report summarizes the two multispectral scanner and aerial photographic missions at the Paducah Gaseous Diffusion Plant. Selected examples of the multispectral data are presented to illustrate its potential for aiding environmental management at the site. 4 refs., 1 fig., 2 tabs.

  5. A multispectral scanner survey of the Tonopah Test Range, Nevada. Date of survey: August 1993

    SciTech Connect

    Brewster, S.B. Jr.; Howard, M.E.; Shines, J.E.

    1994-08-01

    The Multispectral Remote Sensing Department of the Remote Sensing Laboratory conducted an airborne multispectral scanner survey of a portion of the Tonopah Test Range, Nevada. The survey was conducted on August 21 and 22, 1993, using a Daedalus AADS1268 scanner and coincident aerial color photography. Flight altitudes were 5,000 feet (1,524 meters) above ground level for systematic coverage and 1,000 feet (304 meters) for selected areas of special interest. The multispectral scanner survey was initiated as part of an interim and limited investigation conducted to gather preliminary information regarding historical hazardous material release sites which could have environmental impacts. The overall investigation also includes an inventory of environmental restoration sites, a ground-based geophysical survey, and an aerial radiological survey. The multispectral scanner imagery and coincident aerial photography were analyzed for the detection, identification, and mapping of man-made soil disturbances. Several standard image enhancement techniques were applied to the data to assist image interpretation. A geologic ratio enhancement and a color composite consisting of AADS1268 channels 10, 7, and 9 (mid-infrared, red, and near-infrared spectral bands) proved most useful for detecting soil disturbances. A total of 358 disturbance sites were identified on the imagery and mapped using a geographic information system. Of these sites, 326 were located within the Tonopah Test Range while the remaining sites were present on the imagery but outside the site boundary. The mapped site locations are being used to support ongoing field investigations.

  6. Caught on Camera.

    ERIC Educational Resources Information Center

    Milshtein, Amy

    2002-01-01

    Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

  7. Sandia multispectral analyst remote sensing toolkit (SMART).

    SciTech Connect

    Post, Brian Nelson; Smith, Jody Lynn; Geib, Peter L.; Nandy, Prabal; Wang, Nancy Nairong

    2003-03-01

    This remote sensing science and exploitation work focused on exploitation algorithms and methods targeted at the analyst. SMART is a 'plug-in' to commercial remote sensing software that provides algorithms to enhance the utility of the Multispectral Thermal Imager (MTI) and other multispectral satellite data. This toolkit has been licensed to 22 government organizations.

  8. A multispectral sorting device for wheat kernels

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A low-cost multispectral sorting device was constructed using three visible and three near-infrared light-emitting diodes (LED) with peak emission wavelengths of 470 nm (blue), 527 nm (green), 624 nm (red), 850 nm, 940 nm, and 1070 nm. The multispectral data were collected by rapidly (~12 kHz) blin...

  9. PORTABLE MULTISPECTRAL IMAGING INSTRUMENT FOR FOOD INDUSTRY

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of this paper is to design and fabricate a hand-held multispectral instrument for real-time contaminant detection. Specifically, the protocol to develop a portable multispectral instrument including optical sensor design, fabrication, calibration, data collection, analysis and algorith...

  10. Multispectral Image Processing for Plants

    NASA Technical Reports Server (NTRS)

    Miles, Gaines E.

    1991-01-01

    The development of a machine vision system to monitor plant growth and health is one of three essential steps towards establishing an intelligent system capable of accurately assessing the state of a controlled ecological life support system for long-term space travel. Besides a network of sensors, simulators are needed to predict plant features, and artificial intelligence algorithms are needed to determine the state of a plant based life support system. Multispectral machine vision and image processing can be used to sense plant features, including health and nutritional status.

  11. Multispectral thermal infrared mapping of the 1 October 1988 Kupaianaha flow field, Kilauea volcano, Hawaii

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J.; Hon, Ken; Kahle, Anne B.; Abbott, Elsa A.; Pieri, David C.

    1992-01-01

    Multispectral thermal infrared radiance measurements of the Kupaianaha flow field were acquired with the NASA airborne Thermal Infrared Multispectral Scanner (TIMS) on the morning of 1 October 1988. The TIMS data were used to map both the temperature and emissivity of the surface of the flow field. The temperature map depicted the underground storage and transport of lava. The presence of molten lava in a tube or tumulus resulted in surface temperatures that were at least 10 C above ambient. The temperature map also clearly defined the boundaries of hydrothermal plumes which resulted from the entry of lava into the ocean. The emissivity map revealed the boundaries between individual flow units within the Kupaianaha field. Distinct spectral anomalies, indicative of silica-rich surface materials, were mapped near fumaroles and ocean entry sites. This apparent enrichment in silica may have resulted from an acid-induced leaching of cations from the surfaces of glassy flows.

  12. Monitoring of maize chlorophyll content based on multispectral vegetation indices

    NASA Astrophysics Data System (ADS)

    Sun, Hong; Li, Minzan; Zheng, Lihua; Zhang, Yane; Zhang, Yajing

    2012-11-01

    In order to estimate the nutrient status of maize, the multi-spectral image was used to monitor the chlorophyll content in the field. The experiments were conducted under three different fertilizer treatments (High, Normal and Low). A multispectral CCD camera was used to collect ground-based images of maize canopy in green (G, 520~600nm), red (R, 630~690nm) and near-infrared (NIR, 760~900nm) band. Leaves of maize were randomly sampled to detect the chlorophyll content by UV-Vis spectrophotometer. The images were processed following image preprocessing, canopy segmentation and parameter calculation: Firstly, the median filtering was used to improve the visual contrast of image. Secondly, the leaves of maize canopy were segmented in NIR image. Thirdly, the average gray value (GIA, RIA and NIRIA) and the vegetation indices (DVI, RVI, NDVI, et al.) widely used in remote sensing were calculated. A new vegetation index, combination of normalized difference vegetation index (CNDVI), was developed. After the correlation analysis between image parameter and chlorophyll content, six parameters (GIA, RIA, NIRIA, GRVI, GNDVI and CNDVI) were selected to estimate chlorophyll content at shooting and trumpet stages respectively. The results of MLR predicting models showed that the R2 was 0.88 and the adjust R2 was 0.64 at shooting stage; the R2 was 0.77 and the adjust R2 was 0.31 at trumpet stage. It was indicated that vegetation indices derived from multispectral image could be used to monitor the chlorophyll content. It provided a feasible method for the chlorophyll content detection.

  13. Large Multispectral and Albedo Panoramas Acquired by the Pancam Instruments on the Mars Exploration Rovers Spirit and Opportunity

    NASA Technical Reports Server (NTRS)

    Bell, J. F., III; Arneson, H. M.; Farrand, W. H.; Goetz, W.; Hayes, A. G.; Herkenhoff, K.; Johnson, M. J.; Johnson, J. R.; Joseph, J.; Kinch, K.

    2005-01-01

    Introduction. The panoramic camera (Pancam) multispectral, stereoscopic imaging systems on the Mars Exploration Rovers Spirit and Opportunity [1] have acquired and downlinked more than 45,000 images (35 Gbits of data) over more than 700 combined sols of operation on Mars as of early January 2005. A large subset of these images were acquired as part of 26 large multispectral and/or broadband "albedo" panoramas (15 on Spirit, 11 on Opportunity) covering large ranges of azimuth (12 spanning 360 ) and designed to characterize major regional color and albedo characteristics of the landing sites and various points along both rover traverses.

  14. Classification of emerald based on multispectral image and PCA

    NASA Astrophysics Data System (ADS)

    Yang, Weiping; Zhao, Dazun; Huang, Qingmei; Ren, Pengyuan; Feng, Jie; Zhang, Xiaoyan

    2005-02-01

    Traditionally, the grade discrimination and classifying of bowlders (emeralds) are implemented by using methods based on people's experiences. In our previous works, a method based on NCS(Natural Color System) color system and sRGB color space conversion is employed for a coarse grade classification of emeralds. However, it is well known that the color match of two colors is not a true "match" unless their spectra are the same. Because metameric colors can not be differentiated by a three channel(RGB) camera, a multispectral camera(MSC) is used as image capturing device in this paper. It consists of a trichromatic digital camera and a set of wide-band filters. The spectra are obtained by measuring a series of natural bowlders(emeralds) samples. Principal component analysis(PCA) method is employed to get some spectral eigenvectors. During the fine classification, the color difference and RMS of spectrum difference between estimated and original spectra are used as criterion. It has been shown that 6 eigenvectors are enough to reconstruct reflection spectra of the testing samples.

  15. Time-of-Flight Microwave Camera

    PubMed Central

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-01-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz–12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum. PMID:26434598

  16. Time-of-Flight Microwave Camera

    NASA Astrophysics Data System (ADS)

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-10-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.

  17. Multispectral vegetative canopy parameter retrieval

    NASA Astrophysics Data System (ADS)

    Borel, Christoph C.; Bunker, David J.

    2011-11-01

    Precision agriculture, forestry and environmental remote sensing are applications uniquely suited to the 8 bands that DigitalGlobe's WorldView-2 provides. At the fine spatial resolution of 0.5 m (panchromatic) and 2 m (multispectral) individual trees can be readily resolved. Recent research [1] has shown that it is possible for hyper-spectral data to invert plant reflectance spectra and estimate nitrogen content, leaf water content, leaf structure, canopy leaf area index and, for sparse canopies, also soil reflectance. The retrieval is based on inverting the SAIL (Scattering by Arbitrary Inclined Leaves) vegetation radiative transfer model for the canopy structure and the reflectance model PROSPECT4/5 for the leaf reflectance. Working on the paper [1] confirmed that a limited number of adjacent bands covering just the visible and near infrared can retrieve the parameters as well, opening up the possibility that this method can be used to analyze multi-spectral WV-2 data. Thus it seems possible to create WV-2 specific inversions using 8 bands and apply them to imagery of various vegetation covered surfaces of agricultural and environmental interest. The capability of retrieving leaf water content and nitrogen content has important applications in determining the health of vegetation, e.g. plant growth status, disease mapping, quantitative drought assessment, nitrogen deficiency, plant vigor, yield, etc.

  18. Classification by Using Multispectral Point Cloud Data

    NASA Astrophysics Data System (ADS)

    Liao, C. T.; Huang, H. H.

    2012-07-01

    Remote sensing images are generally recorded in two-dimensional format containing multispectral information. Also, the semantic information is clearly visualized, which ground features can be better recognized and classified via supervised or unsupervised classification methods easily. Nevertheless, the shortcomings of multispectral images are highly depending on light conditions, and classification results lack of three-dimensional semantic information. On the other hand, LiDAR has become a main technology for acquiring high accuracy point cloud data. The advantages of LiDAR are high data acquisition rate, independent of light conditions and can directly produce three-dimensional coordinates. However, comparing with multispectral images, the disadvantage is multispectral information shortage, which remains a challenge in ground feature classification through massive point cloud data. Consequently, by combining the advantages of both LiDAR and multispectral images, point cloud data with three-dimensional coordinates and multispectral information can produce a integrate solution for point cloud classification. Therefore, this research acquires visible light and near infrared images, via close range photogrammetry, by matching images automatically through free online service for multispectral point cloud generation. Then, one can use three-dimensional affine coordinate transformation to compare the data increment. At last, the given threshold of height and color information is set as threshold in classification.

  19. Adaptive ladar receiver for multispectral imaging

    NASA Astrophysics Data System (ADS)

    Johnson, Kenneth; Vaidyanathan, Mohan; Xue, Song; Tennant, William E.; Kozlowski, Lester J.; Hughes, Gary W.; Smith, Duane D.

    2001-09-01

    We are developing a novel 2D focal plane array (FPA) with read-out integrated circuit (ROIC) on a single chip for 3D laser radar imaging. The ladar will provide high-resolution range and range-resolved intensity images for detection and identification of difficult targets. The initial full imaging-camera-on-a-chip system will be a 64 by 64 element, 100-micrometers pixel-size detector array that is directly bump bonded to a low-noise 64 by 64 array silicon CMOS-based ROIC. The architecture is scalable to 256 by 256 or higher arrays depending on the system application. The system will provide all the required electronic processing at pixel level and the smart FPA enables directly producing the 3D or 4D format data to be captured with a single laser pulse. The detector arrays are made of uncooled InGaAs PIN device for SWIR imaging at 1.5 micrometers wavelength and cooled HgCdTe PIN device for MWIR imaging at 3.8 micrometers wavelength. We are also investigating concepts using multi-color detector arrays for simultaneous imaging at multiple wavelengths that would provide additional spectral dimension capability for enhanced detection and identification of deep-hide targets. The system is suited for flash ladar imaging, for combat identification of ground targets from airborne platforms, flash-ladar imaging seekers, and autonomous robotic/automotive vehicle navigation and collision avoidance applications.

  20. On-board multispectral classification study

    NASA Technical Reports Server (NTRS)

    Ewalt, D.

    1979-01-01

    The factors relating to onboard multispectral classification were investigated. The functions implemented in ground-based processing systems for current Earth observation sensors were reviewed. The Multispectral Scanner, Thematic Mapper, Return Beam Vidicon, and Heat Capacity Mapper were studied. The concept of classification was reviewed and extended from the ground-based image processing functions to an onboard system capable of multispectral classification. Eight different onboard configurations, each with varying amounts of ground-spacecraft interaction, were evaluated. Each configuration was evaluated in terms of turnaround time, onboard processing and storage requirements, geometric and classification accuracy, onboard complexity, and ancillary data required from the ground.

  1. US open-skies follow-on evaluation program, multispectral hyperspectral (MSHS) sensor survey

    SciTech Connect

    Ryan, R.; Del Guidice, P.; Smith, L.; Soel, M.

    1996-10-01

    The Follow-On Sensor Evaluation Program (FOSEP) has evaluated the potential benefits of hyperspectral and multispectral sensor additions to the existing sensor suite on the U.S. Open Skies aircraft and recommends a multispectral sensor for implementation in the Open Skies program. Potential enhancements to the Open Skies missions include environmental monitoring and improved treaty verification capabilities. A previous study indicated the inadequacy of the current U.S. Open Skies aircraft sensor suite, with or without modifications to existing sensors, in the performance of environmental monitoring. The most beneficial modifications identified in this previous report were multispectral modifications of the existing sensor suite. However, even with these modifications significant inadequacies for many environmental and other missions would still remain. That study concluded that enhancement of Open Skies missions could be achieved with the addition of sensors specifically designed for multispectral imagery. In this current study, we examined and compared a wide range of commercially available airborne imaging spectrometers for a host of Open Skies missions which included both environmental and military applications. A set of tables and figures were developed ensuring that the evaluated sensors matched the Open Skies parameters of interest and flight profiles. These tables provide a common basis for comparing commercially available sensor systems for their suitability to Open Skies missions. Several figures of merit were used to compare different sensors including ground sample distance (GSD), number of spectral bands, signal-to-noise ratio and exportability. This methodology was applied to potential Open Skies multispectral and hyperspectral sensors. Rankings were developed using these of figures of merit and constraints imposed by the existing and future platforms. 6 refs., 2 figs., 4 tabs.

  2. An operational multispectral scanner for bathymetric surveys - The ABS NORDA scanner

    NASA Technical Reports Server (NTRS)

    Haimbach, Stephen P.; Joy, Richard T.; Hickman, G. Daniel

    1987-01-01

    The Naval Ocean Research and Development Activity (NORDA) is developing the Airborne Bathymetric Survey (ABS) system, which will take shallow water depth soundings from a Navy P-3 aircraft. The system combines active and passive sensors to obtain optical measurements of water depth. The ABS NORDA Scanner is the systems passive multispectral scanner whose design goal is to provide 100 percent coverage of the seafloor, to depths of 20 m in average coastal waters. The ABS NORDA Scanner hardware and operational environment is discussed in detail. The optical model providing the basis for depth extraction is reviewed and the proposed data processing routine discussed.

  3. Assessment of Pen Branch delta and corridor vegetation changes using multispectral scanner data 1992--1994

    SciTech Connect

    1996-01-01

    Airborne multispectral scanner data were used to monitor natural succession of wetland vegetation species over a three-year period from 1992 through 1994 for Pen Branch on the Savannah River Site in South Carolina. Image processing techniques were used to identify and measure wetland vegetation communities in the lower portion of the Pen Branch corridor and delta. The study provided a reliable means for monitoring medium- and large-scale changes in a diverse environment. Findings from the study will be used to support decisions regarding remediation efforts following the cessation of cooling water discharge from K reactor at the Department of Energy`s Savannah River Site in South Carolina.

  4. Uav Multispectral Survey to Map Soil and Crop for Precision Farming Applications

    NASA Astrophysics Data System (ADS)

    Sonaa, Giovanna; Passoni, Daniele; Pinto, Livio; Pagliari, Diana; Masseroni, Daniele; Ortuani, Bianca; Facchi, Arianna

    2016-06-01

    New sensors mounted on UAV and optimal procedures for survey, data acquisition and analysis are continuously developed and tested for applications in precision farming. Procedures to integrate multispectral aerial data about soil and crop and ground-based proximal geophysical data are a recent research topic aimed to delineate homogeneous zones for the management of agricultural inputs (i.e., water, nutrients). Multispectral and multitemporal orthomosaics were produced over a test field (a 100 m x 200 m plot within a maize field), to map vegetation and soil indices, as well as crop heights, with suitable ground resolution. UAV flights were performed in two moments during the crop season, before sowing on bare soil, and just before flowering when maize was nearly at the maximum height. Two cameras, for color (RGB) and false color (NIR-RG) images, were used. The images were processed in Agisoft Photoscan to produce Digital Surface Model (DSM) of bare soil and crop, and multispectral orthophotos. To overcome some difficulties in the automatic searching of matching points for the block adjustment of the crop image, also the scientific software developed by Politecnico of Milan was used to enhance images orientation. Surveys and image processing are described, as well as results about classification of multispectral-multitemporal orthophotos and soil indices.

  5. Combining transverse field detectors and color filter arrays to improve multispectral imaging systems.

    PubMed

    Martínez, Miguel A; Valero, Eva M; Hernández-Andrés, Javier; Romero, Javier; Langfelder, Giacomo

    2014-05-01

    This work focuses on the improvement of a multispectral imaging sensor based on transverse field detectors (TFDs). We aimed to achieve a higher color and spectral accuracy in the estimation of spectral reflectances from sensor responses. Such an improvement was done by combining these recently developed silicon-based sensors with color filter arrays (CFAs). Consequently, we sacrificed the filter-less full spatial resolution property of TFDs to narrow down the spectrally broad sensitivities of these sensors. We designed and performed several experiments to test the influence of different design features on the estimation quality (type of sensor, tunability, interleaved polarization, use of CFAs, type of CFAs, number of shots), some of which are exclusive to TFDs. We compared systems that use a TFD with systems that use normal monochrome sensors, both combined with multispectral CFAs as well as common RGB filters present in commercial digital color cameras. Results showed that a system that combines TFDs and CFAs performs better than systems with the same type of multispectral CFA and other sensors, or even the same TFDs combined with different kinds of filters used in common imaging systems. We propose CFA+TFD-based systems with one or two shots, depending on the possibility of using longer capturing times or not. Improved TFD systems thus emerge as an interesting possibility for multispectral acquisition, which overcomes the limited accuracy found in previous studies.

  6. Future Planetary Surface Imager Development by the Beagle 2 Stereo Camera System Team

    NASA Astrophysics Data System (ADS)

    Griffiths, A. D.; Coates, A. J.; Josset, J.-L.; Paar, G.

    2004-03-01

    The Stereo Camera System provided Beagle 2 with wide-angle multi-spectral stereo imaging (IFOV=0.043°). The SCS team plans to build on this design heritage to provide improved stereo capabilities to the Pasteur payload of the Aurora ExoMars rover.

  7. Pre-flight and On-orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.; Wagner, R. V.; Robinson, M. S.; Licht, A.; Thomas, P. C.; Becker, K.; Anderson, J.; Brylow, S. M.; Humm, D. C.; Tschimmel, M.

    2016-04-01

    The Lunar Reconnaissance Orbiter Camera (LROC) consists of two imaging systems that provide multispectral and high resolution imaging of the lunar surface. The Wide Angle Camera (WAC) is a seven color push-frame imager with a 90∘ field of view in monochrome mode and 60∘ field of view in color mode. From the nominal 50 km polar orbit, the WAC acquires images with a nadir ground sampling distance of 75 m for each of the five visible bands and 384 m for the two ultraviolet bands. The Narrow Angle Camera (NAC) consists of two identical cameras capable of acquiring images with a ground sampling distance of 0.5 m from an altitude of 50 km. The LROC team geometrically calibrated each camera before launch at Malin Space Science Systems in San Diego, California and the resulting measurements enabled the generation of a detailed camera model for all three cameras. The cameras were mounted and subsequently launched on the Lunar Reconnaissance Orbiter (LRO) on 18 June 2009. Using a subset of the over 793000 NAC and 207000 WAC images of illuminated terrain collected between 30 June 2009 and 15 December 2013, we improved the interior and exterior orientation parameters for each camera, including the addition of a wavelength dependent radial distortion model for the multispectral WAC. These geometric refinements, along with refined ephemeris, enable seamless projections of NAC image pairs with a geodetic accuracy better than 20 meters and sub-pixel precision and accuracy when orthorectifying WAC images.

  8. Multispectral Scanner for Monitoring Plants

    NASA Technical Reports Server (NTRS)

    Gat, Nahum

    2004-01-01

    A multispectral scanner has been adapted to capture spectral images of living plants under various types of illumination for purposes of monitoring the health of, or monitoring the transfer of genes into, the plants. In a health-monitoring application, the plants are illuminated with full-spectrum visible and near infrared light and the scanner is used to acquire a reflected-light spectral signature known to be indicative of the health of the plants. In a gene-transfer- monitoring application, the plants are illuminated with blue or ultraviolet light and the scanner is used to capture fluorescence images from a green fluorescent protein (GFP) that is expressed as result of the gene transfer. The choice of wavelength of the illumination and the wavelength of the fluorescence to be monitored depends on the specific GFP.

  9. Multispectral sensing of moisture stress

    NASA Technical Reports Server (NTRS)

    Olson, C. E., Jr.

    1970-01-01

    Laboratory reflectance data, and field tests with multispectral remote sensors provide support for this hypotheses that differences in moisture content and water deficits are closely related to foliar reflectance from woody plants. When these relationships are taken into account, automatic recognition techniques become more powerful than when they are ignored. Evidence is increasing that moisture relationships inside plant foliage are much more closely related to foliar reflectance characteristics than are external variables such as soil moisture, wind, and air temperature. Short term changes in water deficits seem to have little influence on foliar reflectance, however. This is in distinct contrast to significant short-term changes in foliar emittance from the same plants with changing wind, air temperature, incident radiation, or water deficit conditions.

  10. Cinematic camera emulation using two-dimensional color transforms

    NASA Astrophysics Data System (ADS)

    McElvain, Jon S.; Gish, Walter

    2015-02-01

    For cinematic and episodic productions, on-set look management is an important component of the creative process, and involves iterative adjustments of the set, actors, lighting and camera configuration. Instead of using the professional motion capture device to establish a particular look, the use of a smaller form factor DSLR is considered for this purpose due to its increased agility. Because the spectral response characteristics will be different between the two camera systems, a camera emulation transform is needed to approximate the behavior of the destination camera. Recently, twodimensional transforms have been shown to provide high-accuracy conversion of raw camera signals to a defined colorimetric state. In this study, the same formalism is used for camera emulation, whereby a Canon 5D Mark III DSLR is used to approximate the behavior a Red Epic cinematic camera. The spectral response characteristics for both cameras were measured and used to build 2D as well as 3x3 matrix emulation transforms. When tested on multispectral image databases, the 2D emulation transforms outperform their matrix counterparts, particularly for images containing highly chromatic content.

  11. Color image reproduction based on multispectral and multiprimary imaging: experimental evaluation

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Masahiro; Teraji, Taishi; Ohsawa, Kenro; Uchiyama, Toshio; Motomura, Hideto; Murakami, Yuri; Ohyama, Nagaaki

    2001-12-01

    Multispectral imaging is significant technology for the acquisition and display of accurate color information. Natural color reproduction under arbitrary illumination becomes possible using spectral information of both image and illumination light. In addition, multiprimary color display, i.e., using more than three primary colors, has been also developed for the reproduction of expanded color gamut, and for discounting observer metamerism. In this paper, we present the concept for the multispectral data interchange for natural color reproduction, and the experimental results using 16-band multispectral camera and 6-primary color display. In the experiment, the accuracy of color reproduction is evaluated in CIE (Delta) Ea*b* for both image capture and display systems. The average and maximum (Delta) Ea*b* = 1.0 and 2.1 in 16-band mutispectral camera system, using Macbeth 24 color patches. In the six-primary color projection display, average and maximum (Delta) Ea*b* = 1.3 and 2.7 with 30 test colors inside the display gamut. Moreover, the color reproduction results with different spectral distributions but same CIE tristimulus value are visually compared, and it is confirmed that the 6-primary display gives improved agreement between the original and reproduced colors.

  12. Use of a Multispectral Uav Photogrammetry for Detection and Tracking of Forest Disturbance Dynamics

    NASA Astrophysics Data System (ADS)

    Minařík, R.; Langhammer, J.

    2016-06-01

    This study presents a new methodological approach for assessment of spatial and qualitative aspects of forest disturbance based on the use of multispectral imaging camera with the UAV photogrammetry. We have used the miniaturized multispectral sensor Tetracam Micro Multiple Camera Array (μ-MCA) Snap 6 with the multirotor imaging platform to get multispectral imagery with high spatial resolution. The study area is located in the Sumava Mountains, Central Europe, heavily affected by windstorms, followed by extensive and repeated bark beetle (Ips typographus [L.]) outbreaks in the past 20 years. After two decades, there is apparent continuous spread of forest disturbance as well as rapid regeneration of forest vegetation, related with changes in species and their diversity. For testing of suggested methodology, we have launched imaging campaign in experimental site under various stages of forest disturbance and regeneration. The imagery of high spatial and spectral resolution enabled to analyse the inner structure and dynamics of the processes. The most informative bands for tree stress detection caused by bark beetle infestation are band 2 (650nm) and band 3 (700nm), followed by band 4 (800 nm) from the, red-edge and NIR part of the spectrum. We have identified only three indices, which seems to be able to correctly detect different forest disturbance categories in the complex conditions of mixture of categories. These are Normalized Difference Vegetation Index (NDVI), Simple 800/650 Ratio Pigment specific simple ratio B1 and Red-edge Index.

  13. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  14. Multispectral imaging with vertical silicon nanowires

    PubMed Central

    Park, Hyunsung; Crozier, Kenneth B.

    2013-01-01

    Multispectral imaging is a powerful tool that extends the capabilities of the human eye. However, multispectral imaging systems generally are expensive and bulky, and multiple exposures are needed. Here, we report the demonstration of a compact multispectral imaging system that uses vertical silicon nanowires to realize a filter array. Multiple filter functions covering visible to near-infrared (NIR) wavelengths are simultaneously defined in a single lithography step using a single material (silicon). Nanowires are then etched and embedded into polydimethylsiloxane (PDMS), thereby realizing a device with eight filter functions. By attaching it to a monochrome silicon image sensor, we successfully realize an all-silicon multispectral imaging system. We demonstrate visible and NIR imaging. We show that the latter is highly sensitive to vegetation and furthermore enables imaging through objects opaque to the eye. PMID:23955156

  15. Toward Multispectral Imaging with Colloidal Metasurface Pixels.

    PubMed

    Stewart, Jon W; Akselrod, Gleb M; Smith, David R; Mikkelsen, Maiken H

    2017-02-01

    Multispectral colloidal metasurfaces are fabricated that exhibit greater than 85% absorption and ≈100 nm linewidths by patterning film-coupled nanocubes in pixels using a fusion of bottom-up and top-down fabrication techniques over wafer-scale areas. With this technique, the authors realize a multispectral pixel array consisting of six resonances between 580 and 1125 nm and reconstruct an RGB image with 9261 color combinations.

  16. Simultaneous denoising and compression of multispectral images

    NASA Astrophysics Data System (ADS)

    Hagag, Ahmed; Amin, Mohamed; Abd El-Samie, Fathi E.

    2013-01-01

    A new technique for denoising and compression of multispectral satellite images to remove the effect of noise on the compression process is presented. One type of multispectral images has been considered: Landsat Enhanced Thematic Mapper Plus. The discrete wavelet transform (DWT), the dual-tree DWT, and a simple Huffman coder are used in the compression process. Simulation results show that the proposed technique is more effective than other traditional compression-only techniques.

  17. Multispectral Image Analysis of Hurricane Gilbert

    DTIC Science & Technology

    1989-05-19

    Classification) Multispectral Image Analysis of Hurrican Gilbert (unclassified) 12. PERSONAL AUTHOR(S) Kleespies, Thomas J. (GL/LYS) 13a. TYPE OF REPORT...cloud top height. component, of tle image in the red channel, and similarly for the green and blue channels. Multispectral Muti.pectral image analysis can...However, there seems to be few references to the human range of vision, the selection as to which mllti.pp.tral image analysis of scenes or

  18. Experimental Advanced Airborne Research Lidar (EAARL) Data Processing Manual

    USGS Publications Warehouse

    Bonisteel, Jamie M.; Nayegandhi, Amar; Wright, C. Wayne; Brock, John C.; Nagle, David

    2009-01-01

    . Each pulse is focused into an illumination area that has a radius of about 20 centimeters on the ground. The pulse-repetition frequency of the EAARL transmitter varies along each across-track scan to produce equal cross-track sample spacing and near uniform density (Nayegandhi and others, 2006). Targets can have varying physical and optical characteristics that cause extreme fluctuations in laser backscatter complexity and signal strength. To accommodate this dynamic range, EAARL has the real-time ability to detect, capture, and automatically adapt to each laser return backscatter. The backscattered energy is collected by an array of four high-speed waveform digitizers connected to an array of four sub-nanosecond photodetectors. Each of the four photodetectors receives a finite range of the returning laser backscatter photons. The most sensitive channel receives 90% of the photons, the least sensitive receives 0.9%, and the middle channel receives 9% (Wright and Brock, 2002). The fourth channel is available for detection but is not currently being utilized. All four channels are digitized simultaneously into 65,536 samples for every laser pulse. Receiver optics consists of a 15-centimeter-diameter dielectric-coated Newtonian telescope, a computer-driven raster scanning mirror oscillating at 12.5 hertz (25 rasters per second), and an array of sub-nanosecond photodetectors. The signal emitted by the pulsed laser transmitter is amplified as backscatter by the optical telescope receiver. The photomultiplier tube (PMT) then converts the optical energy into electrical impulses (Nayegandhi and others, 2006). In addition to the full-waveform resolving laser, the EAARL sensor suite includes a down-looking 70-centimeter-resolution Red-Green-Blue (RGB) digital network camera, a high-resolution color infrared (CIR) multispectral camera (14-centimeter-resolution), two precision dual-frequency kinematic carrier-phase global positioning system (GPS) receivers, and an

  19. Long-distance eye-safe laser TOF camera design

    NASA Astrophysics Data System (ADS)

    Kovalev, Anton V.; Polyakov, Vadim M.; Buchenkov, Vyacheslav A.

    2016-04-01

    We present a new TOF camera design based on a compact actively Q-switched diode pumped solid-state laser operating in 1.5 μm range and a receiver system based on a short wave infrared InGaAs PIN diodes focal plane array with an image intensifier and a special readout integration circuit. The compact camera is capable of depth imaging up to 4 kilometers with 10 frame/s and 1.2 m error. The camera could be applied for airborne and space geodesy location and navigation.

  20. Multispectral palmprint recognition using a quaternion matrix.

    PubMed

    Xu, Xingpeng; Guo, Zhenhua; Song, Changjiang; Li, Yafeng

    2012-01-01

    Palmprints have been widely studied for biometric recognition for many years. Traditionally, a white light source is used for illumination. Recently, multispectral imaging has drawn attention because of its high recognition accuracy. Multispectral palmprint systems can provide more discriminant information under different illuminations in a short time, thus they can achieve better recognition accuracy. Previously, multispectral palmprint images were taken as a kind of multi-modal biometrics, and the fusion scheme on the image level or matching score level was used. However, some spectral information will be lost during image level or matching score level fusion. In this study, we propose a new method for multispectral images based on a quaternion model which could fully utilize the multispectral information. Firstly, multispectral palmprint images captured under red, green, blue and near-infrared (NIR) illuminations were represented by a quaternion matrix, then principal component analysis (PCA) and discrete wavelet transform (DWT) were applied respectively on the matrix to extract palmprint features. After that, Euclidean distance was used to measure the dissimilarity between different features. Finally, the sum of two distances and the nearest neighborhood classifier were employed for recognition decision. Experimental results showed that using the quaternion matrix can achieve a higher recognition rate. Given 3000 test samples from 500 palms, the recognition rate can be as high as 98.83%.

  1. Multispectral Palmprint Recognition Using a Quaternion Matrix

    PubMed Central

    Xu, Xingpeng; Guo, Zhenhua; Song, Changjiang; Li, Yafeng

    2012-01-01

    Palmprints have been widely studied for biometric recognition for many years. Traditionally, a white light source is used for illumination. Recently, multispectral imaging has drawn attention because of its high recognition accuracy. Multispectral palmprint systems can provide more discriminant information under different illuminations in a short time, thus they can achieve better recognition accuracy. Previously, multispectral palmprint images were taken as a kind of multi-modal biometrics, and the fusion scheme on the image level or matching score level was used. However, some spectral information will be lost during image level or matching score level fusion. In this study, we propose a new method for multispectral images based on a quaternion model which could fully utilize the multispectral information. Firstly, multispectral palmprint images captured under red, green, blue and near-infrared (NIR) illuminations were represented by a quaternion matrix, then principal component analysis (PCA) and discrete wavelet transform (DWT) were applied respectively on the matrix to extract palmprint features. After that, Euclidean distance was used to measure the dissimilarity between different features. Finally, the sum of two distances and the nearest neighborhood classifier were employed for recognition decision. Experimental results showed that using the quaternion matrix can achieve a higher recognition rate. Given 3000 test samples from 500 palms, the recognition rate can be as high as 98.83%. PMID:22666049

  2. Pancam: A Multispectral Imaging Investigation on the NASA 2003 Mars Exploration Rover Mission

    NASA Technical Reports Server (NTRS)

    Bell, J. F., III; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Dingizian, A.; Brown, D.; Morris, R. V.; Arneson, H. M.; Johnson, M. J.

    2003-01-01

    One of the six science payload elements carried on each of the NASA Mars Exploration Rovers (MER; Figure 1) is the Panoramic Camera System, or Pancam. Pancam consists of three major components: a pair of digital CCD cameras, the Pancam Mast Assembly (PMA), and a radiometric calibration target. The PMA provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. The calibration target provides a set of reference color and grayscale standards for calibration validation, and a shadow post for quantification of the direct vs. diffuse illumination of the scene. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360 of azimuth and from zenith to nadir, providing a complete view of the scene around the rover in up to 12 unique wavelengths. The major characteristics of Pancam are summarized.

  3. Angioscopic image-enhanced observation of atherosclerotic plaque phantom by near-infrared multispectral imaging at wavelengths around 1200 nm

    NASA Astrophysics Data System (ADS)

    Ishii, K.; Nagao, R.; Matsui, D.; Awazu, K.

    2015-02-01

    Spectroscopic techniques have been researched for intravascular diagnostic imaging of atherosclerotic plaque. Nearinfrared (NIR) light efficiently penetrates of biological tissues, and the NIR region contains the characteristic absorption range of lipid-rich plaques. The objective of this study is to observe atherosclerotic plaque using a NIR multispectral angioscopic imaging. Atherosclerotic plaque phantoms were prepared using a biological tissue model and bovine fat. For the study, we developed an NIR multispectral angioscopic imaging system with a halogen light, mercury-cadmiumtelluride camera, band-pass filters and an image fiber. Apparent spectral absorbance was obtained at three wavelengths, 1150, 1200 and 1300 nm. Multispectral images of the phantom were constructed using the spectral angle mapper algorithm. As a result, the lipid area, which was difficult to observe in a visible image, could be clearly observed in a multispectral image. Our results show that image-enhanced observation and quantification of atherosclerotic plaque by NIR multispectral imaging at wavelengths around 1200 nm is a promising angioscopic technique with the potential to identify lipid-rich plaques.

  4. Estimating atmospheric parameters and reducing noise for multispectral imaging

    DOEpatents

    Conger, James Lynn

    2014-02-25

    A method and system for estimating atmospheric radiance and transmittance. An atmospheric estimation system is divided into a first phase and a second phase. The first phase inputs an observed multispectral image and an initial estimate of the atmospheric radiance and transmittance for each spectral band and calculates the atmospheric radiance and transmittance for each spectral band, which can be used to generate a "corrected" multispectral image that is an estimate of the surface multispectral image. The second phase inputs the observed multispectral image and the surface multispectral image that was generated by the first phase and removes noise from the surface multispectral image by smoothing out change in average deviations of temperatures.

  5. Multispectral fundus imaging for early detection of diabetic retinopathy

    NASA Astrophysics Data System (ADS)

    Beach, James M.; Tiedeman, James S.; Hopkins, Mark F.; Sabharwal, Yashvinder S.

    1999-04-01

    Functional imaging of the retina and associated structures may provide information for early assessment of risks of developing retinopathy in diabetic patients. Here we show results of retinal oximetry performed using multi-spectral reflectance imaging techniques to assess hemoglobin (Hb) oxygen saturation (OS) in blood vessels of the inner retina and oxygen utilization at the optic nerve in diabetic patients without retinopathy and early disease during experimental hyperglycemia. Retinal images were obtained through a fundus camera and simultaneously recorded at up to four wavelengths using image-splitting modules coupled to a digital camera. Changes in OS in large retinal vessels, in average OS in disk tissue, and in the reduced state of cytochrome oxidase (CO) at the disk were determined from changes in reflectance associated with the oxidation/reduction states of Hb and CO. Step to high sugar lowered venous oxygen saturation to a degree dependent on disease duration. Moderate increase in sugar produced higher levels of reduced CO in both the disk and surrounding tissue without a detectable change in average tissue OS. Results suggest that regulation of retinal blood supply and oxygen consumption are altered by hyperglycemia and that such functional changes are present before clinical signs of retinopathy.

  6. Geologic mapping in Death Valley, California/Nevada using NASA/JPL airborne systems (AVIRIS, TIMS, and AIRSAR)

    NASA Technical Reports Server (NTRS)

    Kruse, Fred A.; Dietz, John B.; Kiereinyoung, Kathryn S.

    1991-01-01

    A multi-sensor aircraft campaign called the Geologic Remote Sensing Field Experiment (GRSFE) conducted during 1989 resulted in acquisition of high quality multispectral images in the visible, near infrared, shortwave infrared, thermal infrared, and microwave regions of the electromagnetic spectrum. The airborne data sets include the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), the Thermal Infrared Multispectral Scanner (TIMS), and the Airborne Synthetic Aperture Radar (SAR). Ancillary data include Landsat Thematic Mapper, laboratory and field spectral measurements, and traditional geologic mapping. The GRSFE data for a site in the northern Death Valley, (California and Nevada) region were calibrated to physical units and geometrically registered to a map base. Various aspects of this experiment are briefly discussed.

  7. Adaptation of the Camera Link Interface for Flight-Instrument Applications

    NASA Technical Reports Server (NTRS)

    Randall, David P.; Mahoney, John C.

    2010-01-01

    COTS (commercial-off-the-shelf) hard ware using an industry-standard Camera Link interface is proposed to accomplish the task of designing, building, assembling, and testing electronics for an airborne spectrometer that would be low-cost, but sustain the required data speed and volume. The focal plane electronics were designed to support that hardware standard. Analysis was done to determine how these COTS electronics could be interfaced with space-qualified camera electronics. Interfaces available for spaceflight application do not support the industry standard Camera Link interface, but with careful design, COTS EGSE (electronics ground support equipment), including camera interfaces and camera simulators, can still be used.

  8. The analysis of multispectral image data with self-organizing feature maps

    SciTech Connect

    Schaale, M.

    1996-11-01

    The analysis of multispectral sceneries is still a challenging task although many different algorithms for a mathematical analysis exist. Most classification algorithms work in a supervised mode only, i.e. they need to know the possible land usage classes prior to the calculation. In this step many simplifications and assumptions have to be done which directly influence the result. KOHONEN`s self-organizing feature maps, which are based on a biological model, provide a powerful tool to describe the multispectral scenery under consideration with a limited number of reference vectors, the so-called codebook. The resulting code-book, generated with an unsupervised learning scheme, is a compressed description of the multi-dimensional data in terms of non-linear principal components which thus overcomes the problems of a linear principal component analysis. Using this code-book as a basis for a lateral fully interconnected network and introducing a non-linear activity flow between radial-basis functions located at the-positions of the reference vectors results in an unsupervised clustering scheme. This method has been successfully adapted to multispectral sceneries recorded by casi (compact airborne spectrographic imager). 16 refs., 8 figs., 1 tab.

  9. Constrained space camera assembly

    DOEpatents

    Heckendorn, Frank M.; Anderson, Erin K.; Robinson, Casandra W.; Haynes, Harriet B.

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  10. Novel fundus camera design

    NASA Astrophysics Data System (ADS)

    Dehoog, Edward A.

    A fundus camera a complex optical system that makes use of the principle of reflex free indirect ophthalmoscopy to image the retina. Despite being in existence as early as 1900's, little has changed in the design of a fundus camera and there is minimal information about the design principles utilized. Parameters and specifications involved in the design of fundus camera are determined and their affect on system performance are discussed. Fundus cameras incorporating different design methods are modeled and a performance evaluation based on design parameters is used to determine the effectiveness of each design strategy. By determining the design principles involved in the fundus camera, new cameras can be designed to include specific imaging modalities such as optical coherence tomography, imaging spectroscopy and imaging polarimetry to gather additional information about properties and structure of the retina. Design principles utilized to incorporate such modalities into fundus camera systems are discussed. Design, implementation and testing of a snapshot polarimeter fundus camera are demonstrated.

  11. Making Ceramic Cameras

    ERIC Educational Resources Information Center

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  12. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  13. Buried archaeological structures detection using MIVIS hyperspectral airborne data

    NASA Astrophysics Data System (ADS)

    Merola, P.; Allegrini, A.; Guglietta, D.; Sampieri, S.

    2006-08-01

    The identification of buried archaeological structures, using remote sensing technologies (aerophotos or satellite and airborne images) is based on the analysis of surface spectral features changes that overlying underground terrain units, located on the basis of texture variations, humidity and vegetation cover. The study of these anomalies on MIVIS (Multispectral Infrared and Visible Imaging Spectrometer) hyperspectral data is the main goal of a research project that the CNR-IIA has carried on over different archaeological test sites. The major archaeological information were gathered by data analysis in the VIS and NIR spectral region and by use of the apparent thermal inertia image and their different vegetation index.

  14. Multispectral Imaging from Mars PATHFINDER

    NASA Technical Reports Server (NTRS)

    Ferrand, William H.; Bell, James F., III; Johnson, Jeffrey R.; Bishop, Janice L.; Morris, Richard V.

    2007-01-01

    The Imager for Mars Pathfinder (IMP) was a mast-mounted instrument on the Mars Pathfinder lander which landed on Mars Ares Vallis floodplain on July 4, 1997. During the 83 sols of Mars Pathfinders landed operations, the IMP collected over 16,600 images. Multispectral images were collected using twelve narrowband filters at wavelengths between 400 and 1000 nm in the visible and near infrared (VNIR) range. The IMP provided VNIR spectra of the materials surrounding the lander including rocks, bright soils, dark soils, and atmospheric observations. During the primary mission, only a single primary rock spectral class, Gray Rock, was recognized; since then, Black Rock, has been identified. The Black Rock spectra have a stronger absorption at longer wavelengths than do Gray Rock spectra. A number of coated rocks have also been described, the Red and Maroon Rock classes, and perhaps indurated soils in the form of the Pink Rock class. A number of different soil types were also recognized with the primary ones being Bright Red Drift, Dark Soil, Brown Soil, and Disturbed Soil. Examination of spectral parameter plots indicated two trends which were interpreted as representing alteration products formed in at least two different environmental epochs of the Ares Vallis area. Subsequent analysis of the data and comparison with terrestrial analogs have supported the interpretation that the rock coatings provide evidence of earlier martian environments. However, the presence of relatively uncoated examples of the Gray and Black rock classes indicate that relatively unweathered materials can persist on the martian surface.

  15. Integration of visible-through microwave-range multispectral image data sets for geologic mapping

    NASA Technical Reports Server (NTRS)

    Kruse, Fred A.; Dietz, John B.

    1991-01-01

    Multispectral remote sensing data sets collected during the Geologic Remote Sensing Field Experiment (GRSFE) conducted during 1989 in the southwestern U.S. were used to produce thematic image maps showing details of the surface geology. LANDSAT TM (Thematic Mapper) images were used to map the distribution of clays, carbonates, and iron oxides. AVIRIS (Airborne Visible/Infrared Imaging Spectrometer) data were used to identify and map calcite, dolomite, sericite, hematite, and geothite, including mixtures. TIMS (Thermal Infrared Multispectral Scanner) data were used to map the distribution of igneous rock phases and carbonates based on their silica contents. AIRSAR (Airborne Synthetic Aperture Radar) data were used to map surface textures related to the scale of surface roughness. The AIRSAR also allowed identification of previously unmapped fault segments and structural control of lithology and minerology. Because all of the above data sets were geographically referenced, combination of different data types and direct comparison of the results with conventional field and laboratory data sets allowed improved geologic mapping of the test site.

  16. Land mine detection using multispectral image fusion

    SciTech Connect

    Clark, G.A.; Sengupta, S.K.; Aimonetti, W.D.; Roeske, F.; Donetti, J.G.; Fields, D.J.; Sherwood, R.J.; Schaich, P.C.

    1995-03-29

    Our system fuses information contained in registered images from multiple sensors to reduce the effects of clutter and improve the ability to detect surface and buried land mines. The sensor suite currently consists of a camera that acquires images in six bands (400nm, 500nm, 600nm, 700nm, 800nm and 900nm). Past research has shown that it is extremely difficult to distinguish land mines from background clutter in images obtained from a single sensor. It is hypothesized, however, that information fused from a suite of various sensors is likely to provide better detection reliability, because the suite of sensors detects a variety of physical properties that are more separable in feature space. The materials surrounding the mines can include natural materials (soil, rocks, foliage, water, etc.) and some artifacts. We use a supervised learning pattern recognition approach to detecting the metal and plastic land mines. The overall process consists of four main parts: Preprocessing, feature extraction, feature selection, and classification. These parts are used in a two step process to classify a subimage. We extract features from the images, and use feature selection algorithms to select only the most important features according to their contribution to correct detections. This allows us to save computational complexity and determine which of the spectral bands add value to the detection system. The most important features from the various sensors are fused using a supervised learning pattern classifier (the probabilistic neural network). We present results of experiments to detect land mines from real data collected from an airborne platform, and evaluate the usefulness of fusing feature information from multiple spectral bands.

  17. Those Nifty Digital Cameras!

    ERIC Educational Resources Information Center

    Ekhaml, Leticia

    1996-01-01

    Describes digital photography--an electronic imaging technology that merges computer capabilities with traditional photography--and its uses in education. Discusses how a filmless camera works, types of filmless cameras, advantages and disadvantages, and educational applications of the consumer digital cameras. (AEF)

  18. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

  19. Radiometric performance of the Viking Mars lander cameras

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Burcher, E. E.; Taylor, E. J.; Wall, S. D.

    1975-01-01

    The Viking lander cameras feature an array of 12 silicon photodiodes for electronic focus selection and multispectral imaging. Comparisons of absolute radiometric calibrations of the four cameras selected for the mission to Mars with performance predictions based on their design data revealed minor discrepancies. These discrepancies were caused primarily by the method used to calibrate the photosensor array and apparently also from light reflections internal to the array. The sensitivity and dynamic range of all camera channels are found to be sufficient for high quality pictures, providing that the commandable gains and offsets can be optimized for the scene radiance; otherwise, the quantization noise may be too high or the dynamic range too low for an adequate characterization of the scene.

  20. Rapid multispectral endoscopic imaging system for near real-time mapping of the mucosa blood supply in the lung

    PubMed Central

    Fawzy, Yasser; Lam, Stephen; Zeng, Haishan

    2015-01-01

    We have developed a fast multispectral endoscopic imaging system that is capable of acquiring images in 18 optimized spectral bands spanning 400-760 nm by combining a customized light source with six triple-band filters and a standard color CCD camera. A method is developed to calibrate the spectral response of the CCD camera. Imaging speed of 15 spectral image cubes/second is achieved. A spectral analysis algorithm based on a linear matrix inversion approach is developed and implemented in a graphics processing unit (GPU) to map the mucosa blood supply in the lung in vivo. Clinical measurements on human lung patients are demonstrated. PMID:26309761

  1. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  2. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  3. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  4. Airborne sensor systems under development at the NASA/NSTL/Earth Resources Laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, James E.; Meeks, Gerald R.

    1988-01-01

    The operational characteristics of the Airborne Bathymetric System (ABS) MSS and the Airborne Multispectral Pushbroom Scanner (AMPS), which are currently being developed at NASA's Earth Resources Laboratory (ERL), are described. The ABS MSS system scans through a swath width of + or - 40 deg from nadir and the sensor incorporates onboard calibration references for the visible and short-wavelength IR channels. The AMPS uses five separate f/1.8 refractive telecentric lens systems, each incorporating nine optical elements, and a replaceable fixed bandwidth filter.

  5. Applying Neural Networks to Hyperspectral and Multispectral Field Data for Discrimination of Cruciferous Weeds in Winter Crops

    PubMed Central

    de Castro, Ana-Isabel; Jurado-Expósito, Montserrat; Gómez-Casero, María-Teresa; López-Granados, Francisca

    2012-01-01

    In the context of detection of weeds in crops for site-specific weed control, on-ground spectral reflectance measurements are the first step to determine the potential of remote spectral data to classify weeds and crops. Field studies were conducted for four years at different locations in Spain. We aimed to distinguish cruciferous weeds in wheat and broad bean crops, using hyperspectral and multispectral readings in the visible and near-infrared spectrum. To identify differences in reflectance between cruciferous weeds, we applied three classification methods: stepwise discriminant (STEPDISC) analysis and two neural networks, specifically, multilayer perceptron (MLP) and radial basis function (RBF). Hyperspectral and multispectral signatures of cruciferous weeds, and wheat and broad bean crops can be classified using STEPDISC analysis, and MLP and RBF neural networks with different success, being the MLP model the most accurate with 100%, or higher than 98.1%, of classification performance for all the years. Classification accuracy from hyperspectral signatures was similar to that from multispectral and spectral indices, suggesting that little advantage would be obtained by using more expensive airborne hyperspectral imagery. Therefore, for next investigations, we recommend using multispectral remote imagery to explore whether they can potentially discriminate these weeds and crops. PMID:22629171

  6. Applying neural networks to hyperspectral and multispectral field data for discrimination of cruciferous weeds in winter crops.

    PubMed

    de Castro, Ana-Isabel; Jurado-Expósito, Montserrat; Gómez-Casero, María-Teresa; López-Granados, Francisca

    2012-01-01

    In the context of detection of weeds in crops for site-specific weed control, on-ground spectral reflectance measurements are the first step to determine the potential of remote spectral data to classify weeds and crops. Field studies were conducted for four years at different locations in Spain. We aimed to distinguish cruciferous weeds in wheat and broad bean crops, using hyperspectral and multispectral readings in the visible and near-infrared spectrum. To identify differences in reflectance between cruciferous weeds, we applied three classification methods: stepwise discriminant (STEPDISC) analysis and two neural networks, specifically, multilayer perceptron (MLP) and radial basis function (RBF). Hyperspectral and multispectral signatures of cruciferous weeds, and wheat and broad bean crops can be classified using STEPDISC analysis, and MLP and RBF neural networks with different success, being the MLP model the most accurate with 100%, or higher than 98.1%, of classification performance for all the years. Classification accuracy from hyperspectral signatures was similar to that from multispectral and spectral indices, suggesting that little advantage would be obtained by using more expensive airborne hyperspectral imagery. Therefore, for next investigations, we recommend using multispectral remote imagery to explore whether they can potentially discriminate these weeds and crops.

  7. Geometric calibration and accuracy assessment of a multispectral imager on UAVs

    NASA Astrophysics Data System (ADS)

    Zheng, Fengjie; Yu, Tao; Chen, Xingfeng; Chen, Jiping; Yuan, Guoti

    2012-11-01

    The increasing developments in Unmanned Aerial Vehicles (UAVs) platforms and associated sensing technologies have widely promoted UAVs remote sensing application. UAVs, especially low-cost UAVs, limit the sensor payload in weight and dimension. Mostly, cameras on UAVs are panoramic, fisheye lens, small-format CCD planar array camera, unknown intrinsic parameters and lens optical distortion will cause serious image aberrations, even leading a few meters or tens of meters errors in ground per pixel. However, the characteristic of high spatial resolution make accurate geolocation more critical to UAV quantitative remote sensing research. A method for MCC4-12F Multispectral Imager designed to load on UAVs has been developed and implemented. Using multi-image space resection algorithm to assess geometric calibration parameters of random position and different photogrammetric altitudes in 3D test field, which is suitable for multispectral cameras. Both theoretical and practical accuracy assessments were selected. The results of theoretical strategy, resolving object space and image point coordinate differences by space intersection, showed that object space RMSE were 0.2 and 0.14 pixels in X direction and in Y direction, image space RMSE were superior to 0.5 pixels. In order to verify the accuracy and reliability of the calibration parameters,practical study was carried out in Tianjin UAV flight experiments, the corrected accuracy validated by ground checkpoints was less than 0.3m. Typical surface reflectance retrieved on the basis of geo-rectified data was compared with ground ASD measurement resulting 4% discrepancy. Hence, the approach presented here was suitable for UAV multispectral imager.

  8. Tower Camera Handbook

    SciTech Connect

    Moudry, D

    2005-01-01

    The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

  9. Multispectral iris fusion for enhancement, interoperability, and cross wavelength matching

    NASA Astrophysics Data System (ADS)

    Burge, Mark J.; Monaco, Matthew K.

    2009-05-01

    Traditionally, only a narrow band of the Near-Infrared (NIR) spectrum (700-900nm) is utilized for iris recognition since this alleviates any physical discomfort from illumination, reduces specular reflections and increases the amount of texture captured for some iris colors. However, previous research has shown that matching performance is not invariant to iris color and can be improved by imaging outside of the NIR spectrum. Building on this research, we demonstrate that iris texture increases with the frequency of the illumination for lighter colored sections of the iris and decreases for darker sections. Using registered visible light and NIR iris images captured using a single-lens multispectral camera, we illustrate how physiological properties of the iris (e.g., the amount and distribution of melanin) impact the transmission, absorbance, and reflectance of different portions of the electromagnetic spectrum and consequently affect the quality of the imaged iris texture. We introduce a novel iris code, Multispectral Enhanced irisCode (MEC), which uses pixel-level fusion algorithms to exploit texture variations elicited by illuminating the iris at different frequencies, to improve iris matcher performance and reduce Failure-To-Enroll (FTE) rates. Finally, we present a model for approximating an NIR iris image using features derived from the color and structure of a visible light iris image. The simulated NIR images generated by this model are designed to improve the interoperability between legacy NIR iris images and those acquired under visible light by enabling cross wavelength matching of NIR and visible light iris images.

  10. Light, shadows and surface characteristics: the multispectral Portable Light Dome

    NASA Astrophysics Data System (ADS)

    Watteeuw, Lieve; Hameeuw, Hendrik; Vandermeulen, Bruno; Van der Perre, Athena; Boschloos, Vanessa; Delvaux, Luc; Proesmans, Marc; Van Bos, Marina; Van Gool, Luc

    2016-11-01

    A multispectral, multidirectional, portable and dome-shaped acquisition system is developed within the framework of the research projects RICH (KU Leuven) and EES (RMAH, Brussels) in collaboration with the ESAT-VISICS research group (KU Leuven). The multispectral Portable Light Dome (MS PLD) consists of a hemispherical structure, an overhead camera and LEDs emitting in five parts of the electromagnetic spectrum regularly covering the dome's inside surface. With the associated software solution, virtual relighting and enhancements can be applied in a real-time, interactive manner. The system extracts genuine 3D and shading information based on a photometric stereo algorithm. This innovative approach allows for instantaneous alternations between the computations in the infrared, red, green, blue and ultraviolet spectra. The MS PLD system has been tested for research ranging from medieval manuscript illuminations to ancient Egyptian artefacts. Preliminary results have shown that it documents and measures the 3D surface structure of objects, re-visualises underdrawings, faded pigments and inscriptions, and examines the MS results in combination with the actual relief characteristics of the physical object. Newly developed features are reflection maps and histograms, analytic visualisations of the reflection properties of all separate LEDs or selected areas. In its capacity as imaging technology, the system acts as a tool for the analysis of surface materials (e.g. identification of blue pigments, gold and metallic surfaces). Besides offering support in answering questions of attribution and monitoring changes and decay of materials, the PLD also contributes to the identification of materials, all essential factors when making decisions in the conservation protocol.

  11. Application of airborne remote sensing to the ancient Pompeii site

    NASA Astrophysics Data System (ADS)

    Vitiello, Fausto; Giordano, Antonio; Borfecchia, Flavio; Martini, Sandro; De Cecco, Luigi

    1996-12-01

    The ancient Pompeii site is in the Sarno Valley, an area of about 400 km2 in the South of Italy near Naples, that was utilized by man since old time (thousands of years ago). Actually the valley is under critical environmental conditions because of the relevant industrial development. ENEA is conducting various studies and research in the valley. ENEA is employing historical research, ground campaigns, cartography and up-to-date airborne multispectral remote sensing technologies to make a geographical information system. Airborne remote sensing technologies are very suitable for situations as that of the Sarno Valley. The paper describes the archaeological application of the research in progress as regarding the ancient site of Pompeii and its fluvial port.

  12. Traffic camera system development

    NASA Astrophysics Data System (ADS)

    Hori, Toshi

    1997-04-01

    The intelligent transportation system has generated a strong need for the development of intelligent camera systems to meet the requirements of sophisticated applications, such as electronic toll collection (ETC), traffic violation detection and automatic parking lot control. In order to achieve the highest levels of accuracy in detection, these cameras must have high speed electronic shutters, high resolution, high frame rate, and communication capabilities. A progressive scan interline transfer CCD camera, with its high speed electronic shutter and resolution capabilities, provides the basic functions to meet the requirements of a traffic camera system. Unlike most industrial video imaging applications, traffic cameras must deal with harsh environmental conditions and an extremely wide range of light. Optical character recognition is a critical function of a modern traffic camera system, with detection and accuracy heavily dependent on the camera function. In order to operate under demanding conditions, communication and functional optimization is implemented to control cameras from a roadside computer. The camera operates with a shutter speed faster than 1/2000 sec. to capture highway traffic both day and night. Consequently camera gain, pedestal level, shutter speed and gamma functions are controlled by a look-up table containing various parameters based on environmental conditions, particularly lighting. Lighting conditions are studied carefully, to focus only on the critical license plate surface. A unique light sensor permits accurate reading under a variety of conditions, such as a sunny day, evening, twilight, storms, etc. These camera systems are being deployed successfully in major ETC projects throughout the world.

  13. Miniaturized fundus camera

    NASA Astrophysics Data System (ADS)

    Gliss, Christine; Parel, Jean-Marie A.; Flynn, John T.; Pratisto, Hans S.; Niederer, Peter F.

    2003-07-01

    We present a miniaturized version of a fundus camera. The camera is designed for the use in screening for retinopathy of prematurity (ROP). There, but also in other applications a small, light weight, digital camera system can be extremely useful. We present a small wide angle digital camera system. The handpiece is significantly smaller and lighter then in all other systems. The electronics is truly portable fitting in a standard boardcase. The camera is designed to be offered at a compatible price. Data from tests on young rabbits' eyes is presented. The development of the camera system is part of a telemedicine project screening for ROP. Telemedical applications are a perfect application for this camera system using both advantages: the portability as well as the digital image.

  14. Unsupervised classification of remote multispectral sensing data

    NASA Technical Reports Server (NTRS)

    Su, M. Y.

    1972-01-01

    The new unsupervised classification technique for classifying multispectral remote sensing data which can be either from the multispectral scanner or digitized color-separation aerial photographs consists of two parts: (a) a sequential statistical clustering which is a one-pass sequential variance analysis and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. Applications of the technique using an IBM-7094 computer on multispectral data sets over Purdue's Flight Line C-1 and the Yellowstone National Park test site have been accomplished. Comparisons between the classification maps by the unsupervised technique and the supervised maximum liklihood technique indicate that the classification accuracies are in agreement.

  15. Multispectral image segmentation of breast pathology

    NASA Astrophysics Data System (ADS)

    Hornak, Joseph P.; Blaakman, Andre; Rubens, Deborah; Totterman, Saara

    1991-06-01

    The signal intensity in a magnetic resonance image is not only a function of imaging parameters but also of several intrinsic tissue properties. Therefore, unlike other medical imaging modalities, magnetic resonance imaging (MRI) allows the imaging scientist to locate pathology using multispectral image segmentation. Multispectral image segmentation works best when orthogonal spectral regions are employed. In MRI, possible spectral regions are spin density (rho) , spin-lattice relaxation time T1, spin-spin relaxation time T2, and texture for each nucleus type and chemical shift. This study examines the ability of multispectral image segmentation to locate breast pathology using the total hydrogen T1, T2, and (rho) . The preliminary results indicate that our technique can locate cysts and fibroadenoma breast lesions with a minimum number of false-positives and false-negatives. Results, T1, T2, and (rho) algorithms, and segmentation techniques are presented.

  16. Radiometric Characterization of Hyperspectral Imagers using Multispectral Sensors

    NASA Technical Reports Server (NTRS)

    McCorkel, Joel; Kurt, Thome; Leisso, Nathan; Anderson, Nikolaus; Czapla-Myers, Jeff

    2009-01-01

    The Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne and satellite based sensors. Often, ground-truth measurements at these test sites are not always successful due to weather and funding availability. Therefore, RSG has also automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor, This work studies the feasibility of determining the radiometric calibration of a hyperspectral imager using multispectral a imagery. The work relies on the Moderate Resolution Imaging Spectroradiometer (M0DIS) as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. Hyperion bands are compared to MODIS by band averaging Hyperion's high spectral resolution data with the relative spectral response of M0DlS. The results compare cross-calibration scenarios that differ in image acquisition coincidence, test site used for the calibration, and reference sensor. Cross-calibration results are presented that show agreement between the use of coincident and non-coincident image pairs within 2% in most brands as well as similar agreement between results that employ the different MODIS sensors as a reference.

  17. Sensitivity Analysis of an Automated Calibration Routine for Airborne Cameras

    DTIC Science & Technology

    2013-03-01

    22 DTED Digital Terrain Elevation Data ...................................................................23 GNSS ...instant in time in which an image was captured. The SPAN featured a tight integration of a NovAtel GNSS receiver and the IMU. The SPAN provided...continuous navigation information, using an Inertial Navigation System (INS), to bridge short Global Navigational Satellite Systems ( GNSS ) outages

  18. Sampling for Airborne Radioactivity

    DTIC Science & Technology

    2007-10-01

    compared to betas, gammas and neutrons. For an airborne radioactivity detection system, it is most important to be able to detect alpha particles and... Airborne radioactive particles may emit alpha, beta, gamma or neutron radiation, depending on which radioisotope is present. From a health perspective...

  19. High-speed multispectral confocal biomedical imaging

    PubMed Central

    Carver, Gary E.; Locknar, Sarah A.; Morrison, William A.; Krishnan Ramanujan, V.; Farkas, Daniel L.

    2014-01-01

    Abstract. A new approach for generating high-speed multispectral confocal images has been developed. The central concept is that spectra can be acquired for each pixel in a confocal spatial scan by using a fast spectrometer based on optical fiber delay lines. This approach merges fast spectroscopy with standard spatial scanning to create datacubes in real time. The spectrometer is based on a serial array of reflecting spectral elements, delay lines between these elements, and a single element detector. The spatial, spectral, and temporal resolution of the instrument is described and illustrated by multispectral images of laser-induced autofluorescence in biological tissues. PMID:24658777

  20. Bathymetric mapping with passive multispectral imagery.

    PubMed

    Philpot, W D

    1989-04-15

    Bathymetric mapping will be most straightforward where water quality and atmospheric conditions are invariant over the scene. Under these conditions, both depth and an effective attenuation coefficient of the water over several different bottom types may be retrieved from passive, multispectral imagery. As scenes become more complex-with changing water type and variable atmospheric conditions-it is probable that a strictly spectral analysis will no longer be sufficient to extract depth from multispectral imagery. In these cases an independent source of information will be required. The most likely sources for such information are spatial and temporal variations in image data.

  1. Multispectral imaging fluorescence microscopy for living cells.

    PubMed

    Hiraoka, Yasushi; Shimi, Takeshi; Haraguchi, Tokuko

    2002-10-01

    Multispectral imaging technologies have been widely used in fields of astronomy and remote sensing. Interdisciplinary approaches developed in, for example, the National Aeronautics and Space Administration (NASA, USA), the Jet Propulsion Laboratory (JPL, USA), or the Communications Research Laboratory (CRL, Japan) have extended the application areas of these technologies from planetary systems to cellular systems. Here we overview multispectral imaging systems that have been devised for microscope applications. We introduce these systems with particular interest in live cell imaging. Finally we demonstrate examples of spectral imaging of living cells using commercially available systems with no need for user engineering.

  2. Data Requirements For The Water Directive - Role of Remote Sensing Using The Hrsc-ax Camera Illustrated By The Blue City Project

    NASA Astrophysics Data System (ADS)

    Martin, J.; O'Kane, J. P.

    A completely digital, aerial survey of Cork City and the Lee Catchment, flown in May 2001 at an altitude of 3000m, covered an area of 325sqkm. The campaign used the German Aerospace Centres HRSC-AX system. The SAXT (Airborne, Extended- & cedil;Generation) version of the HRSC (High Resolution Stereo Camera), in commercial operation since late 2000, produces very high resolution Digital Elevation Models (DEMs) and multi-spectral ortho-rectified images. The system contains nine CCD line sensors, mounted in parallel behind one single optic. Operating on a push-broom principle, these acquire nine superimposed image tracks simultaneously (along-track). Five of the nine CCD lines are panchromatic sensors arranged at specific viewing an- gles to provide the multiple-stereo and photogrammetric capabilities of the instrument. The other four are covered with colour filters (red, green, blue near infrared) to ac- quire multi-spectral images. The fully automatic photogrammetric and cartographic processing system, developed at the DLR Institute of Space Sensor Technology and Planetary Exploration (in co-operation with the Technical University of Berlin) yields mean absolute accuracies of s15-20cm in both the horizontal and vertical directions ´ for all products. For the Blue City Project, the system yielded a DEM with a resolu- tion of 100cm, a nadir (panchromatic) ortho-mosaic with a 15cm resolution, and true colour (R, G, B and nIR) ortho-mosaics with a 30cm resolution. Joining the colour bands yielded true-colour (RGB) and false-colour (nIR,G,B) ortho-image mosaics of the full area. The combination of these high resolution products allows detailed to- pographic and multi-spectral analysis of the study area. This data set will provide the foundation for a multi-purpose geographic information system, linked to hydro- dynamic, hydraulic and hydrologic computer models of surface and subsurface flow, including water quality, throughout the catchment of the Lee and the City of

  3. Non-contact assessment of melanin distribution via multispectral temporal illumination coding

    NASA Astrophysics Data System (ADS)

    Amelard, Robert; Scharfenberger, Christian; Wong, Alexander; Clausi, David A.

    2015-03-01

    Melanin is a pigment that is highly absorptive in the UV and visible electromagnetic spectra. It is responsible for perceived skin tone, and protects against harmful UV effects. Abnormal melanin distribution is often an indicator for melanoma. We propose a novel approach for non-contact melanin distribution via multispectral temporal illumination coding to estimate the two-dimensional melanin distribution based on its absorptive characteristics. In the proposed system, a novel multispectral, cross-polarized, temporally-coded illumination sequence is synchronized with a camera to measure reflectance under both multispectral and ambient illumination. This allows us to eliminate the ambient illumination contribution from the acquired reflectance measurements, and also to determine the melanin distribution in an observed region based on the spectral properties of melanin using the Beer-Lambert law. Using this information, melanin distribution maps can be generated for objective, quantitative assessment of skin type of individuals. We show that the melanin distribution map correctly identifies areas with high melanin densities (e.g., nevi).

  4. Landslide Identification and Information Extraction Based on Optical and Multispectral UAV Remote Sensing Imagery

    NASA Astrophysics Data System (ADS)

    Lin, Jiayuan; Wang, Meimei; Yang, Jia; Yang, Qingxia

    2017-02-01

    Landslide is one of the most serious natural disasters which caused enormous economic losses and casualties in the world. Fast and accurate identification of newly occurred landslide and extraction of relevant information are the premise and foundation for landslide disaster assessment and relief. As the places where landslides occur are often inaccessible for field observation because of the temporary failure in transportation and communication. Therefore, UAV remote sensing can be adopted to collect landslide information efficiently and quickly with the advantages of low cost, flexible launch and landing, safety, under-cloud-flying, and hyperspatial image resolution. Newly occurred landslides are usually accompanied with those phenomena such as vegetation burying and bedrock or bare soil exposure, which can be easily detected in optical or multispectral UAV images. By taking one typical landslide occurred in Wenchuan Earthquake stricken area in 2010 as an example, this paper demonstrates the process of integration of multispectral camera with UAV platform, NDVI generation with multispectral UAV images, three-dimensional terrain and orthophoto generation with optical UAV images, and identification and extraction of landslide information such as its location, impacted area, and earthwork volume.

  5. Multispectral and hyperspectral measurements of smoke candles and soldier's camouflage equipment

    NASA Astrophysics Data System (ADS)

    Lagueux, Philippe; Gagnon, Marc-André; Kastek, Mariusz; PiÄ tkowski, Tadeusz; Dulski, Rafał; Trzaskawka, Piotr

    2012-09-01

    The emergence of new infrared camouflage and countermeasure technologies in the context of military operations has paved the way to enhanced detection capabilities. Camouflage devices such as candles (or smoke bombs) and flares are developed to generate either large area or localized screens with very high absorption in the infrared. Similarly, soldier's camouflage devices such as clothing have evolved in design to dissolve their infrared characteristics with that of the background. In all cases, the analysis of the targets infrared images needs to be conducted in both multispectral and hyperspectral domains to assess their capability to efficiently provide visible and infrared camouflage. The Military University of Technology has conducted several intensive field campaigns where various types of smoke candles and camouflage uniforms were deployed in different conditions and were measured both in the multispectral and hyperspectral domains. Cooled broadband infrared cameras were used for the multispectral analysis whereas the high spectral, spatial and temporal resolution acquisition of these thermodynamic events was recorded with the Telops Hyper-Cam sensor. This paper presents the test campaign concept and the analysis of the recorded measurements.

  6. Hemodynamic and morphologic responses in mouse brain during acute head injury imaged by multispectral structured illumination

    NASA Astrophysics Data System (ADS)

    Volkov, Boris; Mathews, Marlon S.; Abookasis, David

    2015-03-01

    Multispectral imaging has received significant attention over the last decade as it integrates spectroscopy, imaging, tomography analysis concurrently to acquire both spatial and spectral information from biological tissue. In the present study, a multispectral setup based on projection of structured illumination at several near-infrared wavelengths and at different spatial frequencies is applied to quantitatively assess brain function before, during, and after the onset of traumatic brain injury in an intact mouse brain (n=5). For the production of head injury, we used the weight drop method where weight of a cylindrical metallic rod falling along a metal tube strikes the mouse's head. Structured light was projected onto the scalp surface and diffuse reflected light was recorded by a CCD camera positioned perpendicular to the mouse head. Following data analysis, we were able to concurrently show a series of hemodynamic and morphologic changes over time including higher deoxyhemoglobin, reduction in oxygen saturation, cell swelling, etc., in comparison with baseline measurements. Overall, results demonstrates the capability of multispectral imaging based structured illumination to detect and map of brain tissue optical and physiological properties following brain injury in a simple noninvasive and noncontact manner.

  7. Comparative study of water ice exposures on cometary nuclei using multispectral imaging data

    NASA Astrophysics Data System (ADS)

    Oklay, N.; Sunshine, J. M.; Pajola, M.; Pommerol, A.; Vincent, J.-B.; Mottola, S.; Sierks, H.; Fornasier, S.; Barucci, M. A.; Preusker, F.; Scholten, F.; Lara, L. M.; Barbieri, C.; Lamy, P. L.; Rodrigo, R.; Koschny, D.; Rickman, H.; A'Hearn, M. F.; Bertaux, J.-L.; Bertini, I.; Bodewits, D.; Cremonese, G.; Da Deppo, V.; Davidsson, B. J. R.; Debei, S.; De Cecco, M.; Deller, J.; Fulle, M.; Gicquel, A.; Groussin, O.; Gutiérrez, P. J.; Güttler, C.; Hall, I.; Hofmann, M.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Keller, H. U.; Knollenberg, J.; Kovacs, G.; Kramm, J.-R.; Kührt, E.; Küppers, M.; Lazzarin, M.; Lin, Z.-Y.; Lopez Moreno, J. J.; Marzari, F.; Naletto, G.; Shi, X.; Thomas, N.; Tubiana, C.

    2016-11-01

    Deep Impact, EPOXI and Rosetta missions visited comets 9P/Tempel 1, 103P/Hartley 2 and 67P/Churyumov-Gerasimenko, respectively. Each of these three missions was equipped with both multispectral imagers and infrared spectrometers. Bright blue features containing water ice were detected in each of these comet nuclei. We analysed multispectral properties of enriched water ice features observed via Optical, Spectroscopic, and Infrared Remote Imaging System narrow angle camera on comet 67P in the wavelength range of 260-1000 nm and then compared with multispectral data of water ice deposits observed on comets 9P and 103P. We characterize the UV/VIS properties of water-ice-rich features observed on the nuclei of these three comets. When compared to the average surface of each comet, our analysis shows that the water ice deposits seen on comet 9P are similar to the clustered water-ice-rich features seen on comet 67P, while the water ice deposit seen on comet 103P is more akin to two large isolated water-ice-rich features seen on comet 67P. Our results indicate that the water ice deposit observed on comet 103P contains more water ice than the water-ice-rich features observed on comets 9P and 67P, proportionally to the average surface of each nucleus.

  8. Microchannel plate streak camera

    DOEpatents

    Wang, Ching L.

    1989-01-01

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

  9. GRACE star camera noise

    NASA Astrophysics Data System (ADS)

    Harvey, Nate

    2016-08-01

    Extending results from previous work by Bandikova et al. (2012) and Inacio et al. (2015), this paper analyzes Gravity Recovery and Climate Experiment (GRACE) star camera attitude measurement noise by processing inter-camera quaternions from 2003 to 2015. We describe a correction to star camera data, which will eliminate a several-arcsec twice-per-rev error with daily modulation, currently visible in the auto-covariance function of the inter-camera quaternion, from future GRACE Level-1B product releases. We also present evidence supporting the argument that thermal conditions/settings affect long-term inter-camera attitude biases by at least tens-of-arcsecs, and that several-to-tens-of-arcsecs per-rev star camera errors depend largely on field-of-view.

  10. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1984-09-28

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

  11. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1989-03-21

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

  12. Airborne remote sensing for Deepwater Horizon oil spill emergency response

    NASA Astrophysics Data System (ADS)

    Kroutil, Robert T.; Shen, Sylvia S.; Lewis, Paul E.; Miller, David P.; Cardarelli, John; Thomas, Mark; Curry, Timothy; Kudaraskus, Paul

    2010-08-01

    On April 28, 2010, the Environmental Protection Agency's (EPA) Airborne Spectral Photometric Environmental Collection Technology (ASPECT) aircraft was deployed to Gulfport, Mississippi to provide airborne remotely sensed air monitoring and situational awareness data and products in response to the Deepwater Horizon oil rig disaster. The ASPECT aircraft was released from service on August 9, 2010 after having flown over 75 missions that included over 250 hours of flight operation. ASPECT's initial mission responsibility was to provide air quality monitoring (i.e., identification of vapor species) during various oil burning operations. The ASPECT airborne wide-area infrared remote sensing spectral data was used to evaluate the hazard potential of vapors being produced from open water oil burns near the Deepwater Horizon rig site. Other significant remote sensing data products and innovations included the development of an advanced capability to correctly identify, locate, characterize, and quantify surface oil that could reach beaches and wetland areas. This advanced identification product provided the Incident Command an improved capability to locate surface oil in order to improve the effectiveness of oil skimmer vessel recovery efforts directed by the US Coast Guard. This paper discusses the application of infrared spectroscopy and multispectral infrared imagery to address significant issues associated with this national crisis. More specifically, this paper addresses the airborne remote sensing capabilities, technology, and data analysis products developed specifically to optimize the resources and capabilities of the Deepwater Horizon Incident Command structure personnel and their remediation efforts.

  13. Analytical multicollimator camera calibration

    USGS Publications Warehouse

    Tayman, W.P.

    1978-01-01

    Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

  14. Streak camera meeting summary

    SciTech Connect

    Dolan, Daniel H.; Bliss, David E.

    2014-09-01

    Streak cameras are important for high-speed data acquisition in single event experiments, where the total recorded information (I) is shared between the number of measurements (M) and the number of samples (S). Topics of this meeting included: streak camera use at the national laboratories; current streak camera production; new tube developments and alternative technologies; and future planning. Each topic is summarized in the following sections.

  15. Summary of Michigan multispectral investigations program

    NASA Technical Reports Server (NTRS)

    Legault, R. R.

    1970-01-01

    The development of techniques to extend spectral signatures in space and time is reported. Signatures that were valid for 30 miles have been extended for 129 miles using transformation and sun sensor data so that a complicated multispectral recognition problem that required 219 learning sets can now be done with 13 learning sets.

  16. Blast investigation by fast multispectral radiometric analysis

    NASA Astrophysics Data System (ADS)

    Devir, A. D.; Bushlin, Y.; Mendelewicz, I.; Lessin, A. B.; Engel, M.

    2011-06-01

    Knowledge regarding the processes involved in blasts and detonations is required in various applications, e.g. missile interception, blasts of high-explosive materials, final ballistics and IED identification. Blasts release large amount of energy in short time duration. Some part of this energy is released as intense radiation in the optical spectral bands. This paper proposes to measure the blast radiation by a fast multispectral radiometer. The measurement is made, simultaneously, in appropriately chosen spectral bands. These spectral bands provide extensive information on the physical and chemical processes that govern the blast through the time-dependence of the molecular and aerosol contributions to the detonation products. Multi-spectral blast measurements are performed in the visible, SWIR and MWIR spectral bands. Analysis of the cross-correlation between the measured multi-spectral signals gives the time dependence of the temperature, aerosol and gas composition of the blast. Farther analysis of the development of these quantities in time may indicate on the order of the detonation and amount and type of explosive materials. Examples of analysis of measured explosions are presented to demonstrate the power of the suggested fast multispectral radiometric analysis approach.

  17. Ringfield lithographic camera

    DOEpatents

    Sweatt, William C.

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  18. Digital Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, Samuel D.; Yeates, Herbert D.

    1993-01-01

    Digital electronic still camera part of electronic recording, processing, tansmitting, and displaying system. Removable hard-disk drive in camera serves as digital electronic equivalent of photographic film. Images viewed, analyzed, or transmitted quickly. Camera takes images of nearly photographic quality and stores them in digital form. Portable, hand-held, battery-powered unit designed for scientific use. Camera used in conjunction with playback unit also serving as transmitting unit if images sent to remote station. Remote station equipped to store, process, and display images. Digital image data encoded with error-correcting code at playback/transmitting unit for error-free transmission to remote station.

  19. LSST Camera Optics Design

    SciTech Connect

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  20. Digital camera simulation.

    PubMed

    Farrell, Joyce E; Catrysse, Peter B; Wandell, Brian A

    2012-02-01

    We describe a simulation of the complete image processing pipeline of a digital camera, beginning with a radiometric description of the scene captured by the camera and ending with a radiometric description of the image rendered on a display. We show that there is a good correspondence between measured and simulated sensor performance. Through the use of simulation, we can quantify the effects of individual digital camera components on system performance and image quality. This computational approach can be helpful for both camera design and image quality assessment.

  1. Airborne gravity is here

    SciTech Connect

    Hammer, S.

    1982-01-11

    After 20 years of development efforts, the airborne gravity survey has finally become a practical exploration method. Besides gravity data, the airborne survey can also collect simultaneous, continuous records of high-precision magneticfield data as well as terrain clearance; these provide a topographic contour map useful in calculating terrain conditions and in subsequent planning and engineering. Compared with a seismic survey, the airborne gravity method can cover the same area much more quickly and cheaply; a seismograph could then detail the interesting spots.

  2. Software defined multi-spectral imaging for Arctic sensor networks

    NASA Astrophysics Data System (ADS)

    Siewert, Sam; Angoth, Vivek; Krishnamurthy, Ramnarayan; Mani, Karthikeyan; Mock, Kenrick; Singh, Surjith B.; Srivistava, Saurav; Wagner, Chris; Claus, Ryan; Vis, Matthew Demi

    2016-05-01

    Availability of off-the-shelf infrared sensors combined with high definition visible cameras has made possible the construction of a Software Defined Multi-Spectral Imager (SDMSI) combining long-wave, near-infrared and visible imaging. The SDMSI requires a real-time embedded processor to fuse images and to create real-time depth maps for opportunistic uplink in sensor networks. Researchers at Embry Riddle Aeronautical University working with University of Alaska Anchorage at the Arctic Domain Awareness Center and the University of Colorado Boulder have built several versions of a low-cost drop-in-place SDMSI to test alternatives for power efficient image fusion. The SDMSI is intended for use in field applications including marine security, search and rescue operations and environmental surveys in the Arctic region. Based on Arctic marine sensor network mission goals, the team has designed the SDMSI to include features to rank images based on saliency and to provide on camera fusion and depth mapping. A major challenge has been the design of the camera computing system to operate within a 10 to 20 Watt power budget. This paper presents a power analysis of three options: 1) multi-core, 2) field programmable gate array with multi-core, and 3) graphics processing units with multi-core. For each test, power consumed for common fusion workloads has been measured at a range of frame rates and resolutions. Detailed analyses from our power efficiency comparison for workloads specific to stereo depth mapping and sensor fusion are summarized. Preliminary mission feasibility results from testing with off-the-shelf long-wave infrared and visible cameras in Alaska and Arizona are also summarized to demonstrate the value of the SDMSI for applications such as ice tracking, ocean color, soil moisture, animal and marine vessel detection and tracking. The goal is to select the most power efficient solution for the SDMSI for use on UAVs (Unoccupied Aerial Vehicles) and other drop

  3. Vein visualization using a smart phone with multispectral Wiener estimation for point-of-care applications.

    PubMed

    Song, Jae Hee; Kim, Choye; Yoo, Yangmo

    2015-03-01

    Effective vein visualization is clinically important for various point-of-care applications, such as needle insertion. It can be achieved by utilizing ultrasound imaging or by applying infrared laser excitation and monitoring its absorption. However, while these approaches can be used for vein visualization, they are not suitable for point-of-care applications because of their cost, time, and accessibility. In this paper, a new vein visualization method based on multispectral Wiener estimation is proposed and its real-time implementation on a smart phone is presented. In the proposed method, a conventional RGB camera on a commercial smart phone (i.e., Galaxy Note 2, Samsung Electronics Inc., Suwon, Korea) is used to acquire reflectance information from veins. Wiener estimation is then applied to extract the multispectral information from the veins. To evaluate the performance of the proposed method, an experiment was conducted using a color calibration chart (ColorChecker Classic, X-rite, Grand Rapids, MI, USA) and an average root-mean-square error of 12.0% was obtained. In addition, an in vivo subcutaneous vein imaging experiment was performed to explore the clinical performance of the smart phone-based Wiener estimation. From the in vivo experiment, the veins at various sites were successfully localized using the reconstructed multispectral images and these results were confirmed by ultrasound B-mode and color Doppler images. These results indicate that the presented multispectral Wiener estimation method can be used for visualizing veins using a commercial smart phone for point-of-care applications (e.g., vein puncture guidance).

  4. Hybrid Image Fusion for Sharpness Enhancement of Multi-Spectral Lunar Images

    NASA Astrophysics Data System (ADS)

    Awumah, Anna; Mahanti, Prasun; Robinson, Mark

    2016-10-01

    Image fusion enhances the sharpness of a multi-spectral (MS) image by incorporating spatial details from a higher-resolution panchromatic (Pan) image [1,2]. Known applications of image fusion for planetary images are rare, although image fusion is well-known for its applications to Earth-based remote sensing. In a recent work [3], six different image fusion algorithms were implemented and their performances were verified with images from the Lunar Reconnaissance Orbiter (LRO) Camera. The image fusion procedure obtained a high-resolution multi-spectral (HRMS) product from the LRO Narrow Angle Camera (used as Pan) and LRO Wide Angle Camera (used as MS) images. The results showed that the Intensity-Hue-Saturation (IHS) algorithm results in a high-spatial quality product while the Wavelet-based image fusion algorithm best preserves spectral quality among all the algorithms. In this work we show the results of a hybrid IHS-Wavelet image fusion algorithm when applied to LROC MS images. The hybrid method provides the best HRMS product - both in terms of spatial resolution and preservation of spectral details. Results from hybrid image fusion can enable new science and increase the science return from existing LROC images.[1] Pohl, Cle, and John L. Van Genderen. "Review article multisensor image fusion in remote sensing: concepts, methods and applications." International journal of remote sensing 19.5 (1998): 823-854.[2] Zhang, Yun. "Understanding image fusion." Photogramm. Eng. Remote Sens 70.6 (2004): 657-661.[3] Mahanti, Prasun et al. "Enhancement of spatial resolution of the LROC Wide Angle Camera images." Archives, XXIII ISPRS Congress Archives (2016).

  5. CCD Luminescence Camera

    NASA Technical Reports Server (NTRS)

    Janesick, James R.; Elliott, Tom

    1987-01-01

    New diagnostic tool used to understand performance and failures of microelectronic devices. Microscope integrated to low-noise charge-coupled-device (CCD) camera to produce new instrument for analyzing performance and failures of microelectronics devices that emit infrared light during operation. CCD camera also used to indentify very clearly parts that have failed where luminescence typically found.

  6. Constrained space camera assembly

    DOEpatents

    Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

    1999-05-11

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

  7. Compact Solar Camera.

    ERIC Educational Resources Information Center

    Juergens, Albert

    1980-01-01

    Describes a compact solar camera built as a one-semester student project. This camera is used for taking pictures of the sun and moon and for direct observation of the image of the sun on a screen. (Author/HM)

  8. The DSLR Camera

    NASA Astrophysics Data System (ADS)

    Berkó, Ernő; Argyle, R. W.

    Cameras have developed significantly in the past decade; in particular, digital Single-Lens Reflex Cameras (DSLR) have appeared. As a consequence we can buy cameras of higher and higher pixel number, and mass production has resulted in the great reduction of prices. CMOS sensors used for imaging are increasingly sensitive, and the electronics in the cameras allows images to be taken with much less noise. The software background is developing in a similar way—intelligent programs are created for after-processing and other supplementary works. Nowadays we can find a digital camera in almost every household, most of these cameras are DSLR ones. These can be used very well for astronomical imaging, which is nicely demonstrated by the amount and quality of the spectacular astrophotos appearing in different publications. These examples also show how much post-processing software contributes to the rise in the standard of the pictures. To sum up, the DSLR camera serves as a cheap alternative for the CCD camera, with somewhat weaker technical characteristics. In the following, I will introduce how we can measure the main parameters (position angle and separation) of double stars, based on the methods, software and equipment I use. Others can easily apply these for their own circumstances.

  9. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  10. Multispectral and hyperspectral imaging with AOTF for object recognition

    NASA Astrophysics Data System (ADS)

    Gupta, Neelam; Dahmani, Rachid

    1999-01-01

    Acousto-optic tunable-filter (AOTF) technology has been used in the design of a no-moving parts, compact, lightweight, field portable, automated, adaptive spectral imaging system when combined with a high sensitivity imaging detector array. Such a system could detect spectral signatures of targets and/or background, which contain polarization information and can be digitally processed by a variety of algorithms. At the Army Research Laboratory, we have developed and used a number of AOTF imaging systems and are also carrying out the development of such imagers at longer wavelengths. We have carried out hyperspectral and multispectral imaging using AOTF systems covering the spectral range from the visible to mid-IR. One of the imager uses a two-cascaded collinear-architecture AOTF cell in the visible-to-near-IR range with a digital Si charge-coupled device camera as the detector. The images obtained with this system showed no color blurring or image shift due to the angular deviation of different colors as a result of diffraction, and the digital images are stored and processed with great ease. The spatial resolution of the filter was evaluated by means of the lines of a target chart. We have also obtained and processed images from another noncollinear visible-to-near-IR AOTF imager with a digital camera, and used hyperspectral image processing software to enhance object recognition in cluttered background. We are presently working on a mid-IR AOTF imaging system that uses a high- performance InSb focal plane array and image acquisition and processing software. We describe our hyperspectral imaging program and present results from our imaging experiments.

  11. Chromaffin cell calcium signal and morphology study based on multispectral images

    NASA Astrophysics Data System (ADS)

    Wu, Hongxiu; Wei, Shunhui; Qu, Anlian; Zhou, Zhuan

    1998-09-01

    Increasing or decreasing the internal calcium concentration can promote or prevent programmed cell death (PCD). We therefore performed a Ca2+ imaging study using Ca2+ indicator dye fura-2 and a sensitive cooled-CCD camera with a 12 bit resolution. Monochromatic beams of light with a wavelength of 345,380 nm were isolated from light emitted by a xenon lamp using a monochromator. The concentration of free calcium can be directly calculated from the ratio of two fluorescence values taken at two appropriately selected wavelength. Fluorescent light emitted from the cells was capture using a camera system. The cell morphology study is based on multispectral scanning, with smear images provided as three monochromatic images by illumination with light of 610,535 and 470 nm wavelengths. The nuclear characteristic parameters extracted from individual nuclei by system are nuclear area, nuclear diameter, nuclear density vector. The results of the restoration of images and the performance of a primitive logic for the detection of nuclei with PCD proved the usefulness of the system and the advantages of using multispectral images in the restoration and detection procedures.

  12. Dry imaging cameras.

    PubMed

    Indrajit, Ik; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

    2011-04-01

    Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow.

  13. Air Pollution Determination Using a Surveillance Internet Protocol Camera Images

    NASA Astrophysics Data System (ADS)

    Chow Jeng, C. J.; Hwee San, Hslim; Matjafri, M. Z.; Abdullah, Abdul, K.

    Air pollution has long been a problem in the industrial nations of the West It has now become an increasing source of environmental degradation in the developing nations of east Asia Malaysia government has built a network to monitor air pollution But the cost of these networks is high and limits the knowledge of pollutant concentration to specific points of the cities A methodology based on a surveillance internet protocol IP camera for the determination air pollution concentrations was presented in this study The objective of this study was to test the feasibility of using IP camera data for estimating real time particulate matter of size less than 10 micron PM10 in the campus of USM The proposed PM10 retrieval algorithm derived from the atmospheric optical properties was employed in the present study In situ data sets of PM10 measurements and sun radiation measurements at the ground surface were collected simultaneously with the IP camera images using a DustTrak meter and a handheld spectroradiometer respectively The digital images were separated into three bands namely red green and blue bands for multispectral algorithm calibration The digital number DN of the IP camera images were converted into radiance and reflectance values After that the reflectance recorded by the digital camera was subtracted by the reflectance of the known surface and we obtained the reflectance caused by the atmospheric components The atmospheric reflectance values were used for regression analysis Regression technique was employed to determine suitable

  14. Near-infrared camera for the Clementine mission

    SciTech Connect

    Priest, R.E.; Lewis, I.T.; Sewall, N.R.; Park, H.S.; Shannon, M.J.; Ledebuhr, A.G.; Pleasance, L.D.; Massie, M.A.; Metschuleit, K.

    1995-04-01

    The Clementine mission provided the first ever complete, systematic surface mapping of the moon from the ultra-violet to the near-infrared regions. More than 1.7 million images of the moon, earth and space were returned from this mission. The near-infrared (NIR) multi-spectral camera, one of two workhorse lunar mapping cameras (the other being the UV/visible camera), provided {approximately}200 in spatial resolution at 400 km periselene, and a 39 km across-track swath. This 1.9 kg infrared camera using a 256 x 256 InSb FPA viewed reflected solar illumination from the lunar surface and lunar horizon in the 1 to 3 {micro}m wavelength region, extending lunar imagery and mineralogy studies into the near infrared. A description of this light-weight, low power NIR camera along with a summary of lessons learned is presented. Design goals and preliminary on-orbit performance estimates are addressed in terms of meeting the mission`s primary objective for flight qualifying the sensors for future Department of Defense flights.

  15. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  17. Spectral Characterization of a Prototype SFA Camera for Joint Visible and NIR Acquisition

    PubMed Central

    Thomas, Jean-Baptiste; Lapray, Pierre-Jean; Gouton, Pierre; Clerc, Cédric

    2016-01-01

    Multispectral acquisition improves machine vision since it permits capturing more information on object surface properties than color imaging. The concept of spectral filter arrays has been developed recently and allows multispectral single shot acquisition with a compact camera design. Due to filter manufacturing difficulties, there was, up to recently, no system available for a large span of spectrum, i.e., visible and Near Infra-Red acquisition. This article presents the achievement of a prototype of camera that captures seven visible and one near infra-red bands on the same sensor chip. A calibration is proposed to characterize the sensor, and images are captured. Data are provided as supplementary material for further analysis and simulations. This opens a new range of applications in security, robotics, automotive and medical fields. PMID:27367690

  18. UV/visible camera for the Clementine mission

    SciTech Connect

    Kordas, J.F.; Lewis, I.T.; Priest, R.E.

    1995-04-01

    This article describes the Clementine UV/Visible (UV/Vis) multispectral camera, discusses design goals and preliminary estimates of on-orbit performance, and summarizes lessons learned in building and using the sensor. While the primary objective of the Clementine Program was to qualify a suite of 6 light-weight, low power imagers for future Department of Defense flights, the mission also has provided the first systematic mapping of the complete lunar surface in the visible and near-infrared spectral regions. The 410 g, 4.65 W UV/Vis camera uses a 384 x 288 frame-transfer silicon CCD FPA and operates at 6 user-selectable wavelength bands between 0.4 and 1.1 {micro}m. It has yielded lunar imagery and mineralogy data with up to 120 in spatial resolution (band dependent) at 400 km periselene along a 39 km cross-track swath.

  19. Optical Communications Link to Airborne Transceiver

    NASA Technical Reports Server (NTRS)

    Regehr, Martin W.; Kovalik, Joseph M.; Biswas, Abhijit

    2011-01-01

    An optical link from Earth to an aircraft demonstrates the ability to establish a link from a ground platform to a transceiver moving overhead. An airplane has a challenging disturbance environment including airframe vibrations and occasional abrupt changes in attitude during flight. These disturbances make it difficult to maintain pointing lock in an optical transceiver in an airplane. Acquisition can also be challenging. In the case of the aircraft link, the ground station initially has no precise knowledge of the aircraft s location. An airborne pointing system has been designed, built, and demonstrated using direct-drive brushless DC motors for passive isolation of pointing disturbances and for high-bandwidth control feedback. The airborne transceiver uses a GPS-INS system to determine the aircraft s position and attitude, and to then illuminate the ground station initially for acquisition. The ground transceiver participates in link-pointing acquisition by first using a wide-field camera to detect initial illumination from the airborne beacon, and to perform coarse pointing. It then transfers control to a high-precision pointing detector. Using this scheme, live video was successfully streamed from the ground to the aircraft at 270 Mb/s while simultaneously downlinking a 50 kb/s data stream from the aircraft to the ground.

  20. Information extraction techniques for multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Malila, W. A.; Crane, R. B.; Turner, R. E.

    1972-01-01

    The applicability of recognition-processing procedures for multispectral scanner data from areas and conditions used for programming the recognition computers to other data from different areas viewed under different measurement conditions was studied. The reflective spectral region approximately 0.3 to 3.0 micrometers is considered. A potential application of such techniques is in conducting area surveys. Work in three general areas is reported: (1) Nature of sources of systematic variation in multispectral scanner radiation signals, (2) An investigation of various techniques for overcoming systematic variations in scanner data; (3) The use of decision rules based upon empirical distributions of scanner signals rather than upon the usually assumed multivariate normal (Gaussian) signal distributions.

  1. Investigation related to multispectral imaging systems

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F.; Erickson, J. D.

    1974-01-01

    A summary of technical progress made during a five year research program directed toward the development of operational information systems based on multispectral sensing and the use of these systems in earth-resource survey applications is presented. Efforts were undertaken during this program to: (1) improve the basic understanding of the many facets of multispectral remote sensing, (2) develop methods for improving the accuracy of information generated by remote sensing systems, (3) improve the efficiency of data processing and information extraction techniques to enhance the cost-effectiveness of remote sensing systems, (4) investigate additional problems having potential remote sensing solutions, and (5) apply the existing and developing technology for specific users and document and transfer that technology to the remote sensing community.

  2. Ringfield lithographic camera

    DOEpatents

    Sweatt, W.C.

    1998-09-08

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

  3. The Mars observer camera

    NASA Technical Reports Server (NTRS)

    Malin, M. C.; Danielson, G. E.; Ingersoll, A. P.; Masursky, H.; Veverka, J.; Soulanille, T.; Ravine, M.

    1987-01-01

    A camera designed to operate under the extreme constraints of the Mars Observer Mission was selected by NASA in April, 1986. Contingent upon final confirmation in mid-November, the Mars Observer Camera (MOC) will begin acquiring images of the surface and atmosphere of Mars in September-October 1991. The MOC incorporates both a wide angle system for low resolution global monitoring and intermediate resolution regional targeting, and a narrow angle system for high resolution selective surveys. Camera electronics provide control of image clocking and on-board, internal editing and buffering to match whatever spacecraft data system capabilities are allocated to the experiment. The objectives of the MOC experiment follow.

  4. Night Vision Camera

    NASA Technical Reports Server (NTRS)

    1996-01-01

    PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

  5. Kitt Peak speckle camera

    NASA Technical Reports Server (NTRS)

    Breckinridge, J. B.; Mcalister, H. A.; Robinson, W. G.

    1979-01-01

    The speckle camera in regular use at Kitt Peak National Observatory since 1974 is described in detail. The design of the atmospheric dispersion compensation prisms, the use of film as a recording medium, the accuracy of double star measurements, and the next generation speckle camera are discussed. Photographs of double star speckle patterns with separations from 1.4 sec of arc to 4.7 sec of arc are shown to illustrate the quality of image formation with this camera, the effects of seeing on the patterns, and to illustrate the isoplanatic patch of the atmosphere.

  6. Interferometry based multispectral photon-limited 2D and 3D integral image encryption employing the Hartley transform.

    PubMed

    Muniraj, Inbarasan; Guo, Changliang; Lee, Byung-Geun; Sheridan, John T

    2015-06-15

    We present a method of securing multispectral 3D photon-counted integral imaging (PCII) using classical Hartley Transform (HT) based encryption by employing optical interferometry. This method has the simultaneous advantages of minimizing complexity by eliminating the need for holography recording and addresses the phase sensitivity problem encountered when using digital cameras. These together with single-channel multispectral 3D data compactness, the inherent properties of the classical photon counting detection model, i.e. sparse sensing and the capability for nonlinear transformation, permits better authentication of the retrieved 3D scene at various depth cues. Furthermore, the proposed technique works for both spatially and temporally incoherent illumination. To validate the proposed technique simulations were carried out for both the 2D and 3D cases. Experimental data is processed and the results support the feasibility of the encryption method.

  7. Development of a Miniature Snapshot Multispectral Imager

    DTIC Science & Technology

    2010-09-01

    imaging results. A main motivation behind development of such a compact imager is to be able to detect chemicals used in improvised explosive...devices (IEDs). 15. SUBJECT TERMS Fabry-Perot filter, multispectral, SWIR, microlens optics, IED detection 16. SECURITY CLASSIFICATION OF: 17...or prism, a filter wheel, a diffractive optic lens, a Fabry-Perot (F-P) etalon, or a tunable filter. All of these optical devices are used with a

  8. Multi-spectral photoacoustic elasticity tomography

    PubMed Central

    Liu, Yubin; Yuan, Zhen

    2016-01-01

    The goal of this work was to develop and validate a spectrally resolved photoacoustic imaging method, namely multi-spectral photoacoustic elasticity tomography (PAET) for quantifying the physiological parameters and elastic modulus of biological tissues. We theoretically and experimentally examined the PAET imaging method using simulations and in vitro experimental tests. Our simulation and in vitro experimental results indicated that the reconstructions were quantitatively accurate in terms of sizes, the physiological and elastic properties of the targets. PMID:27699101

  9. Material Characterization using Passive Multispectral Polarimetric Imagery

    DTIC Science & Technology

    2013-03-01

    wavelength due to the tendency of all materials to polarize scattered light very weakly in that regime . The derivative would be near zero for metals and...Applications in Remote Sensing. Oxford University Press, USA, 2009. [6] Coffland, Bruce. “Multispectral scanners for wildfire assessment”, 2008. URL... logs /sept14/media/volcanoo-cone-3.html. [24] National Oceanic and Atmospheric Administration. “Sonar”, Oct 2012. URL http://www.nmfs.noaa.gov/pr

  10. Do speed cameras reduce collisions?

    PubMed

    Skubic, Jeffrey; Johnson, Steven B; Salvino, Chris; Vanhoy, Steven; Hu, Chengcheng

    2013-01-01

    We investigated the effects of speed cameras along a 26 mile segment in metropolitan Phoenix, Arizona. Motor vehicle collisions were retrospectively identified according to three time periods - before cameras were placed, while cameras were in place and after cameras were removed. A 14 mile segment in the same area without cameras was used for control purposes. Five cofounding variables were eliminated. In this study, the placement or removal of interstate highway speed cameras did not independently affect the incidence of motor vehicle collisions.

  11. A multispectral imaging approach for diagnostics of skin pathologies

    NASA Astrophysics Data System (ADS)

    Lihacova, Ilze; Derjabo, Aleksandrs; Spigulis, Janis

    2013-06-01

    Noninvasive multispectral imaging method was applied for different skin pathology such as nevus, basal cell carcinoma, and melanoma diagnostics. Developed melanoma diagnostic parameter, using three spectral bands (540 nm, 650 nm and 950 nm), was calculated for nevus, melanoma and basal cell carcinoma. Simple multispectral diagnostic device was established and applied for skin assessment. Development and application of multispectral diagnostics method described further in this article.

  12. Image processing of underwater multispectral imagery

    USGS Publications Warehouse

    Zawada, D. G.

    2003-01-01

    Capturing in situ fluorescence images of marine organisms presents many technical challenges. The effects of the medium, as well as the particles and organisms within it, are intermixed with the desired signal. Methods for extracting and preparing the imagery for analysis are discussed in reference to a novel underwater imaging system called the low-light-level underwater multispectral imaging system (LUMIS). The instrument supports both uni- and multispectral collections, each of which is discussed in the context of an experimental application. In unispectral mode, LUMIS was used to investigate the spatial distribution of phytoplankton. A thin sheet of laser light (532 nm) induced chlorophyll fluorescence in the phytoplankton, which was recorded by LUMIS. Inhomogeneities in the light sheet led to the development of a beam-pattern-correction algorithm. Separating individual phytoplankton cells from a weak background fluorescence field required a two-step procedure consisting of edge detection followed by a series of binary morphological operations. In multispectral mode, LUMIS was used to investigate the bio-assay potential of fluorescent pigments in corals. Problems with the commercial optical-splitting device produced nonlinear distortions in the imagery. A tessellation algorithm, including an automated tie-point-selection procedure, was developed to correct the distortions. Only pixels corresponding to coral polyps were of interest for further analysis. Extraction of these pixels was performed by a dynamic global-thresholding algorithm.

  13. Comparison of multispectral remote-sensing techniques for monitoring subsurface drain conditions. [Imperial Valley, California

    NASA Technical Reports Server (NTRS)

    Goettelman, R. C.; Grass, L. B.; Millard, J. P.; Nixon, P. R.

    1983-01-01

    The following multispectral remote-sensing techniques were compared to determine the most suitable method for routinely monitoring agricultural subsurface drain conditions: airborne scanning, covering the visible through thermal-infrared (IR) portions of the spectrum; color-IR photography; and natural-color photography. Color-IR photography was determined to be the best approach, from the standpoint of both cost and information content. Aerial monitoring of drain conditions for early warning of tile malfunction appears practical. With careful selection of season and rain-induced soil-moisture conditions, extensive regional surveys are possible. Certain locations, such as the Imperial Valley, Calif., are precluded from regional monitoring because of year-round crop rotations and soil stratification conditions. Here, farms with similar crops could time local coverage for bare-field and saturated-soil conditions.

  14. JORNEX: An airborne campaign to quantify rangeland vegetation change and plant community-atmospheric interactions

    SciTech Connect

    Ritchie, J.C.; Rango, A.; Kustas, W.P.

    1996-11-01

    The Jornada Experimental Range in New Mexico provides a unique opportunity to integrate hydrologic-atmospheric fluxes and surface states, vegetation types, cover, and distribution, and vegetation response to changes in hydrologic states and atmospheric driving forces. The Jornada Range is the site of a long-term ecological research program to investigate the processes leading to desertification. In concert with ongoing ground measurements, remotely sensed data are being collected from ground, airborne, and satellite platforms during JORNEX (the JORNada Experiment) to provide spatial and temporal distribution of vegetation state using laser altimeter and multispectral aircraft and satellite data and surface energy balance estimates from a combination of parameters and state variables derived from remotely sensed data. These measurements will be used as inputs to models to quantify the hydrologic budget and the plant response to changes in components in the water and energy balance. Intensive three day study periods for ground and airborne campaigns have been made in May 1995 (dry season) and September 1995 (wet season), February 1996 (Winter) and are planned for wet and dry seasons of 1996. An airborne platform is being used to collect thermal, multispectral, 3-band video, and laser altimetry profile data. Bowen ratio-energy balance stations were established in shrub and grass communities in May 1995 and are collecting data continuously. Additional energy flux measurements were made using eddy correlation techniques during the September 1995 campaign. Ground-based measurements during the intensive campaigns include thermal and multispectral measurements made using yoke-based platforms and hand-held instruments, LAI, and other vegetation data. Ground and aircraft measurements are acquired during Landsat overpasses so the effect of scale on measurements can be studied. This paper discusses preliminary results from the 1995 airborne campaign. 24 refs., 13 figs., 1 tab.

  15. A comparison of real and simulated airborne multisensor imagery

    NASA Astrophysics Data System (ADS)

    Bloechl, Kevin; De Angelis, Chris; Gartley, Michael; Kerekes, John; Nance, C. Eric

    2014-06-01

    This paper presents a methodology and results for the comparison of simulated imagery to real imagery acquired with multiple sensors hosted on an airborne platform. The dataset includes aerial multi- and hyperspectral imagery with spatial resolutions of one meter or less. The multispectral imagery includes data from an airborne sensor with three-band visible color and calibrated radiance imagery in the long-, mid-, and short-wave infrared. The airborne hyperspectral imagery includes 360 bands of calibrated radiance and reflectance data spanning 400 to 2450 nm in wavelength. Collected in September 2012, the imagery is of a park in Avon, NY, and includes a dirt track and areas of grass, gravel, forest, and agricultural fields. A number of artificial targets were deployed in the scene prior to collection for purposes of target detection, subpixel detection, spectral unmixing, and 3D object recognition. A synthetic reconstruction of the collection site was created in DIRSIG, an image generation and modeling tool developed by the Rochester Institute of Technology, based on ground-measured reflectance data, ground photography, and previous airborne imagery. Simulated airborne images were generated using the scene model, time of observation, estimates of the atmospheric conditions, and approximations of the sensor characteristics. The paper provides a comparison between the empirical and simulated images, including a comparison of achieved performance for classification, detection and unmixing applications. It was found that several differences exist due to the way the image is generated, including finite sampling and incomplete knowledge of the scene, atmospheric conditions and sensor characteristics. The lessons learned from this effort can be used in constructing future simulated scenes and further comparisons between real and simulated imagery.

  16. Comparison of Hyperspectral and Multispectral Satellites for Discriminating Land Cover in Northern California

    NASA Astrophysics Data System (ADS)

    Clark, M. L.; Kilham, N. E.

    2015-12-01

    Land-cover maps are important science products needed for natural resource and ecosystem service management, biodiversity conservation planning, and assessing human-induced and natural drivers of land change. Most land-cover maps at regional to global scales are produced with remote sensing techniques applied to multispectral satellite imagery with 30-500 m pixel sizes (e.g., Landsat, MODIS). Hyperspectral, or imaging spectrometer, imagery measuring the visible to shortwave infrared regions (VSWIR) of the spectrum have shown impressive capacity to map plant species and coarser land-cover associations, yet techniques have not been widely tested at regional and greater spatial scales. The Hyperspectral Infrared Imager (HyspIRI) mission is a VSWIR hyperspectral and thermal satellite being considered for development by NASA. The goal of this study was to assess multi-temporal, HyspIRI-like satellite imagery for improved land cover mapping relative to multispectral satellites. We mapped FAO Land Cover Classification System (LCCS) classes over 22,500 km2 in the San Francisco Bay Area, California using 30-m HyspIRI, Landsat 8 and Sentinel-2 imagery simulated from data acquired by NASA's AVIRIS airborne sensor. Random Forests (RF) and Multiple-Endmember Spectral Mixture Analysis (MESMA) classifiers were applied to the simulated images and accuracies were compared to those from real Landsat 8 images. The RF classifier was superior to MESMA, and multi-temporal data yielded higher accuracy than summer-only data. With RF, hyperspectral data had overall accuracy of 72.2% and 85.1% with full 20-class and reduced 12-class schemes, respectively. Multispectral imagery had lower accuracy. For example, simulated and real Landsat data had 7.5% and 4.6% lower accuracy than HyspIRI data with 12 classes, respectively. In summary, our results indicate increased mapping accuracy using HyspIRI multi-temporal imagery, particularly in discriminating different natural vegetation types, such as

  17. Advanced CCD camera developments

    SciTech Connect

    Condor, A.

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  18. The MKID Camera

    NASA Astrophysics Data System (ADS)

    Maloney, P. R.; Czakon, N. G.; Day, P. K.; Duan, R.; Gao, J.; Glenn, J.; Golwala, S.; Hollister, M.; LeDuc, H. G.; Mazin, B.; Noroozian, O.; Nguyen, H. T.; Sayers, J.; Schlaerth, J.; Vaillancourt, J. E.; Vayonakis, A.; Wilson, P.; Zmuidzinas, J.

    2009-12-01

    The MKID Camera project is a collaborative effort of Caltech, JPL, the University of Colorado, and UC Santa Barbara to develop a large-format, multi-color millimeter and submillimeter-wavelength camera for astronomy using microwave kinetic inductance detectors (MKIDs). These are superconducting, micro-resonators fabricated from thin aluminum and niobium films. We couple the MKIDs to multi-slot antennas and measure the change in surface impedance produced by photon-induced breaking of Cooper pairs. The readout is almost entirely at room temperature and can be highly multiplexed; in principle hundreds or even thousands of resonators could be read out on a single feedline. The camera will have 576 spatial pixels that image simultaneously in four bands at 750, 850, 1100 and 1300 microns. It is scheduled for deployment at the Caltech Submillimeter Observatory in the summer of 2010. We present an overview of the camera design and readout and describe the current status of testing and fabrication.

  19. The Complementary Pinhole Camera.

    ERIC Educational Resources Information Center

    Bissonnette, D.; And Others

    1991-01-01

    Presents an experiment based on the principles of rectilinear motion of light operating in a pinhole camera that projects the image of an illuminated object through a small hole in a sheet to an image screen. (MDH)

  20. Development of a multispectral imagery device devoted to weed detection

    NASA Astrophysics Data System (ADS)

    Vioix, Jean-Baptiste; Douzals, Jean-Paul; Truchetet, Frederic; Navar, Pierre

    2003-04-01

    Multispectral imagery is a large domain with number of practical applications: thermography, quality control in industry, food science and agronomy, etc. The main interest is to obtain spectral information of the objects for which reflectance signal can be associated with physical, chemical and/or biological properties. Agronomic applications of multispectral imagery generally involve the acquisition of several images in the wavelengths of visible and near infrared. This paper will first present different kind of multispectral devices used for agronomic issues and will secondly introduce an original multispectral design based on a single CCD. Third, early results obtained for weed detection are presented.

  1. The Multispectral Imaging Science Working Group. Volume 3: Appendices

    NASA Technical Reports Server (NTRS)

    Cox, S. C. (Editor)

    1982-01-01

    The status and technology requirements for using multispectral sensor imagery in geographic, hydrologic, and geologic applications are examined. Critical issues in image and information science are identified.

  2. Airborne Next: Rethinking Airborne Organization and Applying New Concepts

    DTIC Science & Technology

    2015-06-01

    structures since its employment on a large scale during World War II. It is puzzling to consider how little airborne organizational structures and employment...future potential of airborne concepts by rethinking traditional airborne organizational structures and employment concepts. Using a holistic approach in... structures of airborne forces to model a “small and many” approach over a “large and few” approach, while incorporating a “swarming” concept. Utilizing

  3. Combination of multispectral remote sensing, variable rate technology and environmental modeling for citrus pest management.

    PubMed

    Du, Qian; Chang, Ni-Bin; Yang, Chenghai; Srilakshmi, Kanth R

    2008-01-01

    The Lower Rio Grande Valley (LRGV) of south Texas is an agriculturally rich area supporting intensive production of vegetables, fruits, grain sorghum, and cotton. Modern agricultural practices involve the combined use of irrigation with the application of large amounts of agrochemicals to maximize crop yields. Intensive agricultural activities in past decades might have caused potential contamination of soil, surface water, and groundwater due to leaching of pesticides in the vadose zone. In an effort to promote precision farming in citrus production, this paper aims at developing an airborne multispectral technique for identifying tree health problems in a citrus grove that can be combined with variable rate technology (VRT) for required pesticide application and environmental modeling for assessment of pollution prevention. An unsupervised linear unmixing method was applied to classify the image for the grove and quantify the symptom severity for appropriate infection control. The PRZM-3 model was used to estimate environmental impacts that contribute to nonpoint source pollution with and without the use of multispectral remote sensing and VRT. Research findings using site-specific environmental assessment clearly indicate that combination of remote sensing and VRT may result in benefit to the environment by reducing the nonpoint source pollution by 92.15%. Overall, this study demonstrates the potential of precision farming for citrus production in the nexus of industrial ecology and agricultural sustainability.

  4. A multispectral automatic target recognition application for maritime surveillance, search, and rescue

    NASA Astrophysics Data System (ADS)

    Schoonmaker, Jon; Reed, Scott; Podobna, Yuliya; Vazquez, Jose; Boucher, Cynthia

    2010-04-01

    Due to increased security concerns, the commitment to monitor and maintain security in the maritime environment is increasingly a priority. A country's coast is the most vulnerable area for the incursion of illegal immigrants, terrorists and contraband. This work illustrates the ability of a low-cost, light-weight, multi-spectral, multi-channel imaging system to handle the environment and see under difficult marine conditions. The system and its implemented detecting and tracking technologies should be organic to the maritime homeland security community for search and rescue, fisheries, defense, and law enforcement. It is tailored for airborne and ship based platforms to detect, track and monitor suspected objects (such as semi-submerged targets like marine mammals, vessels in distress, and drug smugglers). In this system, automated detection and tracking technology is used to detect, classify and localize potential threats or objects of interest within the imagery provided by the multi-spectral system. These algorithms process the sensor data in real time, thereby providing immediate feedback when features of interest have been detected. A supervised detection system based on Haar features and Cascade Classifiers is presented and results are provided on real data. The system is shown to be extendable and reusable for a variety of different applications.

  5. Analysis of multispectral signatures and investigation of multi-aspect remote sensing techniques

    NASA Technical Reports Server (NTRS)

    Malila, W. A.; Hieber, R. H.; Sarno, J. E.

    1974-01-01

    Two major aspects of remote sensing with multispectral scanners (MSS) are investigated. The first, multispectral signature analysis, includes the effects on classification performance of systematic variations found in the average signals received from various ground covers as well as the prediction of these variations with theoretical models of physical processes. The foremost effects studied are those associated with the time of day airborne MSS data are collected. Six data collection runs made over the same flight line in a period of five hours are analyzed, it is found that the time span significantly affects classification performance. Variations associated with scan angle also are studied. The second major topic of discussion is multi-aspect remote sensing, a new concept in remote sensing with scanners. Here, data are collected on multiple passes by a scanner that can be tilted to scan forward of the aircraft at different angles on different passes. The use of such spatially registered data to achieve improved classification of agricultural scenes is investigated and found promising. Also considered are the possibilities of extracting from multi-aspect data, information on the condition of corn canopies and the stand characteristics of forests.

  6. Monitoring Geothermal Features in Yellowstone National Park with ATLAS Multispectral Imagery

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph; Berglund, Judith

    2000-01-01

    The National Park Service (NPS) must produce an Environmental Impact Statement for each proposed development in the vicinity of known geothermal resource areas (KGRAs) in Yellowstone National Park. In addition, the NPS monitors indicator KGRAs for environmental quality and is still in the process of mapping many geothermal areas. The NPS currently maps geothermal features with field survey techniques. High resolution aerial multispectral remote sensing in the visible, NIR, SWIR, and thermal spectral regions could enable YNP geothermal features to be mapped more quickly and in greater detail In response, Yellowstone Ecosystems Studies, in partnership with NASA's Commercial Remote Sensing Program, is conducting a study on the use of Airborne Terrestrial Applications Sensor (ATLAS) multispectral data for monitoring geothermal features in the Upper Geyser Basin. ATLAS data were acquired at 2.5 meter resolution on August 17, 2000. These data were processed into land cover classifications and relative temperature maps. For sufficiently large features, the ATLAS data can map geothermal areas in terms of geyser pools and hot springs, plus multiple categories of geothermal runoff that are apparently indicative of temperature gradients and microbial matting communities. In addition, the ATLAS maps clearly identify geyserite areas. The thermal bands contributed to classification success and to the computation of relative temperature. With masking techniques, one can assess the influence of geothermal features on the Firehole River. Preliminary results appear to confirm ATLAS data utility for mapping and monitoring geothermal features. Future work will include classification refinement and additional validation.

  7. Spacecraft camera image registration

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Chan, Fred N. T. (Inventor); Gamble, Donald W. (Inventor)

    1987-01-01

    A system for achieving spacecraft camera (1, 2) image registration comprises a portion external to the spacecraft and an image motion compensation system (IMCS) portion onboard the spacecraft. Within the IMCS, a computer (38) calculates an image registration compensation signal (60) which is sent to the scan control loops (84, 88, 94, 98) of the onboard cameras (1, 2). At the location external to the spacecraft, the long-term orbital and attitude perturbations on the spacecraft are modeled. Coefficients (K, A) from this model are periodically sent to the onboard computer (38) by means of a command unit (39). The coefficients (K, A) take into account observations of stars and landmarks made by the spacecraft cameras (1, 2) themselves. The computer (38) takes as inputs the updated coefficients (K, A) plus synchronization information indicating the mirror position (AZ, EL) of each of the spacecraft cameras (1, 2), operating mode, and starting and stopping status of the scan lines generated by these cameras (1, 2), and generates in response thereto the image registration compensation signal (60). The sources of periodic thermal errors on the spacecraft are discussed. The system is checked by calculating measurement residuals, the difference between the landmark and star locations predicted at the external location and the landmark and star locations as measured by the spacecraft cameras (1, 2).

  8. Neutron cameras for ITER

    SciTech Connect

    Johnson, L.C.; Barnes, C.W.; Batistoni, P.

    1998-12-31

    Neutron cameras with horizontal and vertical views have been designed for ITER, based on systems used on JET and TFTR. The cameras consist of fan-shaped arrays of collimated flight tubes, with suitably chosen detectors situated outside the biological shield. The sight lines view the ITER plasma through slots in the shield blanket and penetrate the vacuum vessel, cryostat, and biological shield through stainless steel windows. This paper analyzes the expected performance of several neutron camera arrangements for ITER. In addition to the reference designs, the authors examine proposed compact cameras, in which neutron fluxes are inferred from {sup 16}N decay gammas in dedicated flowing water loops, and conventional cameras with fewer sight lines and more limited fields of view than in the reference designs. It is shown that the spatial sampling provided by the reference designs is sufficient to satisfy target measurement requirements and that some reduction in field of view may be permissible. The accuracy of measurements with {sup 16}N-based compact cameras is not yet established, and they fail to satisfy requirements for parameter range and time resolution by large margins.

  9. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  10. Multispectral imaging of the ocular fundus using light emitting diode illumination.

    PubMed

    Everdell, N L; Styles, I B; Calcagni, A; Gibson, J; Hebden, J; Claridge, E

    2010-09-01

    We present an imaging system based on light emitting diode (LED) illumination that produces multispectral optical images of the human ocular fundus. It uses a conventional fundus camera equipped with a high power LED light source and a highly sensitive electron-multiplying charge coupled device camera. It is able to take pictures at a series of wavelengths in rapid succession at short exposure times, thereby eliminating the image shift introduced by natural eye movements (saccades). In contrast with snapshot systems the images retain full spatial resolution. The system is not suitable for applications where the full spectral resolution is required as it uses discrete wavebands for illumination. This is not a problem in retinal imaging where the use of selected wavelengths is common. The modular nature of the light source allows new wavelengths to be introduced easily and at low cost. The use of wavelength-specific LEDs as a source is preferable to white light illumination and subsequent filtering of the remitted light as it minimizes the total light exposure of the subject. The system is controlled via a graphical user interface that enables flexible control of intensity, duration, and sequencing of sources in synchrony with the camera. Our initial experiments indicate that the system can acquire multispectral image sequences of the human retina at exposure times of 0.05 s in the range of 500-620 nm with mean signal to noise ratio of 17 dB (min 11, std 4.5), making it suitable for quantitative analysis with application to the diagnosis and screening of eye diseases such as diabetic retinopathy and age-related macular degeneration.

  11. Multispectral imaging of the ocular fundus using light emitting diode illumination

    NASA Astrophysics Data System (ADS)

    Everdell, N. L.; Styles, I. B.; Calcagni, A.; Gibson, J.; Hebden, J.; Claridge, E.

    2010-09-01

    We present an imaging system based on light emitting diode (LED) illumination that produces multispectral optical images of the human ocular fundus. It uses a conventional fundus camera equipped with a high power LED light source and a highly sensitive electron-multiplying charge coupled device camera. It is able to take pictures at a series of wavelengths in rapid succession at short exposure times, thereby eliminating the image shift introduced by natural eye movements (saccades). In contrast with snapshot systems the images retain full spatial resolution. The system is not suitable for applications where the full spectral resolution is required as it uses discrete wavebands for illumination. This is not a problem in retinal imaging where the use of selected wavelengths is common. The modular nature of the light source allows new wavelengths to be introduced easily and at low cost. The use of wavelength-specific LEDs as a source is preferable to white light illumination and subsequent filtering of the remitted light as it minimizes the total light exposure of the subject. The system is controlled via a graphical user interface that enables flexible control of intensity, duration, and sequencing of sources in synchrony with the camera. Our initial experiments indicate that the system can acquire multispectral image sequences of the human retina at exposure times of 0.05 s in the range of 500-620 nm with mean signal to noise ratio of 17 dB (min 11, std 4.5), making it suitable for quantitative analysis with application to the diagnosis and screening of eye diseases such as diabetic retinopathy and age-related macular degeneration.

  12. Computationally efficient target classification in multispectral image data with Deep Neural Networks

    NASA Astrophysics Data System (ADS)

    Cavigelli, Lukas; Bernath, Dominic; Magno, Michele; Benini, Luca

    2016-10-01

    Detecting and classifying targets in video streams from surveillance cameras is a cumbersome, error-prone and expensive task. Often, the incurred costs are prohibitive for real-time monitoring. This leads to data being stored locally or transmitted to a central storage site for post-incident examination. The required communication links and archiving of the video data are still expensive and this setup excludes preemptive actions to respond to imminent threats. An effective way to overcome these limitations is to build a smart camera that analyzes the data on-site, close to the sensor, and transmits alerts when relevant video sequences are detected. Deep neural networks (DNNs) have come to outperform humans in visual classifications tasks and are also performing exceptionally well on other computer vision tasks. The concept of DNNs and Convolutional Networks (ConvNets) can easily be extended to make use of higher-dimensional input data such as multispectral data. We explore this opportunity in terms of achievable accuracy and required computational effort. To analyze the precision of DNNs for scene labeling in an urban surveillance scenario we have created a dataset with 8 classes obtained in a field experiment. We combine an RGB camera with a 25-channel VIS-NIR snapshot sensor to assess the potential of multispectral image data for target classification. We evaluate several new DNNs, showing that the spectral information fused together with the RGB frames can be used to improve the accuracy of the system or to achieve similar accuracy with a 3x smaller computation effort. We achieve a very high per-pixel accuracy of 99.1%. Even for scarcely occurring, but particularly interesting classes, such as cars, 75% of the pixels are labeled correctly with errors occurring only around the border of the objects. This high accuracy was obtained with a training set of only 30 labeled images, paving the way for fast adaptation to various application scenarios.

  13. CNR LARA project, Italy: Airborne laboratory for environmental research

    NASA Technical Reports Server (NTRS)

    Bianchi, R.; Cavalli, R. M.; Fiumi, L.; Marino, C. M.; Pignatti, S.

    1995-01-01

    The increasing interest for the environmental problems and the study of the impact on the environment due to antropic activity produced an enhancement of remote sensing applications. The Italian National Research Council (CNR) established a new laboratory for airborne hyperspectral imaging, the LARA Project (Laboratorio Aero per Ricerche Ambientali - Airborne Laboratory for Environmental Research), equipping its airborne laboratory, a CASA-212, mainly with the Daedalus AA5000 MIVIS (Multispectral Infrared and Visible Imaging Spectrometer) instrument. MIVIS's channels, spectral bandwidths, and locations are chosen to meet the needs of scientific research for advanced applications of remote sensing data. MIVIS can make significant contributions to solving problems in many diverse areas such as geologic exploration, land use studies, mineralogy, agricultural crop studies, energy loss analysis, pollution assessment, volcanology, forest fire management and others. The broad spectral range and the many discrete narrow channels of MIVIS provide a fine quantization of spectral information that permits accurate definition of absorption features from a variety of materials, allowing the extraction of chemical and physical information of our environment. The availability of such a hyperspectral imager, that will operate mainly in the Mediterranean area, at the present represents a unique opportunity for those who are involved in environmental studies and land-management to collect systematically large-scale and high spectral-spatial resolution data of this part of the world. Nevertheless, MIVIS deployments will touch other parts of the world, where a major interest from the international scientific community is present.

  14. Excitation spectroscopy in multispectral optical fluorescence tomography: methodology, feasibility and computer simulation studies

    NASA Astrophysics Data System (ADS)

    Chaudhari, Abhijit J.; Ahn, Sangtae; Levenson, Richard; Badawi, Ramsey D.; Cherry, Simon R.; Leahy, Richard M.

    2009-08-01

    Molecular probes used for in vivo optical fluorescence tomography (OFT) studies in small animals are typically chosen such that their emission spectra lie in the 680-850 nm wavelength range. This is because tissue attenuation in this spectral band is relatively low, allowing optical photons even from deep sites in tissue to reach the animal surface and consequently be detected by a CCD camera. The wavelength dependence of tissue optical properties within the 680-850 nm band can be exploited for emitted light by measuring fluorescent data via multispectral approaches and incorporating the spectral dependence of these optical properties into the OFT inverse problem—that of reconstructing underlying 3D fluorescent probe distributions from optical data collected on the animal surface. However, in the aforementioned spectral band, due to only small variations in the tissue optical properties, multispectral emission data, though superior for image reconstruction compared to achromatic data, tend to be somewhat redundant. A different spectral approach for OFT is to capitalize on the larger variations in the optical properties of tissue for excitation photons than for the emission photons by using excitation at multiple wavelengths as a means of decoding source depth in tissue. The full potential of spectral approaches in OFT can be realized by a synergistic combination of these two approaches, that is, exciting the underlying fluorescent probe at multiple wavelengths and measuring emission data multispectrally. In this paper, we describe a method that incorporates both excitation and emission spectral information into the OFT inverse problem. We describe a linear algebraic formulation of the multiple wavelength illumination-multispectral detection forward model for OFT and compare it to models that use only excitation at multiple wavelengths or those that use only multispectral detection techniques. This study is carried out in a realistic inhomogeneous mouse atlas

  15. An investigative study of multispectral data compression for remotely-sensed images using vector quantization and difference-mapped shift-coding

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    A study is conducted to investigate the effects and advantages of data compression techniques on multispectral imagery data acquired by NASA's airborne scanners at the Stennis Space Center. The first technique used was vector quantization. The vector is defined in the multispectral imagery context as an array of pixels from the same location from each channel. The error obtained in substituting the reconstructed images for the original set is compared for different compression ratios. Also, the eigenvalues of the covariance matrix obtained from the reconstructed data set are compared with the eigenvalues of the original set. The effects of varying the size of the vector codebook on the quality of the compression and on subsequent classification are also presented. The output data from the Vector Quantization algorithm was further compressed by a lossless technique called Difference-mapped Shift-extended Huffman coding. The overall compression for 7 channels of data acquired by the Calibrated Airborne Multispectral Scanner (CAMS), with an RMS error of 15.8 pixels was 195:1 (0.41 bpp) and with an RMS error of 3.6 pixels was 18:1 (.447 bpp). The algorithms were implemented in software and interfaced with the help of dedicated image processing boards to an 80386 PC compatible computer. Modules were developed for the task of image compression and image analysis. Also, supporting software to perform image processing for visual display and interpretation of the compressed/classified images was developed.

  16. Deployable Wireless Camera Penetrators

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

    2008-01-01

    A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an

  17. The VISTA IR camera

    NASA Astrophysics Data System (ADS)

    Dalton, Gavin B.; Caldwell, Martin; Ward, Kim; Whalley, Martin S.; Burke, Kevin; Lucas, John M.; Richards, Tony; Ferlet, Marc; Edeson, Ruben L.; Tye, Daniel; Shaughnessy, Bryan M.; Strachan, Mel; Atad-Ettedgui, Eli; Leclerc, Melanie R.; Gallie, Angus; Bezawada, Nagaraja N.; Clark, Paul; Bissonauth, Nirmal; Luke, Peter; Dipper, Nigel A.; Berry, Paul; Sutherland, Will; Emerson, Jim

    2004-09-01

    The VISTA IR Camera has now completed its detailed design phase and is on schedule for delivery to ESO"s Cerro Paranal Observatory in 2006. The camera consists of 16 Raytheon VIRGO 2048x2048 HgCdTe arrays in a sparse focal plane sampling a 1.65 degree field of view. A 1.4m diameter filter wheel provides slots for 7 distinct science filters, each comprising 16 individual filter panes. The camera also provides autoguiding and curvature sensing information for the VISTA telescope, and relies on tight tolerancing to meet the demanding requirements of the f/1 telescope design. The VISTA IR camera is unusual in that it contains no cold pupil-stop, but rather relies on a series of nested cold baffles to constrain the light reaching the focal plane to the science beam. In this paper we present a complete overview of the status of the final IR Camera design, its interaction with the VISTA telescope, and a summary of the predicted performance of the system.

  18. THE DARK ENERGY CAMERA

    SciTech Connect

    Flaugher, B.; Diehl, H. T.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Buckley-Geer, E. J.; Honscheid, K.; Abbott, T. M. C.; Bonati, M.; Antonik, M.; Brooks, D.; Ballester, O.; Cardiel-Sas, L.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Boprie, D.; Campa, J.; Castander, F. J.; Collaboration: DES Collaboration; and others

    2015-11-15

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel{sup −1}. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  19. The Dark Energy Camera

    SciTech Connect

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  20. Satellite camera image navigation

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Savides, John (Inventor); Hanson, Charles W. (Inventor)

    1987-01-01

    Pixels within a satellite camera (1, 2) image are precisely located in terms of latitude and longitude on a celestial body, such as the earth, being imaged. A computer (60) on the earth generates models (40, 50) of the satellite's orbit and attitude, respectively. The orbit model (40) is generated from measurements of stars and landmarks taken by the camera (1, 2), and by range data. The orbit model (40) is an expression of the satellite's latitude and longitude at the subsatellite point, and of the altitude of the satellite, as a function of time, using as coefficients (K) the six Keplerian elements at epoch. The attitude model (50) is based upon star measurements taken by each camera (1, 2). The attitude model (50) is a set of expressions for the deviations in a set of mutually orthogonal reference optical axes (x, y, z) as a function of time, for each camera (1, 2). Measured data is fit into the models (40, 50) using a walking least squares fit algorithm. A transformation computer (66 ) transforms pixel coordinates as telemetered by the camera (1, 2) into earth latitude and longitude coordinates, using the orbit and attitude models (40, 50).

  1. The Dark Energy Camera

    DOE PAGES

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar.more » The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less

  2. Neutron counting with cameras

    SciTech Connect

    Van Esch, Patrick; Crisanti, Marta; Mutti, Paolo

    2015-07-01

    A research project is presented in which we aim at counting individual neutrons with CCD-like cameras. We explore theoretically a technique that allows us to use imaging detectors as counting detectors at lower counting rates, and transits smoothly to continuous imaging at higher counting rates. As such, the hope is to combine the good background rejection properties of standard neutron counting detectors with the absence of dead time of integrating neutron imaging cameras as well as their very good spatial resolution. Compared to Xray detection, the essence of thermal neutron detection is the nuclear conversion reaction. The released energies involved are of the order of a few MeV, while X-ray detection releases energies of the order of the photon energy, which is in the 10 KeV range. Thanks to advances in camera technology which have resulted in increased quantum efficiency, lower noise, as well as increased frame rate up to 100 fps for CMOS-type cameras, this more than 100-fold higher available detection energy implies that the individual neutron detection light signal can be significantly above the noise level, as such allowing for discrimination and individual counting, which is hard to achieve with X-rays. The time scale of CMOS-type cameras doesn't allow one to consider time-of-flight measurements, but kinetic experiments in the 10 ms range are possible. The theory is next confronted to the first experimental results. (authors)

  3. Reproducible high-resolution multispectral image acquisition in dermatology

    NASA Astrophysics Data System (ADS)

    Duliu, Alexandru; Gardiazabal, José; Lasser, Tobias; Navab, Nassir

    2015-07-01

    Multispectral image acquisitions are increasingly popular in dermatology, due to their improved spectral resolution which enables better tissue discrimination. Most applications however focus on restricted regions of interest, imaging only small lesions. In this work we present and discuss an imaging framework for high-resolution multispectral imaging on large regions of interest.

  4. Multispectral data compression through transform coding and block quantization

    NASA Technical Reports Server (NTRS)

    Ready, P. J.; Wintz, P. A.

    1972-01-01

    Transform coding and block quantization techniques are applied to multispectral aircraft scanner data, and digitized satellite imagery. The multispectral source is defined and an appropriate mathematical model proposed. The Karhunen-Loeve, Fourier, and Hadamard encoders are considered and are compared to the rate distortion function for the equivalent Gaussian source and to the performance of the single sample PCM encoder.

  5. A multispectral method of determining sea surface temperatures

    NASA Technical Reports Server (NTRS)

    Shenk, W. E.

    1972-01-01

    A multispectral method for determining sea surface temperatures is discussed. The specifications of the equipment and the atmospheric conditions required for successful multispectral data acquisition are described. Examples of data obtained in the North Atlantic Ocean are presented. The differences between the actual sea surface temperatures and the equivalent blackbody temperatures as determined by a radiometer are plotted.

  6. International Symposium on Airborne Geophysics

    NASA Astrophysics Data System (ADS)

    Mogi, Toru; Ito, Hisatoshi; Kaieda, Hideshi; Kusunoki, Kenichiro; Saltus, Richard W.; Fitterman, David V.; Okuma, Shigeo; Nakatsuka, Tadashi

    2006-05-01

    Airborne geophysics can be defined as the measurement of Earth properties from sensors in the sky. The airborne measurement platform is usually a traditional fixed-wing airplane or helicopter, but could also include lighter-than-air craft, unmanned drones, or other specialty craft. The earliest history of airborne geophysics includes kite and hot-air balloon experiments. However, modern airborne geophysics dates from the mid-1940s when military submarine-hunting magnetometers were first used to map variations in the Earth's magnetic field. The current gamut of airborne geophysical techniques spans a broad range, including potential fields (both gravity and magnetics), electromagnetics (EM), radiometrics, spectral imaging, and thermal imaging.

  7. Airborne Remote Sensing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA imaging technology has provided the basis for a commercial agricultural reconnaissance service. AG-RECON furnishes information from airborne sensors, aerial photographs and satellite and ground databases to farmers, foresters, geologists, etc. This service produces color "maps" of Earth conditions, which enable clients to detect crop color changes or temperature changes that may indicate fire damage or pest stress problems.

  8. Recognizing Airborne Hazards.

    ERIC Educational Resources Information Center

    Schneider, Christian M.

    1990-01-01

    The heating, ventilating, and air conditioning (HVAC) systems in older buildings often do not adequately handle air-borne contaminants. Outlines a three-stage Indoor Air Quality (IAQ) assessment and describes a case in point at a Pittsburgh, Pennsylvania, school. (MLF)

  9. Airborne asbestos in buildings.

    PubMed

    Lee, R J; Van Orden, D R

    2008-03-01

    The concentration of airborne asbestos in buildings nationwide is reported in this study. A total of 3978 indoor samples from 752 buildings, representing nearly 32 man-years of sampling, have been analyzed by transmission electron microscopy. The buildings that were surveyed were the subject of litigation related to suits alleging the general building occupants were exposed to a potential health hazard as a result the presence of asbestos-containing materials (ACM). The average concentration of all airborne asbestos structures was 0.01structures/ml (s/ml) and the average concentration of airborne asbestos > or = 5microm long was 0.00012fibers/ml (f/ml). For all samples, 99.9% of the samples were <0.01 f/ml for fibers longer than 5microm; no building averaged above 0.004f/ml for fibers longer than 5microm. No asbestos was detected in 27% of the buildings and in 90% of the buildings no asbestos was detected that would have been seen optically (> or = 5microm long and > or = 0.25microm wide). Background outdoor concentrations have been reported at 0.0003f/ml > or = 5microm. These results indicate that in-place ACM does not result in elevated airborne asbestos in building atmospheres approaching regulatory levels and that it does not result in a significantly increased risk to building occupants.

  10. Multispectral rock-type separation and classification.

    SciTech Connect

    Moya, Mary M.; Fogler, Robert Joseph; Paskaleva, Biliana; Hayat, Majeed M.

    2004-06-01

    This paper explores the possibility of separating and classifying remotely-sensed multispectral data from rocks and minerals onto seven geological rock-type groups. These groups are extracted from the general categories of metamorphic, igneous and sedimentary rocks. The study is performed under ideal conditions for which the data is generated according to laboratory hyperspectral data for the members, which are, in turn, passed through the Multi-spectral Thermal Imager (MTI) filters yielding 15 bands. The main challenge in separability is the small size of the training data sets, which initially did not permit direct application of Bayesian decision theory. To enable Bayseian classification, the original training data is linearly perturbed with the addition minerals, vegetation, soil, water and other valid impurities. As a result, the size of the training data is significantly increased and accurate estimates of the covariance matrices are achieved. In addition, a set of reduced (five) linearly-extracted canonical features that are optimal in providing the most important information about the data is determined. An alternative nonlinear feature-selection method is also employed based on spectral indices comprising a small subset of all possible ratios between bands. By applying three optimization strategies, combinations of two and three ratios are found that provide reliable separability and classification between all seven groups according to the Bhattacharyya distance. To set a benchmark to which the MTI capability in rock classification can be compared, an optimization strategy is performed for the selection of optimal multispectral filters, other than the MTI filters, and an improvement in classification is predicted.

  11. HIGH SPEED CAMERA

    DOEpatents

    Rogers, B.T. Jr.; Davis, W.C.

    1957-12-17

    This patent relates to high speed cameras having resolution times of less than one-tenth microseconds suitable for filming distinct sequences of a very fast event such as an explosion. This camera consists of a rotating mirror with reflecting surfaces on both sides, a narrow mirror acting as a slit in a focal plane shutter, various other mirror and lens systems as well as an innage recording surface. The combination of the rotating mirrors and the slit mirror causes discrete, narrow, separate pictures to fall upon the film plane, thereby forming a moving image increment of the photographed event. Placing a reflecting surface on each side of the rotating mirror cancels the image velocity that one side of the rotating mirror would impart, so as a camera having this short a resolution time is thereby possible.

  12. Selective-imaging camera

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Hsu, Charles; Landa, Joseph; Cha, Jae H.; Krapels, Keith A.

    2015-05-01

    How can we design cameras that image selectively in Full Electro-Magnetic (FEM) spectra? Without selective imaging, we cannot use, for example, ordinary tourist cameras to see through fire, smoke, or other obscurants contributing to creating a Visually Degraded Environment (VDE). This paper addresses a possible new design of selective-imaging cameras at firmware level. The design is consistent with physics of the irreversible thermodynamics of Boltzmann's molecular entropy. It enables imaging in appropriate FEM spectra for sensing through the VDE, and displaying in color spectra for Human Visual System (HVS). We sense within the spectra the largest entropy value of obscurants such as fire, smoke, etc. Then we apply a smart firmware implementation of Blind Sources Separation (BSS) to separate all entropy sources associated with specific Kelvin temperatures. Finally, we recompose the scene using specific RGB colors constrained by the HVS, by up/down shifting Planck spectra at each pixel and time.

  13. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  14. Electronic still camera

    NASA Astrophysics Data System (ADS)

    Holland, S. Douglas

    1992-09-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  15. Multispectral analysis of ocean dumped materials

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1977-01-01

    Remotely sensed data were collected in conjunction with sea-truth measurements in three experiments in the New York Bight. Pollution features of primary interest were ocean dumped materials, such as sewage sludge and acid waste. Sewage-sludge and acid-waste plumes, including plumes from sewage sludge dumped by the 'line-dump' and 'spot-dump' methods, were located, identified, and mapped. Previously developed quantitative analysis techniques for determining quantitative distributions of materials in sewage sludge dumps were evaluated, along with multispectral analysis techniques developed to identify ocean dumped materials. Results of these experiments and the associated data analysis investigations are presented and discussed.

  16. Multispectral analysis of ocean dumped materials

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1977-01-01

    Experiments conducted in the Atlantic coastal zone indicated that plumes resulting from ocean dumping of acid wastes and sewage sludge have unique spectral characteristics. Remotely sensed wide area synoptic coverage provided information on these pollution features that was not readily available from other sources. Aircraft remotely sensed photographic and multispectral scanner data were interpreted by two methods. First, qualitative analyses in which pollution features were located, mapped, and identified without concurrent sea truth and, second, quantitative analyses in which concurrently collected sea truth was used to calibrate the remotely sensed data and to determine quantitative distributions of one or more parameters in a plume.

  17. Mapping soil types from multispectral scanner data.

    NASA Technical Reports Server (NTRS)

    Kristof, S. J.; Zachary, A. L.

    1971-01-01

    Multispectral remote sensing and computer-implemented pattern recognition techniques were used for automatic ?mapping' of soil types. This approach involves subjective selection of a set of reference samples from a gray-level display of spectral variations which was generated by a computer. Each resolution element is then classified using a maximum likelihood ratio. Output is a computer printout on which the researcher assigns a different symbol to each class. Four soil test areas in Indiana were experimentally examined using this approach, and partially successful results were obtained.

  18. Multispectral image processing for environmental monitoring

    NASA Astrophysics Data System (ADS)

    Carlotto, Mark J.; Lazaroff, Mark B.; Brennan, Mark W.

    1993-03-01

    New techniques are described for detecting environmental anomalies and changes using multispectral imagery. Environmental anomalies are areas that do not exhibit normal signatures due to man-made activities and include phenomena such as effluent discharges, smoke plumes, stressed vegetation, and deforestation. A new region-based processing technique is described for detecting these phenomena using Landsat TM imagery. Another algorithm that can detect the appearance or disappearance of environmental phenomena is also described and an example illustrating its use in detecting urban changes using SPOT imagery is presented.

  19. Multispectral image fusion using neural networks

    NASA Technical Reports Server (NTRS)

    Kagel, J. H.; Platt, C. A.; Donaven, T. W.; Samstad, E. A.

    1990-01-01

    A prototype system is being developed to demonstrate the use of neural network hardware to fuse multispectral imagery. This system consists of a neural network IC on a motherboard, a circuit card assembly, and a set of software routines hosted by a PC-class computer. Research in support of this consists of neural network simulations fusing 4 to 7 bands of Landsat imagery and fusing (separately) multiple bands of synthetic imagery. The simulations, results, and a description of the prototype system are presented.

  20. Multispectral-image fusion using neural networks

    NASA Astrophysics Data System (ADS)

    Kagel, Joseph H.; Platt, C. A.; Donaven, T. W.; Samstad, Eric A.

    1990-08-01

    A prototype system is being developed to demonstrate the use of neural network hardware to fuse multispectral imagery. This system consists of a neural network IC on a motherboard a circuit card assembly and a set of software routines hosted by a PC-class computer. Research in support of this consists of neural network simulations fusing 4 to 7 bands of Landsat imagery and fusing (separately) multiple bands of synthetic imagery. The simulations results and a description of the prototype system are presented. 1.

  1. Multispectral scanner imagery for plant community classification.

    NASA Technical Reports Server (NTRS)

    Driscoll, R. S.; Spencer, M. M.

    1973-01-01

    Optimum channel selection among 12 channels of multispectral scanner imagery identified six as providing the best information for computerized classification of 11 plant communities and two nonvegetation classes. Intensive preprocessing of the spectral data was required to eliminate bidirectional reflectance effects of the spectral imagery caused by scanner view angle and varying geometry of the plant canopy. Generalized plant community types - forest, grassland, and hydrophytic systems - were acceptably classified based on ecological analysis. Serious, but soluble, errors occurred with attempts to classify specific community types within the grassland system. However, special clustering analyses provided for improved classification of specific grassland communities.

  2. Photoreactivation in Airborne Mycobacterium parafortuitum

    PubMed Central

    Peccia, Jordan; Hernandez, Mark

    2001-01-01

    Photoreactivation was observed in airborne Mycobacterium parafortuitum exposed concurrently to UV radiation (254 nm) and visible light. Photoreactivation rates of airborne cells increased with increasing relative humidity (RH) and decreased with increasing UV dose. Under a constant UV dose with visible light absent, the UV inactivation rate of airborne M. parafortuitum cells decreased by a factor of 4 as RH increased from 40 to 95%; however, under identical conditions with visible light present, the UV inactivation rate of airborne cells decreased only by a factor of 2. When irradiated in the absence of visible light, cellular cyclobutane thymine dimer content of UV-irradiated airborne M. parafortuitum and Serratia marcescens increased in response to RH increases. Results suggest that, unlike in waterborne bacteria, cyclobutane thymine dimers are not the most significant form of UV-induced DNA damage incurred by airborne bacteria and that the distribution of DNA photoproducts incorporated into UV-irradiated airborne cells is a function of RH. PMID:11526027

  3. Artificial human vision camera

    NASA Astrophysics Data System (ADS)

    Goudou, J.-F.; Maggio, S.; Fagno, M.

    2014-10-01

    In this paper we present a real-time vision system modeling the human vision system. Our purpose is to inspire from human vision bio-mechanics to improve robotic capabilities for tasks such as objects detection and tracking. This work describes first the bio-mechanical discrepancies between human vision and classic cameras and the retinal processing stage that takes place in the eye, before the optic nerve. The second part describes our implementation of these principles on a 3-camera optical, mechanical and software model of the human eyes and associated bio-inspired attention model.

  4. Laser Range Camera Modeling

    SciTech Connect

    Storjohann, K.

    1990-01-01

    This paper describes an imaging model that was derived for use with a laser range camera (LRC) developed by the Advanced Intelligent Machines Division of Odetics. However, this model could be applied to any comparable imaging system. Both the derivation of the model and the determination of the LRC's intrinsic parameters are explained. For the purpose of evaluating the LRC's extrinsic parameters, i.e., its external orientation, a transformation of the LRC's imaging model into a standard camera's (SC) pinhole model is derived. By virtue of this transformation, the evaluation of the LRC's external orientation can be found by applying any SC calibration technique.

  5. Airborne Dust Monitoring Activities at the National Environmental Satellite, Data and Information Service

    NASA Astrophysics Data System (ADS)

    Stephens, G.; McNamara, D.; Taylor, J.

    2002-12-01

    Wind blown dust can be a hazard to transportation, industrial, and military operations, and much work has been devoted to its analysis and prediction from a meteorological viewpoint. The detection and forecasting of dust outbreaks in near real time is difficult, particularly in remote desert areas with sparse observation networks. The Regional Haze Regulation, passed by Congress in 1999, mandates a reduction in man made inputs to haze in 156 Class I areas (national parks and wilderness areas). Studies have demonstrated that satellite data can be useful in detection and tracking of dust storms. Environmental satellites offer frequent coverage of large geographic areas. The National Environmental Satellite, Data, and Information Service (NESDIS) of the U.S. National Oceanic and Atmospheric Administration (NOAA) operates a system of polar orbiting and geostationary environmental satellites, which sense data in two visible and three infrared channels. Promising results in the detection of airborne dust have been obtained using multispectral techniques to combine information from two or more channels to detect subtle spectral differences. One technique, using a ratio of two thermal channels, detects the presence of airborne dust, and discriminates it from both underlying ground and meteorological clouds. In addition, NESDIS accesses and is investigating for operational use data from several other satellites. The Total Ozone Mapping Spectrometer on board NASA's Earth Probe mission provides an aerosol index product which can detect dust and smoke, and the Moderate Resolution Imaging Spectroradiometer on NASA's Terra and Aqua satellites provide several channels which can detect aerosols in multispectral channel combinations. NESDIS, in cooperation with NOAA's Air Resources Laboratory, produces a daily smoke transport forecast, combining satellite derived smoke source points with a mathematical transport prediction model; such a scheme could be applied to other aerosol

  6. A Field Evaluation of Airborne Techniques for Detection of Unexploded Ordnance

    SciTech Connect

    Bell, D.; Doll, W.E.; Hamlett, P.; Holladay, J.S.; Nyquist, J.E.; Smyre, J.; Gamey, T.J.

    1999-03-14

    US Defense Department estimates indicate that as many as 11 million acres of government land in the U. S. may contain unexploded ordnance (UXO), with the cost of identifying and disposing of this material estimated at nearly $500 billion. The size and character of the ordnance, types of interference, vegetation, geology, and topography vary from site to site. Because of size or composition, some ordnance is difficult to detect with any geophysical method, even under favorable soil and cultural interference conditions. For some sites, airborne methods may provide the most time and cost effective means for detection of UXO. Airborne methods offer lower risk to field crews from proximity to unstable ordnance, and less disturbance of sites that maybe environmentally sensitive. Data were acquired over a test site at Edwards AFB, CA using airborne magnetic, electromagnetic, multispectral and thermal sensors. Survey areas included sites where trenches might occur, and a test site in which we placed deactivated ordnance, ranging in size from small ''bomblets'' to large bombs. Magnetic data were then acquired with the Aerodat HM-3 system, which consists of three cesium magnetometers within booms extending to the front and sides of the helicopter, and mounted such that the helicopter can be flown within 3m of the surface. Electromagnetic data were acquired with an Aerodat 5 frequency coplanar induction system deployed as a sling load from a helicopter, with a sensor altitude of 15m. Surface data, acquired at selected sites, provide a comparison with airborne data. Multispectral and thermal data were acquired with a Daedelus AADS 1268 system. Preliminary analysis of the test data demonstrate the value of airborne systems for UXO detection and provide insight into improvements that might make the systems even more effective.

  7. Pantir - a Dual Camera Setup for Precise Georeferencing and Mosaicing of Thermal Aerial Images

    NASA Astrophysics Data System (ADS)

    Weber, I.; Jenal, A.; Kneer, C.; Bongartz, J.

    2015-03-01

    Research and monitoring in fields like hydrology and agriculture are applications of airborne thermal infrared (TIR) cameras, which suffer from low spatial resolution and low quality lenses. Common ground control points (GCPs), lacking thermal activity and being relatively small in size, cannot be used in TIR images. Precise georeferencing and mosaicing however is necessary for data analysis. Adding a high resolution visible light camera (VIS) with a high quality lens very close to the TIR camera, in the same stabilized rig, allows us to do accurate geoprocessing with standard GCPs after fusing both images (VIS+TIR) using standard image registration methods.

  8. A multisensor system for airborne surveillance of oil pollution

    NASA Technical Reports Server (NTRS)

    Edgerton, A. T.; Ketchal, R.; Catoe, C.

    1973-01-01

    The U.S. Coast Guard is developing a prototype airborne oil surveillance system for use in its Marine Environmental Protection Program. The prototype system utilizes an X-band side-looking radar, a 37-GHz imaging microwave radiometer, a multichannel line scanner, and a multispectral low light level system. The system is geared to detecting and mapping oil spills and potential pollution violators anywhere within a 25 nmi range of the aircraft flight track under all but extreme weather conditions. The system provides for false target discrimination and maximum identification of spilled materials. The system also provides an automated detection alarm, as well as a color display to achieve maximum coupling between the sensor data and the equipment operator.

  9. Modeling of estuarne chlorophyll a from an airborne scanner

    USGS Publications Warehouse

    Khorram, Siamak; Catts, Glenn P.; Cloern, James E.; Knight, Allen W.

    1987-01-01

    Near simultaneous collection of 34 surface water samples and airborne multispectral scanner data provided input for regression models developed to predict surface concentrations of estuarine chlorophyll a. Two wavelength ratios were employed in model development. The ratios werechosen to capitalize on the spectral characteristics of chlorophyll a, while minimizing atmospheric influences. Models were then applied to data previously acquired over the study area thre years earlier. Results are in the form of color-coded displays of predicted chlorophyll a concentrations and comparisons of the agreement among measured surface samples and predictions basedon coincident remotely sensed data. The influence of large variations in fresh-water inflow to the estuary are clearly apparent in the results. The synoptic view provided by remote sensing is another method of examining important estuarine dynamics difficult to observe from in situ sampling alone.

  10. Use of multispectral satellite imagery and hyperspectral endmember libraries for urban land cover mapping at the metropolitan scale

    NASA Astrophysics Data System (ADS)

    Priem, Frederik; Okujeni, Akpona; van der Linden, Sebastian; Canters, Frank

    2016-10-01

    The value of characteristic reflectance features for mapping urban materials has been demonstrated in many experiments with airborne imaging spectrometry. Analysis of larger areas requires satellite-based multispectral imagery, which typically lacks the spatial and spectral detail of airborne data. Consequently the need arises to develop mapping methods that exploit the complementary strengths of both data sources. In this paper a workflow for sub-pixel quantification of Vegetation-Impervious-Soil urban land cover is presented, using medium resolution multispectral satellite imagery, hyperspectral endmember libraries and Support Vector Regression. A Landsat 8 Operational Land Imager surface reflectance image covering the greater metropolitan area of Brussels is selected for mapping. Two spectral libraries developed for the cities of Brussels and Berlin based on airborne hyperspectral APEX and HyMap data are used. First the combined endmember library is resampled to match the spectral response of the Landsat sensor. The library is then optimized to avoid spectral redundancy and confusion. Subsequently the spectra of the endmember library are synthetically mixed to produce training data for unmixing. Mapping is carried out using Support Vector Regression models trained with spectra selected through stratified sampling of the mixed library. Validation on building block level (mean size = 46.8 Landsat pixels) yields an overall good fit between reference data and estimation with Mean Absolute Errors of 0.06, 0.06 and 0.08 for vegetation, impervious and soil respectively. Findings of this work may contribute to the use of universal spectral libraries for regional scale land cover fraction mapping using regression approaches.

  11. Underwater camera with depth measurement

    NASA Astrophysics Data System (ADS)

    Wang, Wei-Chih; Lin, Keng-Ren; Tsui, Chi L.; Schipf, David; Leang, Jonathan

    2016-04-01

    The objective of this study is to develop an RGB-D (video + depth) camera that provides three-dimensional image data for use in the haptic feedback of a robotic underwater ordnance recovery system. Two camera systems were developed and studied. The first depth camera relies on structured light (as used by the Microsoft Kinect), where the displacement of an object is determined by variations of the geometry of a projected pattern. The other camera system is based on a Time of Flight (ToF) depth camera. The results of the structural light camera system shows that the camera system requires a stronger light source with a similar operating wavelength and bandwidth to achieve a desirable working distance in water. This approach might not be robust enough for our proposed underwater RGB-D camera system, as it will require a complete re-design of the light source component. The ToF camera system instead, allows an arbitrary placement of light source and camera. The intensity output of the broadband LED light source in the ToF camera system can be increased by putting them into an array configuration and the LEDs can be modulated comfortably with any waveform and frequencies required by the ToF camera. In this paper, both camera were evaluated and experiments were conducted to demonstrate the versatility of the ToF camera.

  12. Development of multi-spectral three-dimensional micro particle tracking velocimetry

    NASA Astrophysics Data System (ADS)

    Tien, Wei-Hsin

    2016-08-01

    The color-coded 3D micro particle tracking velocimetry system (CC3DμPTV) is a volumetric velocimetry technique that uses the defocusing digital particle image velocimetry (DDPIV) approach to reconstruct the 3D location of the particle. It is suited for microscopic flow visualization because of the single camera configuration. However, several factors limit the performance of the system. In this study, the limitation of the CC3DμPTV is discussed in detail and a new concept of a multi-camera 3D μ-PTV system is proposed to improve the performance of the original CC3DμPTV system. The system utilizes two dichroic beam splitters to separate the incoming light into 3 spectral ranges, and image with three monochrome image sensors. The use of a color-matched light source, off-center individual pinhole and monochrome image sensors allow the system to achieve better sensitivity and optical resolution. The use of coherent lasers light sources with high-speed cameras improves the velocity measurement dynamic range. The performance of the proposed multi-spectral system is first evaluated with a simulation model based on the finite element method (FEM). The performance is also compared numerically with the CC3DμPTV system. The test results show significant improvements on the signal to noise ratio and optical resolution. Originally presented in 11th International Symposium on Particle Image Velocimetry, Santa Barbara, California, September 14-16, 2015.

  13. Self-sustainability of optical fibers in airborne communications

    NASA Astrophysics Data System (ADS)

    Dwivedi, Anurag; Finnegan, Eric J.

    2005-05-01

    A large number of communications technologies co-exist today in both civilian and military space with their relative strengths and weaknesses. The information carrying capacity of optical fiber communication, however, surpasses any other communications technology in use today. Additionally, optical fiber is immune to environmental effects and detection, and can be designed to be resistant to exploitation and jamming. However, fiber-optic communication applications are usually limited to static, pre-deployed cable systems. Enabling the fiber applications in dynamically deployed and ad-hoc conditions will open up a large number of communication possibilities in terrestrial, aerial, and oceanic environments. Of particular relevance are bandwidth intensive data, video and voice applications such as airborne imagery, multispectral and hyperspectral imaging, surveillance and communications disaster recovery through surveillance platforms like Airships (also called balloons, aerostats or blimps) and Unmanned Aerial Vehicles (UAVs). Two major considerations in the implementation of airborne fiber communications are (a) mechanical sustainability of optical fibers, and (b) variation in optical transmission characteristics of fiber in dynamic deployment condition. This paper focuses on the mechanical aspects of airborne optical fiber and examines the ability of un-cabled optical fiber to sustain its own weight and wind drag in airborne communications applications. Since optical fiber is made of silica glass, the material fracture characteristics, sub-critical crack growth, strength distribution and proof stress are the key parameters that determine the self-sustainability of optical fiber. Results are presented in terms of maximum self-sustainable altitudes for three types of optical fibers, namely silica-clad, Titania-doped Silica-clad, and carbon-coated hermetic fibers, for short and long service periods and a range of wind profiles and fiber dimensions.

  14. Quantum dot-based image sensors for cutting-edge commercial multispectral cameras

    NASA Astrophysics Data System (ADS)

    Mandelli, Emanuele; Beiley, Zach M.; Kolli, Naveen; Pattantyus-Abraham, Andras G.

    2016-09-01

    This work presents the development of a quantum dot-based photosensitive film engineered to be integrated on standard CMOS process wafers. It enables the design of exceptionally high performance, reliable image sensors. Quantum dot solids absorb light much more rapidly than typical silicon-based photodiodes do, and with the ability to tune the effective material bandgap, quantum dot-based imagers enable higher quantum efficiency over extended spectral bands, both in the Visible and IR regions of the spectrum. Moreover, a quantum dot-based image sensor enables desirable functions such as ultra-small pixels with low crosstalk, high full well capacity, global shutter and wide dynamic range at a relatively low manufacturing cost. At InVisage, we have optimized the manufacturing process flow and are now able to produce high-end image sensors for both visible and NIR in quantity.

  15. Photogrammetric camera calibration

    USGS Publications Warehouse

    Tayman, W.P.; Ziemann, H.

    1984-01-01

    Section 2 (Calibration) of the document "Recommended Procedures for Calibrating Photogrammetric Cameras and Related Optical Tests" from the International Archives of Photogrammetry, Vol. XIII, Part 4, is reviewed in the light of recent practical work, and suggestions for changes are made. These suggestions are intended as a basis for a further discussion. ?? 1984.

  16. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  17. Anger Camera Firmware

    SciTech Connect

    2010-11-19

    The firmware is responsible for the operation of Anger Camera Electronics, calculation of position, time of flight and digital communications. It provides a first stage analysis of 48 signals from 48 analog signals that have been converted to digital values using A/D convertors.

  18. Mars Observer camera

    NASA Technical Reports Server (NTRS)

    Malin, M. C.; Danielson, G. E.; Ingersoll, A. P.; Masursky, H.; Veverka, J.; Ravine, M. A.; Soulanille, T. A.

    1992-01-01

    The Mars Observer camera (MOC) is a three-component system (one narrow-angle and two wide-angle cameras) designed to take high spatial resolution pictures of the surface of Mars and to obtain lower spatial resolution, synoptic coverage of the planet's surface and atmosphere. The cameras are based on the 'push broom' technique; that is, they do not take 'frames' but rather build pictures, one line at a time, as the spacecraft moves around the planet in its orbit. MOC is primarily a telescope for taking extremely high resolution pictures of selected locations on Mars. Using the narrow-angle camera, areas ranging from 2.8 km x 2.8 km to 2.8 km x 25.2 km (depending on available internal digital buffer memory) can be photographed at about 1.4 m/pixel. Additionally, lower-resolution pictures (to a lowest resolution of about 11 m/pixel) can be acquired by pixel averaging; these images can be much longer, ranging up to 2.8 x 500 km at 11 m/pixel. High-resolution data will be used to study sediments and sedimentary processes, polar processes and deposits, volcanism, and other geologic/geomorphic processes.

  19. Snapshot polarimeter fundus camera.

    PubMed

    DeHoog, Edward; Luo, Haitao; Oka, Kazuhiko; Dereniak, Eustace; Schwiegerling, James

    2009-03-20

    A snapshot imaging polarimeter utilizing Savart plates is integrated into a fundus camera for retinal imaging. Acquired retinal images can be processed to reconstruct Stokes vector images, giving insight into the polarization properties of the retina. Results for images from a normal healthy retina and retinas with pathology are examined and compared.

  20. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

  1. Spas color camera

    NASA Technical Reports Server (NTRS)

    Toffales, C.

    1983-01-01

    The procedures to be followed in assessing the performance of the MOS color camera are defined. Aspects considered include: horizontal and vertical resolution; value of the video signal; gray scale rendition; environmental (vibration and temperature) tests; signal to noise ratios; and white balance correction.

  2. Advanced Virgo phase cameras

    NASA Astrophysics Data System (ADS)

    van der Schaaf, L.; Agatsuma, K.; van Beuzekom, M.; Gebyehu, M.; van den Brand, J.

    2016-05-01

    A century after the prediction of gravitational waves, detectors have reached the sensitivity needed to proof their existence. One of them, the Virgo interferometer in Pisa, is presently being upgraded to Advanced Virgo (AdV) and will come into operation in 2016. The power stored in the interferometer arms raises from 20 to 700 kW. This increase is expected to introduce higher order modes in the beam, which could reduce the circulating power in the interferometer, limiting the sensitivity of the instrument. To suppress these higher-order modes, the core optics of Advanced Virgo is equipped with a thermal compensation system. Phase cameras, monitoring the real-time status of the beam constitute a critical component of this compensation system. These cameras measure the phases and amplitudes of the laser-light fields at the frequencies selected to control the interferometer. The measurement combines heterodyne detection with a scan of the wave front over a photodetector with pin-hole aperture. Three cameras observe the phase front of these laser sidebands. Two of them monitor the in-and output of the interferometer arms and the third one is used in the control of the aberrations introduced by the power recycling cavity. In this paper the working principle of the phase cameras is explained and some characteristic parameters are described.

  3. Communities, Cameras, and Conservation

    ERIC Educational Resources Information Center

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

  4. The LSST Camera Overview

    SciTech Connect

    Gilmore, Kirk; Kahn, Steven A.; Nordby, Martin; Burke, David; O'Connor, Paul; Oliver, John; Radeka, Veljko; Schalk, Terry; Schindler, Rafe; /SLAC

    2007-01-10

    The LSST camera is a wide-field optical (0.35-1um) imager designed to provide a 3.5 degree FOV with better than 0.2 arcsecond sampling. The detector format will be a circular mosaic providing approximately 3.2 Gigapixels per image. The camera includes a filter mechanism and, shuttering capability. It is positioned in the middle of the telescope where cross-sectional area is constrained by optical vignetting and heat dissipation must be controlled to limit thermal gradients in the optical beam. The fast, f/1.2 beam will require tight tolerances on the focal plane mechanical assembly. The focal plane array operates at a temperature of approximately -100 C to achieve desired detector performance. The focal plane array is contained within an evacuated cryostat, which incorporates detector front-end electronics and thermal control. The cryostat lens serves as an entrance window and vacuum seal for the cryostat. Similarly, the camera body lens serves as an entrance window and gas seal for the camera housing, which is filled with a suitable gas to provide the operating environment for the shutter and filter change mechanisms. The filter carousel can accommodate 5 filters, each 75 cm in diameter, for rapid exchange without external intervention.

  5. Ultraminiature television camera

    NASA Technical Reports Server (NTRS)

    Deterville, R. J.; Drago, N.

    1967-01-01

    Ultraminiature television camera with a total volume of 20.25 cubic inches, requires 28 vdc power, operates on UHF and accommodates standard 8-mm optics. It uses microelectronic assembly packaging techniques and contains a magnetically deflected and electrostatically focused vidicon, automatic gain control circuit, power supply, and transmitter.

  6. Radiometric Characterization of IKONOS Multispectral Imagery

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary; Ryan, Robert E.; Kelly, Michelle; Holekamp, Kara; Zanoni, Vicki; Thome, Kurtis; Schiller, Stephen

    2002-01-01

    A radiometric characterization of Space Imaging's IKONOS 4-m multispectral imagery has been performed by a NASA funded team from the John C. Stennis Space Center (SSC), the University of Arizona Remote Sensing Group (UARSG), and South Dakota State University (SDSU). Both intrinsic radiometry and the effects of Space Imaging processing on radiometry were investigated. Relative radiometry was examined with uniform Antarctic and Saharan sites. Absolute radiometric calibration was performed using reflectance-based vicarious calibration methods on several uniform sites imaged by IKONOS, coincident with ground-based surface and atmospheric measurements. Ground-based data and the IKONOS spectral response function served as input to radiative transfer codes to generate a Top-of-Atmosphere radiance estimate. Calibration coefficients derived from each vicarious calibration were combined to generate an IKONOS radiometric gain coefficient for each multispectral band assuming a linear response over the full dynamic range of the instrument. These calibration coefficients were made available to Space Imaging, which subsequently adopted them by updating its initial set of calibration coefficients. IKONOS imagery procured through the NASA Scientific Data Purchase program is processed with or without a Modulation Transfer Function Compensation kernel. The radiometric effects of this kernel on various scene types was also investigated. All imagery characterized was procured through the NASA Scientific Data Purchase program.

  7. Multispectral multisensor image fusion using wavelet transforms

    USGS Publications Warehouse

    Lemeshewsky, George P.

    1999-01-01

    Fusion techniques can be applied to multispectral and higher spatial resolution panchromatic images to create a composite image that is easier to interpret than the individual images. Wavelet transform-based multisensor, multiresolution fusion (a type of band sharpening) was applied to Landsat thematic mapper (TM) multispectral and coregistered higher resolution SPOT panchromatic images. The objective was to obtain increased spatial resolution, false color composite products to support the interpretation of land cover types wherein the spectral characteristics of the imagery are preserved to provide the spectral clues needed for interpretation. Since the fusion process should not introduce artifacts, a shift invariant implementation of the discrete wavelet transform (SIDWT) was used. These results were compared with those using the shift variant, discrete wavelet transform (DWT). Overall, the process includes a hue, saturation, and value color space transform to minimize color changes, and a reported point-wise maximum selection rule to combine transform coefficients. The performance of fusion based on the SIDWT and DWT was evaluated with a simulated TM 30-m spatial resolution test image and a higher resolution reference. Simulated imagery was made by blurring higher resolution color-infrared photography with the TM sensors' point spread function. The SIDWT based technique produced imagery with fewer artifacts and lower error between fused images and the full resolution reference. Image examples with TM and SPOT 10-m panchromatic illustrate the reduction in artifacts due to the SIDWT based fusion.

  8. Multi-spectral imaging of oxygen saturation

    NASA Astrophysics Data System (ADS)

    Savelieva, Tatiana A.; Stratonnikov, Aleksander A.; Loschenov, Victor B.

    2008-06-01

    The system of multi-spectral imaging of oxygen saturation is an instrument that can record both spectral and spatial information about a sample. In this project, the spectral imaging technique is used for monitoring of oxygen saturation of hemoglobin in human tissues. This system can be used for monitoring spatial distribution of oxygen saturation in photodynamic therapy, surgery or sports medicine. Diffuse reflectance spectroscopy in the visible range is an effective and extensively used technique for the non-invasive study and characterization of various biological tissues. In this article, a short review of modeling techniques being currently in use for diffuse reflection from semi-infinite turbid media is presented. A simple and practical model for use with a real-time imaging system is proposed. This model is based on linear approximation of the dependence of the diffuse reflectance coefficient on relation between absorbance and reduced scattering coefficient. This dependence was obtained with the Monte Carlo simulation of photon propagation in turbid media. Spectra of the oxygenated and deoxygenated forms of hemoglobin differ mostly in the red area (520 - 600 nm) and have several characteristic points there. Thus four band-pass filters were used for multi-spectral imaging. After having measured the reflectance, the data obtained are used for fitting the concentration of oxygenated and free hemoglobin, and hemoglobin oxygen saturation.

  9. Efficient lossless compression scheme for multispectral images

    NASA Astrophysics Data System (ADS)

    Benazza-Benyahia, Amel; Hamdi, Mohamed; Pesquet, Jean-Christophe

    2001-12-01

    Huge amounts of data are generated thanks to the continuous improvement of remote sensing systems. Archiving this tremendous volume of data is a real challenge which requires lossless compression techniques. Furthermore, progressive coding constitutes a desirable feature for telebrowsing. To this purpose, a compact and pyramidal representation of the input image has to be generated. Separable multiresolution decompositions have already been proposed for multicomponent images allowing each band to be decomposed separately. It seems however more appropriate to exploit also the spectral correlations. For hyperspectral images, the solution is to apply a 3D decomposition according to the spatial and to the spectral dimensions. This approach is not appropriate for multispectral images because of the reduced number of spectral bands. In recent works, we have proposed a nonlinear subband decomposition scheme with perfect reconstruction which exploits efficiently both the spatial and the spectral redundancies contained in multispectral images. In this paper, the problem of coding the coefficients of the resulting subband decomposition is addressed. More precisely, we propose an extension to the vector case of Shapiro's embedded zerotrees of wavelet coefficients (V-EZW) with achieves further saving in the bit stream. Simulations carried out on SPOT images indicate the outperformance of the global compression package we performed.

  10. Development of Camera Model and Geometric Calibration/validation of Xsat IRIS Imagery

    NASA Astrophysics Data System (ADS)

    Kwoh, L. K.; Huang, X.; Tan, W. J.

    2012-07-01

    XSAT, launched on 20 April 2011, is the first micro-satellite designed and built in Singapore. It orbits the Earth at altitude of 822 km in a sun synchronous orbit. The satellite carries a multispectral camera IRIS with three spectral bands - 0.52~0.60 mm for Green, 0.63~0.69 mm for Red and 0.76~0.89 mm for NIR at 12 m resolution. In the design of IRIS camera, the three bands were acquired by three lines of CCDs (NIR, Red and Green). These CCDs were physically separated in the focal plane and their first pixels not absolutely aligned. The micro-satellite platform was also not stable enough to allow for co-registration of the 3 bands with simple linear transformation. In the camera model developed, this platform stability was compensated with 3rd to 4th order polynomials for the satellite's roll, pitch and yaw attitude angles. With the camera model, the camera parameters such as the band to band separations, the alignment of the CCDs relative to each other, as well as the focal length of the camera can be validated or calibrated. The results of calibration with more than 20 images showed that the band to band along-track separation agreed well with the pre-flight values provided by the vendor (0.093° and 0.046° for the NIR vs red and for green vs red CCDs respectively). The cross-track alignments were 0.05 pixel and 5.9 pixel for the NIR vs red and green vs red CCDs respectively. The focal length was found to be shorter by about 0.8%. This was attributed to the lower operating temperature which XSAT is currently operating. With the calibrated parameters and the camera model, a geometric level 1 multispectral image with RPCs can be generated and if required, orthorectified imagery can also be produced.

  11. Low-Cost Optical Camera System for Disaster Monitoring

    NASA Astrophysics Data System (ADS)

    Kurz, F.; Meynberg, O.; Rosenbaum, D.; Türmer, S.; Reinartz, P.; Schroeder, M.

    2012-07-01

    Real-time monitoring of natural disasters, mass events, and large accidents with airborne optical sensors is an ongoing topic in research and development. Airborne monitoring is used as a complemental data source with the advantage of flexible data acquisition and higher spatial resolution compared to optical satellite data. In cases of disasters or mass events, optical high resolution image data received directly after acquisition are highly welcomed by security related organizations like police and rescue forces. Low-cost optical camera systems are suitable for real-time applications as the accuracy requirements can be lowered in return for faster processing times. In this paper, the performance of low-cost camera systems for real-time mapping applications is exemplarily evaluated based on already existing sensor systems operated at German Aerospace Center (DLR). Focus lies next to the geometrical and radiometric performance on the real time processing chain which includes image processors, thematic processors for automatic traffic extraction and automatic person tracking, data downlink to the ground station, and further processing and distribution on the ground. Finally, a concept for a national airborne rapid mapping service based on the low-cost hardware is proposed.

  12. Skin parameter map retrieval from a dedicated multispectral imaging system applied to dermatology/cosmetology.

    PubMed

    Jolivot, Romuald; Benezeth, Yannick; Marzani, Franck

    2013-01-01

    In vivo quantitative assessment of skin lesions is an important step in the evaluation of skin condition. An objective measurement device can help as a valuable tool for skin analysis. We propose an explorative new multispectral camera specifically developed for dermatology/cosmetology applications. The multispectral imaging system provides images of skin reflectance at different wavebands covering visible and near-infrared domain. It is coupled with a neural network-based algorithm for the reconstruction of reflectance cube of cutaneous data. This cube contains only skin optical reflectance spectrum in each pixel of the bidimensional spatial information. The reflectance cube is analyzed by an algorithm based on a Kubelka-Munk model combined with evolutionary algorithm. The technique allows quantitative measure of cutaneous tissue and retrieves five skin parameter maps: melanin concentration, epidermis/dermis thickness, haemoglobin concentration, and the oxygenated hemoglobin. The results retrieved on healthy participants by the algorithm are in good accordance with the data from the literature. The usefulness of the developed technique was proved during two experiments: a clinical study based on vitiligo and melasma skin lesions and a skin oxygenation experiment (induced ischemia) with healthy participant where normal tissues are recorded at normal state and when temporary ischemia is induced.

  13. Multispectral imaging system on tethered balloons for optical remote sensing education and outreach

    NASA Astrophysics Data System (ADS)

    Shaw, Joseph A.; Nugent, Paul W.; Kaufman, Nathan; Pust, Nathan J.; Mikes, Devin; Knierim, Cassie; Faulconer, Nathan; Larimer, Randal; DesJardins, Angela; Knighton, Berk

    2012-10-01

    A set of low-cost, compact multispectral imaging systems have been developed for deployment on tethered balloons for education and outreach based on basic principles of optical remote sensing. The imagers use tiny CMOS cameras with low-cost optical filters to obtain images in red and near-infrared bands, and a more recent version include a blue band. The red and near-infrared bands are used primarily for identifying and monitoring vegetation through the Normalized Difference Vegetation Index (NDVI), while the blue band is used for studying water turbidity, identifying water and ice, and so forth. The imagers are designed to be carried by tethered balloons at altitudes up to approximately 50 m. Engineering and physics students at Montana State University-Bozeman gained hands-on experience during the early stages of designing and building the imagers, and a wide variety of university and college students are using the imagers for a broad range of applications to learn about multispectral imaging, remote sensing, and applications typically involving some aspect of environmental science.

  14. A multispectral study of an extratropical cyclone with Nimbus 3 medium resolution infrared radiometer data

    NASA Technical Reports Server (NTRS)

    Holub, R.; Shenk, W. E.

    1973-01-01

    Four registered channels (0.2 to 4, 6.5 to 7, 10 to 11, and 20 to 23 microns) of the Nimbus 3 Medium Resolution Infrared Radiometer (MRIR) were used to study 24-hr changes in the structure of an extratropical cyclone during a 6-day period in May 1969. Use of a stereographic-horizon map projection insured that the storm was mapped with a single perspective throughout the series and allowed the convenient preparation of 24-hr difference maps of the infrared radiation fields. Single-channel and multispectral analysis techniques were employed to establish the positions and vertical slopes of jetstreams, large cloud systems, and major features of middle and upper tropospheric circulation. Use of these techniques plus the difference maps and continuity of observation allowed the early detection of secondary cyclones developing within the circulation of the primary cyclone. An automated, multispectral cloud-type identification technique was developed, and comparisons that were made with conventional ship reports and with high-resolution visual data from the image dissector camera system showed good agreement.

  15. Thresholding for biological material detection in real-time multispectral imaging

    NASA Astrophysics Data System (ADS)

    Yoon, Seung Chul; Park, Bosoon; Lawrence, Kurt C.; Windham, William R.

    2005-09-01

    Recently, hyperspectral image analysis has proved successful for a target detection problem encountered in remote sensing as well as near sensing utilizing in situ instrumentation. The conventional global bi-level thresholding for target detection, such as the clustering-based Otsu's method, has been inadequate for the detection of biologically harmful material on foods that has a large degree of variability in size, location, color, shape, texture, and occurrence time. This paper presents multistep-like thresholding based on kernel density estimation for the real-time detection of harmful contaminants on a food product presented in multispectral images. We are particularly concerned with the detection of fecal contaminants on poultry carcasses in real-time. In the past, we identified 2 optimal wavelength bands and developed a real-time multispectral imaging system using a common aperture camera and a globally optimized thresholding method from a ratio of the optimal bands. This work extends our previous study by introducing a new decision rule to detect fecal contaminants on a single bird level. The underlying idea is to search for statistical separability along the two directions defined by the global optimal threshold vector and its orthogonal vector. Experimental results with real birds and fecal samples in different amounts are provided.

  16. Multispectral imaging systems on tethered balloons for optical remote sensing education and research

    NASA Astrophysics Data System (ADS)

    Shaw, Joseph A.; Nugent, Paul W.; Kaufman, Nathan A.; Pust, Nathan J.; Mikes, Devin; Knierim, Cassie; Faulconer, Nathan; Larimer, Randal M.; DesJardins, Angela C.; Knighton, W. Berk

    2012-01-01

    A set of low-cost, compact multispectral imaging systems have been developed for deployment on tethered balloons for education and outreach based on basic principles of optical remote sensing. They have proven to be sufficiently capable, and they are now being used in research as well. The imagers use tiny complementary metal-oxide semiconductor cameras with low-cost optical filters to obtain images in red and near-infrared bands, and a more recent version includes a blue band. The red and near-infrared bands are used primarily for identifying and monitoring vegetation through the normalized difference vegetation index (NDVI), while the blue band can be used for studying water turbidity and so forth. The imagers are designed to be carried by tethered balloons to altitudes currently up to approximately 50 m. These undergraduate-student-built imaging systems are being used by university and college students for a broad range of applications in multispectral imaging, remote sensing, and environmental science.

  17. Absolute airborne gravimetry

    NASA Astrophysics Data System (ADS)

    Baumann, Henri

    This work consists of a feasibility study of a first stage prototype airborne absolute gravimeter system. In contrast to relative systems, which are using spring gravimeters, the measurements acquired by absolute systems are uncorrelated and the instrument is not suffering from problems like instrumental drift, frequency response of the spring and possible variation of the calibration factor. The major problem we had to resolve were to reduce the influence of the non-gravitational accelerations included in the measurements. We studied two different approaches to resolve it: direct mechanical filtering, and post-processing digital compensation. The first part of the work describes in detail the different mechanical passive filters of vibrations, which were studied and tested in the laboratory and later in a small truck in movement. For these tests as well as for the airborne measurements an absolute gravimeter FG5-L from Micro-G Ltd was used together with an Inertial navigation system Litton-200, a vertical accelerometer EpiSensor, and GPS receivers for positioning. These tests showed that only the use of an optical table gives acceptable results. However, it is unable to compensate for the effects of the accelerations of the drag free chamber. The second part describes the strategy of the data processing. It is based on modeling the perturbing accelerations by means of GPS, EpiSensor and INS data. In the third part the airborne experiment is described in detail, from the mounting in the aircraft and data processing to the different problems encountered during the evaluation of the quality and accuracy of the results. In the part of data processing the different steps conducted from the raw apparent gravity data and the trajectories to the estimation of the true gravity are explained. A comparison between the estimated airborne data and those obtained by ground upward continuation at flight altitude allows to state that airborne absolute gravimetry is feasible and

  18. The PAU Camera

    NASA Astrophysics Data System (ADS)

    Casas, R.; Ballester, O.; Cardiel-Sas, L.; Carretero, J.; Castander, F. J.; Castilla, J.; Crocce, M.; de Vicente, J.; Delfino, M.; Fernández, E.; Fosalba, P.; García-Bellido, J.; Gaztañaga, E.; Grañena, F.; Jiménez, J.; Madrid, F.; Maiorino, M.; Martí, P.; Miquel, R.; Neissner, C.; Ponce, R.; Sánchez, E.; Serrano, S.; Sevilla, I.; Tonello, N.; Troyano, I.

    2011-11-01

    The PAU Camera (PAUCam) is a wide-field camera designed to be mounted at the William Herschel Telescope (WHT) prime focus, located at the Observatorio del Roque de los Muchachos in the island of La Palma (Canary Islands).Its primary function is to carry out a cosmological survey, the PAU Survey, covering an area of several hundred square degrees of sky. Its purpose is to determine positions and distances using photometric redshift techniques. To achieve accurate photo-z's, PAUCam will be equipped with 40 narrow-band filters covering the range from 450 to850 nm, and six broad-band filters, those of the SDSS system plus the Y band. To fully cover the focal plane delivered by the telescope optics, 18 CCDs 2k x 4k are needed. The pixels are square of 15 μ m size. The optical characteristics of the prime focus corrector deliver a field-of-view where eight of these CCDs will have an illumination of more than 95% covering a field of 40 arc minutes. The rest of the CCDs will occupy the vignetted region extending the field diameter to one degree. Two of the CCDs will be devoted to auto-guiding.This camera have some innovative features. Firstly, both the broad-band and the narrow-band filters will be placed in mobile trays, hosting 16 such filters at most. Those are located inside the cryostat at few millimeters in front of the CCDs when observing. Secondly, a pressurized liquid nitrogen tank outside the camera will feed a boiler inside the cryostat with a controlled massflow. The read-out electronics will use the Monsoon architecture, originally developed by NOAO, modified and manufactured by our team in the frame of the DECam project (the camera used in the DES Survey).PAUCam will also be available to the astronomical community of the WHT.

  19. [Soil Salinity Estimation Based on Near-Ground Multispectral Imagery in Typical Area of the Yellow River Delta].

    PubMed

    Zhang, Tong-rui; Zhao, Geng-xing; Gao, Ming-xiu; Wang, Zhuo-ran; Jia, Ji-chao; Li, Ping; An, De-yu

    2016-01-01

    This study chooses the core demonstration area of 'Bohai Barn' project as the study area, which is located in Wudi, Shandong Province. We first collected near-ground and multispectral images and surface soil salinity data using ADC portable multispectral camera and EC110 portable salinometer. Then three vegetation indices, namely NDVI, SAVI and GNDVI, were used to build 18 models respectively with the actual measured soil salinity. These models include linear function, exponential function, logarithmic function, exponentiation function, quadratic function and cubic function, from which the best estimation model for soil salinity estimation was selected and used for inverting and analyzing soil salinity status of the study area. Results indicated that all models mentioned above could effectively estimate soil salinity and models using SAVI as the dependent variable were more effective than the others. Among SAVI models, the linear model (Y = -0.524x + 0.663, n = 70) is the best, under which the test value of F is the highest as 141.347 at significance test level, estimated R2 0.797 with a 93.36% accuracy. Soil salinity of the study area is mainly around 2.5 per thousand - 3.5 per thousand, which gradually increases from southwest to northeast. The study has probed into soil salinity estimation methods based on near-ground and multispectral data, and will provide a quick and effective technical soil salinity estimation approach for coastal saline soil of the study area and the whole Yellow River Delta.

  20. Quantitative evaluation of lipid concentration in atherosclerotic plaque phantom by near-infrared multispectral angioscope at wavelengths around 1200 nm

    NASA Astrophysics Data System (ADS)

    Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio

    2015-07-01

    Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.

  1. Image Sensors Enhance Camera Technologies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  2. Do Speed Cameras Reduce Collisions?

    PubMed Central

    Skubic, Jeffrey; Johnson, Steven B.; Salvino, Chris; Vanhoy, Steven; Hu, Chengcheng

    2013-01-01

    We investigated the effects of speed cameras along a 26 mile segment in metropolitan Phoenix, Arizona. Motor vehicle collisions were retrospectively identified according to three time periods – before cameras were placed, while cameras were in place and after cameras were removed. A 14 mile segment in the same area without cameras was used for control purposes. Five cofounding variables were eliminated. In this study, the placement or removal of interstate highway speed cameras did not independently affect the incidence of motor vehicle collisions. PMID:24406979

  3. Combination of RGB and multispectral imagery for discrimination of cabernet sauvignon grapevine elements.

    PubMed

    Fernández, Roemi; Montes, Héctor; Salinas, Carlota; Sarria, Javier; Armada, Manuel

    2013-06-19

    This paper proposes a sequential masking algorithm based on the K-means method that combines RGB and multispectral imagery for discrimination of Cabernet Sauvignon grapevine elements in unstructured natural environments, without placing any screen behind the canopy and without any previous preparation of the vineyard. In this way, image pixels are classified into five clusters corresponding to leaves, stems, branches, fruit and background. A custom-made sensory rig that integrates a CCD camera and a servo-controlled filter wheel has been specially designed and manufactured for the acquisition of images during the experimental stage. The proposed algorithm is extremely simple, efficient, and provides a satisfactory rate of classification success. All these features turn out the proposed algorithm into an appropriate candidate to be employed in numerous tasks of the precision viticulture, such as yield estimation, water and nutrients needs estimation, spraying and harvesting.

  4. Direct, trans-irradiation and multispectral infrared imaging of a Titian canvas

    NASA Astrophysics Data System (ADS)

    Daffara, Claudia; Monti, Francesca; Fontana, Raffaella; Artoni, Paola; Salvadori, Ornella

    2013-05-01

    Near infrared imaging is a powerful technique for the analysis of ancient paintings, allowing the nondestructive examination of features underneath the pictorial surface. Beyond the unique nature of the artwork (materials and layer stratigraphy), the effectiveness of the technique in detecting any painting features is determined by the device performance (spectral sensitivity, acquisition band narrowness, spatial resolution) as well as by the irradiation setup. We performed multi-modal infrared imaging on a XVI century masterpiece by Titian using an InGaAs camera and different measurement setup. Acquisition was carried out in conventional reflection geometry and in trans-irradiation mode, as well as in wideband and multispectral modes. Preliminary results are presented and the potentialities of such infrared analysis discussed.

  5. A Near-Infrared (NIR) Global Multispectral Map of the Moon from Clementine

    NASA Technical Reports Server (NTRS)

    Eliason, E. M.; Lee, E. M.; Becker, T. L.; Weller, L. A.; Isbell, C. E.; Staid, M. I.; Gaddis, L. R.; McEwen, A. S.; Robinson, M. S.; Duxbury, T.

    2003-01-01

    In May and June of 1994, the NASA/DoD Clementine Mission acquired global, 11- band, multispectral observations of the lunar surface using the ultraviolet-visible (UVVIS) and near-infrared (NIR) camera systems. The global 5-band UVVIS Digital Image Model (DIM) of the Moon at 100 m/pixel was released to the Planetary Data System (PDS) in 2000. The corresponding NIR DIM has been compiled by the U.S. Geological Survey for distribution to the lunar science community. The recently released NIR DIM has six spectral bands (1100, 1250, 1500, 2000, 2600, and 2780 nm) and is delivered in 996 quads at 100 m/pixel (303 pixels/degree). The NIR data were radiometrically corrected, geometrically controlled, and photometrically normalized to form seamless, uniformly illuminated mosaics of the lunar surface.

  6. Multispectral imaging approach for simplified non-invasive in-vivo evaluation of gingival erythema

    NASA Astrophysics Data System (ADS)

    Eckhard, Timo; Valero, Eva M.; Nieves, Juan L.; Gallegos-Rueda, José M.; Mesa, Francisco

    2012-03-01

    Erythema is a common visual sign of gingivitis. In this work, a new and simple low-cost image capture and analysis method for erythema assessment is proposed. The method is based on digital still images of gingivae and applied on a pixel-by-pixel basis. Multispectral images are acquired with a conventional digital camera and multiplexed LED illumination panels at 460nm and 630nm peak wavelength. An automatic work-flow segments teeth from gingiva regions in the images and creates a map of local blood oxygenation levels, which relates to the presence of erythema. The map is computed from the ratio of the two spectral images. An advantage of the proposed approach is that the whole process is easy to manage by dental health care professionals in clinical environment.

  7. Airborne Intercept Monitoring

    DTIC Science & Technology

    2006-04-01

    Primary mirror of Zerodur with Pilkington 747 coating • FOV = 0.104 degrees Airborne Intercept Monitoring RTO-MP-SET-105 16 - 3 UNCLASSIFIED...Pointing System (SPS). The STS is a 0.75 meter aperture Mersenne Cassegrain telescope and the SAT is a 0.34 meter aperture 3- mirror anastigmat telescope...UNLIMITED UNCLASSIFIED/UNLIMITED • Air Flow to Mitigate Thermal “Seeing” Effects • Light weighted primary mirror to reduce mass The SAT

  8. Airborne forest fire research

    NASA Technical Reports Server (NTRS)

    Mattingly, G. S.

    1974-01-01

    The research relating to airborne fire fighting systems is reviewed to provide NASA/Langley Research Center with current information on the use of aircraft in forest fire operations, and to identify research requirements for future operations. A literature survey, interview of forest fire service personnel, analysis and synthesis of data from research reports and independent conclusions, and recommendations for future NASA-LRC programs are included.

  9. Airborne Infrared Astronomical Telescopes

    NASA Astrophysics Data System (ADS)

    Erickson, Edwin F.

    2017-01-01

    A unique program of infrared astronomical observations from aircraft evolved at NASA’s Ames Research Center, beginning in the 1960s. Telescopes were flown on a Convair 990, a Lear Jet, and a Lockheed C-141 - the Kuiper Airborne Observatory (KAO) - leading to the planning and development of SOFIA: a 2.7 m telescope now flying on a Boeing 747SP. The poster describes these telescopes and highlights of some of the scientific results obtained from them.

  10. Innovativ Airborne Sensors for Disaster Management

    NASA Astrophysics Data System (ADS)

    Altan, M. O.; Kemper, G.

    2016-06-01

    Disaster management by analyzing changes in the DSM before and after the "event". Advantage of Lidar is that beside rain and clouds, no other weather conditions limit their use. As an active sensor, missions in the nighttime are possible. The new mid-format cameras that make use CMOS sensors (e.g. Phase One IXU1000) can capture data also under poor and difficult light conditions and might will be the first choice for remotely sensed data acquisition in aircrafts and UAVs. UAVs will surely be more and more part of the disaster management on the detailed level. Today equipped with video live cams using RGB and Thermal IR, they assist in looking inside buildings and behind. Thus, they can continue with the aerial survey where airborne anomalies have been detected.

  11. Airborne wireless communication systems, airborne communication methods, and communication methods

    DOEpatents

    Deaton, Juan D [Menan, ID; Schmitt, Michael J [Idaho Falls, ID; Jones, Warren F [Idaho Falls, ID

    2011-12-13

    An airborne wireless communication system includes circuitry configured to access information describing a configuration of a terrestrial wireless communication base station that has become disabled. The terrestrial base station is configured to implement wireless communication between wireless devices located within a geographical area and a network when the terrestrial base station is not disabled. The circuitry is further configured, based on the information, to configure the airborne station to have the configuration of the terrestrial base station. An airborne communication method includes answering a 911 call from a terrestrial cellular wireless phone using an airborne wireless communication system.

  12. Airborne seeker evaluation and test system

    NASA Astrophysics Data System (ADS)

    Jollie, William B.

    1991-08-01

    The Airborne Seeker Evaluation Test System (ASETS) is an airborne platform for development, test, and evaluation of air-to-ground seekers and sensors. ASETS consists of approximately 10,000 pounds of equipment, including sixteen racks of control, display, and recording electronics, and a very large stabilized airborne turret, all carried by a modified C- 130A aircraft. The turret measures 50 in. in diameter and extends over 50 in. below the aircraft. Because of the low ground clearance of the C-130, a unique retractor mechanism was designed to raise the turret inside the aircraft for take-offs and landings, and deploy the turret outside the aircraft for testing. The turret has over 7 cubic feet of payload space and can accommodate up to 300 pounds of instrumentation, including missile seekers, thermal imagers, infrared mapping systems, laser systems, millimeter wave radar units, television cameras, and laser rangers. It contains a 5-axis gyro-stabilized gimbal system that will maintain a line of sight in the pitch, roll, and yaw axes to an accuracy better than +/- 125 (mu) rad. The rack-mounted electronics in the aircraft cargo bay can be interchanged to operate any type of sensor and record the data. Six microcomputer subsystems operate and maintain all of the system components during a test mission. ASETS is capable of flying at altitudes between 200 and 20,000 feet, and at airspeeds ranging from 100 to 250 knots. Mission scenarios can include air-to-surface seeker testing, terrain mapping, surface target measurement, air-to-air testing, atmospheric transmission studies, weather data collection, aircraft or missile tracking, background signature measurements, and surveillance. ASETS is fully developed and available to support test programs.

  13. Airborne multispectral remote sensing with ground truth for areawide pest management

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Scientists and engineers in areawide pest management programs have been developing, integrating, and evaluating multiple strategies and technologies into a systems approach for management of field crop insect pests. Remote sensing along with global positioning systems, geographic information system...

  14. Assessment of soybean injury from glyphosate using airborne multispectral remote sensing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    BACKGROUND: Glyphosate drift onto off-target sensitive crops can reduce growth and yield, and is of great concern to growers and pesticide applicators. Detection of herbicide injury using biological responses is tedious, so more convenient and rapid detection methods are needed. The objective of thi...

  15. Evaluating spectral measures derived from airborne multispectral imagery for detecting cotton root rot

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton root rot, caused by the soilborne fungus Phymatotrichopsis omnivore, is one of the most destructive plant diseases occurring throughout the southwestern United States. This disease has plagued the cotton industry for more than 100 years, but effective practices for its control are still lacki...

  16. Change detection of cotton root rot infection over a 10-year interval using airborne multispectral imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton root rot is a very serious and destructive disease of cotton grown in the southwestern and south central United States. Accurate information regarding the spatial and temporal infections of the disease within fields is important for effective management and control of the disease. The objecti...

  17. Mapping cotton root rot infestations over a 10-year interval with airborne multispectral imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton root rot, caused by the pathogen Phymatotrichopsis omnivora, is a very serious and destructive disease of cotton grown in the southwestern and south central U.S. Accurate information regarding temporal changes of cotton root rot infestations within fields is important for the management and c...

  18. Monitoring cotton root rot progression within a growing season using airborne multispectral imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton root rot, caused by the fungus Phymatotrichopsis omnivora, is a serious and destructive disease affecting cotton production in the southwestern United States. Accurate delineation of cotton root rot infections is important for cost-effective management of the disease. The objective of this st...

  19. Mapping forest stand complexity for woodland caribou habitat assessment using multispectral airborne imagery

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Hu, B.; Woods, M.

    2014-11-01

    The decline of the woodland caribou population is a result of their habitat loss. To conserve the habitat of the woodland caribou and protect it from extinction, it is critical to accurately characterize and monitor its habitat. Conventionally, products derived from low to medium spatial resolution remote sensing data, such as land cover classification and vegetation indices are used for wildlife habitat assessment. These products fail to provide information on the structure complexities of forest canopies which reflect important characteristics of caribou's habitats. Recent studies have employed the LiDAR system (Light Detection And Ranging) to directly retrieve the three dimensional forest attributes. Although promising results have been achieved, the acquisition cost of LiDAR data is very high. In this study, utilizing the very high spatial resolution imagery in characterizing the structural development the of forest canopies was exploited. A stand based image texture analysis was performed to predict forest succession stages. The results were demonstrated to be consistent with those derived from LiDAR data.

  20. Airborne multi-spectral remote sensing with ground truth for areawide pest management

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Scientists and researchers have been developing, integrating, and evaluating multiple strategies and technologies into a systems approach for management of field crop insect pests. Remote sensing along with Global Positioning Systems, Geographic Information Systems, and variable rate technology are...

  1. Use of Airborne Multi-Spectral Imagery in Pest Management Systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Scientists and researchers have been developing, integrating, and evaluating multiple strategies and technologies into a systems approach for management of field crop insect pests. Remote sensing along with Global Positioning Systems, Geographic Information Systems, and variable rate technology are...

  2. Multispectral Imaging Systems for Airborne Remote Sensing to Support Agricultural Production Management

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Remote sensing has shown promise as a tool for managing agricultural application and production. Earth-observing satellite systems have an advantage for large-scale analysis at regional levels but are limited in spatial resolution. High-resolution satellite systems have been available in recent year...

  3. Multispectral hypercolorimetry and automatic guided pigment identification: some masterpieces case studies

    NASA Astrophysics Data System (ADS)

    Melis, Marcello; Miccoli, Matteo; Quarta, Donato

    2013-05-01

    A couple of years ago we proposed, in this same session, an extension to the standard colorimetry (CIE '31) that we called Hypercolorimetry. It was based on an even sampling of the 300-1000nm wavelength range, with the definition of 7 hypercolor matching functions optimally shaped to minimize the methamerism. Since then we consolidated the approach through a large number of multispectral analysis and specialized the system to the non invasive diagnosis for paintings and frescos. In this paper we describe the whole process, from the multispectral image acquisition to the final 7 bands computation and we show the results on paintings from Masters of the colour. We describe and propose in this paper a systematic approach to the non invasive diagnosis that is able to change a subjective analysis into a repeatable measure indipendent from the specific lighting conditions and from the specific acquisition system. Along with the Hypercolorimetry and its consolidation in the field of non invasive diagnosis, we developed also a standard spectral reflectance database of pure pigments and pigments painted with different bindings. As we will see, this database could be compared to the reflectances of the painting to help the diagnostician in identifing the proper matter. We used a Nikon D800FR (Full Range) camera. This is a 36megapixel reflex camera modified under a Nikon/Profilocolore common project, to achieve a 300-1000nm range sensitivity. The large amount of data allowed us to perform very accurate pixels comparisions, based on their spectral reflectance. All the original pigments and their binding have been provided by the Opificio delle Pietre Dure, Firenze, Italy, while the analyzed masterpieces belong to the collection of the Pinacoteca Nazionale of Bologna, Italy.

  4. Airborne field strength monitoring

    NASA Astrophysics Data System (ADS)

    Bredemeyer, J.; Kleine-Ostmann, T.; Schrader, T.; Münter, K.; Ritter, J.

    2007-06-01

    In civil and military aviation, ground based navigation aids (NAVAIDS) are still crucial for flight guidance even though the acceptance of satellite based systems (GNSS) increases. Part of the calibration process for NAVAIDS (ILS, DME, VOR) is to perform a flight inspection according to specified methods as stated in a document (DOC8071, 2000) by the International Civil Aviation Organization (ICAO). One major task is to determine the coverage, or, in other words, the true signal-in-space field strength of a ground transmitter. This has always been a challenge to flight inspection up to now, since, especially in the L-band (DME, 1GHz), the antenna installed performance was known with an uncertainty of 10 dB or even more. In order to meet ICAO's required accuracy of ±3 dB it is necessary to have a precise 3-D antenna factor of the receiving antenna operating on the airborne platform including all losses and impedance mismatching. Introducing precise, effective antenna factors to flight inspection to achieve the required accuracy is new and not published in relevant papers yet. The authors try to establish a new balanced procedure between simulation and validation by airborne and ground measurements. This involves the interpretation of measured scattering parameters gained both on the ground and airborne in comparison with numerical results obtained by the multilevel fast multipole algorithm (MLFMA) accelerated method of moments (MoM) using a complex geometric model of the aircraft. First results will be presented in this paper.

  5. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    NASA Astrophysics Data System (ADS)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  6. Multispectral Filter Arrays: Recent Advances and Practical Implementation

    PubMed Central

    Lapray, Pierre-Jean; Wang, Xingbo; Thomas, Jean-Baptiste; Gouton, Pierre

    2014-01-01

    Thanks to some technical progress in interferencefilter design based on different technologies, we can finally successfully implement the concept of multispectral filter array-based sensors. This article provides the relevant state-of-the-art for multispectral imaging systems and presents the characteristics of the elements of our multispectral sensor as a case study. The spectral characteristics are based on two different spatial arrangements that distribute eight different bandpass filters in the visible and near-infrared area of the spectrum. We demonstrate that the system is viable and evaluate its performance through sensor spectral simulation. PMID:25407904

  7. Lattice algebra approach to multispectral analysis of ancient documents.

    PubMed

    Valdiviezo-N, Juan C; Urcid, Gonzalo

    2013-02-01

    This paper introduces a lattice algebra procedure that can be used for the multispectral analysis of historical documents and artworks. Assuming the presence of linearly mixed spectral pixels captured in a multispectral scene, the proposed method computes the scaled min- and max-lattice associative memories to determine the purest pixels that best represent the spectra of single pigments. The estimation of fractional proportions of pure spectra at each image pixel is used to build pigment abundance maps that can be used for subsequent restoration of damaged parts. Application examples include multispectral images acquired from the Archimedes Palimpsest and a Mexican pre-Hispanic codex.

  8. Eliminate background interference from latent fingerprints using ultraviolet multispectral imaging

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Xu, Xiaojing; Wang, Guiqiang

    2014-02-01

    Fingerprints are the most important evidence in crime scene. The technology of developing latent fingerprints is one of the hottest research areas in forensic science. Recently, multispectral imaging which has shown great capability in fingerprints development, questioned document detection and trace evidence examination is used in detecting material evidence. This paper studied how to eliminate background interference from non-porous and porous surface latent fingerprints by rotating filter wheel ultraviolet multispectral imaging. The results approved that background interference could be removed clearly from latent fingerprints by using multispectral imaging in ultraviolet bandwidth.

  9. Crop, soil, and geological mapping from digitized multispectral satellite photography.

    NASA Technical Reports Server (NTRS)

    Anuta, P. E.; Kristof, S. J.; Levandowski, D. W.; Phillips, T. L.; Macdonald, R. B.

    1971-01-01

    An experimental study was conducted of digitized multispectral satellite photography to seek answers to the following two questions: what are the data handling problems and requirements of converting photographic density measurements to a usable digital form, and what surface features can be distinguished using multispectral data taken at satellite altitudes. Results include the digitization of three multiband black and white photographs and a color infrared photograph, the conversion of the results of digitization to a useful digital form, and several data analysis experiments. As a whole, they encourage the use of multiband photography as a multispectral data collection instrument.

  10. Automated Camera Array Fine Calibration

    NASA Technical Reports Server (NTRS)

    Clouse, Daniel; Padgett, Curtis; Ansar, Adnan; Cheng, Yang

    2008-01-01

    Using aerial imagery, the JPL FineCalibration (JPL FineCal) software automatically tunes a set of existing CAHVOR camera models for an array of cameras. The software finds matching features in the overlap region between images from adjacent cameras, and uses these features to refine the camera models. It is not necessary to take special imagery of a known target and no surveying is required. JPL FineCal was developed for use with an aerial, persistent surveillance platform.

  11. Streak camera receiver definition study

    NASA Technical Reports Server (NTRS)

    Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.

    1990-01-01

    Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.

  12. Airborne Submillimeter Spectroscopy

    NASA Technical Reports Server (NTRS)

    Zmuidzinas, J.

    1998-01-01

    This is the final technical report for NASA-Ames grant NAG2-1068 to Caltech, entitled "Airborne Submillimeter Spectroscopy", which extended over the period May 1, 1996 through January 31, 1998. The grant was funded by the NASA airborne astronomy program, during a period of time after the Kuiper Airborne Observatory was no longer operational. Instead. this funding program was intended to help develop instrument concepts and technology for the upcoming SOFIA (Stratospheric Observatory for Infrared Astronomy) project. SOFIA, which is funded by NASA and is now being carried out by a consortium lead by USRA (Universities Space Research Association), will be a 747 aircraft carrying a 2.5 meter diameter telescope. The purpose of our grant was to fund the ongoing development of sensitive heterodyne receivers for the submillimeter band (500-1200 GHz), using sensitive superconducting (SIS) detectors. In 1997 July we submitted a proposal to USRA to construct a heterodyne instrument for SOFIA. Our proposal was successful [1], and we are now continuing our airborne astronomy effort with funding from USRA. A secondary purpose of the NAG2-1068 grant was to continue the anaIN'sis of astronomical data collected with an earlier instrument which was flown on the NASA Kuiper Airborne Observatory (KAO). The KAO instrument and the astronomical studies which were carried out with it were supported primarily under another grant, NAG2-744, which extended over October 1, 1991 through Januarv 31, 1997. For a complete description of the astronomical data and its anailysis, we refer the reader to the final technical report for NAG2-744, which was submitted to NASA on December 1. 1997. Here we report on the SIS detector development effort for SOFIA carried out under NAG2-1068. The main result of this effort has been the demonstration of SIS mixers using a new superconducting material niobium titanium nitride (NbTiN), which promises to deliver dramatic improvements in sensitivity in the 700

  13. LSST Camera Optics

    SciTech Connect

    Olivier, S S; Seppala, L; Gilmore, K; Hale, L; Whistler, W

    2006-06-05

    The Large Synoptic Survey Telescope (LSST) is a unique, three-mirror, modified Paul-Baker design with an 8.4m primary, a 3.4m secondary, and a 5.0m tertiary feeding a camera system that includes corrector optics to produce a 3.5 degree field of view with excellent image quality (<0.3 arcsecond 80% encircled diffracted energy) over the entire field from blue to near infra-red wavelengths. We describe the design of the LSST camera optics, consisting of three refractive lenses with diameters of 1.6m, 1.0m and 0.7m, along with a set of interchangeable, broad-band, interference filters with diameters of 0.75m. We also describe current plans for fabricating, coating, mounting and testing these lenses and filters.

  14. Combustion pinhole camera system

    DOEpatents

    Witte, Arvel B.

    1984-02-21

    A pinhole camera system utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  15. Combustion pinhole camera system

    DOEpatents

    Witte, A.B.

    1984-02-21

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor. 2 figs.

  16. Airborne reconnaissance VIII; Proceedings of the meeting, San Diego, CA, August 21, 22, 1984

    SciTech Connect

    Henkel, P.; Lagesse, F.R.

    1984-01-01

    Various papers on sensors and ancillary equipment, technological advances, development and testing, and intelligence extraction and exploitation in airborne reconnaissance are presented. The topics discussed include: the CA-810 modern trilens camera, PC-183B standoff imaging system, ruggedized MMW radiometer sensor for surveillance applications, application of biocular viewers to airborne reconnaissance, KA-102 film/EO standoff system, KS-146A camera development and flight test results, electrooptical imaging for film cameras, and new generation advanced IR linescan sensor system. Also addressed are: evolution of real time airborne reconnaissance, computer-controlled operation of reconnaissance cameras, miniature focus sensor, microprocessor-controller autofocus system, camera flight tests and image evaluation, LM-230A cost-effective test system, information management for tactical reconnaissance, performance modeling of infrared linescanners and FLIRs, USAF tactical reconnaissance - Grenada, sensor control and film annotation for long-range standoff reconnaissance, laser beam recording on film, meteorological effects on image quality, and optimization of photographic information transfer by CRT.

  17. Hemispherical Laue camera

    DOEpatents

    Li, James C. M.; Chu, Sungnee G.

    1980-01-01

    A hemispherical Laue camera comprises a crystal sample mount for positioning a sample to be analyzed at the center of sphere of a hemispherical, X-radiation sensitive film cassette, a collimator, a stationary or rotating sample mount and a set of standard spherical projection spheres. X-radiation generated from an external source is directed through the collimator to impinge onto the single crystal sample on the stationary mount. The diffracted beam is recorded on the hemispherical X-radiation sensitive film mounted inside the hemispherical film cassette in either transmission or back-reflection geometry. The distances travelled by X-radiation diffracted from the crystal to the hemispherical film are the same for all crystal planes which satisfy Bragg's Law. The recorded diffraction spots or Laue spots on the film thereby preserve both the symmetry information of the crystal structure and the relative intensities which are directly related to the relative structure factors of the crystal orientations. The diffraction pattern on the exposed film is compared with the known diffraction pattern on one of the standard spherical projection spheres for a specific crystal structure to determine the orientation of the crystal sample. By replacing the stationary sample support with a rotating sample mount, the hemispherical Laue camera can be used for crystal structure determination in a manner previously provided in conventional Debye-Scherrer cameras.

  18. Gamma ray camera

    DOEpatents

    Perez-Mendez, V.

    1997-01-21

    A gamma ray camera is disclosed for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array. 6 figs.

  19. Gamma ray camera

    DOEpatents

    Perez-Mendez, Victor

    1997-01-01

    A gamma ray camera for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array.

  20. Orbiter Camera Payload System

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.