Lunar Reconnaissance Orbiter Camera (LROC) instrument overview
Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.
2010-01-01
The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.
Hybrid Image Fusion for Sharpness Enhancement of Multi-Spectral Lunar Images
NASA Astrophysics Data System (ADS)
Awumah, Anna; Mahanti, Prasun; Robinson, Mark
2016-10-01
Image fusion enhances the sharpness of a multi-spectral (MS) image by incorporating spatial details from a higher-resolution panchromatic (Pan) image [1,2]. Known applications of image fusion for planetary images are rare, although image fusion is well-known for its applications to Earth-based remote sensing. In a recent work [3], six different image fusion algorithms were implemented and their performances were verified with images from the Lunar Reconnaissance Orbiter (LRO) Camera. The image fusion procedure obtained a high-resolution multi-spectral (HRMS) product from the LRO Narrow Angle Camera (used as Pan) and LRO Wide Angle Camera (used as MS) images. The results showed that the Intensity-Hue-Saturation (IHS) algorithm results in a high-spatial quality product while the Wavelet-based image fusion algorithm best preserves spectral quality among all the algorithms. In this work we show the results of a hybrid IHS-Wavelet image fusion algorithm when applied to LROC MS images. The hybrid method provides the best HRMS product - both in terms of spatial resolution and preservation of spectral details. Results from hybrid image fusion can enable new science and increase the science return from existing LROC images.[1] Pohl, Cle, and John L. Van Genderen. "Review article multisensor image fusion in remote sensing: concepts, methods and applications." International journal of remote sensing 19.5 (1998): 823-854.[2] Zhang, Yun. "Understanding image fusion." Photogramm. Eng. Remote Sens 70.6 (2004): 657-661.[3] Mahanti, Prasun et al. "Enhancement of spatial resolution of the LROC Wide Angle Camera images." Archives, XXIII ISPRS Congress Archives (2016).
Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.
2012-01-01
We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.
The Panoramic Camera (PanCam) Instrument for the ESA ExoMars Rover
NASA Astrophysics Data System (ADS)
Griffiths, A.; Coates, A.; Jaumann, R.; Michaelis, H.; Paar, G.; Barnes, D.; Josset, J.
The recently approved ExoMars rover is the first element of the ESA Aurora programme and is slated to deliver the Pasteur exobiology payload to Mars by 2013. The 0.7 kg Panoramic Camera will provide multispectral stereo images with 65° field-of- view (1.1 mrad/pixel) and high resolution (85 µrad/pixel) monoscopic "zoom" images with 5° field-of-view. The stereo Wide Angle Cameras (WAC) are based on Beagle 2 Stereo Camera System heritage. The Panoramic Camera instrument is designed to fulfil the digital terrain mapping requirements of the mission as well as providing multispectral geological imaging, colour and stereo panoramic images, solar images for water vapour abundance and dust optical depth measurements and to observe retrieved subsurface samples before ingestion into the rest of the Pasteur payload. Additionally the High Resolution Camera (HRC) can be used for high resolution imaging of interesting targets detected in the WAC panoramas and of inaccessible locations on crater or valley walls.
Pre-flight and On-orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Camera
NASA Astrophysics Data System (ADS)
Speyerer, E. J.; Wagner, R. V.; Robinson, M. S.; Licht, A.; Thomas, P. C.; Becker, K.; Anderson, J.; Brylow, S. M.; Humm, D. C.; Tschimmel, M.
2016-04-01
The Lunar Reconnaissance Orbiter Camera (LROC) consists of two imaging systems that provide multispectral and high resolution imaging of the lunar surface. The Wide Angle Camera (WAC) is a seven color push-frame imager with a 90∘ field of view in monochrome mode and 60∘ field of view in color mode. From the nominal 50 km polar orbit, the WAC acquires images with a nadir ground sampling distance of 75 m for each of the five visible bands and 384 m for the two ultraviolet bands. The Narrow Angle Camera (NAC) consists of two identical cameras capable of acquiring images with a ground sampling distance of 0.5 m from an altitude of 50 km. The LROC team geometrically calibrated each camera before launch at Malin Space Science Systems in San Diego, California and the resulting measurements enabled the generation of a detailed camera model for all three cameras. The cameras were mounted and subsequently launched on the Lunar Reconnaissance Orbiter (LRO) on 18 June 2009. Using a subset of the over 793000 NAC and 207000 WAC images of illuminated terrain collected between 30 June 2009 and 15 December 2013, we improved the interior and exterior orientation parameters for each camera, including the addition of a wavelength dependent radial distortion model for the multispectral WAC. These geometric refinements, along with refined ephemeris, enable seamless projections of NAC image pairs with a geodetic accuracy better than 20 meters and sub-pixel precision and accuracy when orthorectifying WAC images.
Geometric Calibration and Radiometric Correction of the Maia Multispectral Camera
NASA Astrophysics Data System (ADS)
Nocerino, E.; Dubbini, M.; Menna, F.; Remondino, F.; Gattelli, M.; Covi, D.
2017-10-01
Multispectral imaging is a widely used remote sensing technique, whose applications range from agriculture to environmental monitoring, from food quality check to cultural heritage diagnostic. A variety of multispectral imaging sensors are available on the market, many of them designed to be mounted on different platform, especially small drones. This work focuses on the geometric and radiometric characterization of a brand-new, lightweight, low-cost multispectral camera, called MAIA. The MAIA camera is equipped with nine sensors, allowing for the acquisition of images in the visible and near infrared parts of the electromagnetic spectrum. Two versions are available, characterised by different set of band-pass filters, inspired by the sensors mounted on the WorlView-2 and Sentinel2 satellites, respectively. The camera details and the developed procedures for the geometric calibrations and radiometric correction are presented in the paper.
The Beagle 2 Stereo Camera System: Scientific Objectives and Design Characteristics
NASA Astrophysics Data System (ADS)
Griffiths, A.; Coates, A.; Josset, J.; Paar, G.; Sims, M.
2003-04-01
The Stereo Camera System (SCS) will provide wide-angle (48 degree) multi-spectral stereo imaging of the Beagle 2 landing site in Isidis Planitia with an angular resolution of 0.75 milliradians. Based on the SpaceX Modular Micro-Imager, the SCS is composed of twin cameras (with 1024 by 1024 pixel frame transfer CCD) and twin filter wheel units (with a combined total of 24 filters). The primary mission objective is to construct a digital elevation model of the area in reach of the lander’s robot arm. The SCS specifications and following baseline studies are described: Panoramic RGB colour imaging of the landing site and panoramic multi-spectral imaging at 12 distinct wavelengths to study the mineralogy of landing site. Solar observations to measure water vapour absorption and the atmospheric dust optical density. Also envisaged are multi-spectral observations of Phobos &Deimos (observations of the moons relative to background stars will be used to determine the lander’s location and orientation relative to the Martian surface), monitoring of the landing site to detect temporal changes, observation of the actions and effects of the other PAW experiments (including rock texture studies with a close-up-lens) and collaborative observations with the Mars Express orbiter instrument teams. Due to be launched in May of this year, the total system mass is 360 g, the required volume envelope is 747 cm^3 and the average power consumption is 1.8 W. A 10Mbit/s RS422 bus connects each camera to the lander common electronics.
NASA Astrophysics Data System (ADS)
Neukum, Gerhard; Jaumann, Ralf; Scholten, Frank; Gwinner, Klaus
2017-11-01
At the Institute of Space Sensor Technology and Planetary Exploration of the German Aerospace Center (DLR) the High Resolution Stereo Camera (HRSC) has been designed for international missions to planet Mars. For more than three years an airborne version of this camera, the HRSC-A, has been successfully applied in many flight campaigns and in a variety of different applications. It combines 3D-capabilities and high resolution with multispectral data acquisition. Variable resolutions depending on the camera control settings can be generated. A high-end GPS/INS system in combination with the multi-angle image information yields precise and high-frequent orientation data for the acquired image lines. In order to handle these data a completely automated photogrammetric processing system has been developed, and allows to generate multispectral 3D-image products for large areas and with accuracies for planimetry and height in the decimeter range. This accuracy has been confirmed by detailed investigations.
Objective for monitoring the corona discharge
NASA Astrophysics Data System (ADS)
Obrezkov, Andrey; Rodionov, Andrey Yu.; Pisarev, Viktor N.; Chivanov, Alexsey N.; Baranov, Yuri P.; Korotaev, Valery V.
2016-04-01
Remote optoelectronic probing is one of the most actual aspects of overhead electric line maintenances. By installing such systems on a helicopter (for example) it becomes possible to monitor overhead transmission line status and to search damaged parts of the lines. Thermal and UV-cameras are used for more effective diagnostic. UV-systems are fitted with filters, that attenuate visible spectrum, which is an undesired type of signal. Also these systems have a wide view angle for better view and proper diagnostics. For even more effectiveness, it is better to use several spectral channels: like UV and IR. Such spectral selection provides good noise reduction. Experimental results of spectral parameters of the wide view angle multispectral objective for such systems are provided in this report. There is also data on point spread function, UV and IR scattering index data and technical requirements for detectors.
COBRA ATD multispectral camera response model
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A new multispectral camera response model has been developed in support of the US Marine Corps (USMC) Coastal Battlefield Reconnaissance and Analysis (COBRA) Advanced Technology Demonstration (ATD) Program. This analytical model accurately estimates response form five Xybion intensified IMC 201 multispectral cameras used for COBRA ATD airborne minefield detection. The camera model design is based on a series of camera response curves which were generated through optical laboratory test performed by the Naval Surface Warfare Center, Dahlgren Division, Coastal Systems Station (CSS). Data fitting techniques were applied to these measured response curves to obtain nonlinear expressions which estimates digitized camera output as a function of irradiance, intensifier gain, and exposure. This COBRA Camera Response Model was proven to be very accurate, stable over a wide range of parameters, analytically invertible, and relatively simple. This practical camera model was subsequently incorporated into the COBRA sensor performance evaluation and computational tools for research analysis modeling toolbox in order to enhance COBRA modeling and simulation capabilities. Details of the camera model design and comparisons of modeled response to measured experimental data are presented.
Photometric Observations of Soils and Rocks at the Mars Exploration Rover Landing Sites
NASA Technical Reports Server (NTRS)
Johnson, J. R.; Arvidson, R. A.; Bell, J. F., III; Farrand, W.; Guinness, E.; Johnson, M.; Herkenhoff, K. E.; Lemmon, M.; Morris, R. V.; Seelos, F., IV
2005-01-01
The Panoramic Cameras (Pancam) on the Spirit and Opportunity Mars Exploration Rovers have acquired multispectral reflectance observations of rocks and soils at different incidence, emission, and phase angles that will be used for photometric modeling of surface materials. Phase angle coverage at both sites extends from approx. 0 deg. to approx. 155 deg.
Adaptive illumination source for multispectral vision system applied to material discrimination
NASA Astrophysics Data System (ADS)
Conde, Olga M.; Cobo, Adolfo; Cantero, Paulino; Conde, David; Mirapeix, Jesús; Cubillas, Ana M.; López-Higuera, José M.
2008-04-01
A multispectral system based on a monochrome camera and an adaptive illumination source is presented in this paper. Its preliminary application is focused on material discrimination for food and beverage industries, where monochrome, color and infrared imaging have been successfully applied for this task. This work proposes a different approach, in which the relevant wavelengths for the required discrimination task are selected in advance using a Sequential Forward Floating Selection (SFFS) Algorithm. A light source, based on Light Emitting Diodes (LEDs) at these wavelengths is then used to sequentially illuminate the material under analysis, and the resulting images are captured by a CCD camera with spectral response in the entire range of the selected wavelengths. Finally, the several multispectral planes obtained are processed using a Spectral Angle Mapping (SAM) algorithm, whose output is the desired material classification. Among other advantages, this approach of controlled and specific illumination produces multispectral imaging with a simple monochrome camera, and cold illumination restricted to specific relevant wavelengths, which is desirable for the food and beverage industry. The proposed system has been tested with success for the automatic detection of foreign object in the tobacco processing industry.
NASA Astrophysics Data System (ADS)
Schmitz, Nicole; Jaumann, Ralf; Coates, Andrew; Griffiths, Andrew; Hauber, Ernst; Trauthan, Frank; Paar, Gerhard; Barnes, Dave; Bauer, Arnold; Cousins, Claire
2010-05-01
Geologic context as a combination of orbital imaging and surface vision, including range, resolution, stereo, and multispectral imaging, is commonly regarded as basic requirement for remote robotic geology and forms the first tier of any multi-instrument strategy for investigating and eventually understanding the geology of a region from a robotic platform. Missions with objectives beyond a pure geologic survey, e.g. exobiology objectives, require goal-oriented operational procedures, where the iterative process of scientific observation, hypothesis, testing, and synthesis, performed via a sol-by-sol data exchange with a remote robot, is supported by a powerful vision system. Beyond allowing a thorough geological mapping of the surface (soil, rocks and outcrops) in 3D, using wide angle stereo imagery, such a system needs to be able to provide detailed visual information on targets of interest in high resolution, thereby enabling the selection of science targets and samples for further analysis with a specialized in-situ instrument suite. Surface vision for ESA's upcoming ExoMars rover will come from a dedicated Panoramic Camera System (PanCam). As integral part of the Pasteur payload package, the PanCam is designed to support the search for evidence of biological processes by obtaining wide angle multispectral stereoscopic panoramic images and high resolution RGB images from the mast of the rover [1]. The camera system will consist of two identical wide-angle cameras (WACs), which are arranged on a common pan-tilt mechanism, with a fixed stereo base length of 50 cm. The WACs are being complemented by a High Resolution Camera (HRC), mounted between the WACs, which allows a magnification of selected targets by a factor of ~8 with respect to the wide-angle optics. The high-resolution images together with the multispectral and stereo capabilities of the camera will be of unprecedented quality for the identification of water-related surface features (such as sedimentary rocks) and form one key to a successful implementation of ESA's multi-level strategy for the ExoMars Reference Surface Mission. A dedicated PanCam Science Implementation Strategy is under development, which connects the PanCam science objectives and needs of the ExoMars Surface Mission with the required investigations, planned measurement approach and sequence, and connected mission requirements. First step of this strategy is obtaining geological context to enable the decision where to send the rover. PanCam (in combination with Wisdom) will be used to obtain ground truth by a thorough geomorphologic mapping of the ExoMars rover's surroundings in near and far range in the form of (1) RGB or monochromatic full (i.e. 360°) or partial stereo panoramas for morphologic and textural information and stereo ranging, (2) mosaics or single images with partly or full multispectral coverage to assess the mineralogy of surface materials as well as their weathering state and possible past or present alteration processes and (3) small-scale high-resolution information on targets/features of interest, and distant or inaccessible sites. This general survey phase will lead to the identification of surface features like outcrops, ridges and troughs and the characterization of different rock and surface units based on their morphology, distribution, and spectral and physical properties. Evidence of water-bearing minerals, water-altered rocks or even water-lain sediments seen in the large-scale wide angle images will then allow for preselecting those targets/features considered relevant for detailed analysis and definition of their geologic context. Detailed characterization and, subsequently, selection of those preselected targets/features for further analysis will then be enabled by color high-resolution imagery, followed by the next tier of contact instruments to enable a decision on whether or not to acquire samples for further analysis. During the following drill/analysis phase, PanCam's High Resolution Camera will characterize the sample in the sample tray and observe the sample discharge into the Core Sample Transfer Mechanism. Key parts of this science strategy have been tested under laboratory conditions in two geology blind tests [2] and during two field test campaigns in Svalbard, using simulated mission conditions, an ExoMars representative Payload (ExoMars and MSL instrument breadboards), and Mars analog settings [3, 4]. The experiences gained are being translated into operational sequences, and, together with the science implementation strategy, form a first version of a PanCam Surface Operations plan. References: [1] Griffiths, A.D. et al. (2006) International Journal of Astrobiology 5 (3): 269-275, doi:10.1017/ S1473550406003387. [2] Pullan, D. et al. (2009) EPSC Abstracts, Vol. 4, EPSC2009-514. [3] Schmitz, N. et al. (2009) Geophysical Research Abstracts, Vol. 11, EGU2009-10621-2. [4] Cousins, C. et al. (2009) EPSC Abstracts, Vol. 4, EPSC2009-813.
Field Test of the ExoMars Panoramic Camera in the High Arctic - First Results and Lessons Learned
NASA Astrophysics Data System (ADS)
Schmitz, N.; Barnes, D.; Coates, A.; Griffiths, A.; Hauber, E.; Jaumann, R.; Michaelis, H.; Mosebach, H.; Paar, G.; Reissaus, P.; Trauthan, F.
2009-04-01
The ExoMars mission as the first element of the ESA Aurora program is scheduled to be launched to Mars in 2016. Part of the Pasteur Exobiology Payload onboard the ExoMars rover is a Panoramic Camera System (‘PanCam') being designed to obtain high-resolution color and wide-angle multi-spectral stereoscopic panoramic images from the mast of the ExoMars rover. The PanCam instrument consists of two wide-angle cameras (WACs), which will provide multispectral stereo images with 34° field-of-view (FOV) and a High-Resolution RGB Channel (HRC) to provide close-up images with 5° field-of-view. For field testing of the PanCam breadboard in a representative environment the ExoMars PanCam team joined the 6th Arctic Mars Analogue Svalbard Expedition (AMASE) 2008. The expedition took place from 4-17 August 2008 in the Svalbard archipelago, Norway, which is considered to be an excellent site, analogue to ancient Mars. 31 scientists and engineers involved in Mars Exploration (among them the ExoMars WISDOM, MIMA and Raman-LIBS team as well as several NASA MSL teams) combined their knowledge, instruments and techniques to study the geology, geophysics, biosignatures, and life forms that can be found in volcanic complexes, warm springs, subsurface ice, and sedimentary deposits. This work has been carried out by using instruments, a rover (NASA's CliffBot), and techniques that will/may be used in future planetary missions, thereby providing the capability to simulate a full mission environment in a Mars analogue terrain. Besides demonstrating PanCam's general functionality in a field environment, test and verification of the interpretability of PanCam data for in-situ geological context determination and scientific target selection was a main objective. To process the collected data, a first version of the preliminary PanCam 3D reconstruction processing & visualization chain was used. Other objectives included to test and refine the operational scenario (based on ExoMars Rover Reference Surface Mission), to investigate data commonalities and data fusion potential w.r.t. other instruments, and to collect representative image data to evaluate various influences, such as viewing distance, surface structure, and availability of structures at "infinity" (e.g. resolution, focus quality and associated accuracy of the 3D reconstruction). Airborne images with the HRSC-AX camera (airborne camera with heritage from the Mars Express High Resolution Stereo Camera HRSC), collected during a flight campaign over Svalbard in June 2008, provided large-scale geological context information for all field sites.
NASA Technical Reports Server (NTRS)
2005-01-01
During its approach to Mimas on Aug. 2, 2005, the Cassini spacecraft narrow-angle camera obtained multi-spectral views of the moon from a range of 228,000 kilometers (142,500 miles). This image is a narrow angle clear-filter image which was processed to enhance the contrast in brightness and sharpness of visible features. Herschel crater, a 140-kilometer-wide (88-mile) impact feature with a prominent central peak, is visible in the upper right of this image. This image was obtained when the Cassini spacecraft was above 25 degrees south, 134 degrees west latitude and longitude. The Sun-Mimas-spacecraft angle was 45 degrees and north is at the top. The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA's Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colo. For more information about the Cassini-Huygens mission visit http://saturn.jpl.nasa.gov . The Cassini imaging team homepage is at http://ciclops.org .Sun, Ryan; Bouchard, Matthew B.; Hillman, Elizabeth M. C.
2010-01-01
Camera-based in-vivo optical imaging can provide detailed images of living tissue that reveal structure, function, and disease. High-speed, high resolution imaging can reveal dynamic events such as changes in blood flow and responses to stimulation. Despite these benefits, commercially available scientific cameras rarely include software that is suitable for in-vivo imaging applications, making this highly versatile form of optical imaging challenging and time-consuming to implement. To address this issue, we have developed a novel, open-source software package to control high-speed, multispectral optical imaging systems. The software integrates a number of modular functions through a custom graphical user interface (GUI) and provides extensive control over a wide range of inexpensive IEEE 1394 Firewire cameras. Multispectral illumination can be incorporated through the use of off-the-shelf light emitting diodes which the software synchronizes to image acquisition via a programmed microcontroller, allowing arbitrary high-speed illumination sequences. The complete software suite is available for free download. Here we describe the software’s framework and provide details to guide users with development of this and similar software. PMID:21258475
Robust and adaptive band-to-band image transform of UAS miniature multi-lens multispectral camera
NASA Astrophysics Data System (ADS)
Jhan, Jyun-Ping; Rau, Jiann-Yeou; Haala, Norbert
2018-03-01
Utilizing miniature multispectral (MS) or hyperspectral (HS) cameras by mounting them on an Unmanned Aerial System (UAS) has the benefits of convenience and flexibility to collect remote sensing imagery for precision agriculture, vegetation monitoring, and environment investigation applications. Most miniature MS cameras adopt a multi-lens structure to record discrete MS bands of visible and invisible information. The differences in lens distortion, mounting positions, and viewing angles among lenses mean that the acquired original MS images have significant band misregistration errors. We have developed a Robust and Adaptive Band-to-Band Image Transform (RABBIT) method for dealing with the band co-registration of various types of miniature multi-lens multispectral cameras (Mini-MSCs) to obtain band co-registered MS imagery for remote sensing applications. The RABBIT utilizes modified projective transformation (MPT) to transfer the multiple image geometry of a multi-lens imaging system to one sensor geometry, and combines this with a robust and adaptive correction (RAC) procedure to correct several systematic errors and to obtain sub-pixel accuracy. This study applies three state-of-the-art Mini-MSCs to evaluate the RABBIT method's performance, specifically the Tetracam Miniature Multiple Camera Array (MiniMCA), Micasense RedEdge, and Parrot Sequoia. Six MS datasets acquired at different target distances and dates, and locations are also applied to prove its reliability and applicability. Results prove that RABBIT is feasible for different types of Mini-MSCs with accurate, robust, and rapid image processing efficiency.
In-Flight performance of MESSENGER's Mercury dual imaging system
Hawkins, S.E.; Murchie, S.L.; Becker, K.J.; Selby, C.M.; Turner, F.S.; Noble, M.W.; Chabot, N.L.; Choo, T.H.; Darlington, E.H.; Denevi, B.W.; Domingue, D.L.; Ernst, C.M.; Holsclaw, G.M.; Laslo, N.R.; Mcclintock, W.E.; Prockter, L.M.; Robinson, M.S.; Solomon, S.C.; Sterner, R.E.
2009-01-01
The Mercury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) spacecraft, launched in August 2004 and planned for insertion into orbit around Mercury in 2011, has already completed two flybys of the innermost planet. The Mercury Dual Imaging System (MDIS) acquired nearly 2500 images from the first two flybys and viewed portions of Mercury's surface not viewed by Mariner 10 in 1974-1975. Mercury's proximity to the Sun and its slow rotation present challenges to the thermal design for a camera on an orbital mission around Mercury. In addition, strict limitations on spacecraft pointing and the highly elliptical orbit create challenges in attaining coverage at desired geometries and relatively uniform spatial resolution. The instrument designed to meet these challenges consists of dual imagers, a monochrome narrow-angle camera (NAC) with a 1.5?? field of view (FOV) and a multispectral wide-angle camera (WAC) with a 10.5?? FOV, co-aligned on a pivoting platform. The focal-plane electronics of each camera are identical and use a 1024??1024 charge-coupled device detector. The cameras are passively cooled but use diode heat pipes and phase-change-material thermal reservoirs to maintain the thermal configuration during the hot portions of the orbit. Here we present an overview of the instrument design and how the design meets its technical challenges. We also review results from the first two flybys, discuss the quality of MDIS data from the initial periods of data acquisition and how that compares with requirements, and summarize how in-flight tests are being used to improve the quality of the instrument calibration. ?? 2009 SPIE.
Novel instrumentation of multispectral imaging technology for detecting tissue abnormity
NASA Astrophysics Data System (ADS)
Yi, Dingrong; Kong, Linghua
2012-10-01
Multispectral imaging is becoming a powerful tool in a wide range of biological and clinical studies by adding spectral, spatial and temporal dimensions to visualize tissue abnormity and the underlying biological processes. A conventional spectral imaging system includes two physically separated major components: a band-passing selection device (such as liquid crystal tunable filter and diffraction grating) and a scientific-grade monochromatic camera, and is expensive and bulky. Recently micro-arrayed narrow-band optical mosaic filter was invented and successfully fabricated to reduce the size and cost of multispectral imaging devices in order to meet the clinical requirement for medical diagnostic imaging applications. However the challenging issue of how to integrate and place the micro filter mosaic chip to the targeting focal plane, i.e., the imaging sensor, of an off-shelf CMOS/CCD camera is not reported anywhere. This paper presents the methods and results of integrating such a miniaturized filter with off-shelf CMOS imaging sensors to produce handheld real-time multispectral imaging devices for the application of early stage pressure ulcer (ESPU) detection. Unlike conventional multispectral imaging devices which are bulky and expensive, the resulting handheld real-time multispectral ESPU detector can produce multiple images at different center wavelengths with a single shot, therefore eliminates the image registration procedure required by traditional multispectral imaging technologies.
Traffic Sign Recognition with Invariance to Lighting in Dual-Focal Active Camera System
NASA Astrophysics Data System (ADS)
Gu, Yanlei; Panahpour Tehrani, Mehrdad; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki
In this paper, we present an automatic vision-based traffic sign recognition system, which can detect and classify traffic signs at long distance under different lighting conditions. To realize this purpose, the traffic sign recognition is developed in an originally proposed dual-focal active camera system. In this system, a telephoto camera is equipped as an assistant of a wide angle camera. The telephoto camera can capture a high accuracy image for an object of interest in the view field of the wide angle camera. The image from the telephoto camera provides enough information for recognition when the accuracy of traffic sign is low from the wide angle camera. In the proposed system, the traffic sign detection and classification are processed separately for different images from the wide angle camera and telephoto camera. Besides, in order to detect traffic sign from complex background in different lighting conditions, we propose a type of color transformation which is invariant to light changing. This color transformation is conducted to highlight the pattern of traffic signs by reducing the complexity of background. Based on the color transformation, a multi-resolution detector with cascade mode is trained and used to locate traffic signs at low resolution in the image from the wide angle camera. After detection, the system actively captures a high accuracy image of each detected traffic sign by controlling the direction and exposure time of the telephoto camera based on the information from the wide angle camera. Moreover, in classification, a hierarchical classifier is constructed and used to recognize the detected traffic signs in the high accuracy image from the telephoto camera. Finally, based on the proposed system, a set of experiments in the domain of traffic sign recognition is presented. The experimental results demonstrate that the proposed system can effectively recognize traffic signs at low resolution in different lighting conditions.
Zheng, Haijing; Bai, Tingzhu; Wang, Quanxi; Cao, Fengmei; Shao, Long; Sun, Zhaotian
2018-01-01
This study investigates multispectral characteristics of an unmanned aerial vehicle (UAV) at different observation angles by experiment. The UAV and its engine are tested on the ground in the cruise state. Spectral radiation intensities at different observation angles are obtained in the infrared band of 0.9–15 μm by a spectral radiometer. Meanwhile, infrared images are captured separately by long-wavelength infrared (LWIR), mid-wavelength infrared (MWIR), and short-wavelength infrared (SWIR) cameras. Additionally, orientation maps of the radiation area and radiance are obtained. The results suggest that the spectral radiation intensity of the UAV is determined by its exhaust plume and that the main infrared emission bands occur at 2.7 μm and 4.3 μm. At observation angles in the range of 0°–90°, the radiation area of the UAV in MWIR band is greatest; however, at angles greater than 90°, the radiation area in the SWIR band is greatest. In addition, the radiance of the UAV at an angle of 0° is strongest. These conclusions can guide IR stealth technique development for UAVs. PMID:29389880
Memoris, A Wide Angle Camera For Bepicolombo
NASA Astrophysics Data System (ADS)
Cremonese, G.; Memoris Team
In order to answer to the Announcement of Opportunity of ESA for the BepiColombo payload, we are working on a wide angle camera concept named MEMORIS (MEr- cury MOderate Resolution Imaging System). MEMORIS will performe stereoscopic images of the whole Mercury surface using two different channels at +/- 20 degrees from the nadir point. It will achieve a spatial resolution of 50m per pixel at 400 km from the surface (peri-Herm), corresponding to a vertical resolution of about 75m with the stereo performances. The scientific objectives will be addressed by MEMORIS may be identified as follows: Estimate of surface age based on crater counting Crater morphology and degrada- tion Stratigraphic sequence of geological units Identification of volcanic features and related deposits Origin of plain units from morphological observations Distribution and type of the tectonic structures Determination of relative age among the structures based on cross-cutting relationships 3D Tectonics Global mineralogical mapping of main geological units Identification of weathering products The last two items will come from the multispectral capabilities of the camera utilizing 8 to 12 (TBD) broad band filters. MEMORIS will be equipped by a further channel devoted to the observations of the tenuous exosphere. It will look at the limb on a given arc of the BepiColombo orbit, in so doing it will observe the exosphere above a surface latitude range of 25-75 degrees in the northern emisphere. The exosphere images will be obtained above the surface just observed by the other two channels, trying to find possible relantionship, as ground-based observations suggest. The exospheric channel will have four narrow-band filters centered on the sodium and potassium emissions and the adjacent continua.
Determining fast orientation changes of multi-spectral line cameras from the primary images
NASA Astrophysics Data System (ADS)
Wohlfeil, Jürgen
2012-01-01
Fast orientation changes of airborne and spaceborne line cameras cannot always be avoided. In such cases it is essential to measure them with high accuracy to ensure a good quality of the resulting imagery products. Several approaches exist to support the orientation measurement by using optical information received through the main objective/telescope. In this article an approach is proposed that allows the determination of non-systematic orientation changes between every captured line. It does not require any additional camera hardware or onboard processing capabilities but the payload images and a rough estimate of the camera's trajectory. The approach takes advantage of the typical geometry of multi-spectral line cameras with a set of linear sensor arrays for different spectral bands on the focal plane. First, homologous points are detected within the heavily distorted images of different spectral bands. With their help a connected network of geometrical correspondences can be built up. This network is used to calculate the orientation changes of the camera with the temporal and angular resolution of the camera. The approach was tested with an extensive set of aerial surveys covering a wide range of different conditions and achieved precise and reliable results.
Development of a portable multispectral thermal infrared camera
NASA Technical Reports Server (NTRS)
Osterwisch, Frederick G.
1991-01-01
The purpose of this research and development effort was to design and build a prototype instrument designated the 'Thermal Infrared Multispectral Camera' (TIRC). The Phase 2 effort was a continuation of the Phase 1 feasibility study and preliminary design for such an instrument. The completed instrument designated AA465 has application in the field of geologic remote sensing and exploration. The AA465 Thermal Infrared Camera (TIRC) System is a field-portable multispectral thermal infrared camera operating over the 8.0 - 13.0 micron wavelength range. Its primary function is to acquire two-dimensional thermal infrared images of user-selected scenes. Thermal infrared energy emitted by the scene is collected, dispersed into ten 0.5 micron wide channels, and then measured and recorded by the AA465 System. This multispectral information is presented in real time on a color display to be used by the operator to identify spectral and spatial variations in the scenes emissivity and/or irradiance. This fundamental instrument capability has a wide variety of commercial and research applications. While ideally suited for two-man operation in the field, the AA465 System can be transported and operated effectively by a single user. Functionally, the instrument operates as if it were a single exposure camera. System measurement sensitivity requirements dictate relatively long (several minutes) instrument exposure times. As such, the instrument is not suited for recording time-variant information. The AA465 was fabricated, assembled, tested, and documented during this Phase 2 work period. The detailed design and fabrication of the instrument was performed during the period of June 1989 to July 1990. The software development effort and instrument integration/test extended from July 1990 to February 1991. Software development included an operator interface/menu structure, instrument internal control functions, DSP image processing code, and a display algorithm coding program. The instrument was delivered to NASA in March 1991. Potential commercial and research uses for this instrument are in its primary application as a field geologists exploration tool. Other applications have been suggested but not investigated in depth. These are measurements of process control in commercial materials processing and quality control functions which require information on surface heterogeneity.
NASA Technical Reports Server (NTRS)
Ivanov, Anton B.
2003-01-01
The Mars Orbiter Camera (MOC) has been operating on board of the Mars Global Surveyor (MGS) spacecraft since 1998. It consists of three cameras - Red and Blue Wide Angle cameras (FOV=140 deg.) and Narrow Angle camera (FOV=0.44 deg.). The Wide Angle camera allows surface resolution down to 230 m/pixel and the Narrow Angle camera - down to 1.5 m/pixel. This work is a continuation of the project, which we have reported previously. Since then we have refined and improved our stereo correlation algorithm and have processed many more stereo pairs. We will discuss results of our stereo pair analysis located in the Mars Exploration rovers (MER) landing sites and address feasibility of recovering topography from stereo pairs (especially in the polar regions), taken during MGS 'Relay-16' mode.
Multispectral image dissector camera flight test
NASA Technical Reports Server (NTRS)
Johnson, B. L.
1973-01-01
It was demonstrated that the multispectral image dissector camera is able to provide composite pictures of the earth surface from high altitude overflights. An electronic deflection feature was used to inject the gyro error signal into the camera for correction of aircraft motion.
NASA Astrophysics Data System (ADS)
Griffiths, Andrew; Coates, Andrew; Muller, Jan-Peter; Jaumann, Ralf; Josset, Jean-Luc; Paar, Gerhard; Barnes, David
2010-05-01
The ExoMars mission has evolved into a joint European-US mission to deliver a trace gas orbiter and a pair of rovers to Mars in 2016 and 2018 respectively. The European rover will carry the Pasteur exobiology payload including the 1.56 kg Panoramic Camera. PanCam will provide multispectral stereo images with 34 deg horizontal field-of-view (580 microrad/pixel) Wide-Angle Cameras (WAC) and (83 microrad/pixel) colour monoscopic "zoom" images with 5 deg horizontal field-of-view High Resolution Camera (HRC). The stereo Wide Angle Cameras (WAC) are based on Beagle 2 Stereo Camera System heritage [1]. Integrated with the WACs and HRC into the PanCam optical bench (which helps the instrument meet its planetary protection requirements) is the PanCam interface unit (PIU); which provides image storage, a Spacewire interface to the rover and DC-DC power conversion. The Panoramic Camera instrument is designed to fulfil the digital terrain mapping requirements of the mission [2] as well as providing multispectral geological imaging, colour and stereo panoramic images and solar images for water vapour abundance and dust optical depth measurements. The High Resolution Camera (HRC) can be used for high resolution imaging of interesting targets detected in the WAC panoramas and of inaccessible locations on crater or valley walls. Additionally HRC will be used to observe retrieved subsurface samples before ingestion into the rest of the Pasteur payload. In short, PanCam provides the overview and context for the ExoMars experiment locations, required to enable the exobiology aims of the mission. In addition to these baseline capabilities further enhancements are possible to PanCam to enhance it's effectiveness for astrobiology and planetary exploration: 1. Rover Inspection Mirror (RIM) 2. Organics Detection by Fluorescence Excitation (ODFE) LEDs [3-6] 3. UVIS broadband UV Flux and Opacity Determination (UVFOD) photodiode This paper will discuss the scientific objectives and resource impacts of these enhancements. References: 1. Griffiths, A.D., Coates, A.J., Josset, J.-L., Paar, G., Hofmann, B., Pullan, D., Ruffer, P., Sims, M.R., Pillinger, C.T., The Beagle 2 stereo camera system, Planet. Space Sci. 53, 1466-1488, 2005. 2. Paar, G., Oberst, J., Barnes, D.P., Griffiths, A.D., Jaumann, R., Coates, A.J., Muller, J.P., Gao, Y., Li, R., 2007, Requirements and Solutions for ExoMars Rover Panoramic Camera 3d Vision Processing, abstract submitted to EGU meeting, Vienna, 2007. 3. Storrie-Lombardi, M.C., Hug, W.F., McDonald, G.D., Tsapin, A.I., and Nealson, K.H. 2001. Hollow cathode ion lasers for deep ultraviolet Raman spectroscopy and fluorescence imaging. Rev. Sci. Ins., 72 (12), 4452-4459. 4. Nealson, K.H., Tsapin, A., and Storrie-Lombardi, M. 2002. Searching for life in the universe: unconventional methods for an unconventional problem. International Microbiology, 5, 223-230. 5. Mormile, M.R. and Storrie-Lombardi, M.C. 2005. The use of ultraviolet excitation of native fluorescence for identifying biomarkers in halite crystals. Astrobiology and Planetary Missions (R. B. Hoover, G. V. Levin and A. Y. Rozanov, Eds.), Proc. SPIE, 5906, 246-253. 6. Storrie-Lombardi, M.C. 2005. Post-Bayesian strategies to optimize astrobiology instrument suites: lessons from Antarctica and the Pilbara. Astrobiology and Planetary Missions (R. B. Hoover, G. V. Levin and A. Y. Rozanov, Eds.), Proc. SPIE, 5906, 288-301.
NASA Astrophysics Data System (ADS)
Zambon, F.; De Sanctis, M. C.; Capaccioni, F.; Filacchione, G.; Carli, C.; Ammanito, E.; Friggeri, A.
2011-10-01
During the first two MESSENGER flybys (14th January 2008 and 6th October 2008) the Mercury Dual Imaging System (MDIS) has extended the coverage of the Mercury surface, obtained by Mariner 10 and now we have images of about 90% of the Mercury surface [1]. MDIS is equipped with a Narrow Angle Camera (NAC) and a Wide Angle Camera (WAC). The NAC uses an off-axis reflective design with a 1.5° field of view (FOV) centered at 747 nm. The WAC has a re- fractive design with a 10.5° FOV and 12-position filters that cover a 395-1040 nm spectral range [2]. The color images can be used to infer information on the surface composition and classification meth- ods are an interesting technique for multispectral image analysis which can be applied to the study of the planetary surfaces. Classification methods are based on clustering algorithms and they can be divided in two categories: unsupervised and supervised. The unsupervised classifiers do not require the analyst feedback, and the algorithm automatically organizes pixels values into classes. In the supervised method, instead, the analyst must choose the "training area" that define the pixels value of a given class [3]. Here we will describe the classification in different compositional units of the region near the Rudaki Crater on Mercury.
Airborne multispectral identification of individual cotton plants using consumer-grade cameras
USDA-ARS?s Scientific Manuscript database
Although multispectral remote sensing using consumer-grade cameras has successfully identified fields of small cotton plants, improvements to detection sensitivity are needed to identify individual or small clusters of plants. The imaging sensor of consumer-grade cameras are based on a Bayer patter...
Li, Hanlun; Zhang, Aiwu; Hu, Shaoxing
2015-01-01
This paper describes an airborne high resolution four-camera multispectral system which mainly consists of four identical monochrome cameras equipped with four interchangeable bandpass filters. For this multispectral system, an automatic multispectral data composing method was proposed. The homography registration model was chosen, and the scale-invariant feature transform (SIFT) and random sample consensus (RANSAC) were used to generate matching points. For the difficult registration problem between visible band images and near-infrared band images in cases lacking manmade objects, we presented an effective method based on the structural characteristics of the system. Experiments show that our method can acquire high quality multispectral images and the band-to-band alignment error of the composed multiple spectral images is less than 2.5 pixels. PMID:26205264
NASA Astrophysics Data System (ADS)
Kelly, M. A.; Boldt, J.; Wilson, J. P.; Yee, J. H.; Stoffler, R.
2017-12-01
The multi-spectral STereo Atmospheric Remote Sensing (STARS) concept has the objective to provide high-spatial and -temporal-resolution observations of 3D cloud structures related to hurricane development and other severe weather events. The rapid evolution of severe weather demonstrates a critical need for mesoscale observations of severe weather dynamics, but such observations are rare, particularly over the ocean where extratropical and tropical cyclones can undergo explosive development. Coincident space-based measurements of wind velocity and cloud properties at the mesoscale remain a great challenge, but are critically needed to improve the understanding and prediction of severe weather and cyclogenesis. STARS employs a mature stereoscopic imaging technique on two satellites (e.g. two CubeSats, two hosted payloads) to simultaneously retrieve cloud motion vectors (CMVs), cloud-top temperatures (CTTs), and cloud geometric heights (CGHs) from multi-angle, multi-spectral observations of cloud features. STARS is a pushbroom system based on separate wide-field-of-view co-boresighted multi-spectral cameras in the visible, midwave infrared (MWIR), and longwave infrared (LWIR) with high spatial resolution (better than 1 km). The visible system is based on a pan-chromatic, low-light imager to resolve cloud structures under nighttime illumination down to ¼ moon. The MWIR instrument, which is being developed as a NASA ESTO Instrument Incubator Program (IIP) project, is based on recent advances in MWIR detector technology that requires only modest cooling. The STARS payload provides flexible options for spaceflight due to its low size, weight, power (SWaP) and very modest cooling requirements. STARS also meets AF operational requirements for cloud characterization and theater weather imagery. In this paper, an overview of the STARS concept, including the high-level sensor design, the concept of operations, and measurement capability will be presented.
A wide-angle camera module for disposable endoscopy
NASA Astrophysics Data System (ADS)
Shim, Dongha; Yeon, Jesun; Yi, Jason; Park, Jongwon; Park, Soo Nam; Lee, Nanhee
2016-08-01
A wide-angle miniaturized camera module for disposable endoscope is demonstrated in this paper. A lens module with 150° angle of view (AOV) is designed and manufactured. All plastic injection-molded lenses and a commercial CMOS image sensor are employed to reduce the manufacturing cost. The image sensor and LED illumination unit are assembled with a lens module. The camera module does not include a camera processor to further reduce its size and cost. The size of the camera module is 5.5 × 5.5 × 22.3 mm3. The diagonal field of view (FOV) of the camera module is measured to be 110°. A prototype of a disposable endoscope is implemented to perform a pre-clinical animal testing. The esophagus of an adult beagle dog is observed. These results demonstrate the feasibility of a cost-effective and high-performance camera module for disposable endoscopy.
USDA-ARS?s Scientific Manuscript database
This paper describes the design and evaluation of an airborne multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame complementary metal oxide semiconductor (CMOS) sensor with 5616 × 3744 pixels. One came...
An Overview of the CBERS-2 Satellite and Comparison of the CBERS-2 CCD Data with the L5 TM Data
NASA Technical Reports Server (NTRS)
Chandler, Gyanesh
2007-01-01
CBERS satellite carries on-board a multi sensor payload with different spatial resolutions and collection frequencies. HRCCD (High Resolution CCD Camera), IRMSS (Infrared Multispectral Scanner), and WFI (Wide-Field Imager). The CCD and the WFI camera operate in the VNIR regions, while the IRMSS operates in SWIR and thermal region. In addition to the imaging payload, the satellite carries a Data Collection System (DCS) and Space Environment Monitor (SEM).
Liao, Jun; Wang, Zhe; Zhang, Zibang; Bian, Zichao; Guo, Kaikai; Nambiar, Aparna; Jiang, Yutong; Jiang, Shaowei; Zhong, Jingang; Choma, Michael; Zheng, Guoan
2018-02-01
We report the development of a multichannel microscopy for whole-slide multiplane, multispectral and phase imaging. We use trinocular heads to split the beam path into 6 independent channels and employ a camera array for parallel data acquisition, achieving a maximum data throughput of approximately 1 gigapixel per second. To perform single-frame rapid autofocusing, we place 2 near-infrared light-emitting diodes (LEDs) at the back focal plane of the condenser lens to illuminate the sample from 2 different incident angles. A hot mirror is used to direct the near-infrared light to an autofocusing camera. For multiplane whole-slide imaging (WSI), we acquire 6 different focal planes of a thick specimen simultaneously. For multispectral WSI, we relay the 6 independent image planes to the same focal position and simultaneously acquire information at 6 spectral bands. For whole-slide phase imaging, we acquire images at 3 focal positions simultaneously and use the transport-of-intensity equation to recover the phase information. We also provide an open-source design to further increase the number of channels from 6 to 15. The reported platform provides a simple solution for multiplexed fluorescence imaging and multimodal WSI. Acquiring an instant focal stack without z-scanning may also enable fast 3-dimensional dynamic tracking of various biological samples. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Awumah, A.; Mahanti, P.; Robinson, M. S.
2017-12-01
Image fusion is often used in Earth-based remote sensing applications to merge spatial details from a high-resolution panchromatic (Pan) image with the color information from a lower-resolution multi-spectral (MS) image, resulting in a high-resolution multi-spectral image (HRMS). Previously, the performance of six well-known image fusion methods were compared using Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) and Wide Angle Camera (WAC) images (1). Results showed the Intensity-Hue-Saturation (IHS) method provided the best spatial performance, but deteriorated the spectral content. In general, there was a trade-off between spatial enhancement and spectral fidelity from the fusion process; the more spatial details from the Pan fused with the MS image, the more spectrally distorted the final HRMS. In this work, we control the amount of spatial details fused (from the LROC NAC images to WAC images) using a controlled IHS method (2), to investigate the spatial variation in spectral distortion on fresh crater ejecta. In the controlled IHS method (2), the percentage of the Pan component merged with the MS is varied. The percent of spatial detail from the Pan used is determined by a variable whose value may be varied between 1 (no Pan utilized) to infinity (entire Pan utilized). An HRMS color composite image (red=415nm, green=321/415nm, blue=321/360nm (3)) was used to assess performance (via visual inspection and metric-based evaluations) at each tested value of the control parameter (1 to 10—after which spectral distortion saturates—in 0.01 increments) within three regions: crater interiors, ejecta blankets, and the background material surrounding the craters. Increasing the control parameter introduced increased spatial sharpness and spectral distortion in all regions, but to varying degrees. Crater interiors suffered the most color distortion, while ejecta experienced less color distortion. The controlled IHS method is therefore desirable for resolution-enhancement of fresh crater ejecta; larger values of the control parameter may be used to sharpen MS images of ejecta patterns but with less impact to color distortion than in the uncontrolled IHS fusion process. References: (1) Prasun et. al (2016) ISPRS. (2) Choi, Myungjin (2006) IEEE. (3) Denevi et. al (2014) JGR.
The Wide Angle Camera of the ROSETTA Mission
NASA Astrophysics Data System (ADS)
Barbieri, C.; Fornasier, S.; Verani, S.; Bertini, I.; Lazzarin, M.; Rampazzi, F.; Cremonese, G.; Ragazzoni, R.; Marzari, F.; Angrilli, F.; Bianchini, G. A.; Debei, S.; Dececco, M.; Guizzo, G.; Parzianello, G.; Ramous, P.; Saggin, B.; Zaccariotto, M.; Da Deppo, V.; Naletto, G.; Nicolosi, G.; Pelizzo, M. G.; Tondello, G.; Brunello, P.; Peron, F.
This paper aims to give a brief description of the Wide Angle Camera (WAC), built by the Centro Servizi e AttivitàSpaziali (CISAS) of the University of Padova for the ESA ROSETTA Mission to comet 46P/Wirtanen and asteroids 4979 Otawara and 140 Siwa. The WAC is part of the OSIRIS imaging system, which comprises also a Narrow Angle Camera (NAC) built by the Laboratoire d'Astrophysique Spatiale (LAS) of Marseille. CISAS had also the responsibility to build the shutter and the front cover mechanism for the NAC. The flight model of the WAC was delivered in December 2001, and has been already integrated on ROSETTA.
The High Resolution Stereo Camera (HRSC): 10 Years of Imaging Mars
NASA Astrophysics Data System (ADS)
Jaumann, R.; Neukum, G.; Tirsch, D.; Hoffmann, H.
2014-04-01
The HRSC Experiment: Imagery is the major source for our current understanding of the geologic evolution of Mars in qualitative and quantitative terms.Imaging is required to enhance our knowledge of Mars with respect to geological processes occurring on local, regional and global scales and is an essential prerequisite for detailed surface exploration. The High Resolution Stereo Camera (HRSC) of ESA's Mars Express Mission (MEx) is designed to simultaneously map the morphology, topography, structure and geologic context of the surface of Mars as well as atmospheric phenomena [1]. The HRSC directly addresses two of the main scientific goals of the Mars Express mission: (1) High-resolution three-dimensional photogeologic surface exploration and (2) the investigation of surface-atmosphere interactions over time; and significantly supports: (3) the study of atmospheric phenomena by multi-angle coverage and limb sounding as well as (4) multispectral mapping by providing high-resolution threedimensional color context information. In addition, the stereoscopic imagery will especially characterize landing sites and their geologic context [1]. The HRSC surface resolution and the digital terrain models bridge the gap in scales between highest ground resolution images (e.g., HiRISE) and global coverage observations (e.g., Viking). This is also the case with respect to DTMs (e.g., MOLA and local high-resolution DTMs). HRSC is also used as cartographic basis to correlate between panchromatic and multispectral stereo data. The unique multi-angle imaging technique of the HRSC supports its stereo capability by providing not only a stereo triplet but also a stereo quintuplet, making the photogrammetric processing very robust [1, 3]. The capabilities for three dimensional orbital reconnaissance of the Martian surface are ideally met by HRSC making this camera unique in the international Mars exploration effort.
HERCULES/MSI: a multispectral imager with geolocation for STS-70
NASA Astrophysics Data System (ADS)
Simi, Christopher G.; Kindsfather, Randy; Pickard, Henry; Howard, William, III; Norton, Mark C.; Dixon, Roberta
1995-11-01
A multispectral intensified CCD imager combined with a ring laser gyroscope based inertial measurement unit was flown on the Space Shuttle Discovery from July 13-22, 1995 (Space Transport System Flight No. 70, STS-70). The camera includes a six position filter wheel, a third generation image intensifier, and a CCD camera. The camera is integrated with a laser gyroscope system that determines the ground position of the imagery to an accuracy of better than three nautical miles. The camera has two modes of operation; a panchromatic mode for high-magnification imaging [ground sample distance (GSD) of 4 m], or a multispectral mode consisting of six different user-selectable spectral ranges at reduced magnification (12 m GSD). This paper discusses the system hardware and technical trade-offs involved with camera optimization, and presents imagery observed during the shuttle mission.
1990-02-14
Range : 4 billion miles from Earth, at 32 degrees to the ecliptic. P-36057C This color image of the Sun, Earth, and Venus is one of the first, and maybe, only images that show are solar system from such a vantage point. The image is a portion of a wide angle image containing the sun and the region of space where the Earth and Venus were at the time, with narrow angle cameras centered on each planet. The wide angle was taken with the cameras darkest filter, a methane absorption band, and the shortest possible exposure, one two-hundredth of a second, to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large in the sky, as seen from Voyager's perpective at the edge of the solar system. Yet, it is still 8xs brighter than the brightest star in Earth's sky, Sirius. The image of the sun you see is far larger than the actual dimension of the solar disk. The result of the brightness is a bright burned out image with multiple reflections from the optics of the camera. The rays around th sun are a diffraction pattern of the calibration lamp which is mounted in front of the wide angle lens. the 2 narrow angle frames containing the images of the Earth and Venus have been digitally mosaicked into the wide angle image at the appropriate scale. These images were taken through three color filters and recombined to produce the color image. The violet, green, and blue filters used , as well as exposure times of .72,.48, and .72 for Earth, and .36, .24, and .36 for Venus.The images also show long linear streaks resulting from scatering of sulight off parts of the camera and its shade.
Development of Camera Model and Geometric Calibration/validation of Xsat IRIS Imagery
NASA Astrophysics Data System (ADS)
Kwoh, L. K.; Huang, X.; Tan, W. J.
2012-07-01
XSAT, launched on 20 April 2011, is the first micro-satellite designed and built in Singapore. It orbits the Earth at altitude of 822 km in a sun synchronous orbit. The satellite carries a multispectral camera IRIS with three spectral bands - 0.52~0.60 mm for Green, 0.63~0.69 mm for Red and 0.76~0.89 mm for NIR at 12 m resolution. In the design of IRIS camera, the three bands were acquired by three lines of CCDs (NIR, Red and Green). These CCDs were physically separated in the focal plane and their first pixels not absolutely aligned. The micro-satellite platform was also not stable enough to allow for co-registration of the 3 bands with simple linear transformation. In the camera model developed, this platform stability was compensated with 3rd to 4th order polynomials for the satellite's roll, pitch and yaw attitude angles. With the camera model, the camera parameters such as the band to band separations, the alignment of the CCDs relative to each other, as well as the focal length of the camera can be validated or calibrated. The results of calibration with more than 20 images showed that the band to band along-track separation agreed well with the pre-flight values provided by the vendor (0.093° and 0.046° for the NIR vs red and for green vs red CCDs respectively). The cross-track alignments were 0.05 pixel and 5.9 pixel for the NIR vs red and green vs red CCDs respectively. The focal length was found to be shorter by about 0.8%. This was attributed to the lower operating temperature which XSAT is currently operating. With the calibrated parameters and the camera model, a geometric level 1 multispectral image with RPCs can be generated and if required, orthorectified imagery can also be produced.
An integrated compact airborne multispectral imaging system using embedded computer
NASA Astrophysics Data System (ADS)
Zhang, Yuedong; Wang, Li; Zhang, Xuguo
2015-08-01
An integrated compact airborne multispectral imaging system using embedded computer based control system was developed for small aircraft multispectral imaging application. The multispectral imaging system integrates CMOS camera, filter wheel with eight filters, two-axis stabilized platform, miniature POS (position and orientation system) and embedded computer. The embedded computer has excellent universality and expansibility, and has advantages in volume and weight for airborne platform, so it can meet the requirements of control system of the integrated airborne multispectral imaging system. The embedded computer controls the camera parameters setting, filter wheel and stabilized platform working, image and POS data acquisition, and stores the image and data. The airborne multispectral imaging system can connect peripheral device use the ports of the embedded computer, so the system operation and the stored image data management are easy. This airborne multispectral imaging system has advantages of small volume, multi-function, and good expansibility. The imaging experiment results show that this system has potential for multispectral remote sensing in applications such as resource investigation and environmental monitoring.
Morphology and Composition of Localized Lunar Dark Mantle Deposits With LROC Data
NASA Astrophysics Data System (ADS)
Gustafson, O.; Bell, J. F.; Gaddis, L. R.; Hawke, B. R.; Robinson, M. S.; LROC Science Team
2010-12-01
Clementine color (ultraviolet, visible or UVVIS) and Lunar Reconnaissance Orbiter (LRO) Wide Angle (WAC) and Narrow Angle (NAC) camera data provide the means to investigate localized lunar dark-mantle deposits (DMDs) of potential pyroclastic origin. Our goals are to (1) examine the morphology and physical characteristics of these deposits with LROC WAC and NAC data; (2) extend methods used in earlier studies of lunar DMDs with Clementine spectral reflectance (CSR) data; (3) use LRO WAC multispectral data to complement and extend the CSR data for compositional analyses; and (4) apply these results to identify the likely mode of emplacement and study the diversity of compositions among these deposits. Pyroclastic deposits have been recognized all across the Moon, identified by their low albedo, smooth texture, and mantling relationship to underlying features. Gaddis et al. (2003) presented a compositional analysis of 75 potential lunar pyroclastic deposits (LPDs) based on CSR measurements. New LRO camera (LROC) data permit more extensive analyses of such deposits than previously possible. Our study began with six sites on the southeastern limb of the Moon that contain nine of the cataloged 75 potential pyroclastic deposits: Humboldt (4 deposits), Petavius, Barnard, Abel B, Abel C, and Titius. Our analysis found that some of the DMDs exhibit qualities characteristic of fluid emplacement, such as flat surfaces, sharp margins, embaying relationships, and flow textures. We conclude that the localized DMDs are a complex class of features, many of which may have formed by a combination of effusive and pyroclastic emplacement mechanisms. We have extended this analysis to include additional localized DMDs from the catalog of 75 potential pyroclastic deposits. We have examined high resolution (up to 0.5 m/p) NAC images as they become available to assess the mode of emplacement of the deposits, locate potential volcanic vents, and assess physical characteristics of the DMDs such as thickness, roughness, and rock abundance. Within and around each DMD, the Clementine UVVIS multispectral mosaic (100 m/p, 5 bands at 415, 750, 900, 950, and 1000 nm) and LROC WAC multispectral image cubes (75 to 400 m/p, 7 bands at 320, 360, 415, 565, 605, 645, and 690 nm) have been used to extract spectral reflectance data. Spectral ratio plots were prepared to compare deposits and draw conclusions regarding compositional differences, such as mafic mineral or titanium content and distribution, both within and between DMDs. The result of the study will be an improved classification of these deposits in terms of emplacement mechanisms and composition, including identifying compositional affinities among DMDs and between DMDs and other volcanic deposits.
Gooi, Patrick; Ahmed, Yusuf; Ahmed, Iqbal Ike K
2014-07-01
We describe the use of a microscope-mounted wide-angle point-of-view camera to record optimal hand positions in ocular surgery. The camera is mounted close to the objective lens beneath the surgeon's oculars and faces the same direction as the surgeon, providing a surgeon's view. A wide-angle lens enables viewing of both hands simultaneously and does not require repositioning the camera during the case. Proper hand positioning and instrument placement through microincisions are critical for effective and atraumatic handling of tissue within the eye. Our technique has potential in the assessment and training of optimal hand position for surgeons performing intraocular surgery. It is an innovative way to routinely record instrument and operating hand positions in ophthalmic surgery and has minimal requirements in terms of cost, personnel, and operating-room space. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2014 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Airborne system for multispectral, multiangle polarimetric imaging.
Bowles, Jeffrey H; Korwan, Daniel R; Montes, Marcos J; Gray, Deric J; Gillis, David B; Lamela, Gia M; Miller, W David
2015-11-01
In this paper, we describe the design, fabrication, calibration, and deployment of an airborne multispectral polarimetric imager. The motivation for the development of this instrument was to explore its ability to provide information about water constituents, such as particle size and type. The instrument is based on four 16 MP cameras and uses wire grid polarizers (aligned at 0°, 45°, 90°, and 135°) to provide the separation of the polarization states. A five-position filter wheel provides for four narrow-band spectral filters (435, 550, 625, and 750 nm) and one blocked position for dark-level measurements. When flown, the instrument is mounted on a programmable stage that provides control of the view angles. View angles that range to ±65° from the nadir have been used. Data processing provides a measure of the polarimetric signature as a function of both the view zenith and view azimuth angles. As a validation of our initial results, we compare our measurements, over water, with the output of a Monte Carlo code, both of which show neutral points off the principle plane. The locations of the calculated and measured neutral points are compared. The random error level in the measured degree of linear polarization (8% at 435) is shown to be better than 0.25%.
Jung, Kyunghwa; Choi, Hyunseok; Hong, Hanpyo; Adikrishna, Arnold; Jeon, In-Ho; Hong, Jaesung
2017-02-01
A hands-free region-of-interest (ROI) selection interface is proposed for solo surgery using a wide-angle endoscope. A wide-angle endoscope provides images with a larger field of view than a conventional endoscope. With an appropriate selection interface for a ROI, surgeons can also obtain a detailed local view as if they moved a conventional endoscope in a specific position and direction. To manipulate the endoscope without releasing the surgical instrument in hand, a mini-camera is attached to the instrument, and the images taken by the attached camera are analyzed. When a surgeon moves the instrument, the instrument orientation is calculated by an image processing. Surgeons can select the ROI with this instrument movement after switching from 'task mode' to 'selection mode.' The accelerated KAZE algorithm is used to track the features of the camera images once the instrument is moved. Both the wide-angle and detailed local views are displayed simultaneously, and a surgeon can move the local view area by moving the mini-camera attached to the surgical instrument. Local view selection for a solo surgery was performed without releasing the instrument. The accuracy of camera pose estimation was not significantly different between camera resolutions, but it was significantly different between background camera images with different numbers of features (P < 0.01). The success rate of ROI selection diminished as the number of separated regions increased. However, separated regions up to 12 with a region size of 160 × 160 pixels were selected with no failure. Surgical tasks on a phantom model and a cadaver were attempted to verify the feasibility in a clinical environment. Hands-free endoscope manipulation without releasing the instruments in hand was achieved. The proposed method requires only a small, low-cost camera and an image processing. The technique enables surgeons to perform solo surgeries without a camera assistant.
NASA Astrophysics Data System (ADS)
Klaessens, John H. G. M.; Nelisse, Martin; Verdaasdonk, Rudolf M.; Noordmans, Herke Jan
2013-03-01
During clinical interventions objective and quantitative information of the tissue perfusion, oxygenation or temperature can be useful for the surgical strategy. Local (point) measurements give limited information and affected areas can easily be missed, therefore imaging large areas is required. In this study a LED based multispectral imaging system (MSI, 17 different wavelengths 370nm-880nm) and a thermo camera were applied during clinical interventions: tissue flap transplantations (ENT), local anesthetic block and during open brain surgery (epileptic seizure). The images covered an area of 20x20 cm, when doing measurements in an (operating) room, they turned out to be more complicated than laboratory experiments due to light fluctuations, movement of the patient and limited angle of view. By constantly measuring the background light and the use of a white reference, light fluctuations and movement were corrected. Oxygenation concentration images could be calculated and combined with the thermal images. The effectively of local anesthesia of a hand could be predicted in an early stage using the thermal camera and the reperfusion of transplanted skin flap could be imaged. During brain surgery, a temporary hyper-perfused area was witnessed which was probably related to an epileptic attack. A LED based multispectral imaging system combined with thermal imaging provide complementary information on perfusion and oxygenation changes and are promising techniques for real-time diagnostics during clinical interventions.
NASA Astrophysics Data System (ADS)
McMackin, Lenore; Herman, Matthew A.; Weston, Tyler
2016-02-01
We present the design of a multi-spectral imager built using the architecture of the single-pixel camera. The architecture is enabled by the novel sampling theory of compressive sensing implemented optically using the Texas Instruments DLP™ micro-mirror array. The array not only implements spatial modulation necessary for compressive imaging but also provides unique diffractive spectral features that result in a multi-spectral, high-spatial resolution imager design. The new camera design provides multi-spectral imagery in a wavelength range that extends from the visible to the shortwave infrared without reduction in spatial resolution. In addition to the compressive imaging spectrometer design, we present a diffractive model of the architecture that allows us to predict a variety of detailed functional spatial and spectral design features. We present modeling results, architectural design and experimental results that prove the concept.
MISR Global Images See the Light of Day
NASA Technical Reports Server (NTRS)
2002-01-01
As of July 31, 2002, global multi-angle, multi-spectral radiance products are available from the MISR instrument aboard the Terra satellite. Measuring the radiative properties of different types of surfaces, clouds and atmospheric particulates is an important step toward understanding the Earth's climate system. These images are among the first planet-wide summary views to be publicly released from the Multi-angle Imaging SpectroRadiometer experiment. Data for these images were collected during the month of March 2002, and each pixel represents monthly-averaged daylight radiances from an area measuring 1/2 degree in latitude by 1/2 degree in longitude.The top panel is from MISR's nadir (vertical-viewing) camera and combines data from the red, green and blue spectral bands to create a natural color image. The central view combines near-infrared, red, and green spectral data to create a false-color rendition that enhances highly vegetated terrain. It takes 9 days for MISR to view the entire globe, and only areas within 8 degrees of latitude of the north and south poles are not observed due to the Terra orbit inclination. Because a single pole-to-pole swath of MISR data is just 400 kilometers wide, multiple swaths must be mosaiced to create these global views. Discontinuities appear in some cloud patterns as a consequence of changes in cloud cover from one day to another.The lower panel is a composite in which red, green, and blue radiances from MISR's 70-degree forward-viewing camera are displayed in the northern hemisphere, and radiances from the 70-degree backward-viewing camera are displayed in the southern hemisphere. At the March equinox (spring in the northern hemisphere, autumn in the southern hemisphere), the Sun is near the equator. Therefore, both oblique angles are observing the Earth in 'forward scattering', particularly at high latitudes. Forward scattering occurs when you (or MISR) observe an object with the Sun at a point in the sky that is in front of you. Relative to the nadir view, this geometry accentuates the appearance of polar clouds, and can even reveal clouds that are invisible in the nadir direction. In relatively clear ocean areas, the oblique-angle composite is generally brighter than its nadir counterpart due to enhanced reflection of light by atmospheric particulates.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.NASA Astrophysics Data System (ADS)
Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio
2015-07-01
Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.
Performance Assessment and Geometric Calibration of RESOURCESAT-2
NASA Astrophysics Data System (ADS)
Radhadevi, P. V.; Solanki, S. S.; Akilan, A.; Jyothi, M. V.; Nagasubramanian, V.
2016-06-01
Resourcesat-2 (RS-2) has successfully completed five years of operations in its orbit. This satellite has multi-resolution and multi-spectral capabilities in a single platform. A continuous and autonomous co-registration, geo-location and radiometric calibration of image data from different sensors with widely varying view angles and resolution was one of the challenges of RS-2 data processing. On-orbit geometric performance of RS-2 sensors has been widely assessed and calibrated during the initial phase operations. Since then, as an ongoing activity, various geometric performance data are being generated periodically. This is performed with sites of dense ground control points (GCPs). These parameters are correlated to the direct geo-location accuracy of the RS-2 sensors and are monitored and validated to maintain the performance. This paper brings out the geometric accuracy assessment, calibration and validation done for about 500 datasets of RS-2. The objectives of this study are to ensure the best absolute and relative location accuracy of different cameras, location performance with payload steering and co-registration of multiple bands. This is done using a viewing geometry model, given ephemeris and attitude data, precise camera geometry and datum transformation. In the model, the forward and reverse transformations between the coordinate systems associated with the focal plane, payload, body, orbit and ground are rigorously and explicitly defined. System level tests using comparisons to ground check points have validated the operational geo-location accuracy performance and the stability of the calibration parameters.
Time-resolved multispectral imaging of combustion reactions
NASA Astrophysics Data System (ADS)
Huot, Alexandrine; Gagnon, Marc-André; Jahjah, Karl-Alexandre; Tremblay, Pierre; Savary, Simon; Farley, Vincent; Lagueux, Philippe; Guyot, Éric; Chamberland, Martin; Marcotte, Frédérick
2015-10-01
Thermal infrared imaging is a field of science that evolves rapidly. Scientists have used for years the simplest tool: thermal broadband cameras. These allow to perform target characterization in both the longwave (LWIR) and midwave (MWIR) infrared spectral range. Infrared thermal imaging is used for a wide range of applications, especially in the combustion domain. For example, it can be used to follow combustion reactions, in order to characterize the injection and the ignition in a combustion chamber or even to observe gases produced by a flare or smokestack. Most combustion gases, such as carbon dioxide (CO2), selectively absorb/emit infrared radiation at discrete energies, i.e. over a very narrow spectral range. Therefore, temperatures derived from broadband imaging are not reliable without prior knowledge of spectral emissivity. This information is not directly available from broadband images. However, spectral information is available using spectral filters. In this work, combustion analysis was carried out using a Telops MS-IR MW camera, which allows multispectral imaging at a high frame rate. A motorized filter wheel allowing synchronized acquisitions on eight (8) different channels was used to provide time-resolved multispectral imaging of combustion products of a candle in which black powder has been burnt to create a burst. It was then possible to estimate the temperature by modeling spectral profiles derived from information obtained with the different spectral filters. Comparison with temperatures obtained using conventional broadband imaging illustrates the benefits of time-resolved multispectral imaging for the characterization of combustion processes.
Time-resolved multispectral imaging of combustion reaction
NASA Astrophysics Data System (ADS)
Huot, Alexandrine; Gagnon, Marc-André; Jahjah, Karl-Alexandre; Tremblay, Pierre; Savary, Simon; Farley, Vincent; Lagueux, Philippe; Guyot, Éric; Chamberland, Martin; Marcotte, Fréderick
2015-05-01
Thermal infrared imaging is a field of science that evolves rapidly. Scientists have used for years the simplest tool: thermal broadband cameras. This allows to perform target characterization in both the longwave (LWIR) and midwave (MWIR) infrared spectral range. Infrared thermal imaging is used for a wide range of applications, especially in the combustion domain. For example, it can be used to follow combustion reactions, in order to characterize the injection and the ignition in a combustion chamber or even to observe gases produced by a flare or smokestack. Most combustion gases such as carbon dioxide (CO2) selectively absorb/emit infrared radiation at discrete energies, i.e. over a very narrow spectral range. Therefore, temperatures derived from broadband imaging are not reliable without prior knowledge about spectral emissivity. This information is not directly available from broadband images. However, spectral information is available using spectral filters. In this work, combustion analysis was carried out using Telops MS-IR MW camera which allows multispectral imaging at a high frame rate. A motorized filter wheel allowing synchronized acquisitions on eight (8) different channels was used to provide time-resolved multispectral imaging of combustion products of a candle in which black powder has been burnt to create a burst. It was then possible to estimate the temperature by modeling spectral profile derived from information obtained with the different spectral filters. Comparison with temperatures obtained using conventional broadband imaging illustrates the benefits of time-resolved multispectral imaging for the characterization of combustion processes.
Using a trichromatic CCD camera for spectral skylight estimation.
López-Alvarez, Miguel A; Hernández-Andrés, Javier; Romero, Javier; Olmo, F J; Cazorla, A; Alados-Arboledas, L
2008-12-01
In a previous work [J. Opt. Soc. Am. A 24, 942-956 (2007)] we showed how to design an optimum multispectral system aimed at spectral recovery of skylight. Since high-resolution multispectral images of skylight could be interesting for many scientific disciplines, here we also propose a nonoptimum but much cheaper and faster approach to achieve this goal by using a trichromatic RGB charge-coupled device (CCD) digital camera. The camera is attached to a fish-eye lens, hence permitting us to obtain a spectrum of every point of the skydome corresponding to each pixel of the image. In this work we show how to apply multispectral techniques to the sensors' responses of a common trichromatic camera in order to obtain skylight spectra from them. This spectral information is accurate enough to estimate experimental values of some climate parameters or to be used in algorithms for automatic cloud detection, among many other possible scientific applications.
Clancy, Neil T.; Stoyanov, Danail; James, David R. C.; Di Marco, Aimee; Sauvage, Vincent; Clark, James; Yang, Guang-Zhong; Elson, Daniel S.
2012-01-01
Sequential multispectral imaging is an acquisition technique that involves collecting images of a target at different wavelengths, to compile a spectrum for each pixel. In surgical applications it suffers from low illumination levels and motion artefacts. A three-channel rigid endoscope system has been developed that allows simultaneous recording of stereoscopic and multispectral images. Salient features on the tissue surface may be tracked during the acquisition in the stereo cameras and, using multiple camera triangulation techniques, this information used to align the multispectral images automatically even though the tissue or camera is moving. This paper describes a detailed validation of the set-up in a controlled experiment before presenting the first in vivo use of the device in a porcine minimally invasive surgical procedure. Multispectral images of the large bowel were acquired and used to extract the relative concentration of haemoglobin in the tissue despite motion due to breathing during the acquisition. Using the stereoscopic information it was also possible to overlay the multispectral information on the reconstructed 3D surface. This experiment demonstrates the ability of this system for measuring blood perfusion changes in the tissue during surgery and its potential use as a platform for other sequential imaging modalities. PMID:23082296
Voyager spacecraft images of Jupiter and Saturn
NASA Technical Reports Server (NTRS)
Birnbaum, M. M.
1982-01-01
The Voyager imaging system is described, noting that it is made up of a narrow-angle and a wide-angle TV camera, each in turn consisting of optics, a filter wheel and shutter assembly, a vidicon tube, and an electronics subsystem. The narrow-angle camera has a focal length of 1500 mm; its field of view is 0.42 deg and its focal ratio is f/8.5. For the wide-angle camera, the focal length is 200 mm, the field of view 3.2 deg, and the focal ratio of f/3.5. Images are exposed by each camera through one of eight filters in the filter wheel on the photoconductive surface of a magnetically focused and deflected vidicon having a diameter of 25 mm. The vidicon storage surface (target) is a selenium-sulfur film having an active area of 11.14 x 11.14 mm; it holds a frame consisting of 800 lines with 800 picture elements per line. Pictures of Jupiter, Saturn, and their moons are presented, with short descriptions given of the area being viewed.
Low-cost multispectral imaging for remote sensing of lettuce health
NASA Astrophysics Data System (ADS)
Ren, David D. W.; Tripathi, Siddhant; Li, Larry K. B.
2017-01-01
In agricultural remote sensing, unmanned aerial vehicle (UAV) platforms offer many advantages over conventional satellite and full-scale airborne platforms. One of the most important advantages is their ability to capture high spatial resolution images (1-10 cm) on-demand and at different viewing angles. However, UAV platforms typically rely on the use of multiple cameras, which can be costly and difficult to operate. We present the development of a simple low-cost imaging system for remote sensing of crop health and demonstrate it on lettuce (Lactuca sativa) grown in Hong Kong. To identify the optimal vegetation index, we recorded images of both healthy and unhealthy lettuce, and used them as input in an expectation maximization cluster analysis with a Gaussian mixture model. Results from unsupervised and supervised clustering show that, among four widely used vegetation indices, the blue wide-dynamic range vegetation index is the most accurate. This study shows that it is readily possible to design and build a remote sensing system capable of determining the health status of lettuce at a reasonably low cost (
Development of a Portable 3CCD Camera System for Multispectral Imaging of Biological Samples
Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S.
2014-01-01
Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510
Inflight Calibration of the Lunar Reconnaissance Orbiter Camera Wide Angle Camera
NASA Astrophysics Data System (ADS)
Mahanti, P.; Humm, D. C.; Robinson, M. S.; Boyd, A. K.; Stelling, R.; Sato, H.; Denevi, B. W.; Braden, S. E.; Bowman-Cisneros, E.; Brylow, S. M.; Tschimmel, M.
2016-04-01
The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) has acquired more than 250,000 images of the illuminated lunar surface and over 190,000 observations of space and non-illuminated Moon since 1 January 2010. These images, along with images from the Narrow Angle Camera (NAC) and other Lunar Reconnaissance Orbiter instrument datasets are enabling new discoveries about the morphology, composition, and geologic/geochemical evolution of the Moon. Characterizing the inflight WAC system performance is crucial to scientific and exploration results. Pre-launch calibration of the WAC provided a baseline characterization that was critical for early targeting and analysis. Here we present an analysis of WAC performance from the inflight data. In the course of our analysis we compare and contrast with the pre-launch performance wherever possible and quantify the uncertainty related to various components of the calibration process. We document the absolute and relative radiometric calibration, point spread function, and scattered light sources and provide estimates of sources of uncertainty for spectral reflectance measurements of the Moon across a range of imaging conditions.
Multispectral photography for earth resources
NASA Technical Reports Server (NTRS)
Wenderoth, S.; Yost, E.; Kalia, R.; Anderson, R.
1972-01-01
A guide for producing accurate multispectral results for earth resource applications is presented along with theoretical and analytical concepts of color and multispectral photography. Topics discussed include: capabilities and limitations of color and color infrared films; image color measurements; methods of relating ground phenomena to film density and color measurement; sensitometry; considerations in the selection of multispectral cameras and components; and mission planning.
NASA Technical Reports Server (NTRS)
Slater, P. N.; Jackson, R. D.
1982-01-01
Ground-measured spectral reflectance data for Avondale loam and drought-stressed and unstressed wheat were converted into digital counts for spectral bands 5 and 7 of the Landsat Multispectral Scanner System (MSS). For dry loam, the differences between ratios of MSS bands 7-5 as determined from space and from ground level measurements were 2.3 percent for clear and 5.6 percent for turbid atmospheric conditions. By contrast, for wet loam the differences were 10.4 and 29.5 percent. It is found that atmospheric conditions may cause a delay of from 3 to 7 days in the discrimination between drought-stressed and unstressed wheat. For oblique angle observations the atmospheric modification of ground-measured reflectances increased with angle at a greater rate in the 0/180 deg azimuth than in the 90/270 deg azimuth. Implications of this result are discussed for oblique angle Systeme Probatoire d'Observation de la Terre (SPOT), Mapsat, future multispectral linear array system imagery, and wide-angle imagery collected from scanners in high-altitude aircraft.
Feng, Lei; Fang, Hui; Zhou, Wei-Jun; Huang, Min; He, Yong
2006-09-01
Site-specific variable nitrogen application is one of the major precision crop production management operations. Obtaining sufficient crop nitrogen stress information is essential for achieving effective site-specific nitrogen applications. The present paper describes the development of a multi-spectral nitrogen deficiency sensor, which uses three channels (green, red, near-infrared) of crop images to determine the nitrogen level of canola. This sensor assesses the nitrogen stress by means of estimated SPAD value of the canola based on canola canopy reflectance sensed using three channels (green, red, near-infrared) of the multi-spectral camera. The core of this investigation is the calibration methods between the multi-spectral references and the nitrogen levels in crops measured using a SPAD 502 chlorophyll meter. Based on the results obtained from this study, it can be concluded that a multi-spectral CCD camera can provide sufficient information to perform reasonable SPAD values estimation during field operations.
Instrumentation for Infrared Airglow Clutter.
1987-03-10
gain, and filter position to the Camera Head, and monitors these parameters as well as preamp video. GAZER is equipped with a Lenzar wide angle, low...Specifications/Parameters VIDEO SENSOR: Camera ...... . LENZAR Intensicon-8 LLLTV using 2nd gen * micro-channel intensifier and proprietary camera tube
Low SWaP multispectral sensors using dichroic filter arrays
NASA Astrophysics Data System (ADS)
Dougherty, John; Varghese, Ron
2015-06-01
The benefits of multispectral imaging are well established in a variety of applications including remote sensing, authentication, satellite and aerial surveillance, machine vision, biomedical, and other scientific and industrial uses. However, many of the potential solutions require more compact, robust, and cost-effective cameras to realize these benefits. The next generation of multispectral sensors and cameras needs to deliver improvements in size, weight, power, portability, and spectral band customization to support widespread deployment for a variety of purpose-built aerial, unmanned, and scientific applications. A novel implementation uses micro-patterning of dichroic filters1 into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. This approach can be implemented across a variety of wavelength ranges and on a variety of detector types including linear, area, silicon, and InGaAs. This dichroic filter array approach can also reduce payloads and increase range for unmanned systems, with the capability to support both handheld and autonomous systems. Recent examples and results of 4 band RGB + NIR dichroic filter arrays in multispectral cameras are discussed. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and scalable production.
Toslak, Devrim; Liu, Changgeng; Alam, Minhaj Nur; Yao, Xincheng
2018-06-01
A portable fundus imager is essential for emerging telemedicine screening and point-of-care examination of eye diseases. However, existing portable fundus cameras have limited field of view (FOV) and frequently require pupillary dilation. We report here a miniaturized indirect ophthalmoscopy-based nonmydriatic fundus camera with a snapshot FOV up to 67° external angle, which corresponds to a 101° eye angle. The wide-field fundus camera consists of a near-infrared light source (LS) for retinal guidance and a white LS for color retinal imaging. By incorporating digital image registration and glare elimination methods, a dual-image acquisition approach was used to achieve reflection artifact-free fundus photography.
Visible-infrared achromatic imaging by wavefront coding with wide-angle automobile camera
NASA Astrophysics Data System (ADS)
Ohta, Mitsuhiko; Sakita, Koichi; Shimano, Takeshi; Sugiyama, Takashi; Shibasaki, Susumu
2016-09-01
We perform an experiment of achromatic imaging with wavefront coding (WFC) using a wide-angle automobile lens. Our original annular phase mask for WFC was inserted to the lens, for which the difference between the focal positions at 400 nm and at 950 nm is 0.10 mm. We acquired images of objects using a WFC camera with this lens under the conditions of visible and infrared light. As a result, the effect of the removal of the chromatic aberration of the WFC system was successfully determined. Moreover, we fabricated a demonstration set assuming the use of a night vision camera in an automobile and showed the effect of the WFC system.
NASA Astrophysics Data System (ADS)
Sahoo, Sujit Kumar; Tang, Dongliang; Dang, Cuong
2018-02-01
Large field of view multispectral imaging through scattering medium is a fundamental quest in optics community. It has gained special attention from researchers in recent years for its wide range of potential applications. However, the main bottlenecks of the current imaging systems are the requirements on specific illumination, poor image quality and limited field of view. In this work, we demonstrated a single-shot high-resolution colour-imaging through scattering media using a monochromatic camera. This novel imaging technique is enabled by the spatial, spectral decorrelation property and the optical memory effect of the scattering media. Moreover the use of deconvolution image processing further annihilate above-mentioned drawbacks arise due iterative refocusing, scanning or phase retrieval procedures.
NASA Technical Reports Server (NTRS)
Johnson, R. W.; Hall, J. B., Jr.
1977-01-01
Ocean dumping of waste materials is a significant environmental concern in the New York Bight. One of these waste materials, sewage sludge, was monitored in an experiment conducted in the New York Bight on September 22, 1975. Remote sensing over controlled sewage sludge dumping included an 11-band multispectral scanner, fiver multispectral cameras and one mapping camera. Concurrent in situ water samples were taken and acoustical measurements were made of the sewage sludge plumes. Data were obtained for sewage sludge plumes resulting from line (moving barge) and spot (stationary barge) dumps. Multiple aircraft overpasses were made to evaluate temporal effects on the plume signature.
A spectral reflectance estimation technique using multispectral data from the Viking lander camera
NASA Technical Reports Server (NTRS)
Park, S. K.; Huck, F. O.
1976-01-01
A technique is formulated for constructing spectral reflectance curve estimates from multispectral data obtained with the Viking lander camera. The multispectral data are limited to six spectral channels in the wavelength range from 0.4 to 1.1 micrometers and most of these channels exhibit appreciable out-of-band response. The output of each channel is expressed as a linear (integral) function of the (known) solar irradiance, atmospheric transmittance, and camera spectral responsivity and the (unknown) spectral responsivity and the (unknown) spectral reflectance. This produces six equations which are used to determine the coefficients in a representation of the spectral reflectance as a linear combination of known basis functions. Natural cubic spline reflectance estimates are produced for a variety of materials that can be reasonably expected to occur on Mars. In each case the dominant reflectance features are accurately reproduced, but small period features are lost due to the limited number of channels. This technique may be a valuable aid in selecting the number of spectral channels and their responsivity shapes when designing a multispectral imaging system.
1999-08-24
One wide-angle and eight narrow-angle camera images of Miranda, taken by NASA Voyager 2, were combined in this view. The controlled mosaic was transformed to an orthographic view centered on the south pole.
The MVACS Surface Stereo Imager on Mars Polar Lander
NASA Astrophysics Data System (ADS)
Smith, P. H.; Reynolds, R.; Weinberg, J.; Friedman, T.; Lemmon, M. T.; Tanner, R.; Reid, R. J.; Marcialis, R. L.; Bos, B. J.; Oquest, C.; Keller, H. U.; Markiewicz, W. J.; Kramm, R.; Gliem, F.; Rueffer, P.
2001-08-01
The Surface Stereo Imager (SSI), a stereoscopic, multispectral camera on the Mars Polar Lander, is described in terms of its capabilities for studying the Martian polar environment. The camera's two eyes, separated by 15.0 cm, provide the camera with range-finding ability. Each eye illuminates half of a single CCD detector with a field of view of 13.8° high by 14.3° wide and has 12 selectable filters between 440 and 1000 nm. The
The NASA 2003 Mars Exploration Rover Panoramic Camera (Pancam) Investigation
NASA Astrophysics Data System (ADS)
Bell, J. F.; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Morris, R. V.; Athena Team
2002-12-01
The Panoramic Camera System (Pancam) is part of the Athena science payload to be launched to Mars in 2003 on NASA's twin Mars Exploration Rover missions. The Pancam imaging system on each rover consists of two major components: a pair of digital CCD cameras, and the Pancam Mast Assembly (PMA), which provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360o of azimuth and from zenith to nadir, providing a complete view of the scene around the rover. Pancam utilizes two 1024x2048 Mitel frame transfer CCD detector arrays, each having a 1024x1024 active imaging area and 32 optional additional reference pixels per row for offset monitoring. Each array is combined with optics and a small filter wheel to become one "eye" of a multispectral, stereoscopic imaging system. The optics for both cameras consist of identical 3-element symmetrical lenses with an effective focal length of 42 mm and a focal ratio of f/20, yielding an IFOV of 0.28 mrad/pixel or a rectangular FOV of 16o\\x9D 16o per eye. The two eyes are separated by 30 cm horizontally and have a 1o toe-in to provide adequate parallax for stereo imaging. The cameras are boresighted with adjacent wide-field stereo Navigation Cameras, as well as with the Mini-TES instrument. The Pancam optical design is optimized for best focus at 3 meters range, and allows Pancam to maintain acceptable focus from infinity to within 1.5 meters of the rover, with a graceful degradation (defocus) at closer ranges. Each eye also contains a small 8-position filter wheel to allow multispectral sky imaging, direct Sun imaging, and surface mineralogic studies in the 400-1100 nm wavelength region. Pancam has been designed and calibrated to operate within specifications from -55oC to +5oC. An onboard calibration target and fiducial marks provide the ability to validate the radiometric and geometric calibration on Mars. Pancam relies heavily on use of the JPL ICER wavelet compression algorithm to maximize data return within stringent mission downlink limits. The scientific goals of the Pancam investigation are to: (a) obtain monoscopic and stereoscopic image mosaics to assess the morphology, topography, and geologic context of each MER landing site; (b) obtain multispectral visible to short-wave near-IR images of selected regions to determine surface color and mineralogic properties; (c) obtain multispectral images over a range of viewing geometries to constrain surface photometric and physical properties; and (d) obtain images of the Martian sky, including direct images of the Sun, to determine dust and aerosol opacity and physical properties. In addition, Pancam also serves a variety of operational functions on the MER mission, including (e) serving as the primary Sun-finding camera for rover navigation; (f) resolving objects on the scale of the rover wheels to distances of ~100 m to help guide navigation decisions; (g) providing stereo coverage adequate for the generation of digital terrain models to help guide and refine rover traverse decisions; (h) providing high resolution images and other context information to guide the selection of the most interesting in situ sampling targets; and (i) supporting acquisition and release of exciting E/PO products.
Optimized Multi-Spectral Filter Array Based Imaging of Natural Scenes.
Li, Yuqi; Majumder, Aditi; Zhang, Hao; Gopi, M
2018-04-12
Multi-spectral imaging using a camera with more than three channels is an efficient method to acquire and reconstruct spectral data and is used extensively in tasks like object recognition, relighted rendering, and color constancy. Recently developed methods are used to only guide content-dependent filter selection where the set of spectral reflectances to be recovered are known a priori. We present the first content-independent spectral imaging pipeline that allows optimal selection of multiple channels. We also present algorithms for optimal placement of the channels in the color filter array yielding an efficient demosaicing order resulting in accurate spectral recovery of natural reflectance functions. These reflectance functions have the property that their power spectrum statistically exhibits a power-law behavior. Using this property, we propose power-law based error descriptors that are minimized to optimize the imaging pipeline. We extensively verify our models and optimizations using large sets of commercially available wide-band filters to demonstrate the greater accuracy and efficiency of our multi-spectral imaging pipeline over existing methods.
Optimized Multi-Spectral Filter Array Based Imaging of Natural Scenes
Li, Yuqi; Majumder, Aditi; Zhang, Hao; Gopi, M.
2018-01-01
Multi-spectral imaging using a camera with more than three channels is an efficient method to acquire and reconstruct spectral data and is used extensively in tasks like object recognition, relighted rendering, and color constancy. Recently developed methods are used to only guide content-dependent filter selection where the set of spectral reflectances to be recovered are known a priori. We present the first content-independent spectral imaging pipeline that allows optimal selection of multiple channels. We also present algorithms for optimal placement of the channels in the color filter array yielding an efficient demosaicing order resulting in accurate spectral recovery of natural reflectance functions. These reflectance functions have the property that their power spectrum statistically exhibits a power-law behavior. Using this property, we propose power-law based error descriptors that are minimized to optimize the imaging pipeline. We extensively verify our models and optimizations using large sets of commercially available wide-band filters to demonstrate the greater accuracy and efficiency of our multi-spectral imaging pipeline over existing methods. PMID:29649114
Computational multispectral video imaging [Invited].
Wang, Peng; Menon, Rajesh
2018-01-01
Multispectral imagers reveal information unperceivable to humans and conventional cameras. Here, we demonstrate a compact single-shot multispectral video-imaging camera by placing a micro-structured diffractive filter in close proximity to the image sensor. The diffractive filter converts spectral information to a spatial code on the sensor pixels. Following a calibration step, this code can be inverted via regularization-based linear algebra to compute the multispectral image. We experimentally demonstrated spectral resolution of 9.6 nm within the visible band (430-718 nm). We further show that the spatial resolution is enhanced by over 30% compared with the case without the diffractive filter. We also demonstrate Vis-IR imaging with the same sensor. Because no absorptive color filters are utilized, sensitivity is preserved as well. Finally, the diffractive filters can be easily manufactured using optical lithography and replication techniques.
Multi-spectral imaging with infrared sensitive organic light emitting diode
Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky
2014-01-01
Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions. PMID:25091589
NASA Technical Reports Server (NTRS)
1982-01-01
Model II Multispectral Camera is an advanced aerial camera that provides optimum enhancement of a scene by recording spectral signatures of ground objects only in narrow, preselected bands of the electromagnetic spectrum. Its photos have applications in such areas as agriculture, forestry, water pollution investigations, soil analysis, geologic exploration, water depth studies and camouflage detection. The target scene is simultaneously photographed in four separate spectral bands. Using a multispectral viewer, such as their Model 75 Spectral Data creates a color image from the black and white positives taken by the camera. With this optical image analysis unit, all four bands are superimposed in accurate registration and illuminated with combinations of blue green, red, and white light. Best color combination for displaying the target object is selected and printed. Spectral Data Corporation produces several types of remote sensing equipment and also provides aerial survey, image processing and analysis and number of other remote sensing services.
Determination of the Actual Land Use Pattern Using Unmanned Aerial Vehicles and Multispectral Camera
NASA Astrophysics Data System (ADS)
Dindaroğlu, T.; Gündoğan, R.; Gülci, S.
2017-11-01
The international initiatives developed in the context of combating global warming are based on the monitoring of Land Use, Land Use Changes, and Forests (LULUCEF). Determination of changes in land use patterns is used to determine the effects of greenhouse gas emissions and to reduce adverse effects in subsequent processes. This process, which requires the investigation and control of quite large areas, has undoubtedly increased the importance of technological tools and equipment. The use of carrier platforms and commercially cheaper various sensors have become widespread. In this study, multispectral camera was used to determine the land use pattern with high sensitivity. Unmanned aerial flights were carried out in the research fields of Kahramanmaras Sutcu Imam University campus area. Unmanned aerial vehicle (UAV) (multi-propeller hexacopter) was used as a carrier platform for aerial photographs. Within the scope of this study, multispectral cameras were used to determine the land use pattern with high sensitivity.
Multi-spectral imaging with infrared sensitive organic light emitting diode
NASA Astrophysics Data System (ADS)
Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky
2014-08-01
Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions.
Topview stereo: combining vehicle-mounted wide-angle cameras to a distance sensor array
NASA Astrophysics Data System (ADS)
Houben, Sebastian
2015-03-01
The variety of vehicle-mounted sensors in order to fulfill a growing number of driver assistance tasks has become a substantial factor in automobile manufacturing cost. We present a stereo distance method exploiting the overlapping field of view of a multi-camera fisheye surround view system, as they are used for near-range vehicle surveillance tasks, e.g. in parking maneuvers. Hence, we aim at creating a new input signal from sensors that are already installed. Particular properties of wide-angle cameras (e.g. hanging resolution) demand an adaptation of the image processing pipeline to several problems that do not arise in classical stereo vision performed with cameras carefully designed for this purpose. We introduce the algorithms for rectification, correspondence analysis, and regularization of the disparity image, discuss reasons and avoidance of the shown caveats, and present first results on a prototype topview setup.
Solar System Portrait - 60 Frame Mosaic
1996-09-13
The cameras of Voyager 1 on Feb. 14, 1990, pointed back toward the sun and took a series of pictures of the sun and the planets, making the first ever portrait of our solar system as seen from the outside. In the course of taking this mosaic consisting of a total of 60 frames, Voyager 1 made several images of the inner solar system from a distance of approximately 4 billion miles and about 32 degrees above the ecliptic plane. Thirty-nine wide angle frames link together six of the planets of our solar system in this mosaic. Outermost Neptune is 30 times further from the sun than Earth. Our sun is seen as the bright object in the center of the circle of frames. The wide-angle image of the sun was taken with the camera's darkest filter (a methane absorption band) and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large as seen from Voyager, only about one-fortieth of the diameter as seen from Earth, but is still almost 8 million times brighter than the brightest star in Earth's sky, Sirius. The result of this great brightness is an image with multiple reflections from the optics in the camera. Wide-angle images surrounding the sun also show many artifacts attributable to scattered light in the optics. These were taken through the clear filter with one second exposures. The insets show the planets magnified many times. Narrow-angle images of Earth, Venus, Jupiter, Saturn, Uranus and Neptune were acquired as the spacecraft built the wide-angle mosaic. Jupiter is larger than a narrow-angle pixel and is clearly resolved, as is Saturn with its rings. Uranus and Neptune appear larger than they really are because of image smear due to spacecraft motion during the long (15 second) exposures. From Voyager's great distance Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun. http://photojournal.jpl.nasa.gov/catalog/PIA00451
Solar System Portrait - 60 Frame Mosaic
NASA Technical Reports Server (NTRS)
1990-01-01
The cameras of Voyager 1 on Feb. 14, 1990, pointed back toward the sun and took a series of pictures of the sun and the planets, making the first ever 'portrait' of our solar system as seen from the outside. In the course of taking this mosaic consisting of a total of 60 frames, Voyager 1 made several images of the inner solar system from a distance of approximately 4 billion miles and about 32 degrees above the ecliptic plane. Thirty-nine wide angle frames link together six of the planets of our solar system in this mosaic. Outermost Neptune is 30 times further from the sun than Earth. Our sun is seen as the bright object in the center of the circle of frames. The wide-angle image of the sun was taken with the camera's darkest filter (a methane absorption band) and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large as seen from Voyager, only about one-fortieth of the diameter as seen from Earth, but is still almost 8 million times brighter than the brightest star in Earth's sky, Sirius. The result of this great brightness is an image with multiple reflections from the optics in the camera. Wide-angle images surrounding the sun also show many artifacts attributable to scattered light in the optics. These were taken through the clear filter with one second exposures. The insets show the planets magnified many times. Narrow-angle images of Earth, Venus, Jupiter, Saturn, Uranus and Neptune were acquired as the spacecraft built the wide-angle mosaic. Jupiter is larger than a narrow-angle pixel and is clearly resolved, as is Saturn with its rings. Uranus and Neptune appear larger than they really are because of image smear due to spacecraft motion during the long (15 second) exposures. From Voyager's great distance Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun.
NASA Astrophysics Data System (ADS)
Piermattei, Livia; Bozzi, Carlo Alberto; Mancini, Adriano; Tassetti, Anna Nora; Karel, Wilfried; Pfeifer, Norbert
2017-04-01
Unmanned aerial vehicles (UAVs) in combination with consumer grade cameras have become standard tools for photogrammetric applications and surveying. The recent generation of multispectral, cost-efficient and lightweight cameras has fostered a breakthrough in the practical application of UAVs for precision agriculture. For this application, multispectral cameras typically use Green, Red, Red-Edge (RE) and Near Infrared (NIR) wavebands to capture both visible and invisible images of crops and vegetation. These bands are very effective for deriving characteristics like soil productivity, plant health and overall growth. However, the quality of results is affected by the sensor architecture, the spatial and spectral resolutions, the pattern of image collection, and the processing of the multispectral images. In particular, collecting data with multiple sensors requires an accurate spatial co-registration of the various UAV image datasets. Multispectral processed data in precision agriculture are mainly presented as orthorectified mosaics used to export information maps and vegetation indices. This work aims to investigate the acquisition parameters and processing approaches of this new type of image data in order to generate orthoimages using different sensors and UAV platforms. Within our experimental area we placed a grid of artificial targets, whose position was determined with differential global positioning system (dGPS) measurements. Targets were used as ground control points to georeference the images and as checkpoints to verify the accuracy of the georeferenced mosaics. The primary aim is to present a method for the spatial co-registration of visible, Red-Edge, and NIR image sets. To demonstrate the applicability and accuracy of our methodology, multi-sensor datasets were collected over the same area and approximately at the same time using the fixed-wing UAV senseFly "eBee". The images were acquired with the camera Canon S110 RGB, the multispectral cameras Canon S110 NIR and S110 RE and with the multi-camera system Parrot Sequoia, which is composed of single-band cameras (Green, Red, Red Edge, NIR and RGB). Imagery from each sensor was georeferenced and mosaicked with the commercial software Agisoft PhotoScan Pro and different approaches for image orientation were compared. To assess the overall spatial accuracy of each dataset the root mean square error was computed between check point coordinates measured with dGPS and coordinates retrieved from georeferenced image mosaics. Additionally, image datasets from different UAV platforms (i.e. DJI Phantom 4Pro, DJI Phantom 3 professional, and DJI Inspire 1 Pro) were acquired over the same area and the spatial accuracy of the orthoimages was evaluated.
Geometric calibration of lens and filter distortions for multispectral filter-wheel cameras.
Brauers, Johannes; Aach, Til
2011-02-01
High-fidelity color image acquisition with a multispectral camera utilizes optical filters to separate the visible electromagnetic spectrum into several passbands. This is often realized with a computer-controlled filter wheel, where each position is equipped with an optical bandpass filter. For each filter wheel position, a grayscale image is acquired and the passbands are finally combined to a multispectral image. However, the different optical properties and non-coplanar alignment of the filters cause image aberrations since the optical path is slightly different for each filter wheel position. As in a normal camera system, the lens causes additional wavelength-dependent image distortions called chromatic aberrations. When transforming the multispectral image with these aberrations into an RGB image, color fringes appear, and the image exhibits a pincushion or barrel distortion. In this paper, we address both the distortions caused by the lens and by the filters. Based on a physical model of the bandpass filters, we show that the aberrations caused by the filters can be modeled by displaced image planes. The lens distortions are modeled by an extended pinhole camera model, which results in a remaining mean calibration error of only 0.07 pixels. Using an absolute calibration target, we then geometrically calibrate each passband and compensate for both lens and filter distortions simultaneously. We show that both types of aberrations can be compensated and present detailed results on the remaining calibration errors.
Interpretation of multispectral and infrared thermal surveys of the Suez Canal Zone, Egypt
NASA Technical Reports Server (NTRS)
Elshazly, E. M.; Hady, M. A. A. H.; Hafez, M. A. A.; Salman, A. B.; Morsy, M. A.; Elrakaiby, M. M.; Alaassy, I. E. E.; Kamel, A. F.
1977-01-01
Remote sensing airborne surveys were conducted, as part of the plan of rehabilitation, of the Suez Canal Zone using I2S multispectral camera and Bendix LN-3 infrared passive scanner. The multispectral camera gives four separate photographs for the same scene in the blue, green, red, and near infrared bands. The scanner was operated in the microwave bands of 8 to 14 microns and the thermal surveying was carried out both at night and in the day time. The surveys, coupled with intensive ground investigations, were utilized in the construction of new geological, structural lineation and drainage maps for the Suez Canal Zone on a scale of approximately 1:20,000, which are superior to the maps made by normal aerial photography. A considerable number of anomalies belonging to various types were revealed through the interpretation of the executed multispectral and infrared thermal surveys.
Analyzing RCD30 Oblique Performance in a Production Environment
NASA Astrophysics Data System (ADS)
Soler, M. E.; Kornus, W.; Magariños, A.; Pla, M.
2016-06-01
In 2014 the Institut Cartogràfic i Geològic de Catalunya (ICGC) decided to incorporate digital oblique imagery in its portfolio in response to the growing demand for this product. The reason can be attributed to its useful applications in a wide variety of fields and, most recently, to an increasing interest in 3d modeling. The selection phase for a digital oblique camera led to the purchase of the Leica RCD30 Oblique system, an 80MPixel multispectral medium-format camera which consists of one Nadir camera and four oblique viewing cameras acquiring images at an off-Nadir angle of 35º. The system also has a multi-directional motion compensation on-board system to deliver the highest image quality. The emergence of airborne oblique cameras has run in parallel to the inclusion of computer vision algorithms into the traditional photogrammetric workflows. Such algorithms rely on having multiple views of the same area of interest and take advantage of the image redundancy for automatic feature extraction. The multiview capability is highly fostered by the use of oblique systems which capture simultaneously different points of view for each camera shot. Different companies and NMAs have started pilot projects to assess the capabilities of the 3D mesh that can be obtained using correlation techniques. Beyond a software prototyping phase, and taking into account the currently immature state of several components of the oblique imagery workflow, the ICGC has focused on deploying a real production environment with special interest on matching the performance and quality of the existing production lines based on classical Nadir images. This paper introduces different test scenarios and layouts to analyze the impact of different variables on the geometric and radiometric performance. Different variables such as flight altitude, side and forward overlap and ground control point measurements and location have been considered for the evaluation of aerial triangulation and stereo plotting. Furthermore, two different flight configurations have been designed to measure the quality of the absolute radiometric calibration and the resolving power of the system. To quantify the effective resolution power of RCD30 Oblique images, a tool based on the computation of the Line Spread Function has been developed. The tool processes a region of interest that contains a single contour in order to extract a numerical measure of edge smoothness for a same flight session. The ICGC is highly devoted to derive information from satellite and airborne multispectral remote sensing imagery. A seamless Normalized Difference Vegetation Index (NDVI) retrieved from Digital Metric Camera (DMC) reflectance imagery is one of the products of ICGC's portfolio. As an evolution of this well-defined product, this paper presents an evaluation of the absolute radiometric calibration of the RCD30 Oblique sensor. To assess the quality of the measure, the ICGC has developed a procedure based on simultaneous acquisition of RCD30 Oblique imagery and radiometric calibrated AISA (Airborne Hyperspectral Imaging System) imagery.
NASA Astrophysics Data System (ADS)
Frouin, Robert; Deschamps, Pierre-Yves; Rothschild, Richard; Stephan, Edward; Leblanc, Philippe; Duttweiler, Fred; Ghaemi, Tony; Riedi, Jérôme
2006-12-01
The Monitoring Aerosols in the Ultraviolet Experiment (MAUVE) and the Short-Wave Infrared Polarimeter Experiment (SWIPE) instruments have been designed to collect, from a typical sun-synchronous polar orbit at 800 km altitude, global observations of the spectral, polarized, and directional radiance reflected by the earth-atmosphere system for a wide range of applications. Based on the heritage of the POLDER radiometer, the MAUVE/SWIPE instrument concept combines the merits of TOMS for observing in the ultra-violet, MISR for wide field-of-view range, MODIS, for multi-spectral aspects in the visible and near infrared, and the POLDER instrument for polarization. The instruments are camera systems with 2-dimensional detector arrays, allowing a 120-degree field-of-view with adequate ground resolution (i.e., 0.4 or 0.8 km at nadir) from satellite altitude. Multi-angle viewing is achieved by the along-track migration at spacecraft velocity of the 2-dimensional field-of-view. Between the cameras' optical assembly and detector array are two filter wheels, one carrying spectral filters, the other polarizing filters, allowing measurements of the first three Stokes parameters, I. Q, and V, of the incident radiation in 16 spectral bands optimally placed in the interval 350-2200 nm. The spectral range is 350-1050 nm for the MAUVE instrument and 1050-2200 nm for the SWIPE instrument. The radiometric requirements are defined to fully exploit the multi-angular, multi-spectral, and multi-polarized capability of the instruments. These include a wide dynamic range, a signal-to-noise ratio above 500 in all channels at maximum radiance level, i.e., when viewing a surface target of albedo equal to 1, and a noise-equivalent-differential reflectance better than 0.0005 at low signal level for a sun at zenith. To achieve daily global coverage, a pair of MAUVE and SWIPE instruments would be carried by each of two mini-satellites placed on interlaced orbits. The equator crossing time of the two satellites would be adjusted to allow simultaneous observations of the overlapping zone viewed from the two parallel orbits of the twin satellites. Using twin satellites instead of a single satellite would allow measurements in a more complete range of scattering angles. A MAUVE/SWIPE satellite mission would improve significantly the accuracy of ocean color observations from space, and will extend the retrieval of ocean optical properties to the ultra-violet, where they become very sensitive to detritus material and dissolved organic matter. It would also provide a complete description of the scattering and absorption properties of aerosol particles, as well as their size distribution and vertical distribution. Over land, the retrieved bidirectional reflectance function would allow a better classification of terrestrial vegetation and discrimination of surface types. The twin satellite concept, by providing stereoscopic capability, would offer the possibility to analyze the three-dimensional structure and radiative properties of cloud fields.
NASA Astrophysics Data System (ADS)
Wicaksono, Pramaditya; Salivian Wisnu Kumara, Ignatius; Kamal, Muhammad; Afif Fauzan, Muhammad; Zhafarina, Zhafirah; Agus Nurswantoro, Dwi; Noviaris Yogyantoro, Rifka
2017-12-01
Although spectrally different, seagrass species may not be able to be mapped from multispectral remote sensing images due to the limitation of their spectral resolution. Therefore, it is important to quantitatively assess the possibility of mapping seagrass species using multispectral images by resampling seagrass species spectra to multispectral bands. Seagrass species spectra were measured on harvested seagrass leaves. Spectral resolution of multispectral images used in this research was adopted from WorldView-2, Quickbird, Sentinel-2A, ASTER VNIR, and Landsat 8 OLI. These images are widely available and can be a good representative and baseline for previous or future remote sensing images. Seagrass species considered in this research are Enhalus acoroides (Ea), Thalassodendron ciliatum (Tc), Thalassia hemprichii (Th), Cymodocea rotundata (Cr), Cymodocea serrulata (Cs), Halodule uninervis (Hu), Halodule pinifolia (Hp), Syringodum isoetifolium (Si), Halophila ovalis (Ho), and Halophila minor (Hm). Multispectral resampling analysis indicate that the resampled spectra exhibit similar shape and pattern with the original spectra but less precise, and they lose the unique absorption feature of seagrass species. Relying on spectral bands alone, multispectral image is not effective in mapping these seagrass species individually, which is shown by the poor and inconsistent result of Spectral Angle Mapper (SAM) classification technique in classifying seagrass species using seagrass species spectra as pure endmember. Only Sentinel-2A produced acceptable classification result using SAM.
Fabrication of multi-focal microlens array on curved surface for wide-angle camera module
NASA Astrophysics Data System (ADS)
Pan, Jun-Gu; Su, Guo-Dung J.
2017-08-01
In this paper, we present a wide-angle and compact camera module that consists of microlens array with different focal lengths on curved surface. The design integrates the principle of an insect's compound eye and the human eye. It contains a curved hexagonal microlens array and a spherical lens. Compared with normal mobile phone cameras which usually need no less than four lenses, but our proposed system only uses one lens. Furthermore, the thickness of our proposed system is only 2.08 mm and diagonal full field of view is about 100 degrees. In order to make the critical microlens array, we used the inkjet printing to control the surface shape of each microlens for achieving different focal lengths and use replication method to form curved hexagonal microlens array.
NASA Astrophysics Data System (ADS)
Oertel, D.; Jahn, H.; Sandau, R.; Walter, I.; Driescher, H.
1990-10-01
Objectives of the multifunctional stereo imaging camera (MUSIC) system to be deployed on the Soviet Mars-94 mission are outlined. A high-resolution stereo camera (HRSC) and wide-angle opto-electronic stereo scanner (WAOSS) are combined in terms of hardware, software, technology aspects, and solutions. Both HRSC and WAOSS are push-button instruments containing a single optical system and focal plates with several parallel CCD line sensors. Emphasis is placed on the MUSIC system's stereo capability, its design, mass memory, and data compression. A 1-Gbit memory is divided into two parts: 80 percent for HRSC and 20 percent for WAOSS, while the selected on-line compression strategy is based on macropixel coding and real-time transform coding.
First Results from the Wide Angle Camera of the ROSETTA Mission .
NASA Astrophysics Data System (ADS)
Barbieri, C.; Fornasier, S.; Bertini, I.; Angrilli, F.; Bianchini, G. A.; Debei, S.; De Cecco, M.; Parzianello, G.; Zaccariotto, M.; Da Deppo, V.; Naletto, G.
This paper gives a brief description of the Wide Angle Camera (WAC), built by the Center of Studies and Activities for Space (CISAS) of the University of Padova for the ESA ROSETTA Mission, of data we have obtained about the new mission targets, and of the first results achieved after the launch in March 2004. The WAC is part of the OSIRIS imaging system, built under the PI-ship of Dr. U. Keller (Max-Planck-Institute for Solar System Studies) which comprises also a Narrow Angle Camera (NAC) built by the Laboratoire d'Astrophysique Spatiale (LAS) of Marseille. CISAS had also the responsibility to build the shutter and the front door mechanism for the NAC. The images show the excellent optical quality of the WAC, exceeding the specifications both in term of encircled energy (80% in one pixel over a FoV of 12×12 sq degree), limiting magnitude (fainter than the 13th in 30s exposure time through a wideband red filter) and amount of distortions.
Generalized assorted pixel camera: postcapture control of resolution, dynamic range, and spectrum.
Yasuma, Fumihito; Mitsunaga, Tomoo; Iso, Daisuke; Nayar, Shree K
2010-09-01
We propose the concept of a generalized assorted pixel (GAP) camera, which enables the user to capture a single image of a scene and, after the fact, control the tradeoff between spatial resolution, dynamic range and spectral detail. The GAP camera uses a complex array (or mosaic) of color filters. A major problem with using such an array is that the captured image is severely under-sampled for at least some of the filter types. This leads to reconstructed images with strong aliasing. We make four contributions in this paper: 1) we present a comprehensive optimization method to arrive at the spatial and spectral layout of the color filter array of a GAP camera. 2) We develop a novel algorithm for reconstructing the under-sampled channels of the image while minimizing aliasing artifacts. 3) We demonstrate how the user can capture a single image and then control the tradeoff of spatial resolution to generate a variety of images, including monochrome, high dynamic range (HDR) monochrome, RGB, HDR RGB, and multispectral images. 4) Finally, the performance of our GAP camera has been verified using extensive simulations that use multispectral images of real world scenes. A large database of these multispectral images has been made available at http://www1.cs.columbia.edu/CAVE/projects/gap_camera/ for use by the research community.
Registration of 3D and Multispectral Data for the Study of Cultural Heritage Surfaces
Chane, Camille Simon; Schütze, Rainer; Boochs, Frank; Marzani, Franck S.
2013-01-01
We present a technique for the multi-sensor registration of featureless datasets based on the photogrammetric tracking of the acquisition systems in use. This method is developed for the in situ study of cultural heritage objects and is tested by digitizing a small canvas successively with a 3D digitization system and a multispectral camera while simultaneously tracking the acquisition systems with four cameras and using a cubic target frame with a side length of 500 mm. The achieved tracking accuracy is better than 0.03 mm spatially and 0.150 mrad angularly. This allows us to seamlessly register the 3D acquisitions and to project the multispectral acquisitions on the 3D model. PMID:23322103
NASA Astrophysics Data System (ADS)
Swain, Pradyumna; Mark, David
2004-09-01
The emergence of curved CCD detectors as individual devices or as contoured mosaics assembled to match the curved focal planes of astronomical telescopes and terrestrial stereo panoramic cameras represents a major optical design advancement that greatly enhances the scientific potential of such instruments. In altering the primary detection surface within the telescope"s optical instrumentation system from flat to curved, and conforming the applied CCD"s shape precisely to the contour of the telescope"s curved focal plane, a major increase in the amount of transmittable light at various wavelengths through the system is achieved. This in turn enables multi-spectral ultra-sensitive imaging with much greater spatial resolution necessary for large and very large telescope applications, including those involving infrared image acquisition and spectroscopy, conducted over very wide fields of view. For earth-based and space-borne optical telescopes, the advent of curved CCD"s as the principle detectors provides a simplification of the telescope"s adjoining optics, reducing the number of optical elements and the occurrence of optical aberrations associated with large corrective optics used to conform to flat detectors. New astronomical experiments may be devised in the presence of curved CCD applications, in conjunction with large format cameras and curved mosaics, including three dimensional imaging spectroscopy conducted over multiple wavelengths simultaneously, wide field real-time stereoscopic tracking of remote objects within the solar system at high resolution, and deep field survey mapping of distant objects such as galaxies with much greater multi-band spatial precision over larger sky regions. Terrestrial stereo panoramic cameras equipped with arrays of curved CCD"s joined with associative wide field optics will require less optical glass and no mechanically moving parts to maintain continuous proper stereo convergence over wider perspective viewing fields than their flat CCD counterparts, lightening the cameras and enabling faster scanning and 3D integration of objects moving within a planetary terrain environment. Preliminary experiments conducted at the Sarnoff Corporation indicate the feasibility of curved CCD imagers with acceptable electro-optic integrity. Currently, we are in the process of evaluating the electro-optic performance of a curved wafer scale CCD imager. Detailed ray trace modeling and experimental electro-optical data performance obtained from the curved imager will be presented at the conference.
Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish
2018-01-01
Purpose: The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Methods: Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Conclusion: Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching. PMID:29283133
Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish
2018-01-01
The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching.
Omnidirectional Underwater Camera Design and Calibration
Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David
2015-01-01
This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707
Uncertainty in multispectral lidar signals caused by incidence angle effects
Nevalainen, Olli; Hakala, Teemu; Kaasalainen, Mikko
2018-01-01
Multispectral terrestrial laser scanning (TLS) is an emerging technology. Several manufacturers already offer commercial dual or three wavelength airborne laser scanners, while multispectral TLS is still carried out mainly with research instruments. Many of these research efforts have focused on the study of vegetation. The aim of this paper is to study the uncertainty of the measurement of spectral indices of vegetation with multispectral lidar. Using two spectral indices as examples, we find that the uncertainty is due to systematic errors caused by the wavelength dependency of laser incidence angle effects. This finding is empirical, and the error cannot be removed by modelling or instrument modification. The discovery and study of these effects has been enabled by hyperspectral and multispectral TLS, and it has become a subject of active research within the past few years. We summarize the most recent studies on multi-wavelength incidence angle effects and present new results on the effect of specular reflection from the leaf surface, and the surface structure, which have been suggested to play a key role. We also discuss the consequences to the measurement of spectral indices with multispectral TLS, and a possible correction scheme using a synthetic laser footprint. PMID:29503718
Severe storm environments: A Skylab EREP report
NASA Technical Reports Server (NTRS)
Pitts, D. E.; Sasaki, Y.; Lee, J. T. (Principal Investigator)
1978-01-01
The results from the severe storm experiment over Texas and Oklahoma are presented. Correlation of data, soil moisture, water temperature, and cloud characteristics were considered. The sensors used in this study were multispectral band cameras, multispectral band scanners, infrared spectrometers, radiometers, and scatterometers.
Multispectral Photography: the obscure becomes the obvious
ERIC Educational Resources Information Center
Polgrean, John
1974-01-01
Commonly used in map making, real estate zoning, and highway route location, aerial photography planes equipped with multispectral cameras may, among many environmental applications, now be used to locate mineral deposits, define marshland boundaries, study water pollution, and detect diseases in crops and forests. (KM)
A Full View of Pluto Stunning Crescent
2015-10-29
In September, NASA's New Horizons team released a stunning but incomplete image of Pluto's crescent. Thanks to new processing work by the science team, New Horizons is releasing the entire, breathtaking image of Pluto. This image was made just 15 minutes after New Horizons' closest approach to Pluto on July 14, 2015, as the spacecraft looked back at Pluto toward the sun. The wide-angle perspective of this view shows the deep haze layers of Pluto's atmosphere extending all the way around Pluto, revealing the silhouetted profiles of rugged plateaus on the night (left) side. The shadow of Pluto cast on its atmospheric hazes can also be seen at the uppermost part of the disk. On the sunlit side of Pluto (right), the smooth expanse of the informally named icy plain Sputnik Planum is flanked to the west (above, in this orientation) by rugged mountains up to 11,000 feet (3,500 meters) high, including the informally named Norgay Montes in the foreground and Hillary Montes on the skyline. Below (east) of Sputnik, rougher terrain is cut by apparent glaciers. The backlighting highlights more than a dozen high-altitude layers of haze in Pluto's tenuous atmosphere. The horizontal streaks in the sky beyond Pluto are stars, smeared out by the motion of the camera as it tracked Pluto. The image was taken with New Horizons' Multi-spectral Visible Imaging Camera (MVIC) from a distance of 11,000 miles (18,000 kilometers) to Pluto. The resolution is 700 meters (0.4 miles).
Multispectral imaging system for contaminant detection
NASA Technical Reports Server (NTRS)
Poole, Gavin H. (Inventor)
2003-01-01
An automated inspection system for detecting digestive contaminants on food items as they are being processed for consumption includes a conveyor for transporting the food items, a light sealed enclosure which surrounds a portion of the conveyor, with a light source and a multispectral or hyperspectral digital imaging camera disposed within the enclosure. Operation of the conveyor, light source and camera are controlled by a central computer unit. Light reflected by the food items within the enclosure is detected in predetermined wavelength bands, and detected intensity values are analyzed to detect the presence of digestive contamination.
Highly Protable Airborne Multispectral Imaging System
NASA Technical Reports Server (NTRS)
Lehnemann, Robert; Mcnamee, Todd
2001-01-01
A portable instrumentation system is described that includes and airborne and a ground-based subsytem. It can acquire multispectral image data over swaths of terrain ranging in width from about 1.5 to 1 km. The system was developed especially for use in coastal environments and is well suited for performing remote sensing and general environmental monitoring. It includes a small,munpilotaed, remotely controlled airplance that carries a forward-looking camera for navigation, three downward-looking monochrome video cameras for imaging terrain in three spectral bands, a video transmitter, and a Global Positioning System (GPS) reciever.
Multispectral radiation envelope characteristics of aerial infrared targets
NASA Astrophysics Data System (ADS)
Kou, Tian; Zhou, Zhongliang; Liu, Hongqiang; Yang, Yuanzhi; Lu, Chunguang
2018-07-01
Multispectral detection signals are relatively stable and complementary to single spectral detection signals with deficiencies of severe scintillation and poor anti-interference. To take advantage of multispectral radiation characteristics in the application of infrared target detection, the concept of a multispectral radiation envelope is proposed. To build the multispectral radiation envelope model, the temperature distribution of an aerial infrared target is calculated first. By considering the coupling heat transfer process, the heat balance equation is built by using the node network, and the convective heat transfer laws as a function of target speed are uncovered. Then, the tail flame temperature distribution model is built and the temperature distributions at different horizontal distances are calculated. Second, to obtain the optimal detection angles, envelope models of reflected background multispectral radiation and target multispectral radiation are built. Finally, the envelope characteristics of the aerial target multispectral radiation are analyzed in different wavebands in detail. The results we obtained reflect Wien's displacement law and prove the effectiveness and reasonableness of the envelope model, and also indicate that the major difference between multispectral wavebands is greatly influenced by the target speed. Moreover, optimal detection angles are obtained by numerical simulation, and these are very important for accurate and fast target detection, attack decision-making and developing multispectral detection platforms.
Solar System Portrait - View of the Sun, Earth and Venus
1996-09-13
This color image of the sun, Earth and Venus was taken by the Voyager 1 spacecraft Feb. 14, 1990, when it was approximately 32 degrees above the plane of the ecliptic and at a slant-range distance of approximately 4 billion miles. It is the first -- and may be the only -- time that we will ever see our solar system from such a vantage point. The image is a portion of a wide-angle image containing the sun and the region of space where the Earth and Venus were at the time with two narrow-angle pictures centered on each planet. The wide-angle was taken with the camera's darkest filter (a methane absorption band), and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large in the sky as seen from Voyager's perspective at the edge of the solar system but is still eight million times brighter than the brightest star in Earth's sky, Sirius. The image of the sun you see is far larger than the actual dimension of the solar disk. The result of the brightness is a bright burned out image with multiple reflections from the optics in the camera. The "rays" around the sun are a diffraction pattern of the calibration lamp which is mounted in front of the wide angle lens. The two narrow-angle frames containing the images of the Earth and Venus have been digitally mosaiced into the wide-angle image at the appropriate scale. These images were taken through three color filters and recombined to produce a color image. The violet, green and blue filters were used; exposure times were, for the Earth image, 0.72, 0.48 and 0.72 seconds, and for the Venus frame, 0.36, 0.24 and 0.36, respectively. Although the planetary pictures were taken with the narrow-angle camera (1500 mm focal length) and were not pointed directly at the sun, they show the effects of the glare from the nearby sun, in the form of long linear streaks resulting from the scattering of sunlight off parts of the camera and its sun shade. From Voyager's great distance both Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun. Detailed analysis also suggests that Voyager detected the moon as well, but it is too faint to be seen without special processing. Venus was only 0.11 pixel in diameter. The faint colored structure in both planetary frames results from sunlight scattered in the optics. http://photojournal.jpl.nasa.gov/catalog/PIA00450
Solar System Portrait - View of the Sun, Earth and Venus
NASA Technical Reports Server (NTRS)
1990-01-01
This color image of the sun, Earth and Venus was taken by the Voyager 1 spacecraft Feb. 14, 1990, when it was approximately 32 degrees above the plane of the ecliptic and at a slant-range distance of approximately 4 billion miles. It is the first -- and may be the only -- time that we will ever see our solar system from such a vantage point. The image is a portion of a wide-angle image containing the sun and the region of space where the Earth and Venus were at the time with two narrow-angle pictures centered on each planet. The wide-angle was taken with the camera's darkest filter (a methane absorption band), and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large in the sky as seen from Voyager's perspective at the edge of the solar system but is still eight million times brighter than the brightest star in Earth's sky, Sirius. The image of the sun you see is far larger than the actual dimension of the solar disk. The result of the brightness is a bright burned out image with multiple reflections from the optics in the camera. The 'rays' around the sun are a diffraction pattern of the calibration lamp which is mounted in front of the wide angle lens. The two narrow-angle frames containing the images of the Earth and Venus have been digitally mosaiced into the wide-angle image at the appropriate scale. These images were taken through three color filters and recombined to produce a color image. The violet, green and blue filters were used; exposure times were, for the Earth image, 0.72, 0.48 and 0.72 seconds, and for the Venus frame, 0.36, 0.24 and 0.36, respectively. Although the planetary pictures were taken with the narrow-angle camera (1500 mm focal length) and were not pointed directly at the sun, they show the effects of the glare from the nearby sun, in the form of long linear streaks resulting from the scattering of sunlight off parts of the camera and its sun shade. From Voyager's great distance both Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun. Detailed analysis also suggests that Voyager detected the moon as well, but it is too faint to be seen without special processing. Venus was only 0.11 pixel in diameter. The faint colored structure in both planetary frames results from sunlight scattered in the optics.
Dynamic calibration of pan-tilt-zoom cameras for traffic monitoring.
Song, Kai-Tai; Tai, Jen-Chao
2006-10-01
Pan-tilt-zoom (PTZ) cameras have been widely used in recent years for monitoring and surveillance applications. These cameras provide flexible view selection as well as a wider observation range. This makes them suitable for vision-based traffic monitoring and enforcement systems. To employ PTZ cameras for image measurement applications, one first needs to calibrate the camera to obtain meaningful results. For instance, the accuracy of estimating vehicle speed depends on the accuracy of camera calibration and that of vehicle tracking results. This paper presents a novel calibration method for a PTZ camera overlooking a traffic scene. The proposed approach requires no manual operation to select the positions of special features. It automatically uses a set of parallel lane markings and the lane width to compute the camera parameters, namely, focal length, tilt angle, and pan angle. Image processing procedures have been developed for automatically finding parallel lane markings. Interesting experimental results are presented to validate the robustness and accuracy of the proposed method.
2011-07-01
cameras were installed around the test pan and an underwater GoPro ® video camera recorded the fire from below the layer of fuel. 3.2.2. Camera Images...Distribution A: Approved for public release; distribution unlimited. 3.2.3. Video Images A GoPro video camera with a wide angle lens recorded the tests...camera and the GoPro ® video camera were not used for fire suppression experiments. 3.3.2. Test Pans Two ¼-in thick stainless steel test pans were
Imaging During MESSENGER's Second Flyby of Mercury
NASA Astrophysics Data System (ADS)
Chabot, N. L.; Prockter, L. M.; Murchie, S. L.; Robinson, M. S.; Laslo, N. R.; Kang, H. K.; Hawkins, S. E.; Vaughan, R. M.; Head, J. W.; Solomon, S. C.; MESSENGER Team
2008-12-01
During MESSENGER's second flyby of Mercury on October 6, 2008, the Mercury Dual Imaging System (MDIS) will acquire 1287 images. The images will include coverage of about 30% of Mercury's surface not previously seen by spacecraft. A portion of the newly imaged terrain will be viewed during the inbound portion of the flyby. On the outbound leg, MDIS will image additional previously unseen terrain as well as regions imaged under different illumination geometry by Mariner 10. These new images, when combined with images from Mariner 10 and from MESSENGER's first Mercury flyby, will enable the first regional- resolution global view of Mercury constituting a combined total coverage of about 96% of the planet's surface. MDIS consists of both a Wide Angle Camera (WAC) and a Narrow Angle Camera (NAC). During MESSENGER's second Mercury flyby, the following imaging activities are planned: about 86 minutes before the spacecraft's closest pass by the planet, the WAC will acquire images through 11 different narrow-band color filters of the approaching crescent planet at a resolution of about 5 km/pixel. At slightly less than 1 hour to closest approach, the NAC will acquire a 4-column x 11-row mosaic with an approximate resolution of 450 m/pixel. At 8 minutes after closest approach, the WAC will obtain the highest-resolution multispectral images to date of Mercury's surface, imaging a portion of the surface through 11 color filters at resolutions of about 250-600 m/pixel. A strip of high-resolution NAC images, with a resolution of approximately 100 m/pixel, will follow these WAC observations. The NAC will next acquire a 15-column x 13- row high-resolution mosaic of the northern hemisphere of the departing planet, beginning approximately 21 minutes after closest approach, with resolutions of 140-300 m/pixel; this mosaic will fill a large gore in the Mariner 10 data. At about 42 minutes following closest approach, the WAC will acquire a 3x3, 11-filter, full- planet mosaic with an average resolution of 2.5 km/pixel. Two NAC mosaics of the entire departing planet will be acquired beginning about 66 minutes after closest approach, with resolutions of 500-700 m/pixel. About 89 minutes following closest approach, the WAC will acquire a multispectral image set with a resolution of about 5 km/pixel. Following this WAC image set, MDIS will continue to acquire occasional images with both the WAC and NAC until 20 hours after closest approach, at which time the flyby data will begin being transmitted to Earth.
Have a Nice Spring! MOC Revisits "Happy Face" Crater
2005-05-16
Smile! Spring has sprung in the martian southern hemisphere. With it comes the annual retreat of the winter polar frost cap. This view of "Happy Face Crater"--officially named "Galle Crater"--shows patches of white water ice frost in and around the crater's south-facing slopes. Slopes that face south will retain frost longer than north-facing slopes because they do not receive as much sunlight in early spring. This picture is a composite of images taken by the Mars Global Surveyor Mars Orbiter Camera (MOC) red and blue wide angle cameras. The wide angle cameras were designed to monitor the changing weather, frost, and wind patterns on Mars. Galle Crater is located on the east rim of the Argyre Basin and is about 215 kilometers (134 miles) across. In this picture, illumination is from the upper left and north is up. http://photojournal.jpl.nasa.gov/catalog/PIA02325
Harry E. Brown
1962-01-01
The canopy camera is a device of new design that takes wide-angle, overhead photographs of vegetation canopies, cloud cover, topographic horizons, and similar subjects. Since the entire hemisphere is photographed in a single exposure, the resulting photograph is circular, with the horizon forming the perimeter and the zenith the center. Photographs of this type provide...
Evaluation of multispectral plenoptic camera
NASA Astrophysics Data System (ADS)
Meng, Lingfei; Sun, Ting; Kosoglow, Rich; Berkner, Kathrin
2013-01-01
Plenoptic cameras enable capture of a 4D lightfield, allowing digital refocusing and depth estimation from data captured with a compact portable camera. Whereas most of the work on plenoptic camera design has been based a simplistic geometric-optics-based characterization of the optical path only, little work has been done of optimizing end-to-end system performance for a specific application. Such design optimization requires design tools that need to include careful parameterization of main lens elements, as well as microlens array and sensor characteristics. In this paper we are interested in evaluating the performance of a multispectral plenoptic camera, i.e. a camera with spectral filters inserted into the aperture plane of the main lens. Such a camera enables single-snapshot spectral data acquisition.1-3 We first describe in detail an end-to-end imaging system model for a spectrally coded plenoptic camera that we briefly introduced in.4 Different performance metrics are defined to evaluate the spectral reconstruction quality. We then present a prototype which is developed based on a modified DSLR camera containing a lenslet array on the sensor and a filter array in the main lens. Finally we evaluate the spectral reconstruction performance of a spectral plenoptic camera based on both simulation and measurements obtained from the prototype.
Phase Curves of Nix and Hydra from the New Horizons Imaging Cameras
NASA Astrophysics Data System (ADS)
Verbiscer, Anne J.; Porter, Simon B.; Buratti, Bonnie J.; Weaver, Harold A.; Spencer, John R.; Showalter, Mark R.; Buie, Marc W.; Hofgartner, Jason D.; Hicks, Michael D.; Ennico-Smith, Kimberly; Olkin, Catherine B.; Stern, S. Alan; Young, Leslie A.; Cheng, Andrew; (The New Horizons Team
2018-01-01
NASA’s New Horizons spacecraft’s voyage through the Pluto system centered on 2015 July 14 provided images of Pluto’s small satellites Nix and Hydra at viewing angles unattainable from Earth. Here, we present solar phase curves of the two largest of Pluto’s small moons, Nix and Hydra, observed by the New Horizons LOng Range Reconnaissance Imager and Multi-spectral Visible Imaging Camera, which reveal the scattering properties of their icy surfaces in visible light. Construction of these solar phase curves enables comparisons between the photometric properties of Pluto’s small moons and those of other icy satellites in the outer solar system. Nix and Hydra have higher visible albedos than those of other resonant Kuiper Belt objects and irregular satellites of the giant planets, but not as high as small satellites of Saturn interior to Titan. Both Nix and Hydra appear to scatter visible light preferentially in the forward direction, unlike most icy satellites in the outer solar system, which are typically backscattering.
USDA-ARS?s Scientific Manuscript database
This research developed a multispectral algorithm derived from hyperspectral line-scan fluorescence imaging under violet/blue LED excitation for detection of fecal contamination on Golden Delicious apples. Using a hyperspectral line-scan imaging system consisting of an EMCCD camera, spectrograph, an...
Digital Astronaut Photography: A Discovery Dataset for Archaeology
NASA Technical Reports Server (NTRS)
Stefanov, William L.
2010-01-01
Astronaut photography acquired from the International Space Station (ISS) using commercial off-the-shelf cameras offers a freely-accessible source for high to very high resolution (4-20 m/pixel) visible-wavelength digital data of Earth. Since ISS Expedition 1 in 2000, over 373,000 images of the Earth-Moon system (including land surface, ocean, atmospheric, and lunar images) have been added to the Gateway to Astronaut Photography of Earth online database (http://eol.jsc.nasa.gov ). Handheld astronaut photographs vary in look angle, time of acquisition, solar illumination, and spatial resolution. These attributes of digital astronaut photography result from a unique combination of ISS orbital dynamics, mission operations, camera systems, and the individual skills of the astronaut. The variable nature of astronaut photography makes the dataset uniquely useful for archaeological applications in comparison with more traditional nadir-viewing multispectral datasets acquired from unmanned orbital platforms. For example, surface features such as trenches, walls, ruins, urban patterns, and vegetation clearing and regrowth patterns may be accentuated by low sun angles and oblique viewing conditions (Fig. 1). High spatial resolution digital astronaut photographs can also be used with sophisticated land cover classification and spatial analysis approaches like Object Based Image Analysis, increasing the potential for use in archaeological characterization of landscapes and specific sites.
Feasibility study and quality assessment of unmanned aircraft system-derived multispectral images
NASA Astrophysics Data System (ADS)
Chang, Kuo-Jen
2017-04-01
The purpose of study is to explore the precision and the applicability of UAS-derived multispectral images. In this study, the Micro-MCA6 multispectral camera was mounted on quadcopter. The Micro-MCA6 shoot images synchronized of each single band. By means of geotagged images and control points, the orthomosaic images of each single band generated firstly by 14cm resolution. The multispectral image was merged complete with 6 bands. In order to improve the spatial resolution, the 6 band image fused with 9cm resolution image taken from RGB camera. Quality evaluation of the image is verified of the each single band by using control points and check points. The standard deviations of errors are within 1 to 2 pixel resolution of each band. The quality of the multispectral image is compared with 3 cm resolution orthomosaic RGB image gathered from UAV in the same mission, as well. The standard deviations of errors are within 2 to 3 pixel resolution. The result shows that the errors resulting from the blurry and the band dislocation of the objects edge identification. To the end, the normalized difference vegetation index (NDVI) extracted from the image to explore the condition of vegetation and the nature of the environment. This study demonstrates the feasibility and the capability of the high resolution multispectral images.
Laser- and Multi-Spectral Monitoring of Natural Objects from UAVs
NASA Astrophysics Data System (ADS)
Reiterer, Alexander; Frey, Simon; Koch, Barbara; Stemmler, Simon; Weinacker, Holger; Hoffmann, Annemarie; Weiler, Markus; Hergarten, Stefan
2016-04-01
The paper describes the research, development and evaluation of a lightweight sensor system for UAVs. The system is composed of three main components: (1) a laser scanning module, (2) a multi-spectral camera system, and (3) a processing/storage unit. All three components are newly developed. Beside measurement precision and frequency, the low weight has been one of the challenging tasks. The current system has a total weight of about 2.5 kg and is designed as a self-contained unit (incl. storage and battery units). The main features of the system are: laser-based multi-echo 3D measurement by a wavelength of 905 nm (totally eye save), measurement range up to 200 m, measurement frequency of 40 kHz, scanning frequency of 16 Hz, relative distance accuracy of 10 mm. The system is equipped with both GNSS and IMU. Alternatively, a multi-visual-odometry system has been integrated to estimate the trajectory of the UAV by image features (based on this system a calculation of 3D-coordinates without GNSS is possible). The integrated multi-spectral camera system is based on conventional CMOS-image-chips equipped with a special sets of band-pass interference filters with a full width half maximum (FWHM) of 50 nm. Good results for calculating the normalized difference vegetation index (NDVI) and the wide dynamic range vegetation index (WDRVI) have been achieved using the band-pass interference filter-set with a FWHM of 50 nm and an exposure times between 5.000 μs and 7.000 μs. The system is currently used for monitoring of natural objects and surfaces, like forest, as well as for geo-risk analysis (landslides). By measuring 3D-geometric and multi-spectral information a reliable monitoring and interpretation of the data-set is possible. The paper gives an overview about the development steps, the system, the evaluation and first results.
Prediction of Viking lander camera image quality
NASA Technical Reports Server (NTRS)
Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.
1976-01-01
Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.
NASA Astrophysics Data System (ADS)
Haase, I.; Oberst, J.; Scholten, F.; Wählisch, M.; Gläser, P.; Karachevtseva, I.; Robinson, M. S.
2012-05-01
Newly acquired high resolution Lunar Reconnaissance Orbiter Camera (LROC) images allow accurate determination of the coordinates of Apollo hardware, sampling stations, and photographic viewpoints. In particular, the positions from where the Apollo 17 astronauts recorded panoramic image series, at the so-called “traverse stations”, were precisely determined for traverse path reconstruction. We analyzed observations made in Apollo surface photography as well as orthorectified orbital images (0.5 m/pixel) and Digital Terrain Models (DTMs) (1.5 m/pixel and 100 m/pixel) derived from LROC Narrow Angle Camera (NAC) and Wide Angle Camera (WAC) images. Key features captured in the Apollo panoramic sequences were identified in LROC NAC orthoimages. Angular directions of these features were measured in the panoramic images and fitted to the NAC orthoimage by applying least squares techniques. As a result, we obtained the surface panoramic camera positions to within 50 cm. At the same time, the camera orientations, North azimuth angles and distances to nearby features of interest were also determined. Here, initial results are shown for traverse station 1 (northwest of Steno Crater) as well as the Apollo Lunar Surface Experiment Package (ALSEP) area.
NASA Astrophysics Data System (ADS)
Da Deppo, V.; Naletto, G.; Nicolosi, P.; Zambolin, P.; De Cecco, M.; Debei, S.; Parzianello, G.; Ramous, P.; Zaccariotto, M.; Fornasier, S.; Verani, S.; Thomas, N.; Barthol, P.; Hviid, S. F.; Sebastian, I.; Meller, R.; Sierks, H.; Keller, H. U.; Barbieri, C.; Angrilli, F.; Lamy, P.; Rodrigo, R.; Rickman, H.; Wenzel, K. P.
2017-11-01
Rosetta is one of the cornerstone missions of the European Space Agency for having a rendezvous with the comet 67P/Churyumov-Gerasimenko in 2014. The imaging instrument on board the satellite is OSIRIS (Optical, Spectroscopic and Infrared Remote Imaging System), a cooperation among several European institutes, which consists of two cameras: a Narrow (NAC) and a Wide Angle Camera (WAC). The WAC optical design is an innovative one: it adopts an all reflecting, unvignetted and unobstructed two mirror configuration which allows to cover a 12° × 12° field of view with an F/5.6 aperture and gives a nominal contrast ratio of about 10-4. The flight model of this camera has been successfully integrated and tested in our laboratories, and finally has been integrated on the satellite which is now waiting to be launched in February 2004. In this paper we are going to describe the optical characteristics of the camera, and to summarize the results so far obtained with the preliminary calibration data. The analysis of the optical performance of this model shows a good agreement between theoretical performance and experimental results.
Automatic helmet-wearing detection for law enforcement using CCTV cameras
NASA Astrophysics Data System (ADS)
Wonghabut, P.; Kumphong, J.; Satiennam, T.; Ung-arunyawee, R.; Leelapatra, W.
2018-04-01
The objective of this research is to develop an application for enforcing helmet wearing using CCTV cameras. The developed application aims to help law enforcement by police, and eventually resulting in changing risk behaviours and consequently reducing the number of accidents and its severity. Conceptually, the application software implemented using C++ language and OpenCV library uses two different angle of view CCTV cameras. Video frames recorded by the wide-angle CCTV camera are used to detect motorcyclists. If any motorcyclist without helmet is found, then the zoomed (narrow-angle) CCTV is activated to capture image of the violating motorcyclist and the motorcycle license plate in real time. Captured images are managed by database implemented using MySQL for ticket issuing. The results show that the developed program is able to detect 81% of motorcyclists on various motorcycle types during daytime and night-time. The validation results reveal that the program achieves 74% accuracy in detecting the motorcyclist without helmet.
Rosetta/OSIRIS - Nucleus morphology and activity of comet 67P/Churyumov-Gerasimenko
NASA Astrophysics Data System (ADS)
Sierks, Holger; Barbieri, Cesare; Lamy, Philippe; Rickman, Hans; Rodrigo, Rafael; Koschny, Detlef
2015-04-01
ESA's Rosetta mission arrived on August 6, 2014, at target comet 67P/Churyumov-Gerasimenko after 10 years of cruise. OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) is the scientific imaging system onboard Rosetta. It comprises a Narrow Angle Camera (NAC) for nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field coma investigations. OSIRIS imaged the nucleus and coma of the comet from the arrival throughout the mapping phase, PHILAE landing, early escort phase and close fly-by. The overview paper will discuss the surface morpholo-gy and activity of the nucleus as seen in gas, dust, and local jets as well as small scale structures in the local topography.
Multispectral and colour analysis for ubiquinone solutions and biological samples
NASA Astrophysics Data System (ADS)
Timofeeva, Elvira O.; Gorbunova, Elena V.; Chertov, Aleksandr N.
2017-02-01
An oxidative damage in cell structures is a basis of most mechanisms that lead to health diseases and senescence of human body. The presence of antioxidant issues such as redox potential imbalance in human body is a very important question for modern clinical diagnostics. Implementation of multispectral and colour analysis of the human skin into optical diagnostics of such wide distributed in a human body antioxidant as ubiquinone can be one of the steps for development of the device with a view to clinical diagnostics of redox potential or quality control of the cosmetics. The recording of multispectral images of the hand skin with monochromatic camera and a set of coloured filters was provided in the current research. Recording data of the multispectral imaging technique was processed using principal component analysis. Also colour characteristics of the skin before and after the skin treatment with facial mask which contains ubiquinone were calculated. The results of the mask treatment were compared with the treatment using oily ubiquinone solution. Despite the fact that results did not give clear explanation about healthy skin or skin stressed by reactive oxygen species, methods which were described in this research are able to identify how skin surface is changing after the antioxidant treatment. In future it is important to provide biomedical tests during the optical tests of the human skin.
Stand-off CWA imaging system: second sight MS
NASA Astrophysics Data System (ADS)
Bernascolle, Philippe F.; Elichabe, Audrey; Fervel, Franck; Haumonté, Jean-Baptiste
2012-06-01
In recent years, several manufactures of IR imaging devices have launched commercial models applicable to a wide range of chemical species. These cameras are rugged and sufficiently sensitive to detect low concentrations of toxic and combustible gases. Bertin Technologies, specialized in the design and supply of innovating systems for industry, defense and health, has developed a stand-off gas imaging system using a multi-spectral infrared imaging technology. With this system, the gas cloud size, localization and evolution can be displayed in real time. This technology was developed several years ago in partnership with the CEB, a French MoD CBRN organization. The goal was to meet the need for early warning caused by a chemical threat. With a night & day efficiency of up to 5 km, this process is able to detect Chemical Warfare Agents (CWA), critical Toxic Industrial Compounds (TIC) and also flammable gases. The system has been adapted to detect industrial spillage, using off-the-shelf uncooled infrared cameras, allowing 24/7 surveillance without costly frequent maintenance. The changes brought to the system are in compliance with Military Specifications (MS) and primarily focus on the signal processing improving the classification of the detected products and on the simplification of the Human Machine Interface (HMI). Second Sight MS is the only mass produced, passive stand-off CWA imaging system with a wide angle (up to 60°) already used by several regular armies around the world. This paper examines this IR gas imager performance when exposed to several CWA, TIC and simulant compounds. First, we will describe the Second Sight MS system. The theory of gas detection, visualization and classification functions has already been described elsewhere, so we will just summarize it here. We will then present the main topic of this paper which is the results of the tests done in laboratory on live agents and in open field on simulant. The sensitivity threshold of the camera measured in laboratory, on some CWA (G, H agents...) and TIC (ammonia, sulfur dioxide...) will be given. The result of the detection and visualization of a gas cloud in open field testing for some simulants (DMMP, SF6) at a far distance will be also shown.
The Activity of Comet 67P/Churyumov-Gerasimenko as Seen by Rosetta/OSIRIS
NASA Astrophysics Data System (ADS)
Sierks, H.; Barbieri, C.; Lamy, P. L.; Rodrigo, R.; Rickman, H.; Koschny, D.
2015-12-01
The Rosetta mission of the European Space Agency arrived on August 6, 2014, at the target comet 67P/Churyumov-Gerasimenko. OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) is the scientific imaging system onboard Rosetta. OSIRIS consists of a Narrow Angle Camera (NAC) for the nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field gas and dust coma investigations. OSIRIS observed the coma and the nucleus of comet 67P/C-G during approach, arrival, and landing of PHILAE. OSIRIS continued comet monitoring and mapping of surface and activity in 2015 with close fly-bys with high resolution and remote, wide angle observations. The scientific results reveal a nucleus with two lobes and varied morphology. Active regions are located at steep cliffs and collapsed pits which form collimated gas jets. Dust is accelerated by the gas, forming bright jet filaments and the large scale, diffuse coma of the comet. We will present activity and surface changes observed in the Northern and Southern hemisphere and around perihelion passage.
Evaluation of modified portable digital camera for screening of diabetic retinopathy.
Chalam, Kakarla V; Brar, Vikram S; Keshavamurthy, Ravi
2009-01-01
To describe a portable wide-field noncontact digital camera for posterior segment photography. The digital camera has a compound lens consisting of two optical elements (a 90-dpt and a 20-dpt lens) attached to a 7.2-megapixel camera. White-light-emitting diodes are used to illuminate the fundus and reduce source reflection. The camera settings are set to candlelight mode, the optic zoom standardized to x2.4 and the focus is manually set to 3.0 m. The new technique provides quality wide-angle digital images of the retina (60 degrees ) in patients with dilated pupils, at a fraction of the cost of established digital fundus photography. The modified digital camera is a useful alternative technique to acquire fundus images and provides a tool for screening posterior segment conditions, including diabetic retinopathy in a variety of clinical settings.
NASA Technical Reports Server (NTRS)
1998-01-01
Under a Jet Propulsion Laboratory SBIR (Small Business Innovative Research), Cambridge Research and Instrumentation Inc., developed a new class of filters for the construction of small, low-cost multispectral imagers. The VariSpec liquid crystal enables users to obtain multi-spectral, ultra-high resolution images using a monochrome CCD (charge coupled device) camera. Application areas include biomedical imaging, remote sensing, and machine vision.
NASA Technical Reports Server (NTRS)
Park, Jung- Ho; Kim, Jongmin; Zukic, Muamer; Torr, Douglas G.
1994-01-01
We report the design of multilayer reflective filters for the self-filtering cameras of the NUVIEWS project. Wide angle self-filtering cameras were designed to image the C IV (154.9 nm) line emission, and H2 Lyman band fluorescence (centered at 161 nm) over a 20 deg x 30 deg field of view. A key element of the filter design includes the development of pi-multilayers optimized to provide maximum reflectance at 154.9 nm and 161 nm for the respective cameras without significant spectral sensitivity to the large cone angle of the incident radiation. We applied self-filtering concepts to design NUVIEWS telescope filters that are composed of three reflective mirrors and one folding mirror. The filters with narrowband widths of 6 and 8 rim at 154.9 and 161 nm, respectively, have net throughputs of more than 50 % with average blocking of out-of-band wavelengths better than 3 x 10(exp -4)%.
Afocal viewport optics for underwater imaging
NASA Astrophysics Data System (ADS)
Slater, Dan
2014-09-01
A conventional camera can be adapted for underwater use by enclosing it in a sealed waterproof pressure housing with a viewport. The viewport, as an optical interface between water and air needs to consider both the camera and water optical characteristics while also providing a high pressure water seal. Limited hydrospace visibility drives a need for wide angle viewports. Practical optical interfaces between seawater and air vary from simple flat plate windows to complex water contact lenses. This paper first provides a brief overview of the physical and optical properties of the ocean environment along with suitable optical materials. This is followed by a discussion of the characteristics of various afocal underwater viewport types including flat windows, domes and the Ivanoff corrector lens, a derivative of a Galilean wide angle camera adapter. Several new and interesting optical designs derived from the Ivanoff corrector lens are presented including a pair of very compact afocal viewport lenses that are compatible with both in water and in air environments and an afocal underwater hyper-hemispherical fisheye lens.
Dual multispectral and 3D structured light laparoscope
NASA Astrophysics Data System (ADS)
Clancy, Neil T.; Lin, Jianyu; Arya, Shobhit; Hanna, George B.; Elson, Daniel S.
2015-03-01
Intraoperative feedback on tissue function, such as blood volume and oxygenation would be useful to the surgeon in cases where current clinical practice relies on subjective measures, such as identification of ischaemic bowel or tissue viability during anastomosis formation. Also, tissue surface profiling may be used to detect and identify certain pathologies, as well as diagnosing aspects of tissue health such as gut motility. In this paper a dual modality laparoscopic system is presented that combines multispectral reflectance and 3D surface imaging. White light illumination from a xenon source is detected by a laparoscope-mounted fast filter wheel camera to assemble a multispectral image (MSI) cube. Surface shape is then calculated using a spectrally-encoded structured light (SL) pattern detected by the same camera and triangulated using an active stereo technique. Images of porcine small bowel were acquired during open surgery. Tissue reflectance spectra were acquired and blood volume was calculated at each spatial pixel across the bowel wall and mesentery. SL features were segmented and identified using a `normalised cut' algoritm and the colour vector of each spot. Using the 3D geometry defined by the camera coordinate system the multispectral data could be overlaid onto the surface mesh. Dual MSI and SL imaging has the potential to provide augmented views to the surgeon supplying diagnostic information related to blood supply health and organ function. Future work on this system will include filter optimisation to reduce noise in tissue optical property measurement, and minimise spot identification errors in the SL pattern.
Airborne camera and spectrometer experiments and data evaluation
NASA Astrophysics Data System (ADS)
Lehmann, F. F.; Bucher, T.; Pless, S.; Wohlfeil, J.; Hirschmüller, H.
2009-09-01
New stereo push broom camera systems have been developed at German Aerospace Centre (DLR). The new small multispectral systems (Multi Functional Camerahead - MFC, Advanced Multispectral Scanner - AMS) are light weight, compact and display three or five RGB stereo lines of 8000, 10 000 or 14 000 pixels, which are used for stereo processing and the generation of Digital Surface Models (DSM) and near True Orthoimage Mosaics (TOM). Simultaneous acquisition of different types of MFC-cameras for infrared and RGB data has been successfully tested. All spectral channels record the image data in full resolution, pan-sharpening is not necessary. Analogue to the line scanner data an automatic processing chain for UltraCamD and UltraCamX exists. The different systems have been flown for different types of applications; main fields of interest among others are environmental applications (flooding simulations, monitoring tasks, classification) and 3D-modelling (e.g. city mapping). From the DSM and TOM data Digital Terrain Models (DTM) and 3D city models are derived. Textures for the facades are taken from oblique orthoimages, which are created from the same input data as the TOM and the DOM. The resulting models are characterised by high geometric accuracy and the perfect fit of image data and DSM. The DLR is permanently developing and testing a wide range of sensor types and imaging platforms for terrestrial and space applications. The MFC-sensors have been flown in combination with laser systems and imaging spectrometers and special data fusion products have been developed. These products include hyperspectral orthoimages and 3D hyperspectral data.
Colors of active regions on comet 67P
NASA Astrophysics Data System (ADS)
Oklay, N.; Vincent, J.-B.; Sierks, H.; Besse, S.; Fornasier, S.; Barucci, M. A.; Lara, L.; Scholten, F.; Preusker, F.; Lazzarin, M.; Pajola, M.; La Forgia, F.
2015-10-01
The OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) scientific imager (Keller et al. 2007) is successfully delivering images of comet 67P/Churyumov-Gerasimenko from its both wide angle camera (WAC) and narrow angle camera (NAC) since ESA's spacecraft Rosetta's arrival to the comet. Both cameras are equipped with filters covering the wavelength range of about 200 nm to 1000 nm. The comet nucleus is mapped with different combination of the filters in resolutions up to 15 cm/px. Besides the determination of the surface morphology in great details (Thomas et al. 2015), such high resolution images provided us a mean to unambiguously link some activity in the coma to a series of pits on the nucleus surface (Vincent et al. 2015).
The Europa Imaging System (EIS): Investigating Europa's geology, ice shell, and current activity
NASA Astrophysics Data System (ADS)
Turtle, Elizabeth; Thomas, Nicolas; Fletcher, Leigh; Hayes, Alexander; Ernst, Carolyn; Collins, Geoffrey; Hansen, Candice; Kirk, Randolph L.; Nimmo, Francis; McEwen, Alfred; Hurford, Terry; Barr Mlinar, Amy; Quick, Lynnae; Patterson, Wes; Soderblom, Jason
2016-07-01
NASA's Europa Mission, planned for launch in 2022, will perform more than 40 flybys of Europa with altitudes at closest approach as low as 25 km. The instrument payload includes the Europa Imaging System (EIS), a camera suite designed to transform our understanding of Europa through global decameter-scale coverage, topographic and color mapping, and unprecedented sub- meter-scale imaging. EIS combines narrow-angle and wide-angle cameras to address these science goals: • Constrain the formation processes of surface features by characterizing endogenic geologic structures, surface units, global cross-cutting relationships, and relationships to Europa's subsurface structure and potential near-surface water. • Search for evidence of recent or current activity, including potential plumes. • Characterize the ice shell by constraining its thickness and correlating surface features with subsurface structures detected by ice penetrating radar. • Characterize scientifically compelling landing sites and hazards by determining the nature of the surface at scales relevant to a potential lander. EIS Narrow-angle Camera (NAC): The NAC, with a 2.3°° x 1.2°° field of view (FOV) and a 10-μμrad instantaneous FOV (IFOV), achieves 0.5-m pixel scale over a 2-km-wide swath from 50-km altitude. A 2-axis gimbal enables independent targeting, allowing very high-resolution stereo imaging to generate digital topographic models (DTMs) with 4-m spatial scale and 0.5-m vertical precision over the 2-km swath from 50-km altitude. The gimbal also makes near-global (>95%) mapping of Europa possible at ≤50-m pixel scale, as well as regional stereo imaging. The NAC will also perform high-phase-angle observations to search for potential plumes. EIS Wide-angle Camera (WAC): The WAC has a 48°° x 24°° FOV, with a 218-μμrad IFOV, and is designed to acquire pushbroom stereo swaths along flyby ground-tracks. From an altitude of 50 km, the WAC achieves 11-m pixel scale over a 44-km-wide swath, generating DTMs with 32-m spatial scale and 4-m vertical precision. These data also support characterization of surface clutter for interpretation of radar deep and shallow sounding modes. Detectors: The cameras have identical rapid-readout, radiation-hard 4k x 2k CMOS detectors and can image in both pushbroom and framing modes. Color observations are acquired by pushbroom imaging using six broadband filters (~300-1050 nm), allowing mapping of surface units for correlation with geologic structures, topography, and compositional units from other instruments.
NASA Technical Reports Server (NTRS)
1998-01-01
Color composite of condensate clouds over Tharsis made from red and blue images with a synthesized green channel. Mars Orbiter Camera wide angle frames from Orbit 48.
Figure caption from Science MagazineInvestigation of Parallax Issues for Multi-Lens Multispectral Camera Band Co-Registration
NASA Astrophysics Data System (ADS)
Jhan, J. P.; Rau, J. Y.; Haala, N.; Cramer, M.
2017-08-01
The multi-lens multispectral cameras (MSCs), such as Micasense Rededge and Parrot Sequoia, can record multispectral information by each separated lenses. With their lightweight and small size, which making they are more suitable for mounting on an Unmanned Aerial System (UAS) to collect high spatial images for vegetation investigation. However, due to the multi-sensor geometry of multi-lens structure induces significant band misregistration effects in original image, performing band co-registration is necessary in order to obtain accurate spectral information. A robust and adaptive band-to-band image transform (RABBIT) is proposed to perform band co-registration of multi-lens MSCs. First is to obtain the camera rig information from camera system calibration, and utilizes the calibrated results for performing image transformation and lens distortion correction. Since the calibration uncertainty leads to different amount of systematic errors, the last step is to optimize the results in order to acquire a better co-registration accuracy. Due to the potential issues of parallax that will cause significant band misregistration effects when images are closer to the targets, four datasets thus acquired from Rededge and Sequoia were applied to evaluate the performance of RABBIT, including aerial and close-range imagery. From the results of aerial images, it shows that RABBIT can achieve sub-pixel accuracy level that is suitable for the band co-registration purpose of any multi-lens MSC. In addition, the results of close-range images also has same performance, if we focus on the band co-registration on specific target for 3D modelling, or when the target has equal distance to the camera.
Geomorphologic mapping of the lunar crater Tycho and its impact melt deposits
NASA Astrophysics Data System (ADS)
Krüger, T.; van der Bogert, C. H.; Hiesinger, H.
2016-07-01
Using SELENE/Kaguya Terrain Camera and Lunar Reconnaissance Orbiter Camera (LROC) data, we produced a new, high-resolution (10 m/pixel), geomorphological and impact melt distribution map for the lunar crater Tycho. The distal ejecta blanket and crater rays were investigated using LROC wide-angle camera (WAC) data (100 m/pixel), while the fine-scale morphologies of individual units were documented using high resolution (∼0.5 m/pixel) LROC narrow-angle camera (NAC) frames. In particular, Tycho shows a large coherent melt sheet on the crater floor, melt pools and flows along the terraced walls, and melt pools on the continuous ejecta blanket. The crater floor of Tycho exhibits three distinct units, distinguishable by their elevation and hummocky surface morphology. The distribution of impact melt pools and ejecta, as well as topographic asymmetries, support the formation of Tycho as an oblique impact from the W-SW. The asymmetric ejecta blanket, significantly reduced melt emplacement uprange, and the depressed uprange crater rim at Tycho suggest an impact angle of ∼25-45°.
Programmable LED-based integrating sphere light source for wide-field fluorescence microscopy.
Rehman, Aziz Ul; Anwer, Ayad G; Goldys, Ewa M
2017-12-01
Wide-field fluorescence microscopy commonly uses a mercury lamp, which has limited spectral capabilities. We designed and built a programmable integrating sphere light (PISL) source which consists of nine LEDs, light-collecting optics, a commercially available integrating sphere and a baffle. The PISL source is tuneable in the range 365-490nm with a uniform spatial profile and a sufficient power at the objective to carry out spectral imaging. We retrofitted a standard fluorescence inverted microscope DM IRB (Leica) with a PISL source by mounting it together with a highly sensitive low- noise CMOS camera. The capabilities of the setup have been demonstrated by carrying out multispectral autofluorescence imaging of live BV2 cells. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Gliss, Christine; Parel, Jean-Marie A.; Flynn, John T.; Pratisto, Hans S.; Niederer, Peter F.
2003-07-01
We present a miniaturized version of a fundus camera. The camera is designed for the use in screening for retinopathy of prematurity (ROP). There, but also in other applications a small, light weight, digital camera system can be extremely useful. We present a small wide angle digital camera system. The handpiece is significantly smaller and lighter then in all other systems. The electronics is truly portable fitting in a standard boardcase. The camera is designed to be offered at a compatible price. Data from tests on young rabbits' eyes is presented. The development of the camera system is part of a telemedicine project screening for ROP. Telemedical applications are a perfect application for this camera system using both advantages: the portability as well as the digital image.
NASA Astrophysics Data System (ADS)
Pronichev, A. N.; Polyakov, E. V.; Tupitsyn, N. N.; Frenkel, M. A.; Mozhenkova, A. V.
2017-01-01
The article describes the use of a computer optical microscopy with multispectral camera to characterize the texture of blasts bone marrow of patients with different variants of acute lymphoblastic leukemia: B- and T- types. Specific characteristics of the chromatin of the nuclei of blasts for different types of acute lymphoblastic leukemia were obtained.
Image denoising and deblurring using multispectral data
NASA Astrophysics Data System (ADS)
Semenishchev, E. A.; Voronin, V. V.; Marchuk, V. I.
2017-05-01
Currently decision-making systems get widespread. These systems are based on the analysis video sequences and also additional data. They are volume, change size, the behavior of one or a group of objects, temperature gradient, the presence of local areas with strong differences, and others. Security and control system are main areas of application. A noise on the images strongly influences the subsequent processing and decision making. This paper considers the problem of primary signal processing for solving the tasks of image denoising and deblurring of multispectral data. The additional information from multispectral channels can improve the efficiency of object classification. In this paper we use method of combining information about the objects obtained by the cameras in different frequency bands. We apply method based on simultaneous minimization L2 and the first order square difference sequence of estimates to denoising and restoring the blur on the edges. In case of loss of the information will be applied an approach based on the interpolation of data taken from the analysis of objects located in other areas and information obtained from multispectral camera. The effectiveness of the proposed approach is shown in a set of test images.
Wide-Angle Polarimetric Camera for Korea Pathfinder Lunar Orbiter
NASA Astrophysics Data System (ADS)
Choi, Y. J.; Kim, S.; Kang, K. I.
2016-12-01
A polarimetry data contains valuable information about the lunar surface such as the grain size and porosity of the regolith. However, a polarimetry toward the Moon in its orbit has not been performed. We plan to perform the polarimetry in lunar orbit through Korea Pathfinder Lunar Orbiter (KPLO), which will be launched around 2018/2019 as the first Korean lunar mission. Wide-Angle Polarimetric Camera (PolCam) is selected as one of the onboard instrument for KPLO. The science objectives are ; (1) To obtain the polarization data of the whole lunar surface at wavelengths of 430nm and 650nm for phase angle range from 0° to 120° with a spatial resolution of 80 m. (2) To obtain the reflectance ratios at 320 nm and 430 nm for the whole lunar surface with a spatial resolution of 80m. We will summarize recent results of lunar surface from ground-based polarimetric observations and will briefly introduce the science rationals and operation concept of PolCam.
NASA Astrophysics Data System (ADS)
Chen, Chung-Hao; Yao, Yi; Chang, Hong; Koschan, Andreas; Abidi, Mongi
2013-06-01
Due to increasing security concerns, a complete security system should consist of two major components, a computer-based face-recognition system and a real-time automated video surveillance system. A computerbased face-recognition system can be used in gate access control for identity authentication. In recent studies, multispectral imaging and fusion of multispectral narrow-band images in the visible spectrum have been employed and proven to enhance the recognition performance over conventional broad-band images, especially when the illumination changes. Thus, we present an automated method that specifies the optimal spectral ranges under the given illumination. Experimental results verify the consistent performance of our algorithm via the observation that an identical set of spectral band images is selected under all tested conditions. Our discovery can be practically used for a new customized sensor design associated with given illuminations for an improved face recognition performance over conventional broad-band images. In addition, once a person is authorized to enter a restricted area, we still need to continuously monitor his/her activities for the sake of security. Because pantilt-zoom (PTZ) cameras are capable of covering a panoramic area and maintaining high resolution imagery for real-time behavior understanding, researches in automated surveillance systems with multiple PTZ cameras have become increasingly important. Most existing algorithms require the prior knowledge of intrinsic parameters of the PTZ camera to infer the relative positioning and orientation among multiple PTZ cameras. To overcome this limitation, we propose a novel mapping algorithm that derives the relative positioning and orientation between two PTZ cameras based on a unified polynomial model. This reduces the dependence on the knowledge of intrinsic parameters of PTZ camera and relative positions. Experimental results demonstrate that our proposed algorithm presents substantially reduced computational complexity and improved flexibility at the cost of slightly decreased pixel accuracy as compared to Chen and Wang's method [18].
Quasi-microscope concept for planetary missions.
Huck, F O; Arvidson, R E; Burcher, E E; Giat, O; Wall, S D
1977-09-01
Viking lander cameras have returned stereo and multispectral views of the Martian surface with a resolution that approaches 2 mm/lp in the near field. A two-orders-of-magnitude increase in resolution could be obtained for collected surface samples by augmenting these cameras with auxiliary optics that would neither impose special camera design requirements nor limit the cameras field of view of the terrain. Quasi-microscope images would provide valuable data on the physical and chemical characteristics of planetary regoliths.
1998-03-13
Color composite of condensate clouds over Tharsis made from red and blue images with a synthesized green channel. Mars Orbiter Camera wide angle frames from Orbit 48. http://photojournal.jpl.nasa.gov/catalog/PIA00812
Pancam: A Multispectral Imaging Investigation on the NASA 2003 Mars Exploration Rover Mission
NASA Technical Reports Server (NTRS)
Bell, J. F., III; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Dingizian, A.; Brown, D.; Morris, R. V.; Arneson, H. M.; Johnson, M. J.
2003-01-01
One of the six science payload elements carried on each of the NASA Mars Exploration Rovers (MER; Figure 1) is the Panoramic Camera System, or Pancam. Pancam consists of three major components: a pair of digital CCD cameras, the Pancam Mast Assembly (PMA), and a radiometric calibration target. The PMA provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. The calibration target provides a set of reference color and grayscale standards for calibration validation, and a shadow post for quantification of the direct vs. diffuse illumination of the scene. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360 of azimuth and from zenith to nadir, providing a complete view of the scene around the rover in up to 12 unique wavelengths. The major characteristics of Pancam are summarized.
Time-of-Flight Microwave Camera.
Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh
2015-10-05
Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable "stealth" regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows "camera-like" behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.
Radiometric sensitivity comparisons of multispectral imaging systems
NASA Technical Reports Server (NTRS)
Lu, Nadine C.; Slater, Philip N.
1989-01-01
Multispectral imaging systems provide much of the basic data used by the land and ocean civilian remote-sensing community. There are numerous multispectral imaging systems which have been and are being developed. A common way to compare the radiometric performance of these systems is to examine their noise-equivalent change in reflectance, NE Delta-rho. The NE Delta-rho of a system is the reflectance difference that is equal to the noise in the recorded signal. A comparison is made of the noise equivalent change in reflectance of seven different multispectral imaging systems (AVHRR, AVIRIS, ETM, HIRIS, MODIS-N, SPOT-1, HRV, and TM) for a set of three atmospheric conditions (continental aerosol with 23-km visibility, continental aerosol with 5-km visibility, and a Rayleigh atmosphere), five values of ground reflectance (0.01, 0.10, 0.25, 0.50, and 1.00), a nadir viewing angle, and a solar zenith angle of 45 deg.
Application of multispectral color photography to flame flow visualization
NASA Technical Reports Server (NTRS)
Stoffers, G.
1979-01-01
For flames of short duration and low intensity of radiation a spectroscopical flame diagnostics is difficult. In order to find some other means of extracting information about the flame structure from its radiation, the feasibility of using multispectral color photography was successfully evaluated. Since the flame photographs are close-ups, there is a considerable parallax between the single images, when several cameras are used, and additive color viewing is not possible. Each image must be analyzed individually, it is advisable to use color film in all cameras. One can either use color films of different spectral sensitivities or color films of the same type with different color filters. Sharp cutting filters are recommended.
2017-11-27
These two images illustrate just how far Cassini traveled to get to Saturn. On the left is one of the earliest images Cassini took of the ringed planet, captured during the long voyage from the inner solar system. On the right is one of Cassini's final images of Saturn, showing the site where the spacecraft would enter the atmosphere on the following day. In the left image, taken in 2001, about six months after the spacecraft passed Jupiter for a gravity assist flyby, the best view of Saturn using the spacecraft's high-resolution (narrow-angle) camera was on the order of what could be seen using the Earth-orbiting Hubble Space Telescope. At the end of the mission (at right), from close to Saturn, even the lower resolution (wide-angle) camera could capture just a tiny part of the planet. The left image looks toward Saturn from 20 degrees below the ring plane and was taken on July 13, 2001 in wavelengths of infrared light centered at 727 nanometers using the Cassini spacecraft narrow-angle camera. The view at right is centered on a point 6 degrees north of the equator and was taken in visible light using the wide-angle camera on Sept. 14, 2017. The view on the left was acquired at a distance of approximately 317 million miles (510 million kilometers) from Saturn. Image scale is about 1,900 miles (3,100 kilometers) per pixel. The view at right was acquired at a distance of approximately 360,000 miles (579,000 kilometers) from Saturn. Image scale is 22 miles (35 kilometers) per pixel. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA21353
Johnson, J. R.; Grundy, W.M.; Lemmon, M.T.; Bell, J.F.; Johnson, M.J.; Deen, R.; Arvidson, R. E.; Farrand, W. H.; Guinness, E.; Hayes, A.G.; Herkenhoff, K. E.; Seelos, F.; Soderblom, J.; Squyres, S.
2006-01-01
The Panoramic Camera (Pancam) on the Mars Exploration Rover Opportunity acquired visible/near-infrared multispectral observations of soils and rocks under varying viewing and illumination geometries that were modeled using radiative transfer theory to improve interpretations of the microphysical and surface scattering nature of materials in Meridiani Planum. Nearly 25,000 individual measurements were collected of rock and soil units identified by their color and morphologic properties over a wide range of phase angles (0-150??) at Eagle crater, in the surrounding plains, in Endurance crater, and in the plains between Endurance and Erebus craters through Sol 492. Corrections for diffuse skylight incorporated sky models based on observations of atmospheric opacity throughout the mission. Disparity maps created from Pancam stereo images allowed inclusion of local facet orientation estimates. Outcrop rocks overall exhibited the highest single scattering albedos (???0.9 at 753 nm), and most spherule-rich soils exhibited the lowest (???0.6 at 753 nm). Macroscopic roughness among outcrop rocks varied but was typically larger than spherule-rich soils. Data sets with sufficient phase angle coverage (resulting in well-constrained Hapke parameters) suggested that models using single-term and two-term Henyey-Greenstein phase functions exhibit a dominantly broad backscattering trend for most undisturbed spherule-rich soils. Rover tracks and other compressed soils exhibited forward scattering, while outcrop rocks were intermediate in their scattering behaviors. Some phase functions exhibited wavelength-dependent trends that may result from variations in thin deposits of airfall dust that occurred during the mission. Copyright 2006 by the American Geophysical Union.
Mars Daily Global Image from April 1999
2000-09-08
Twelve orbits a day provide NASA Mars Global Surveyor MOC wide angle cameras a global napshot of weather patterns across the planet. Here, bluish-white water ice clouds hang above the Tharsis volcanoes.
[Reliability of retinal imaging screening in retinopathy of prematurity].
Navarro-Blanco, C; Peralta-Calvo, J; Pastora-Salvador, N; Alvarez-Rementería, L; Chamorro, E; Sánchez-Ramos, C
2014-09-01
The retinopathy of prematurity (ROP) is a potentially avoidable cause of blindness in children. The advances in neonatal care make the survival of extremely premature infants, who show a greater incidence of the disease, possible. The aim of the study is to evaluate the reliability of ROP screening using retinography imaging with the RetCam 3 wide-angle camera and also study the variability of ROP diagnosis depending on the evaluator. The indirect ophthalmoscopy exam was performed by a Pediatric ROP-Expert Ophthalmologist. The same ophthalmologist and a technician specialized in digital image capture took retinal images using the RetCam 3 wide-angle camera. A total of 30 image sets were analyzed by 3 masked groups: group A (8 ophthalmologists), group B (5 experts in vision), and group C (2 ROP-expert ophthalmologists). According to the diagnosis using indirect ophthalmoscopy, the sensitivity (26-93), Kappa (0.24-0.80), and the percent agreement were statistically significant in group C for the diagnosis of ROP Type 1. In the diagnosis of ROP Type 1+Type 2, Kappa (0.17-0.33) and the percent agreement (58-90) were statistically significant, with higher values in group C. The diagnosis, carried out by ROP-expert ophthalmologists, using the wide-angle camera RetCam 3 has proved to be a reliable method. Copyright © 2013 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.
Snowstorm Along the China-Mongolia-Russia Borders
NASA Technical Reports Server (NTRS)
2004-01-01
Heavy snowfall on March 12, 2004, across north China's Inner Mongolia Autonomous Region, Mongolia and Russia, caused train and highway traffic to stop for several days along the Russia-China border. This pair of images from the Multi-angle Imaging SpectroRadiometer (MISR) highlights the snow and surface properties across the region on March 13. The left-hand image is a multi-spectral false-color view made from the near-infrared, red, and green bands of MISR's vertical-viewing (nadir) camera. The right-hand image is a multi-angle false-color view made from the red band data of the 46-degree aftward camera, the nadir camera, and the 46-degree forward camera. About midway between the frozen expanse of China's Hulun Nur Lake (along the right-hand edge of the images) and Russia's Torey Lakes (above image center) is a dark linear feature that corresponds with the China-Mongolia border. In the upper portion of the images, many small plumes of black smoke rise from coal and wood fires and blow toward the southeast over the frozen lakes and snow-covered grasslands. Along the upper left-hand portion of the images, in Russia's Yablonovyy mountain range and the Onon River Valley, the terrain becomes more hilly and forested. In the nadir image, vegetation appears in shades of red, owing to its high near-infrared reflectivity. In the multi-angle composite, open-canopy forested areas are indicated by green hues. Since this is a multi-angle composite, the green color arises not from the color of the leaves but from the architecture of the surface cover. The green areas appear brighter at the nadir angle than at the oblique angles because more of the snow-covered surface in the gaps between the trees is visible. Color variations in the multi-angle composite also indicate angular reflectance properties for areas covered by snow and ice. The light blue color of the frozen lakes is due to the increased forward scattering of smooth ice, and light orange colors indicate rougher ice or snow, which scatters more light in the backward direction. The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire Earth between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbit 22525. The panels cover an area of about 355 kilometers x 380 kilometers, and utilize data from blocks 50 to 52 within World Reference System-2 path 126. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.2014-03-01
U.S. Air Force, and others have demonstrated the utility of SUAS in natural disasters such as the Fukushima Daiichi meltdown to take photographs at...factor. Multispectral Imagery (MSI) has proven capable of dismount detection with several distinct wavelengths. This research proposes a spectral...Epipolar lines depicted in blue, show the geometric relationship between the two cameras after stereo rectification
Performance Characteristics For The Orbiter Camera Payload System's Large Format Camera (LFC)
NASA Astrophysics Data System (ADS)
MoIIberg, Bernard H.
1981-11-01
The Orbiter Camera Payload System, the OCPS, is an integrated photographic system which is carried into Earth orbit as a payload in the Shuttle Orbiter vehicle's cargo bay. The major component of the OCPS is a Large Format Camera (LFC) which is a precision wide-angle cartographic instrument that is capable of produc-ing high resolution stereophotography of great geometric fidelity in multiple base to height ratios. The primary design objective for the LFC was to maximize all system performance characteristics while maintaining a high level of reliability compatible with rocket launch conditions and the on-orbit environment.
Hyperspectral imaging for food processing automation
NASA Astrophysics Data System (ADS)
Park, Bosoon; Lawrence, Kurt C.; Windham, William R.; Smith, Doug P.; Feldner, Peggy W.
2002-11-01
This paper presents the research results that demonstrates hyperspectral imaging could be used effectively for detecting feces (from duodenum, ceca, and colon) and ingesta on the surface of poultry carcasses, and potential application for real-time, on-line processing of poultry for automatic safety inspection. The hyperspectral imaging system included a line scan camera with prism-grating-prism spectrograph, fiber optic line lighting, motorized lens control, and hyperspectral image processing software. Hyperspectral image processing algorithms, specifically band ratio of dual-wavelength (565/517) images and thresholding were effective on the identification of fecal and ingesta contamination of poultry carcasses. A multispectral imaging system including a common aperture camera with three optical trim filters (515.4 nm with 8.6- nm FWHM), 566.4 nm with 8.8-nm FWHM, and 631 nm with 10.2-nm FWHM), which were selected and validated by a hyperspectral imaging system, was developed for a real-time, on-line application. A total image processing time required to perform the current multispectral images captured by a common aperture camera was approximately 251 msec or 3.99 frames/sec. A preliminary test shows that the accuracy of real-time multispectral imaging system to detect feces and ingesta on corn/soybean fed poultry carcasses was 96%. However, many false positive spots that cause system errors were also detected.
Fernández-Guisuraga, José Manuel; Sanz-Ablanedo, Enoc; Suárez-Seoane, Susana; Calvo, Leonor
2018-02-14
This study evaluated the opportunities and challenges of using drones to obtain multispectral orthomosaics at ultra-high resolution that could be useful for monitoring large and heterogeneous burned areas. We conducted a survey using an octocopter equipped with a Parrot SEQUOIA multispectral camera in a 3000 ha framework located within the perimeter of a megafire in Spain. We assessed the quality of both the camera raw imagery and the multispectral orthomosaic obtained, as well as the required processing capability. Additionally, we compared the spatial information provided by the drone orthomosaic at ultra-high spatial resolution with another image provided by the WorldView-2 satellite at high spatial resolution. The drone raw imagery presented some anomalies, such as horizontal banding noise and non-homogeneous radiometry. Camera locations showed a lack of synchrony of the single frequency GPS receiver. The georeferencing process based on ground control points achieved an error lower than 30 cm in X-Y and lower than 55 cm in Z. The drone orthomosaic provided more information in terms of spatial variability in heterogeneous burned areas in comparison with the WorldView-2 satellite imagery. The drone orthomosaic could constitute a viable alternative for the evaluation of post-fire vegetation regeneration in large and heterogeneous burned areas.
2018-01-01
This study evaluated the opportunities and challenges of using drones to obtain multispectral orthomosaics at ultra-high resolution that could be useful for monitoring large and heterogeneous burned areas. We conducted a survey using an octocopter equipped with a Parrot SEQUOIA multispectral camera in a 3000 ha framework located within the perimeter of a megafire in Spain. We assessed the quality of both the camera raw imagery and the multispectral orthomosaic obtained, as well as the required processing capability. Additionally, we compared the spatial information provided by the drone orthomosaic at ultra-high spatial resolution with another image provided by the WorldView-2 satellite at high spatial resolution. The drone raw imagery presented some anomalies, such as horizontal banding noise and non-homogeneous radiometry. Camera locations showed a lack of synchrony of the single frequency GPS receiver. The georeferencing process based on ground control points achieved an error lower than 30 cm in X-Y and lower than 55 cm in Z. The drone orthomosaic provided more information in terms of spatial variability in heterogeneous burned areas in comparison with the WorldView-2 satellite imagery. The drone orthomosaic could constitute a viable alternative for the evaluation of post-fire vegetation regeneration in large and heterogeneous burned areas. PMID:29443914
Medium-sized aperture camera for Earth observation
NASA Astrophysics Data System (ADS)
Kim, Eugene D.; Choi, Young-Wan; Kang, Myung-Seok; Kim, Ee-Eul; Yang, Ho-Soon; Rasheed, Ad. Aziz Ad.; Arshad, Ahmad Sabirin
2017-11-01
Satrec Initiative and ATSB have been developing a medium-sized aperture camera (MAC) for an earth observation payload on a small satellite. Developed as a push-broom type high-resolution camera, the camera has one panchromatic and four multispectral channels. The panchromatic channel has 2.5m, and multispectral channels have 5m of ground sampling distances at a nominal altitude of 685km. The 300mm-aperture Cassegrain telescope contains two aspheric mirrors and two spherical correction lenses. With a philosophy of building a simple and cost-effective camera, the mirrors incorporate no light-weighting, and the linear CCDs are mounted on a single PCB with no beam splitters. MAC is the main payload of RazakSAT to be launched in 2005. RazakSAT is a 180kg satellite including MAC, designed to provide high-resolution imagery of 20km swath width on a near equatorial orbit (NEqO). The mission objective is to demonstrate the capability of a high-resolution remote sensing satellite system on a near equatorial orbit. This paper describes the overview of the MAC and RarakSAT programmes, and presents the current development status of MAC focusing on key optical aspects of Qualification Model.
NASA Astrophysics Data System (ADS)
Gicquel, Adeline; Vincent, Jean-Baptiste; Sierks, Holger; Rose, Martin; Agarwal, Jessica; Deller, Jakob; Guettler, Carsten; Hoefner, Sebastian; Hofmann, Marc; Hu, Xuanyu; Kovacs, Gabor; Oklay Vincent, Nilda; Shi, Xian; Tubiana, Cecilia; Barbieri, Cesare; Lamy, Phylippe; Rodrigo, Rafael; Koschny, Detlef; Rickman, Hans; OSIRIS Team
2016-10-01
Images of the nucleus and the coma (gas and dust) of comet 67P/Churyumov- Gerasimenko have been acquired by the OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) cameras system since March 2014 using both the wide angle camera (WAC) and the narrow angle camera (NAC). We are using the NAC camera to study the bright outburst observed on July 29th, 2015 in the southern hemisphere. The NAC camera's wavelength ranges between 250-1000 nm with a combination of 12 filters. The high spatial resolution is needed to localize the source point of the outburst on the surface of the nucleus. At the time of the observations, the heliocentric distance was 1.25AU and the distance between the spacecraft and the comet was 126 km. We aim to understand the physics leading to such outgassing: Is the jet associated to the outbursts controlled by the micro-topography? Or by ice suddenly exposed? We are using the Direct Simulation Monte Carlo (DSMC) method to study the gas flow close to the nucleus. The goal of the DSMC code is to reproduce the opening angle of the jet, and constrain the outgassing ratio between outburst source and local region. The results of this model will be compared to the images obtained with the NAC camera.
An earth imaging camera simulation using wide-scale construction of reflectance surfaces
NASA Astrophysics Data System (ADS)
Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk
2013-10-01
Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.
Multi-spectral endogenous fluorescence imaging for bacterial differentiation
NASA Astrophysics Data System (ADS)
Chernomyrdin, Nikita V.; Babayants, Margarita V.; Korotkov, Oleg V.; Kudrin, Konstantin G.; Rimskaya, Elena N.; Shikunova, Irina A.; Kurlov, Vladimir N.; Cherkasova, Olga P.; Komandin, Gennady A.; Reshetov, Igor V.; Zaytsev, Kirill I.
2017-07-01
In this paper, the multi-spectral endogenous fluorescence imaging was implemented for bacterial differentiation. The fluorescence imaging was performed using a digital camera equipped with a set of visual bandpass filters. Narrowband 365 nm ultraviolet radiation passed through a beam homogenizer was used to excite the sample fluorescence. In order to increase a signal-to-noise ratio and suppress a non-fluorescence background in images, the intensity of the UV excitation was modulated using a mechanical chopper. The principal components were introduced for differentiating the samples of bacteria based on the multi-spectral endogenous fluorescence images.
NASA Astrophysics Data System (ADS)
Torkildsen, H. E.; Hovland, H.; Opsahl, T.; Haavardsholm, T. V.; Nicolas, S.; Skauli, T.
2014-06-01
In some applications of multi- or hyperspectral imaging, it is important to have a compact sensor. The most compact spectral imaging sensors are based on spectral filtering in the focal plane. For hyperspectral imaging, it has been proposed to use a "linearly variable" bandpass filter in the focal plane, combined with scanning of the field of view. As the image of a given object in the scene moves across the field of view, it is observed through parts of the filter with varying center wavelength, and a complete spectrum can be assembled. However if the radiance received from the object varies with viewing angle, or with time, then the reconstructed spectrum will be distorted. We describe a camera design where this hyperspectral functionality is traded for multispectral imaging with better spectral integrity. Spectral distortion is minimized by using a patterned filter with 6 bands arranged close together, so that a scene object is seen by each spectral band in rapid succession and with minimal change in viewing angle. The set of 6 bands is repeated 4 times so that the spectral data can be checked for internal consistency. Still the total extent of the filter in the scan direction is small. Therefore the remainder of the image sensor can be used for conventional imaging with potential for using motion tracking and 3D reconstruction to support the spectral imaging function. We show detailed characterization of the point spread function of the camera, demonstrating the importance of such characterization as a basis for image reconstruction. A simplified image reconstruction based on feature-based image coregistration is shown to yield reasonable results. Elimination of spectral artifacts due to scene motion is demonstrated.
Retinal oxygen saturation evaluation by multi-spectral fundus imaging
NASA Astrophysics Data System (ADS)
Khoobehi, Bahram; Ning, Jinfeng; Puissegur, Elise; Bordeaux, Kimberly; Balasubramanian, Madhusudhanan; Beach, James
2007-03-01
Purpose: To develop a multi-spectral method to measure oxygen saturation of the retina in the human eye. Methods: Five Cynomolgus monkeys with normal eyes were anesthetized with intramuscular ketamine/xylazine and intravenous pentobarbital. Multi-spectral fundus imaging was performed in five monkeys with a commercial fundus camera equipped with a liquid crystal tuned filter in the illumination light path and a 16-bit digital camera. Recording parameters were controlled with software written specifically for the application. Seven images at successively longer oxygen-sensing wavelengths were recorded within 4 seconds. Individual images for each wavelength were captured in less than 100 msec of flash illumination. Slightly misaligned images of separate wavelengths due to slight eye motion were registered and corrected by translational and rotational image registration prior to analysis. Numerical values of relative oxygen saturation of retinal arteries and veins and the underlying tissue in between the artery/vein pairs were evaluated by an algorithm previously described, but which is now corrected for blood volume from averaged pixels (n > 1000). Color saturation maps were constructed by applying the algorithm at each image pixel using a Matlab script. Results: Both the numerical values of relative oxygen saturation and the saturation maps correspond to the physiological condition, that is, in a normal retina, the artery is more saturated than the tissue and the tissue is more saturated than the vein. With the multi-spectral fundus camera and proper registration of the multi-wavelength images, we were able to determine oxygen saturation in the primate retinal structures on a tolerable time scale which is applicable to human subjects. Conclusions: Seven wavelength multi-spectral imagery can be used to measure oxygen saturation in retinal artery, vein, and tissue (microcirculation). This technique is safe and can be used to monitor oxygen uptake in humans. This work is original and is not under consideration for publication elsewhere.
NASA Technical Reports Server (NTRS)
1999-01-01
This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.
Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. -- A remote wide-angle camera captures liftoff of the Delta II rocket carrying the Gravity Probe B spacecraft from Space Launch Complex 2 on Vandenberg AFB, Calif., at 9:57:24 a.m. PDT.
MOVING BEYOND COLOR: THE CASE FOR MULTISPECTRAL IMAGING IN BRIGHTFIELD PATHOLOGY.
Cukierski, William J; Qi, Xin; Foran, David J
2009-01-01
A multispectral camera is capable of imaging a histologic slide at narrow bandwidths over the range of the visible spectrum. While several uses for multispectral imaging (MSI) have been demonstrated in pathology [1, 2], there is no unified consensus over when and how MSI might benefit automated analysis [3, 4]. In this work, we use a linear-algebra framework to investigate the relationship between the spectral image and its standard-image counterpart. The multispectral "cube" is treated as an extension of a traditional image in a high-dimensional color space. The concept of metamers is introduced and used to derive regions of the visible spectrum where MSI may provide an advantage. Furthermore, histological stains which are amenable to analysis by MSI are reported. We show the Commission internationale de l'éclairage (CIE) 1931 transformation from spectrum to color is non-neighborhood preserving. Empirical results are demonstrated on multispectral images of peripheral blood smears.
Skylab investigation of the upwelling off the Northwest coast of Africa
NASA Technical Reports Server (NTRS)
Szekielda, K. H.; Suszkowski, D. J.; Tabor, P. S.
1975-01-01
The upwelling off the NW coast of Africa in the vicinity of Cape Blanc was studied in February - March 1974 from aircraft and in September 1973 from Skylab. The aircraft study was designed to determine the effectiveness of a differential radiometer in quantifying surface chlorophyll concentrations. Photographic images of the S190A Multispectral Camera and the S190B Earth Terrain Camera from Skylab were used to study distributional patterns of suspended material and to locate ocean color boundaries. The thermal channel of the S192 Multispectral Scanner was used to map sea-surface temperature distributions offshore of Cape Blanc. Correlating ocean color changes with temperature gradients is an effective method of qualitatively estimating biological productivity in the upwelling region off Africa.
Evaluation of the Quality of Action Cameras with Wide-Angle Lenses in Uav Photogrammetry
NASA Astrophysics Data System (ADS)
Hastedt, H.; Ekkel, T.; Luhmann, T.
2016-06-01
The application of light-weight cameras in UAV photogrammetry is required due to restrictions in payload. In general, consumer cameras with normal lens type are applied to a UAV system. The availability of action cameras, like the GoPro Hero4 Black, including a wide-angle lens (fish-eye lens) offers new perspectives in UAV projects. With these investigations, different calibration procedures for fish-eye lenses are evaluated in order to quantify their accuracy potential in UAV photogrammetry. Herewith the GoPro Hero4 is evaluated using different acquisition modes. It is investigated to which extent the standard calibration approaches in OpenCV or Agisoft PhotoScan/Lens can be applied to the evaluation processes in UAV photogrammetry. Therefore different calibration setups and processing procedures are assessed and discussed. Additionally a pre-correction of the initial distortion by GoPro Studio and its application to the photogrammetric purposes will be evaluated. An experimental setup with a set of control points and a prospective flight scenario is chosen to evaluate the processing results using Agisoft PhotoScan. Herewith it is analysed to which extent a pre-calibration and pre-correction of a GoPro Hero4 will reinforce the reliability and accuracy of a flight scenario.
NASA Astrophysics Data System (ADS)
Ehrhart, Matthias; Lienhart, Werner
2017-09-01
The importance of automated prism tracking is increasingly triggered by the rising automation of total station measurements in machine control, monitoring and one-person operation. In this article we summarize and explain the different techniques that are used to coarsely search a prism, to precisely aim at a prism, and to identify whether the correct prism is tracked. Along with the state-of-the-art review, we discuss and experimentally evaluate possible improvements based on the image data of an additional wide-angle camera which is available for many total stations today. In cases in which the total station's fine aiming module loses the prism, the tracked object may still be visible to the wide-angle camera because of its larger field of view. The theodolite angles towards the target can then be derived from its image coordinates which facilitates a fast reacquisition of the prism. In experimental measurements we demonstrate that our image-based approach for the coarse target search is 4 to 10-times faster than conventional approaches.
NASA Technical Reports Server (NTRS)
Bell, J. F., III; Arneson, H. M.; Farrand, W. H.; Goetz, W.; Hayes, A. G.; Herkenhoff, K.; Johnson, M. J.; Johnson, J. R.; Joseph, J.; Kinch, K.
2005-01-01
Introduction. The panoramic camera (Pancam) multispectral, stereoscopic imaging systems on the Mars Exploration Rovers Spirit and Opportunity [1] have acquired and downlinked more than 45,000 images (35 Gbits of data) over more than 700 combined sols of operation on Mars as of early January 2005. A large subset of these images were acquired as part of 26 large multispectral and/or broadband "albedo" panoramas (15 on Spirit, 11 on Opportunity) covering large ranges of azimuth (12 spanning 360 ) and designed to characterize major regional color and albedo characteristics of the landing sites and various points along both rover traverses.
Unmanned spacecraft for surveying earth's resources
NASA Technical Reports Server (NTRS)
George, T. A.
1970-01-01
The technical objectives and payloads for ERTS A and B are discussed. The primary emphasis is on coverage of the United States and the ocean areas immediately adjacent, using 3-camera return beam vidicon TV system, 4-channel multispectral point scanner, data collection system, and wideband video tape recorder. The expected performance and system characteristics of the RBV system and the 4-band multispectral object plane point scanner are outlined. Ground station considerations are also given.
Time-of-Flight Microwave Camera
Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh
2015-01-01
Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz–12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum. PMID:26434598
Design and development of an airborne multispectral imaging system
NASA Astrophysics Data System (ADS)
Kulkarni, Rahul R.; Bachnak, Rafic; Lyle, Stacey; Steidley, Carl W.
2002-08-01
Advances in imaging technology and sensors have made airborne remote sensing systems viable for many applications that require reasonably good resolution at low cost. Digital cameras are making their mark on the market by providing high resolution at very high rates. This paper describes an aircraft-mounted imaging system (AMIS) that is being designed and developed at Texas A&M University-Corpus Christi (A&M-CC) with the support of a grant from NASA. The approach is to first develop and test a one-camera system that will be upgraded into a five-camera system that offers multi-spectral capabilities. AMIS will be low cost, rugged, portable and has its own battery power source. Its immediate use will be to acquire images of the Coastal area in the Gulf of Mexico for a variety of studies covering vast spectra from near ultraviolet region to near infrared region. This paper describes AMIS and its characteristics, discusses the process for selecting the major components, and presents the progress.
Time-of-Flight Microwave Camera
NASA Astrophysics Data System (ADS)
Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh
2015-10-01
Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.
LROC WAC Ultraviolet Reflectance of the Moon
NASA Astrophysics Data System (ADS)
Robinson, M. S.; Denevi, B. W.; Sato, H.; Hapke, B. W.; Hawke, B. R.
2011-10-01
Earth-based color filter photography, first acquired in the 1960s, showed color differences related to morphologic boundaries on the Moon [1]. These color units were interpreted to indicate compositional differences, thought to be the result of variations in titanium content [1]. Later it was shown that iron abundance (FeO) also plays a dominant role in controlling color in lunar soils [2]. Equally important is the maturity of a lunar soil in terms of its reflectance properties (albedo and color) [3]. Maturity is a measure of the state of alteration of surface materials due to sputtering and high velocity micrometeorite impacts over time [3]. The Clementine (CL) spacecraft provided the first global and digital visible through infrared observations of the Moon [4]. This pioneering dataset allowed significant advances in our understanding of compositional (FeO and TiO2) and maturation differences across the Moon [5,6]. Later, the Lunar Prospector (LP) gamma ray and neutron experiments provided the first global, albeit low resolution, elemental maps [7]. Newly acquired Moon Mineralogic Mapper hyperspectral measurements are now providing the means to better characterize mineralogic variations on a global scale [8]. Our knowledge of ultraviolet color differences between geologic units is limited to low resolution (km scale) nearside telescopic observations, and high resolution Hubble Space Telescope images of three small areas [9], and laboratory analyses of lunar materials [10,11]. These previous studies detailed color differences in the UV (100 to 400 nm) related to composition and physical state. HST UV (250 nm) and visible (502 nm) color differences were found to correlate with TiO2, and were relatively insensitive to maturity effects seen in visible ratios (CL) [9]. These two results led to the conclusion that improvements in TiO2 estimation accuracy over existing methods may be possible through a simple UV/visible ratio [9]. The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) provides the first global lunar ultraviolet through visible (321 nm to 689 nm) multispectral observations [12]. The WAC is a sevencolor push-frame imager with nominal resolutions of 400 m (321, 360 nm) and 100 m (415, 566, 604, 643, 689 nm). Due to its wide field-of-view (60° in color mode) the phase angle within a single line varies ±30°, thus requiring the derivation of a precise photometric characterization [13] before any interpretations of lunar reflectance properties can be made. The current WAC photometric correction relies on multiple WAC observations of the same area over a broad range of phase angles and typically results in relative corrections good to a few percent [13].
2018-03-05
In this image, NASA's Cassini sees Saturn and its rings through a haze of Sun glare on the camera lens. If you could travel to Saturn in person and look out the window of your spacecraft when the Sun was at a certain angle, you might see a view very similar to this one. Images taken using red, green and blue spectral filters were combined to show the scene in natural color. The images were taken with Cassini's wide-angle camera on June 23, 2013, at a distance of approximately 491,200 miles (790,500 kilometers) from Saturn. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA17185
Surface compositional variation on the comet 67P/Churyumov-Gerasimenko by OSIRIS data
NASA Astrophysics Data System (ADS)
Barucci, M. A.; Fornasier, S.; Feller, C.; Perna, D.; Hasselmann, H.; Deshapriya, J. D. P.; Fulchignoni, M.; Besse, S.; Sierks, H.; Forgia, F.; Lazzarin, M.; Pommerol, A.; Oklay, N.; Lara, L.; Scholten, F.; Preusker, F.; Leyrat, C.; Pajola, M.; Osiris-Rosetta Team
2015-10-01
Since the Rosetta mission arrived at the comet 67P/Churyumov-Gerasimenko (67/P C-G) on July 2014, the comet nucleus has been mapped by both OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System, [1]) NAC (Narrow Angle Camera) and WAC (Wide Angle Camera) acquiring a huge quantity of surface's images at different wavelength bands, under variable illumination conditions and spatial resolution, and producing the most detailed maps at the highest spatial resolution of a comet nucleus surface.67/P C-G's nucleus shows an irregular bi-lobed shape of complex morphology with terrains showing intricate features [2, 3] and a heterogeneity surface at different scales.
Wetland Vegetation Integrity Assessment with Low Altitude Multispectral Uav Imagery
NASA Astrophysics Data System (ADS)
Boon, M. A.; Tesfamichael, S.
2017-08-01
The use of multispectral sensors on Unmanned Aerial Vehicles (UAVs) was until recently too heavy and bulky although this changed in recent times and they are now commercially available. The focus on the usage of these sensors is mostly directed towards the agricultural sector where the focus is on precision farming. Applications of these sensors for mapping of wetland ecosystems are rare. Here, we evaluate the performance of low altitude multispectral UAV imagery to determine the state of wetland vegetation in a localised spatial area. Specifically, NDVI derived from multispectral UAV imagery was used to inform the determination of the integrity of the wetland vegetation. Furthermore, we tested different software applications for the processing of the imagery. The advantages and disadvantages we experienced of these applications are also shortly presented in this paper. A JAG-M fixed-wing imaging system equipped with a MicaScene RedEdge multispectral camera were utilised for the survey. A single surveying campaign was undertaken in early autumn of a 17 ha study area at the Kameelzynkraal farm, Gauteng Province, South Africa. Structure-from-motion photogrammetry software was used to reconstruct the camera position's and terrain features to derive a high resolution orthoretified mosaic. MicaSense Atlas cloud-based data platform, Pix4D and PhotoScan were utilised for the processing. The WET-Health level one methodology was followed for the vegetation assessment, where wetland health is a measure of the deviation of a wetland's structure and function from its natural reference condition. An on-site evaluation of the vegetation integrity was first completed. Disturbance classes were then mapped using the high resolution multispectral orthoimages and NDVI. The WET-Health vegetation module completed with the aid of the multispectral UAV products indicated that the vegetation of the wetland is largely modified ("D" PES Category) and that the condition is expected to deteriorate (change score) in the future. However a lower impact score were determined utilising the multispectral UAV imagery and NDVI. The result is a more accurate estimation of the impacts in the wetland.
Global Albedo Variations on Mars from Recent MRO/MARCI and Other Space-Based Observations
NASA Astrophysics Data System (ADS)
Bell, J. F., III; Wellington, D. F.
2017-12-01
Dramatic changes in Mars surface albedo have been quantified by telescopic, orbital, and surface-based observations over the last 40 years. These changes provide important inputs for global and mesoscale climate models, enabling characterization of seasonal and secular variations in the distribution of mobile surface materials (dust, sand) in the planet's current climate regime. Much of the modern record of dust storms and albedo changes comes from synoptic-scale global imaging from the Viking Orbiter, Mars Global Surveyor (MGS), Hubble Space Telescope (HST), and Mars Reconnaissance Orbiter (MRO) missions, as well as local-scale observations from long-lived surface platforms like the Spirit and Opportunity rovers. Here we focus on the substantial time history of global-scale images acquired from the MRO Mars Color Imager (MARCI). MARCI is a wide-angle multispectral imager that acquires daily coverage of most of the surface at up to 1 km/pixel. MARCI has been in orbit since 2006, providing six Mars years of continuous surface and atmospheric observations, and building on the nearly five previous Mars years of global-scale imaging from the MGS Mars Orbiter Camera Wide Angle (MOC/WA) imager, which operated from 1997 to 2006. While many of the most significant MARCI-observed changes in the surface albedo are the result of large dust storms, other regions experience seasonal darkening events that repeat with different degrees of annual regularity. Some of these are associated with local dust storms, while for others, frequent surface changes take place with no associated evidence for dust storms, suggesting action by seasonally-variable winds and/or small-scale storms/dust devils too small to resolve. Discrete areas of dramatic surface changes across widely separated regions of Tharsis and in portions of Solis Lacus and Syrtis Major are among the regions where surface changes have been observed without a direct association to specific detectable dust storm events. Deposition following the annual southern summer dusty season plays a significant role in maintaining the cyclic nature of these changes. These and other historical observations also show that major regional or global-scale dust storms produce unique changes that may require several Mars years to reverse.
NASA Technical Reports Server (NTRS)
Bauer, M. E. (Principal Investigator); Vanderbilt, V. C.; Robinson, B. F.; Biehl, L. L.; Vanderbilt, A. S.
1981-01-01
The reflectance response with view angle of wheat, was analyzed. The analyses, which assumes there are no atmospheric effects, and otherwise simulates the response of a multispectral scanner, is based upon spectra taken continuously in wavelength from 0.45 to 2.4 micrometers at more than 1200 view/illumination directions using an Exotech model 20C spectra radiometer. Data were acquired six meters above four wheat canopies, each at a different growth stage. The analysis shows that the canopy reflective response is a pronounced function of illumination angle, scanner view angle and wavelength. The variation is greater at low solar elevations compared to high solar elevations.
NASA Astrophysics Data System (ADS)
Wang, Sheng; Bandini, Filippo; Jakobsen, Jakob; Zarco-Tejada, Pablo J.; Köppl, Christian Josef; Haugård Olesen, Daniel; Ibrom, Andreas; Bauer-Gottwein, Peter; Garcia, Monica
2017-04-01
Unmanned Aerial Systems (UAS) can collect optical and thermal hyperspatial (<1m) imagery with low cost and flexible revisit times regardless of cloudy conditions. The reflectance and radiometric temperature signatures of the land surface, closely linked with the vegetation structure and functioning, are already part of models to predict Evapotranspiration (ET) and Gross Primary Productivity (GPP) from satellites. However, there remain challenges for an operational monitoring using UAS compared to satellites: the payload capacity of most commercial UAS is less than 2 kg, but miniaturized sensors have low signal to noise ratios and small field of view requires mosaicking hundreds of images and accurate orthorectification. In addition, wind gusts and lower platform stability require appropriate geometric and radiometric corrections. Finally, modeling fluxes on days without images is still an issue for both satellite and UAS applications. This study focuses on designing an operational UAS-based monitoring system including payload design, sensor calibration, based on routine collection of optical and thermal images in a Danish willow field to perform a joint monitoring of ET and GPP dynamics over continuous time at daily time steps. The payload (<2 kg) consists of a multispectral camera (Tetra Mini-MCA6), a thermal infrared camera (FLIR Tau 2), a digital camera (Sony RX-100) used to retrieve accurate digital elevation models (DEMs) for multispectral and thermal image orthorectification, and a standard GNSS single frequency receiver (UBlox) or a real time kinematic double frequency system (Novatel Inc. flexpack6+OEM628). Geometric calibration of the digital and multispectral cameras was conducted to recover intrinsic camera parameters. After geometric calibration, accurate DEMs with vertical errors about 10cm could be retrieved. Radiometric calibration for the multispectral camera was conducted with an integrating sphere (Labsphere CSTM-USS-2000C) and the laboratory calibration showed that the camera measured radiance had a bias within ±4.8%. The thermal camera was calibrated using a black body at varying target and ambient temperatures and resulted in laboratory accuracy with RMSE of 0.95 K. A joint model of ET and GPP was applied using two parsimonious, physiologically based models, a modified version of the Priestley-Taylor Jet Propulsion Laboratory model (Fisher et al., 2008; Garcia et al., 2013) and a Light Use Efficiency approach (Potter et al., 1993). Both models estimate ET and GPP under optimum potential conditions down-regulated by the same biophysical constraints dependent on remote sensing and atmospheric data to reflect multiple stresses. Vegetation indices were calculated from the multispectral data to assess vegetation conditions, while thermal infrared imagery was used to compute a thermal inertia index to infer soil moisture constraints. To interpolate radiometric temperature between flights, a prognostic Surface Energy Balance model (Margulis et al., 2001) based on the force-restore method was applied in a data assimilation scheme to obtain continuous ET and GPP fluxes. With this operational system, regular flight campaigns with a hexacopter (DJI S900) have been conducted in a Danish willow flux site (Risø) over the 2016 growing season. The observed energy, water and carbon fluxes from the Risø eddy covariance flux tower were used to validate the model simulation. This UAS monitoring system is suitable for agricultural management and land-atmosphere interaction studies.
Photogrammetry System and Method for Determining Relative Motion Between Two Bodies
NASA Technical Reports Server (NTRS)
Miller, Samuel A. (Inventor); Severance, Kurt (Inventor)
2014-01-01
A photogrammetry system and method provide for determining the relative position between two objects. The system utilizes one or more imaging devices, such as high speed cameras, that are mounted on a first body, and three or more photogrammetry targets of a known location on a second body. The system and method can be utilized with cameras having fish-eye, hyperbolic, omnidirectional, or other lenses. The system and method do not require overlapping fields-of-view if two or more cameras are utilized. The system and method derive relative orientation by equally weighting information from an arbitrary number of heterogeneous cameras, all with non-overlapping fields-of-view. Furthermore, the system can make the measurements with arbitrary wide-angle lenses on the cameras.
Analysis of the effect on optical equipment caused by solar position in target flight measure
NASA Astrophysics Data System (ADS)
Zhu, Shun-hua; Hu, Hai-bin
2012-11-01
Optical equipment is widely used to measure flight parameters in target flight performance test, but the equipment is sensitive to the sun's rays. In order to avoid the disadvantage of sun's rays directly shines to the optical equipment camera lens when measuring target flight parameters, the angle between observation direction and the line which connects optical equipment camera lens and the sun should be kept at a big range, The calculation method of the solar azimuth and altitude to the optical equipment at any time and at any place on the earth, the equipment observation direction model and the calculating model of angle between observation direction and the line which connects optical equipment camera lens are introduced in this article. Also, the simulation of the effect on optical equipment caused by solar position at different time, different date, different month and different target flight direction is given in this article.
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei
2016-01-01
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision. PMID:27892454
NASA Astrophysics Data System (ADS)
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei
2016-11-01
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.
NASA Astrophysics Data System (ADS)
Rau, J.-Y.; Jhan, J.-P.; Huang, C.-Y.
2015-08-01
Miniature Multiple Camera Array (MiniMCA-12) is a frame-based multilens/multispectral sensor composed of 12 lenses with narrow band filters. Due to its small size and light weight, it is suitable to mount on an Unmanned Aerial System (UAS) for acquiring high spectral, spatial and temporal resolution imagery used in various remote sensing applications. However, due to its wavelength range is only 10 nm that results in low image resolution and signal-to-noise ratio which are not suitable for image matching and digital surface model (DSM) generation. In the meantime, the spectral correlation among all 12 bands of MiniMCA images are low, it is difficult to perform tie-point matching and aerial triangulation at the same time. In this study, we thus propose the use of a DSLR camera to assist automatic aerial triangulation of MiniMCA-12 imagery and to produce higher spatial resolution DSM for MiniMCA12 ortho-image generation. Depending on the maximum payload weight of the used UAS, these two kinds of sensors could be collected at the same time or individually. In this study, we adopt a fixed-wing UAS to carry a Canon EOS 5D Mark2 DSLR camera and a MiniMCA-12 multi-spectral camera. For the purpose to perform automatic aerial triangulation between a DSLR camera and the MiniMCA-12, we choose one master band from MiniMCA-12 whose spectral range has overlap with the DSLR camera. However, all lenses of MiniMCA-12 have different perspective centers and viewing angles, the original 12 channels have significant band misregistration effect. Thus, the first issue encountered is to reduce the band misregistration effect. Due to all 12 MiniMCA lenses being frame-based, their spatial offsets are smaller than 15 cm and all images are almost 98% overlapped, we thus propose a modified projective transformation (MPT) method together with two systematic error correction procedures to register all 12 bands of imagery on the same image space. It means that those 12 bands of images acquired at the same exposure time will have same interior orientation parameters (IOPs) and exterior orientation parameters (EOPs) after band-to-band registration (BBR). Thus, in the aerial triangulation stage, the master band of MiniMCA-12 was treated as a reference channel to link with DSLR RGB images. It means, all reference images from the master band of MiniMCA-12 and all RGB images were triangulated at the same time with same coordinate system of ground control points (GCP). Due to the spatial resolution of RGB images is higher than the MiniMCA-12, the GCP can be marked on the RGB images only even they cannot be recognized on the MiniMCA images. Furthermore, a one meter gridded digital surface model (DSM) is created by the RGB images and applied to the MiniMCA imagery for ortho-rectification. Quantitative error analyses show that the proposed BBR scheme can achieve 0.33 pixels of average misregistration residuals length and the co-registration errors among 12 MiniMCA ortho-images and between MiniMCA and Canon RGB ortho-images are all less than 0.6 pixels. The experimental results demonstrate that the proposed method is robust, reliable and accurate for future remote sensing applications.
NASA Astrophysics Data System (ADS)
Howett, C. J. A.; Ennico, K.; Olkin, C. B.; Buie, M. W.; Verbiscer, A. J.; Zangari, A. M.; Parker, A. H.; Reuter, D. C.; Grundy, W. M.; Weaver, H. A.; Young, L. A.; Stern, S. A.
2017-05-01
Light curves produced from color observations taken during New Horizons' approach to the Pluto-system by its Multi-spectral Visible Imaging Camera (MVIC, part of the Ralph instrument) are analyzed. Fifty seven observations were analyzed, they were obtained between 9th April and 3rd July 2015, at a phase angle of 14.5° to 15.1°, sub-observer latitude of 51.2 °N to 51.5 °N, and a sub-solar latitude of 41.2°N. MVIC has four color channels; all are discussed for completeness but only two were found to produce reliable light curves: Blue (400-550 nm) and Red (540-700 nm). The other two channels, Near Infrared (780-975 nm) and Methane-Band (860-910 nm), were found to be potentially erroneous and too noisy respectively. The Blue and Red light curves show that Charon's surface is neutral in color, but slightly brighter on its Pluto-facing hemisphere. This is consistent with previous studies made with the Johnson B and V bands, which are at shorter wavelengths than that of the MVIC Blue and Red channel respectively.
NASA Technical Reports Server (NTRS)
Howett, C. J. A.; Ennico, K.; Olkin, C. B.; Buie, M. W.; Verbiscer, A. J.; Zangari, A. M.; Parker, A. H.; Reuter, D. C.; Grundy, W. M.; Weaver, H. A.;
2016-01-01
Light curves produced from color observations taken during New Horizons approach to the Pluto-system by its Multi-spectral Visible Imaging Camera (MVIC, part of the Ralph instrument) are analyzed. Fifty seven observations were analyzed, they were obtained between 9th April and 3rd July 2015, at a phase angle of 14.5 degrees to 15.1 degrees, sub-observer latitude of 51.2 degrees North to 51.5 degrees North, and a sub-solar latitude of 41.2 degrees North. MVIC has four color channels; all are discussed for completeness but only two were found to produce reliable light curves: Blue (400-550 nm) and Red (540-700 nm). The other two channels, Near Infrared (780-975 nm) and Methane-Band (860-910 nm), were found to be potentially erroneous and too noisy respectively. The Blue and Red light curves show that Charon's surface is neutral in color, but slightly brighter on its Pluto-facing hemisphere. This is consistent with previous studies made with the Johnson B and V bands, which are at shorter wavelengths than that of the MVIC Blue and Red channel respectively.
The Uses of a Polarimetric Camera
2008-09-01
are displayed in this thesis the author used two different lenses . One of the lenses is an ARSAT H 20mm with an F number of 2.8. This lens was used...for all the wide angle images collected. For the telephoto images collected, the author used a NIKKOR 200mm lenses which has an F number of 4.0...16 K. DEGREE OF LINEAR POLARIZATION (DOLP) ..................................17 L. PHASE ANGLE OF POLARIZATION
MOVING BEYOND COLOR: THE CASE FOR MULTISPECTRAL IMAGING IN BRIGHTFIELD PATHOLOGY
Cukierski, William J.; Qi, Xin; Foran, David J.
2009-01-01
A multispectral camera is capable of imaging a histologic slide at narrow bandwidths over the range of the visible spectrum. While several uses for multispectral imaging (MSI) have been demonstrated in pathology [1, 2], there is no unified consensus over when and how MSI might benefit automated analysis [3, 4]. In this work, we use a linear-algebra framework to investigate the relationship between the spectral image and its standard-image counterpart. The multispectral “cube” is treated as an extension of a traditional image in a high-dimensional color space. The concept of metamers is introduced and used to derive regions of the visible spectrum where MSI may provide an advantage. Furthermore, histological stains which are amenable to analysis by MSI are reported. We show the Commission internationale de l’éclairage (CIE) 1931 transformation from spectrum to color is non-neighborhood preserving. Empirical results are demonstrated on multispectral images of peripheral blood smears. PMID:19997528
NASA Technical Reports Server (NTRS)
1997-01-01
Sections of MOC images P024_01 and P024_02, shown here in color composite form, were acquired with the low resolution red and blue wide angle cameras over a 5 minute period starting when Mars Global Surveyor was at its closest point to the planet at the beginning of its 24th orbit (around 4:00 AM PDT on October 20, 1997). To make this image, a third component (green) was synthesized from the red and blue images. During the imaging period, the camera was pointed straight down towards the martian surface, 176 km (109 miles) below the spacecraft. During the time it took to acquire the image, the spacecraft rose to an altitude of 310 km (193 miles). Owing to data camera scanning rate and data volume constraints, the image was acquired at a resolution of roughly 1 km (0.609 mile) per pixel. The image shown here covers an area from 12o to 26o N latitude and 126o N to 138o W longitude. The image is oriented with north to the top.
As has been noted in other MOC releases, Olympus Mons is the largest of the major Tharsis volcanoes, rising 25 km (15.5 miles) and stretching over nearly 550 km (340 miles) east-west. The summit caldera, a composite of as many as seven roughly circular collapse depressions, is 66 by 83 km (41 by 52 miles) across. Also seen in this image are water-ice clouds that accumulate around and above the volcano during the late afternoon (at the time the image was acquired, the summit was at 5:30 PM local solar time). To understand the value of orbital observations, compare this image with the two taken during approach (PIA00929 and PIA00936), that are representative of the best resolution from Earth.Through Monday, October 28, the MOC had acquired a total of 132 images, most of which were at low sun elevation angles. Of these images, 74 were taken with the high resolution narrow angle camera and 58 with the low resolution wide angle cameras. Twenty-eight narrow angle and 24 wide angle images were taken after the suspension of aerobraking. These images, including the one shown above, are among the best returned so far.Launched on November 7, 1996, Mars Global Surveyor entered Mars orbit on Thursday, September 11, 1997. The original mission plan called for using friction with the planet's atmosphere to reduce the orbital energy, leading to a two-year mapping mission from close, circular orbit (beginning in March 1998). Owing to difficulties with one of the two solar panels, aerobraking was suspended in mid-October and is scheduled to resume in mid-November. Many of the original objectives of the mission, and in particular those of the camera, are likely to be accomplished as the mission progresses.Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.A scan-angle correction for thermal infrared multispectral data using side lapping images
Watson, K.
1996-01-01
Thermal infrared multispectral scanner (TIMS) images, acquired with side lapping flight lines, provide dual angle observations of the same area on the ground and can thus be used to estimate variations in the atmospheric transmission with scan angle. The method was tested using TIMS aircraft data for six flight lines with about 30% sidelap for an area within Joshua Tree National Park, California. Generally the results correspond to predictions for the transmission scan-angle coefficient based on a standard atmospheric model although some differences were observed at the longer wavelength channels. A change was detected for the last pair of lines that may indicate either spatial or temporal atmospheric variation. The results demonstrate that the method provides information for correcting regional survey data (requiring multiple adjacent flight lines) that can be important in detecting subtle changes in lithology.
Multi-Angle View of the Canary Islands
NASA Technical Reports Server (NTRS)
2000-01-01
A multi-angle view of the Canary Islands in a dust storm, 29 February 2000. At left is a true-color image taken by the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite. This image was captured by the MISR camera looking at a 70.5-degree angle to the surface, ahead of the spacecraft. The middle image was taken by the MISR downward-looking (nadir) camera, and the right image is from the aftward 70.5-degree camera. The images are reproduced using the same radiometric scale, so variations in brightness, color, and contrast represent true variations in surface and atmospheric reflectance with angle. Windblown dust from the Sahara Desert is apparent in all three images, and is much brighter in the oblique views. This illustrates how MISR's oblique imaging capability makes the instrument a sensitive detector of dust and other particles in the atmosphere. Data for all channels are presented in a Space Oblique Mercator map projection to facilitate their co-registration. The images are about 400 km (250 miles)wide, with a spatial resolution of about 1.1 kilometers (1,200 yards). North is toward the top. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-01-01
Recent advances in remote-sensing technology and applications are examined in reviews and reports. Topics addressed include the use of Landsat TM data to assess suspended-sediment dispersion in a coastal lagoon, the use of sun incidence angle and IR reflectance levels in mapping old-growth coniferous forests, information-management systems, Large-Format-Camera soil mapping, and the economic potential of Landsat TM winter-wheat crop-condition assessment. Consideration is given to measurement of ephemeral gully erosion by airborne laser ranging, the creation of a multipurpose cadaster, high-resolution remote sensing and the news media, the role of vegetation in the global carbon cycle, PC applications in analytical photogrammetry,more » multispectral geological remote sensing of a suspected impact crater, fractional calculus in digital terrain modeling, and automated mapping using GP-based survey data.« less
Denniss, Jonathan; Schiessl, Ingo; Nourrit, Vincent; Fenerty, Cecilia H; Gautam, Ramesh; Henson, David B
2011-11-07
To investigate the relationship between neuroretinal rim (NRR) differential light absorption (DLA, a measure of spectral absorption properties) and visual field (VF) sensitivity in primary open-angle glaucoma (POAG). Patients diagnosed with (n = 22) or suspected of having (n = 7) POAG were imaged with a multispectral system incorporating a modified digital fundus camera, 250-W tungsten-halogen lamp, and fast-tuneable liquid crystal filter. Five images were captured sequentially within 1.0 second at wavelengths selected according to absorption properties of hemoglobin (range, 570-610 nm), and a Beer-Lambert law model was used to produce DLA maps of residual NRR from the images. Patients also underwent VF testing. Differences in NRR DLA in vertically opposing 180° and 45° sectors either side of the horizontal midline were compared with corresponding differences in VF sensitivity on both decibel and linear scales by Spearman's rank correlation. The decibel VF sensitivity scale showed significant relationships between superior-inferior NRR DLA difference and sensitivity differences between corresponding VF areas in 180° NRR sectors (Spearman ρ = 0.68; P < 0.0001), superior-/inferior-temporal 45° NRR sectors (ρ = 0.57; P < 0.002), and superior-/inferior-nasal 45° NRR sectors (ρ = 0.59; P < 0.001). Using the linear VF sensitivity scale significant relationships were found for 180° NRR sectors (ρ = 0.62; P < 0.0002) and superior-inferior-nasal 45° NRR sectors (ρ = 0.53; P < 0.002). No significant difference was found between correlations using the linear or decibel VF sensitivity scales. Residual NRR DLA is related to VF sensitivity in POAG. Multispectral imaging may provide clinically important information for the assessment and management of POAG.
Denniss, Jonathan; Schiessl, Ingo; Nourrit, Vincent; Fenerty, Cecilia H.; Gautam, Ramesh; Henson, David B.
2011-01-01
Purpose. To investigate the relationship between neuroretinal rim (NRR) differential light absorption (DLA, a measure of spectral absorption properties) and visual field (VF) sensitivity in primary open-angle glaucoma (POAG). Methods. Patients diagnosed with (n = 22) or suspected of having (n = 7) POAG were imaged with a multispectral system incorporating a modified digital fundus camera, 250-W tungsten-halogen lamp, and fast-tuneable liquid crystal filter. Five images were captured sequentially within 1.0 second at wavelengths selected according to absorption properties of hemoglobin (range, 570–610 nm), and a Beer-Lambert law model was used to produce DLA maps of residual NRR from the images. Patients also underwent VF testing. Differences in NRR DLA in vertically opposing 180° and 45° sectors either side of the horizontal midline were compared with corresponding differences in VF sensitivity on both decibel and linear scales by Spearman's rank correlation. Results. The decibel VF sensitivity scale showed significant relationships between superior–inferior NRR DLA difference and sensitivity differences between corresponding VF areas in 180° NRR sectors (Spearman ρ = 0.68; P < 0.0001), superior-/inferior-temporal 45° NRR sectors (ρ = 0.57; P < 0.002), and superior-/inferior-nasal 45° NRR sectors (ρ = 0.59; P < 0.001). Using the linear VF sensitivity scale significant relationships were found for 180° NRR sectors (ρ = 0.62; P < 0.0002) and superior–inferior–nasal 45° NRR sectors (ρ = 0.53; P < 0.002). No significant difference was found between correlations using the linear or decibel VF sensitivity scales. Conclusions. Residual NRR DLA is related to VF sensitivity in POAG. Multispectral imaging may provide clinically important information for the assessment and management of POAG. PMID:21980002
2015-08-20
This view from NASA Cassini spacecraft looks toward Saturn icy moon Dione, with giant Saturn and its rings in the background, just prior to the mission final close approach to the moon on August 17, 2015. At lower right is the large, multi-ringed impact basin named Evander, which is about 220 miles (350 kilometers) wide. The canyons of Padua Chasma, features that form part of Dione's bright, wispy terrain, reach into the darkness at left. Imaging scientists combined nine visible light (clear spectral filter) images to create this mosaic view: eight from the narrow-angle camera and one from the wide-angle camera, which fills in an area at lower left. The scene is an orthographic projection centered on terrain at 0.2 degrees north latitude, 179 degrees west longitude on Dione. An orthographic view is most like the view seen by a distant observer looking through a telescope. North on Dione is up. The view was acquired at distances ranging from approximately 106,000 miles (170,000 kilometers) to 39,000 miles (63,000 kilometers) from Dione and at a sun-Dione-spacecraft, or phase, angle of 35 degrees. Image scale is about 1,500 feet (450 meters) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA19650
Non-contact measurement of rotation angle with solo camera
NASA Astrophysics Data System (ADS)
Gan, Xiaochuan; Sun, Anbin; Ye, Xin; Ma, Liqun
2015-02-01
For the purpose to measure a rotation angle around the axis of an object, a non-contact rotation angle measurement method based on solo camera was promoted. The intrinsic parameters of camera were calibrated using chessboard on principle of plane calibration theory. The translation matrix and rotation matrix between the object coordinate and the camera coordinate were calculated according to the relationship between the corners' position on object and their coordinates on image. Then the rotation angle between the measured object and the camera could be resolved from the rotation matrix. A precise angle dividing table (PADT) was chosen as the reference to verify the angle measurement error of this method. Test results indicated that the rotation angle measurement error of this method did not exceed +/- 0.01 degree.
Remote sensing and spectral analysis of plumes from ocean dumping in the New York Bight Apex
NASA Technical Reports Server (NTRS)
Johnson, R. W.
1980-01-01
The application of the remote sensing techniques of aerial photography and multispectral scanning in the qualitative and quantitative analysis of plumes from ocean dumping of waste materials is investigated in the New York Bight Apex. Plumes resulting from the dumping of acid waste and sewage sludge were observed by Ocean Color Scanner at an altitude of 19.7 km and by Modular Multispectral Scanner and mapping camera at an altitude of 3.0 km. Results of the qualitative analysis of multispectral and photographic data for the mapping, location, and identification of pollution features without concurrent sea truth measurements are presented which demonstrate the usefulness of in-scene calibration. Quantitative distributions of the suspended solids in sewage sludge released in spot and line dumps are also determined by a multiple regression analysis of multispectral and sea truth data.
1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...
1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
NASA Astrophysics Data System (ADS)
Beyer, Ross A.; Archinal, B.; Li, R.; Mattson, S.; Moratto, Z.; McEwen, A.; Oberst, J.; Robinson, M.
2009-09-01
The Lunar Reconnaissance Orbiter Camera (LROC) will obtain two types of multiple overlapping coverage to derive terrain models of the lunar surface. LROC has two Narrow Angle Cameras (NACs), working jointly to provide a wider (in the cross-track direction) field of view, as well as a Wide Angle Camera (WAC). LRO's orbit precesses, and the same target can be viewed at different solar azimuth and incidence angles providing the opportunity to acquire `photometric stereo' in addition to traditional `geometric stereo' data. Geometric stereo refers to images acquired by LROC with two observations at different times. They must have different emission angles to provide a stereo convergence angle such that the resultant images have enough parallax for a reasonable stereo solution. The lighting at the target must not be radically different. If shadows move substantially between observations, it is very difficult to correlate the images. The majority of NAC geometric stereo will be acquired with one nadir and one off-pointed image (20 degree roll). Alternatively, pairs can be obtained with two spacecraft rolls (one to the left and one to the right) providing a stereo convergence angle up to 40 degrees. Overlapping WAC images from adjacent orbits can be used to generate topography of near-global coverage at kilometer-scale effective spatial resolution. Photometric stereo refers to multiple-look observations of the same target under different lighting conditions. LROC will acquire at least three (ideally five) observations of a target. These observations should have near identical emission angles, but with varying solar azimuth and incidence angles. These types of images can be processed via various methods to derive single pixel resolution topography and surface albedo. The LROC team will produce some topographic models, but stereo data collection is focused on acquiring the highest quality data so that such models can be generated later.
Development of infrared scene projectors for testing fire-fighter cameras
NASA Astrophysics Data System (ADS)
Neira, Jorge E.; Rice, Joseph P.; Amon, Francine K.
2008-04-01
We have developed two types of infrared scene projectors for hardware-in-the-loop testing of thermal imaging cameras such as those used by fire-fighters. In one, direct projection, images are projected directly into the camera. In the other, indirect projection, images are projected onto a diffuse screen, which is then viewed by the camera. Both projectors use a digital micromirror array as the spatial light modulator, in the form of a Micromirror Array Projection System (MAPS) engine having resolution of 800 x 600 with mirrors on a 17 micrometer pitch, aluminum-coated mirrors, and a ZnSe protective window. Fire-fighter cameras are often based upon uncooled microbolometer arrays and typically have resolutions of 320 x 240 or lower. For direct projection, we use an argon-arc source, which provides spectral radiance equivalent to a 10,000 Kelvin blackbody over the 7 micrometer to 14 micrometer wavelength range, to illuminate the micromirror array. For indirect projection, an expanded 4 watt CO II laser beam at a wavelength of 10.6 micrometers illuminates the micromirror array and the scene formed by the first-order diffracted light from the array is projected onto a diffuse aluminum screen. In both projectors, a well-calibrated reference camera is used to provide non-uniformity correction and brightness calibration of the projected scenes, and the fire-fighter cameras alternately view the same scenes. In this paper, we compare the two methods for this application and report on our quantitative results. Indirect projection has an advantage of being able to more easily fill the wide field of view of the fire-fighter cameras, which typically is about 50 degrees. Direct projection more efficiently utilizes the available light, which will become important in emerging multispectral and hyperspectral applications.
NASA Astrophysics Data System (ADS)
Yang, Xue; Hu, Yajia; Li, Gang; Lin, Ling
2018-02-01
This paper proposes an optimized lighting method of applying a shaped-function signal for increasing the dynamic range of light emitting diode (LED)-multispectral imaging system. The optimized lighting method is based on the linear response zone of the analog-to-digital conversion (ADC) and the spectral response of the camera. The auxiliary light at a higher sensitivity-camera area is introduced to increase the A/D quantization levels that are within the linear response zone of ADC and improve the signal-to-noise ratio. The active light is modulated by the shaped-function signal to improve the gray-scale resolution of the image. And the auxiliary light is modulated by the constant intensity signal, which is easy to acquire the images under the active light irradiation. The least square method is employed to precisely extract the desired images. One wavelength in multispectral imaging based on LED illumination was taken as an example. It has been proven by experiments that the gray-scale resolution and the accuracy of information of the images acquired by the proposed method were both significantly improved. The optimum method opens up avenues for the hyperspectral imaging of biological tissue.
Yang, Xue; Hu, Yajia; Li, Gang; Lin, Ling
2018-02-01
This paper proposes an optimized lighting method of applying a shaped-function signal for increasing the dynamic range of light emitting diode (LED)-multispectral imaging system. The optimized lighting method is based on the linear response zone of the analog-to-digital conversion (ADC) and the spectral response of the camera. The auxiliary light at a higher sensitivity-camera area is introduced to increase the A/D quantization levels that are within the linear response zone of ADC and improve the signal-to-noise ratio. The active light is modulated by the shaped-function signal to improve the gray-scale resolution of the image. And the auxiliary light is modulated by the constant intensity signal, which is easy to acquire the images under the active light irradiation. The least square method is employed to precisely extract the desired images. One wavelength in multispectral imaging based on LED illumination was taken as an example. It has been proven by experiments that the gray-scale resolution and the accuracy of information of the images acquired by the proposed method were both significantly improved. The optimum method opens up avenues for the hyperspectral imaging of biological tissue.
Comparison and evaluation of datasets for off-angle iris recognition
NASA Astrophysics Data System (ADS)
Kurtuncu, Osman M.; Cerme, Gamze N.; Karakaya, Mahmut
2016-05-01
In this paper, we investigated the publicly available iris recognition datasets and their data capture procedures in order to determine if they are suitable for the stand-off iris recognition research. Majority of the iris recognition datasets include only frontal iris images. Even if a few datasets include off-angle iris images, the frontal and off-angle iris images are not captured at the same time. The comparison of the frontal and off-angle iris images shows not only differences in the gaze angle but also change in pupil dilation and accommodation as well. In order to isolate the effect of the gaze angle from other challenging issues including dilation and accommodation, the frontal and off-angle iris images are supposed to be captured at the same time by using two different cameras. Therefore, we developed an iris image acquisition platform by using two cameras in this work where one camera captures frontal iris image and the other one captures iris images from off-angle. Based on the comparison of Hamming distance between frontal and off-angle iris images captured with the two-camera- setup and one-camera-setup, we observed that Hamming distance in two-camera-setup is less than one-camera-setup ranging from 0.05 to 0.001. These results show that in order to have accurate results in the off-angle iris recognition research, two-camera-setup is necessary in order to distinguish the challenging issues from each other.
Image quality prediction - An aid to the Viking lander imaging investigation on Mars
NASA Technical Reports Server (NTRS)
Huck, F. O.; Wall, S. D.
1976-01-01
Image quality criteria and image quality predictions are formulated for the multispectral panoramic cameras carried by the Viking Mars landers. Image quality predictions are based on expected camera performance, Mars surface radiance, and lighting and viewing geometry (fields of view, Mars lander shadows, solar day-night alternation), and are needed in diagnosis of camera performance, in arriving at a preflight imaging strategy, and revision of that strategy should the need arise. Landing considerations, camera control instructions, camera control logic, aspects of the imaging process (spectral response, spatial response, sensitivity), and likely problems are discussed. Major concerns include: degradation of camera response by isotope radiation, uncertainties in lighting and viewing geometry and in landing site local topography, contamination of camera window by dust abrasion, and initial errors in assigning camera dynamic ranges (gains and offsets).
Csutak, A; Lengyel, I; Jonasson, F; Leung, I; Geirsdottir, A; Xing, W; Peto, T
2010-10-01
To establish the agreement between image grading of conventional (45°) and ultra wide-angle (200°) digital images in the macula. In 2008, the 12-year follow-up was conducted on 573 participants of the Reykjavik Eye Study. This study included the use of the Optos P200C AF ultra wide-angle laser scanning ophthalmoscope alongside Zeiss FF 450 conventional digital fundus camera on 121 eyes with or without age-related macular degeneration using the International Classification System. Of these eyes, detailed grading was carried out on five cases each with hard drusen, geographic atrophy and chorioretinal neovascularisation, and six cases of soft drusen. Exact agreement and κ-statistics were calculated. Comparison of the conventional and ultra wide-angle images in the macula showed an overall 96.43% agreement (κ=0.93) with no disagreement at end-stage disease; although in one eye chorioretinal neovascularisation was graded as drusenoid pigment epithelial detachment. Of patients with drusen only, the exact agreement was 96.1%. The detailed grading showed no clinically significant disagreement between the conventional 45° and 200° images. On the basis of our results, there is a good agreement between grading conventional and ultra wide-angle images in the macula.
UAV based mapping of variation in grassland yield for forage production in Arctic environments
NASA Astrophysics Data System (ADS)
Davids, C.; Karlsen, S. R.; Jørgensen, M.; Ancin Murguzur, F. J.
2017-12-01
Grassland cultivation for animal feed is the key agricultural activity in northern Norway. Even though the growing season has increased by at least a week in the last 30 years, grassland yields appear to have declined, probably due to more challenging winter conditions and changing agronomy practices. The ability for local and regional crop productivity forecasting would assist farmers with management decisions and would provide local and national authorities with a better overview over productivity and potential problems due to e.g. winter damage. Remote sensing technology has long been used to estimate and map the variability of various biophysical parameters, but calibration is important. In order to establish the relationship between spectral reflectance and grass yield in northern European environments we combine Sentinel-2 time series, UAV-based multispectral measurements, and ground-based spectroradiometry, with biomass analyses and observations of species composition. In this presentation we will focus on the results from the UAV data acquisition. We used a multirotor UAV with different sensors (a multispectral Rikola camera, and NDVI and RGB cameras) to image a number of cultivated grasslands of different age and productivity in northern Norway in June/July 2016 and 2017. Following UAV data acquisition, 10 to 20 in situ measurements were made per field using a FieldSpec3 (350-2500 nm). In addition, samples were taken to determine biomass and grass species composition. The imaging and sampling was done immediately prior to harvesting. The Rikola camera, when used as a stand-alone camera mounted on a UAV, can collect 15 bands with a spectral width of 10-15 nm in the range between 500-890 nm. In the initial analysis of the 2016 data we investigated how well different vegetation indices correlated with biomass and showed that vegetation indices that include red edge bands perform better than widely used indices such as NDVI. We will extend the analysis with partial least square regression once the 2017 data becomes available and in this presentation we will show the results of both the partial least square regression analysis and vegetation indices for the pooled data from the 2016 and 2017 acquisition.
New Horizons Tracks an Asteroid
2007-04-02
The two pots in this image are a composite of two images of asteroid 2002 JF56 taken on June 11 and June 12, 2006, with the Multispectral Visible Imaging Camera component of the New Horizons Ralph imager.
Miniature Wide-Angle Lens for Small-Pixel Electronic Camera
NASA Technical Reports Server (NTRS)
Mouroulils, Pantazis; Blazejewski, Edward
2009-01-01
A proposed wideangle lens is shown that would be especially well suited for an electronic camera in which the focal plane is occupied by an image sensor that has small pixels. The design of the lens is intended to satisfy requirements for compactness, high image quality, and reasonably low cost, while addressing issues peculiar to the operation of small-pixel image sensors. Hence, this design is expected to enable the development of a new generation of compact, high-performance electronic cameras. The lens example shown has a 60 degree field of view and a relative aperture (f-number) of 3.2. The main issues affecting the design are also shown.
1996-01-29
In this false color image of Neptune, objects that are deep in the atmosphere are blue, while those at higher altitudes are white. The image was taken by Voyager 2 wide-angle camera through an orange filter and two different methane filters. http://photojournal.jpl.nasa.gov/catalog/PIA00051
Combined position and diameter measures for lunar craters
Arthur, D.W.G.
1977-01-01
The note addresses the problem of simultaneously measuring positions and diameters of circular impact craters on wide-angle photographs of approximately spherical planets such as the Moon and Mercury. The method allows for situations in which the camera is not aligned on the planet's center. ?? 1977.
NASA Astrophysics Data System (ADS)
Li, J.; Wu, Z.; Wei, X.; Zhang, Y.; Feng, F.; Guo, F.
2018-04-01
Cross-calibration has the advantages of high precision, low resource requirements and simple implementation. It has been widely used in recent years. The four wide-field-of-view (WFV) cameras on-board Gaofen-1 satellite provide high spatial resolution and wide combined coverage (4 × 200 km) without onboard calibration. In this paper, the four-band radiometric cross-calibration coefficients of WFV1 camera were obtained based on radiation and geometry matching taking Landsat 8 OLI (Operational Land Imager) sensor as reference. Scale Invariant Feature Transform (SIFT) feature detection method and distance and included angle weighting method were introduced to correct misregistration of WFV-OLI image pair. The radiative transfer model was used to eliminate difference between OLI sensor and WFV1 camera through the spectral match factor (SMF). The near-infrared band of WFV1 camera encompasses water vapor absorption bands, thus a Look Up Table (LUT) for SMF varies from water vapor amount is established to estimate the water vapor effects. The surface synchronization experiment was designed to verify the reliability of the cross-calibration coefficients, which seem to perform better than the official coefficients claimed by the China Centre for Resources Satellite Data and Application (CCRSDA).
The NEAR Multispectral Imager.
NASA Astrophysics Data System (ADS)
Hawkins, S. E., III
1998-06-01
Multispectral Imager, one of the primary instruments on the Near Earth Asteroid Rendezvous (NEAR) spacecraft, uses a five-element refractive optics telescope, an eight-position filter wheel, and a charge-coupled device detector to acquire images over its sensitive wavelength range of ≍400 - 1100 nm. The primary science objectives of the Multispectral Imager are to determine the morphology and composition of the surface of asteroid 433 Eros. The camera will have a critical role in navigating to the asteroid. Seven narrowband spectral filters have been selected to provide multicolor imaging for comparative studies with previous observations of asteroids in the same class as Eros. The eighth filter is broadband and will be used for optical navigation. An overview of the instrument is presented, and design parameters and tradeoffs are discussed.
Solutions on a high-speed wide-angle zoom lens with aspheric surfaces
NASA Astrophysics Data System (ADS)
Yamanashi, Takanori
2012-10-01
Recent development in CMOS and digital camera technology has accelerated the business and market share of digital cinematography. In terms of optical design, this technology has increased the need to carefully consider pixel pitch and characteristics of the imager. When the field angle at the wide end, zoom ratio, and F-number are specified, choosing an appropriate zoom lens type is crucial. In addition, appropriate power distributions and lens configurations are required. At points near the wide end of a zoom lens, it is known that an aspheric surface is an effective means to correct off-axis aberrations. On the other hand, optical designers have to focus on manufacturability of aspheric surfaces and perform required analysis with respect to the surface shape. Centration errors aside, it is also important to know the sensitivity to aspheric shape errors and their effect on image quality. In this paper, wide angle cine zoom lens design examples are introduced and their main characteristics are described. Moreover, technical challenges are pointed out and solutions are proposed.
NASA Technical Reports Server (NTRS)
2005-01-01
This false color image of Saturn's moon Mimas reveals variation in either the composition or texture across its surface. During its approach to Mimas on Aug. 2, 2005, the Cassini spacecraft narrow-angle camera obtained multi-spectral views of the moon from a range of 228,000 kilometers (142,500 miles). This image is a color composite of narrow-angle ultraviolet, green, infrared and clear filter images, which have been specially processed to accentuate subtle changes in the spectral properties of Mimas' surface materials. To create this view, three color images (ultraviolet, green and infrared) were combined with a single black and white picture that isolates and maps regional color differences to create the final product. Shades of blue and violet in the image at the right are used to identify surface materials that are bluer in color and have a weaker infrared brightness than average Mimas materials, which are represented by green. Herschel crater, a 140-kilometer-wide (88-mile) impact feature with a prominent central peak, is visible in the upper right of the image. The unusual bluer materials are seen to broadly surround Herschel crater. However, the bluer material is not uniformly distributed in and around the crater. Instead, it appears to be concentrated on the outside of the crater and more to the west than to the north or south. The origin of the color differences is not yet understood. It may represent ejecta material that was excavated from inside Mimas when the Herschel impact occurred. The bluer color of these materials may be caused by subtle differences in the surface composition or the sizes of grains making up the icy soil. This image was obtained when the Cassini spacecraft was above 25 degrees south, 134 degrees west latitude and longitude. The Sun-Mimas-spacecraft angle was 45 degrees and north is at the top. The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA's Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colo. For more information about the Cassini-Huygens mission visit http://saturn.jpl.nasa.gov . The Cassini imaging team homepage is at http://ciclops.org .Unmanned Aircraft Systems Used over Western U.S. Rangelands to Characterize Terrestrial Ecosystems
NASA Astrophysics Data System (ADS)
Rango, A.
2015-12-01
New remote sensing methods to quantify terrestrial ecosystems have developed rapidly over the past 10 years. New platforms with improved aeronautical capabilities have become known as Unmanned Aircraft Systems (UAS). In addition to the new aircraft, sensors are becoming smaller and some can fit into limited payload bays. The miniaturization process is well underway, but much remains to be done. Rather than using a wide variety of sensors, a limited number of instruments is recommended. At the moment we fly 2-3 instruments (digital SLR camera, 6-band multispectral camera, and single video camera). Our flights are primarily over low population density western U.S. rangeland with objectives to assess rangeland health, active erosion, vegetation change, phenology, livestock movement, and vegetation type consumed by grazing animals. All of our UAS flights are made using a serpentine flight path with overlapping images at an altitude of 700 ft (215 m). This altitude allows hyperspatial imagery with a resolution of 5-15 cm depending upon the sensor being used, and it allows determination of vegetation type based on the plant structure and vegetation geometries, or by multispectral analysis. In addition to advances in aircraft and sensor technology, image processing software has become more sophisticated. Future development is necessary, and we can expect improvement in sensors, aircraft, data collection, and application to terrestrial ecosystems. Of 17 ARS research laboratories across the country four laboratories are interested in future UAS applications and another 13 already have at least one UAS. In 2015 the Federal Aviation Administration proposed a framework of recommendations that would allow routine use of certain small UAS (those weighing less than 55 lb (25 kg)). Although these new regulations will provide increased flexibility in how flights are made, other operations will still require the use of a Certificate of Authorization.
NASA Astrophysics Data System (ADS)
Nikolashkin, S. V.; Reshetnikov, A. A.
2017-11-01
The system of video surveillance during active rocket experiments in the Polar geophysical observatory "Tixie" and studies of the effects of "Soyuz" vehicle launches from the "Vostochny" cosmodrome over the territory of the Republic of Sakha (Yakutia) is presented. The created system consists of three AHD video cameras with different angles of view mounted on a common platform mounted on a tripod with the possibility of manual guiding. The main camera with high-sensitivity black and white CCD matrix SONY EXview HADII is equipped depending on the task with lenses "MTO-1000" (F = 1000 mm) or "Jupiter-21M " (F = 300 mm) and is designed for more detailed shooting of luminous formations. The second camera of the same type, but with a 30 degree angle of view. It is intended for shooting of the general plan and large objects, and also for a binding of coordinates of object on stars. The third color wide-angle camera (120 degrees) is designed to be connected to landmarks in the daytime, the optical axis of this channel is directed at 60 degrees down. The data is recorded on the hard disk of a four-channel digital video recorder. Tests of the original version of the system with two channels were conducted during the launch of the geophysical rocket in Tixie in September 2015 and showed its effectiveness.
Visual imaging control systems of the Mariner to Jupiter and Saturn spacecraft
NASA Technical Reports Server (NTRS)
Larks, L.
1979-01-01
Design and fabrication of optical systems for the Mariner Jupiter Saturn (Voyager) mission is described. Because of the long distances of these planets from the sun, the spacecraft was designed without solar panels with the electricity generated on-board by radio-isotope thermal generators (RTG). The presence of RTG's and Jupiter radiation environment required that the optical systems be fabricated out of radiation stabilized materials. A narrow angle and a wide angle camera are located on the spacecraft scan platform, with the narrow angle lens a modification of the Mariner 10 lens. The optical system is described, noting that the lens was modified by moving the aperture correctors forward and placing a spider mounted secondary mirror in the original back surface of the second aperture corrector. The wide angle lens was made out of cerium doped, radiation stabilized optical glass with greatest blue transmittance, which would be resistant to RTG and Jupiter radiation.
Research on Geometric Calibration of Spaceborne Linear Array Whiskbroom Camera
Sheng, Qinghong; Wang, Qi; Xiao, Hui; Wang, Qing
2018-01-01
The geometric calibration of a spaceborne thermal-infrared camera with a high spatial resolution and wide coverage can set benchmarks for providing an accurate geographical coordinate for the retrieval of land surface temperature. The practice of using linear array whiskbroom Charge-Coupled Device (CCD) arrays to image the Earth can help get thermal-infrared images of a large breadth with high spatial resolutions. Focusing on the whiskbroom characteristics of equal time intervals and unequal angles, the present study proposes a spaceborne linear-array-scanning imaging geometric model, whilst calibrating temporal system parameters and whiskbroom angle parameters. With the help of the YG-14—China’s first satellite equipped with thermal-infrared cameras of high spatial resolution—China’s Anyang Imaging and Taiyuan Imaging are used to conduct an experiment of geometric calibration and a verification test, respectively. Results have shown that the plane positioning accuracy without ground control points (GCPs) is better than 30 pixels and the plane positioning accuracy with GCPs is better than 1 pixel. PMID:29337885
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; ...
2016-11-28
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° ×more » 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.« less
2016-09-15
NASA's Cassini spacecraft stared at Saturn for nearly 44 hours on April 25 to 27, 2016, to obtain this movie showing just over four Saturn days. With Cassini's orbit being moved closer to the planet in preparation for the mission's 2017 finale, scientists took this final opportunity to capture a long movie in which the planet's full disk fit into a single wide-angle camera frame. Visible at top is the giant hexagon-shaped jet stream that surrounds the planet's north pole. Each side of this huge shape is slightly wider than Earth. The resolution of the 250 natural color wide-angle camera frames comprising this movie is 512x512 pixels, rather than the camera's full resolution of 1024x1024 pixels. Cassini's imaging cameras have the ability to take reduced-size images like these in order to decrease the amount of data storage space required for an observation. The spacecraft began acquiring this sequence of images just after it obtained the images to make a three-panel color mosaic. When it began taking images for this movie sequence, Cassini was 1,847,000 miles (2,973,000 kilometers) from Saturn, with an image scale of 355 kilometers per pixel. When it finished gathering the images, the spacecraft had moved 171,000 miles (275,000 kilometers) closer to the planet, with an image scale of 200 miles (322 kilometers) per pixel. A movie is available at http://photojournal.jpl.nasa.gov/catalog/PIA21047
Auto-calibration of GF-1 WFV images using flat terrain
NASA Astrophysics Data System (ADS)
Zhang, Guo; Xu, Kai; Huang, Wenchao
2017-12-01
Four wide field view (WFV) cameras with 16-m multispectral medium-resolution and a combined swath of 800 km are onboard the Gaofen-1 (GF-1) satellite, which can increase the revisit frequency to less than 4 days and enable large-scale land monitoring. The detection and elimination of WFV camera distortions is key for subsequent applications. Due to the wide swath of WFV images, geometric calibration using either conventional methods based on the ground control field (GCF) or GCF independent methods is problematic. This is predominantly because current GCFs in China fail to cover the whole WFV image and most GCF independent methods are used for close-range photogrammetry or computer vision fields. This study proposes an auto-calibration method using flat terrain to detect nonlinear distortions of GF-1 WFV images. First, a classic geometric calibration model is built for the GF1 WFV camera, and at least two images with an overlap area that cover flat terrain are collected, then the elevation residuals between the real elevation and that calculated by forward intersection are used to solve nonlinear distortion parameters in WFV images. Experiments demonstrate that the orientation accuracy of the proposed method evaluated by GCF CPs is within 0.6 pixel, and residual errors manifest as random errors. Validation using Google Earth CPs further proves the effectiveness of auto-calibration, and the whole scene is undistorted compared to not using calibration parameters. The orientation accuracy of the proposed method and the GCF method is compared. The maximum difference is approximately 0.3 pixel, and the factors behind this discrepancy are analyzed. Generally, this method can effectively compensate for distortions in the GF-1 WFV camera.
Space infrared telescope facility wide field and diffraction limited array camera (IRAC)
NASA Technical Reports Server (NTRS)
Fazio, Giovanni G.
1988-01-01
The wide-field and diffraction limited array camera (IRAC) is capable of two-dimensional photometry in either a wide-field or diffraction-limited mode over the wavelength range from 2 to 30 microns with a possible extension to 120 microns. A low-doped indium antimonide detector was developed for 1.8 to 5.0 microns, detectors were tested and optimized for the entire 1.8 to 30 micron range, beamsplitters were developed and tested for the 1.8 to 30 micron range, and tradeoff studies of the camera's optical system performed. Data are presented on the performance of InSb, Si:In, Si:Ga, and Si:Sb array detectors bumpbonded to a multiplexed CMOS readout chip of the source-follower type at SIRTF operating backgrounds (equal to or less than 1 x 10 to the 8th ph/sq cm/sec) and temperature (4 to 12 K). Some results at higher temperatures are also presented for comparison to SIRTF temperature results. Data are also presented on the performance of IRAC beamsplitters at room temperature at both 0 and 45 deg angle of incidence and on the performance of the all-reflecting optical system baselined for the camera.
The Athena Pancam and Color Microscopic Imager (CMI)
NASA Technical Reports Server (NTRS)
Bell, J. F., III; Herkenhoff, K. E.; Schwochert, M.; Morris, R. V.; Sullivan, R.
2000-01-01
The Athena Mars rover payload includes two primary science-grade imagers: Pancam, a multispectral, stereo, panoramic camera system, and the Color Microscopic Imager (CMI), a multispectral and variable depth-of-field microscope. Both of these instruments will help to achieve the primary Athena science goals by providing information on the geology, mineralogy, and climate history of the landing site. In addition, Pancam provides important support for rover navigation and target selection for Athena in situ investigations. Here we describe the science goals, instrument designs, and instrument performance of the Pancam and CMI investigations.
Classification of human carcinoma cells using multispectral imagery
NASA Astrophysics Data System (ADS)
Ćinar, Umut; Y. Ćetin, Yasemin; Ćetin-Atalay, Rengul; Ćetin, Enis
2016-03-01
In this paper, we present a technique for automatically classifying human carcinoma cell images using textural features. An image dataset containing microscopy biopsy images from different patients for 14 distinct cancer cell line type is studied. The images are captured using a RGB camera attached to an inverted microscopy device. Texture based Gabor features are extracted from multispectral input images. SVM classifier is used to generate a descriptive model for the purpose of cell line classification. The experimental results depict satisfactory performance, and the proposed method is versatile for various microscopy magnification options.
Multispectral imaging of the ocular fundus using light emitting diode illumination
NASA Astrophysics Data System (ADS)
Everdell, N. L.; Styles, I. B.; Calcagni, A.; Gibson, J.; Hebden, J.; Claridge, E.
2010-09-01
We present an imaging system based on light emitting diode (LED) illumination that produces multispectral optical images of the human ocular fundus. It uses a conventional fundus camera equipped with a high power LED light source and a highly sensitive electron-multiplying charge coupled device camera. It is able to take pictures at a series of wavelengths in rapid succession at short exposure times, thereby eliminating the image shift introduced by natural eye movements (saccades). In contrast with snapshot systems the images retain full spatial resolution. The system is not suitable for applications where the full spectral resolution is required as it uses discrete wavebands for illumination. This is not a problem in retinal imaging where the use of selected wavelengths is common. The modular nature of the light source allows new wavelengths to be introduced easily and at low cost. The use of wavelength-specific LEDs as a source is preferable to white light illumination and subsequent filtering of the remitted light as it minimizes the total light exposure of the subject. The system is controlled via a graphical user interface that enables flexible control of intensity, duration, and sequencing of sources in synchrony with the camera. Our initial experiments indicate that the system can acquire multispectral image sequences of the human retina at exposure times of 0.05 s in the range of 500-620 nm with mean signal to noise ratio of 17 dB (min 11, std 4.5), making it suitable for quantitative analysis with application to the diagnosis and screening of eye diseases such as diabetic retinopathy and age-related macular degeneration.
High resolution multispectral photogrammetric imagery: enhancement, interpretation and evaluations
NASA Astrophysics Data System (ADS)
Roberts, Arthur; Haefele, Martin; Bostater, Charles; Becker, Thomas
2007-10-01
A variety of aerial mapping cameras were adapted and developed into simulated multiband digital photogrammetric mapping systems. Direct digital multispectral, two multiband cameras (IIS 4 band and Itek 9 band) and paired mapping and reconnaissance cameras were evaluated for digital spectral performance and photogrammetric mapping accuracy in an aquatic environment. Aerial films (24cm X 24cm format) tested were: Agfa color negative and extended red (visible and near infrared) panchromatic, and; Kodak color infrared and B&W (visible and near infrared) infrared. All films were negative processed to published standards and digitally converted at either 16 (color) or 10 (B&W) microns. Excellent precision in the digital conversions was obtained with scanning errors of less than one micron. Radiometric data conversion was undertaken using linear density conversion and centered 8 bit histogram exposure. This resulted in multiple 8 bit spectral image bands that were unaltered (not radiometrically enhanced) "optical count" conversions of film density. This provided the best film density conversion to a digital product while retaining the original film density characteristics. Data covering water depth, water quality, surface roughness, and bottom substrate were acquired using different measurement techniques as well as different techniques to locate sampling points on the imagery. Despite extensive efforts to obtain accurate ground truth data location errors, measurement errors, and variations in the correlation between water depth and remotely sensed signal persisted. These errors must be considered endemic and may not be removed through even the most elaborate sampling set up. Results indicate that multispectral photogrammetric systems offer improved feature mapping capability.
Multispectral imaging of the ocular fundus using light emitting diode illumination.
Everdell, N L; Styles, I B; Calcagni, A; Gibson, J; Hebden, J; Claridge, E
2010-09-01
We present an imaging system based on light emitting diode (LED) illumination that produces multispectral optical images of the human ocular fundus. It uses a conventional fundus camera equipped with a high power LED light source and a highly sensitive electron-multiplying charge coupled device camera. It is able to take pictures at a series of wavelengths in rapid succession at short exposure times, thereby eliminating the image shift introduced by natural eye movements (saccades). In contrast with snapshot systems the images retain full spatial resolution. The system is not suitable for applications where the full spectral resolution is required as it uses discrete wavebands for illumination. This is not a problem in retinal imaging where the use of selected wavelengths is common. The modular nature of the light source allows new wavelengths to be introduced easily and at low cost. The use of wavelength-specific LEDs as a source is preferable to white light illumination and subsequent filtering of the remitted light as it minimizes the total light exposure of the subject. The system is controlled via a graphical user interface that enables flexible control of intensity, duration, and sequencing of sources in synchrony with the camera. Our initial experiments indicate that the system can acquire multispectral image sequences of the human retina at exposure times of 0.05 s in the range of 500-620 nm with mean signal to noise ratio of 17 dB (min 11, std 4.5), making it suitable for quantitative analysis with application to the diagnosis and screening of eye diseases such as diabetic retinopathy and age-related macular degeneration.
ERTS-1 - Teaching us a new way to see.
NASA Technical Reports Server (NTRS)
Mercanti, E. P.
1973-01-01
The ERTS-1 payload is discussed, giving attention to three television cameras, which view the same area in three different spectral bands. The payload includes also a multispectral scanner subsystem and a data collection system which collects information from some 150 remote, unattended, instrumented ground platforms. Many government agencies use ERTS-1 data as integral parts of their ongoing programs. Through its EROS program, the Interior Department represents the largest single recipient and user agency of data obtained from NASA aircraft and spacecraft designed to gather repetitive information related to a wide variety of earth-science and natural-resources disciplines. Questions of environmental impact are considered together with applications in agriculture, forestry, marine resources, geography, and the survey of water resources.
Wide field-of-view dual-band multispectral muzzle flash detection
NASA Astrophysics Data System (ADS)
Montoya, J.; Melchor, J.; Spiliotis, P.; Taplin, L.
2013-06-01
Sensor technologies are undergoing revolutionary advances, as seen in the rapid growth of multispectral methodologies. Increases in spatial, spectral, and temporal resolution, and in breadth of spectral coverage, render feasible sensors that function with unprecedented performance. A system was developed that addresses many of the key hardware requirements for a practical dual-band multispectral acquisition system, including wide field of view and spectral/temporal shift between dual bands. The system was designed using a novel dichroic beam splitter and dual band-pass filter configuration that creates two side-by-side images of a scene on a single sensor. A high-speed CMOS sensor was used to simultaneously capture data from the entire scene in both spectral bands using a short focal-length lens that provided a wide field-of-view. The beam-splitter components were arranged such that the two images were maintained in optical alignment and real-time intra-band processing could be carried out using only simple arithmetic on the image halves. An experiment related to limitations of the system to address multispectral detection requirements was performed. This characterized the system's low spectral variation across its wide field of view. This paper provides lessons learned on the general limitation of key hardware components required for multispectral muzzle flash detection, using the system as a hardware example combined with simulated multispectral muzzle flash and background signatures.
Clementine Observes the Moon, Solar Corona, and Venus
NASA Technical Reports Server (NTRS)
1997-01-01
In 1994, during its flight, the Clementine spacecraft returned images of the Moon. In addition to the geologic mapping cameras, the Clementine spacecraft also carried two Star Tracker cameras for navigation. These lightweight (0.3 kg) cameras kept the spacecraft on track by constantly observing the positions of stars, reminiscent of the age-old seafaring tradition of sextant/star navigation. These navigation cameras were also to take some spectacular wide angle images of the Moon.
In this picture the Moon is seen illuminated solely by light reflected from the Earth--Earthshine! The bright glow on the lunar horizon is caused by light from the solar corona; the sun is just behind the lunar limb. Caught in this image is the planet Venus at the top of the frame.2015-10-15
NASA's Cassini spacecraft zoomed by Saturn's icy moon Enceladus on Oct. 14, 2015, capturing this stunning image of the moon's north pole. A companion view from the wide-angle camera (PIA20010) shows a zoomed out view of the same region for context. Scientists expected the north polar region of Enceladus to be heavily cratered, based on low-resolution images from the Voyager mission, but high-resolution Cassini images show a landscape of stark contrasts. Thin cracks cross over the pole -- the northernmost extent of a global system of such fractures. Before this Cassini flyby, scientists did not know if the fractures extended so far north on Enceladus. North on Enceladus is up. The image was taken in visible green light with the Cassini spacecraft narrow-angle camera. The view was acquired at a distance of approximately 4,000 miles (6,000 kilometers) from Enceladus and at a Sun-Enceladus-spacecraft, or phase, angle of 9 degrees. Image scale is 115 feet (35 meters) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA19660
New Satellite Project Aerosol-UA: Remote Sensing of Aerosols in the Terrestrial Atmosphere
NASA Technical Reports Server (NTRS)
Milinevsky, G.; Yatskiv, Ya.; Degtyaryov, O.; Syniavskyi, I.; Mishchenko, Michael I.; Rosenbush, V.; Ivanov, Yu.; Makarov, A.; Bovchaliuk, A.; Danylevsky, V.;
2016-01-01
We discuss the development of the Ukrainian space project Aerosol-UA which has the following three main objectives: (1) to monitor the spatial distribution of key characteristics of terrestrial tropospheric and stratospheric aerosols; (2) to provide a comprehensive observational database enabling accurate quantitative estimates of the aerosol contribution to the energy budget of the climate system; and (3) quantify the contribution of anthropogenic aerosols to climate and ecological processes. The remote sensing concept of the project is based on precise orbital measurements of the intensity and polarization of sunlight scattered by the atmosphere and the surface with a scanning polarimeter accompanied by a wide-angle multispectral imager-polarimeter. Preparations have already been made for the development of the instrument suite for the Aerosol-UA project, in particular, of the multi-channel scanning polarimeter (ScanPol) designed for remote sensing studies of the global distribution of aerosol and cloud properties (such as particle size, morphology, and composition) in the terrestrial atmosphere by polarimetric and spectrophotometric measurements of the scattered sunlight in a wide range of wavelengths and viewing directions from which a scene location is observed. ScanPol is accompanied by multispectral wide-angle imager-polarimeter (MSIP) that serves to collect information on cloud conditions and Earths surface image. Various components of the polarimeter ScanPol have been prototyped, including the opto-mechanical and electronic assemblies and the scanning mirror controller. Preliminary synthetic data simulations for the retrieval of aerosol parameters over land surfaces have been performed using the Generalized Retrieval of Aerosol and Surface Properties (GRASP) algorithm. Methods for the validation of satellite data using ground-based observations of aerosol properties are also discussed. We assume that designing, building, and launching into orbit a multi-functional high-precision scanning polarimeter and an imager-polarimeter should make a significant contribution to the study of natural and anthropogenic aerosols and their climatic and ecological effects.
NASA Astrophysics Data System (ADS)
Karachevtseva, I. P.; Kozlova, N. A.; Kokhanov, A. A.; Zubarev, A. E.; Nadezhdina, I. E.; Patratiy, V. D.; Konopikhin, A. A.; Basilevsky, A. T.; Abdrakhimov, A. M.; Oberst, J.; Haase, I.; Jolliff, B. L.; Plescia, J. B.; Robinson, M. S.
2017-02-01
The Lunar Reconnaissance Orbiter Camera (LROC) system consists of a Wide Angle Camera (WAC) and Narrow Angle Camera (NAC). NAC images (∼0.5 to 1.7 m/pixel) reveal details of the Luna-21 landing site and Lunokhod-2 traverse area. We derived a Digital Elevation Model (DEM) and an orthomosaic for the study region using photogrammetric stereo processing techniques with NAC images. The DEM and mosaic allowed us to analyze the topography and morphology of the landing site area and to map the Lunokhod-2 rover route. The total range of topographic elevation along the traverse was found to be less than 144 m; and the rover encountered slopes of up to 20°. With the orthomosaic tied to the lunar reference frame, we derived coordinates of the Lunokhod-2 landing module and overnight stop points. We identified the exact rover route by following its tracks and determined its total length as 39.16 km, more than was estimated during the mission (37 km), which until recently was a distance record for planetary robotic rovers held for more than 40 years.
Towards fish-eye camera based in-home activity assessment.
Bas, Erhan; Erdogmus, Deniz; Ozertem, Umut; Pavel, Misha
2008-01-01
Indoors localization, activity classification, and behavioral modeling are increasingly important for surveillance applications including independent living and remote health monitoring. In this paper, we study the suitability of fish-eye cameras (high-resolution CCD sensors with very-wide-angle lenses) for the purpose of monitoring people in indoors environments. The results indicate that these sensors are very useful for automatic activity monitoring and people tracking. We identify practical and mathematical problems related to information extraction from these video sequences and identify future directions to solve these issues.
The Panoramic Camera (Pancam) Investigation on the NASA 2003 Mars Exploration Rover Mission
NASA Technical Reports Server (NTRS)
Bell, J. F., III; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Dingizian, A.; Brown, D.; Morris, R. V.; Arneson, H. M.; Johnson, M. J.
2003-01-01
The Panoramic Camera System (Pancam) is part of the Athena science payload to be launched to Mars in 2003 on NASA's twin Mars Exploration Rover (MER) missions. The Pancam imaging system on each rover consists of two major components: a pair of digital CCD cameras, and the Pancam Mast Assembly (PMA), which provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360 of azimuth and from zenith to nadir, providing a complete view of the scene around the rover.
13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...
13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...
10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
A Comparative Study of Land Cover Classification by Using Multispectral and Texture Data
Qadri, Salman; Khan, Dost Muhammad; Ahmad, Farooq; Qadri, Syed Furqan; Babar, Masroor Ellahi; Shahid, Muhammad; Ul-Rehman, Muzammil; Razzaq, Abdul; Shah Muhammad, Syed; Fahad, Muhammad; Ahmad, Sarfraz; Pervez, Muhammad Tariq; Naveed, Nasir; Aslam, Naeem; Jamil, Mutiullah; Rehmani, Ejaz Ahmad; Ahmad, Nazir; Akhtar Khan, Naeem
2016-01-01
The main objective of this study is to find out the importance of machine vision approach for the classification of five types of land cover data such as bare land, desert rangeland, green pasture, fertile cultivated land, and Sutlej river land. A novel spectra-statistical framework is designed to classify the subjective land cover data types accurately. Multispectral data of these land covers were acquired by using a handheld device named multispectral radiometer in the form of five spectral bands (blue, green, red, near infrared, and shortwave infrared) while texture data were acquired with a digital camera by the transformation of acquired images into 229 texture features for each image. The most discriminant 30 features of each image were obtained by integrating the three statistical features selection techniques such as Fisher, Probability of Error plus Average Correlation, and Mutual Information (F + PA + MI). Selected texture data clustering was verified by nonlinear discriminant analysis while linear discriminant analysis approach was applied for multispectral data. For classification, the texture and multispectral data were deployed to artificial neural network (ANN: n-class). By implementing a cross validation method (80-20), we received an accuracy of 91.332% for texture data and 96.40% for multispectral data, respectively. PMID:27376088
Non-contact assessment of melanin distribution via multispectral temporal illumination coding
NASA Astrophysics Data System (ADS)
Amelard, Robert; Scharfenberger, Christian; Wong, Alexander; Clausi, David A.
2015-03-01
Melanin is a pigment that is highly absorptive in the UV and visible electromagnetic spectra. It is responsible for perceived skin tone, and protects against harmful UV effects. Abnormal melanin distribution is often an indicator for melanoma. We propose a novel approach for non-contact melanin distribution via multispectral temporal illumination coding to estimate the two-dimensional melanin distribution based on its absorptive characteristics. In the proposed system, a novel multispectral, cross-polarized, temporally-coded illumination sequence is synchronized with a camera to measure reflectance under both multispectral and ambient illumination. This allows us to eliminate the ambient illumination contribution from the acquired reflectance measurements, and also to determine the melanin distribution in an observed region based on the spectral properties of melanin using the Beer-Lambert law. Using this information, melanin distribution maps can be generated for objective, quantitative assessment of skin type of individuals. We show that the melanin distribution map correctly identifies areas with high melanin densities (e.g., nevi).
Johnson, J. R.; Grundy, W.M.; Lemmon, M.T.; Bell, J.F.; Johnson, M.J.; Deen, R.G.; Arvidson, R. E.; Farrand, W. H.; Guinness, E.A.; Hayes, A.G.; Herkenhoff, K. E.; Seelos, F.; Soderblom, J.; Squyres, S.
2006-01-01
Multispectral observations of rocks and soils were acquired under varying illumination and viewing geometries in visible/near-infrared wavelengths by the Panoramic Camera (Pancam) on the Spirit Mars Exploration Rover to provide constraints on the physical and mineralogical nature of geologic materials in Gusev Crater. Data sets were acquired at six sites located near the landing site, in the surrounding plains, and in the West Spur and Husband Hill regions of the Columbia Hills. From these ???600 images, over 10,000 regions of interest were selected of rocks and soils over a wide range of phase angles (0-130??). Corrections for diffuse skylight incorporated sky models based on observations of atmospheric opacity throughout the mission. Disparity maps created from Pancam stereo images allowed inclusion of estimates of local facet orientations in the sky models. Single-term and two-term phase functions derived from Hapke scattering models exhibit a dominantly broad backscattering trend for soils and "Red" rocks inferred to be covered with variable amounts of dust and other coatings, consistent with the results from the Viking Lander and Imager for Mars Pathfinder cameras. Darker "Gray" rock surfaces (inferred to be relatively less dust covered) display more narrow, forward scattering behaviors, consistent with particles exhibiting little internal scattering. Gray and Red rocks are macroscopically rougher than most soil units, although a "dust-cleaning" event observed near the Paso Robles site caused an increase in soil surface roughness in addition to a substantial decrease in surface single scattering albedo. Gray rocks near the rim of Bonneville Crater exhibit the largest macroscopic roughness (????) among all units, as well as the greatest backscattering among Gray rocks. Photometric properties of coated Red rocks vary in the West Spur region, possibly as a result of weathering differences related to elevation-dependent aeolian regimes. Copyright 2006 by the American Geophysical Union.
Upper wide-angle viewing system for ITER.
Lasnier, C J; McLean, A G; Gattuso, A; O'Neill, R; Smiley, M; Vasquez, J; Feder, R; Smith, M; Stratton, B; Johnson, D; Verlaan, A L; Heijmans, J A C
2016-11-01
The Upper Wide Angle Viewing System (UWAVS) will be installed on five upper ports of ITER. This paper shows major requirements, gives an overview of the preliminary design with reasons for some design choices, examines self-emitted IR light from UWAVS optics and its effect on accuracy, and shows calculations of signal-to-noise ratios for the two-color temperature output as a function of integration time and divertor temperature. Accurate temperature output requires correction for vacuum window absorption vs. wavelength and for self-emitted IR, which requires good measurement of the temperature of the optical components. The anticipated signal-to-noise ratio using presently available IR cameras is adequate for the required 500 Hz frame rate.
A Warping Framework for Wide-Angle Imaging and Perspective Manipulation
ERIC Educational Resources Information Center
Carroll, Robert E.
2013-01-01
Nearly all photographs are created with lenses that approximate an ideal pinhole camera--that is, a perspective projection. This projection has proven useful not only for creating realistic depictions, but also for its expressive flexibility. Beginning in the Renaissance, the notion of perspective gave artists a systematic way to represent…
Schiaparelli Crater Rim and Interior Deposits
NASA Technical Reports Server (NTRS)
1998-01-01
A portion of the rim and interior of the large impact crater Schiaparelli is seen at different resolutions in images acquired October 18, 1997 by the Mars Global Surveyor Orbiter Camera (MOC) and by the Viking Orbiter 1 twenty years earlier. The left image is a MOC wide angle camera 'context' image showing much of the eastern portion of the crater at roughly 1 km (0.6 mi) per picture element. The image is about 390 by 730 km (240 X 450 miles). Shown within the wide angle image is the outline of a portion of the best Viking image (center, 371S53), acquired at a resolution of about 240 m/pixel (790 feet). The area covered is 144 X 144 km (89 X 89 miles). The right image is the high resolution narrow angle camera view. The area covered is very small--3.9 X 10.2 km (2.4 X 6.33 mi)--but is seen at 63 times higher resolution than the Viking image. The subdued relief and bright surface are attributed to blanketing by dust; many small craters have been completely filled in, and only the most recent (and very small) craters appear sharp and bowl-shaped. Some of the small craters are only 10-12 m (30-35 feet) across. Occasional dark streaks on steeper slopes are small debris slides that have probably occurred in the past few decades. The two prominent, narrow ridges in the center of the image may be related to the adjustment of the crater floor to age or the weight of the material filling the basin.
Malin Space Science Systems (MSSS) and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.The Day the Earth Smiled: Sneak Preview
2013-07-22
In this rare image taken on July 19, 2013, the wide-angle camera on NASA's Cassini spacecraft has captured Saturn's rings and our planet Earth and its moon in the same frame. It is only one footprint in a mosaic of 33 footprints covering the entire Saturn ring system (including Saturn itself). At each footprint, images were taken in different spectral filters for a total of 323 images: some were taken for scientific purposes and some to produce a natural color mosaic. This is the only wide-angle footprint that has the Earth-moon system in it. The dark side of Saturn, its bright limb, the main rings, the F ring, and the G and E rings are clearly seen; the limb of Saturn and the F ring are overexposed. The "breaks" in the brightness of Saturn's limb are due to the shadows of the rings on the globe of Saturn, preventing sunlight from shining through the atmosphere in those regions. The E and G rings have been brightened for better visibility. Earth, which is 898 million miles (1.44 billion kilometers) away in this image, appears as a blue dot at center right; the moon can be seen as a fainter protrusion off its right side. An arrow indicates their location in the annotated version. (The two are clearly seen as separate objects in the accompanying composite image PIA14949.) The other bright dots nearby are stars. This is only the third time ever that Earth has been imaged from the outer solar system. The acquisition of this image, along with the accompanying composite narrow- and wide-angle image of Earth and the moon and the full mosaic from which both are taken, marked the first time that inhabitants of Earth knew in advance that their planet was being imaged. That opportunity allowed people around the world to join together in social events to celebrate the occasion. This view looks toward the unilluminated side of the rings from about 20 degrees below the ring plane. Images taken using red, green and blue spectral filters were combined to create this natural color view. The images were obtained with the Cassini spacecraft wide-angle camera on July 19, 2013 at a distance of approximately 753,000 miles (1.212 million kilometers) from Saturn, and approximately 898.414 million miles (1.445858 billion kilometers) from Earth. Image scale on Saturn is 43 miles (69 kilometers) per pixel; image scale on the Earth is 53,820 miles (86,620 kilometers) per pixel. The illuminated areas of neither Earth nor the Moon are resolved here. Consequently, the size of each "dot" is the same size that a point of light of comparable brightness would have in the wide-angle camera. http://photojournal.jpl.nasa.gov/catalog/PIA17171
Remote identification of individual volunteer cotton plants
USDA-ARS?s Scientific Manuscript database
Although airborne multispectral remote sensing can identify fields of small cotton plants, improvements to detection sensitivity are needed to identify individual or small clusters of plants that can similarly provide habitat for boll weevils. However, when consumer-grade cameras are used, each pix...
NASA Technical Reports Server (NTRS)
2002-01-01
One of the benefits of the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) Extended Mission is the opportunity to observe how the planet's weather changes during a second full martian year. This picture of Arsia Mons was taken June 19, 2001; southern spring equinox occurred the same day. Arsia Mons is a volcano nearly large enough to cover the state of New Mexico. On this particular day (the first day of Spring), the MOC wide angle cameras documented an unusual spiral-shaped cloud within the 110 km (68 mi) diameter caldera--the summit crater--of the giant volcano. Because the cloud is bright both in the red and blue images acquired by the wide angle cameras, it probably consisted mostly of fine dust grains. The cloud's spin may have been induced by winds off the inner slopes of the volcano's caldera walls resulting from the temperature differences between the walls and the caldera floor, or by a vortex as winds blew up and over the caldera. Similar spiral clouds were seen inside the caldera for several days; we don't know if this was a single cloud that persisted throughout that time or one that regenerated each afternoon. Sunlight illuminates this scene from the left/upper left.Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.Zhang, Dongyan; Zhou, Xingen; Zhang, Jian; Lan, Yubin; Xu, Chao; Liang, Dong
2018-01-01
Detection and monitoring are the first essential step for effective management of sheath blight (ShB), a major disease in rice worldwide. Unmanned aerial systems have a high potential of being utilized to improve this detection process since they can reduce the time needed for scouting for the disease at a field scale, and are affordable and user-friendly in operation. In this study, a commercialized quadrotor unmanned aerial vehicle (UAV), equipped with digital and multispectral cameras, was used to capture imagery data of research plots with 67 rice cultivars and elite lines. Collected imagery data were then processed and analyzed to characterize the development of ShB and quantify different levels of the disease in the field. Through color features extraction and color space transformation of images, it was found that the color transformation could qualitatively detect the infected areas of ShB in the field plots. However, it was less effective to detect different levels of the disease. Five vegetation indices were then calculated from the multispectral images, and ground truths of disease severity and GreenSeeker measured NDVI (Normalized Difference Vegetation Index) were collected. The results of relationship analyses indicate that there was a strong correlation between ground-measured NDVIs and image-extracted NDVIs with the R2 of 0.907 and the root mean square error (RMSE) of 0.0854, and a good correlation between image-extracted NDVIs and disease severity with the R2 of 0.627 and the RMSE of 0.0852. Use of image-based NDVIs extracted from multispectral images could quantify different levels of ShB in the field plots with an accuracy of 63%. These results demonstrate that a customer-grade UAV integrated with digital and multispectral cameras can be an effective tool to detect the ShB disease at a field scale.
On the spectral reflectance properties of materials exposed at the Viking landing sites
NASA Technical Reports Server (NTRS)
Guinness, E.; Arvidson, R.; Dale-Bannister, M.; Singer, R.; Bruckenthal, E.
1987-01-01
Reflectance data derived from Viking Lander multispectral data were used to characterize the types of soils and blocks exposed at the landing sites and to search for evidence of relatively unaltered igneous rocks. A comprehensive effort was mounted to examine multispectral data that combines testing of camera radiometric calibrations, explicitly removing the effects of atmospheric attenuation and skylight, and quantitatively comparing the corrected data to reflectance data from laboratory materials. Bi-directional reflectances for blue, green and red channels were determined for 31 block and soil exposures at Viking landing sites.
Applications of remote sensing to watershed management
NASA Technical Reports Server (NTRS)
Rango, A.
1975-01-01
Aircraft and satellite remote sensing systems which are capable of contributing to watershed management are described and include: the multispectral scanner subsystem on LANDSAT and the basic multispectral camera array flown on high altitude aircraft such as the U-2. Various aspects of watershed management investigated by remote sensing systems are discussed. Major areas included are: snow mapping, surface water inventories, flood management, hydrologic land use monitoring, and watershed modeling. It is indicated that technological advances in remote sensing of hydrological data must be coupled with an expansion of awareness and training in remote sensing techniques of the watershed management community.
Remote sensing techniques applied to multispectral recognition of the Aranjuez pilot zone
NASA Technical Reports Server (NTRS)
Lemos, G. L.; Salinas, J.; Rebollo, M.
1977-01-01
A rectangular (7 x 14 km) area 40 km S of Madrid was remote-sensed with a three-stage recognition process. Ground truth was established in the first phase, airborne sensing with a multispectral scanner and photographic cameras were used in the second phase, and Landsat satellite data were obtained in the third phase. Agronomic and hydrological photointerpretation problems are discussed. Color, black/white, and labeled areas are displayed for crop recognition in the land-use survey; turbidity, concentrations of pollutants and natural chemicals, and densitometry of the water are considered in the evaluation of water resources.
Morphology and Dynamics of Jets of Comet 67P Churyumov-Gerasimenko: Early Phase Development
NASA Astrophysics Data System (ADS)
Lin, Zhong-Yi; Ip, Wing-Huen; Lai, Ian-Lin; Lee, Jui-Chi; Pajola, Maurizio; Lara, Luisa; Gutierrez, Pedro; Rodrigo, Rafael; Bodewits, Dennis; A'Hearn, Mike; Vincent, Jean-Baptiste; Agarwal, Jessica; Keller, Uwe; Mottola, Stefano; Bertini, Ivano; Lowry, Stephen; Rozek, Agata; Liao, Ying; Rosetta Osiris Coi Team
2015-04-01
The scientific camera, OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System), onboard the Rosetta spacecraft comprises a Narrow Angle Camera (NAC) for nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field of dust and gas coma investigations. The dynamical behavior of jets in the dust coma continuously monitored by using dust filters from the arrival at the comet (August 2014) throughout the mapping phase (Oct. 2014) is described here. The analysis will cover the study of the time variability of jets, the source regions of these jets, the excess brightness of jets relative to the averaged coma brightness, and the brightness distribution of dust jets along the projected distance. The jets detected between August and September originated mostly from the neck region (Hapi). Morphological changes appeared over a time scale of several days in September. The brightness slope of the dust jets is much steeper than the background coma. This might be related to the sublimation or fragmentation of the emitted dust grains. Inter-comparison with results from other experiments will be necessary to understand the difference between the dust emitted from Hapi and those from the head and the body of the nucleus surface. The physical properties of the Hapi jets will be compared to dust jets (and their source regions) to emerge as comet 67P moves around the perihelion.
Allometric constraints to inversion of canopy structure from remote sensing
NASA Astrophysics Data System (ADS)
Wolf, A.; Berry, J. A.; Asner, G. P.
2008-12-01
Canopy radiative transfer models employ a large number of vegetation architectural and leaf biochemical attributes. Studies of leaf biochemistry show a wide array of chemical and spectral diversity that suggests that several leaf biochemical constituents can be independently retrieved from multi-spectral remotely sensed imagery. In contrast, attempts to exploit multi-angle imagery to retrieve canopy structure only succeed in finding two or three of the many unknown canopy arhitectural attributes. We examine a database of over 5000 destructive tree harvests from Eurasia to show that allometry - the covariation of plant form across a broad range of plant size and canopy density - restricts the architectural diversity of plant canopies into a single composite variable ranging from young canopies with many short trees with small crowns to older canopies with fewer trees and larger crowns. Moreover, these architectural attributes are closely linked to biomass via allometric constraints such as the "self-thinning law". We use the measured variance and covariance of plant canopy architecture in these stands to drive the radiative transfer model DISORD, which employs the Li-Strahler geometric optics model. This correlations introduced in the Monte Carlo study are used to determine which attributes of canopy architecture lead to important variation that can be observed by multi-angle or multi-spectral satellite observations, using the sun-view geometry characteristic of MODIS observations in different biomes located at different latitude bands. We conclude that although multi-angle/multi-spectral remote sensing is only sensitive to some of the many unknown canopy attributes that ecologists would wish to know, the strong allometric covariation between these attributes and others permits a large number of inferrences, such as forest biomass, that will be meaningful next-generation vegetation products useful for data assimilation.
Combined Infrared Stereo and Laser Ranging Cloud Measurements from Shuttle Mission STS-85
NASA Technical Reports Server (NTRS)
Lancaster, Redgie S.; Spinhirne, James D.; OCStarr, David (Technical Monitor)
2001-01-01
Multi-angle remote sensing provides a wealth of information for earth and climate monitoring. And, as technology advances so do the options for developing instrumentation versatile enough to meet the demands associated with these types of measurements. In the current work, the multiangle measurement capability of the Infrared Spectral Imaging Radiometer is demonstrated. This instrument flew as part of mission STS-85 of the space shuttle Columbia in 1997 and was the first earth-observing radiometer to incorporate an uncooled microbolometer array detector as its image sensor. Specifically, a method for computing cloud-top height from the multi-spectral stereo measurements acquired during this flight has been developed and the results demonstrate that a vertical precision of 10.6 km was achieved. Further, the accuracy of these measurements is confirmed by comparison with coincident direct laser ranging measurements from the Shuttle Laser Altimeter. Mission STS-85 was the first space flight to combine laser ranging and thermal IR camera systems for cloud remote sensing.
The Effect of Camera Angle and Image Size on Source Credibility and Interpersonal Attraction.
ERIC Educational Resources Information Center
McCain, Thomas A.; Wakshlag, Jacob J.
The purpose of this study was to examine the effects of two nonverbal visual variables (camera angle and image size) on variables developed in a nonmediated context (source credibility and interpersonal attraction). Camera angle and image size were manipulated in eight video taped television newscasts which were subsequently presented to eight…
Clementine Observes the Moon, Solar Corona, and Venus
1999-06-12
In 1994, during its flight, NASA's Clementine spacecraft returned images of the Moon. In addition to the geologic mapping cameras, the Clementine spacecraft also carried two Star Tracker cameras for navigation. These lightweight (0.3 kg) cameras kept the spacecraft on track by constantly observing the positions of stars, reminiscent of the age-old seafaring tradition of sextant/star navigation. These navigation cameras were also to take some spectacular wide angle images of the Moon. In this picture the Moon is seen illuminated solely by light reflected from the Earth--Earthshine! The bright glow on the lunar horizon is caused by light from the solar corona; the sun is just behind the lunar limb. Caught in this image is the planet Venus at the top of the frame. http://photojournal.jpl.nasa.gov/catalog/PIA00434
Quality Assessment of 3d Reconstruction Using Fisheye and Perspective Sensors
NASA Astrophysics Data System (ADS)
Strecha, C.; Zoller, R.; Rutishauser, S.; Brot, B.; Schneider-Zapp, K.; Chovancova, V.; Krull, M.; Glassey, L.
2015-03-01
Recent mathematical advances, growing alongside the use of unmanned aerial vehicles, have not only overcome the restriction of roll and pitch angles during flight but also enabled us to apply non-metric cameras in photogrammetric method, providing more flexibility for sensor selection. Fisheye cameras, for example, advantageously provide images with wide coverage; however, these images are extremely distorted and their non-uniform resolutions make them more difficult to use for mapping or terrestrial 3D modelling. In this paper, we compare the usability of different camera-lens combinations, using the complete workflow implemented in Pix4Dmapper to achieve the final terrestrial reconstruction result of a well-known historical site in Switzerland: the Chillon Castle. We assess the accuracy of the outcome acquired by consumer cameras with perspective and fisheye lenses, comparing the results to a laser scanner point cloud.
Alexandridis, Thomas K; Tamouridou, Afroditi Alexandra; Pantazi, Xanthoula Eirini; Lagopodi, Anastasia L; Kashefi, Javid; Ovakoglou, Georgios; Polychronos, Vassilios; Moshou, Dimitrios
2017-09-01
In the present study, the detection and mapping of Silybum marianum (L.) Gaertn. weed using novelty detection classifiers is reported. A multispectral camera (green-red-NIR) on board a fixed wing unmanned aerial vehicle (UAV) was employed for obtaining high-resolution images. Four novelty detection classifiers were used to identify S. marianum between other vegetation in a field. The classifiers were One Class Support Vector Machine (OC-SVM), One Class Self-Organizing Maps (OC-SOM), Autoencoders and One Class Principal Component Analysis (OC-PCA). As input features to the novelty detection classifiers, the three spectral bands and texture were used. The S. marianum identification accuracy using OC-SVM reached an overall accuracy of 96%. The results show the feasibility of effective S. marianum mapping by means of novelty detection classifiers acting on multispectral UAV imagery.
Development of low-cost high-performance multispectral camera system at Banpil
NASA Astrophysics Data System (ADS)
Oduor, Patrick; Mizuno, Genki; Olah, Robert; Dutta, Achyut K.
2014-05-01
Banpil Photonics (Banpil) has developed a low-cost high-performance multispectral camera system for Visible to Short- Wave Infrared (VIS-SWIR) imaging for the most demanding high-sensitivity and high-speed military, commercial and industrial applications. The 640x512 pixel InGaAs uncooled camera system is designed to provide a compact, smallform factor to within a cubic inch, high sensitivity needing less than 100 electrons, high dynamic range exceeding 190 dB, high-frame rates greater than 1000 frames per second (FPS) at full resolution, and low power consumption below 1W. This is practically all the feature benefits highly desirable in military imaging applications to expand deployment to every warfighter, while also maintaining a low-cost structure demanded for scaling into commercial markets. This paper describes Banpil's development of the camera system including the features of the image sensor with an innovation integrating advanced digital electronics functionality, which has made the confluence of high-performance capabilities on the same imaging platform practical at low cost. It discusses the strategies employed including innovations of the key components (e.g. focal plane array (FPA) and Read-Out Integrated Circuitry (ROIC)) within our control while maintaining a fabless model, and strategic collaboration with partners to attain additional cost reductions on optics, electronics, and packaging. We highlight the challenges and potential opportunities for further cost reductions to achieve a goal of a sub-$1000 uncooled high-performance camera system. Finally, a brief overview of emerging military, commercial and industrial applications that will benefit from this high performance imaging system and their forecast cost structure is presented.
A Wide Field of View Plasma Spectrometer
Skoug, Ruth M.; Funsten, Herbert O.; Moebius, Eberhard; ...
2016-07-01
Here we present a fundamentally new type of space plasma spectrometer, the wide field of view plasma spectrometer, whose field of view is >1.25π ster using fewer resources than traditional methods. The enabling component is analogous to a pinhole camera with an electrostatic energy-angle filter at the image plane. Particle energy-per-charge is selected with a tunable bias voltage applied to the filter plate relative to the pinhole aperture plate. For a given bias voltage, charged particles from different directions are focused by different angles to different locations. Particles with appropriate locations and angles can transit the filter plate and aremore » measured using a microchannel plate detector with a position-sensitive anode. Full energy and angle coverage are obtained using a single high-voltage power supply, resulting in considerable resource savings and allowing measurements at fast timescales. Lastly, we present laboratory prototype measurements and simulations demonstrating the instrument concept and discuss optimizations of the instrument design for application to space measurements.« less
Challenges and solutions for high performance SWIR lens design
NASA Astrophysics Data System (ADS)
Gardner, M. C.; Rogers, P. J.; Wilde, M. F.; Cook, T.; Shipton, A.
2016-10-01
Shortwave infrared (SWIR) cameras are becoming increasingly attractive due to the improving size, resolution and decreasing prices of InGaAs focal plane arrays (FPAs). The rapid development of competitively priced HD performance SWIR cameras has not been matched in SWIR imaging lenses with the result that the lens is now more likely to be the limiting factor in imaging quality than the FPA. Adapting existing lens designs from the visible region by re-coating for SWIR will improve total transmission but diminished image quality metrics such as MTF, and in particular large field angle performance such as vignetting, field curvature and distortion are serious consequences. To meet this challenge original SWIR solutions are presented including a wide field of view fixed focal length lens for commercial machine vision (CMV) and a wide angle, small, lightweight defence lens and their relevant design considerations discussed. Issues restricting suitable glass types will be examined. The index and dispersion properties at SWIR wavelengths can differ significantly from their visible values resulting in unusual glass combinations when matching doublet elements. Materials chosen simultaneously allow athermalization of the design as well as containing matched CTEs in the elements of doublets. Recently, thinned backside-illuminated InGaAs devices have made Vis.SWIR cameras viable. The SWIR band is sufficiently close to the visible that the same constituent materials can be used for AR coatings covering both bands. Keeping the lens short and mass low can easily result in high incidence angles which in turn complicates coating design, especially when extended beyond SWIR into the visible band. This paper also explores the potential performance of wideband Vis.SWIR AR coatings.
Wide-field fundus imaging with trans-palpebral illumination.
Toslak, Devrim; Thapa, Damber; Chen, Yanjun; Erol, Muhammet Kazim; Paul Chan, R V; Yao, Xincheng
2017-01-28
In conventional fundus imaging devices, transpupillary illumination is used for illuminating the inside of the eye. In this method, the illumination light is directed into the posterior segment of the eye through the cornea and passes the pupillary area. As a result of sharing the pupillary area for the illumination beam and observation path, pupil dilation is typically necessary for wide-angle fundus examination, and the field of view is inherently limited. An alternative approach is to deliver light from the sclera. It is possible to image a wider retinal area with transcleral-illumination. However, the requirement of physical contact between the illumination probe and the sclera is a drawback of this method. We report here trans-palpebral illumination as a new method to deliver the light through the upper eyelid (palpebra). For this study, we used a 1.5 mm diameter fiber with a warm white LED light source. To illuminate the inside of the eye, the fiber illuminator was placed at the location corresponding to the pars plana region. A custom designed optical system was attached to a digital camera for retinal imaging. The optical system contained a 90 diopter ophthalmic lens and a 25 diopter relay lens. The ophthalmic lens collected light coming from the posterior of the eye and formed an aerial image between the ophthalmic and relay lenses. The aerial image was captured by the camera through the relay lens. An adequate illumination level was obtained to capture wide angle fundus images within ocular safety limits, defined by the ISO 15004-2: 2007 standard. This novel trans-palpebral illumination approach enables wide-angle fundus photography without eyeball contact and pupil dilation.
Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid
2016-06-13
Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.
NASA Technical Reports Server (NTRS)
Ponseggi, B. G. (Editor); Johnson, H. C. (Editor)
1985-01-01
Papers are presented on the picosecond electronic framing camera, photogrammetric techniques using high-speed cineradiography, picosecond semiconductor lasers for characterizing high-speed image shutters, the measurement of dynamic strain by high-speed moire photography, the fast framing camera with independent frame adjustments, design considerations for a data recording system, and nanosecond optical shutters. Consideration is given to boundary-layer transition detectors, holographic imaging, laser holographic interferometry in wind tunnels, heterodyne holographic interferometry, a multispectral video imaging and analysis system, a gated intensified camera, a charge-injection-device profile camera, a gated silicon-intensified-target streak tube and nanosecond-gated photoemissive shutter tubes. Topics discussed include high time-space resolved photography of lasers, time-resolved X-ray spectrographic instrumentation for laser studies, a time-resolving X-ray spectrometer, a femtosecond streak camera, streak tubes and cameras, and a short pulse X-ray diagnostic development facility.
Surveillance Using Multiple Unmanned Aerial Vehicles
2009-03-01
BATCAM wingspan was 21” vs Jodeh’s 9.1 ft, the BATCAM’s propulsion was electric vs. Jodeh’s gas engine, cameras were body fixed vs. gimballed, and...3.1: BATCAM Camera FOV Angles Angle Front Camera Side Camera Depression Angle 49◦ 39◦ horizontal FOV 48◦ 48◦ vertical FOV 40◦ 40◦ by a quiet electric ...motor. The batteries can be recharged with a car cigarette lighter in less than an hour. Assembly of the wing airframe takes less than a minute, and
Types of rocks exposed at the Viking landing sites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guinness, E.; Arvidson, R.; Dale-Bannister, M.
1985-01-01
Spectral estimates derived from Viking Lander multispectral images have been used to investigate the types of rocks exposed at both landing sites, and to infer whether the rocks are primary igneous rocks or weathering products. These analyses should aid interpretations of spectra to be returned from the Visual and Infrared Mapping Spectrometer on the upcoming Mars Observer Mission. A series of gray surfaces on the Landers were used to check the accuracy of the camera preflight calibrations. Results indicate that the pre-flight calibrations for the three color channels are probably correct for all cameras but camera 2 on Lander 1.more » The calibration for the infrared channels appears to have changed, although the cause is not known. For this paper, only the color channels were used to derive data for rocks. Rocks at both sites exhibit a variety of reflectance values. For example, reflectance estimates for two rocks in the blue (0.4-0.5 microns), green (0.5-0.6 microns), and red (0.6-0.75 microns) channels are 0.16, 0.23, and 0.33 and 0.12, 0.19, 0.37 at a phase angle of 20 degrees. These values have been compared with laboratory reflectance spectra of analog materials and telescopic spectra of Mars, both convolved to the Lander bandpasses. Lander values for some rocks are similar to earth based observations of martian dark regions and with certain mafic igneous rocks thinly coated with amorphous ferric-oxide rich weathering products. These results are consistent with previous interpretations.« less
Reconditioning of Cassini Narrow-Angle Camera
2002-07-23
These five images of single stars, taken at different times with the narrow-angle camera on NASA Cassini spacecraft, show the effects of haze collecting on the camera optics, then successful removal of the haze by warming treatments.
Multi-Angle Snowflake Camera Instrument Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stuefer, Martin; Bailey, J.
2016-07-01
The Multi-Angle Snowflake Camera (MASC) takes 9- to 37-micron resolution stereographic photographs of free-falling hydrometers from three angles, while simultaneously measuring their fall speed. Information about hydrometeor size, shape orientation, and aspect ratio is derived from MASC photographs. The instrument consists of three commercial cameras separated by angles of 36º. Each camera field of view is aligned to have a common single focus point about 10 cm distant from the cameras. Two near-infrared emitter pairs are aligned with the camera’s field of view within a 10-angular ring and detect hydrometeor passage, with the lower emitters configured to trigger the MASCmore » cameras. The sensitive IR motion sensors are designed to filter out slow variations in ambient light. Fall speed is derived from successive triggers along the fall path. The camera exposure times are extremely short, in the range of 1/25,000th of a second, enabling the MASC to capture snowflake sizes ranging from 30 micrometers to 3 cm.« less
Upper wide-angle viewing system for ITER
Lasnier, C. J.; McLean, A. G.; Gattuso, A.; ...
2016-08-15
The Upper Wide Angle Viewing System (UWAVS) will be installed on five upper ports of ITER. Here, this paper shows major requirements, gives an overview of the preliminary design with reasons for some design choices, examines self-emitted IR light from UWAVS optics and its effect on accuracy, and shows calculations of signal-to-noise ratios for the two-color temperature output as a function of integration time and divertor temperature. Accurate temperature output requires correction for vacuum window absorption vs. wavelength and for self-emitted IR, which requires good measurement of the temperature of the optical components. The anticipated signal-to-noise ratio using presently availablemore » IR cameras is adequate for the required 500 Hz frame rate.« less
NASA Technical Reports Server (NTRS)
2005-01-01
False color images of Saturn's moon, Mimas, reveal variation in either the composition or texture across its surface. During its approach to Mimas on Aug. 2, 2005, the Cassini spacecraft narrow-angle camera obtained multi-spectral views of the moon from a range of 228,000 kilometers (142,500 miles). The image at the left is a narrow angle clear-filter image, which was separately processed to enhance the contrast in brightness and sharpness of visible features. The image at the right is a color composite of narrow-angle ultraviolet, green, infrared and clear filter images, which have been specially processed to accentuate subtle changes in the spectral properties of Mimas' surface materials. To create this view, three color images (ultraviolet, green and infrared) were combined into a single black and white picture that isolates and maps regional color differences. This 'color map' was then superimposed over the clear-filter image at the left. The combination of color map and brightness image shows how the color differences across the Mimas surface materials are tied to geological features. Shades of blue and violet in the image at the right are used to identify surface materials that are bluer in color and have a weaker infrared brightness than average Mimas materials, which are represented by green. Herschel crater, a 140-kilometer-wide (88-mile) impact feature with a prominent central peak, is visible in the upper right of each image. The unusual bluer materials are seen to broadly surround Herschel crater. However, the bluer material is not uniformly distributed in and around the crater. Instead, it appears to be concentrated on the outside of the crater and more to the west than to the north or south. The origin of the color differences is not yet understood. It may represent ejecta material that was excavated from inside Mimas when the Herschel impact occurred. The bluer color of these materials may be caused by subtle differences in the surface composition or the sizes of grains making up the icy soil. The images were obtained when the Cassini spacecraft was above 25 degrees south, 134 degrees west latitude and longitude. The Sun-Mimas-spacecraft angle was 45 degrees and north is at the top. The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA's Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colo. For more information about the Cassini-Huygens mission visit http://saturn.jpl.nasa.gov . The Cassini imaging team homepage is at http://ciclops.org .USDA-ARS?s Scientific Manuscript database
An Unmanned Agricultural Robotics System (UARS) is acquired, rebuilt with desired hardware, and operated in both classrooms and field. The UARS includes crop height sensor, crop canopy analyzer, normalized difference vegetative index (NDVI) sensor, multispectral camera, and hyperspectral radiometer...
Angle of sky light polarization derived from digital images of the sky under various conditions.
Zhang, Wenjing; Cao, Yu; Zhang, Xuanzhe; Yang, Yi; Ning, Yu
2017-01-20
Skylight polarization is used for navigation by some birds and insects. Skylight polarization also has potential for human navigation applications. Its advantages include relative immunity from interference and the absence of error accumulation over time. However, there are presently few examples of practical applications for polarization navigation technology. The main reason is its weak robustness during cloudy weather conditions. In this paper, the real-time measurement of the sky light polarization pattern across the sky has been achieved with a wide field of view camera. The images were processed under a new reference coordinate system to clearly display the symmetrical distribution of angle of polarization with respect to the solar meridian. A new algorithm for the extraction of the image axis of symmetry is proposed, in which the real-time azimuth angle between the camera and the solar meridian is accurately calculated. Our experimental results under different weather conditions show that polarization navigation has high accuracy, is strongly robust, and performs well during fog and haze, clouds, and strong sunlight.
Multispectral and polarimetric photodetection using a plasmonic metasurface
NASA Astrophysics Data System (ADS)
Pelzman, Charles; Cho, Sang-Yeon
2018-01-01
We present a metasurface-integrated Si 2-D CMOS sensor array for multispectral and polarimetric photodetection applications. The demonstrated sensor is based on the polarization selective extraordinary optical transmission from periodic subwavelength nanostructures, acting as artificial atoms, known as meta-atoms. The meta-atoms were created by patterning periodic rectangular apertures that support optical resonance at the designed spectral bands. By spatially separating meta-atom clusters with different lattice constants and orientations, the demonstrated metasurface can convert the polarization and spectral information of an optical input into a 2-D intensity pattern. As a proof-of-concept experiment, we measured the linear components of the Stokes parameters directly from captured images using a CMOS camera at four spectral bands. Compared to existing multispectral polarimetric sensors, the demonstrated metasurface-integrated CMOS system is compact and does not require any moving components, offering great potential for advanced photodetection applications.
Lu, Liang; Qi, Lin; Luo, Yisong; Jiao, Hengchao; Dong, Junyu
2018-03-02
Multi-spectral photometric stereo can recover pixel-wise surface normal from a single RGB image. The difficulty lies in that the intensity in each channel is the tangle of illumination, albedo and camera response; thus, an initial estimate of the normal is required in optimization-based solutions. In this paper, we propose to make a rough depth estimation using the deep convolutional neural network (CNN) instead of using depth sensors or binocular stereo devices. Since high-resolution ground-truth data is expensive to obtain, we designed a network and trained it with rendered images of synthetic 3D objects. We use the model to predict initial normal of real-world objects and iteratively optimize the fine-scale geometry in the multi-spectral photometric stereo framework. The experimental results illustrate the improvement of the proposed method compared with existing methods.
Tamouridou, Afroditi A.; Lagopodi, Anastasia L.; Kashefi, Javid; Kasampalis, Dimitris; Kontouris, Georgios; Moshou, Dimitrios
2017-01-01
Remote sensing techniques are routinely used in plant species discrimination and of weed mapping. In the presented work, successful Silybum marianum detection and mapping using multilayer neural networks is demonstrated. A multispectral camera (green-red-near infrared) attached on a fixed wing unmanned aerial vehicle (UAV) was utilized for the acquisition of high-resolution images (0.1 m resolution). The Multilayer Perceptron with Automatic Relevance Determination (MLP-ARD) was used to identify the S. marianum among other vegetation, mostly Avena sterilis L. The three spectral bands of Red, Green, Near Infrared (NIR) and the texture layer resulting from local variance were used as input. The S. marianum identification rates using MLP-ARD reached an accuracy of 99.54%. Τhe study had an one year duration, meaning that the results are specific, although the accuracy shows the interesting potential of S. marianum mapping with MLP-ARD on multispectral UAV imagery. PMID:29019957
Apollo 9 Mission image - S0-65 Multispectral Photography - Texas
2009-01-21
Earth Observation taken by the Apollo 9 crew. View is of Galveston and Freeport in Texas. Latitude was 28.42 N by Longitude 94.54 W, Overlap was 80%, Altitude miles were 105 and cloud cover was 5%. This imagery taken as part of the NASA S0-65 Experiment "Multispectral Terrain Photography". The experiment provides simultaneous satellite photography of the Earth's surface in three distinct spectral bands. The photography consists of four almost spatially identical photographs. The images of ground objects appear in the same coordinate positions on all four photos in the multispectral set within the opto-mechanical tolerances of the Hasselblad cameras in the Apollo 9 spacecraft. Band designation for this frame is A. Film and filter is Ektachrome SO-368,Infrared Color Wratten 15. Mean Wavelength of Sensitivity is green,red and infrared. The Nominal Bandpass is total sensitivity of all dye layers 510-900nm.
Apollo 9 Mission image - S0-65 Multispectral Photography - New Mexico
2009-01-21
Earth Observation taken by the Apollo 9 crew. View is of Carrizozo in New Mexico and includes lava flow and snow. Latitude was 33.42 N by Longitude 106.10 W, Overlap was 7.5%, Altitude miles were 121 and cloud cover was 0%. This imagery taken as part of the NASA S0-65 Experiment "Multispectral Terrain Photography". The experiment provides simultaneous satellite photography of the Earth's surface in three distinct spectral bands. The photography consists of four almost spatially identical photographs. The images of ground objects appear in the same coordinate positions on all four photos in the multispectral set within the opto-mechanical tolerances of the Hasselblad cameras in the Apollo 9 spacecraft. Band designation for this frame is A. Film and filter is Ektachrome SO-368,Infrared Color Wratten 15. Mean Wavelength of Sensitivity is green,red and infrared. The Nominal Bandpass is total sensitivity of all dye layers 510-900nm.
Apollo 9 Mission image - S0-65 Multispectral Photography - California
2009-01-21
Earth Observation taken by the Apollo 9 crew. View is of Salton Sea and Imperial Valley in California. Latitude was 33.09 N by Longitude 116.14 W, Overlap was 50%, Altitude miles were 103 and cloud cover was 35%. This imagery taken as part of the NASA S0-65 Experiment "Multispectral Terrain Photography". The experiment provides simultaneous satellite photography of the Earth's surface in three distinct spectral bands. The photography consists of four almost spatially identical photographs. The images of ground objects appear in the same coordinate positions on all four photos in the multispectral set within the opto-mechanical tolerances of the Hasselblad cameras in the Apollo 9 spacecraft. Band designation for this frame is A. Film and filter is Ektachrome SO-368,Infrared Color Wratten 15. Mean Wavelength of Sensitivity is green,red and infrared. The Nominal Bandpass is total sensitivity of all dye layers 510-900nm.
NASA Astrophysics Data System (ADS)
Chernomyrdin, Nikita V.; Zaytsev, Kirill I.; Lesnichaya, Anastasiya D.; Kudrin, Konstantin G.; Cherkasova, Olga P.; Kurlov, Vladimir N.; Shikunova, Irina A.; Perchik, Alexei V.; Yurchenko, Stanislav O.; Reshetov, Igor V.
2016-09-01
In present paper, an ability to differentiate basal cell carcinoma (BCC) and healthy skin by combining multi-spectral autofluorescence imaging, principle component analysis (PCA), and linear discriminant analysis (LDA) has been demonstrated. For this purpose, the experimental setup, which includes excitation and detection branches, has been assembled. The excitation branch utilizes a mercury arc lamp equipped with a 365-nm narrow-linewidth excitation filter, a beam homogenizer, and a mechanical chopper. The detection branch employs a set of bandpass filters with the central wavelength of spectral transparency of λ = 400, 450, 500, and 550 nm, and a digital camera. The setup has been used to study three samples of freshly excised BCC. PCA and LDA have been implemented to analyze the data of multi-spectral fluorescence imaging. Observed results of this pilot study highlight the advantages of proposed imaging technique for skin cancer diagnosis.
Lu, Liang; Qi, Lin; Luo, Yisong; Jiao, Hengchao; Dong, Junyu
2018-01-01
Multi-spectral photometric stereo can recover pixel-wise surface normal from a single RGB image. The difficulty lies in that the intensity in each channel is the tangle of illumination, albedo and camera response; thus, an initial estimate of the normal is required in optimization-based solutions. In this paper, we propose to make a rough depth estimation using the deep convolutional neural network (CNN) instead of using depth sensors or binocular stereo devices. Since high-resolution ground-truth data is expensive to obtain, we designed a network and trained it with rendered images of synthetic 3D objects. We use the model to predict initial normal of real-world objects and iteratively optimize the fine-scale geometry in the multi-spectral photometric stereo framework. The experimental results illustrate the improvement of the proposed method compared with existing methods. PMID:29498703
Tamouridou, Afroditi A; Alexandridis, Thomas K; Pantazi, Xanthoula E; Lagopodi, Anastasia L; Kashefi, Javid; Kasampalis, Dimitris; Kontouris, Georgios; Moshou, Dimitrios
2017-10-11
Remote sensing techniques are routinely used in plant species discrimination and of weed mapping. In the presented work, successful Silybum marianum detection and mapping using multilayer neural networks is demonstrated. A multispectral camera (green-red-near infrared) attached on a fixed wing unmanned aerial vehicle (UAV) was utilized for the acquisition of high-resolution images (0.1 m resolution). The Multilayer Perceptron with Automatic Relevance Determination (MLP-ARD) was used to identify the S. marianum among other vegetation, mostly Avena sterilis L. The three spectral bands of Red, Green, Near Infrared (NIR) and the texture layer resulting from local variance were used as input. The S. marianum identification rates using MLP-ARD reached an accuracy of 99.54%. Τhe study had an one year duration, meaning that the results are specific, although the accuracy shows the interesting potential of S. marianum mapping with MLP-ARD on multispectral UAV imagery.
There is no bidirectional hot-spot in Sentinel-2 data
NASA Astrophysics Data System (ADS)
Li, Z.; Roy, D. P.; Zhang, H.
2017-12-01
The Sentinel-2 multi-spectral instrument (MSI) acquires reflective wavelength observations with directional effects due to surface reflectance anisotropy, often described by the bidirectional reflectance distribution function (BRDF). Recently, we quantified Sentinel-2A (S2A) BRDF effects for 20° × 10° of southern Africa sensed in January and in April 2016 and found maximum BRDF effects for the January data and at the western scan edge, i.e., in the back-scatter direction (Roy et al. 2017). The hot-spot is the term used to describe the increased directional reflectance that occurs over most surfaces when the solar and viewing directions coincide, and has been observed in wide-field of view data such as MODIS. Recently, we observed that Landsat data will not have a hot-spot because the global annual minimum solar zenith angle is more than twice the maximum view zenith angle (Zhang et al. 2016). This presentation examines if there is a S2A hot-spot which may be possible as it has a wider field of view (20.6°) and higher orbit (786 km) than Landsat. We examined a global year of S2A metadata extracted using the Committee on Earth Observation Satellite Visualization Environment (COVE) tool, computed the solar zenith angles in the acquisition corners, and ranked the acquisitions by the solar zenith angle in the back-scatter direction. The available image data for the 10 acquisitions with the smallest solar zenith angle over the year were ordered from the ESA and their geometries examined in detail. The acquisition closest to the hot-spot had a maximum scattering angle of 173.61° on its western edge (view zenith angle 11.91°, solar zenith angle 17.97°) and was acquired over 60.80°W 24.37°N on June 2nd 2016. Given that hot-spots are only apparent when the scattering angle is close to 180° we conclude from this global annual analysis that there is no hot-spot in Sentinel-2 data. Roy, D.P, Li, J., Zhang, H.K., Yan, L., Huang, H., Li, Z., 2017, Examination of Sentinel-2A multi-spectral instrument (MSI) reflectance anisotropy and the suitability of a general method to normalize MSI reflectance to nadir BRDF adjusted reflectance, RSE. 199, 25-38. Zhang, H. K., Roy, D.P., Kovalskyy, V., 2016, Optimal solar geometry definition for global long term Landsat time series bi-directional reflectance normalization, IEEE TGRS. 54(3), 1410-1418.
Evaluation of a novel collimator for molecular breast tomosynthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilland, David R.; Welch, Benjamin L.; Lee, Seungjoon
Here, this study investigated a novel gamma camera for molecular breast tomosynthesis (MBT), which is a nuclear breast imaging method that uses limited angle tomography. The camera is equipped with a variable angle, slant-hole (VASH) collimator that allows the camera to remain close to the breast throughout the acquisition. The goal of this study was to evaluate the spatial resolution and count sensitivity of this camera and to compare contrast and contrast-to-noise ratio (CNR) with conventional planar imaging using an experimental breast phantom. Methods The VASH collimator mounts to a commercial gamma camera for breast imaging that uses a pixelatedmore » (3.2 mm), 15 × 20 cm NaI crystal. Spatial resolution was measured in planar images over a range of distances from the collimator (30-100 mm) and a range of slant angles (–25° to 25°) using 99mTc line sources. Spatial resolution was also measured in reconstructed MBT images including in the depth dimension. The images were reconstructed from data acquired over the -25° to 25° angular range using an iterative algorithm adapted to the slant-hole geometry. Sensitivity was measured over the range of slant angles using a disk source. Measured spatial resolution and sensitivity were compared to theoretical values. Contrast and CNR were measured using a breast phantom containing spherical lesions (6.2 mm and 7.8 mm diameter) and positioned over a range of depths in the phantom. The MBT and planar methods had equal scan time, and the count density in the breast phantom data was similar to that in clinical nuclear breast imaging. The MBT method used an iterative reconstruction algorithm combined with a postreconstruction Metz filter. Results The measured spatial resolution in planar images agreed well with theoretical calculations over the range of distances and slant angles. The measured FWHM was 9.7 mm at 50 mm distance. In reconstructed MBT images, the spatial resolution in the depth dimension was approximately 2.2 mm greater than the other two dimensions due to the limited angle data. The measured count sensitivity agreed closely with theory over all slant angles when using a wide energy window. At 0° slant angle, measured sensitivity was 19.7 counts sec -1 μCi -1 with the open energy window and 11.2 counts sec -1 μCi -1 with a 20% wide photopeak window (126 to 154 keV). The measured CNR in the MBT images was significantly greater than in the planar images for all but the lowest CNR cases where the lesion detectability was extremely low for both MBT and planar. The 7.8 mm lesion at 37 mm depth was marginally detectable in the planar image but easily visible in the MBT image. The improved CNR with MBT was due to a large improvement in contrast, which out-weighed the increase in image noise. Conclusion The spatial resolution and count sensitivity measurements with the prototype MBT system matched theoretical calculations, and the measured CNR in breast phantom images was generally greater with the MBT system compared to conventional planar imaging. These results demonstrate the potential of the proposed MBT system to improve lesion detection in nuclear breast imaging.« less
Evaluation of a novel collimator for molecular breast tomosynthesis.
Gilland, David R; Welch, Benjamin L; Lee, Seungjoon; Kross, Brian; Weisenberger, Andrew G
2017-11-01
This study investigated a novel gamma camera for molecular breast tomosynthesis (MBT), which is a nuclear breast imaging method that uses limited angle tomography. The camera is equipped with a variable angle, slant-hole (VASH) collimator that allows the camera to remain close to the breast throughout the acquisition. The goal of this study was to evaluate the spatial resolution and count sensitivity of this camera and to compare contrast and contrast-to-noise ratio (CNR) with conventional planar imaging using an experimental breast phantom. The VASH collimator mounts to a commercial gamma camera for breast imaging that uses a pixelated (3.2 mm), 15 × 20 cm NaI crystal. Spatial resolution was measured in planar images over a range of distances from the collimator (30-100 mm) and a range of slant angles (-25° to 25°) using 99m Tc line sources. Spatial resolution was also measured in reconstructed MBT images including in the depth dimension. The images were reconstructed from data acquired over the -25° to 25° angular range using an iterative algorithm adapted to the slant-hole geometry. Sensitivity was measured over the range of slant angles using a disk source. Measured spatial resolution and sensitivity were compared to theoretical values. Contrast and CNR were measured using a breast phantom containing spherical lesions (6.2 mm and 7.8 mm diameter) and positioned over a range of depths in the phantom. The MBT and planar methods had equal scan time, and the count density in the breast phantom data was similar to that in clinical nuclear breast imaging. The MBT method used an iterative reconstruction algorithm combined with a postreconstruction Metz filter. The measured spatial resolution in planar images agreed well with theoretical calculations over the range of distances and slant angles. The measured FWHM was 9.7 mm at 50 mm distance. In reconstructed MBT images, the spatial resolution in the depth dimension was approximately 2.2 mm greater than the other two dimensions due to the limited angle data. The measured count sensitivity agreed closely with theory over all slant angles when using a wide energy window. At 0° slant angle, measured sensitivity was 19.7 counts sec -1 μCi -1 with the open energy window and 11.2 counts sec -1 μCi -1 with a 20% wide photopeak window (126 to 154 keV). The measured CNR in the MBT images was significantly greater than in the planar images for all but the lowest CNR cases where the lesion detectability was extremely low for both MBT and planar. The 7.8 mm lesion at 37 mm depth was marginally detectable in the planar image but easily visible in the MBT image. The improved CNR with MBT was due to a large improvement in contrast, which out-weighed the increase in image noise. The spatial resolution and count sensitivity measurements with the prototype MBT system matched theoretical calculations, and the measured CNR in breast phantom images was generally greater with the MBT system compared to conventional planar imaging. These results demonstrate the potential of the proposed MBT system to improve lesion detection in nuclear breast imaging. © 2017 American Association of Physicists in Medicine.
Evaluation of a novel collimator for molecular breast tomosynthesis
Gilland, David R.; Welch, Benjamin L.; Lee, Seungjoon; ...
2017-09-06
Here, this study investigated a novel gamma camera for molecular breast tomosynthesis (MBT), which is a nuclear breast imaging method that uses limited angle tomography. The camera is equipped with a variable angle, slant-hole (VASH) collimator that allows the camera to remain close to the breast throughout the acquisition. The goal of this study was to evaluate the spatial resolution and count sensitivity of this camera and to compare contrast and contrast-to-noise ratio (CNR) with conventional planar imaging using an experimental breast phantom. Methods The VASH collimator mounts to a commercial gamma camera for breast imaging that uses a pixelatedmore » (3.2 mm), 15 × 20 cm NaI crystal. Spatial resolution was measured in planar images over a range of distances from the collimator (30-100 mm) and a range of slant angles (–25° to 25°) using 99mTc line sources. Spatial resolution was also measured in reconstructed MBT images including in the depth dimension. The images were reconstructed from data acquired over the -25° to 25° angular range using an iterative algorithm adapted to the slant-hole geometry. Sensitivity was measured over the range of slant angles using a disk source. Measured spatial resolution and sensitivity were compared to theoretical values. Contrast and CNR were measured using a breast phantom containing spherical lesions (6.2 mm and 7.8 mm diameter) and positioned over a range of depths in the phantom. The MBT and planar methods had equal scan time, and the count density in the breast phantom data was similar to that in clinical nuclear breast imaging. The MBT method used an iterative reconstruction algorithm combined with a postreconstruction Metz filter. Results The measured spatial resolution in planar images agreed well with theoretical calculations over the range of distances and slant angles. The measured FWHM was 9.7 mm at 50 mm distance. In reconstructed MBT images, the spatial resolution in the depth dimension was approximately 2.2 mm greater than the other two dimensions due to the limited angle data. The measured count sensitivity agreed closely with theory over all slant angles when using a wide energy window. At 0° slant angle, measured sensitivity was 19.7 counts sec -1 μCi -1 with the open energy window and 11.2 counts sec -1 μCi -1 with a 20% wide photopeak window (126 to 154 keV). The measured CNR in the MBT images was significantly greater than in the planar images for all but the lowest CNR cases where the lesion detectability was extremely low for both MBT and planar. The 7.8 mm lesion at 37 mm depth was marginally detectable in the planar image but easily visible in the MBT image. The improved CNR with MBT was due to a large improvement in contrast, which out-weighed the increase in image noise. Conclusion The spatial resolution and count sensitivity measurements with the prototype MBT system matched theoretical calculations, and the measured CNR in breast phantom images was generally greater with the MBT system compared to conventional planar imaging. These results demonstrate the potential of the proposed MBT system to improve lesion detection in nuclear breast imaging.« less
Cinematic camera emulation using two-dimensional color transforms
NASA Astrophysics Data System (ADS)
McElvain, Jon S.; Gish, Walter
2015-02-01
For cinematic and episodic productions, on-set look management is an important component of the creative process, and involves iterative adjustments of the set, actors, lighting and camera configuration. Instead of using the professional motion capture device to establish a particular look, the use of a smaller form factor DSLR is considered for this purpose due to its increased agility. Because the spectral response characteristics will be different between the two camera systems, a camera emulation transform is needed to approximate the behavior of the destination camera. Recently, twodimensional transforms have been shown to provide high-accuracy conversion of raw camera signals to a defined colorimetric state. In this study, the same formalism is used for camera emulation, whereby a Canon 5D Mark III DSLR is used to approximate the behavior a Red Epic cinematic camera. The spectral response characteristics for both cameras were measured and used to build 2D as well as 3x3 matrix emulation transforms. When tested on multispectral image databases, the 2D emulation transforms outperform their matrix counterparts, particularly for images containing highly chromatic content.
Clinical evaluation of melanomas and common nevi by spectral imaging
Diebele, Ilze; Kuzmina, Ilona; Lihachev, Alexey; Kapostinsh, Janis; Derjabo, Alexander; Valeine, Lauma; Spigulis, Janis
2012-01-01
A clinical trial on multi-spectral imaging of malignant and non-malignant skin pathologies comprising 17 melanomas and 65 pigmented common nevi was performed. Optical density data of skin pathologies were obtained in the spectral range 450–950 nm using the multispectral camera Nuance EX. An image parameter and maps capable of distinguishing melanoma from pigmented nevi were proposed. The diagnostic criterion is based on skin optical density differences at three fixed wavelengths: 540nm, 650nm and 950nm. The sensitivity and specificity of this method were estimated to be 94% and 89%, respectively. The proposed methodology and potential clinical applications are discussed. PMID:22435095
Nanohole-array-based device for 2D snapshot multispectral imaging
Najiminaini, Mohamadreza; Vasefi, Fartash; Kaminska, Bozena; Carson, Jeffrey J. L.
2013-01-01
We present a two-dimensional (2D) snapshot multispectral imager that utilizes the optical transmission characteristics of nanohole arrays (NHAs) in a gold film to resolve a mixture of input colors into multiple spectral bands. The multispectral device consists of blocks of NHAs, wherein each NHA has a unique periodicity that results in transmission resonances and minima in the visible and near-infrared regions. The multispectral device was illuminated over a wide spectral range, and the transmission was spectrally unmixed using a least-squares estimation algorithm. A NHA-based multispectral imaging system was built and tested in both reflection and transmission modes. The NHA-based multispectral imager was capable of extracting 2D multispectral images representative of four independent bands within the spectral range of 662 nm to 832 nm for a variety of targets. The multispectral device can potentially be integrated into a variety of imaging sensor systems. PMID:24005065
NASA Technical Reports Server (NTRS)
Luchini, Chris B.
1997-01-01
Development of camera and instrument simulations for space exploration requires the development of scientifically accurate models of the objects to be studied. Several planned cometary missions have prompted the development of a three dimensional, multi-spectral, anisotropic multiple scattering model of cometary coma.
NASA Technical Reports Server (NTRS)
Grunwald, Arthur J.; Kohn, Silvia
1993-01-01
The pilot's ability to derive Control-Oriented Visual Field Information from teleoperated Helmet-Mounted displays in Nap-of-the-Earth flight, is investigated. The visual field with these types of displays, commonly used in Apache and Cobra helicopter night operations, originates from a relatively narrow field-of-view Forward Looking Infrared Radiation Camera, gimbal-mounted at the nose of the aircraft and slaved to the pilot's line-of-sight, in order to obtain a wide-angle field-of-regard. Pilots have encountered considerable difficulties in controlling the aircraft by these devices. Experimental simulator results presented here indicate that part of these difficulties can be attributed to head/camera slaving system phase lags and errors. In the presence of voluntary head rotation, these slaving system imperfections are shown to impair the Control-Oriented Visual Field Information vital in vehicular control, such as the perception of the anticipated flight path or the vehicle yaw rate. Since, in the presence of slaving system imperfections, the pilot will tend to minimize head rotation, the full wide-angle field-of-regard of the line-of-sight slaved Helmet-Mounted Display, is not always fully utilized.
NASA Astrophysics Data System (ADS)
Ott, T.; Drolshagen, E.; Koschny, D.; Güttler, C.; Tubiana, C.; Frattin, E.; Agarwal, J.; Sierks, H.; Bertini, I.; Barbieri, C.; Lamy, P. I.; Rodrigo, R.; Rickman, H.; A'Hearn, M. F.; Barucci, M. A.; Bertaux, J.-L.; Boudreault, S.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; Debei, S.; De Cecco, M.; Deller, J.; Feller, C.; Fornasier, S.; Fulle, M.; Geiger, B.; Gicquel, A.; Groussin, O.; Gutiérrez, P. J.; Hofmann, M.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Keller, H. U.; Knollenberg, J.; Kovacs, G.; Kramm, J. R.; Kührt, E.; Küppers, M.; Lara, L. M.; Lazzarin, M.; Lin, Z.-Y.; López-Moreno, J. J.; Marzari, F.; Mottola, S.; Naletto, G.; Oklay, N.; Pajola, M.; Shi, X.; Thomas, N.; Vincent, J.-B.; Poppe, B.
2017-07-01
The OSIRIS (optical, spectroscopic and infrared remote imaging system) instrument on board the ESA Rosetta spacecraft collected data of 67P/Churyumov-Gerasimenko for over 2 yr. OSIRIS consists of two cameras, a Narrow Angle Camera and a Wide Angle Camera. For specific imaging sequences related to the observation of dust aggregates in 67P's coma, the two cameras were operating simultaneously. The two cameras are mounted 0.7 m apart from each other, as a result this baseline yields a parallax shift of the apparent particle trails on the analysed images directly proportional to their distance. Thanks to such shifts, the distance between observed dust aggregates and the spacecraft was determined. This method works for particles closer than 6000 m to the spacecraft and requires very few assumptions. We found over 250 particles in a suitable distance range with sizes of some centimetres, masses in the range of 10-6-102 kg and a mean velocity of about 2.4 m s-1 relative to the nucleus. Furthermore, the spectral slope was analysed showing a decrease in the median spectral slope of the particles with time. The further a particle is from the spacecraft the fainter is its signal. For this reason, this was counterbalanced by a debiasing. Moreover, the dust mass-loss rate of the nucleus could be computed as well as the Afρ of the comet around perihelion. The summed-up dust mass-loss rate for the mass bins 10-4-102 kg is almost 8300 kg s-1.
NASA Astrophysics Data System (ADS)
Bell, J. F.; Godber, A.; McNair, S.; Caplinger, M. A.; Maki, J. N.; Lemmon, M. T.; Van Beek, J.; Malin, M. C.; Wellington, D.; Kinch, K. M.; Madsen, M. B.; Hardgrove, C.; Ravine, M. A.; Jensen, E.; Harker, D.; Anderson, R. B.; Herkenhoff, K. E.; Morris, R. V.; Cisneros, E.; Deen, R. G.
2017-07-01
The NASA Curiosity rover Mast Camera (Mastcam) system is a pair of fixed-focal length, multispectral, color CCD imagers mounted 2 m above the surface on the rover's remote sensing mast, along with associated electronics and an onboard calibration target. The left Mastcam (M-34) has a 34 mm focal length, an instantaneous field of view (IFOV) of 0.22 mrad, and a FOV of 20° × 15° over the full 1648 × 1200 pixel span of its Kodak KAI-2020 CCD. The right Mastcam (M-100) has a 100 mm focal length, an IFOV of 0.074 mrad, and a FOV of 6.8° × 5.1° using the same detector. The cameras are separated by 24.2 cm on the mast, allowing stereo images to be obtained at the resolution of the M-34 camera. Each camera has an eight-position filter wheel, enabling it to take Bayer pattern red, green, and blue (RGB) "true color" images, multispectral images in nine additional bands spanning 400-1100 nm, and images of the Sun in two colors through neutral density-coated filters. An associated Digital Electronics Assembly provides command and data interfaces to the rover, 8 Gb of image storage per camera, 11 bit to 8 bit companding, JPEG compression, and acquisition of high-definition video. Here we describe the preflight and in-flight calibration of Mastcam images, the ways that they are being archived in the NASA Planetary Data System, and the ways that calibration refinements are being developed as the investigation progresses on Mars. We also provide some examples of data sets and analyses that help to validate the accuracy and precision of the calibration.
A Robust Mechanical Sensing System for Unmanned Sea Surface Vehicles
NASA Technical Reports Server (NTRS)
Kulczycki, Eric A.; Magnone, Lee J.; Huntsberger, Terrance; Aghazarian, Hrand; Padgett, Curtis W.; Trotz, David C.; Garrett, Michael S.
2009-01-01
The need for autonomous navigation and intelligent control of unmanned sea surface vehicles requires a mechanically robust sensing architecture that is watertight, durable, and insensitive to vibration and shock loading. The sensing system developed here comprises four black and white cameras and a single color camera. The cameras are rigidly mounted to a camera bar that can be reconfigured to mount multiple vehicles, and act as both navigational cameras and application cameras. The cameras are housed in watertight casings to protect them and their electronics from moisture and wave splashes. Two of the black and white cameras are positioned to provide lateral vision. They are angled away from the front of the vehicle at horizontal angles to provide ideal fields of view for mapping and autonomous navigation. The other two black and white cameras are positioned at an angle into the color camera's field of view to support vehicle applications. These two cameras provide an overlap, as well as a backup to the front camera. The color camera is positioned directly in the middle of the bar, aimed straight ahead. This system is applicable to any sea-going vehicle, both on Earth and in space.
Investigating at the Moon With new Eyes: The Lunar Reconnaissance Orbiter Mission Camera (LROC)
NASA Astrophysics Data System (ADS)
Hiesinger, H.; Robinson, M. S.; McEwen, A. S.; Turtle, E. P.; Eliason, E. M.; Jolliff, B. L.; Malin, M. C.; Thomas, P. C.
The Lunar Reconnaissance Orbiter Mission Camera (LROC) H. Hiesinger (1,2), M.S. Robinson (3), A.S. McEwen (4), E.P. Turtle (4), E.M. Eliason (4), B.L. Jolliff (5), M.C. Malin (6), and P.C. Thomas (7) (1) Brown Univ., Dept. of Geological Sciences, Providence RI 02912, Harald_Hiesinger@brown.edu, (2) Westfaelische Wilhelms-University, (3) Northwestern Univ., (4) LPL, Univ. of Arizona, (5) Washington Univ., (6) Malin Space Science Systems, (7) Cornell Univ. The Lunar Reconnaissance Orbiter (LRO) mission is scheduled for launch in October 2008 as a first step to return humans to the Moon by 2018. The main goals of the Lunar Reconnaissance Orbiter Camera (LROC) are to: 1) assess meter and smaller- scale features for safety analyses for potential lunar landing sites near polar resources, and elsewhere on the Moon; and 2) acquire multi-temporal images of the poles to characterize the polar illumination environment (100 m scale), identifying regions of permanent shadow and permanent or near permanent illumination over a full lunar year. In addition, LROC will return six high-value datasets such as 1) meter-scale maps of regions of permanent or near permanent illumination of polar massifs; 2) high resolution topography through stereogrammetric and photometric stereo analyses for potential landing sites; 3) a global multispectral map in 7 wavelengths (300-680 nm) to characterize lunar resources, in particular ilmenite; 4) a global 100-m/pixel basemap with incidence angles (60-80 degree) favorable for morphologic interpretations; 5) images of a variety of geologic units at sub-meter resolution to investigate physical properties and regolith variability; and 6) meter-scale coverage overlapping with Apollo Panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972, to estimate hazards for future surface operations. LROC consists of two narrow-angle cameras (NACs) which will provide 0.5-m scale panchromatic images over a 5-km swath, a wide-angle camera (WAC) to acquire images at about 100 m/pixel in seven color bands over a 100-km swath, and a common Sequence and Compressor System (SCS). Each NAC has a 700-mm-focal-length optic that images onto a 5000-pixel CCD line-array, providing a cross-track field-of-view (FOV) of 2.86 degree. The NAC readout noise is better than 100 e- , and the data are sampled at 12 bits. Its internal buffer holds 256 MB of uncompressed data, enough for a full-swath image 25-km long or a 2x2 binned image 100-km long. The WAC has two 6-mm- focal-length lenses imaging onto the same 1000 x 1000 pixel, electronically shuttered CCD area-array, one imaging in the visible/near IR, and the other in the UV. Each has a cross-track FOV of 90 degree. From the nominal 50-km orbit, the WAC will have a resolution of 100 m/pixel in the visible, and a swath width of ˜100 km. The seven-band color capability of the WAC is achieved by color filters mounted directly 1 over the detector, providing different sections of the CCD with different filters [1]. The readout noise is less than 40 e- , and, as with the NAC, pixel values are digitized to 12-bits and may be subsequently converted to 8-bit values. The total mass of the LROC system is about 12 kg; the total LROC power consumption averages at 22 W (30 W peak). Assuming a downlink with lossless compression, LRO will produce a total of 20 TeraBytes (TB) of raw data. Production of higher-level data products will result in a total of 70 TB for Planetary Data System (PDS) archiving, 100 times larger than any previous missions. [1] Malin et al., JGR, 106, 17651-17672, 2001. 2
NASA Astrophysics Data System (ADS)
Davis, P. A.; Cagney, L. E.; Kohl, K. A.; Gushue, T. M.; Fritzinger, C.; Bennett, G. E.; Hamill, J. F.; Melis, T. S.
2010-12-01
Periodically, the Grand Canyon Monitoring and Research Center of the U.S. Geological Survey collects and interprets high-resolution (20-cm), airborne multispectral imagery and digital surface models (DSMs) to monitor the effects of Glen Canyon Dam operations on natural and cultural resources of the Colorado River in Grand Canyon. We previously employed the first generation of the ADS40 in 2000 and the Zeiss-Imaging Digital Mapping Camera (DMC) in 2005. Data from both sensors displayed band-image misregistration owing to multiple sensor optics and image smearing along abrupt scarps due to errors in image rectification software, both of which increased post-processing time, cost, and errors from image classification. Also, the near-infrared gain on the early, 8-bit ADS40 was not properly set and its signal was saturated for the more chlorophyll-rich vegetation, which limited our vegetation mapping. Both sensors had stereo panchromatic capability for generating a DSM. The ADS40 performed to specifications; the DMC failed. In 2009, we employed the new ADS40 SH52 to acquire 11-bit multispectral data with a single lens (20-cm positional accuracy), as well as stereo panchromatic data that provided a 1-m cell DSM (40-cm root-mean-square vertical error at one sigma). Analyses of the multispectral data showed near-perfect registration of its four band images at our 20-cm resolution, a linear response to ground reflectance, and a large dynamic range and good sensitivity (except for the blue band). Data were acquired over a 10-day period for the 450-km-long river corridor in which acquisition time and atmospheric conditions varied considerably during inclement weather. We received 266 orthorectified flightlines for the corridor, choosing to calibrate and mosaic the data ourselves to ensure a flawless mosaic with consistent, realistic spectral information. A linear least-squares cross-calibration of overlapping flightlines for the corridor showed that the dominate factors in inter-flightline variability were solar zenith angle and atmospheric scattering, which respectively affect the slope and intercept of the calibration. The inter-flightline calibration slopes were consistently close to the square of the ratio of the cosines of the zenith angles of each pair of overlapping flightlines. Our results corroborate previous observations that the cosine of solar zenith angle is a good approximation for atmospheric transmission and the use of its square in radiometric calibrations may compensate for that effect and the effect of non-nadir sun angle on surface reflectance. It was more expedient to acquire imagery for each sub-linear river segment by collecting 5-6 parallel flightlines; river sinuosity caused us to use 2-3 flightlines for each segment. Surfaces near flightline edges were often smeared and replaced with adjacent, more nadir-viewed flightline data. Eliminating surface smearing was the most time consuming aspect of creating a flawless image mosaic for the river corridor, but its removal will increase the efficiency and accuracy of image analyses of monitoring parameters of interest to river managers.
Modeling of the ITER-like wide-angle infrared thermography view of JET.
Aumeunier, M-H; Firdaouss, M; Travère, J-M; Loarer, T; Gauthier, E; Martin, V; Chabaud, D; Humbert, E
2012-10-01
Infrared (IR) thermography systems are mandatory to ensure safe plasma operation in fusion devices. However, IR measurements are made much more complicated in metallic environment because of the spurious contributions of the reflected fluxes. This paper presents a full predictive photonic simulation able to assess accurately the surface temperature measurement with classical IR thermography from a given plasma scenario and by taking into account the optical properties of PFCs materials. This simulation has been carried out the ITER-like wide angle infrared camera view of JET in comparing with experimental data. The consequences and the effects of the low emissivity and the bidirectional reflectivity distribution function used in the model for the metallic PFCs on the contribution of the reflected flux in the analysis are discussed.
2013-12-23
The globe of Saturn, seen here in natural color, is reminiscent of a holiday ornament in this wide-angle view from NASA's Cassini spacecraft. The characteristic hexagonal shape of Saturn's northern jet stream, somewhat yellow here, is visible. At the pole lies a Saturnian version of a high-speed hurricane, eye and all. This view is centered on terrain at 75 degrees north latitude, 120 degrees west longitude. Images taken using red, green and blue spectral filters were combined to create this natural-color view. The images were taken with the Cassini spacecraft wide-angle camera on July 22, 2013. This view was acquired at a distance of approximately 611,000 miles (984,000 kilometers) from Saturn. Image scale is 51 miles (82 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA17175
The PanCam instrument on the 2018 Exomars rover: Scientific objectives
NASA Astrophysics Data System (ADS)
Jaumann, Ralf; Coates, Andrew; Hauber, Ernst; Hoffmann, Harald; Schmitz, Nicole; Le Deit, Laetitia; Tirsch, Daniela; Paar, Gerhard; Griffiths, Andrew
2010-05-01
The Exomars Panoramic Camera System is an imaging suite of three camera heads to be mounted on the ExoMars rover`s mast, with the boresight 1.8 m above ground. As late as the ExoMars Pasteur Payload Design Review (PDR) in 2009, the PanCam consists of two identical wide angle cameras (WAC) with fixed focal length lenses, and a high resolution camera (HRC) with an automatic focus mechanism, placed adjacent to the right WAC. The WAC stereo pair provides binocular vision for stereoscopic studies as well as 12 filter positions (per camera) for stereoscopic colour imaging and scientific multispectral studies. The stereo baseline of the pair is 500 mm. The two WAC have 22 mm focal length, f/10 lenses that illuminate detectors with 1024 × 1024 pixels. WAC lenses are fixed, with an optimal focus set to 4 m, and a focus ranging from 1.2 m (corresponding to the nearest view of the calibration target on the rover deck) to infinity. The HRC is able to focus between 0.9 m (distance to a drill core on the rover`s sample tray) and infinity. The instantaneous field of views of WAC and HRC are 580 μrad/pixel and 83 μrad/pixel, respectively. The corresponding resolution (in mm/pixel) at a distance of 2 m are 1.2 (WAC) and 0.17 (HRC), at 100 m distance it is 58 (WAC) and 8.3 (HRC). WAC and HRC will be geometrically co-aligned. The main scientific goal of PanCam is the geologic characterisation of the environment in which the rover is operating, providing the context for investigations carried out by the other instruments of the Pasteur payload. PanCam data will serve as a bridge between orbital data (high-resolution images from HRSC, CTX, and HiRISE, and spectrometer data from OMEGA and CRISM) and the data acquired in situ on the Martian surface. The position of HRC on top of the rover`s mast enables the detailed panoramic inspection of surface features over the full horizontal range of 360° even at large distances, an important prerequisite to identify the scientifically most promising targets and to plan the rover`s traverse. Key to success of PanCam is the provision of data that allow the determination of rock lithology, either of boulders on the surface or of outcrops. This task requires high spatial resolution as well as colour capabilities. The stereo images provide complementary information on the three-dimensional properties (i.e. the shape) of rocks. As an example, the degree of rounding of rocks as a result of fluvial transport can reveal the erosional history of the investigated particles, with possible implications on the chronology and intensity of rock-water interaction. The identification of lithology and geological history of rocks will strongly benefit from the co-aligned views of WAC (colour, stereo) and HRC (high spatial resolution), which will ensure that 3D and multispectral information is available together with fine-scale textural information for each scene. Stereo information is also of utmost importance for the determination of outcrop geometry (e.g., strike and dip of layered sequences), which helps to understand the emplacement history of sedimentary and volcanic rocks (e.g., cross-bedding, unconformities, etc.). PanCam will further reveal physical soil properties such as cohesion by imaging sites where the soil is disturbed by the rover`s wheels and the drill. Another essential task of PanCam is the imaging of samples (from the drill) before ingestion into the rover for further analysis by other instruments. PanCam can be tilted vertically and will also study the atmosphere (e.g., dust loading, opacity, clouds) and aeolian processes related to surface-atmosphere interactions, such as dust devils.
Addressing challenges of modulation transfer function measurement with fisheye lens cameras
NASA Astrophysics Data System (ADS)
Deegan, Brian M.; Denny, Patrick E.; Zlokolica, Vladimir; Dever, Barry; Russell, Laura
2015-03-01
Modulation transfer function (MTF) is a well defined and accepted method of measuring image sharpness. The slanted edge test, as defined in ISO12233 is a standard method of calculating MTF, and is widely used for lens alignment and auto-focus algorithm verification. However, there are a number of challenges which should be considered when measuring MTF in cameras with fisheye lenses. Due to trade-offs related Petzval curvature, planarity of the optical plane is difficult to achieve in fisheye lenses. It is therefore critical to have the ability to accurately measure sharpness throughout the entire image, particularly for lens alignment. One challenge for fisheye lenses is that, because of the radial distortion, the slanted edges will have different angles, depending on the location within the image and on the distortion profile of the lens. Previous work in the literature indicates that MTF measurements are robust for angles between 2 and 10 degrees. Outside of this range, MTF measurements become unreliable. Also, the slanted edge itself will be curved by the lens distortion, causing further measurement problems. This study summarises the difficulties in the use of MTF for sharpness measurement in fisheye lens cameras, and proposes mitigations and alternative methods.
The MESSENGER Earth Flyby: Results from the Mercury Dual Imaging System
NASA Astrophysics Data System (ADS)
Prockter, L. M.; Murchie, S. L.; Hawkins, S. E.; Robinson, M. S.; Shelton, R. G.; Vaughan, R. M.; Solomon, S. C.
2005-12-01
The MESSENGER (MErcury Surface, Space ENvironment, Geochemistry, and Ranging) spacecraft was launched from Cape Canaveral Air Force Station, Fla., on 3 August 2004. It returned to Earth for a gravity assist on 2 August 2005, providing an exceptional opportunity for the Science Team to perform instrument calibrations and to test some of the data acquisition sequences that will be used to meet Mercury science goals. The Mercury Dual Imaging System (MDIS), one of seven science instruments on MESSENGER, consists of a wide-angle and a narrow-angle imager that together can map landforms, track variations in surface color, and carry out stereogrammetry. The two imagers are mounted on a pivot platform that enables the instrument to point in a different direction from the spacecraft boresight, allowing great flexibility and increased imaging coverage. During the week prior to the closest approach to Earth, MDIS acquired a number of images of the Moon for radiometric calibration and to test optical navigation sequences that will be used to target planetary flybys. Twenty-four hours before closest approach, images of the Earth were acquired with 11 filters of the wide-angle camera. After MDIS flew over the nightside of the Earth, additional color images centered on South America were obtained at sufficiently high resolution to discriminate small-scale features such as the Amazon River and Lake Titicaca. During its departure from Earth, MDIS acquired a sequence of images taken in three filters every 4 minutes over a period of 24 hours. These images have been assembled into a movie of a crescent Earth that begins as South America slides across the terminator into darkness and continues for one full Earth rotation. This movie and the other images have provided a successful test of the sequences that will be used during the MESSENGER Mercury flybys in 2008 and 2009 and have demonstrated the high quality of the MDIS wide-angle camera.
System for critical infrastructure security based on multispectral observation-detection module
NASA Astrophysics Data System (ADS)
Trzaskawka, Piotr; Kastek, Mariusz; Życzkowski, Marek; Dulski, Rafał; Szustakowski, Mieczysław; Ciurapiński, Wiesław; Bareła, Jarosław
2013-10-01
Recent terrorist attacks and possibilities of such actions in future have forced to develop security systems for critical infrastructures that embrace sensors technologies and technical organization of systems. The used till now perimeter protection of stationary objects, based on construction of a ring with two-zone fencing, visual cameras with illumination are efficiently displaced by the systems of the multisensor technology that consists of: visible technology - day/night cameras registering optical contrast of a scene, thermal technology - cheap bolometric cameras recording thermal contrast of a scene and active ground radars - microwave and millimetre wavelengths that record and detect reflected radiation. Merging of these three different technologies into one system requires methodology for selection of technical conditions of installation and parameters of sensors. This procedure enables us to construct a system with correlated range, resolution, field of view and object identification. Important technical problem connected with the multispectral system is its software, which helps couple the radar with the cameras. This software can be used for automatic focusing of cameras, automatic guiding cameras to an object detected by the radar, tracking of the object and localization of the object on the digital map as well as target identification and alerting. Based on "plug and play" architecture, this system provides unmatched flexibility and simplistic integration of sensors and devices in TCP/IP networks. Using a graphical user interface it is possible to control sensors and monitor streaming video and other data over the network, visualize the results of data fusion process and obtain detailed information about detected intruders over a digital map. System provide high-level applications and operator workload reduction with features such as sensor to sensor cueing from detection devices, automatic e-mail notification and alarm triggering. The paper presents a structure and some elements of critical infrastructure protection solution which is based on a modular multisensor security system. System description is focused mainly on methodology of selection of sensors parameters. The results of the tests in real conditions are also presented.
Wide-Field-of-View, High-Resolution, Stereoscopic Imager
NASA Technical Reports Server (NTRS)
Prechtl, Eric F.; Sedwick, Raymond J.
2010-01-01
A device combines video feeds from multiple cameras to provide wide-field-of-view, high-resolution, stereoscopic video to the user. The prototype under development consists of two camera assemblies, one for each eye. One of these assemblies incorporates a mounting structure with multiple cameras attached at offset angles. The video signals from the cameras are fed to a central processing platform where each frame is color processed and mapped into a single contiguous wide-field-of-view image. Because the resolution of most display devices is typically smaller than the processed map, a cropped portion of the video feed is output to the display device. The positioning of the cropped window will likely be controlled through the use of a head tracking device, allowing the user to turn his or her head side-to-side or up and down to view different portions of the captured image. There are multiple options for the display of the stereoscopic image. The use of head mounted displays is one likely implementation. However, the use of 3D projection technologies is another potential technology under consideration, The technology can be adapted in a multitude of ways. The computing platform is scalable, such that the number, resolution, and sensitivity of the cameras can be leveraged to improve image resolution and field of view. Miniaturization efforts can be pursued to shrink the package down for better mobility. Power savings studies can be performed to enable unattended, remote sensing packages. Image compression and transmission technologies can be incorporated to enable an improved telepresence experience.
Flight test comparison of film type SO-289 and film type 2424 in the AMPS camera
NASA Technical Reports Server (NTRS)
Perry, L.
1975-01-01
A flight test was conducted to determine the suitability of SO-289 multispectral infrared aerial film for Earth Resources' use. It was directly compared to film type 2424, infrared aerographic film, the IR film in current use. The exposure parameters for both films are given.
Improved Airborne System for Sensing Wildfires
NASA Technical Reports Server (NTRS)
McKeown, Donald; Richardson, Michael
2008-01-01
The Wildfire Airborne Sensing Program (WASP) is engaged in a continuing effort to develop an improved airborne instrumentation system for sensing wildfires. The system could also be used for other aerial-imaging applications, including mapping and military surveillance. Unlike prior airborne fire-detection instrumentation systems, the WASP system would not be based on custom-made multispectral line scanners and associated custom- made complex optomechanical servomechanisms, sensors, readout circuitry, and packaging. Instead, the WASP system would be based on commercial off-the-shelf (COTS) equipment that would include (1) three or four electronic cameras (one for each of three or four wavelength bands) instead of a multispectral line scanner; (2) all associated drive and readout electronics; (3) a camera-pointing gimbal; (4) an inertial measurement unit (IMU) and a Global Positioning System (GPS) receiver for measuring the position, velocity, and orientation of the aircraft; and (5) a data-acquisition subsystem. It would be necessary to custom-develop an integrated sensor optical-bench assembly, a sensor-management subsystem, and software. The use of mostly COTS equipment is intended to reduce development time and cost, relative to those of prior systems.
Comparison of Spectral Characteristic between LAPAN-A3 and Sentinel-2A
NASA Astrophysics Data System (ADS)
Zylshal, Z.; Sari, N. M.; Nugroho, J. T.; Kushardono, D.
2017-12-01
Indonesian National Institute of Aeronautics and Space (LAPAN) started building its experimental microsatellite back in 2007 and finally able to launch its first microsatellite dubbed as LAPAN-A1/LAPAN-Tubsat. With the launch of LAPAN-A3/LAPAN-IPB, Indonesian experimental satellite programme hit its third generation. LAPAN-A3 is carrying multiple payloads including multispectral push-broom imager, digital matrix camera, as well as video camera. This paper aims to highlight the spectral differences between LAPAN-A3 and the well-established Sentinel-2A multispectral to investigate the potential of using LAPAN-A3 data to complement the other well-established medium resolution satellite data. Comparisons between corresponding bands and band transformations were performed over a dataset. Three areas of interest were chosen as the test sites. Linear regression and Pearson correlation coefficient were then calculated between the corresponding bands. The preliminary results showed a moderate correlation between the two sensors with Pearson correlation coefficient ranging from 0.39 to 0.65. Some issues were found regarding the radiometric quality over the whole scene of LAPAN-A3.
Application of remote sensing techniques to hydrography with emphasis on bathymetry. M.S. Thesis
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Meireles, D. S.
1980-01-01
Remote sensing techniques are utilized for the determination of hydrographic characteristics, with emphasis in bathymetry. Two sensor systems were utilized: the Metric Camera Wild RC-10 and the Multispectral Scanner of LANDSAT Satellite (MSS-LANDSAT). From photographs of the metric camera, data of photographic density of points with known depth are obtained. A correlation between the variables density x depth is calculated through a regression straight line. From this line, the depth of points with known photographic density is determined. The LANDSAT MSS images are interpreted automatically in the Iterative Multispectral Analysis System (I-100) with the obtention of point subareas with the same gray level. With some simplifications done, it is assumed that the depth of a point is directly related with its gray level. Subareas with points of the same depth are then determined and isobathymetric curves are drawn. The coast line is obtained through the sensor systems already mentioned. Advantages and limitations of the techniques and of the sensor systems utilized are discussed and the results are compared with ground truth.
NASA Astrophysics Data System (ADS)
Yong, Sang-Soon; Ra, Sung-Woong
2007-10-01
Multi-Spectral Camera(MSC) is a main payload on the KOMPSAT-2 satellite to perform the earth remote sensing. The MSC instrument has one(1) channel for panchromatic imaging and four(4) channel for multi-spectral imaging covering the spectral range from 450nm to 900nm using TDI CCD Focal Plane Array (FPA). The instrument images the earth using a push-broom motion with a swath width of 15 km and a ground sample distance (GSD) of 1 m over the entire field of view (FOV) at altitude 685 Km. The instrument is designed to have an on-orbit operation duty cycle of 20% over the mission lifetime of 3 years with the functions of programmable gain/ offset and on-board image data compression/ storage. The compression method on KOMPSAT-2 MSC was selected and used to match EOS input rate and PDTS output data rate on MSC image data chain. At once the MSC performance was carefully handled to minimize any degradation so that it was analyzed and restored in KGS(KOMPSAT Ground Station) during LEOP and Cal./Val.(Calibration and Validation) phase. In this paper, on-orbit image data chain in MSC and image data processing on KGS including general MSC description is briefly described. The influences on image performance between on-board compression algorithms and between performance restoration methods in ground station are analyzed, and the relation between both methods is to be analyzed and discussed.
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele
2011-06-01
Recent advances in micro-optical element fabrication using gray scale technology have opened up the opportunity to create simultaneous multi-spectral imaging with fine structure diffractive lenses. This paper will discuss an approach that uses diffractive optical lenses configured in an array (lenslet array) and placed in close proximity to the focal plane array which enables a small compact simultaneous multispectral imaging camera [1]. The lenslet array is designed so that all lenslets have a common focal length with each lenslet tuned for a different wavelength. The number of simultaneous spectral images is determined by the number of individually configured lenslets in the array. The number of spectral images can be increased by a factor of 2 when using it with a dual-band focal plane array (MWIR/LWIR) by exploiting multiple diffraction orders. In addition, modulation of the focal length of the lenslet array with piezoelectric actuation will enable spectral bin fill-in allowing additional spectral coverage while giving up simultaneity. Different lenslet array spectral imaging concept designs are presented in this paper along with a unique concept for prefiltering the radiation focused on the detector. This approach to spectral imaging has applications in the detection of chemical agents in both aerosolized form and as a liquid on a surface. It also can be applied to the detection of weaponized biological agent and IED detection in various forms from manufacturing to deployment and post detection during forensic analysis.
1986-01-24
Range : 236,000 km. ( 147,000 mi. ) Resolution : 33 km. ( 20 mi. ) P-29525B/W This Voyager 2 image reveals a contiuos distribution of small particles throughout the Uranus ring system. This unigue geometry, the highest phase angle at which Voyager imaged the rings, allows us to see lanes of fine dust particles not visible from other viewing angles. All the previously known rings are visible. However, some of the brightest features in the image are bright dust lanes not previously seen. the combination of this unique geometry and a long, 96 second exposure allowed this spectacular observation, acquired through the clear filter if Voyager 2's wide angle camera. the long exposure produced a noticable, non-uniform smear, as well as streaks due to trailed stars.
Real-time full-motion color Flash lidar for target detection and identification
NASA Astrophysics Data System (ADS)
Nelson, Roy; Coppock, Eric; Craig, Rex; Craner, Jeremy; Nicks, Dennis; von Niederhausern, Kurt
2015-05-01
Greatly improved understanding of areas and objects of interest can be gained when real time, full-motion Flash LiDAR is fused with inertial navigation data and multi-spectral context imagery. On its own, full-motion Flash LiDAR provides the opportunity to exploit the z dimension for improved intelligence vs. 2-D full-motion video (FMV). The intelligence value of this data is enhanced when it is combined with inertial navigation data to produce an extended, georegistered data set suitable for a variety of analysis. Further, when fused with multispectral context imagery the typical point cloud now becomes a rich 3-D scene which is intuitively obvious to the user and allows rapid cognitive analysis with little or no training. Ball Aerospace has developed and demonstrated a real-time, full-motion LIDAR system that fuses context imagery (VIS to MWIR demonstrated) and inertial navigation data in real time, and can stream these information-rich geolocated/fused 3-D scenes from an airborne platform. In addition, since the higher-resolution context camera is boresighted and frame synchronized to the LiDAR camera and the LiDAR camera is an array sensor, techniques have been developed to rapidly interpolate the LIDAR pixel values creating a point cloud that has the same resolution as the context camera, effectively creating a high definition (HD) LiDAR image. This paper presents a design overview of the Ball TotalSight™ LIDAR system along with typical results over urban and rural areas collected from both rotary and fixed-wing aircraft. We conclude with a discussion of future work.
Analysis of multispectral signatures and investigation of multi-aspect remote sensing techniques
NASA Technical Reports Server (NTRS)
Malila, W. A.; Hieber, R. H.; Sarno, J. E.
1974-01-01
Two major aspects of remote sensing with multispectral scanners (MSS) are investigated. The first, multispectral signature analysis, includes the effects on classification performance of systematic variations found in the average signals received from various ground covers as well as the prediction of these variations with theoretical models of physical processes. The foremost effects studied are those associated with the time of day airborne MSS data are collected. Six data collection runs made over the same flight line in a period of five hours are analyzed, it is found that the time span significantly affects classification performance. Variations associated with scan angle also are studied. The second major topic of discussion is multi-aspect remote sensing, a new concept in remote sensing with scanners. Here, data are collected on multiple passes by a scanner that can be tilted to scan forward of the aircraft at different angles on different passes. The use of such spatially registered data to achieve improved classification of agricultural scenes is investigated and found promising. Also considered are the possibilities of extracting from multi-aspect data, information on the condition of corn canopies and the stand characteristics of forests.
Kim, Min-Gab; Kim, Jin-Yong
2018-05-01
In this paper, we introduce a method to overcome the limitation of thickness measurement of a micro-patterned thin film. A spectroscopic imaging reflectometer system that consists of an acousto-optic tunable filter, a charge-coupled-device camera, and a high-magnitude objective lens was proposed, and a stack of multispectral images was generated. To secure improved accuracy and lateral resolution in the reconstruction of a two-dimensional thin film thickness, prior to the analysis of spectral reflectance profiles from each pixel of multispectral images, the image restoration based on an iterative deconvolution algorithm was applied to compensate for image degradation caused by blurring.
Reciprocity testing of Kodak film type SO-289 multispectral infrared aerial film
NASA Technical Reports Server (NTRS)
Lockwood, H. E.
1975-01-01
Kodak multispectral infrared aerial film type SO-289 was tested for reciprocity characteristics because of the variance between the I-B sensitometer exposure times (8 seconds and 4 seconds) and the camera exposure time (1/500 second) used on the ASTP stratospheric aerosol measurement project. Test exposures were made on the flight emulsion using a Mead star system sensitometer, the films were processed to ASTP control standards, and the resulting densities read and reciprocity data calculated. It was found that less exposure was required to produce a typical density (1.3) at 1/500 second exposure time than at an 8 second exposure time. This exposure factor was 2.8.
Geometric correction and digital elevation extraction using multiple MTI datasets
Mercier, Jeffrey A.; Schowengerdt, Robert A.; Storey, James C.; Smith, Jody L.
2007-01-01
Digital Elevation Models (DEMs) are traditionally acquired from a stereo pair of aerial photographs sequentially captured by an airborne metric camera. Standard DEM extraction techniques can be naturally extended to satellite imagery, but the particular characteristics of satellite imaging can cause difficulties. The spacecraft ephemeris with respect to the ground site during image collects is the most important factor in the elevation extraction process. When the angle of separation between the stereo images is small, the extraction process typically produces measurements with low accuracy, while a large angle of separation can cause an excessive number of erroneous points in the DEM from occlusion of ground areas. The use of three or more images registered to the same ground area can potentially reduce these problems and improve the accuracy of the extracted DEM. The pointing capability of some sensors, such as the Multispectral Thermal Imager (MTI), allows for multiple collects of the same area from different perspectives. This functionality of MTI makes it a good candidate for the implementation of a DEM extraction algorithm using multiple images for improved accuracy. Evaluation of this capability and development of algorithms to geometrically model the MTI sensor and extract DEMs from multi-look MTI imagery are described in this paper. An RMS elevation error of 6.3-meters is achieved using 11 ground test points, while the MTI band has a 5-meter ground sample distance.
NASA Astrophysics Data System (ADS)
Volkov, Boris; Mathews, Marlon S.; Abookasis, David
2015-03-01
Multispectral imaging has received significant attention over the last decade as it integrates spectroscopy, imaging, tomography analysis concurrently to acquire both spatial and spectral information from biological tissue. In the present study, a multispectral setup based on projection of structured illumination at several near-infrared wavelengths and at different spatial frequencies is applied to quantitatively assess brain function before, during, and after the onset of traumatic brain injury in an intact mouse brain (n=5). For the production of head injury, we used the weight drop method where weight of a cylindrical metallic rod falling along a metal tube strikes the mouse's head. Structured light was projected onto the scalp surface and diffuse reflected light was recorded by a CCD camera positioned perpendicular to the mouse head. Following data analysis, we were able to concurrently show a series of hemodynamic and morphologic changes over time including higher deoxyhemoglobin, reduction in oxygen saturation, cell swelling, etc., in comparison with baseline measurements. Overall, results demonstrates the capability of multispectral imaging based structured illumination to detect and map of brain tissue optical and physiological properties following brain injury in a simple noninvasive and noncontact manner.
NASA Astrophysics Data System (ADS)
Cavigelli, Lukas; Bernath, Dominic; Magno, Michele; Benini, Luca
2016-10-01
Detecting and classifying targets in video streams from surveillance cameras is a cumbersome, error-prone and expensive task. Often, the incurred costs are prohibitive for real-time monitoring. This leads to data being stored locally or transmitted to a central storage site for post-incident examination. The required communication links and archiving of the video data are still expensive and this setup excludes preemptive actions to respond to imminent threats. An effective way to overcome these limitations is to build a smart camera that analyzes the data on-site, close to the sensor, and transmits alerts when relevant video sequences are detected. Deep neural networks (DNNs) have come to outperform humans in visual classifications tasks and are also performing exceptionally well on other computer vision tasks. The concept of DNNs and Convolutional Networks (ConvNets) can easily be extended to make use of higher-dimensional input data such as multispectral data. We explore this opportunity in terms of achievable accuracy and required computational effort. To analyze the precision of DNNs for scene labeling in an urban surveillance scenario we have created a dataset with 8 classes obtained in a field experiment. We combine an RGB camera with a 25-channel VIS-NIR snapshot sensor to assess the potential of multispectral image data for target classification. We evaluate several new DNNs, showing that the spectral information fused together with the RGB frames can be used to improve the accuracy of the system or to achieve similar accuracy with a 3x smaller computation effort. We achieve a very high per-pixel accuracy of 99.1%. Even for scarcely occurring, but particularly interesting classes, such as cars, 75% of the pixels are labeled correctly with errors occurring only around the border of the objects. This high accuracy was obtained with a training set of only 30 labeled images, paving the way for fast adaptation to various application scenarios.
Overview of the Multi-Spectral Imager on the NEAR spacecraft
NASA Astrophysics Data System (ADS)
Hawkins, S. E., III
1996-07-01
The Multi-Spectral Imager on the Near Earth Asteroid Rendezvous (NEAR) spacecraft is a 1 Hz frame rate CCD camera sensitive in the visible and near infrared bands (~400-1100 nm). MSI is the primary instrument on the spacecraft to determine morphology and composition of the surface of asteroid 433 Eros. In addition, the camera will be used to assist in navigation to the asteroid. The instrument uses refractive optics and has an eight position spectral filter wheel to select different wavelength bands. The MSI optical focal length of 168 mm gives a 2.9 ° × 2.25 ° field of view. The CCD is passively cooled and the 537×244 pixel array output is digitized to 12 bits. Electronic shuttering increases the effective dynamic range of the instrument by more than a factor of 100. A one-time deployable cover protects the instrument during ground testing operations and launch. A reduced aperture viewport permits full field of view imaging while the cover is in place. A Data Processing Unit (DPU) provides the digital interface between the spacecraft and the Camera Head and uses an RTX2010 processor. The DPU provides an eight frame image buffer, lossy and lossless data compression routines, and automatic exposure control. An overview of the instrument is presented and design parameters and trade-offs are discussed.
Inflight Radiometric Calibration of New Horizons' Multispectral Visible Imaging Camera (MVIC)
NASA Technical Reports Server (NTRS)
Howett, C. J. A.; Parker, A. H.; Olkin, C. B.; Reuter, D. C.; Ennico, K.; Grundy, W. M.; Graps, A. L.; Harrison, K. P.; Throop, H. B.; Buie, M. W.;
2016-01-01
We discuss two semi-independent calibration techniques used to determine the inflight radiometric calibration for the New Horizons Multi-spectral Visible Imaging Camera (MVIC). The first calibration technique compares the measured number of counts (DN) observed from a number of well calibrated stars to those predicted using the component-level calibration. The ratio of these values provides a multiplicative factor that allows a conversation between the preflight calibration to the more accurate inflight one, for each detector. The second calibration technique is a channel-wise relative radiometric calibration for MVIC's blue, near-infrared and methane color channels using Hubble and New Horizons observations of Charon and scaling from the red channel stellar calibration. Both calibration techniques produce very similar results (better than 7% agreement), providing strong validation for the techniques used. Since the stellar calibration described here can be performed without a color target in the field of view and covers all of MVIC's detectors, this calibration was used to provide the radiometric keyword values delivered by the New Horizons project to the Planetary Data System (PDS). These keyword values allow each observation to be converted from counts to physical units; a description of how these keyword values were generated is included. Finally, mitigation techniques adopted for the gain drift observed in the near-infrared detector and one of the panchromatic framing cameras are also discussed.
Wide-Field Optic for Autonomous Acquisition of Laser Link
NASA Technical Reports Server (NTRS)
Page, Norman A.; Charles, Jeffrey R.; Biswas, Abhijit
2011-01-01
An innovation reported in Two-Camera Acquisition and Tracking of a Flying Target, NASA Tech Briefs, Vol. 32, No. 8 (August 2008), p. 20, used a commercial fish-eye lens and an electronic imaging camera for initially locating objects with subsequent handover to an actuated narrow-field camera. But this operated against a dark-sky background. An improved solution involves an optical design based on custom optical components for the wide-field optical system that directly addresses the key limitations in acquiring a laser signal from a moving source such as an aircraft or a spacecraft. The first challenge was to increase the light collection entrance aperture diameter, which was approximately 1 mm in the first prototype. The new design presented here increases this entrance aperture diameter to 4.2 mm, which is equivalent to a more than 16 times larger collection area. One of the trades made in realizing this improvement was to restrict the field-of-view to +80 deg. elevation and 360 azimuth. This trade stems from practical considerations where laser beam propagation over the excessively high air mass, which is in the line of sight (LOS) at low elevation angles, results in vulnerability to severe atmospheric turbulence and attenuation. An additional benefit of the new design is that the large entrance aperture is maintained even at large off-axis angles when the optic is pointed at zenith. The second critical limitation for implementing spectral filtering in the design was tackled by collimating the light prior to focusing it onto the focal plane. This allows the placement of the narrow spectral filter in the collimated portion of the beam. For the narrow band spectral filter to function properly, it is necessary to adequately control the range of incident angles at which received light intercepts the filter. When this angle is restricted via collimation, narrower spectral filtering can be implemented. The collimated beam (and the filter) must be relatively large to reduce the incident angle down to only a few degrees. In the presented embodiment, the filter diameter is more than ten times larger than the entrance aperture. Specifically, the filter has a clear aperture of about 51 mm. The optical design is refractive, and is comprised of nine custom refractive elements and an interference filter. The restricted maximum angle through the narrow-band filter ensures the efficient use of a 2-nm noise equivalent bandwidth spectral width optical filter at low elevation angles (where the range is longest), at the expense of less efficiency for high elevations, which can be tolerated because the range at high elevation angles is shorter. The image circle is 12 mm in diameter, mapped to 80 x 360 of sky, centered on the zenith.
Cai, Fuhong; Lu, Wen; Shi, Wuxiong; He, Sailing
2017-11-15
Spatially-explicit data are essential for remote sensing of ecological phenomena. Lately, recent innovations in mobile device platforms have led to an upsurge in on-site rapid detection. For instance, CMOS chips in smart phones and digital cameras serve as excellent sensors for scientific research. In this paper, a mobile device-based imaging spectrometer module (weighing about 99 g) is developed and equipped on a Single Lens Reflex camera. Utilizing this lightweight module, as well as commonly used photographic equipment, we demonstrate its utility through a series of on-site multispectral imaging, including ocean (or lake) water-color sensing and plant reflectance measurement. Based on the experiments we obtain 3D spectral image cubes, which can be further analyzed for environmental monitoring. Moreover, our system can be applied to many kinds of cameras, e.g., aerial camera and underwater camera. Therefore, any camera can be upgraded to an imaging spectrometer with the help of our miniaturized module. We believe it has the potential to become a versatile tool for on-site investigation into many applications.
A Wide-Angle Camera for the Mobile Asteroid Surface Scout (MASCOT) on Hayabusa-2
NASA Astrophysics Data System (ADS)
Schmitz, N.; Koncz, A.; Jaumann, R.; Hoffmann, H.; Jobs, D.; Kachlicki, J.; Michaelis, H.; Mottola, S.; Pforte, B.; Schroeder, S.; Terzer, R.; Trauthan, F.; Tschentscher, M.; Weisse, S.; Ho, T.-M.; Biele, J.; Ulamec, S.; Broll, B.; Kruselburger, A.; Perez-Prieto, L.
2014-04-01
JAXA's Hayabusa-2 mission, an asteroid sample return mission, is scheduled for launch in December 2014, for a rendezvous with the C-type asteroid 1999 JU3 in 2018. MASCOT, the Mobile Asteroid Surface Scout [1], is a small lander, designed to deliver ground truth for the orbiter remote measurements, support the selection of sampling sites, and provide context for the returned samples.MASCOT's main objective is to investigate the landing site's geomorphology, the internal structure, texture and composition of the regolith (dust, soil and rocks), and the thermal, mechanical, and magnetic properties of the surface. MASCOT comprises a payload of four scientific instruments: camera, radiometer, magnetometer and hyper-spectral microscope. The camera (MASCOT CAM) was designed and built by DLR's Institute of Planetary Research, together with Airbus DS Germany.
7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...
7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Design and Development of a Low-Cost Aerial Mobile Mapping System for Multi-Purpose Applications
NASA Astrophysics Data System (ADS)
Acevedo Pardo, C.; Farjas Abadía, M.; Sternberg, H.
2015-08-01
The research project with the working title "Design and development of a low-cost modular Aerial Mobile Mapping System" was formed during the last year as the result from numerous discussions and considerations with colleagues from the HafenCity University Hamburg, Department Geomatics. The aim of the project is to design a sensor platform which can be embedded preferentially on an UAV, but also can be integrated on any adaptable vehicle. The system should perform a direct scanning of surfaces with a laser scanner and supported through sensors for determining the position and attitude of the platform. The modular design allows his extension with other sensors such as multispectral cameras, digital cameras or multiple cameras systems.
Flooding in the Aftermath of Hurricane Katrina
NASA Technical Reports Server (NTRS)
2005-01-01
These views of the Louisiana and Mississippi regions were acquired before and one day after Katrina made landfall along the Gulf of Mexico coast, and highlight many of the changes to the rivers and vegetation that occurred between the two views. The images were acquired by NASA's Multi-angle Imaging SpectroRadiometer (MISR) on August 14 and August 30, 2005. These multiangular, multispectral false-color composites were created using red band data from MISR's 46o backward and forward-viewing cameras, and near-infrared data from MISR's nadir camera. Such a display causes water bodies and inundated soil to appear in blue and purple hues, and highly vegetated areas to appear bright green. The scene differentiation is a result of both spectral effects (living vegetation is highly reflective at near-infrared wavelengths whereas water is absorbing) and of angular effects (wet surfaces preferentially forward scatter sunlight). The two images were processed identically and extend from the regions of Greenville, Mississippi (upper left) to Mobile Bay, Alabama (lower right). There are numerous rivers along the Mississippi coast that were not apparent in the pre-Katrina image; the most dramatic of these is a new inlet in the Pascagoula River that was not apparent before Katrina. The post-Katrina flooding along the edges of Lake Pontchartrain and the city of New Orleans is also apparent. In addition, the agricultural lands along the Mississippi floodplain in the upper left exhibit stronger near-infrared brightness before Katrina. After Katrina, many of these agricultural areas exhibit a stronger signal to MISR's oblique cameras, indicating the presence of inundated soil throughout the floodplain. Note that clouds appear in a different spot for each view angle due to a parallax effect resulting from their height above the surface. The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously, viewing the entire globe between 82o north and 82o south latitude every nine days. Each image covers an area of about 380 kilometers by 410 kilometers. The data products were generated from a portion of the imagery acquired during Terra orbits 30091 and 30324 and utilize data from blocks 64-67 within World Reference System-2 path 22. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Science Mission Directorate, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is managed for NASA by the California Institute of Technology.NASA Technical Reports Server (NTRS)
2001-01-01
Surface brightness contrasts accentuated by a thin layer of snow enable a network of rivers, roads, and farmland boundaries to stand out clearly in these MISR images of southeastern Saskatchewan and southwestern Manitoba. The lefthand image is a multi-spectral false-color view made from the near-infrared, red, and green bands of MISR's vertical-viewing (nadir) camera. The righthand image is a multi-angle false-color view made from the red band data of the 60-degree aftward camera, the nadir camera, and the 60-degree forward camera. In each image, the selected channels are displayed as red, green, and blue, respectively. The data were acquired April 17, 2001 during Terra orbit 7083, and cover an area measuring about 285 kilometers x 400 kilometers. North is at the top.
The junction of the Assiniboine and Qu'Apelle Rivers in the bottom part of the images is just east of the Saskatchewan-Manitoba border. During the growing season, the rich, fertile soils in this area support numerous fields of wheat, canola, barley, flaxseed, and rye. Beef cattle are raised in fenced pastures. To the north, the terrain becomes more rocky and forested. Many frozen lakes are visible as white patches in the top right. The narrow linear, north-south trending patterns about a third of the way down from the upper right corner are snow-filled depressions alternating with vegetated ridges, most probably carved by glacial flow.In the lefthand image, vegetation appears in shades of red, owing to its high near-infrared reflectivity. In the righthand image, several forested regions are clearly visible in green hues. Since this is a multi-angle composite, the green arises not from the color of the leaves but from the architecture of the surface cover. Progressing southeastward along the Manitoba Escarpment, the forested areas include the Pasquia Hills, the Porcupine Hills, Duck Mountain Provincial Park, and Riding Mountain National Park. The forests are brighter in the nadir than at the oblique angles, probably because more of the snow-covered surface is visible in the gaps between the trees. In contrast, the valley between the Pasquia and Porcupine Hills near the top of the images appears bright red in the lefthand image (indicating high vegetation abundance) but shows a mauve color in the multi-angle view. This means that it is darker in the nadir than at the oblique angles. Examination of imagery acquired after the snow has melted should establish whether this difference is related to the amount of snow on the surface or is indicative of a different type of vegetation structure.Saskatchewan and Manitoba are believed to derive their names from the Cree words for the winding and swift-flowing waters of the Saskatchewan River and for a narrows on Lake Manitoba where the roaring sound of wind and water evoked the voice of the Great Spirit. They are two of Canada's Prairie Provinces; Alberta is the third.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.NASA Astrophysics Data System (ADS)
Johnson, J. R.; Bell, J. F., III; Hayes, A.; Deen, R. G.; Godber, A.; Arvidson, R. E.; Lemmon, M. T.
2015-12-01
The Mastcam imaging system on the Curiosity rover continued acquisition of multispectral images of the same terrain at multiple times of day at three new rover locations between sols 872 and 1003. These data sets will be used to investigate the light scattering properties of rocks and soils along the Curiosity traverse using radiative transfer models. Images were acquired by the Mastcam-34 (M-34) camera on Sols 872-892 at 8 times of day (Mojave drill location), Sols 914-917 (Telegraph Peak drill location) at 9 times of day, and Sols 1000-1003 at 8 times of day (Stimson-Murray Formation contact near Marias Pass). Data sets were acquired using filters centered at 445, 527, 751, and 1012 nm, and the images were jpeg-compressed. Data sets typically were pointed ~east and ~west to provide phase angle coverage from near 0° to 125-140° for a variety of rocks and soils. Also acquired on Sols 917-918 at the Telegraph Peak site was a multiple time-of-day Mastcam sequence pointed southeast using only the broadband Bayer filters that provided losslessly compressed images with phase angles ~55-129°. Navcam stereo images were also acquired with each data set to provide broadband photometry and terrain measurements for computing surface normals and local incidence and emission angles used in photometric modeling. On Sol 1028, the MAHLI camera was used as a goniometer to acquire images at 20 arm positions, all centered at the same location within the work volume from a near-constant distance of 85 cm from the surface. Although this experiment was run at only one time of day (~15:30 LTST), it provided phase angle coverage from ~30° to ~111°. The terrain included the contact between the uppermost portion of the Murray Formation and the Stimson sandstones, and was the first acquisition of both Mastcam and MALHI photometry images at the same rover location. The MAHLI images also allowed construction of a 3D shape model of the Stimson-Murray contact region. The attached figure shows a phase color composite of the western Stimson area, created using phase angles of 8°, 78°, and 130° at 751 nm. The red areas correspond to highly backscattering materials that appear to concentrate along linear fractures throughout this area. The blue areas correspond to more forward scattering materials dispersed through the stratigraphic sequence.
Constraining Aerosol Properties with the Spectrally-Resolved Phase Function of Pluto's Hazes
NASA Astrophysics Data System (ADS)
Parker, A. H.; Howett, C.; Olkin, C.; Protopapa, S.; Grundy, W. M.; Gladstone, R.; Young, L. A.; Horst, S. M.; Weaver, H. A., Jr.; Moore, J. M.; Ennico Smith, K.; Stern, A.
2017-12-01
The Multi-spectral Visible Imaging Camera (MVIC) and Lisa Hardaway Infrared Mapping Spectrometer (LEISA) aboard New Horizons imaged Pluto at high phase throughout departure from the system in July of 2015. The repeated MVIC color scans captured the phase behavior of Pluto's atmospheric hazes through phase angles of 165.0 to 169.5 degrees in four bandpasses in the visible and NIR. A spatially-resolved departure LEISA scan delivered moderate SNR NIR spectra of the hazes over wavelengths from 1.25 - 2.5 microns. Here we present our analysis of the departure MVIC and LEISA data, extracting high precision color phase curves of the hazes using the most up-to-date radiometric calibration and NIR gain drift corrections. We interpret these phase curves and spectra using Mie theory to constrain the size and composition of haze particles, with results indicating broad similarity to Titan aerosol analogues ("tholins"). Finally, we will explore the implications of the nature of these haze particles for the evolution of Pluto's surface as they settle out onto it over time.
NASA Astrophysics Data System (ADS)
Liu, Yongfeng; Zhang, You-tong; Gou, Chenhua; Tian, Hongsen
2008-12-01
Temperature laser- induced- fluorescence (LIF) 2-D imaging measurements using a new multi-spectral detection strategy are reported for high pressure flames in high-speed diesel engine. Schematic of the experimental set-up is outlined and the experimental data on the diesel engine is summarized. Experiment injection system is a third generation Bosch high-pressure common rail featuring a maximum pressure of 160 MPa. The injector is equipped with a six-hole nozzle, where each hole has a diameter of 0.124 mm. and slightly offset (by 1.0 mm) to the center of the cylinder axis to allow a better cooling of the narrow bridge between the exhaust valves. The measurement system includes a blower, which supplied the intake flow rate, and a prototype single-valve direct injection diesel engine head modified to lay down the swirled-type injector. 14-bit digital CCD cameras are employed to achieve a greater level of accuracy in comparison to the results of previous measurements. The temperature field spatial distributions in the cylinder for different crank angle degrees are carried out in a single direct-injection diesel engine.
2D temperature field measurement in a direct-injection engine using LIF technology
NASA Astrophysics Data System (ADS)
Liu, Yongfeng; Tian, Hongsen; Yang, Jianwei; Sun, Jianmin; Zhu, Aihua
2011-12-01
A new multi-spectral detection strategy for temperature laser- induced- fluorescence (LIF) 2-D imaging measurements is reported for high pressure flames in high-speed diesel engine. Schematic of the experimental set-up is outlined and the experimental data on the diesel engine is summarized. Experiment injection system is a third generation Bosch high-pressure common rail featuring a maximum pressure of 160MPa. The injector is equipped with a six-hole nozzle, where each hole has a diameter of 0.124 mm. and slightly offset to the center of the cylinder axis to allow a better cooling of the narrow bridge between the exhaust valves. The measurement system includes a blower, which supplied the intake flow rate, and a prototype single-valve direct injection diesel engine head modified to lay down the swirled-type injector. 14-bit digital CCD cameras are employed to achieve a greater level of accuracy in comparison to the results of previous measurements. The temperature field spatial distributions in the cylinder for different crank angle degrees are carried out in a single direct-injection diesel engine.
NASA Astrophysics Data System (ADS)
Ghionis, George; Trygonis, Vassilis; Karydis, Antonis; Vousdoukas, Michalis; Alexandrakis, George; Drakopoulos, Panos; Amdreadis, Olympos; Psarros, Fotis; Velegrakis, Antonis; Poulos, Serafim
2016-04-01
Effective beach management requires environmental assessments that are based on sound science, are cost-effective and are available to beach users and managers in an accessible, timely and transparent manner. The most common problems are: 1) The available field data are scarce and of sub-optimal spatio-temporal resolution and coverage, 2) our understanding of local beach processes needs to be improved in order to accurately model/forecast beach dynamics under a changing climate, and 3) the information provided by coastal scientists/engineers in the form of data, models and scientific interpretation is often too complicated to be of direct use by coastal managers/decision makers. A multispectral video system has been developed, consisting of one or more video cameras operating in the visible part of the spectrum, a passive near-infrared (NIR) camera, an active NIR camera system, a thermal infrared camera and a spherical video camera, coupled with innovative image processing algorithms and a telemetric system for the monitoring of coastal environmental parameters. The complete system has the capability to record, process and communicate (in quasi-real time) high frequency information on shoreline position, wave breaking zones, wave run-up, erosion hot spots along the shoreline, nearshore wave height, turbidity, underwater visibility, wind speed and direction, air and sea temperature, solar radiation, UV radiation, relative humidity, barometric pressure and rainfall. An innovative, remotely-controlled interactive visual monitoring system, based on the spherical video camera (with 360°field of view), combines the video streams from all cameras and can be used by beach managers to monitor (in real time) beach user numbers, flow activities and safety at beaches of high touristic value. The high resolution near infrared cameras permit 24-hour monitoring of beach processes, while the thermal camera provides information on beach sediment temperature and moisture, can detect upwelling in the nearshore zone, and enhances the safety of beach users. All data can be presented in real- or quasi-real time and are stored for future analysis and training/validation of coastal processes models. Acknowledgements: This work was supported by the project BEACHTOUR (11SYN-8-1466) of the Operational Program "Cooperation 2011, Competitiveness and Entrepreneurship", co-funded by the European Regional Development Fund and the Greek Ministry of Education and Religious Affairs.
Sky brightness and color measurements during the 21 August 2017 total solar eclipse.
Bruns, Donald G; Bruns, Ronald D
2018-06-01
The sky brightness was measured during the partial phases and during totality of the 21 August 2017 total solar eclipse. A tracking CCD camera with color filters and a wide-angle lens allowed measurements across a wide field of view, recording images every 10 s. The partially and totally eclipsed Sun was kept behind an occulting disk attached to the camera, allowing direct brightness measurements from 1.5° to 38° from the Sun. During the partial phases, the sky brightness as a function of time closely followed the integrated intensity of the unobscured fraction of the solar disk. A redder sky was measured close to the Sun just before totality, caused by the redder color of the exposed solar limb. During totality, a bluer sky was measured, dimmer than the normal sky by a factor of 10,000. Suggestions for enhanced measurements at future eclipses are offered.
Towards a Single Sensor Passive Solution for Automated Fall Detection
Belshaw, Michael; Taati, Babak; Snoek, Jasper; Mihailidis, Alex
2012-01-01
Falling in the home is one of the major challenges to independent living among older adults. The associated costs, coupled with a rapidly growing elderly population, are placing a burden on healthcare systems worldwide that will swiftly become unbearable. To facilitate expeditious emergency care, we have developed an artificially intelligent camera-based system that automatically detects if a person within the field-of-view has fallen. The system addresses concerns raised in earlier work and the requirements of a widely deployable in-home solution. The presented prototype utilizes a consumer-grade camera modified with a wide-angle lens. Machine learning techniques applied to carefully engineered features allow the system to classify falls at high accuracy while maintaining invariance to lighting, environment and the presence of multiple moving objects. This paper describes the system, outlines the algorithms used and presents empirical validation of its effectiveness. PMID:22254671
Mars Global Surveyor: 7 Years in Orbit!
NASA Technical Reports Server (NTRS)
2004-01-01
12 September 2004 Today, 12 September 2004, the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) team celebrates 7 Earth years orbiting Mars. MGS first reached the red planet and performed its critical orbit insertion burn on 12 September 1997. Over the past 7 years, MOC has returned over 170,000 images; its narrow angle camera has covered about 4.5% of the surface, and its wide angle cameras have viewed 100% of the planet nearly everyday. At this time, MOC is not acquiring data because Mars is on the other side of the Sun relative to Earth. This period, known as Solar Conjunction, occurs about once every 26 months. During Solar Conjunction, no radio communications from spacecraft that are orbiting or have landed on Mars can be received. MOC was turned off on 7 September and is expected to resume operations on 25 September 2004, when Mars re-emerges from behind the Sun. The rotating color image of Mars shown here was compiled from MOC red and blue wide angle daily global images acquired exactly 1 Mars year ago on 26 October 2002 (Ls 86.4o). In other words, Mars today (12 September 2004) should look about the same as the view provided here. Presently, Mars is in very late northern spring, and the north polar cap has retreated almost to its summer configuration. Water ice clouds form each afternoon at this time of year over the large volcanoes in the Tharsis and Elysium regions. A discontinuous belt of clouds forms over the martian equator; it is most prominent north of the Valles Marineris trough system. In the southern hemisphere, it is late autumn and the giant Hellas Basin floor is nearly white with seasonal frost cover. The south polar cap is not visible, it is enveloped in seasonal darkness. The northern summer and southern winter seasons will begin on 20 September 2004.6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...
6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Wageningen UR Unmanned Aerial Remote Sensing Facility - Overview of activities
NASA Astrophysics Data System (ADS)
Bartholomeus, Harm; Keesstra, Saskia; Kooistra, Lammert; Suomalainen, Juha; Mucher, Sander; Kramer, Henk; Franke, Jappe
2016-04-01
To support environmental management there is an increasing need for timely, accurate and detailed information on our land. Unmanned Aerial Systems (UAS) are increasingly used to monitor agricultural crop development, habitat quality or urban heat efficiency. An important reason is that UAS technology is maturing quickly while the flexible capabilities of UAS fill a gap between satellite based and ground based geo-sensing systems. In 2012, different groups within Wageningen University and Research Centre have established an Unmanned Airborne Remote Sensing Facility. The objective of this facility is threefold: a) To develop innovation in the field of remote sensing science by providing a platform for dedicated and high-quality experiments; b) To support high quality UAS services by providing calibration facilities and disseminating processing procedures to the UAS user community; and c) To promote and test the use of UAS in a broad range of application fields like habitat monitoring, precision agriculture and land degradation assessment. The facility is hosted by the Laboratory of Geo-Information Science and Remote Sensing (GRS) and the Department of Soil Physics and Land Management (SLM) of Wageningen University together with the team Earth Informatics (EI) of Alterra. The added value of the Unmanned Aerial Remote Sensing Facility is that compared to for example satellite based remote sensing more dedicated science experiments can be prepared. This includes for example higher frequent observations in time (e.g., diurnal observations), observations of an object under different observation angles for characterization of BRDF and flexibility in use of camera's and sensors types. In this way, laboratory type of set ups can be tested in a field situation and effects of up-scaling can be tested. In the last years we developed and implemented different camera systems (e.g. a hyperspectral pushbroom system, and multispectral frame cameras) which we operated in projects all around the world, while new camera systems are being planned such as LiDAR and a full frame hyperspectral camera. In the presentation we will give an overview of our activities, ranging from erosion studies, decision support for precision agriculture, determining leaf biochemistry and canopy structure in tropical forests to the mapping of coastal zones.
Bautista, Pinky A; Yagi, Yukako
2011-01-01
In this paper we introduced a digital staining method for histopathology images captured with an n-band multispectral camera. The method consisted of two major processes: enhancement of the original spectral transmittance and the transformation of the enhanced transmittance to its target spectral configuration. Enhancement is accomplished by shifting the original transmittance with the scaled difference between the original transmittance and the transmittance estimated with m dominant principal component (PC) vectors;the m-PC vectors were determined from the transmittance samples of the background image. Transformation of the enhanced transmittance to the target spectral configuration was done using an nxn transformation matrix, which was derived by applying a least square method to the enhanced and target spectral training data samples of the different tissue components. Experimental results on the digital conversion of a hematoxylin and eosin (H&E) stained multispectral image to its Masson's trichrome stained (MT) equivalent shows the viability of the method.
Spectral imaging spreads into new industrial and on-field applications
NASA Astrophysics Data System (ADS)
Bouyé, Clémentine; Robin, Thierry; d'Humières, Benoît
2018-02-01
Numerous recent innovative developments have led to a high reduction of hyperspectral and multispectral cameras cost and size. The achieved products - compact, reliable, low-cot, easy-to-use - meet end-user requirements in major fields: agriculture, food and beverages, pharmaceutics, machine vision, health. The booming of this technology in industrial and on-field applications is getting closer. Indeed, the Spectral Imaging market is at a turning point. A high growth rate of 20% is expected in the next 5 years. The number of cameras sold will increase from 3 600 in 2017 to more than 9 000 in 2022.
Image quality prediction: an aid to the Viking Lander imaging investigation on Mars.
Huck, F O; Wall, S D
1976-07-01
Two Viking spacecraft scheduled to land on Mars in the summer of 1976 will return multispectral panoramas of the Martian surface with resolutions 4 orders of magnitude higher than have been previously obtained and stereo views with resolutions approaching that of the human eye. Mission constraints and uncertainties require a carefully planned imaging investigation that is supported by a computer model of camera response and surface features to aid in diagnosing camera performance, in establishing a preflight imaging strategy, and in rapidly revising this strategy if pictures returned from Mars reveal unfavorable or unanticipated conditions.
Lunar UV-visible-IR mapping interferometric spectrometer
NASA Technical Reports Server (NTRS)
Smith, W. Hayden; Haskin, L.; Korotev, R.; Arvidson, R.; Mckinnon, W.; Hapke, B.; Larson, S.; Lucey, P.
1992-01-01
Ultraviolet-visible-infrared mapping digital array scanned interferometers for lunar compositional surveys was developed. The research has defined a no-moving-parts, low-weight and low-power, high-throughput, and electronically adaptable digital array scanned interferometer that achieves measurement objectives encompassing and improving upon all the requirements defined by the LEXSWIG for lunar mineralogical investigation. In addition, LUMIS provides a new, important, ultraviolet spectral mapping, high-spatial-resolution line scan camera, and multispectral camera capabilities. An instrument configuration optimized for spectral mapping and imaging of the lunar surface and provide spectral results in support of the instrument design are described.
NASA Technical Reports Server (NTRS)
1989-01-01
This pair of Voyager 2 images (FDS 11446.21 and 11448.10), two 591-s exposures obtained through the clear filter of the wide angle camera, show the full ring system with the highest sensitivity. Visible in this figure are the bright, narrow N53 and N63 rings, the diffuse N42 ring, and (faintly) the plateau outside of the N53 ring (with its slight brightening near 57,500 km).
Costless Platform for High Resolution Stereoscopic Images of a High Gothic Facade
NASA Astrophysics Data System (ADS)
Héno, R.; Chandelier, L.; Schelstraete, D.
2012-07-01
In October 2011, the PPMD specialized master's degree students (Photogrammetry, Positionning and Deformation Measurement) of the French ENSG (IGN's School of Geomatics, the Ecole Nationale des Sciences Géographiques) were asked to come and survey the main facade of the cathedral of Amiens, which is very complex as far as size and decoration are concerned. Although it was first planned to use a lift truck for the image survey, budget considerations and taste for experimentation led the project to other perspectives: images shot from the ground level with a long focal camera will be combined to complementary images shot from what higher galleries are available on the main facade with a wide angle camera fixed on a horizontal 2.5 meter long pole. This heteroclite image survey is being processed by the PPMD master's degree students during this academic year. Among other type of products, 3D point clouds will be calculated on specific parts of the facade with both sources of images. If the proposed device and methodology to get full image coverage of the main facade happen to be fruitful, the image acquisition phase will be completed later by another team. This article focuses on the production of 3D point clouds with wide angle images on the rose of the main facade.
Visual field information in Nap-of-the-Earth flight by teleoperated Helmet-Mounted displays
NASA Technical Reports Server (NTRS)
Grunwald, Arthur J.; Kohn, S.; Merhav, S. J.
1991-01-01
The human ability to derive Control-Oriented Visual Field Information from teleoperated Helmet-Mounted displays in Nap-of-the-Earth flight, is investigated. The visual field with these types of displays originates from a Forward Looking Infrared Radiation Camera, gimbal-mounted at the front of the aircraft and slaved to the pilot's line-of-sight, to obtain wide-angle visual coverage. Although these displays are proved to be effective in Apache and Cobra helicopter night operations, they demand very high pilot proficiency and work load. Experimental work presented in the paper has shown that part of the difficulties encountered in vehicular control by means of these displays can be attributed to the narrow viewing aperture and head/camera slaving system phase lags. Both these shortcomings will impair visuo-vestibular coordination, when voluntary head rotation is present. This might result in errors in estimating the Control-Oriented Visual Field Information vital in vehicular control, such as the vehicle yaw rate or the anticipated flight path, or might even lead to visuo-vestibular conflicts (motion sickness). Since, under these conditions, the pilot will tend to minimize head rotation, the full wide-angle coverage of the Helmet-Mounted Display, provided by the line-of-sight slaving system, is not always fully utilized.
2009-08-01
habitat analysis because of the high horizontal error between the mosaicked image tiles . The imagery was collected with a non-metric camera and likewise...possible with true color imagery (digital orthophotos ) or multispectral imagery, but usually comes at a much higher cost. Due to its availability and
2015-10-15
This high-resolution image captured by NASA's New Horizons spacecraft combines blue, red and infrared images taken by the Ralph/Multispectral Visual Imaging Camera (MVIC). The bright expanse is the western lobe of the "heart," informally called Sputnik Planum, which has been found to be rich in nitrogen, carbon monoxide and methane ices. http://photojournal.jpl.nasa.gov/catalog/PIA20007
Calibration of Action Cameras for Photogrammetric Purposes
Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo
2014-01-01
The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898
Calibration of action cameras for photogrammetric purposes.
Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo
2014-09-18
The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution.
Gyrocopter-Based Remote Sensing Platform
NASA Astrophysics Data System (ADS)
Weber, I.; Jenal, A.; Kneer, C.; Bongartz, J.
2015-04-01
In this paper the development of a lightweight and highly modularized airborne sensor platform for remote sensing applications utilizing a gyrocopter as a carrier platform is described. The current sensor configuration consists of a high resolution DSLR camera for VIS-RGB recordings. As a second sensor modality, a snapshot hyperspectral camera was integrated in the aircraft. Moreover a custom-developed thermal imaging system composed of a VIS-PAN camera and a LWIR-camera is used for aerial recordings in the thermal infrared range. Furthermore another custom-developed highly flexible imaging system for high resolution multispectral image acquisition with up to six spectral bands in the VIS-NIR range is presented. The performance of the overall system was tested during several flights with all sensor modalities and the precalculated demands with respect to spatial resolution and reliability were validated. The collected data sets were georeferenced, georectified, orthorectified and then stitched to mosaics.
Strong, Conor J; Burnside, Niall G; Llewellyn, Dan
2017-01-01
The loss of unimproved grassland has led to species decline in a wide range of taxonomic groups. Agricultural intensification has resulted in fragmented patches of remnant grassland habitat both across Europe and internationally. The monitoring of remnant patches of this habitat is critically important, however, traditional surveying of large, remote landscapes is a notoriously costly and difficult task. The emergence of small-Unmanned Aircraft Systems (sUAS) equipped with low-cost multi-spectral cameras offer an alternative to traditional grassland survey methods, and have the potential to progress and innovate the monitoring and future conservation of this habitat globally. The aim of this article is to investigate the potential of sUAS for rapid detection of threatened unimproved grassland and to test the use of an Enhanced Normalized Difference Vegetation Index (ENDVI). A sUAS aerial survey is undertaken at a site nationally recognised as an important location for fragmented unimproved mesotrophic grassland, within the south east of England, UK. A multispectral camera is used to capture imagery in the visible and near-infrared spectrums, and the ENDVI calculated and its discrimination performance compared to a range of more traditional vegetation indices. In order to validate the results of analysis, ground quadrat surveys were carried out to determine the grassland communities present. Quadrat surveys identified three community types within the site; unimproved grassland, improved grassland and rush pasture. All six vegetation indices tested were able to distinguish between the broad habitat types of grassland and rush pasture; whilst only three could differentiate vegetation at a community level. The Enhanced Normalized Difference Vegetation Index (ENDVI) was the most effective index when differentiating grasslands at the community level. The mechanisms behind the improved performance of the ENDVI are discussed and recommendations are made for areas of future research and study.
Strong, Conor J.; Llewellyn, Dan
2017-01-01
The loss of unimproved grassland has led to species decline in a wide range of taxonomic groups. Agricultural intensification has resulted in fragmented patches of remnant grassland habitat both across Europe and internationally. The monitoring of remnant patches of this habitat is critically important, however, traditional surveying of large, remote landscapes is a notoriously costly and difficult task. The emergence of small-Unmanned Aircraft Systems (sUAS) equipped with low-cost multi-spectral cameras offer an alternative to traditional grassland survey methods, and have the potential to progress and innovate the monitoring and future conservation of this habitat globally. The aim of this article is to investigate the potential of sUAS for rapid detection of threatened unimproved grassland and to test the use of an Enhanced Normalized Difference Vegetation Index (ENDVI). A sUAS aerial survey is undertaken at a site nationally recognised as an important location for fragmented unimproved mesotrophic grassland, within the south east of England, UK. A multispectral camera is used to capture imagery in the visible and near-infrared spectrums, and the ENDVI calculated and its discrimination performance compared to a range of more traditional vegetation indices. In order to validate the results of analysis, ground quadrat surveys were carried out to determine the grassland communities present. Quadrat surveys identified three community types within the site; unimproved grassland, improved grassland and rush pasture. All six vegetation indices tested were able to distinguish between the broad habitat types of grassland and rush pasture; whilst only three could differentiate vegetation at a community level. The Enhanced Normalized Difference Vegetation Index (ENDVI) was the most effective index when differentiating grasslands at the community level. The mechanisms behind the improved performance of the ENDVI are discussed and recommendations are made for areas of future research and study. PMID:29023504
Application of Sensor Fusion to Improve Uav Image Classification
NASA Astrophysics Data System (ADS)
Jabari, S.; Fathollahi, F.; Zhang, Y.
2017-08-01
Image classification is one of the most important tasks of remote sensing projects including the ones that are based on using UAV images. Improving the quality of UAV images directly affects the classification results and can save a huge amount of time and effort in this area. In this study, we show that sensor fusion can improve image quality which results in increasing the accuracy of image classification. Here, we tested two sensor fusion configurations by using a Panchromatic (Pan) camera along with either a colour camera or a four-band multi-spectral (MS) camera. We use the Pan camera to benefit from its higher sensitivity and the colour or MS camera to benefit from its spectral properties. The resulting images are then compared to the ones acquired by a high resolution single Bayer-pattern colour camera (here referred to as HRC). We assessed the quality of the output images by performing image classification tests. The outputs prove that the proposed sensor fusion configurations can achieve higher accuracies compared to the images of the single Bayer-pattern colour camera. Therefore, incorporating a Pan camera on-board in the UAV missions and performing image fusion can help achieving higher quality images and accordingly higher accuracy classification results.
2016-03-17
This enhanced color view of Pluto's surface diversity was created by merging Ralph/Multispectral Visible Imaging Camera (MVIC) color imagery (650 meters per pixel) with Long Range Reconnaissance Imager panchromatic imagery (230 meters per pixel). At lower right, ancient, heavily cratered terrain is coated with dark, reddish tholins. At upper right, volatile ices filling the informally named Sputnik Planum have modified the surface, creating a chaos-like array of blocky mountains. Volatile ice also occupies a few nearby deep craters, and in some areas the volatile ice is pocked with arrays of small sublimation pits. At left, and across the bottom of the scene, gray-white CH4 ice deposits modify tectonic ridges, the rims of craters, and north-facing slopes. The scene in this image is 260 miles (420 kilometers) wide and 140 miles (225 kilometers) from top to bottom; north is to the upper left. http://photojournal.jpl.nasa.gov/catalog/PIA20534
Radar and infrared remote sensing of terrain, water resources, arctic sea ice, and agriculture
NASA Technical Reports Server (NTRS)
Biggs, A. W.
1983-01-01
Radar range measurements, basic waveforms of radar systems, and radar displays are initially described. These are followed by backscatter from several types of terrain and vegetation as a function of frequency and grazing angle. Analytical models for this backscatter include the facet models of radar return, with range-angle, velocity-range, velocity-angle, range, velocity, and angular only discriminations. Several side-looking airborne radar geometries are presented. Radar images of Arctic sea ice, fresh water lake ice, cloud-covered terrain, and related areas are presented to identify applications of radar imagery. Volume scatter models are applied to radar imagery from alpine snowfields. Short pulse ice thickness radar for subsurface probes is discussed in fresh-water ice and sea ice detection. Infrared scanners, including multispectral, are described. Diffusion of cold water into a river, Arctic sea ice, power plant discharges, volcanic heat, and related areas are presented in thermal imagery. Multispectral radar and infrared imagery are discussed, with comparisons of photographic, infrared, and radar imagery of the same terrain or subjects.
A single camera photogrammetry system for multi-angle fast localization of EEG electrodes.
Qian, Shuo; Sheng, Yang
2011-11-01
Photogrammetry has become an effective method for the determination of electroencephalography (EEG) electrode positions in three dimensions (3D). Capturing multi-angle images of the electrodes on the head is a fundamental objective in the design of photogrammetry system for EEG localization. Methods in previous studies are all based on the use of either a rotating camera or multiple cameras, which are time-consuming or not cost-effective. This study aims to present a novel photogrammetry system that can realize simultaneous acquisition of multi-angle head images in a single camera position. Aligning two planar mirrors with the angle of 51.4°, seven views of the head with 25 electrodes are captured simultaneously by the digital camera placed in front of them. A complete set of algorithms for electrode recognition, matching, and 3D reconstruction is developed. It is found that the elapsed time of the whole localization procedure is about 3 min, and camera calibration computation takes about 1 min, after the measurement of calibration points. The positioning accuracy with the maximum error of 1.19 mm is acceptable. Experimental results demonstrate that the proposed system provides a fast and cost-effective method for the EEG positioning.
Mitigating fluorescence spectral overlap in wide-field endoscopic imaging
Hou, Vivian; Nelson, Leonard Y.; Seibel, Eric J.
2013-01-01
Abstract. The number of molecular species suitable for multispectral fluorescence imaging is limited due to the overlap of the emission spectra of indicator fluorophores, e.g., dyes and nanoparticles. To remove fluorophore emission cross-talk in wide-field multispectral fluorescence molecular imaging, we evaluate three different solutions: (1) image stitching, (2) concurrent imaging with cross-talk ratio subtraction algorithm, and (3) frame-sequential imaging. A phantom with fluorophore emission cross-talk is fabricated, and a 1.2-mm ultrathin scanning fiber endoscope (SFE) is used to test and compare these approaches. Results show that fluorophore emission cross-talk could be successfully avoided or significantly reduced. Near term, the concurrent imaging method of wide-field multispectral fluorescence SFE is viable for early stage cancer detection and localization in vivo. Furthermore, a means to enhance exogenous fluorescence target-to-background ratio by the reduction of tissue autofluorescence background is demonstrated. PMID:23966226
7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...
7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...
2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Smartphone-Guided Needle Angle Selection During CT-Guided Procedures.
Xu, Sheng; Krishnasamy, Venkatesh; Levy, Elliot; Li, Ming; Tse, Zion Tsz Ho; Wood, Bradford John
2018-01-01
In CT-guided intervention, translation from a planned needle insertion angle to the actual insertion angle is estimated only with the physician's visuospatial abilities. An iPhone app was developed to reduce reliance on operator ability to estimate and reproduce angles. The iPhone app overlays the planned angle on the smartphone's camera display in real-time based on the smartphone's orientation. The needle's angle is selected by visually comparing the actual needle with the guideline in the display. If the smartphone's screen is perpendicular to the planned path, the smartphone shows the Bull's-Eye View mode, in which the angle is selected after the needle's hub overlaps the tip in the camera. In phantom studies, we evaluated the accuracies of the hardware, the Guideline mode, and the Bull's-Eye View mode and showed the app's clinical efficacy. A proof-of-concept clinical case was also performed. The hardware accuracy was 0.37° ± 0.27° (mean ± SD). The mean error and navigation time were 1.0° ± 0.9° and 8.7 ± 2.3 seconds for a senior radiologist with 25 years' experience and 1.5° ± 1.3° and 8.0 ± 1.6 seconds for a junior radiologist with 4 years' experience. The accuracy of the Bull's-Eye View mode was 2.9° ± 1.1°. Combined CT and smart-phone guidance was significantly more accurate than CT-only guidance for the first needle pass (p = 0.046), which led to a smaller final targeting error (mean distance from needle tip to target, 2.5 vs 7.9 mm). Mobile devices can be useful for guiding needle-based interventions. The hardware is low cost and widely available. The method is accurate, effective, and easy to implement.
An ordinary camera in an extraordinary location: Outreach with the Mars Webcam
NASA Astrophysics Data System (ADS)
Ormston, T.; Denis, M.; Scuka, D.; Griebel, H.
2011-09-01
The European Space Agency's Mars Express mission was launched in 2003 and was Europe's first mission to Mars. On-board was a small camera designed to provide ‘visual telemetry’ of the separation of the Beagle-2 lander. After achieving its goal it was shut down while the primary science mission of Mars Express got underway. In 2007 this camera was reactivated by the flight control team of Mars Express for the purpose of providing public education and outreach—turning it into the ‘Mars Webcam’.The camera is a small, 640×480 pixel colour CMOS camera with a wide-angle 30°×40° field of view. This makes it very similar in almost every way to the average home PC webcam. The major difference is that this webcam is not in an average location but is instead in orbit around Mars. On a strict basis of non-interference with the primary science activities, the camera is turned on to provide unique wide-angle views of the planet below.A highly automated process ensures that the observations are scheduled on the spacecraft and then uploaded to the internet as rapidly as possible. There is no intermediate stage, so that visitors to the Mars Webcam blog serve as ‘citizen scientists’. Full raw datasets and processing instructions are provided along with a mechanism to allow visitors to comment on the blog. Members of the public are encouraged to use this in either a personal or an educational context and work with the images. We then take their excellent work and showcase it back on the blog. We even apply techniques developed by them to improve the data and webcam experience for others.The accessibility and simplicity of the images also makes the data ideal for educational use, especially as educational projects can then be showcased on the site as inspiration for others. The oft-neglected target audience of space enthusiasts is also important as this allows them to participate as part of an interplanetary instrument team.This paper will cover the history of the project and the technical background behind using the camera and linking the results to an accessible blog format. It will also cover the outreach successes of the project, some of the contributions from the Mars Webcam community, opportunities to use and work with the Mars Webcam and plans for future uses of the camera.
Design and fabrication of multispectral optics using expanded glass map
NASA Astrophysics Data System (ADS)
Bayya, Shyam; Gibson, Daniel; Nguyen, Vinh; Sanghera, Jasbinder; Kotov, Mikhail; Drake, Gryphon; Deegan, John; Lindberg, George
2015-06-01
As the desire to have compact multispectral imagers in various DoD platforms is growing, the dearth of multispectral optics is widely felt. With the limited number of material choices for optics, these multispectral imagers are often very bulky and impractical on several weight sensitive platforms. To address this issue, NRL has developed a large set of unique infrared glasses that transmit from 0.9 to > 14 μm in wavelength and expand the glass map for multispectral optics with refractive indices from 2.38 to 3.17. They show a large spread in dispersion (Abbe number) and offer some unique solutions for multispectral optics designs. The new NRL glasses can be easily molded and also fused together to make bonded doublets. A Zemax compatible glass file has been created and is available upon request. In this paper we present some designs, optics fabrication and imaging, all using NRL materials.
Design framework for a spectral mask for a plenoptic camera
NASA Astrophysics Data System (ADS)
Berkner, Kathrin; Shroff, Sapna A.
2012-01-01
Plenoptic cameras are designed to capture different combinations of light rays from a scene, sampling its lightfield. Such camera designs capturing directional ray information enable applications such as digital refocusing, rotation, or depth estimation. Only few address capturing spectral information of the scene. It has been demonstrated that by modifying a plenoptic camera with a filter array containing different spectral filters inserted in the pupil plane of the main lens, sampling of the spectral dimension of the plenoptic function is performed. As a result, the plenoptic camera is turned into a single-snapshot multispectral imaging system that trades-off spatial with spectral information captured with a single sensor. Little work has been performed so far on analyzing diffraction effects and aberrations of the optical system on the performance of the spectral imager. In this paper we demonstrate simulation of a spectrally-coded plenoptic camera optical system via wave propagation analysis, evaluate quality of the spectral measurements captured at the detector plane, and demonstrate opportunities for optimization of the spectral mask for a few sample applications.
Dual-emissive quantum dots for multispectral intraoperative fluorescence imaging.
Chin, Patrick T K; Buckle, Tessa; Aguirre de Miguel, Arantxa; Meskers, Stefan C J; Janssen, René A J; van Leeuwen, Fijs W B
2010-09-01
Fluorescence molecular imaging is rapidly increasing its popularity in image guided surgery applications. To help develop its full surgical potential it remains a challenge to generate dual-emissive imaging agents that allow for combined visible assessment and sensitive camera based imaging. To this end, we now describe multispectral InP/ZnS quantum dots (QDs) that exhibit a bright visible green/yellow exciton emission combined with a long-lived far red defect emission. The intensity of the latter emission was enhanced by X-ray irradiation and allows for: 1) inverted QD density dependent defect emission intensity, showing improved efficacies at lower QD densities, and 2) detection without direct illumination and interference from autofluorescence. Copyright 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ewerlöf, Maria; Larsson, Marcus; Salerud, E. Göran
2017-02-01
Hyperspectral imaging (HSI) can estimate the spatial distribution of skin blood oxygenation, using visible to near-infrared light. HSI oximeters often use a liquid-crystal tunable filter, an acousto-optic tunable filter or mechanically adjustable filter wheels, which has too long response/switching times to monitor tissue hemodynamics. This work aims to evaluate a multispectral snapshot imaging system to estimate skin blood volume and oxygen saturation with high temporal and spatial resolution. We use a snapshot imager, the xiSpec camera (MQ022HG-IM-SM4X4-VIS, XIMEA), having 16 wavelength-specific Fabry-Perot filters overlaid on the custom CMOS-chip. The spectral distribution of the bands is however substantially overlapping, which needs to be taken into account for an accurate analysis. An inverse Monte Carlo analysis is performed using a two-layered skin tissue model, defined by epidermal thickness, haemoglobin concentration and oxygen saturation, melanin concentration and spectrally dependent reduced-scattering coefficient, all parameters relevant for human skin. The analysis takes into account the spectral detector response of the xiSpec camera. At each spatial location in the field-of-view, we compare the simulated output to the detected diffusively backscattered spectra to find the best fit. The imager is evaluated for spatial and temporal variations during arterial and venous occlusion protocols applied to the forearm. Estimated blood volume changes and oxygenation maps at 512x272 pixels show values that are comparable to reference measurements performed in contact with the skin tissue. We conclude that the snapshot xiSpec camera, paired with an inverse Monte Carlo algorithm, permits us to use this sensor for spatial and temporal measurement of varying physiological parameters, such as skin tissue blood volume and oxygenation.
Multispectral Snapshot Imagers Onboard Small Satellite Formations for Multi-Angular Remote Sensing
NASA Technical Reports Server (NTRS)
Nag, Sreeja; Hewagama, Tilak; Georgiev, Georgi; Pasquale, Bert; Aslam, Shahid; Gatebe, Charles K.
2017-01-01
Multispectral snapshot imagers are capable of producing 2D spatial images with a single exposure at selected, numerous wavelengths using the same camera, therefore operate differently from push broom or whiskbroom imagers. They are payloads of choice in multi-angular, multi-spectral imaging missions that use small satellites flying in controlled formation, to retrieve Earth science measurements dependent on the targets Bidirectional Reflectance-Distribution Function (BRDF). Narrow fields of view are needed to capture images with moderate spatial resolution. This paper quantifies the dependencies of the imagers optical system, spectral elements and camera on the requirements of the formation mission and their impact on performance metrics such as spectral range, swath and signal to noise ratio (SNR). All variables and metrics have been generated from a comprehensive, payload design tool. The baseline optical parameters selected (diameter 7 cm, focal length 10.5 cm, pixel size 20 micron, field of view 1.15 deg) and snapshot imaging technologies are available. The spectral components shortlisted were waveguide spectrometers, acousto-optic tunable filters (AOTF), electronically actuated Fabry-Perot interferometers, and integral field spectrographs. Qualitative evaluation favored AOTFs because of their low weight, small size, and flight heritage. Quantitative analysis showed that waveguide spectrometers perform better in terms of achievable swath (10-90 km) and SNR (greater than 20) for 86 wavebands, but the data volume generated will need very high bandwidth communication to downlink. AOTFs meet the external data volume caps well as the minimum spectral (wavebands) and radiometric (SNR) requirements, therefore are found to be currently feasible in spite of lower swath and SNR.
NASA Astrophysics Data System (ADS)
Nebiker, S.; Lack, N.; Abächerli, M.; Läderach, S.
2016-06-01
In this paper we investigate the performance of new light-weight multispectral sensors for micro UAV and their application to selected tasks in agronomical research and agricultural practice. The investigations are based on a series of flight campaigns in 2014 and 2015 covering a number of agronomical test sites with experiments on rape, barley, onion, potato and other crops. In our sensor comparison we included a high-end multispectral multiSPEC 4C camera with bandpass colour filters and reference channel in zenith direction and a low-cost, consumer-grade Canon S110 NIR camera with Bayer pattern colour filters. Ground-based reference measurements were obtained using a terrestrial hyperspectral field spectrometer. The investigations show that measurements with the high-end system consistently match very well with ground-based field spectrometer measurements with a mean deviation of just 0.01-0.04 NDVI values. The low-cost system, while delivering better spatial resolutions, expressed significant biases. The sensors were subsequently used to address selected agronomical questions. These included crop yield estimation in rape and barley and plant disease detection in potato and onion cultivations. High levels of correlation between different vegetation indices and reference yield measurements were obtained for rape and barley. In case of barley, the NDRE index shows an average correlation of 87% with reference yield, when species are taken into account. With high geometric resolutions and respective GSDs of down to 2.5 cm the effects of a thrips infestation in onion could be analysed and potato blight was successfully detected at an early stage of infestation.
An effective rectification method for lenselet-based plenoptic cameras
NASA Astrophysics Data System (ADS)
Jin, Jing; Cao, Yiwei; Cai, Weijia; Zheng, Wanlu; Zhou, Ping
2016-10-01
The Lenselet-Based Plenoptic has recently drawn a lot of attention in the field of computational photography. The additional information inherent in light field allows a wide range of applications, but some preliminary processing of the raw image is necessary before further operations. In this paper, an effective method is presented for the rotation rectification of the raw image. The rotation is caused by imperfectly position of micro-lens array relative to the sensor plane in commercially available Lytro plenoptic cameras. The key to our method is locating the center of each microlens image, which is projected by a micro-lens. Because of vignetting, the pixel values at centers of the micro-lens image are higher than those at the peripheries. A mask is applied to probe the micro-lens image to locate the center area by finding the local maximum response. The error of the center coordinate estimate is corrected and the angle of rotation is computed via a subsequent line fitting. The algorithm is performed on two images captured by different Lytro cameras. The angles of rotation are -0.3600° and -0.0621° respectively and the rectified raw image is useful and reliable for further operations, such as extraction of the sub-aperture images. The experimental results demonstrate that our method is efficient and accurate.
Marcot, Bruce G.; Jorgenson, M. Torre; DeGange, Anthony R.
2014-01-01
5. A Canon® Rebel 3Ti with a Sigma zoom lens (18–200 mm focal length). The Drift® HD-170 and GoPro® Hero3 cameras were secured to the struts and underwing for nadir (direct downward) imaging. The Panasonic® and Canon® cameras were each hand-held for oblique-angle landscape images, shooting through the airplanes’ windows, targeting both general landscape conditions as well as landscape features of special interest, such as tundra fire scars and landslips. The Drift® and GoPro® cameras each were set for time-lapse photography at 5-second intervals for overlapping coverage. Photographs from all cameras (100 percent .jpg format) were date- and time-synchronized to geographic positioning system waypoints taken during the flights, also at 5-second intervals, providing precise geotagging (latitude-longitude) of all files. All photographs were adjusted for color saturation and gamma, and nadir photographs were corrected for lens distortion for the Drift® and GoPro® cameras’ 170° wide-angle distortion. EXIF (exchangeable image file format) data on camera settings and geotagging were extracted into spreadsheet databases. An additional 1 hour, 20 minutes, and 43 seconds of high-resolution videos were recorded at 60 frames per second with the GoPro® camera along selected transect segments, and also were image-adjusted and corrected for lens distortion. Geotagged locations of 12,395 nadir photographs from the Drift® and GoPro® cameras were overlayed in a geographic information system (ArcMap 10.0) onto a map of 44 ecotypes (land- and water-cover types) of the Arctic Network study area. Presence and area of each ecotype occurring within a geographic information system window centered on the location of each photograph were recorded and included in the spreadsheet databases. All original and adjusted photographs, videos, geographic positioning system flight tracks, and photograph databases are available by contacting ascweb@usgs.gov.
Apollo 9 Mission image - S0-65 Multispectral Photography - Georgia
2009-02-19
AS09-26A-3792A (11 March 1969) --- Color infrared photograph of the Atlanta, Georgia area taken on March 11, 1969, by one of the four synchronized cameras of the Apollo 9 Earth Resources Survey (SO-65) experiment. At 11:21 a.m. (EST) when this picture was taken, the Apollo 9 spacecraft was at an altitude of 106 nautical miles, and the sun elevation was 47 degrees above the horizon. The location of the point on Earth's surface at which the four-camera combination was aimed was 33 degrees 10 minutes north latitude, and 84 degrees and 40 minutes west longitude. The other three cameras used: (B) black and white film with a red filter; (C) black and white infrared film; and (D) black and white film with a green filter.
Tissue classification for laparoscopic image understanding based on multispectral texture analysis
NASA Astrophysics Data System (ADS)
Zhang, Yan; Wirkert, Sebastian J.; Iszatt, Justin; Kenngott, Hannes; Wagner, Martin; Mayer, Benjamin; Stock, Christian; Clancy, Neil T.; Elson, Daniel S.; Maier-Hein, Lena
2016-03-01
Intra-operative tissue classification is one of the prerequisites for providing context-aware visualization in computer-assisted minimally invasive surgeries. As many anatomical structures are difficult to differentiate in conventional RGB medical images, we propose a classification method based on multispectral image patches. In a comprehensive ex vivo study we show (1) that multispectral imaging data is superior to RGB data for organ tissue classification when used in conjunction with widely applied feature descriptors and (2) that combining the tissue texture with the reflectance spectrum improves the classification performance. Multispectral tissue analysis could thus evolve as a key enabling technique in computer-assisted laparoscopy.
Bell, James F.; Godber, A.; McNair, S.; Caplinger, M.A.; Maki, J.N.; Lemmon, M.T.; Van Beek, J.; Malin, M.C.; Wellington, D.; Kinch, K.M.; Madsen, M.B.; Hardgrove, C.; Ravine, M.A.; Jensen, E.; Harker, D.; Anderson, Ryan; Herkenhoff, Kenneth E.; Morris, R.V.; Cisneros, E.; Deen, R.G.
2017-01-01
The NASA Curiosity rover Mast Camera (Mastcam) system is a pair of fixed-focal length, multispectral, color CCD imagers mounted ~2 m above the surface on the rover's remote sensing mast, along with associated electronics and an onboard calibration target. The left Mastcam (M-34) has a 34 mm focal length, an instantaneous field of view (IFOV) of 0.22 mrad, and a FOV of 20° × 15° over the full 1648 × 1200 pixel span of its Kodak KAI-2020 CCD. The right Mastcam (M-100) has a 100 mm focal length, an IFOV of 0.074 mrad, and a FOV of 6.8° × 5.1° using the same detector. The cameras are separated by 24.2 cm on the mast, allowing stereo images to be obtained at the resolution of the M-34 camera. Each camera has an eight-position filter wheel, enabling it to take Bayer pattern red, green, and blue (RGB) “true color” images, multispectral images in nine additional bands spanning ~400–1100 nm, and images of the Sun in two colors through neutral density-coated filters. An associated Digital Electronics Assembly provides command and data interfaces to the rover, 8 Gb of image storage per camera, 11 bit to 8 bit companding, JPEG compression, and acquisition of high-definition video. Here we describe the preflight and in-flight calibration of Mastcam images, the ways that they are being archived in the NASA Planetary Data System, and the ways that calibration refinements are being developed as the investigation progresses on Mars. We also provide some examples of data sets and analyses that help to validate the accuracy and precision of the calibration
3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...
3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Thermal Effects on Camera Focal Length in Messenger Star Calibration and Orbital Imaging
NASA Astrophysics Data System (ADS)
Burmeister, S.; Elgner, S.; Preusker, F.; Stark, A.; Oberst, J.
2018-04-01
We analyse images taken by the MErcury Surface, Space ENviorment, GEochemistry, and Ranging (MESSENGER) spacecraft for the camera's thermal response in the harsh thermal environment near Mercury. Specifically, we study thermally induced variations in focal length of the Mercury Dual Imaging System (MDIS). Within the several hundreds of images of star fields, the Wide Angle Camera (WAC) typically captures up to 250 stars in one frame of the panchromatic channel. We measure star positions and relate these to the known star coordinates taken from the Tycho-2 catalogue. We solve for camera pointing, the focal length parameter and two non-symmetrical distortion parameters for each image. Using data from the temperature sensors on the camera focal plane we model a linear focal length function in the form of f(T) = A0 + A1 T. Next, we use images from MESSENGER's orbital mapping mission. We deal with large image blocks, typically used for the production of a high-resolution digital terrain models (DTM). We analyzed images from the combined quadrangles H03 and H07, a selected region, covered by approx. 10,600 images, in which we identified about 83,900 tiepoints. Using bundle block adjustments, we solved for the unknown coordinates of the control points, the pointing of the camera - as well as the camera's focal length. We then fit the above linear function with respect to the focal plane temperature. As a result, we find a complex response of the camera to thermal conditions of the spacecraft. To first order, we see a linear increase by approx. 0.0107 mm per degree temperature for the Narrow-Angle Camera (NAC). This is in agreement with the observed thermal response seen in images of the panchromatic channel of the WAC. Unfortunately, further comparisons of results from the two methods, both of which use different portions of the available image data, are limited. If leaving uncorrected, these effects may pose significant difficulties in the photogrammetric analysis, specifically these may be responsible for erroneous longwavelength trends in topographic models.
NASA Astrophysics Data System (ADS)
Kim, Manjae; Kim, Sewoong; Hwang, Minjoo; Kim, Jihun; Je, Minkyu; Jang, Jae Eun; Lee, Dong Hun; Hwang, Jae Youn
2017-02-01
To date, the incident rates of various skin diseases have increased due to hereditary and environmental factors including stress, irregular diet, pollution, etc. Among these skin diseases, seborrheic dermatitis and psoriasis are a chronic/relapsing dermatitis involving infection and temporary alopecia. However, they typically exhibit similar symptoms, thus resulting in difficulty in discrimination between them. To prevent their associated complications and appropriate treatments for them, it is crucial to discriminate between seborrheic dermatitis and psoriasis with high specificity and sensitivity and further continuously/quantitatively to monitor the skin lesions during their treatment at other locations besides a hospital. Thus, we here demonstrate a mobile multispectral imaging system connected to a smartphone for selfdiagnosis of seborrheic dermatitis and further discrimination between seborrheic dermatitis and psoriasis on the scalp, which is the more challenging case. Using the system developed, multispectral imaging and analysis of seborrheic dermatitis and psoriasis on the scalp was carried out. It was here found that the spectral signatures of seborrheic dermatitis and psoriasis were discernable and thus seborrheic dermatitis on the scalp could be distinguished from psoriasis by using the system. In particular, the smartphone-based multispectral imaging and analysis moreover offered better discrimination between seborrheic dermatitis and psoriasis than the RGB imaging and analysis. These results suggested that the multispectral imaging system based on a smartphone has the potential for self-diagnosis of seborrheic dermatitis with high portability and specificity.
Band co-registration modeling of LAPAN-A3/IPB multispectral imager based on satellite attitude
NASA Astrophysics Data System (ADS)
Hakim, P. R.; Syafrudin, A. H.; Utama, S.; Jayani, A. P. S.
2018-05-01
One of significant geometric distortion on images of LAPAN-A3/IPB multispectral imager is co-registration error between each color channel detector. Band co-registration distortion usually can be corrected by using several approaches, which are manual method, image matching algorithm, or sensor modeling and calibration approach. This paper develops another approach to minimize band co-registration distortion on LAPAN-A3/IPB multispectral image by using supervised modeling of image matching with respect to satellite attitude. Modeling results show that band co-registration error in across-track axis is strongly influenced by yaw angle, while error in along-track axis is fairly influenced by both pitch and roll angle. Accuracy of the models obtained is pretty good, which lies between 1-3 pixels error for each axis of each pair of band co-registration. This mean that the model can be used to correct the distorted images without the need of slower image matching algorithm, nor the laborious effort needed in manual approach and sensor calibration. Since the calculation can be executed in order of seconds, this approach can be used in real time quick-look image processing in ground station or even in satellite on-board image processing.
A position and attitude vision measurement system for wind tunnel slender model
NASA Astrophysics Data System (ADS)
Cheng, Lei; Yang, Yinong; Xue, Bindang; Zhou, Fugen; Bai, Xiangzhi
2014-11-01
A position and attitude vision measurement system for drop test slender model in wind tunnel is designed and developed. The system used two high speed cameras, one is put to the side of the model and another is put to the position where the camera can look up the model. Simple symbols are set on the model. The main idea of the system is based on image matching technique between the 3D-digital model projection image and the image captured by the camera. At first, we evaluate the pitch angles, the roll angles and the position of the centroid of a model through recognizing symbols in the images captured by the side camera. And then, based on the evaluated attitude info, giving a series of yaw angles, a series of projection images of the 3D-digital model are obtained. Finally, these projection images are matched with the image which captured by the looking up camera, and the best match's projection images corresponds to the yaw angle is the very yaw angle of the model. Simulation experiments are conducted and the results show that the maximal error of attitude measurement is less than 0.05°, which can meet the demand of test in wind tunnel.
Compact camera technologies for real-time false-color imaging in the SWIR band
NASA Astrophysics Data System (ADS)
Dougherty, John; Jennings, Todd; Snikkers, Marco
2013-11-01
Previously real-time false-colored multispectral imaging was not available in a true snapshot single compact imager. Recent technology improvements now allow for this technique to be used in practical applications. This paper will cover those advancements as well as a case study for its use in UAV's where the technology is enabling new remote sensing methodologies.
Multispectral fundus imaging for early detection of diabetic retinopathy
NASA Astrophysics Data System (ADS)
Beach, James M.; Tiedeman, James S.; Hopkins, Mark F.; Sabharwal, Yashvinder S.
1999-04-01
Functional imaging of the retina and associated structures may provide information for early assessment of risks of developing retinopathy in diabetic patients. Here we show results of retinal oximetry performed using multi-spectral reflectance imaging techniques to assess hemoglobin (Hb) oxygen saturation (OS) in blood vessels of the inner retina and oxygen utilization at the optic nerve in diabetic patients without retinopathy and early disease during experimental hyperglycemia. Retinal images were obtained through a fundus camera and simultaneously recorded at up to four wavelengths using image-splitting modules coupled to a digital camera. Changes in OS in large retinal vessels, in average OS in disk tissue, and in the reduced state of cytochrome oxidase (CO) at the disk were determined from changes in reflectance associated with the oxidation/reduction states of Hb and CO. Step to high sugar lowered venous oxygen saturation to a degree dependent on disease duration. Moderate increase in sugar produced higher levels of reduced CO in both the disk and surrounding tissue without a detectable change in average tissue OS. Results suggest that regulation of retinal blood supply and oxygen consumption are altered by hyperglycemia and that such functional changes are present before clinical signs of retinopathy.
Newer views of the Moon: Comparing spectra from Clementine and the Moon Mineralogy Mapper
Kramer, G.Y.; Besse, S.; Nettles, J.; Combe, J.-P.; Clark, R.N.; Pieters, C.M.; Staid, M.; Malaret, E.; Boardman, J.; Green, R.O.; Head, J.W.; McCord, T.B.
2011-01-01
The Moon Mineralogy Mapper (M3) provided the first global hyperspectral data of the lunar surface in 85 bands from 460 to 2980 nm. The Clementine mission provided the first global multispectral maps the lunar surface in 11 spectral bands across the ultraviolet-visible (UV-VIS) and near-infrared (NIR). In an effort to understand how M3 improves our ability to analyze and interpret lunar data, we compare M3 spectra with those from Clementine's UV-VIS and NIR cameras. The Clementine mission provided the first global multispectral maps the lunar surface in 11 spectral bands across the UV-VIS and NIR. We have found that M3 reflectance values are lower across all wavelengths compared with albedos from both of Clementine's UV-VIS and NIR cameras. M3 spectra show the Moon to be redder, that is, have a steeper continuum slope, than indicated by Clementine. The 1 m absorption band depths may be comparable between the instruments, but Clementine data consistently exhibit shallower 2 m band depths than M 3. Absorption band minimums are difficult to compare due to the significantly different spectral resolutions. Copyright 2011 by the American Geophysical Union.
Newer views of the Moon: Comparing spectra from Clementineand the Moon Mineralogy Mapper
Georgiana Y. Kramer,; Sebastian Besse,; Nettles, Jeff; Jean-Philippe Combe,; Clark, Roger N.; Pieters, Carle M.; Matthew Staid,; Joseph Boardman,; Robert Green,; McCord, Thomas B.; Malaret, Erik; Head, James W.
2011-01-01
The Moon Mineralogy Mapper (M3) provided the first global hyperspectral data of the lunar surface in 85 bands from 460 to 2980 nm. The Clementine mission provided the first global multispectral maps the lunar surface in 11 spectral bands across the ultraviolet-visible (UV-VIS) and near-infrared (NIR). In an effort to understand how M3 improves our ability to analyze and interpret lunar data, we compare M3 spectra with those from Clementine's UV-VIS and NIR cameras. The Clementine mission provided the first global multispectral maps the lunar surface in 11 spectral bands across the UV-VIS and NIR. We have found that M3 reflectance values are lower across all wavelengths compared with albedos from both of Clementine's UV-VIS and NIR cameras. M3 spectra show the Moon to be redder, that is, have a steeper continuum slope, than indicated by Clementine. The 1 μm absorption band depths may be comparable between the instruments, but Clementine data consistently exhibit shallower 2 μm band depths than M3. Absorption band minimums are difficult to compare due to the significantly different spectral resolutions.
Skylab and ERTS-1 investigations of coastal land use and water properties. [Delaware Bay
NASA Technical Reports Server (NTRS)
Klemas, V. (Principal Investigator); Bartlett, D.; Rogers, R.
1974-01-01
The author has identified the following significant results. ERTS-1 multispectral scanner and Skylab's S190A, S190B, and S192 data products were evaluated for their utility in studying current circulation, suspended sediment concentrations and pollution dispersal in Delaware Bay and in mapping coastal vegetation and land use. Imagery from the ERTS-1 MSS, S190A and S190B cameras shows considerable detail in water structure, circulation, suspended sediment distribution and within waste disposal plumes in shelf waters. These data products were also used in differentiating and mapping twelve coastal vegetation and land use classes. The spatial resolution of the S190A multispectral facility appears to be about 30 to 70 meters while that of the S190B earth terrain camera is about 10 to 30 meters. Such resolution, along with good cartographic quality, indicates a considerable potential for mapping coastal land use and monitoring water properties in estuaries and on the continental shelf. The ERTS-1 MSS has a resolution of about 70-100 meters. Moreover, its regular 18-day cycle permits observation of important changes, including the environmental impact of coastal zone development on coastal vegetation and ecology.
COMPARISON OF RETINAL PATHOLOGY VISUALIZATION IN MULTISPECTRAL SCANNING LASER IMAGING.
Meshi, Amit; Lin, Tiezhu; Dans, Kunny; Chen, Kevin C; Amador, Manuel; Hasenstab, Kyle; Muftuoglu, Ilkay Kilic; Nudleman, Eric; Chao, Daniel; Bartsch, Dirk-Uwe; Freeman, William R
2018-03-16
To compare retinal pathology visualization in multispectral scanning laser ophthalmoscope imaging between the Spectralis and Optos devices. This retrospective cross-sectional study included 42 eyes from 30 patients with age-related macular degeneration (19 eyes), diabetic retinopathy (10 eyes), and epiretinal membrane (13 eyes). All patients underwent retinal imaging with a color fundus camera (broad-spectrum white light), the Spectralis HRA-2 system (3-color monochromatic lasers), and the Optos P200 system (2-color monochromatic lasers). The Optos image was cropped to a similar size as the Spectralis image. Seven masked graders marked retinal pathologies in each image within a 5 × 5 grid that included the macula. The average area with detected retinal pathology in all eyes was larger in the Spectralis images compared with Optos images (32.4% larger, P < 0.0001), mainly because of better visualization of epiretinal membrane and retinal hemorrhage. The average detection rate of age-related macular degeneration and diabetic retinopathy pathologies was similar across the three modalities, whereas epiretinal membrane detection rate was significantly higher in the Spectralis images. Spectralis tricolor multispectral scanning laser ophthalmoscope imaging had higher rate of pathology detection primarily because of better epiretinal membrane and retinal hemorrhage visualization compared with Optos bicolor multispectral scanning laser ophthalmoscope imaging.
Development of online lines-scan imaging system for chicken inspection and differentiation
NASA Astrophysics Data System (ADS)
Yang, Chun-Chieh; Chan, Diane E.; Chao, Kuanglin; Chen, Yud-Ren; Kim, Moon S.
2006-10-01
An online line-scan imaging system was developed for differentiation of wholesome and systemically diseased chickens. The hyperspectral imaging system used in this research can be directly converted to multispectral operation and would provide the ideal implementation of essential features for data-efficient high-speed multispectral classification algorithms. The imaging system consisted of an electron-multiplying charge-coupled-device (EMCCD) camera and an imaging spectrograph for line-scan images. The system scanned the surfaces of chicken carcasses on an eviscerating line at a poultry processing plant in December 2005. A method was created to recognize birds entering and exiting the field of view, and to locate a Region of Interest on the chicken images from which useful spectra were extracted for analysis. From analysis of the difference spectra between wholesome and systemically diseased chickens, four wavelengths of 468 nm, 501 nm, 582 nm and 629 nm were selected as key wavelengths for differentiation. The method of locating the Region of Interest will also have practical application in multispectral operation of the line-scan imaging system for online chicken inspection. This line-scan imaging system makes possible the implementation of multispectral inspection using the key wavelengths determined in this study with minimal software adaptations and without the need for cross-system calibration.
2016-10-17
Pandora is seen here, in isolation beside Saturn's kinked and constantly changing F ring. Pandora (near upper right) is 50 miles (81 kilometers) wide. The moon has an elongated, potato-like shape (see PIA07632). Two faint ringlets are visible within the Encke Gap, near lower left. The gap is about 202 miles (325 kilometers) wide. The much narrower Keeler Gap, which lies outside the Encke Gap, is maintained by the diminutive moon Daphnis (not seen here). This view looks toward the sunlit side of the rings from about 23 degrees above the ring plane. The image was taken in visible light with the Cassini spacecraft narrow-angle camera on Aug. 12, 2016. The view was acquired at a distance of approximately 907,000 miles (1.46 million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 113 degrees. Image scale is 6 miles (9 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20504
Auto-converging stereo cameras for 3D robotic tele-operation
NASA Astrophysics Data System (ADS)
Edmondson, Richard; Aycock, Todd; Chenault, David
2012-06-01
Polaris Sensor Technologies has developed a Stereovision Upgrade Kit for TALON robot to provide enhanced depth perception to the operator. This kit previously required the TALON Operator Control Unit to be equipped with the optional touchscreen interface to allow for operator control of the camera convergence angle adjustment. This adjustment allowed for optimal camera convergence independent of the distance from the camera to the object being viewed. Polaris has recently improved the performance of the stereo camera by implementing an Automatic Convergence algorithm in a field programmable gate array in the camera assembly. This algorithm uses scene content to automatically adjust the camera convergence angle, freeing the operator to focus on the task rather than adjustment of the vision system. The autoconvergence capability has been demonstrated on both visible zoom cameras and longwave infrared microbolometer stereo pairs.
NASA Technical Reports Server (NTRS)
Poliner, Jeffrey; Fletcher, Lauren; Klute, Glenn K.
1994-01-01
Video-based motion analysis systems are widely employed to study human movement, using computers to capture, store, process, and analyze video data. This data can be collected in any environment where cameras can be located. One of the NASA facilities where human performance research is conducted is the Weightless Environment Training Facility (WETF), a pool of water which simulates zero-gravity with neutral buoyance. Underwater video collection in the WETF poses some unique problems. This project evaluates the error caused by the lens distortion of the WETF cameras. A grid of points of known dimensions was constructed and videotaped using a video vault underwater system. Recorded images were played back on a VCR and a personal computer grabbed and stored the images on disk. These images were then digitized to give calculated coordinates for the grid points. Errors were calculated as the distance from the known coordinates of the points to the calculated coordinates. It was demonstrated that errors from lens distortion could be as high as 8 percent. By avoiding the outermost regions of a wide-angle lens, the error can be kept smaller.
Spectral methods to detect cometary minerals with OSIRIS on board Rosetta
NASA Astrophysics Data System (ADS)
Oklay, N.; Vincent, J.-B.; Sierks, H.
2013-09-01
Comet 67P/Churyumov-Gerasimenko is going to be observed by the OSIRIS scientific imager (Keller et al. 2007) on board ESA's spacecraft Rosetta in the wavelength range of 250-1000 nm with a combination of 12 filters for the narrow angle camera (NAC) and 14 combination of 12 filters for the narrow angle camera (NAC) and 14 filters in the wavelength range of 240-720 nm for the wide angle camera (WAC). NAC filters are suitable to surface composition studies, while WAC filters are designed for gas and radical emission studies. In order to investigate the composition of the comet surface from the observed images, we need to understand how to detect different minerals and which compositional information can be derived from the NAC filters. Therefore, the most common cometary silicates e.g. enstatite, forsterite are investigated with two hydrated silicates (serpentine and smectite) for the determina- tion of the spectral methods. Laboratory data of those selected minerals are collected from RELAB database (http://www.planetary.brown.edu/relabdocs/relab.htm) and absolute spectra of the minerals observed by OSIRIS NAC filters are calculated. Due to the limited spectral range of the laboratory data, Far-UV and Neutral density filters of NAC are excluded from this analysis. Considered NAC filters in this study are represented in Table 1 and the number of collected laboratory data are presented in Table 2. Detection and separation of the minerals will not only allow us to study the surface composition but also to study observed composition changes due to the cometary activity during the mission.
Lock-in imaging with synchronous digital mirror demodulation
NASA Astrophysics Data System (ADS)
Bush, Michael G.
2010-04-01
Lock-in imaging enables high contrast imaging in adverse conditions by exploiting a modulated light source and homodyne detection. We report results on a patent pending lock-in imaging system fabricated from commercial-off-theshelf parts utilizing standard cameras and a spatial light modulator. By leveraging the capabilities of standard parts we are able to present a low cost, high resolution, high sensitivity camera with applications in search and rescue, friend or foe identification (IFF), and covert surveillance. Different operating modes allow the same instrument to be utilized for dual band multispectral imaging or high dynamic range imaging, increasing the flexibility in different operational settings.
Cloud and aerosol polarimetric imager
NASA Astrophysics Data System (ADS)
Zhang, Junqiang; Shao, Jianbing; Yan, Changxiang
2014-02-01
Cloud and Aerosol Polarimetric Imager (CAPI), which is the first onboard cloud and aerosol Polarimetric detector of CHINA, is developed to get cloud and aerosol data of atmosphere to retrieve aerosol optical and microphysical properties to increase the reversion precision of greenhouse gasses (GHGs). The instrument is neither a Polarization and Direction of Earth's Reflectance (POLDER) nor a Directional Polarimetric Camera (DPC) type polarized camera. It is a multispectral push broom system using linear detectors, and can get 5 bands spectral data, from ultraviolet (UV) to SWIR, of the same ground feature at the same time without any moving structure. This paper describes the CAPI instrument characteristics, composition, calibration, and the nearest development.
NASA Astrophysics Data System (ADS)
Castro Marín, J. M.; Brown, V. J. G.; López Jiménez, A. C.; Rodríguez Gómez, J.; Rodrigo, R.
2001-05-01
The optical, spectroscopic infrared remote imaging system (OSIRIS) is an instrument carried on board the European Space Agency spacecraft Rosetta that will be launched in January 2003 to study in situ the comet Wirtanen. The electronic design of the mechanism controller board (MCB) system of the two OSIRIS optical cameras, the narrow angle camera, and the wide angle camera, is described here. The system is comprised of two boards mounted on an aluminum frame as part of an electronics box that contains the power supply and the digital processor unit of the instrument. The mechanisms controlled by the MCB for each camera are the front door assembly and a filter wheel assembly. The front door assembly for each camera is driven by a four phase, permanent magnet stepper motor. Each filter wheel assembly consists of two, eight filter wheels. Each wheel is driven by a four phase, variable reluctance stepper motor. Each motor, for all the assemblies, also contains a redundant set of four stator phase windings that can be energized separately or in parallel with the main windings. All stepper motors are driven in both directions using the full step unipolar mode of operation. The MCB also performs general housekeeping data acquisition of the OSIRIS instrument, i.e., mechanism position encoders and temperature measurements. The electronic design application used is quite new due to use of a field programmable gate array electronic devices that avoid the use of the now traditional system controlled by microcontrollers and software. Electrical tests of the engineering model have been performed successfully and the system is ready for space qualification after environmental testing. This system may be of interest to institutions involved in future space experiments with similar needs for mechanisms control.
Spatial Variations of Spectral Properties of (21) Lutetia as Observed by OSIRIS/Rosetta
NASA Astrophysics Data System (ADS)
Leyrat, Cedric; Sierks, H.; Barbieri, C.; Barucci, A.; Da Deppo, V.; De Leon, J.; Fulchignoni, M.; Fornasier, S.; Groussin, O.; Hviid, S. F.; Jorda, L.; Keller, H. U.; La Forgia, F.; Lara, L.; Lazzarin, M.; Magrin, S.; Marchi, S.; Thomas, N.; Schroder, S. E.; OSIRIS Team
2010-10-01
On July 10, 2010, the Rosetta ESA/NASA spacecraft successfully flew by the asteroid (21) Lutetia, which becomes the largest asteroid observed by a space probe. The closest approach occurred at 15H45 UTC at a relative speed of 15km/s and a relative distance of 3160 km. The Narrow Angle Camera (NAC) and the Wide Angle Camera (WAC) of the OSIRIS instrument onboard Rosetta acquired images at different phase angles ranging from almost zero to more than 150 degrees. The best spatial resolution (60 m/pixel) allowed to reveal a very complex topography with several features and different crater's surface densities. Spectrophotometric analysis of the data could suggest spatial variations of the albedo and spectral properties at the surface of the asteroid, at least in the northern hemisphere. Numerous sets of data have been obtained at different wavelengths from 270nm to 980nm. We will first present a color-color analysis of data in order to locate landscapes where surface variegation is present. We will also present a more accurate study of spectral properties using the shape model and different statistical methods. Possible variations of the surface spectral properties with the slope of the ground and the gravity field orientation will be discussed as well.
The Fringe Reading Facility at the Max-Planck-Institut fuer Stroemungsforschung
NASA Astrophysics Data System (ADS)
Becker, F.; Meier, G. E. A.; Wegner, H.; Timm, R.; Wenskus, R.
1987-05-01
A Mach-Zehnder interferometer is used for optical flow measurements in a transonic wind tunnel. Holographic interferograms are reconstructed by illumination with a He-Ne-laser and viewed by a video camera through wide angle optics. This setup was used for investigating industrial double exposure holograms of truck tires in order to develop methods of automatic recognition of certain manufacturing faults. Automatic input is achieved by a transient recorder digitizing the output of a TV camera and transferring the digitized data to a PDP11-34. Interest centered around sequences of interferograms showing the interaction of vortices with a profile and subsequent emission of sound generated by this process. The objective is the extraction of quantitative data which relates to the emission of noise.
The Fringe Reading Facility at the Max-Planck-Institut fuer Stroemungsforschung
NASA Technical Reports Server (NTRS)
Becker, F.; Meier, G. E. A.; Wegner, H.; Timm, R.; Wenskus, R.
1987-01-01
A Mach-Zehnder interferometer is used for optical flow measurements in a transonic wind tunnel. Holographic interferograms are reconstructed by illumination with a He-Ne-laser and viewed by a video camera through wide angle optics. This setup was used for investigating industrial double exposure holograms of truck tires in order to develop methods of automatic recognition of certain manufacturing faults. Automatic input is achieved by a transient recorder digitizing the output of a TV camera and transferring the digitized data to a PDP11-34. Interest centered around sequences of interferograms showing the interaction of vortices with a profile and subsequent emission of sound generated by this process. The objective is the extraction of quantitative data which relates to the emission of noise.
NASA Technical Reports Server (NTRS)
2003-01-01
MGS MOC Release No. MOC2-344, 28 April 2003
This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image mosaic was constructed from data acquired by the MOC red wide angle camera. The large, circular feature in the upper left is Aram Chaos, an ancient impact crater filled with layered sedimentary rock that was later disrupted and eroded to form a blocky, 'chaotic' appearance. To the southeast of Aram Chaos, in the lower right of this picture, is Iani Chaos. The light-toned patches amid the large blocks of Iani Chaos are known from higher-resolution MOC images to be layered, sedimentary rock outcrops. The picture center is near 0.5oN, 20oW. Sunlight illuminates the scene from the left/upper left.Junocam: Juno's Outreach Camera
NASA Astrophysics Data System (ADS)
Hansen, C. J.; Caplinger, M. A.; Ingersoll, A.; Ravine, M. A.; Jensen, E.; Bolton, S.; Orton, G.
2017-11-01
Junocam is a wide-angle camera designed to capture the unique polar perspective of Jupiter offered by Juno's polar orbit. Junocam's four-color images include the best spatial resolution ever acquired of Jupiter's cloudtops. Junocam will look for convective clouds and lightning in thunderstorms and derive the heights of the clouds. Junocam will support Juno's radiometer experiment by identifying any unusual atmospheric conditions such as hotspots. Junocam is on the spacecraft explicitly to reach out to the public and share the excitement of space exploration. The public is an essential part of our virtual team: amateur astronomers will supply ground-based images for use in planning, the public will weigh in on which images to acquire, and the amateur image processing community will help process the data.
Virtual displays for 360-degree video
NASA Astrophysics Data System (ADS)
Gilbert, Stephen; Boonsuk, Wutthigrai; Kelly, Jonathan W.
2012-03-01
In this paper we describe a novel approach for comparing users' spatial cognition when using different depictions of 360- degree video on a traditional 2D display. By using virtual cameras within a game engine and texture mapping of these camera feeds to an arbitrary shape, we were able to offer users a 360-degree interface composed of four 90-degree views, two 180-degree views, or one 360-degree view of the same interactive environment. An example experiment is described using these interfaces. This technique for creating alternative displays of wide-angle video facilitates the exploration of how compressed or fish-eye distortions affect spatial perception of the environment and can benefit the creation of interfaces for surveillance and remote system teleoperation.
NASA Technical Reports Server (NTRS)
Jenniskens, Peter; Nugent, David; Murthy, Jayant; Tedesco, Ed; DeVincenzi, Donal L. (Technical Monitor)
2000-01-01
In November 1997, the Midcourse Space Experiment satellite (MSX) was deployed to observe the Leonid shower from space. The shower lived up to expectations, with abundant bright fireballs. Twenty-nine meteors were detected by a wide-angle, visible wavelength, camera near the limb of the Earth in a 48-minute interval, and three meteors by the narrow field camera. This amounts to a meteoroid influx of 5.5 +/- 0.6 10(exp -5)/sq km hr for masses greater than 0.3 gram. The limiting magnitude for limb observations of Leonid meteors was measured at M(sub v) = -1.5 magn The Leonid shower magnitude population index was 1.6 +/- 0.2 down to M(sub v) = -7 magn., with no sign of an upper mass cut-off.
Atmospheric aerosol profiling with a bistatic imaging lidar system.
Barnes, John E; Sharma, N C Parikh; Kaplan, Trevor B
2007-05-20
Atmospheric aerosols have been profiled using a simple, imaging, bistatic lidar system. A vertical laser beam is imaged onto a charge-coupled-device camera from the ground to the zenith with a wide-angle lens (CLidar). The altitudes are derived geometrically from the position of the camera and laser with submeter resolution near the ground. The system requires no overlap correction needed in monostatic lidar systems and needs a much smaller dynamic range. Nighttime measurements of both molecular and aerosol scattering were made at Mauna Loa Observatory. The CLidar aerosol total scatter compares very well with a nephelometer measuring at 10 m above the ground. The results build on earlier work that compared purely molecular scattered light to theory, and detail instrument improvements.
NASA Astrophysics Data System (ADS)
Manessa, Masita Dwi Mandini; Kanno, Ariyo; Sagawa, Tatsuyuki; Sekine, Masahiko; Nurdin, Nurjannah
2018-01-01
Lyzenga's multispectral bathymetry formula has attracted considerable interest due to its simplicity. However, there has been little discussion of the effect that variation in optical conditions and bottom types-which commonly appears in coral reef environments-has on this formula's results. The present paper evaluates Lyzenga's multispectral bathymetry formula for a variety of optical conditions and bottom types. A noiseless dataset of above-water remote sensing reflectance from WorldView-2 images over Case-1 shallow coral reef water is simulated using a radiative transfer model. The simulation-based assessment shows that Lyzenga's formula performs robustly, with adequate generality and good accuracy, under a range of conditions. As expected, the influence of bottom type on depth estimation accuracy is far greater than the influence of other optical parameters, i.e., chlorophyll-a concentration and solar zenith angle. Further, based on the simulation dataset, Lyzenga's formula estimates depth when the bottom type is unknown almost as accurately as when the bottom type is known. This study provides a better understanding of Lyzenga's multispectral bathymetry formula under various optical conditions and bottom types.
Preliminary results on photometric properties of materials at the Sagan Memorial Station, Mars
Johnson, J. R.; Kirk, R.; Soderblom, L.A.; Gaddis, L.; Reid, R.J.; Britt, D.T.; Smith, P.; Lemmon, M.; Thomas, N.; Bell, J.F.; Bridges, N.T.; Anderson, R.; Herkenhoff, K. E.; Maki, J.; Murchie, S.; Dummel, A.; Jaumann, R.; Trauthan, F.; Arnold, G.
1999-01-01
Reflectance measurements of selected rocks and soils over a wide range of illumination geometries obtained by the Imager for Mars Pathfinder (IMP) camera provide constraints on interpretations of the physical and mineralogical nature of geologic materials at the landing site. The data sets consist of (1) three small "photometric spot" subframed scenes, covering phase angles from 20?? to 150??; (2) two image strips composed of three subframed images each, located along the antisunrise and antisunset lines (photometric equator), covering phase angles from ???0?? to 155??; and (3) full-image scenes of the rock "Yogi," covering phase angles from 48?? to 100??. Phase functions extracted from calibrated data exhibit a dominantly backscattering photometric function, consistent with the results from the Viking lander cameras. However, forward scattering behavior does appear at phase angles >140??, particularly for the darker gray rock surfaces. Preliminary efforts using a Hapke scattering model are useful in comparing surface properties of different rock and soil types but are not well constrained, possibly due to the incomplete phase angle availability, uncertainties related to the photometric function of the calibration targets, and/or the competing effects of diffuse and direct lighting. Preliminary interpretations of the derived Hapke parameters suggest that (1) red rocks can be modeled as a mixture of gray rocks with a coating of bright and dark soil or dust, and (2) gray rocks have macroscopically smoother surfaces composed of microscopically homogeneous, clear materials with little internal scattering, which may imply a glass-like or varnished surface. Copyright 1999 by the American Geophysical Union.
Fluctuations of Lake Eyre, South Australia
NASA Technical Reports Server (NTRS)
2002-01-01
Lake Eyre is a large salt lake situated between two deserts in one of Australia's driest regions. However, this low-lying lake attracts run-off from one of the largest inland drainage systems in the world. The drainage basin is very responsive to rainfall variations, and changes dramatically with Australia's inter-annual weather fluctuations. When Lake Eyre fills,as it did in 1989, it is temporarily Australia's largest lake, and becomes dense with birds, frogs and colorful plant life. The Lake responds to extended dry periods (often associated with El Nino events) by drying completely.These four images from the Multi-angle Imaging SpectroRadiometer contrast the lake area at the start of the austral summers of 2000 and 2002. The top two panels portray the region as it appeared on December 9, 2000. Heavy rains in the first part of 2000 caused both the north and south sections of the lake to fill partially and the northern part of the lake still contained significant standing water by the time these data were acquired. The bottom panels were captured on November 29, 2002. Rainfall during 2002 was significantly below average ( http://www.bom.gov.au/ ), although showers occurring in the week before the image was acquired helped alleviate this condition slightly.The left-hand panels portray the area as it appeared to MISR's vertical-viewing (nadir) camera, and are false-color views comprised of data from the near-infrared, green and blue channels. Here, wet and/or moist surfaces appear blue-green, since water selectively absorbs longer wavelengths such as near-infrared. The right-hand panels are multi-angle composites created with red band data from MISR's 60-degree forward, nadir and 60-degree backward-viewing cameras, displayed as red, green and blue, respectively. In these multi-angle composites, color variations serve as a proxy for changes in angular reflectance, and indicate textural properties of the surface related to roughness and/or moisture content.Data from the two dates were processed identically to preserve relative variations in brightness between them. Wet surfaces or areas with standing water appear green due to the effect of sunglint at the nadir camera view angle. Dry, salt encrusted parts of the lake appear bright white or gray. Purple areas have enhanced forward scattering, possibly as a result of surface moistness. Some variations exhibited by the multi-angle composites are not discernible in the nadir multi-spectral images and vice versa, suggesting that the combination of angular and spectral information is a more powerful diagnostic of surface conditions than either technique by itself.The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire globe between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbits 5194 and 15679. The panels cover an area of 146 kilometers x 122 kilometers, and utilize data from blocks 113 to 114 within World Reference System-2 path 100.MISR was built and is managed by NASA's Jet Propulsion Laboratory,Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...
8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
NASA Technical Reports Server (NTRS)
Holub, R.; Shenk, W. E.
1973-01-01
Four registered channels (0.2 to 4, 6.5 to 7, 10 to 11, and 20 to 23 microns) of the Nimbus 3 Medium Resolution Infrared Radiometer (MRIR) were used to study 24-hr changes in the structure of an extratropical cyclone during a 6-day period in May 1969. Use of a stereographic-horizon map projection insured that the storm was mapped with a single perspective throughout the series and allowed the convenient preparation of 24-hr difference maps of the infrared radiation fields. Single-channel and multispectral analysis techniques were employed to establish the positions and vertical slopes of jetstreams, large cloud systems, and major features of middle and upper tropospheric circulation. Use of these techniques plus the difference maps and continuity of observation allowed the early detection of secondary cyclones developing within the circulation of the primary cyclone. An automated, multispectral cloud-type identification technique was developed, and comparisons that were made with conventional ship reports and with high-resolution visual data from the image dissector camera system showed good agreement.
Multispectral analysis tools can increase utility of RGB color images in histology
NASA Astrophysics Data System (ADS)
Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard
2018-04-01
Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.
The Orbiter camera payload system's large-format camera and attitude reference system
NASA Technical Reports Server (NTRS)
Schardt, B. B.; Mollberg, B. H.
1985-01-01
The Orbiter camera payload system (OCPS) is an integrated photographic system carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a large-format camera (LFC), a precision wide-angle cartographic instrument capable of producing high-resolution stereophotography of great geometric fidelity in multiple base-to-height ratios. A secondary and supporting system to the LFC is the attitude reference system (ARS), a dual-lens stellar camera array (SCA) and camera support structure. The SCA is a 70 mm film system that is rigidly mounted to the LFC lens support structure and, through the simultaneous acquisition of two star fields with each earth viewing LFC frame, makes it possible to precisely determine the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high-precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with rocket launch conditions and the on-orbit environment. The full OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on Oct. 5, 1984, as a major payload aboard the STS-41G mission.
ERIC Educational Resources Information Center
Mackworth, Norman H.; And Others
1972-01-01
The Mackworth wide-angle reflection eye camera was used to record the position of the gaze on a display of 16 white symbols. One of these symbols changed to red after 30 seconds, remained red for a minute of testing, and then became white again. The subjects were 10 aphasic children (aged 5-9), who were compared with a group of 10 normal children,…
Eastern Space and Missile Center (ESMC) Capability.
1983-09-16
Sites Fig. 4 ETR Tracking Itlescopes A unique feature at the ETR is the ability to compute a The Contraves Model 151 includes a TV camera. a widetband...main objective lens. The Contraves wideband transmitter sends video signals from either the main objective TV or the DAGE wide-angle TV system to the...Modified main objective plus the time of day to 0.1 second. to use the ESMC precise 2400 b/s acquisition data system, the Contraves computer system
Aspects of Voyager photogrammetry
NASA Technical Reports Server (NTRS)
Wu, Sherman S. C.; Schafer, Francis J.; Jordan, Raymond; Howington, Annie-Elpis
1987-01-01
In January 1986, Voyager 2 took a series of pictures of Uranus and its satellites with the Imaging Science System (ISS) on board the spacecraft. Based on six stereo images from the ISS narrow-angle camera, a topographic map was compiled of the Southern Hemisphere of Miranda, one of Uranus' moons. Assuming a spherical figure, a 20-km surface relief is shown on the map. With three additional images from the ISS wide-angle camera, a control network of Miranda's Southern Hemisphere was established by analytical photogrammetry, producing 88 ground points for the control of multiple-model compilation on the AS-11AM analytical stereoplotter. Digital terrain data from the topographic map of Miranda have also been produced. By combining these data and the image data from the Voyager 2 mission, perspective views or even a movie of the mapped area can be made. The application of these newly developed techniques to Voyager 1 imagery, which includes a few overlapping pictures of Io and Ganymede, permits the compilation of contour maps or topographic profiles of these bodies on the analytical stereoplotters.
Multispectral imaging approach for simplified non-invasive in-vivo evaluation of gingival erythema
NASA Astrophysics Data System (ADS)
Eckhard, Timo; Valero, Eva M.; Nieves, Juan L.; Gallegos-Rueda, José M.; Mesa, Francisco
2012-03-01
Erythema is a common visual sign of gingivitis. In this work, a new and simple low-cost image capture and analysis method for erythema assessment is proposed. The method is based on digital still images of gingivae and applied on a pixel-by-pixel basis. Multispectral images are acquired with a conventional digital camera and multiplexed LED illumination panels at 460nm and 630nm peak wavelength. An automatic work-flow segments teeth from gingiva regions in the images and creates a map of local blood oxygenation levels, which relates to the presence of erythema. The map is computed from the ratio of the two spectral images. An advantage of the proposed approach is that the whole process is easy to manage by dental health care professionals in clinical environment.
1996-01-29
In this image from NASA's Voyager wide-angle taken on Aug. 23 1989, the two main rings of Neptune can be clearly seen. In the lower part of the frame the originally announced ring arc, consisting of three distinct features, is visible. This feature covers about 35 degrees of longitude and has yet to be radially resolved in Voyager images. From higher resolution images it is known that this region contains much more material than the diffuse belts seen elsewhere in its orbit, which seem to encircle the planet. This is consistent with the fact that ground-based observations of stellar occultations by the rings show them to be very broken and clumpy. The more sensitive wide-angle camera is revealing more widely distributed but fainter material. Each of these rings of material lies just outside of the orbit of a newly discovered moon. One of these moons, 1989N2, may be seen in the upper right corner. The moon is streaked by its orbital motion, whereas the stars in the frame are less smeared. The dark area around the bright moon and star are artifacts of the processing required to bring out the faint rings. This wide-angle image was taken from a range of 2 million kilometers (1.2 million miles), through the clear filter. http://photojournal.jpl.nasa.gov/catalog/PIA00053
Apollo 9 Mission image - S0-65 Multispectral Photography - New Mexico and Texas
1969-03-12
AS09-26A-3807A (12 March 1969) --- Color infrared photograph of the Texas-New Mexico border area, between Lubbock and Roswell, taken on March 12, 1969, by one of the four synchronized cameras of the Apollo 9 Earth Resources Survey (SO65). At 11:30 a.m. (EST) when this picture was made the Apollo 9 spacecraft was at an altitude of 119 nautical miles, and the sun elevation was 38 degrees above the horizon. The location of the point on Earth's surface at which the four-camera combination was aimed was 33 degrees 42 minutes north latitude, and 103 degrees 1 minute west longitude. The other three cameras used: (B) black and white film with a red filter; (C) black and white infrared film; and (D) black and white film with a green filter.
Apollo 9 Mission image - S0-65 Multispectral Photography - Georgia
2009-02-19
AS09-26A-3816A (12 March 1969) --- Color infrared photograph of the Atlantic coast of Georgia, Brunswick area, taken on March 12, 1969, by one of the four synchronized cameras of the Apollo 9 Earth Resources Survey SO65 Experiment. At 11:35 a.m. (EST) when this picture was made the Apollo 9 spacecraft was at an altitude of 102 nautical miles, and the sun elevation was 51 degrees above the horizon. The location of the point on Earth's surface at which the four-camera combination was aimed 31 degrees 16 minutes north latitude, and 81 degrees 17 minutes west longitude. The other three cameras used: (B) black and white film with a red filter; (C) black and white infrared film; and (D) black and white film with a green filter.
Public-Requested Mars Image: Crater on Pavonis Mons
NASA Technical Reports Server (NTRS)
2003-01-01
MGS MOC Release No. MOC2-481, 12 September 2003
This image is in the first pair obtained in the Public Target Request program, which accepts suggestions for sites to photograph with the Mars Orbiter Camera on NASA's Mars Global Surveyor spacecraft.It is a narrow-angle (high-resolution) view of a portion of the lower wall and floor of the caldera at the top of a martian volcano named Pavonis Mons. A companion picture is a wide-angle context image, taken at the same time as the high-resolution view. The white box in the context frame shows the location of the high-resolution picture. [figure removed for brevity, see original site] Pavonis Mons is a broad shield volcano. Its summit region is about 14 kilometers (8.7 miles) above the martian datum (zero-elevation reference level). The caldera is about 4.6 kilometers (2.8 miles) deep. The caldera formed by collapse--long ago--as molten rock withdrew to greater depths within the volcano. The high-resolution picture shows that today the floor and walls of this caldera are covered by a thick, textured mantle of dust, perhaps more than 1 meter (1 yard) deep. Larger boulders and rock outcroppings poke out from within this dust mantle. They are seen as small, dark dots and mounds on the lower slopes of the wall in the high-resolution image. The narrow-angle Mars Orbiter Camera image has a resolution of 1.5 meters (about 5 feet) per pixel and covers an area 1.5 kilometers (0.9 mile) wide by 9 kilometers (5.6 miles) long. The context image, covering much of the summit region of Pavonis Mons, is about 115 kilometers (72 miles) wide. Sunlight illuminates both images from the lower left; north is toward the upper right; east to the right. The high-resolution view is located near 0.4 degrees north latitude, 112.8 degrees west longitude.NASA Technical Reports Server (NTRS)
1998-01-01
Positive Systems has worked in conjunction with Stennis Space Center to design the ADAR System 5500. This is a four-band airborne digital imaging system used to capture multispectral imagery similar to that available from satellite platforms such as Landsat, SPOT and the new generation of high resolution satellites. Positive Systems has provided remote sensing services for the development of digital aerial camera systems and software for commercial aerial imaging applications.
Binocular Multispectral Adaptive Imaging System (BMAIS)
2010-07-26
system for pilots that adaptively integrates shortwave infrared (SWIR), visible, near ‐IR (NIR), off‐head thermal, and computer symbology/imagery into...respective areas. BMAIS is a binocular helmet mounted imaging system that features dual shortwave infrared (SWIR) cameras, embedded image processors and...algorithms and fusion of other sensor sites such as forward looking infrared (FLIR) and other aircraft subsystems. BMAIS is attached to the helmet
Performance evaluation of a quasi-microscope for planetary landers
NASA Technical Reports Server (NTRS)
Burcher, E. E.; Huck, F. O.; Wall, S. D.; Woehrle, S. B.
1977-01-01
Spatial resolutions achieved with cameras on lunar and planetary landers have been limited to about 1 mm, whereas microscopes of the type proposed for such landers could have obtained resolutions of about 1 um but were never accepted because of their complexity and weight. The quasi-microscope evaluated in this paper could provide intermediate resolutions of about 10 um with relatively simple optics that would augment a camera, such as the Viking lander camera, without imposing special design requirements on the camera of limiting its field of view of the terrain. Images of natural particulate samples taken in black and white and in color show that grain size, shape, and texture are made visible for unconsolidated materials in a 50- to 500-um size range. Such information may provide broad outlines of planetary surface mineralogy and allow inferences to be made of grain origin and evolution. The mineralogical descriptions of single grains would be aided by the reflectance spectra that could, for example, be estimated from the six-channel multispectral data of the Viking lander camera.
First NAC Image Obtained in Mercury Orbit
2017-12-08
NASA image acquired: March 29, 2011 This is the first image of Mercury taken from orbit with MESSENGER’s Narrow Angle Camera (NAC). MESSENGER’s camera system, the Mercury Dual Imaging System (MDIS), has two cameras: the Narrow Angle Camera and the Wide Angle Camera (WAC). Comparison of this image with MESSENGER’s first WAC image of the same region shows the substantial difference between the fields of view of the two cameras. At 1.5°, the field of view of the NAC is seven times smaller than the 10.5° field of view of the WAC. This image was taken using MDIS’s pivot. MDIS is mounted on a pivoting platform and is the only instrument in MESSENGER’s payload capable of movement independent of the spacecraft. The other instruments are fixed in place, and most point down the spacecraft’s boresight at all times, relying solely on the guidance and control system for pointing. The 90° range of motion of the pivot gives MDIS a much-needed extra degree of freedom, allowing MDIS to image the planet’s surface at times when spacecraft geometry would normally prevent it from doing so. The pivot also gives MDIS additional imaging opportunities by allowing it to view more of the surface than that at which the boresight-aligned instruments are pointed at any given time. On March 17, 2011 (March 18, 2011, UTC), MESSENGER became the first spacecraft ever to orbit the planet Mercury. The mission is currently in the commissioning phase, during which spacecraft and instrument performance are verified through a series of specially designed checkout activities. In the course of the one-year primary mission, the spacecraft's seven scientific instruments and radio science investigation will unravel the history and evolution of the Solar System's innermost planet. Visit the Why Mercury? section of this website to learn more about the science questions that the MESSENGER mission has set out to answer. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Carnegie Institution of Washington NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook
NASA Astrophysics Data System (ADS)
Melis, Marcello; Miccoli, Matteo; Quarta, Donato
2013-05-01
A couple of years ago we proposed, in this same session, an extension to the standard colorimetry (CIE '31) that we called Hypercolorimetry. It was based on an even sampling of the 300-1000nm wavelength range, with the definition of 7 hypercolor matching functions optimally shaped to minimize the methamerism. Since then we consolidated the approach through a large number of multispectral analysis and specialized the system to the non invasive diagnosis for paintings and frescos. In this paper we describe the whole process, from the multispectral image acquisition to the final 7 bands computation and we show the results on paintings from Masters of the colour. We describe and propose in this paper a systematic approach to the non invasive diagnosis that is able to change a subjective analysis into a repeatable measure indipendent from the specific lighting conditions and from the specific acquisition system. Along with the Hypercolorimetry and its consolidation in the field of non invasive diagnosis, we developed also a standard spectral reflectance database of pure pigments and pigments painted with different bindings. As we will see, this database could be compared to the reflectances of the painting to help the diagnostician in identifing the proper matter. We used a Nikon D800FR (Full Range) camera. This is a 36megapixel reflex camera modified under a Nikon/Profilocolore common project, to achieve a 300-1000nm range sensitivity. The large amount of data allowed us to perform very accurate pixels comparisions, based on their spectral reflectance. All the original pigments and their binding have been provided by the Opificio delle Pietre Dure, Firenze, Italy, while the analyzed masterpieces belong to the collection of the Pinacoteca Nazionale of Bologna, Italy.
Galileo multispectral imaging of Earth.
Geissler, P; Thompson, W R; Greenberg, R; Moersch, J; McEwen, A; Sagan, C
1995-08-25
Nearly 6000 multispectral images of Earth were acquired by the Galileo spacecraft during its two flybys. The Galileo images offer a unique perspective on our home planet through the spectral capability made possible by four narrowband near-infrared filters, intended for observations of methane in Jupiter's atmosphere, which are not incorporated in any of the currently operating Earth orbital remote sensing systems. Spectral variations due to mineralogy, vegetative cover, and condensed water are effectively mapped by the visible and near-infrared multispectral imagery, showing a wide variety of biological, meteorological, and geological phenomena. Global tectonic and volcanic processes are clearly illustrated by these images, providing a useful basis for comparative planetary geology. Differences between plant species are detected through the narrowband IR filters on Galileo, allowing regional measurements of variation in the "red edge" of chlorophyll and the depth of the 1-micrometer water band, which is diagnostic of leaf moisture content. Although evidence of life is widespread in the Galileo data set, only a single image (at approximately 2 km/pixel) shows geometrization plausibly attributable to our technical civilization. Water vapor can be uniquely imaged in the Galileo 0.73-micrometer band, permitting spectral discrimination of moist and dry clouds with otherwise similar albedo. Surface snow and ice can be readily distinguished from cloud cover by narrowband imaging within the sensitivity range of Galileo's silicon CCD camera. Ice grain size variations can be mapped using the weak H2O absorption at 1 micrometer, a technique which may find important applications in the exploration of the moons of Jupiter. The Galileo images have the potential to make unique contributions to Earth science in the areas of geological, meteorological and biological remote sensing, due to the inclusion of previously untried narrowband IR filters. The vast scale and near global coverage of the Galileo data set complements the higher-resolution data from Earth orbiting systems and may provide a valuable reference point for future studies of global change.
Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft
NASA Astrophysics Data System (ADS)
Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.
2015-02-01
Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.
NASA Astrophysics Data System (ADS)
Diner, D. J.; Martonchik, J. V.; Sanghavi, S.; Xu, F.; Garay, M. J.; Bradley, C.; Chipman, R.; McClain, S.
2011-12-01
Passive retrievals of aerosol properties from aircraft or satellite must account for surface reflection at the lower boundary. Future missions such as Aerosol-Cloud-Ecosystem (ACE) will use multiangular, multispectral, and polarimetric imagery for aerosol remote sensing. Interpreting such multidimensional measurements requires representing the aerosols by a set of optical and microphysical parameters and modeling the surface bidirectional reflectance distribution function (BRDF). We are developing a surface model represented by a matrix BRDF that describes both intensity and polarization. The BRDF is the sum of a depolarizing volumetric (diffuse) scattering term represented by the modified Rahman-Pinty-Verstraete (mRPV) function, and a specular reflection term corresponding to a distribution of tilted microfacets, each of which reflects according to the Fresnel laws. In order to limit the number of parameters that need to be retrieved, empirical constraints are placed on the surface reflection model, e.g., that the volumetric component can be written as the product of a function only of wavelength and a function only of illumination and view geometry and that the polarized surface reflectance is spectrally neutral. Validation of these assumptions is required to establish a successful surface reflectance model that can be used as part of the aerosol retrievals. The Ground-based and Airborne Multiangle SpectroPolarimetric Imagers (GroundMSPI and AirMSPI) are pushbroom cameras that use a novel dual-photoelastic modulator (PEM) design to measure the Stokes vector components I, Q, and U, degree of linear polarization (DOLP), and angle of linear polarization (AOLP) with high accuracy. Intensity bands are centered at 355, 380, 445, 555, 660, 865, and 935 nm, and polarization channels are at 470, 660, and 865 nm. GroundMSPI and AirMSPI data collected on clear days are being used to further develop and validate the parametric surface model. For GroundMSPI, time sequences of intensity and polarization imagery are acquired throughout the day, and motion of the Sun through the sky provides variable scattering angle. AirMSPI acquires multiangular imagery from the NASA ER-2 aircraft by pointing the camera at different angles using a motorized gimbal. In this paper, we will present examples of GroundMSPI and AirMSPI imagery and explore how well the parametric surface model is able to represent the measured intensity and polarization data.
NASA Technical Reports Server (NTRS)
2003-01-01
MGS MOC Release No. MOC2-387, 10 June 2003
This is a Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide angle view of the Charitum Montes, south of Argyre Planitia, in early June 2003. The seasonal south polar frost cap, composed of carbon dioxide, has been retreating southward through this area since spring began a month ago. The bright features toward the bottom of this picture are surfaces covered by frost. The picture is located near 57oS, 43oW. North is at the top, south is at the bottom. Sunlight illuminates the scene from the upper left. The area shown is about 217 km (135 miles) wide.MISR Multi-angle Views of Sunday Morning Fires
NASA Technical Reports Server (NTRS)
2007-01-01
Hot, dry Santa Ana winds began blowing through the Los Angeles and San Diego areas on Sunday October 21, 2007. Wind speeds ranging from 30 to 50 mph were measured in the area, with extremely low relative humidities. These winds, coupled with exceptionally dry conditions due to lack of rainfall resulted in a number of fires in the Los Angeles and San Diego areas, causing the evacuation of more than 250,000 people. These two images show the Southern California coast from Los Angeles to San Diego from two of the nine cameras on the Multi-angle Imaging SpectroRadiometer (MISR) instrument on the NASA EOS Terra satellite. These images were obtained around 11:35 a.m. PDT on Sunday morning, October 21, 2007 and show a number of plumes extending out over the Pacific ocean. In addition, locations identified as potential hot spots from the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument on the same satellite are outlined in red. The left image is from MISR's nadir looking camera and the plumes appear very faint. The image on the right is from MISR's 60o forward looking camera, which accentuates the amount of light scattered by aerosols in the atmosphere, including smoke and dust. Both these images are false color and contain information from MISR's red, green, blue and near-infrared wavelengths, which makes vegetated land appear greener than it would naturally. Notice in the right hand image that the color of the plumes associated with the MODIS hot spots is bluish, while plumes not associated with hot spots appear more yellow. This is because the latter plumes are composed of dust kicked up by the strong Santa Ana winds. In some locations along Interstate 5 on this date, visibility was severely reduced due to blowing dust. MISR's multiangle and multispectral capability give it the ability to distinguish smoke from dust in this situation. The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire globe between 82 degrees north and 82 degrees south latitude. These images were generated from a portion of the imagery acquired during Terra orbit 41713, and use data from blocks 63 to 66 within World Reference System-2 path 40. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. The MISR data were obtained from the NASA Langley Research Center Atmospheric Science Data Center. JPL is a division of the California Institute of Technology.Baffling system for the Wide Angle Camera (WAC) of ROSETTA mission
NASA Astrophysics Data System (ADS)
Brunello, Pierfrancesco; Peron, Fabio; Barbieri, Cesare; Fornasier, Sonia
2000-10-01
After the experience of GIOTTO fly-by to comet Halley in 1986, the European Space Agency planned to improve the scientific knowledge of these astronomical objects by means of an even more ambitious rendezvous mission with another comet (P/Wirtanen). This mission, named ROSETTA, will go on from 2003 to 2013, ending after the comet perihelion phase and including also the fly-by with two asteroids of the main belt (140 Siwa and 4979 Otawara). Scientific priority of the mission is the in situ investigation of the cometary nucleus, with the aim of better understanding the formation and the composition of planetesimals and their evolution over the last 4.5 billions of years. In this context, the Authors were involved in the design of the baffling for the Wide Angle Camera (WAC) of the imaging system (OSIRIS) carried on board of the spacecraft. Scientific requirements for the WAC are : a large field of view (FOV) of 12 degree(s) x 12 degree(s) with a resolution of 100 (mu) rad per pixel, UV response, and a contrast ratio of 10-4 in order to detect gaseous and dusty features close to the nucleus of the comet. TO achieve these performances, a fairly novel class of optical solutions employing off-axis sections of concentric mirrors was explored. Regarding baffling, the peculiar demand was the rejection of stray-light generated by the optics for sources within the FOV, since the optical entrance aperture is located at the level of the secondary mirror (instead of the primary as usual). This paper describes the baffle design and analyzes its performances, calculated by numerical simulation with ray tracing methods, at different angles of incidence of the light, for sources both outside and inside the field of view.
NASA Astrophysics Data System (ADS)
Gicquel, A.; Vincent, J.-B.; Agarwal, J.; A'Hearn, M. F.; Bertini, I.; Bodewits, D.; Sierks, H.; Lin, Z.-Y.; Barbieri, C.; Lamy, P. L.; Rodrigo, R.; Koschny, D.; Rickman, H.; Keller, H. U.; Barucci, M. A.; Bertaux, J.-L.; Besse, S.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; Debei, S.; Deller, J.; De Cecco, M.; Frattin, E.; El-Maarry, M. R.; Fornasier, S.; Fulle, M.; Groussin, O.; Gutiérrez, P. J.; Gutiérrez-Marquez, P.; Güttler, C.; Höfner, S.; Hofmann, M.; Hu, X.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Knollenberg, J.; Kovacs, G.; Kramm, J.-R.; Kührt, E.; Küppers, M.; Lara, L. M.; Lazzarin, M.; Moreno, J. J. Lopez; Lowry, S.; Marzari, F.; Masoumzadeh, N.; Massironi, M.; Moreno, F.; Mottola, S.; Naletto, G.; Oklay, N.; Pajola, M.; Pommerol, A.; Preusker, F.; Scholten, F.; Shi, X.; Thomas, N.; Toth, I.; Tubiana, C.
2016-11-01
Beginning in 2014 March, the OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) cameras began capturing images of the nucleus and coma (gas and dust) of comet 67P/Churyumov-Gerasimenko using both the wide angle camera (WAC) and the narrow angle camera (NAC). The many observations taken since July of 2014 have been used to study the morphology, location, and temporal variation of the comet's dust jets. We analysed the dust monitoring observations shortly after the southern vernal equinox on 2015 May 30 and 31 with the WAC at the heliocentric distance Rh = 1.53 AU, where it is possible to observe that the jet rotates with the nucleus. We found that the decline of brightness as a function of the distance of the jet is much steeper than the background coma, which is a first indication of sublimation. We adapted a model of sublimation of icy aggregates and studied the effect as a function of the physical properties of the aggregates (composition and size). The major finding of this paper was that through the sublimation of the aggregates of dirty grains (radius a between 5 and 50 μm) we were able to completely reproduce the radial brightness profile of a jet beyond 4 km from the nucleus. To reproduce the data, we needed to inject a number of aggregates between 8.5 × 1013 and 8.5 × 1010 for a = 5 and 50 μm, respectively, or an initial mass of H2O ice around 22 kg.
Multispectral computational ghost imaging with multiplexed illumination
NASA Astrophysics Data System (ADS)
Huang, Jian; Shi, Dongfeng
2017-07-01
Computational ghost imaging has attracted wide attention from researchers in many fields over the last two decades. Multispectral imaging as one application of computational ghost imaging possesses spatial and spectral resolving abilities, and is very useful for surveying scenes and extracting detailed information. Existing multispectral imagers mostly utilize narrow band filters or dispersive optical devices to separate light of different wavelengths, and then use multiple bucket detectors or an array detector to record them separately. Here, we propose a novel multispectral ghost imaging method that uses one single bucket detector with multiplexed illumination to produce a colored image. The multiplexed illumination patterns are produced by three binary encoded matrices (corresponding to the red, green and blue colored information, respectively) and random patterns. The results of the simulation and experiment have verified that our method can be effective in recovering the colored object. Multispectral images are produced simultaneously by one single-pixel detector, which significantly reduces the amount of data acquisition.
A telephoto camera system with shooting direction control by gaze detection
NASA Astrophysics Data System (ADS)
Teraya, Daiki; Hachisu, Takumi; Yendo, Tomohiro
2015-05-01
For safe driving, it is important for driver to check traffic conditions such as traffic lights, or traffic signs as early as soon. If on-vehicle camera takes image of important objects to understand traffic conditions from long distance and shows these to driver, driver can understand traffic conditions earlier. To take image of long distance objects clearly, the focal length of camera must be long. When the focal length is long, on-vehicle camera doesn't have enough field of view to check traffic conditions. Therefore, in order to get necessary images from long distance, camera must have long-focal length and controllability of shooting direction. In previous study, driver indicates shooting direction on displayed image taken by a wide-angle camera, a direction controllable camera takes telescopic image, and displays these to driver. However, driver uses a touch panel to indicate the shooting direction in previous study. It is cause of disturb driving. So, we propose a telephoto camera system for driving support whose shooting direction is controlled by driver's gaze to avoid disturbing drive. This proposed system is composed of a gaze detector and an active telephoto camera whose shooting direction is controlled. We adopt non-wear detecting method to avoid hindrance to drive. The gaze detector measures driver's gaze by image processing. The shooting direction of the active telephoto camera is controlled by galvanometer scanners and the direction can be switched within a few milliseconds. We confirmed that the proposed system takes images of gazing straight ahead of subject by experiments.
Field trials for determining the visible and infrared transmittance of screening smoke
NASA Astrophysics Data System (ADS)
Sánchez Oliveros, Carmen; Santa-María Sánchez, Guillermo; Rosique Pérez, Carlos
2009-09-01
In order to evaluate the concealment capability of smoke, the Countermeasures Laboratory of the Institute of Technology "Marañosa" (ITM) has done a set of tests for measuring the transmittances of multispectral smoke tins in several bands of the electromagnetic spectrum. The smoke composition based on red phosphorous has been developed and patented by this laboratory as a part of a projectile development. The smoke transmittance was measured by means of thermography as well as spectroradiometry. Black bodies and halogen lamps were used as infrared and visible source of radiation. The measurements were carried out in June of 2008 at the Marañosa field (Spain) with two MWIR cameras, two LWIR cameras, one CCD visible camera, one CVF IR spectroradiometer covering the interval 1.5 to 14 microns and one array silicon based spectroradiometer for the 0.2 to 1.1 μm spectra. The transmittance and dimensions of the smoke screen were characterized in the visible band, MWIR (3 - 5 μm and LWIR (8 - 12 μm) regions. The size of the screen was about 30 meters wide and 5 meters high. The transmittances in the IR bands were about 0.3 and better than 0.1 in the visible one. The screens showed to be effective over the time of persistence for all of the tests. The results obtained from the imaging and non-imaging systems were in good accordance. The meteorological conditions during tests such as the wind speed are determinant for the use of this kind of optical countermeasures.
Common aperture multispectral spotter camera: Spectro XR
NASA Astrophysics Data System (ADS)
Petrushevsky, Vladimir; Freiman, Dov; Diamant, Idan; Giladi, Shira; Leibovich, Maor
2017-10-01
The Spectro XRTM is an advanced color/NIR/SWIR/MWIR 16'' payload recently developed by Elbit Systems / ELOP. The payload's primary sensor is a spotter camera with common 7'' aperture. The sensor suite includes also MWIR zoom, EO zoom, laser designator or rangefinder, laser pointer / illuminator and laser spot tracker. Rigid structure, vibration damping and 4-axes gimbals enable high level of line-of-sight stabilization. The payload's list of features include multi-target video tracker, precise boresight, strap-on IMU, embedded moving map, geodetic calculations suite, and image fusion. The paper describes main technical characteristics of the spotter camera. Visible-quality, all-metal front catadioptric telescope maintains optical performance in wide range of environmental conditions. High-efficiency coatings separate the incoming light into EO, SWIR and MWIR band channels. Both EO and SWIR bands have dual FOV and 3 spectral filters each. Several variants of focal plane array formats are supported. The common aperture design facilitates superior DRI performance in EO and SWIR, in comparison to the conventionally configured payloads. Special spectral calibration and color correction extend the effective range of color imaging. An advanced CMOS FPA and low F-number of the optics facilitate low light performance. SWIR band provides further atmospheric penetration, as well as see-spot capability at especially long ranges, due to asynchronous pulse detection. MWIR band has good sharpness in the entire field-of-view and (with full HD FPA) delivers amount of detail far exceeding one of VGA-equipped FLIRs. The Spectro XR offers level of performance typically associated with larger and heavier payloads.
High Angular Resolution Measurements of the Anisotropy of Reflectance of Sea Ice and Snow
NASA Astrophysics Data System (ADS)
Goyens, C.; Marty, S.; Leymarie, E.; Antoine, D.; Babin, M.; Bélanger, S.
2018-01-01
We introduce a new method to determine the anisotropy of reflectance of sea ice and snow at spatial scales from 1 m2 to 80 m2 using a multispectral circular fish-eye radiance camera (CE600). The CE600 allows measuring radiance simultaneously in all directions of a hemisphere at a 1° angular resolution. The spectral characteristics of the reflectance and its dependency on illumination conditions obtained from the camera are compared to those obtained with a hyperspectral field spectroradiometer manufactured by Analytical Spectral Device, Inc. (ASD). Results confirm the potential of the CE600, with the suggested measurement setup and data processing, to measure commensurable sea ice and snow hemispherical-directional reflectance factor, HDRF, values. Compared to the ASD, the reflectance anisotropy measured with the CE600 provides much higher resolution in terms of directional reflectance (N = 16,020). The hyperangular resolution allows detecting features that were overlooked using the ASD due to its limited number of measurement angles (N = 25). This data set of HDRF further documents variations in the anisotropy of the reflectance of snow and ice with the geometry of observation and illumination conditions and its spectral and spatial scale dependency. Finally, in order to reproduce the hyperangular CE600 reflectance measurements over the entire 400-900 nm spectral range, a regression-based method is proposed to combine the ASD and CE600 measurements. Results confirm that both instruments may be used in synergy to construct a hyperangular and hyperspectral snow and ice reflectance anisotropy data set.
NASA Astrophysics Data System (ADS)
Moissl, Richard; Kueppers, Michael
2016-10-01
In this paper we present the results of an analysis on a large part of the existing Image data from the OSIRIS camera system onboard the Rosetta Spacecraft, in which stars of sufficient brightness (down to a limiting magnitude of 6) have been observed through the coma of Comet 67/P Churyumov-Gerasimenko ("C-G"). Over the course of the Rosetta main mission the Coma of the comet underwent large changes in density and structure, owed to the changing insolation along the orbit of C-G. We report on the changes of the stellar signals in the wavelength ranges, covered by the filters of the OSIRIS Narrow-Angle (NAC) and Wide-Angle (WAC) cameras.Acknowledgements: OSIRIS was built by a consortium led by the Max-Planck-Institut für Sonnensystemforschung, Göttingen, Germany, in collaboration with CISAS, University of Padova, Italy, the Laboratoire d'Astrophysique de Marseille, France, the Instituto de Astrofísica de Andalucia, CSIC, Granada, Spain, the Scientific Support Office of the European Space Agency, Noordwijk, The Netherlands, the Instituto Nacional de Técnica Aeroespacial, Madrid, Spain, the Universidad Politéchnica de Madrid, Spain, the Department of Physics and Astronomy of Uppsala University, Sweden, and the Institut für Datentechnik und Kommunikationsnetze der Technischen Universität Braunschweig, Germany.
Southern Florida's River of Grass
NASA Technical Reports Server (NTRS)
2002-01-01
Florida's Everglades is a region of broad, slow-moving sheets of water flowing southward over low-lying areas from Lake Okeechobeeto the Gulf of Mexico. In places this remarkable 'river of grass' is 80 kilometers wide. These images from the Multi-angle Imaging SpectroRadiometer show the Everglades region on January 16, 2002. Each image covers an area measuring 191 kilometers x 205 kilometers. The data were captured during Terra orbit 11072.On the left is a natural color view acquired by MISR's nadir camera. A portion of Lake Okeechobee is visible at the top, to the right of image center. South of the lake, whose name derives from the Seminole word for 'big water,' an extensive region of farmland known as the Everglades Agricultural Area is recognizable by its many clustered squares. Over half of the sugar produced in United States is grown here. Urban areas along the east coast and in the northern part of the image extend to the boundaries of Big Cypress Swamp, situated north of Everglades National Park.The image on the right combines red-band data from the 46-degree backward, nadir and 46-degree forward-viewing camera angles to create a red, green, blue false-color composite. One of the interesting uses of the composite image is for detecting surface water. Wet surfaces appear blue in this rendition because sun glitter produces a greater signal at the forward camera's view angle. Wetlands visible in these images include a series of shallow impoundments called Water Conservation Areas which were built to speed water flow through the Everglades in times of drought. In parts of the Everglades, these levees and extensive systems such as the Miami and Tamiami Canals have altered the natural cycles of water flow. For example, the water volume of the Shark River Slough, a natural wetland which feeds Everglades National Park, is influenced by the Tamiami Canal. The unique and intrinsic value of the Everglades is now widely recognized, and efforts to restore the natural water cycles are underway.MISR Images Forest Fires and Hurricane
NASA Technical Reports Server (NTRS)
2000-01-01
These images show forest fires raging in Montana and Hurricane Hector swirling in the Pacific. These two unrelated, large-scale examples of nature's fury were captured by the Multi-angle Imaging SpectroRadiometer(MISR) during a single orbit of NASA's Terra satellite on August 14, 2000.
In the left image, huge smoke plumes rise from devastating wildfires in the Bitterroot Mountain Range near the Montana-Idaho border. Flathead Lake is near the upper left, and the Great Salt Lake is at the bottom right. Smoke accumulating in the canyons and plains is also visible. This image was generated from the MISR camera that looks forward at a steep angle (60 degrees); the instrument has nine different cameras viewing Earth at different angles. The smoke is far more visible when seen at this highly oblique angle than it would be in a conventional, straight-downward (nadir)view. The wide extent of the smoke is evident from comparison with the image on the right, a view of Hurricane Hector acquired from MISR's nadir-viewing camera. Both images show an area of approximately 400 kilometers (250 miles)in width and about 850 kilometers (530 miles) in length.When this image of Hector was taken, the eastern Pacific tropical cyclone was located approximately 1,100 kilometers (680 miles) west of the southern tip of Baja California, Mexico. The eye is faintly visible and measures 25 kilometers (16 miles) in diameter. The storm was beginning to weaken, and 24hours later the National Weather Service downgraded Hector from a hurricane to a tropical storm.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.For more information: http://www-misr.jpl.nasa.govEnabling Technologies for High-accuracy Multiangle Spectropolarimetric Imaging from Space
NASA Technical Reports Server (NTRS)
Diner, David J.; Macenka, Steven A.; Seshndri, Suresh; Bruce, Carl E; Jau, Bruno; Chipman, Russell A.; Cairns, Brian; Christoph, Keller; Foo, Leslie D.
2004-01-01
Satellite remote sensing plays a major role in measuring the optical and radiative properties, environmental impact, and spatial and temporal distribution of tropospheric aerosols. In this paper, we envision a new generation of spaceborne imager that integrates the unique strengths of multispectral, multiangle, and polarimetric approaches, thereby achieving better accuracies in aerosol optical depth and particle properties than can be achieved using any one method by itself. Design goals include spectral coverage from the near-UV to the shortwave infrared; global coverage within a few days; intensity and polarimetric imaging simultaneously at multiple view angles; kilometer to sub-kilometer spatial resolution; and measurement of the degree of linear polarization for a subset of the spectral complement with an uncertainty of 0.5% or less. The latter requirement is technically the most challenging. In particular, an approach for dealing with inter-detector gain variations is essential to avoid false polarization signals. We propose using rapid modulation of the input polarization state to overcome this problem, using a high-speed variable retarder in the camera design. Technologies for rapid retardance modulation include mechanically rotating retarders, liquid crystals, and photoelastic modulators (PEMs). We conclude that the latter are the most suitable.
Multispectral system analysis through modeling and simulation
NASA Technical Reports Server (NTRS)
Malila, W. A.; Gleason, J. M.; Cicone, R. C.
1977-01-01
The design and development of multispectral remote sensor systems and associated information extraction techniques should be optimized under the physical and economic constraints encountered and yet be effective over a wide range of scene and environmental conditions. Direct measurement of the full range of conditions to be encountered can be difficult, time consuming, and costly. Simulation of multispectral data by modeling scene, atmosphere, sensor, and data classifier characteristics is set forth as a viable alternative, particularly when coupled with limited sets of empirical measurements. A multispectral system modeling capability is described. Use of the model is illustrated for several applications - interpretation of remotely sensed data from agricultural and forest scenes, evaluating atmospheric effects in Landsat data, examining system design and operational configuration, and development of information extraction techniques.
Multispectral system analysis through modeling and simulation
NASA Technical Reports Server (NTRS)
Malila, W. A.; Gleason, J. M.; Cicone, R. C.
1977-01-01
The design and development of multispectral remote sensor systems and associated information extraction techniques should be optimized under the physical and economic constraints encountered and yet be effective over a wide range of scene and environmental conditions. Direct measurement of the full range of conditions to be encountered can be difficult, time consuming, and costly. Simulation of multispectral data by modeling scene, atmosphere, sensor, and data classifier characteristics is set forth as a viable alternative, particularly when coupled with limited sets of empirical measurements. A multispectral system modeling capability is described. Use of the model is illustrated for several applications - interpretation of remotely sensed data from agricultural and forest scenes, evaluating atmospheric effects in LANDSAT data, examining system design and operational configuration, and development of information extraction techniques.
Component pattern analysis of chemicals using multispectral THz imaging system
NASA Astrophysics Data System (ADS)
Kawase, Kodo; Ogawa, Yuichi; Watanabe, Yuki
2004-04-01
We have developed a novel basic technology for terahertz (THz) imaging, which allows detection and identification of chemicals by introducing the component spatial pattern analysis. The spatial distributions of the chemicals were obtained from terahertz multispectral transillumination images, using absorption spectra previously measured with a widely tunable THz-wave parametric oscillator. Further we have applied this technique to the detection and identification of illicit drugs concealed in envelopes. The samples we used were methamphetamine and MDMA, two of the most widely consumed illegal drugs in Japan, and aspirin as a reference.
2015-04-29
This image from MESSENGER spacecraft covers a small area located about 115 km south of the center of Mansart crater. The smallest craters visible in the image are about the size of the 16-meter (52-feet) crater that will be made by the impact of the MESSENGER spacecraft. The impact will take place tomorrow, April 30, 2015. Just left of center is a crater that is about 80 meters in diameter. The bright area on its right wall may be an outcrop of hollows material. Date acquired: April 28, 2015 Image Mission Elapsed Time (MET): 72505530 Image ID: 8408666 Instrument: Narrow Angle Camera (NAC) of the Mercury Dual Imaging System (MDIS) Center Latitude: 69.8° N Center Longitude: 303.7° E Resolution: 2.0 meters/pixel Scale: The scene is about 1 km (0.6 miles) wide. This image has not been map projected. Incidence Angle: 79.0° Emission Angle: 11.0° Phase Angle: 90.0° http://photojournal.jpl.nasa.gov/catalog/PIA19442
Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System
Lu, Yu; Wang, Keyi; Fan, Gongshu
2016-01-01
A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second. PMID:27077857
Global Controlled Mosaic of Mercury from MESSENGER Orbital Images
NASA Astrophysics Data System (ADS)
Becker, K. J.; Weller, L. A.; Edmundson, K. L.; Becker, T. L.; Robinson, M. S.; Solomon, S. C.
2011-12-01
The MESSENGER spacecraft entered orbit around Mercury in March 2011. Since then, the Mercury Dual Imaging System (MDIS) has been steadily acquiring images from the monochrome, narrow-angle camera (NAC) and the multispectral, wide-angle camera (WAC). With these images, the U.S. Geological Survey (USGS) is constructing a global, controlled monochrome base map of the planet using the Integrated Software for Imagers and Spectrometers (ISIS3) [1]. Although the characterization of MESSENGER spacecraft's navigation and attitude data has proven to be reliable to date, an element of uncertainty in these parameters is unavoidable. This leads to registration offsets between images in the base map. To minimize these errors, images are controlled using a least-squares bundle adjustment that provides refined spacecraft attitude and position parameters plus triangulated ground coordinates of image tie points. As a first effort, 4542 images (2781 NAC, 1761 WAC G filter) have been controlled with a root mean squared error of 0.25 pixels in image space [2]. A preliminary digital elevation model (DEM) is also being produced from the large number of ground points (~ 47,000) triangulated in this adjustment. The region defined by these points ranges from 80°S to 86°N latitude and 158°E to 358°E longitude. A symmetric, unimodal distribution and a dynamic range of 10.5 km characterize the hypsometry of this area. Minimum, maximum, and mean elevations are -5.0, 5.5, and -0.2 km relative to the mean radius of Mercury (2440 km) as defined by the mission. The USGS will use the DEM and base map for the construction of a registered color (WAC) map of high spatial integrity essential for reliable scientific interpretation of the color data. Ongoing improvements to the base map will be made as new images from MDIS become available, providing continuity in resolution, illumination, and viewing conditions. Additional bundle adjustments will further improve spacecraft attitude. The results from further bundle adjustments will ultimately be provided to users in the form of a new, smithed (derived) CK SPICE [3] kernel (C-matrix subsystem dealing with orientation of spacecraft and rotating structures on the spacecraft), replacing the original reconstructed kernel (typically provided by the mission navigation team). The determination of updated attitude parameters for every image acquired by MDIS is a primary goal of the USGS. [1] Anderson, J. A., et al. (2004) Modernization of the Integrated Software for Imagers and Spectrometers, Lunar Planet. Sci. 35, abstract 2039. [2] Edmundson, K. L., et al. (2011), Preliminary photogrammetric control of MESSENGER orbital images of Mercury, GSA Annual Meeting, submitted. [3] Acton, C. H. (1966), Ancillary data services of NASA's Navigation and Ancillary Information Facility, Planet. Space Sci. 44, 65-70.
Selkowitz, David J.; Green, Gordon; Peterson, Birgit E.; Wylie, Bruce
2012-01-01
Spatially explicit representations of vegetation canopy height over large regions are necessary for a wide variety of inventory, monitoring, and modeling activities. Although airborne lidar data has been successfully used to develop vegetation canopy height maps in many regions, for vast, sparsely populated regions such as the boreal forest biome, airborne lidar is not widely available. An alternative approach to canopy height mapping in areas where airborne lidar data is limited is to use spaceborne lidar measurements in combination with multi-angular and multi-spectral remote sensing data to produce comprehensive canopy height maps for the entire region. This study uses spaceborne lidar data from the Geosciences Laser Altimeter System (GLAS) as training data for regression tree models that incorporate multi-angular and multi-spectral data from the Multi-Angle Imaging Spectroradiometer (MISR) and the Moderate Resolution Imaging SpectroRadiometer (MODIS) to map vegetation canopy height across a 1,300,000 km2 swath of boreal forest in Interior Alaska. Results are compared to in situ height measurements as well as airborne lidar data. Although many of the GLAS-derived canopy height estimates are inaccurate, applying a series of filters incorporating both data associated with the GLAS shots as well as ancillary data such as land cover can identify the majority of height estimates with significant errors, resulting in a filtered dataset with much higher accuracy. Results from the regression tree models indicate that late winter MISR imagery acquired under snow-covered conditions is effective for mapping canopy heights ranging from 5 to 15 m, which includes the vast majority of forests in the region. It appears that neither MISR nor MODIS imagery acquired during the growing season is effective for canopy height mapping, although including summer multi-spectral MODIS data along with winter MISR imagery does appear to provide a slight increase in the accuracy of resulting height maps. The finding that winter, snow-covered MISR imagery can be used to map canopy height is important because clear sky days are nearly three times as common during the late winter period as during the growing season. The increased odds of acquiring cloud-free imagery during the target acquisition period make regularly updated forest height inventories for Interior Alaska much more feasible. A major advantage of the GLAS–MISR–MODIS canopy height mapping methodology described here is that this approach uses only data that is freely available worldwide, making the approach potentially applicable across the entire circumpolar boreal forest region.
2017-06-26
NASA's Cassini spacecraft peers toward a sliver of Saturn's sunlit atmosphere while the icy rings stretch across the foreground as a dark band. This view looks toward the unilluminated side of the rings from about 7 degrees below the ring plane. The image was taken in green light with the Cassini spacecraft wide-angle camera on March 31, 2017. The view was obtained at a distance of approximately 620,000 miles (1 million kilometers) from Saturn. Image scale is 38 miles (61 kilometers) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA21334
1986-01-24
P-29516 BW Range: 125, 000 kilometers (78,000 miles) Voyager 2's wide-angle camera captured this view of the outer part of the Uranian ring system just 11 minutes before passing though the ring plane. The resolution in this clear-filter view is slightly better than 9 km (6 mi). The brightest, outermost ring is known as epsilon. Interior to epsilon lie (from top) the newly discovered 10th ring of Uranus--designated 1986UR1 and barely visible here--and then the delta, gamma and eta rings.
NASA Technical Reports Server (NTRS)
2005-01-01
This somewhat oblique blue wide angle Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows the 174 km (108 mi) diameter crater, Terby, and its vicinity in December 2004. Located north of Hellas, this region can be covered with seasonal frost and ground-hugging fog, even in the afternoon, despite being north of 30oS. The subtle, wavy pattern is a manifestation of fog. Location near: 28oS, 286oW Illumination from: upper left Season: Southern WinterNASA Astrophysics Data System (ADS)
Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest
The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972. LROC allows us to determine the recent impact rate of bolides in the size range of 0.5 to 10 meters, which is currently not well known. Determining the impact rate at these sizes enables engineering remediation measures for future surface operations and interplanetary travel. The WAC has imaged nearly the entire Moon in seven wavelengths. A preliminary global WAC stereo-based topographic model is in preparation [1] and global color processing is underway [2]. As the mission progresses repeat global coverage will be obtained as lighting conditions change providing a robust photometric dataset. The NACs are revealing a wealth of morpho-logic features at the meter scale providing the engineering and science constraints needed to support future lunar exploration. All of the Apollo landing sites have been imaged, as well as the majority of robotic landing and impact sites. Through the use of off-nadir slews a collection of stereo pairs is being acquired that enable 5-m scale topographic mapping [3-7]. Impact mor-phologies (terraces, impact melt, rays, etc) are preserved in exquisite detail at all Copernican craters and are enabling new studies of impact mechanics and crater size-frequency distribution measurements [8-12]. Other topical studies including, for example, lunar pyroclastics, domes, and tectonics are underway [e.g., 10-17]. The first PDS data release of LROC data will be in March 2010, and will include all images from the commissioning phase and the first 3 months of the mapping phase. [1] Scholten et al. (2010) 41st LPSC, #2111; [2] Denevi et al. (2010a) 41st LPSC, #2263; [3] Beyer et al. (2010) 41st LPSC, #2678; [4] Archinal et al. (2010) 41st LPSC, #2609; [5] Mattson et al. (2010) 41st LPSC, #1871; [6] Tran et al. (2010) 41st LPSC, #2515; [7] Oberst et al. (2010) 41st LPSC, #2051; [8] Bray et al. (2010) 41st LPSC, #2371; [9] Denevi et al. (2010b) 41st LPSC, #2582; [10] Hiesinger et al. (2010a) 41st LPSC, #2278; [11] Hiesinger et al. (2010b) 41st LPSC, #2304; [12] van der Bogert et al. (2010) 41st LPSC, #2165; [13] Plescia et al. (2010) 41st LPSC, #2160; [14] Lawrence et al. (2010) 41st LPSC, #1906; [15] Gaddis et al. (2010) 41st LPSC, #2059; [16] Watters et al. (2010) 41st LPSC, #1863; [17] Garry et al. (2010) 41st LPSC, #2278.
A multi-modal stereo microscope based on a spatial light modulator.
Lee, M P; Gibson, G M; Bowman, R; Bernet, S; Ritsch-Marte, M; Phillips, D B; Padgett, M J
2013-07-15
Spatial Light Modulators (SLMs) can emulate the classic microscopy techniques, including differential interference (DIC) contrast and (spiral) phase contrast. Their programmability entails the benefit of flexibility or the option to multiplex images, for single-shot quantitative imaging or for simultaneous multi-plane imaging (depth-of-field multiplexing). We report the development of a microscope sharing many of the previously demonstrated capabilities, within a holographic implementation of a stereo microscope. Furthermore, we use the SLM to combine stereo microscopy with a refocusing filter and with a darkfield filter. The instrument is built around a custom inverted microscope and equipped with an SLM which gives various imaging modes laterally displaced on the same camera chip. In addition, there is a wide angle camera for visualisation of a larger region of the sample.
EXPERIMENTS IN LITHOGRAPHY FROM REMOTE SENSOR IMAGERY.
Kidwell, R. H.; McSweeney, J.; Warren, A.; Zang, E.; Vickers, E.
1983-01-01
Imagery from remote sensing systems such as the Landsat multispectral scanner and return beam vidicon, as well as synthetic aperture radar and conventional optical camera systems, contains information at resolutions far in excess of that which can be reproduced by the lithographic printing process. The data often require special handling to produce both standard and special map products. Some conclusions have been drawn regarding processing techniques, procedures for production, and printing limitations.
1973-06-01
This EREP color infrared photograph of the Uncompahgre Plateau area of Colorado was taken in June of 1973 by the Earth Terrain Camera (Skylab EREP Experiment S190B) of the Skylab's Multi-spectral Photographic Facility during the Skylab-2 mission. Skylab stereoscopic data provided the best identification of vegetation complexes and delineation of vegetation boundaries, particularly in areas where changes in relief were related to changes in vegetation type (a common occurrence in wild-land vegetation communities).
Cytology 3D structure formation based on optical microscopy images
NASA Astrophysics Data System (ADS)
Pronichev, A. N.; Polyakov, E. V.; Shabalova, I. P.; Djangirova, T. V.; Zaitsev, S. M.
2017-01-01
The article the article is devoted to optimization of the parameters of imaging of biological preparations in optical microscopy using a multispectral camera in visible range of electromagnetic radiation. A model for the image forming of virtual preparations was proposed. The optimum number of layers was determined for the object scan in depth and holistic perception of its switching according to the results of the experiment.
NASA Technical Reports Server (NTRS)
Bergamini, E. W.; Depaula, A. R., Jr.; Martins, R. C. D. O.
1984-01-01
Data relative to the on board supervision subsystem are presented which were considered in a conference between INPE and NASA personnel, with the purpose of initiating a joint effort leading to the implementation of the Brazilian remote sensing experiment - (BRESEX). The BRESEX should consist, basically, of a multispectral camera for Earth observation, to be tested in a future space shuttle flight.
NASA Astrophysics Data System (ADS)
Ciurapiński, Wieslaw; Dulski, Rafal; Kastek, Mariusz; Szustakowski, Mieczyslaw; Bieszczad, Grzegorz; Życzkowski, Marek; Trzaskawka, Piotr; Piszczek, Marek
2009-09-01
The paper presents the concept of multispectral protection system for perimeter protection for stationary and moving objects. The system consists of active ground radar, thermal and visible cameras. The radar allows the system to locate potential intruders and to control an observation area for system cameras. The multisensor construction of the system ensures significant improvement of detection probability of intruder and reduction of false alarms. A final decision from system is worked out using image data. The method of data fusion used in the system has been presented. The system is working under control of FLIR Nexus system. The Nexus offers complete technology and components to create network-based, high-end integrated systems for security and surveillance applications. Based on unique "plug and play" architecture, system provides unmatched flexibility and simplistic integration of sensors and devices in TCP/IP networks. Using a graphical user interface it is possible to control sensors and monitor streaming video and other data over the network, visualize the results of data fusion process and obtain detailed information about detected intruders over a digital map. System provides high-level applications and operator workload reduction with features such as sensor to sensor cueing from detection devices, automatic e-mail notification and alarm triggering.
Baikejiang, Reheman; Zhang, Wei; Li, Changqing
2017-01-01
Diffuse optical tomography (DOT) has attracted attentions in the last two decades due to its intrinsic sensitivity in imaging chromophores of tissues such as hemoglobin, water, and lipid. However, DOT has not been clinically accepted yet due to its low spatial resolution caused by strong optical scattering in tissues. Structural guidance provided by an anatomical imaging modality enhances the DOT imaging substantially. Here, we propose a computed tomography (CT) guided multispectral DOT imaging system for breast cancer imaging. To validate its feasibility, we have built a prototype DOT imaging system which consists of a laser at the wavelength of 650 nm and an electron multiplying charge coupled device (EMCCD) camera. We have validated the CT guided DOT reconstruction algorithms with numerical simulations and phantom experiments, in which different imaging setup parameters, such as projection number of measurements and width of measurement patch, have been investigated. Our results indicate that an air-cooling EMCCD camera is good enough for the transmission mode DOT imaging. We have also found that measurements at six angular projections are sufficient for DOT to reconstruct the optical targets with 2 and 4 times absorption contrast when the CT guidance is applied. Finally, we have described our future research plan on integration of a multispectral DOT imaging system into a breast CT scanner.
Zhang, Wenjing; Cao, Yu; Zhang, Xuanzhe; Liu, Zejin
2015-10-20
Stable information of a sky light polarization pattern can be used for navigation with various advantages such as better performance of anti-interference, no "error cumulative effect," and so on. But the existing method of sky light polarization measurement is weak in real-time performance or with a complex system. Inspired by the navigational capability of a Cataglyphis with its compound eyes, we introduce a new approach to acquire the all-sky image under different polarization directions with one camera and without a rotating polarizer, so as to detect the polarization pattern across the full sky in a single snapshot. Our system is based on a handheld light field camera with a wide-angle lens and a triplet linear polarizer placed over its aperture stop. Experimental results agree with the theoretical predictions. Not only real-time detection but simple and costless architecture demonstrates the superiority of the approach proposed in this paper.
The 1997 Spring Regression of the Martian South Polar Cap: Mars Orbiter Camera Observations
James, P.B.; Cantor, B.A.; Malin, M.C.; Edgett, K.; Carr, M.H.; Danielson, G.E.; Ingersoll, A.P.; Davies, M.E.; Hartmann, W.K.; McEwen, A.S.; Soderblom, L.A.; Thomas, P.C.; Veverka, J.
2000-01-01
The Mars Orbiter cameras (MOC) on Mars Global Surveyor observed the south polar cap of Mars during its spring recession in 1997. The images acquired by the wide angle cameras reveal a pattern of recession that is qualitatively similar to that observed by Viking in 1977 but that does differ in at least two respects. The 1977 recession in the 0o to 120o longitude sector was accelerated relative to the 1997 observations after LS = 240o; the Mountains of Mitchel also detached from the main cap earlier in 1997. Comparison of the MOC images with Mars Orbiter Laser Altimeter data shows that the Mountains of Mitchel feature is controlled by local topography. Relatively dark, low albedo regions well within the boundaries of the seasonal cap were observed to have red-to-violet ratios that characterize them as frost units rather than unfrosted or partially frosted ground; this suggests the possibility of regions covered by CO2 frost having different grain sizes.
2007-01-16
Both luminous and translucent, the C ring sweeps out of the darkness of Saturn's shadow and obscures the planet at lower left. The ring is characterized by broad, isolated bright areas, or "plateaus," surrounded by fainter material. This view looks toward the unlit side of the rings from about 19 degrees above the ringplane. North on Saturn is up. The dark, inner B ring is seen at lower right. The image was taken in visible light with the Cassini spacecraft wide-angle camera on Dec. 15, 2006 at a distance of approximately 632,000 kilometers (393,000 miles) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 56 degrees. Image scale is 34 kilometers (21 miles) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA08855
NASA Astrophysics Data System (ADS)
Bruegge, Carol J.; Val, Sebastian; Diner, David J.; Jovanovic, Veljko; Gray, Ellyn; Di Girolamo, Larry; Zhao, Guangyu
2014-09-01
The Multi-angle Imaging SpectroRadiometer (MISR) has successfully operated on the EOS/ Terra spacecraft since 1999. It consists of nine cameras pointing from nadir to 70.5° view angle with four spectral channels per camera. Specifications call for a radiometric uncertainty of 3% absolute and 1% relative to the other cameras. To accomplish this, MISR utilizes an on-board calibrator (OBC) to measure camera response changes. Once every two months the two Spectralon panels are deployed to direct solar-light into the cameras. Six photodiode sets measure the illumination level that are compared to MISR raw digital numbers, thus determining the radiometric gain coefficients used in Level 1 data processing. Although panel stability is not required, there has been little detectable change in panel reflectance, attributed to careful preflight handling techniques. The cameras themselves have degraded in radiometric response by 10% since launch, but calibration updates using the detector-based scheme has compensated for these drifts and allowed the radiance products to meet accuracy requirements. Validation using Sahara desert observations show that there has been a drift of ~1% in the reported nadir-view radiance over a decade, common to all spectral bands.
Applications of Action Cam Sensors in the Archaeological Yard
NASA Astrophysics Data System (ADS)
Pepe, M.; Ackermann, S.; Fregonese, L.; Fassi, F.; Adami, A.
2018-05-01
In recent years, special digital cameras called "action camera" or "action cam", have become popular due to their low price, smallness, lightness, strength and capacity to make videos and photos even in extreme environment surrounding condition. Indeed, these particular cameras have been designed mainly to capture sport actions and work even in case of dirt, bumps, or underwater and at different external temperatures. High resolution of Digital single-lens reflex (DSLR) cameras are usually preferred to be employed in photogrammetric field. Indeed, beyond the sensor resolution, the combination of such cameras with fixed lens with low distortion are preferred to perform accurate 3D measurements; at the contrary, action cameras have small and wide-angle lens, with a lower performance in terms of sensor resolution, lens quality and distortions. However, by considering the characteristics of the action cameras to acquire under conditions that may result difficult for standard DSLR cameras and because of their lower price, these could be taken into consideration as a possible and interesting approach during archaeological excavation activities to document the state of the places. In this paper, the influence of lens radial distortion and chromatic aberration on this type of cameras in self-calibration mode and an evaluation of their application in the field of Cultural Heritage will be investigated and discussed. Using a suitable technique, it has been possible to improve the accuracy of the 3D model obtained by action cam images. Case studies show the quality and the utility of the use of this type of sensor in the survey of archaeological artefacts.
The Mars NetLander panoramic camera
NASA Astrophysics Data System (ADS)
Jaumann, Ralf; Langevin, Yves; Hauber, Ernst; Oberst, Jürgen; Grothues, Hans-Georg; Hoffmann, Harald; Soufflot, Alain; Bertaux, Jean-Loup; Dimarellis, Emmanuel; Mottola, Stefano; Bibring, Jean-Pierre; Neukum, Gerhard; Albertz, Jörg; Masson, Philippe; Pinet, Patrick; Lamy, Philippe; Formisano, Vittorio
2000-10-01
The panoramic camera (PanCam) imaging experiment is designed to obtain high-resolution multispectral stereoscopic panoramic images from each of the four Mars NetLander 2005 sites. The main scientific objectives to be addressed by the PanCam experiment are (1) to locate the landing sites and support the NetLander network sciences, (2) to geologically investigate and map the landing sites, and (3) to study the properties of the atmosphere and of variable phenomena. To place in situ measurements at a landing site into a proper regional context, it is necessary to determine the lander orientation on ground and to exactly locate the position of the landing site with respect to the available cartographic database. This is not possible by tracking alone due to the lack of on-ground orientation and the so-called map-tie problem. Images as provided by the PanCam allow to determine accurate tilt and north directions for each lander and to identify the lander locations based on landmarks, which can also be recognized in appropriate orbiter imagery. With this information, it will be further possible to improve the Mars-wide geodetic control point network and the resulting geometric precision of global map products. The major geoscientific objectives of the PanCam lander images are the recognition of surface features like ripples, ridges and troughs, and the identification and characterization of different rock and surface units based on their morphology, distribution, spectral characteristics, and physical properties. The analysis of the PanCam imagery will finally result in the generation of precise map products for each of the landing sites. So far comparative geologic studies of the Martian surface are restricted to the timely separated Mars Pathfinder and the two Viking Lander Missions. Further lander missions are in preparation (Beagle-2, Mars Surveyor 03). NetLander provides the unique opportunity to nearly double the number of accessible landing site data by providing simultaneous and long-term observations at four different surface locations which becomes especially important for studies of variable surface features as well as properties and phenomena of the atmosphere. Major changes on the surface that can be detected by PanCam are caused by eolian activities and condensation processes, which directly reflect variations in the prevailing near-surface wind regime and the diurnal and seasonal volatile and dust cycles. Atmospheric studies will concentrate on the detection of clouds, measurements of the aerosol contents and the water vapor absorption at 936 nm. In order to meet these objectives, the proposed PanCam instrument is a highly miniaturized, dedicated stereo and multispectral imaging device. The camera consists of two identical camera cubes, which are arranged in a common housing at a fixed stereo base length of 11 cm. Each camera cube is equipped with a CCD frame transfer detector with 1024×1024 active pixels and optics with a focal length of 13 mm yielding a field-of-view of 53°×53° and an instantaneous filed of view of 1.1 mrad. A filter swivel with six positions provides different color band passes in the wavelength range of 400-950 nm. The camera head is mounted on top of a deployable scissors boom and can be rotated by 360° to obtain a full panorama, which is already covered by eight images. The boom raises the camera head to a final altitude of 90 cm above the surface. Most camera activities will take place within the first week and the first month of the mission. During the remainder of the mission, the camera will operate with a reduced data rate to monitor time-dependent variations on a daily basis. PanCam is a joint German/French project with contributions from DLR, Institute of Space Sensor Technology and Planetary Exploration, Berlin, Institut d'Astrophysique Spatiale, CNRS, Orsay, and Service d'Aéronomie, CNRS, Verrières-le-Buisson.
Mars Exploration Rover Athena Panoramic Camera (Pancam) investigation
Bell, J.F.; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.N.; Arneson, H.M.; Brown, D.; Collins, S.A.; Dingizian, A.; Elliot, S.T.; Hagerott, E.C.; Hayes, A.G.; Johnson, M.J.; Johnson, J. R.; Joseph, J.; Kinch, K.; Lemmon, M.T.; Morris, R.V.; Scherr, L.; Schwochert, M.; Shepard, M.K.; Smith, G.H.; Sohl-Dickstein, J. N.; Sullivan, R.J.; Sullivan, W.T.; Wadsworth, M.
2003-01-01
The Panoramic Camera (Pancam) investigation is part of the Athena science payload launched to Mars in 2003 on NASA's twin Mars Exploration Rover (MER) missions. The scientific goals of the Pancam investigation are to assess the high-resolution morphology, topography, and geologic context of each MER landing site, to obtain color images to constrain the mineralogic, photometric, and physical properties of surface materials, and to determine dust and aerosol opacity and physical properties from direct imaging of the Sun and sky. Pancam also provides mission support measurements for the rovers, including Sun-finding for rover navigation, hazard identification and digital terrain modeling to help guide long-term rover traverse decisions, high-resolution imaging to help guide the selection of in situ sampling targets, and acquisition of education and public outreach products. The Pancam optical, mechanical, and electronics design were optimized to achieve these science and mission support goals. Pancam is a multispectral, stereoscopic, panoramic imaging system consisting of two digital cameras mounted on a mast 1.5 m above the Martian surface. The mast allows Pancam to image the full 360?? in azimuth and ??90?? in elevation. Each Pancam camera utilizes a 1024 ?? 1024 active imaging area frame transfer CCD detector array. The Pancam optics have an effective focal length of 43 mm and a focal ratio f/20, yielding an instantaneous field of view of 0.27 mrad/pixel and a field of view of 16?? ?? 16??. Each rover's two Pancam "eyes" are separated by 30 cm and have a 1?? toe-in to provide adequate stereo parallax. Each eye also includes a small eight position filter wheel to allow surface mineralogic studies, multispectral sky imaging, and direct Sun imaging in the 400-1100 nm wavelength region. Pancam was designed and calibrated to operate within specifications on Mars at temperatures from -55?? to +5??C. An onboard calibration target and fiducial marks provide the capability to validate the radiometric and geometric calibration on Mars. Copyright 2003 by the American Geophysical Union.
NASA Technical Reports Server (NTRS)
Blonksi, Slawomir; Gasser, Gerald; Russell, Jeffrey; Ryan, Robert; Terrie, Greg; Zanoni, Vicki
2001-01-01
Multispectral data requirements for Earth science applications are not always studied rigorously studied before a new remote sensing system is designed. A study of the spatial resolution, spectral bandpasses, and radiometric sensitivity requirements of real-world applications would focus the design onto providing maximum benefits to the end-user community. To support systematic studies of multispectral data requirements, the Applications Research Toolbox (ART) has been developed at NASA's Stennis Space Center. The ART software allows users to create and assess simulated datasets while varying a wide range of system parameters. The simulations are based on data acquired by existing multispectral and hyperspectral instruments. The produced datasets can be further evaluated for specific end-user applications. Spectral synthesis of multispectral images from hyperspectral data is a key part of the ART software. In this process, hyperspectral image cubes are transformed into multispectral imagery without changes in spatial sampling and resolution. The transformation algorithm takes into account spectral responses of both the synthesized, broad, multispectral bands and the utilized, narrow, hyperspectral bands. To validate the spectral synthesis algorithm, simulated multispectral images are compared with images collected near-coincidentally by the Landsat 7 ETM+ and the EO-1 ALI instruments. Hyperspectral images acquired with the airborne AVIRIS instrument and with the Hyperion instrument onboard the EO-1 satellite were used as input data to the presented simulations.
Image-based path planning for automated virtual colonoscopy navigation
NASA Astrophysics Data System (ADS)
Hong, Wei
2008-03-01
Virtual colonoscopy (VC) is a noninvasive method for colonic polyp screening, by reconstructing three-dimensional models of the colon using computerized tomography (CT). In virtual colonoscopy fly-through navigation, it is crucial to generate an optimal camera path for efficient clinical examination. In conventional methods, the centerline of the colon lumen is usually used as the camera path. In order to extract colon centerline, some time consuming pre-processing algorithms must be performed before the fly-through navigation, such as colon segmentation, distance transformation, or topological thinning. In this paper, we present an efficient image-based path planning algorithm for automated virtual colonoscopy fly-through navigation without the requirement of any pre-processing. Our algorithm only needs the physician to provide a seed point as the starting camera position using 2D axial CT images. A wide angle fisheye camera model is used to generate a depth image from the current camera position. Two types of navigational landmarks, safe regions and target regions are extracted from the depth images. Camera position and its corresponding view direction are then determined using these landmarks. The experimental results show that the generated paths are accurate and increase the user comfort during the fly-through navigation. Moreover, because of the efficiency of our path planning algorithm and rendering algorithm, our VC fly-through navigation system can still guarantee 30 FPS.
NASA Astrophysics Data System (ADS)
Shinoj, V. K.; Murukeshan, V. M.; Hong, Jesmond; Baskaran, M.; Aung, Tin
2015-07-01
Noninvasive medical imaging techniques have generated great interest and high potential in the research and development of ocular imaging and follow up procedures. It is well known that angle closure glaucoma is one of the major ocular diseases/ conditions that causes blindness. The identification and treatment of this disease are related primarily to angle assessment techniques. In this paper, we illustrate a probe-based imaging approach to obtain the images of the angle region in eye. The proposed probe consists of a micro CCD camera and LED/NIR laser light sources and they are configured at the distal end to enable imaging of iridocorneal region inside eye. With this proposed dualmodal probe, imaging is performed in light (white visible LED ON) and dark (NIR laser light source alone) conditions and the angle region is noticeable in both cases. The imaging using NIR sources have major significance in anterior chamber imaging since it evades pupil constriction due to the bright light and thereby the artificial altering of anterior chamber angle. The proposed methodology and developed scheme are expected to find potential application in glaucoma disease detection and diagnosis.
Wide-angle ITER-prototype tangential infrared and visible viewing system for DIII-D.
Lasnier, C J; Allen, S L; Ellis, R E; Fenstermacher, M E; McLean, A G; Meyer, W H; Morris, K; Seppala, L G; Crabtree, K; Van Zeeland, M A
2014-11-01
An imaging system with a wide-angle tangential view of the full poloidal cross-section of the tokamak in simultaneous infrared and visible light has been installed on DIII-D. The optical train includes three polished stainless steel mirrors in vacuum, which view the tokamak through an aperture in the first mirror, similar to the design concept proposed for ITER. A dichroic beam splitter outside the vacuum separates visible and infrared (IR) light. Spatial calibration is accomplished by warping a CAD-rendered image to align with landmarks in a data image. The IR camera provides scrape-off layer heat flux profile deposition features in diverted and inner-wall-limited plasmas, such as heat flux reduction in pumped radiative divertor shots. Demonstration of the system to date includes observation of fast-ion losses to the outer wall during neutral beam injection, and shows reduced peak wall heat loading with disruption mitigation by injection of a massive gas puff.
Wide-angle ITER-prototype tangential infrared and visible viewing system for DIII-D
Lasnier, Charles J.; Allen, Steve L.; Ellis, Ronald E.; ...
2014-08-26
An imaging system with a wide-angle tangential view of the full poloidal cross-section of the tokamak in simultaneous infrared and visible light has been installed on DIII-D. The optical train includes three polished stainless steel mirrors in vacuum, which view the tokamak through an aperture in the first mirror, similar to the design concept proposed for ITER. A dichroic beam splitter outside the vacuum separates visible and infrared (IR) light. Spatial calibration is accomplished by warping a CAD-rendered image to align with landmarks in a data image. The IR camera provides scrape-off layer heat flux profile deposition features in divertedmore » and inner-wall-limited plasmas, such as heat flux reduction in pumped radiative divertor shots. As a result, demonstration of the system to date includes observation of fast-ion losses to the outer wall during neutral beam injection, and shows reduced peak wall heat loading with disruption mitigation by injection of a massive gas puff.« less
Reductions in injury crashes associated with red light camera enforcement in oxnard, california.
Retting, Richard A; Kyrychenko, Sergey Y
2002-11-01
This study estimated the impact of red light camera enforcement on motor vehicle crashes in one of the first US communities to employ such cameras-Oxnard, California. Crash data were analyzed for Oxnard and for 3 comparison cities. Changes in crash frequencies were compared for Oxnard and control cities and for signalized and nonsignalized intersections by means of a generalized linear regression model. Overall, crashes at signalized intersections throughout Oxnard were reduced by 7% and injury crashes were reduced by 29%. Right-angle crashes, those most associated with red light violations, were reduced by 32%; right-angle crashes involving injuries were reduced by 68%. Because red light cameras can be a permanent component of the transportation infrastructure, crash reductions attributed to camera enforcement should be sustainable.
NASA Astrophysics Data System (ADS)
Suzuki, H.; Yamada, M.; Kouyama, T.; Tatsumi, E.; Kameda, S.; Honda, R.; Sawada, H.; Ogawa, N.; Morota, T.; Honda, C.; Sakatani, N.; Hayakawa, M.; Yokota, Y.; Yamamoto, Y.; Sugita, S.
2018-01-01
Hayabusa2, the first sample return mission to a C-type asteroid was launched by the Japan Aerospace Exploration Agency (JAXA) on December 3, 2014 and will arrive at the asteroid in the middle of 2018 to collect samples from its surface, which may contain both hydrated minerals and organics. The optical navigation camera (ONC) system on board the Hayabusa2 consists of three individual framing CCD cameras, ONC-T for a telescopic nadir view, ONC-W1 for a wide-angle nadir view, and ONC-W2 for a wide-angle slant view will be used to observe the surface of Ryugu. The cameras will be used to measure the global asteroid shape, local morphologies, and visible spectroscopic properties. Thus, image data obtained by ONC will provide essential information to select landing (sampling) sites on the asteroid. This study reports the results of initial inflight calibration based on observations of Earth, Mars, Moon, and stars to verify and characterize the optical performance of the ONC, such as flat-field sensitivity, spectral sensitivity, point-spread function (PSF), distortion, and stray light of ONC-T, and distortion for ONC-W1 and W2. We found some potential problems that may influence our science observations. This includes changes in sensitivity of flat fields for all bands from those that were measured in the pre-flight calibration and existence of a stray light that arises under certain conditions of spacecraft attitude with respect to the sun. The countermeasures for these problems were evaluated by using data obtained during initial in-flight calibration. The results of our inflight calibration indicate that the error of spectroscopic measurements around 0.7 μm using 0.55, 0.70, and 0.86 μm bands of the ONC-T can be lower than 0.7% after these countermeasures and pixel binning. This result suggests that our ONC-T would be able to detect typical strength (∼3%) of the serpentine absorption band often found on CM chondrites and low albedo asteroids with ≥ 4σ confidence.
Development of a stiffness-angle law for simplifying the measurement of human hair stiffness.
Jung, I K; Park, S C; Lee, Y R; Bin, S A; Hong, Y D; Eun, D; Lee, J H; Roh, Y S; Kim, B M
2018-04-01
This research examines the benefits of caffeine absorption on hair stiffness. To test hair stiffness, we have developed an evaluation method that is not only accurate, but also inexpensive. Our evaluation method for measuring hair stiffness culminated in a model, called the Stiffness-Angle Law, which describes the elastic properties of hair and can be widely applied to the development of hair care products. Small molecules (≤500 g mol -1 ) such as caffeine can be absorbed into hair. A common shampoo containing 4% caffeine was formulated and applied to hair 10 times, after which the hair stiffness was measured. The caffeine absorption of the treated hair was observed using Fourier-transform infrared spectroscopy (FTIR) with a focal plane array (FPA) detector. Our evaluation method for measuring hair stiffness consists of a regular camera and a support for single strands of hair. After attaching the hair to the support, the bending angle of the hair was observed with a camera and measured. Then, the hair strand was weighed. The stiffness of the hair was calculated based on our proposed Stiffness-Angle Law using three variables: angle, weight of hair and the distance the hair was pulled across the support. The caffeine absorption was confirmed by FTIR analysis. The concentration of amide bond in the hair certainly increased due to caffeine absorption. After caffeine was absorbed into the hair, the bending angle and weight of the hair changed. Applying these measured changes to the Stiffness-Angle Law, it was confirmed that the hair stiffness increased by 13.2% due to caffeine absorption. The theoretical results using the Stiffness-Angle Law agree with the visual examinations of hair exposed to caffeine and also the known results of hair stiffness from a previous report. Our evaluation method combined with our proposed Stiffness-Angle Law effectively provides an accurate and inexpensive evaluation technique for measuring bending stiffness of human hair. © 2018 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
Compact Autonomous Hemispheric Vision System
NASA Technical Reports Server (NTRS)
Pingree, Paula J.; Cunningham, Thomas J.; Werne, Thomas A.; Eastwood, Michael L.; Walch, Marc J.; Staehle, Robert L.
2012-01-01
Solar System Exploration camera implementations to date have involved either single cameras with wide field-of-view (FOV) and consequently coarser spatial resolution, cameras on a movable mast, or single cameras necessitating rotation of the host vehicle to afford visibility outside a relatively narrow FOV. These cameras require detailed commanding from the ground or separate onboard computers to operate properly, and are incapable of making decisions based on image content that control pointing and downlink strategy. For color, a filter wheel having selectable positions was often added, which added moving parts, size, mass, power, and reduced reliability. A system was developed based on a general-purpose miniature visible-light camera using advanced CMOS (complementary metal oxide semiconductor) imager technology. The baseline camera has a 92 FOV and six cameras are arranged in an angled-up carousel fashion, with FOV overlaps such that the system has a 360 FOV (azimuth). A seventh camera, also with a FOV of 92 , is installed normal to the plane of the other 6 cameras giving the system a > 90 FOV in elevation and completing the hemispheric vision system. A central unit houses the common electronics box (CEB) controlling the system (power conversion, data processing, memory, and control software). Stereo is achieved by adding a second system on a baseline, and color is achieved by stacking two more systems (for a total of three, each system equipped with its own filter.) Two connectors on the bottom of the CEB provide a connection to a carrier (rover, spacecraft, balloon, etc.) for telemetry, commands, and power. This system has no moving parts. The system's onboard software (SW) supports autonomous operations such as pattern recognition and tracking.
NASA Astrophysics Data System (ADS)
Holland, S. Douglas
1992-09-01
A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.
NASA Technical Reports Server (NTRS)
Holland, S. Douglas (Inventor)
1992-01-01
A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.
WindCam and MSPI: two cloud and aerosol instrument concepts derived from Terra/MISR heritage
NASA Astrophysics Data System (ADS)
Diner, David J.; Mischna, Michael; Chipman, Russell A.; Davis, Ab; Cairns, Brian; Davies, Roger; Kahn, Ralph A.; Muller, Jan-Peter; Torres, Omar
2008-08-01
The Multi-angle Imaging SpectroRadiometer (MISR) has been acquiring global cloud and aerosol data from polar orbit since February 2000. MISR acquires moderately high-resolution imagery at nine view angles from nadir to 70.5°, in four visible/near-infrared spectral bands. Stereoscopic parallax, time lapse among the nine views, and the variation of radiance with angle and wavelength enable retrieval of geometric cloud and aerosol plume heights, height-resolved cloud-tracked winds, and aerosol optical depth and particle property information. Two instrument concepts based upon MISR heritage are in development. The Cloud Motion Vector Camera, or WindCam, is a simplified version comprised of a lightweight, compact, wide-angle camera to acquire multiangle stereo imagery at a single visible wavelength. A constellation of three WindCam instruments in polar Earth orbit would obtain height-resolved cloud-motion winds with daily global coverage, making it a low-cost complement to a spaceborne lidar wind measurement system. The Multiangle SpectroPolarimetric Imager (MSPI) is aimed at aerosol and cloud microphysical properties, and is a candidate for the National Research Council Decadal Survey's Aerosol-Cloud-Ecosystem (ACE) mission. MSPI combines the capabilities of MISR with those of other aerosol sensors, extending the spectral coverage to the ultraviolet and shortwave infrared and incorporating high-accuracy polarimetric imaging. Based on requirements for the nonimaging Aerosol Polarimeter Sensor on NASA's Glory mission, a degree of linear polarization uncertainty of 0.5% is specified within a subset of the MSPI bands. We are developing a polarization imaging approach using photoelastic modulators (PEMs) to accomplish this objective.
Panoramic 3d Vision on the ExoMars Rover
NASA Astrophysics Data System (ADS)
Paar, G.; Griffiths, A. D.; Barnes, D. P.; Coates, A. J.; Jaumann, R.; Oberst, J.; Gao, Y.; Ellery, A.; Li, R.
The Pasteur payload on the ESA ExoMars Rover 2011/2013 is designed to search for evidence of extant or extinct life either on or up to ˜2 m below the surface of Mars. The rover will be equipped by a panoramic imaging system to be developed by a UK, German, Austrian, Swiss, Italian and French team for visual characterization of the rover's surroundings and (in conjunction with an infrared imaging spectrometer) remote detection of potential sample sites. The Panoramic Camera system consists of a wide angle multispectral stereo pair with 65° field-of-view (WAC; 1.1 mrad/pixel) and a high resolution monoscopic camera (HRC; current design having 59.7 µrad/pixel with 3.5° field-of-view) . Its scientific goals and operational requirements can be summarized as follows: • Determination of objects to be investigated in situ by other instruments for operations planning • Backup and Support for the rover visual navigation system (path planning, determination of subsequent rover positions and orientation/tilt within the 3d environment), and localization of the landing site (by stellar navigation or by combination of orbiter and ground panoramic images) • Geological characterization (using narrow band geology filters) and cartography of the local environments (local Digital Terrain Model or DTM). • Study of atmospheric properties and variable phenomena near the Martian surface (e.g. aerosol opacity, water vapour column density, clouds, dust devils, meteors, surface frosts,) 1 • Geodetic studies (observations of Sun, bright stars, Phobos/Deimos). The performance of 3d data processing is a key element of mission planning and scientific data analysis. The 3d Vision Team within the Panoramic Camera development Consortium reports on the current status of development, consisting of the following items: • Hardware Layout & Engineering: The geometric setup of the system (location on the mast & viewing angles, mutual mounting between WAC and HRC) needs to be optimized w.r.t. fields of view, ranging capability (distance measurement capability), data rate, necessity of calibration targets, hardware & data interfaces to other subsystems (e.g. navigation) as well as accuracy impacts of sensor design and compression ratio. • Geometric Calibration: The geometric properties of the individual cameras including various spectral filters, their mutual relations and the dynamic geometrical relation between rover frame and cameras - with the mast in between - are precisely described by a calibration process. During surface operations these relations will be continuously checked and updated by photogrammetric means, environmental influences such as temperature, pressure and the Mars gravity will be taken into account. • Surface Mapping: Stereo imaging using the WAC stereo pair is used for the 3d reconstruction of the rover vicinity to identify, locate and characterize potentially interesting spots (3-10 for an experimental cycle to be performed within approx. 10-30 sols). The HRC is used for high resolution imagery of these regions of interest to be overlaid on the 3d reconstruction and potentially refined by shape-from-shading techniques. A quick processing result is crucial for time critical operations planning, therefore emphasis is laid on the automatic behaviour and intrinsic error detection mechanisms. The mapping results will be continuously fused, updated and synchronized with the map used by the navigation system. The surface representation needs to take into account the different resolutions of HRC and WAC as well as uncommon or even unexpected image acquisition modes such as long range, wide baseline stereo from different rover positions or escape strategies in the case of loss of one of the stereo camera heads. • Panorama Mosaicking: The production of a high resolution stereoscopic panorama nowadays is state-of-art in computer vision. However, certain 2 challenges such as the need for access to accurate spherical coordinates, maintenance of radiometric & spectral response in various spectral bands, fusion between HRC and WAC, super resolution, and again the requirement of quick yet robust processing will add some complexity to the ground processing system. • Visualization for Operations Planning: Efficient operations planning is directly related to an ergonomic and well performing visualization. It is intended to adapt existing tools to an integrated visualization solution for the purpose of scientific site characterization, view planning and reachability mapping/instrument placement of pointing sensors (including the panoramic imaging system itself), and selection of regions of interest. The main interfaces between the individual components as well as the first version of a user requirement document are currently under definition. Beside the support for sensor layout and calibration the 3d vision system will consist of 2-3 main modules to be used during ground processing & utilization of the ExoMars Rover panoramic imaging system. 3
NASA Technical Reports Server (NTRS)
1978-01-01
NASA remote sensing technology is being employed in archeological studies of the Anasazi Indians, who lived in New Mexico one thousand years ago. Under contract with the National Park Service, NASA's Technology Applications Center at the University of New Mexico is interpreting multispectral scanner data and demonstrating how aerospace scanning techniques can uncover features of prehistoric ruins not visible in conventional aerial photographs. The Center's initial study focused on Chaco Canyon, a pre-Columbia Anasazi site in northeastern New Mexico. Chaco Canyon is a national monument and it has been well explored on the ground and by aerial photography. But the National Park Service was interested in the potential of multispectral scanning for producing evidence of prehistoric roads, field patterns and dwelling areas not discernible in aerial photographs. The multispectral scanner produces imaging data in the invisible as well as the visible portions of the spectrum. This data is converted to pictures which bring out features not visible to the naked eye or to cameras. The Technology Applications Center joined forces with Bendix Aerospace Systems Division, Ann Arbor, Michigan, which provided a scanner-equipped airplane for mapping the Chaco Canyon area. The NASA group processed the scanner images and employed computerized image enhancement techniques to bring out additional detail.
Mountain pine beetle detection and monitoring: evaluation of airborne imagery
NASA Astrophysics Data System (ADS)
Roberts, A.; Bone, C.; Dragicevic, S.; Ettya, A.; Northrup, J.; Reich, R.
2007-10-01
The processing and evaluation of digital airborne imagery for detection, monitoring and modeling of mountain pine beetle (MPB) infestations is evaluated. The most efficient and reliable remote sensing strategy for identification and mapping of infestation stages ("current" to "red" to "grey" attack) of MPB in lodgepole pine forests is determined for the most practical and cost effective procedures. This research was planned to specifically enhance knowledge by determining the remote sensing imaging systems and analytical procedures that optimize resource management for this critical forest health problem. Within the context of this study, airborne remote sensing of forest environments for forest health determinations (MPB) is most suitably undertaken using multispectral digitally converted imagery (aerial photography) at scales of 1:8000 for early detection of current MPB attack and 1:16000 for mapping and sequential monitoring of red and grey attack. Digital conversion should be undertaken at 10 to 16 microns for B&W multispectral imagery and 16 to 24 microns for colour and colour infrared imagery. From an "operational" perspective, the use of twin mapping-cameras with colour and B&W or colour infrared film will provide the best approximation of multispectral digital imagery with near comparable performance in a competitive private sector context (open bidding).
Multispectral and hyperspectral measurements of soldier's camouflage equipment
NASA Astrophysics Data System (ADS)
Kastek, Mariusz; Piątkowski, Tadeusz; Dulski, Rafal; Chamberland, Martin; Lagueux, Philippe; Farley, Vincent
2012-06-01
In today's electro-optic warfare era, it is more than vital for one nation's defense to possess the most advanced measurement and signature intelligence (MASINT) capabilities. This is critical to gain a strategic advantage in the planning of the military operations and deployments. The thermal infrared region of the electromagnetic spectrum is a key region that is exploited for infrared reconnaissance and surveillance missions. The Military University of Technology has conducted an intensive measurement campaign of various soldier's camouflage devices in the scope of building a database of infrared signatures. One of today's key technologies required to perform signature measurements has become infrared hyperspectral and broadband/multispectral imaging sensors. The Telops Hyper-Cam LW product represents a unique commercial offering with outstanding performances and versatility for the collection of hyperspectral infrared images. The Hyper-Cam allows for the infrared imagery of a target (320 × 256 pixels) at a very high spectral resolution (down to 0.25 cm-1). Moreover, the Military University of Technology has made use of a suite of scientific grade commercial infrared cameras to further measure and assess the targets from a broadband/multispectral perspective. The experiment concept and measurement results are presented in this paper.
NASA Astrophysics Data System (ADS)
Louchard, Eric; Farm, Brian; Acker, Andrew
2008-04-01
BAE Systems Sensor Systems Identification & Surveillance (IS) has developed, under contract with the Office of Naval Research, a multispectral airborne sensor system and processing algorithms capable of detecting mine-like objects in the surf zone and land mines in the beach zone. BAE Systems has used this system in a blind test at a test range established by the Naval Surface Warfare Center - Panama City Division (NSWC-PCD) at Eglin Air Force Base. The airborne and ground subsystems used in this test are described, with graphical illustrations of the detection algorithms. We report on the performance of the system configured to operate with a human operator analyzing data on a ground station. A subsurface (underwater bottom proud mine in the surf zone and moored mine in shallow water) mine detection capability is demonstrated in the surf zone. Surface float detection and proud land mine detection capability is also demonstrated. Our analysis shows that this BAE Systems-developed multispectral airborne sensor provides a robust technical foundation for a viable system for mine counter-measures, and would be a valuable asset for use prior to an amphibious assault.
NASA Astrophysics Data System (ADS)
de Villiers, Jason; Jermy, Robert; Nicolls, Fred
2014-06-01
This paper presents a system to determine the photogrammetric parameters of a camera. The lens distortion, focal length and camera six degree of freedom (DOF) position are calculated. The system caters for cameras of different sensitivity spectra and fields of view without any mechanical modifications. The distortion characterization, a variant of Brown's classic plumb line method, allows many radial and tangential distortion coefficients and finds the optimal principal point. Typical values are 5 radial and 3 tangential coefficients. These parameters are determined stably and demonstrably produce superior results to low order models despite popular and prevalent misconceptions to the contrary. The system produces coefficients to model both the distorted to undistorted pixel coordinate transformation (e.g. for target designation) and the inverse transformation (e.g. for image stitching and fusion) allowing deterministic rates far exceeding real time. The focal length is determined to minimise the error in absolute photogrammetric positional measurement for both multi camera systems or monocular (e.g. helmet tracker) systems. The system determines the 6 DOF position of the camera in a chosen coordinate system. It can also determine the 6 DOF offset of the camera relative to its mechanical mount. This allows faulty cameras to be replaced without requiring a recalibration of the entire system (such as an aircraft cockpit). Results from two simple applications of the calibration results are presented: stitching and fusion of the images from a dual-band visual/ LWIR camera array, and a simple laboratory optical helmet tracker.
Multispectral Palmprint Recognition Using a Quaternion Matrix
Xu, Xingpeng; Guo, Zhenhua; Song, Changjiang; Li, Yafeng
2012-01-01
Palmprints have been widely studied for biometric recognition for many years. Traditionally, a white light source is used for illumination. Recently, multispectral imaging has drawn attention because of its high recognition accuracy. Multispectral palmprint systems can provide more discriminant information under different illuminations in a short time, thus they can achieve better recognition accuracy. Previously, multispectral palmprint images were taken as a kind of multi-modal biometrics, and the fusion scheme on the image level or matching score level was used. However, some spectral information will be lost during image level or matching score level fusion. In this study, we propose a new method for multispectral images based on a quaternion model which could fully utilize the multispectral information. Firstly, multispectral palmprint images captured under red, green, blue and near-infrared (NIR) illuminations were represented by a quaternion matrix, then principal component analysis (PCA) and discrete wavelet transform (DWT) were applied respectively on the matrix to extract palmprint features. After that, Euclidean distance was used to measure the dissimilarity between different features. Finally, the sum of two distances and the nearest neighborhood classifier were employed for recognition decision. Experimental results showed that using the quaternion matrix can achieve a higher recognition rate. Given 3000 test samples from 500 palms, the recognition rate can be as high as 98.83%. PMID:22666049
Multispectral palmprint recognition using a quaternion matrix.
Xu, Xingpeng; Guo, Zhenhua; Song, Changjiang; Li, Yafeng
2012-01-01
Palmprints have been widely studied for biometric recognition for many years. Traditionally, a white light source is used for illumination. Recently, multispectral imaging has drawn attention because of its high recognition accuracy. Multispectral palmprint systems can provide more discriminant information under different illuminations in a short time, thus they can achieve better recognition accuracy. Previously, multispectral palmprint images were taken as a kind of multi-modal biometrics, and the fusion scheme on the image level or matching score level was used. However, some spectral information will be lost during image level or matching score level fusion. In this study, we propose a new method for multispectral images based on a quaternion model which could fully utilize the multispectral information. Firstly, multispectral palmprint images captured under red, green, blue and near-infrared (NIR) illuminations were represented by a quaternion matrix, then principal component analysis (PCA) and discrete wavelet transform (DWT) were applied respectively on the matrix to extract palmprint features. After that, Euclidean distance was used to measure the dissimilarity between different features. Finally, the sum of two distances and the nearest neighborhood classifier were employed for recognition decision. Experimental results showed that using the quaternion matrix can achieve a higher recognition rate. Given 3000 test samples from 500 palms, the recognition rate can be as high as 98.83%.
Feasibility of using Eastman Kodak type 3400 film for high altitude multispectral photography
NASA Technical Reports Server (NTRS)
Perry, L.
1972-01-01
A photographic test flight of the NASA RB-57F was conducted on March 25, 1972, over Houston and West Texas, to determine the suitability of Eastman Kodak type 3400 film as a replacement for type 2402 film in the Hasselblad cameras. An additional purpose was to test GAF film type 2914, a new black and white film similar to 2402, but with higher maximum gamma and greater dynamic range.
Data acquisition system for operational earth observation missions
NASA Technical Reports Server (NTRS)
Deerwester, J. M.; Alexander, D.; Arno, R. D.; Edsinger, L. E.; Norman, S. M.; Sinclair, K. F.; Tindle, E. L.; Wood, R. D.
1972-01-01
The data acquisition system capabilities expected to be available in the 1980 time period as part of operational Earth observation missions are identified. By data acquisition system is meant the sensor platform (spacecraft or aircraft), the sensors themselves and the communication system. Future capabilities and support requirements are projected for the following sensors: film camera, return beam vidicon, multispectral scanner, infrared scanner, infrared radiometer, microwave scanner, microwave radiometer, coherent side-looking radar, and scatterometer.
Multi-Angle Imager for Aerosols (MAIA) Investigation of Airborne Particle Health Impacts
NASA Astrophysics Data System (ADS)
Diner, D. J.
2016-12-01
Airborne particulate matter (PM) is a well-known cause of heart disease, cardiovascular and respiratory illness, low birth weight, and lung cancer. The Global Burden of Disease (GBD) Study ranks PM as a major environmental risk factor worldwide. Global maps of PM2.5concentrations derived from satellite instruments, including MISR and MODIS, have provided key contributions to the GBD and many other health-related investigations. Although it is well established that PM exposure increases the risks of mortality and morbidity, our understanding of the relative toxicity of specific PM types is relatively poor. To address this, the Multi-Angle Imager for Aerosols (MAIA) investigation was proposed to NASA's third Earth Venture Instrument (EVI-3) solicitation. The satellite instrument that is part of the investigation is a multiangle, multispectral, and polarimetric camera system based on the first and second generation Airborne Multiangle SpectroPolarimetric Imagers, AirMSPI and AirMSPI-2. MAIA was selected for funding in March 2016. Estimates of the abundances of different aerosol types from the WRF-Chem model will be combined with MAIA instrument data. Geostatistical models derived from collocated surface and MAIA retrievals will then be used to relate retrieved fractional column aerosol optical depths to near-surface concentrations of major PM constituents, including sulfate, nitrate, organic carbon, black carbon, and dust. Epidemiological analyses of geocoded birth, death, and hospital records will be used to associate exposure to PM types with adverse health outcomes. MAIA launch is planned for early in the next decade. The MAIA instrument incorporates a pair of cameras on a two-axis gimbal to provide regional multiangle observations of selected, globally distributed target areas. Primary Target Areas (PTAs) on five continents are chosen to include major population centers covering a range of PM concentrations and particle types, surface-based aerosol sunphotometers, PM size discrimination and chemical speciation monitors, and access to geocoded health datasets. The MAIA investigation brings together an international team of researchers and policy specialists with expertise in remote sensing, aerosol science, air quality, epidemiology, and public health.
Video Mosaicking for Inspection of Gas Pipelines
NASA Technical Reports Server (NTRS)
Magruder, Darby; Chien, Chiun-Hong
2005-01-01
A vision system that includes a specially designed video camera and an image-data-processing computer is under development as a prototype of robotic systems for visual inspection of the interior surfaces of pipes and especially of gas pipelines. The system is capable of providing both forward views and mosaicked radial views that can be displayed in real time or after inspection. To avoid the complexities associated with moving parts and to provide simultaneous forward and radial views, the video camera is equipped with a wide-angle (>165 ) fish-eye lens aimed along the axis of a pipe to be inspected. Nine white-light-emitting diodes (LEDs) placed just outside the field of view of the lens (see Figure 1) provide ample diffuse illumination for a high-contrast image of the interior pipe wall. The video camera contains a 2/3-in. (1.7-cm) charge-coupled-device (CCD) photodetector array and functions according to the National Television Standards Committee (NTSC) standard. The video output of the camera is sent to an off-the-shelf video capture board (frame grabber) by use of a peripheral component interconnect (PCI) interface in the computer, which is of the 400-MHz, Pentium II (or equivalent) class. Prior video-mosaicking techniques are applicable to narrow-field-of-view (low-distortion) images of evenly illuminated, relatively flat surfaces viewed along approximately perpendicular lines by cameras that do not rotate and that move approximately parallel to the viewed surfaces. One such technique for real-time creation of mosaic images of the ocean floor involves the use of visual correspondences based on area correlation, during both the acquisition of separate images of adjacent areas and the consolidation (equivalently, integration) of the separate images into a mosaic image, in order to insure that there are no gaps in the mosaic image. The data-processing technique used for mosaicking in the present system also involves area correlation, but with several notable differences: Because the wide-angle lens introduces considerable distortion, the image data must be processed to effectively unwarp the images (see Figure 2). The computer executes special software that includes an unwarping algorithm that takes explicit account of the cylindrical pipe geometry. To reduce the processing time needed for unwarping, parameters of the geometric mapping between the circular view of a fisheye lens and pipe wall are determined in advance from calibration images and compiled into an electronic lookup table. The software incorporates the assumption that the optical axis of the camera is parallel (rather than perpendicular) to the direction of motion of the camera. The software also compensates for the decrease in illumination with distance from the ring of LEDs.
Development of an Aerosol Opacity Retrieval Algorithm for Use with Multi-Angle Land Surface Images
NASA Technical Reports Server (NTRS)
Diner, D.; Paradise, S.; Martonchik, J.
1994-01-01
In 1998, the Multi-angle Imaging SpectroRadiometer (MISR) will fly aboard the EOS-AM1 spacecraft. MISR will enable unique methods for retrieving the properties of atmospheric aerosols, by providing global imagery of the Earth at nine viewing angles in four visible and near-IR spectral bands. As part of the MISR algorithm development, theoretical methods of analyzing multi-angle, multi-spectral data are being tested using images acquired by the airborne Advanced Solid-State Array Spectroradiometer (ASAS). In this paper we derive a method to be used over land surfaces for retrieving the change in opacity between spectral bands, which can then be used in conjunction with an aerosol model to derive a bound on absolute opacity.
5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF ...
5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF BRIDGE AND ENGINE CAR ON TRACKS, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Multispectral scanner optical system
NASA Technical Reports Server (NTRS)
Stokes, R. C.; Koch, N. G. (Inventor)
1980-01-01
An optical system for use in a multispectral scanner of the type used in video imaging devices is disclosed. Electromagnetic radiation reflected by a rotating scan mirror is focused by a concave primary telescope mirror and collimated by a second concave mirror. The collimated beam is split by a dichroic filter which transmits radiant energy in the infrared spectrum and reflects visible and near infrared energy. The long wavelength beam is filtered and focused on an infrared detector positioned in a cryogenic environment. The short wavelength beam is dispersed by a pair of prisms, then projected on an array of detectors also mounted in a cryogenic environment and oriented at an angle relative to the optical path of the dispersed short wavelength beam.
Interdisciplinary scientist participation in the Phobos mission
NASA Technical Reports Server (NTRS)
1992-01-01
Data was acquired from VSK (2 wide-angle visible-NIR TV cameras at 0.4 to 0.6 micrometers and 0.8 to 1.1 micrometers, and a narrow-angle TV camera), KRFM (10-band UV-visible spectrometer at 0.3 to 0.6 micrometers and a 6-band radiometer at 5-50 micrometers), and ISM (a 128-channel NIR imaging spectrometer at 0.8 to 3 micrometers). These data provided improved mapping coverage of Phobos; improved mass, shape, and volume determinations, with the density shown to be lower than that of all known meteorites, suggesting a porous interior; evidence for a physically, spectrally and possibly compositionally heterogeneous surface; and proof that the spectral properties do not closely resemble those of unaltered carbonaceous chondrites, but show more resemblance to the spectra of altered mafic material. For Mars, the data show that the underlying rock type can be distinguished through the global dust cover; that the spectral properties and possibly composition vary laterally between and within the geologic provinces; that the surface physical properties vary laterally, and in many cases, the boundaries coincide with those of the geologic units; and the acquired data also demonstrate the value of reflectance spectroscopy and radiometry to the study of Martian geology.
The emplacement of long lava flows in Mare Imbrium, the Moon
NASA Astrophysics Data System (ADS)
Garry, W. B.
2012-12-01
Lava flow margins are scarce on the lunar surface. The best developed lava flows on the Moon occur in Mare Imbrium where flow margins are traceable nearly their entire flow length. The flow field originates in the southwest part of the basin from a fissure or series of fissures and cones located in the vicinity of Euler crater and erupted in three phases (Phases I, II, III) over a period of 0.5 Billion years (3.0 - 2.5 Ga). The flow field was originally mapped with Apollo and Lunar Orbiter data by Schaber (1973) and shows the flow field extends 200 to 1200 km from the presumed source area and covers an area of 2.0 x 10^5 km^2 with an estimated eruptive volume of 4 x 10^4 km^3. Phase I flows extend 1200 km and have the largest flow volume, but interestingly do not exhibit visible topography and are instead defined by difference in color from the surrounding mare flows. Phases II and III flows have well-defined flow margins (10 - 65 m thick) and channels (0.4 - 2.0 km wide, 40 - 70 m deep), but shorter flow lengths, 600 km and 400 km respectively. Recent missions, including Lunar Reconnaissance Orbiter (LRO), Kaguya (Selene), and Clementine, provide high resolution data sets of these lava flows. Using a combination of data sets including images from LRO Wide-Angle-Camera (WAC)(50-100 m/pixel) and Narrow-Angle-Camera (NAC) (up to 0.5m/pixel), Kaguya Terrain Camera (TC) (10 m/pixel), and topography from LRO Lunar Orbiter Laser Altimeter (LOLA), the morphology has been remapped and topographic measurements of the flow features have been made in an effort to reevaluate the emplacement of the flow field. Morphologic mapping reveals a different flow path for Phase I compared to the original mapping completed by Schaber (1973). The boundaries of the Phase I flow field have been revised based on Moon Mineralogy Mapper color ratio images (Staid et al., 2011). This has implications for the area covered and volume erupted during this stage, as well as, the age of Phase I. Flow features and margins have been identified in the Phase I flow within the LROC WAC mosaic and in Narrow Angle Camera (NAC) images. These areas have a mottled appearance. LOLA profiles over the more prominent flow lobes in Phase I reveal these margins are less 10 m thick. Phase II and III morphology maps are similar to previous flow maps. Phase III lobes near Euler are 10-12 km wide and 20-30 m thick based on measurements of the LOLA 1024ppd Elevation Digital Terrain Model (DTM) in JMoon. One of the longer Phase III lobes varies between 15 to 50 km wide and 25 to 60 m thick, with the thickest section at the distal end of the lobe. The Phase II lobe is 15 to 25 m thick and up to 35 km wide. The eruptive volume of the Mare Imbrium lava flows has been compared to terrestrial flood basalts. The morphology of the lobes in Phase II and III, which includes levees, thick flow fronts, and lobate margins suggests these could be similar to terrestrial aa-style flows. The Phase I flows might be more representative of sheet flows, pahoehoe-style flows, or inflated flows. Morphologic comparisons will be made with terrestrial flows at Askja volcano in Iceland, a potential analog to compare different styles of emplacement for the flows in Mare Imbrium.
ERIC Educational Resources Information Center
Beverly, Robert E.; Young, Thomas J.
Two hundred forty college undergraduates participated in a study of the effect of camera angle on an audience's perceptual judgments of source credibility, dominance, attraction, and homophily. The subjects were divided into four groups and each group was shown a videotape presentation in which sources had been videotaped according to one of four…
NASA Technical Reports Server (NTRS)
Jagge, Amy
2016-01-01
With ever changing landscapes and environmental conditions due to human induced climate change, adaptability is imperative for the long-term success of facilities and Federal agency missions. To mitigate the effects of climate change, indicators such as above-ground biomass change must be identified to establish a comprehensive monitoring effort. Researching the varying effects of climate change on ecosystems can provide a scientific framework that will help produce informative, strategic and tactical policies for environmental adaptation. As a proactive approach to climate change mitigation, NASA tasked the Climate Change Adaptation Science Investigators Workgroup (CASI) to provide climate change expertise and data to Center facility managers and planners in order to ensure sustainability based on predictive models and current research. Generation of historical datasets that will be used in an agency-wide effort to establish strategies for climate change mitigation and adaptation at NASA facilities is part of the CASI strategy. Using time series of historical remotely sensed data is well-established means of measuring change over time. CASI investigators have acquired multispectral and hyperspectral optical and LiDAR remotely sensed datasets from NASA Earth Observation Satellites (including the International Space Station), airborne sensors, and astronaut photography using hand held digital cameras to create a historical dataset for the Johnson Space Center, as well as the Houston and Galveston area. The raster imagery within each dataset has been georectified, and the multispectral and hyperspectral imagery has been atmospherically corrected. Using ArcGIS for Server, the CASI-Regional Remote Sensing data has been published as an image service, and can be visualized through a basic web mapping application. Future work will include a customized web mapping application created using a JavaScript Application Programming Interface (API), and inclusion of the CASI data for the NASA Johnson Space Center into a NASA-Wide GIS Institutional Portal.
3D bubble reconstruction using multiple cameras and space carving method
NASA Astrophysics Data System (ADS)
Fu, Yucheng; Liu, Yang
2018-07-01
An accurate measurement of bubble shape and size has a significant value in understanding the behavior of bubbles that exist in many engineering applications. Past studies usually use one or two cameras to estimate bubble volume, surface area, among other parameters. The 3D bubble shape and rotation angle are generally not available in these studies. To overcome this challenge and obtain more detailed information of individual bubbles, a 3D imaging system consisting of four high-speed cameras is developed in this paper, and the space carving method is used to reconstruct the 3D bubble shape based on the recorded high-speed images from different view angles. The proposed method can reconstruct the bubble surface with minimal assumptions. A benchmarking test is performed in a 3 cm × 1 cm rectangular channel with stagnant water. The results show that the newly proposed method can measure the bubble volume with an error of less than 2% compared with the syringe reading. The conventional two-camera system has an error around 10%. The one-camera system has an error greater than 25%. The visualization of a 3D bubble rising demonstrates the wall influence on bubble rotation angle and aspect ratio. This also explains the large error that exists in the single camera measurement.
High-emulation mask recognition with high-resolution hyperspectral video capture system
NASA Astrophysics Data System (ADS)
Feng, Jiao; Fang, Xiaojing; Li, Shoufeng; Wang, Yongjin
2014-11-01
We present a method for distinguishing human face from high-emulation mask, which is increasingly used by criminals for activities such as stealing card numbers and passwords on ATM. Traditional facial recognition technique is difficult to detect such camouflaged criminals. In this paper, we use the high-resolution hyperspectral video capture system to detect high-emulation mask. A RGB camera is used for traditional facial recognition. A prism and a gray scale camera are used to capture spectral information of the observed face. Experiments show that mask made of silica gel has different spectral reflectance compared with the human skin. As multispectral image offers additional spectral information about physical characteristics, high-emulation mask can be easily recognized.
NASA Astrophysics Data System (ADS)
Bernat, Amir S.; Bar-Am, Kfir; Cataldo, Leigh; Bolton, Frank J.; Kahn, Bruce S.; Levitz, David
2018-02-01
Cervical cancer is a leading cause of death for women in low resource settings. In order to better detect cervical dysplasia, a low cost multi-spectral colposcope was developed utilizing low costs LEDs and an area scan camera. The device is capable of both traditional colposcopic imaging and multi-spectral image capture. Following initial bench testing, the device was deployed to a gynecology clinic where it was used to image patients in a colposcopy setting. Both traditional colposcopic images and spectral data from patients were uploaded to a cloud server for remote analysis. Multi-spectral imaging ( 30 second capture) took place before any clinical procedure; the standard of care was followed thereafter. If acetic acid was used in the standard of care, a post-acetowhitening colposcopic image was also captured. In analyzing the data, normal and abnormal regions were identified in the colposcopic images by an expert clinician. Spectral data were fit to a theoretical model based on diffusion theory, yielding information on scattering and absorption parameters. Data were grouped according to clinician labeling of the tissue, as well as any additional clinical test results available (Pap, HPV, biopsy). Altogether, N=20 patients were imaged in this study, with 9 of them abnormal. In comparing normal and abnormal regions of interest from patients, substantial differences were measured in blood content, while differences in oxygen saturation parameters were more subtle. These results suggest that optical measurements made using low cost spectral imaging systems can distinguish between normal and pathological tissues.
Multipurpose Hyperspectral Imaging System
NASA Technical Reports Server (NTRS)
Mao, Chengye; Smith, David; Lanoue, Mark A.; Poole, Gavin H.; Heitschmidt, Jerry; Martinez, Luis; Windham, William A.; Lawrence, Kurt C.; Park, Bosoon
2005-01-01
A hyperspectral imaging system of high spectral and spatial resolution that incorporates several innovative features has been developed to incorporate a focal plane scanner (U.S. Patent 6,166,373). This feature enables the system to be used for both airborne/spaceborne and laboratory hyperspectral imaging with or without relative movement of the imaging system, and it can be used to scan a target of any size as long as the target can be imaged at the focal plane; for example, automated inspection of food items and identification of single-celled organisms. The spectral resolution of this system is greater than that of prior terrestrial multispectral imaging systems. Moreover, unlike prior high-spectral resolution airborne and spaceborne hyperspectral imaging systems, this system does not rely on relative movement of the target and the imaging system to sweep an imaging line across a scene. This compact system (see figure) consists of a front objective mounted at a translation stage with a motorized actuator, and a line-slit imaging spectrograph mounted within a rotary assembly with a rear adaptor to a charged-coupled-device (CCD) camera. Push-broom scanning is carried out by the motorized actuator which can be controlled either manually by an operator or automatically by a computer to drive the line-slit across an image at a focal plane of the front objective. To reduce the cost, the system has been designed to integrate as many as possible off-the-shelf components including the CCD camera and spectrograph. The system has achieved high spectral and spatial resolutions by using a high-quality CCD camera, spectrograph, and front objective lens. Fixtures for attachment of the system to a microscope (U.S. Patent 6,495,818 B1) make it possible to acquire multispectral images of single cells and other microscopic objects.
Near-field observation platform
NASA Astrophysics Data System (ADS)
Schlemmer, Harry; Baeurle, Constantin; Vogel, Holger
2008-04-01
A miniaturized near-field observation platform is presented comprising a sensitive daylight camera and an uncooled micro-bolometer thermal imager each equipped with a wide angle lens. Both cameras are optimised for a range between a few meters and 200 m. The platform features a stabilised line of sight and can therefore be used also on a vehicle when it is in motion. The line of sight either can be directed manually or the platform can be used in a panoramic mode. The video output is connected to a control panel where algorithms for moving target indication or tracking can be applied in order to support the observer. The near-field platform also can be netted with the vehicle system and the signals can be utilised, e.g. to designate a new target to the main periscope or the weapon sight.
1986-01-22
Range : 2.7 million miles (1.7 million miles) P-29497C Tis Voyager 2, false color composite of Uranus demonstrates the usefulness of special filters in the Voyager cameras for revealing the presence of high altitude hazes in Uranus' atmosphere. The picture is a composite of images obtained through the single orange and two methane filters of Voyager's wide angle camera. Orange, short wavelength and long wavelength methane images are displayed, retrospectively, as blue, green, and orange. The pink area centered on the pole is due to the presence of hazes high in the atmosphere that reflect the light before it has traversed a long enough path through the atmosphere to suffer absorbtion by methane gas. The bluest region at mid-latitude represent the most haze free regions on Uranus, thus, deeper cloud levels can be detected in these areas.
Modified plenoptic camera for phase and amplitude wavefront sensing
NASA Astrophysics Data System (ADS)
Wu, Chensheng; Davis, Christopher C.
2013-09-01
Shack-Hartmann sensors have been widely applied in wavefront sensing. However, they are limited to measuring slightly distorted wavefronts whose local tilt doesn't surpass the numerical aperture of its micro-lens array and cross talk of incident waves on the mrcro-lens array should be strictly avoided. In medium to strong turbulence cases of optic communication, where large jitter in angle of arrival and local interference caused by break-up of beam are common phenomena, Shack-Hartmann sensors no longer serve as effective tools in revealing distortions in a signal wave. Our design of a modified Plenoptic Camera shows great potential in observing and extracting useful information from severely disturbed wavefronts. Furthermore, by separating complex interference patterns into several minor interference cases, it may also be capable of telling regional phase difference of coherently illuminated objects.
Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test.
Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno
2008-11-17
The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.
Uav Borne Low Altitude Photogrammetry System
NASA Astrophysics Data System (ADS)
Lin, Z.; Su, G.; Xie, F.
2012-07-01
In this paper,the aforementioned three major aspects related to the Unmanned Aerial Vehicles (UAV) system for low altitude aerial photogrammetry, i.e., flying platform, imaging sensor system and data processing software, are discussed. First of all, according to the technical requirements about the least cruising speed, the shortest taxiing distance, the level of the flight control and the performance of turbulence flying, the performance and suitability of the available UAV platforms (e.g., fixed wing UAVs, the unmanned helicopters and the unmanned airships) are compared and analyzed. Secondly, considering the restrictions on the load weight of a platform and the resolution pertaining to a sensor, together with the exposure equation and the theory of optical information, the principles of designing self-calibration and self-stabilizing combined wide-angle digital cameras (e.g., double-combined camera and four-combined camera) are placed more emphasis on. Finally, a software named MAP-AT, considering the specialty of UAV platforms and sensors, is developed and introduced. Apart from the common functions of aerial image processing, MAP-AT puts more effort on automatic extraction, automatic checking and artificial aided adding of the tie points for images with big tilt angles. Based on the recommended process for low altitude photogrammetry with UAVs in this paper, more than ten aerial photogrammetry missions have been accomplished, the accuracies of Aerial Triangulation, Digital orthophotos(DOM)and Digital Line Graphs(DLG) of which meet the standard requirement of 1:2000, 1:1000 and 1:500 mapping.
Polarimetric Observations of the Lunar Surface
NASA Astrophysics Data System (ADS)
Kim, S.
2017-12-01
Polarimetric images contain valuable information on the lunar surface such as grain size and porosity of the regolith, from which one can estimate the space weathering environment on the lunar surface. Surprisingly, polarimetric observation has never been conducted from the lunar orbit before. A Wide-Angle Polarimetric Camera (PolCam) has been recently selected as one of three Korean science instruments onboard the Korea Pathfinder Lunar Orbiter (KPLO), which is aimed to be launched in 2019/2020 as the first Korean lunar mission. PolCam will obtain 80 m-resolution polarimetric images of the whole lunar surface between -70º and +70º latitudes at 320, 430 and 750 nm bands for phase angles up to 115º. I will also discuss previous polarimetric studies on the lunar surface based on our ground-based observations.