Sample records for angle infrared camera

  1. New gonioscopy system using only infrared light.

    PubMed

    Sugimoto, Kota; Ito, Kunio; Matsunaga, Koichi; Miura, Katsuya; Esaki, Koji; Uji, Yukitaka

    2005-08-01

    To describe an infrared gonioscopy system designed to observe the anterior chamber angle under natural mydriasis in a completely darkened room. An infrared light filter was used to modify the light source of the slit-lamp microscope. A television monitor connected to a CCD monochrome camera was used to indirectly observe the angle. Use of the infrared system enabled observation of the angle under natural mydriasis in a completely darkened room. Infrared gonioscopy is a useful procedure for the observation of the angle under natural mydriasis.

  2. Near-infrared light-guided miniaturized indirect ophthalmoscopy for nonmydriatic wide-field fundus photography.

    PubMed

    Toslak, Devrim; Liu, Changgeng; Alam, Minhaj Nur; Yao, Xincheng

    2018-06-01

    A portable fundus imager is essential for emerging telemedicine screening and point-of-care examination of eye diseases. However, existing portable fundus cameras have limited field of view (FOV) and frequently require pupillary dilation. We report here a miniaturized indirect ophthalmoscopy-based nonmydriatic fundus camera with a snapshot FOV up to 67° external angle, which corresponds to a 101° eye angle. The wide-field fundus camera consists of a near-infrared light source (LS) for retinal guidance and a white LS for color retinal imaging. By incorporating digital image registration and glare elimination methods, a dual-image acquisition approach was used to achieve reflection artifact-free fundus photography.

  3. Visible-infrared achromatic imaging by wavefront coding with wide-angle automobile camera

    NASA Astrophysics Data System (ADS)

    Ohta, Mitsuhiko; Sakita, Koichi; Shimano, Takeshi; Sugiyama, Takashi; Shibasaki, Susumu

    2016-09-01

    We perform an experiment of achromatic imaging with wavefront coding (WFC) using a wide-angle automobile lens. Our original annular phase mask for WFC was inserted to the lens, for which the difference between the focal positions at 400 nm and at 950 nm is 0.10 mm. We acquired images of objects using a WFC camera with this lens under the conditions of visible and infrared light. As a result, the effect of the removal of the chromatic aberration of the WFC system was successfully determined. Moreover, we fabricated a demonstration set assuming the use of a night vision camera in an automobile and showed the effect of the WFC system.

  4. Instrumentation for Infrared Airglow Clutter.

    DTIC Science & Technology

    1987-03-10

    gain, and filter position to the Camera Head, and monitors these parameters as well as preamp video. GAZER is equipped with a Lenzar wide angle, low...Specifications/Parameters VIDEO SENSOR: Camera ...... . LENZAR Intensicon-8 LLLTV using 2nd gen * micro-channel intensifier and proprietary camera tube

  5. Multi-Angle Snowflake Camera Instrument Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stuefer, Martin; Bailey, J.

    2016-07-01

    The Multi-Angle Snowflake Camera (MASC) takes 9- to 37-micron resolution stereographic photographs of free-falling hydrometers from three angles, while simultaneously measuring their fall speed. Information about hydrometeor size, shape orientation, and aspect ratio is derived from MASC photographs. The instrument consists of three commercial cameras separated by angles of 36º. Each camera field of view is aligned to have a common single focus point about 10 cm distant from the cameras. Two near-infrared emitter pairs are aligned with the camera’s field of view within a 10-angular ring and detect hydrometeor passage, with the lower emitters configured to trigger the MASCmore » cameras. The sensitive IR motion sensors are designed to filter out slow variations in ambient light. Fall speed is derived from successive triggers along the fall path. The camera exposure times are extremely short, in the range of 1/25,000th of a second, enabling the MASC to capture snowflake sizes ranging from 30 micrometers to 3 cm.« less

  6. Research on Geometric Calibration of Spaceborne Linear Array Whiskbroom Camera

    PubMed Central

    Sheng, Qinghong; Wang, Qi; Xiao, Hui; Wang, Qing

    2018-01-01

    The geometric calibration of a spaceborne thermal-infrared camera with a high spatial resolution and wide coverage can set benchmarks for providing an accurate geographical coordinate for the retrieval of land surface temperature. The practice of using linear array whiskbroom Charge-Coupled Device (CCD) arrays to image the Earth can help get thermal-infrared images of a large breadth with high spatial resolutions. Focusing on the whiskbroom characteristics of equal time intervals and unequal angles, the present study proposes a spaceborne linear-array-scanning imaging geometric model, whilst calibrating temporal system parameters and whiskbroom angle parameters. With the help of the YG-14—China’s first satellite equipped with thermal-infrared cameras of high spatial resolution—China’s Anyang Imaging and Taiyuan Imaging are used to conduct an experiment of geometric calibration and a verification test, respectively. Results have shown that the plane positioning accuracy without ground control points (GCPs) is better than 30 pixels and the plane positioning accuracy with GCPs is better than 1 pixel. PMID:29337885

  7. Auto-converging stereo cameras for 3D robotic tele-operation

    NASA Astrophysics Data System (ADS)

    Edmondson, Richard; Aycock, Todd; Chenault, David

    2012-06-01

    Polaris Sensor Technologies has developed a Stereovision Upgrade Kit for TALON robot to provide enhanced depth perception to the operator. This kit previously required the TALON Operator Control Unit to be equipped with the optional touchscreen interface to allow for operator control of the camera convergence angle adjustment. This adjustment allowed for optimal camera convergence independent of the distance from the camera to the object being viewed. Polaris has recently improved the performance of the stereo camera by implementing an Automatic Convergence algorithm in a field programmable gate array in the camera assembly. This algorithm uses scene content to automatically adjust the camera convergence angle, freeing the operator to focus on the task rather than adjustment of the vision system. The autoconvergence capability has been demonstrated on both visible zoom cameras and longwave infrared microbolometer stereo pairs.

  8. Experimental Study of Multispectral Characteristics of an Unmanned Aerial Vehicle at Different Observation Angles

    PubMed Central

    Zheng, Haijing; Bai, Tingzhu; Wang, Quanxi; Cao, Fengmei; Shao, Long; Sun, Zhaotian

    2018-01-01

    This study investigates multispectral characteristics of an unmanned aerial vehicle (UAV) at different observation angles by experiment. The UAV and its engine are tested on the ground in the cruise state. Spectral radiation intensities at different observation angles are obtained in the infrared band of 0.9–15 μm by a spectral radiometer. Meanwhile, infrared images are captured separately by long-wavelength infrared (LWIR), mid-wavelength infrared (MWIR), and short-wavelength infrared (SWIR) cameras. Additionally, orientation maps of the radiation area and radiance are obtained. The results suggest that the spectral radiation intensity of the UAV is determined by its exhaust plume and that the main infrared emission bands occur at 2.7 μm and 4.3 μm. At observation angles in the range of 0°–90°, the radiation area of the UAV in MWIR band is greatest; however, at angles greater than 90°, the radiation area in the SWIR band is greatest. In addition, the radiance of the UAV at an angle of 0° is strongest. These conclusions can guide IR stealth technique development for UAVs. PMID:29389880

  9. Colors of active regions on comet 67P

    NASA Astrophysics Data System (ADS)

    Oklay, N.; Vincent, J.-B.; Sierks, H.; Besse, S.; Fornasier, S.; Barucci, M. A.; Lara, L.; Scholten, F.; Preusker, F.; Lazzarin, M.; Pajola, M.; La Forgia, F.

    2015-10-01

    The OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) scientific imager (Keller et al. 2007) is successfully delivering images of comet 67P/Churyumov-Gerasimenko from its both wide angle camera (WAC) and narrow angle camera (NAC) since ESA's spacecraft Rosetta's arrival to the comet. Both cameras are equipped with filters covering the wavelength range of about 200 nm to 1000 nm. The comet nucleus is mapped with different combination of the filters in resolutions up to 15 cm/px. Besides the determination of the surface morphology in great details (Thomas et al. 2015), such high resolution images provided us a mean to unambiguously link some activity in the coma to a series of pits on the nucleus surface (Vincent et al. 2015).

  10. Compensation method for the influence of angle of view on animal temperature measurement using thermal imaging camera combined with depth image.

    PubMed

    Jiao, Leizi; Dong, Daming; Zhao, Xiande; Han, Pengcheng

    2016-12-01

    In the study, we proposed an animal surface temperature measurement method based on Kinect sensor and infrared thermal imager to facilitate the screening of animals with febrile diseases. Due to random motion and small surface temperature variation of animals, the influence of the angle of view on temperature measurement is significant. The method proposed in the present study could compensate the temperature measurement error caused by the angle of view. Firstly, we analyzed the relationship between measured temperature and angle of view and established the mathematical model for compensating the influence of the angle of view with the correlation coefficient above 0.99. Secondly, the fusion method of depth and infrared thermal images was established for synchronous image capture with Kinect sensor and infrared thermal imager and the angle of view of each pixel was calculated. According to experimental results, without compensation treatment, the temperature image measured in the angle of view of 74° to 76° showed the difference of more than 2°C compared with that measured in the angle of view of 0°. However, after compensation treatment, the temperature difference range was only 0.03-1.2°C. This method is applicable for real-time compensation of errors caused by the angle of view during the temperature measurement process with the infrared thermal imager. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Land-based infrared imagery for marine mammal detection

    NASA Astrophysics Data System (ADS)

    Graber, Joseph; Thomson, Jim; Polagye, Brian; Jessup, Andrew

    2011-09-01

    A land-based infrared (IR) camera is used to detect endangered Southern Resident killer whales in Puget Sound, Washington, USA. The observations are motivated by a proposed tidal energy pilot project, which will be required to monitor for environmental effects. Potential monitoring methods also include visual observation, passive acoustics, and active acoustics. The effectiveness of observations in the infrared spectrum is compared to observations in the visible spectrum to assess the viability of infrared imagery for cetacean detection and classification. Imagery was obtained at Lime Kiln Park, Washington from 7/6/10-7/9/10 using a FLIR Thermovision A40M infrared camera (7.5-14μm, 37°HFOV, 320x240 pixels) under ideal atmospheric conditions (clear skies, calm seas, and wind speed 0-4 m/s). Whales were detected during both day (9 detections) and night (75 detections) at distances ranging from 42 to 162 m. The temperature contrast between dorsal fins and the sea surface ranged from 0.5 to 4.6 °C. Differences in emissivity from sea surface to dorsal fin are shown to aid detection at high incidence angles (near grazing). A comparison to theory is presented, and observed deviations from theory are investigated. A guide for infrared camera selection based on site geometry and desired target size is presented, with specific considerations regarding marine mammal detection. Atmospheric conditions required to use visible and infrared cameras for marine mammal detection are established and compared with 2008 meteorological data for the proposed tidal energy site. Using conservative assumptions, infrared observations are predicted to provide a 74% increase in hours of possible detection, compared with visual observations.

  12. Surface compositional variation on the comet 67P/Churyumov-Gerasimenko by OSIRIS data

    NASA Astrophysics Data System (ADS)

    Barucci, M. A.; Fornasier, S.; Feller, C.; Perna, D.; Hasselmann, H.; Deshapriya, J. D. P.; Fulchignoni, M.; Besse, S.; Sierks, H.; Forgia, F.; Lazzarin, M.; Pommerol, A.; Oklay, N.; Lara, L.; Scholten, F.; Preusker, F.; Leyrat, C.; Pajola, M.; Osiris-Rosetta Team

    2015-10-01

    Since the Rosetta mission arrived at the comet 67P/Churyumov-Gerasimenko (67/P C-G) on July 2014, the comet nucleus has been mapped by both OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System, [1]) NAC (Narrow Angle Camera) and WAC (Wide Angle Camera) acquiring a huge quantity of surface's images at different wavelength bands, under variable illumination conditions and spatial resolution, and producing the most detailed maps at the highest spatial resolution of a comet nucleus surface.67/P C-G's nucleus shows an irregular bi-lobed shape of complex morphology with terrains showing intricate features [2, 3] and a heterogeneity surface at different scales.

  13. Modeling of the ITER-like wide-angle infrared thermography view of JET.

    PubMed

    Aumeunier, M-H; Firdaouss, M; Travère, J-M; Loarer, T; Gauthier, E; Martin, V; Chabaud, D; Humbert, E

    2012-10-01

    Infrared (IR) thermography systems are mandatory to ensure safe plasma operation in fusion devices. However, IR measurements are made much more complicated in metallic environment because of the spurious contributions of the reflected fluxes. This paper presents a full predictive photonic simulation able to assess accurately the surface temperature measurement with classical IR thermography from a given plasma scenario and by taking into account the optical properties of PFCs materials. This simulation has been carried out the ITER-like wide angle infrared camera view of JET in comparing with experimental data. The consequences and the effects of the low emissivity and the bidirectional reflectivity distribution function used in the model for the metallic PFCs on the contribution of the reflected flux in the analysis are discussed.

  14. Modelling of the outburst on July 29th , 2015 observed with OSIRIS in the southern hemisphere of comet 67P/Churyumov-Gerasimenko

    NASA Astrophysics Data System (ADS)

    Gicquel, Adeline; Vincent, Jean-Baptiste; Sierks, Holger; Rose, Martin; Agarwal, Jessica; Deller, Jakob; Guettler, Carsten; Hoefner, Sebastian; Hofmann, Marc; Hu, Xuanyu; Kovacs, Gabor; Oklay Vincent, Nilda; Shi, Xian; Tubiana, Cecilia; Barbieri, Cesare; Lamy, Phylippe; Rodrigo, Rafael; Koschny, Detlef; Rickman, Hans; OSIRIS Team

    2016-10-01

    Images of the nucleus and the coma (gas and dust) of comet 67P/Churyumov- Gerasimenko have been acquired by the OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) cameras system since March 2014 using both the wide angle camera (WAC) and the narrow angle camera (NAC). We are using the NAC camera to study the bright outburst observed on July 29th, 2015 in the southern hemisphere. The NAC camera's wavelength ranges between 250-1000 nm with a combination of 12 filters. The high spatial resolution is needed to localize the source point of the outburst on the surface of the nucleus. At the time of the observations, the heliocentric distance was 1.25AU and the distance between the spacecraft and the comet was 126 km. We aim to understand the physics leading to such outgassing: Is the jet associated to the outbursts controlled by the micro-topography? Or by ice suddenly exposed? We are using the Direct Simulation Monte Carlo (DSMC) method to study the gas flow close to the nucleus. The goal of the DSMC code is to reproduce the opening angle of the jet, and constrain the outgassing ratio between outburst source and local region. The results of this model will be compared to the images obtained with the NAC camera.

  15. VizieR Online Data Catalog: Solar neighborhood. XXXVII. RVs for M dwarfs (Benedict+, 2016)

    NASA Astrophysics Data System (ADS)

    Benedict, G. F.; Henry, T. J.; Franz, O. G.; McArthur, B. E.; Wasserman, L. H.; Jao, W.-C.; Cargile, P. A.; Dieterich, S. B.; Bradley, A. J.; Nelan, E. P.; Whipple, A. L.

    2017-05-01

    During this project we observed with two Fine Guidance Sensor (FGS) units: FGS 3 from 1992 to 2000, and FGS 1r from 2000 to 2009. FGS 1r replaced the original FGS 1 during Hubble Space Telescope (HST) Servicing Mission 3A in late 1999. We included visual, photographic, and CCD observations of separations and position angles from Geyer et al. 1988AJ.....95.1841G for our analysis of GJ 65 AB. We include a single observation of G 193-027 AB from Beuzit et al. 2004A&A...425..997B, who used the Adaptive Optics Bonnette system on the Canada-France-Hawaii Telescope (CFHT). For GJ 65 AB we include five Very Large Telescope/NAos-COnica (VLT/NACO) measures of position angle and separation (Kervella et al. 2016A&A...593A.127K). For our analysis of GJ 623 AB, we included astrometric observations (Martinache et al. 2007ApJ...661..496M) performed with the Palomar High Angular Resolution Observer (PHARO) instrument on the Palomar 200in (5m) telescope and with the Near InfraRed Camera 2 (NIRC2) instrument on the Keck II telescope. Separations have typical errors of 2mas. Position angle errors average 0.5°. Measurements are included for GJ 22 AC from McCarthy et al. 1991AJ....101..214M and for GJ 473 AB from Henry et al. 1992AJ....103.1369H and Torres et al. 1999AJ....117..562T, who used a two-dimensional infrared speckle camera containing a 58*62 pixel InSb array on the Steward Observatory 90in telescope. We also include infrared speckle observations by Woitas et al. 2003A&A...406..293W, who obtained fourteen separation and position angle measurements for GJ 22 AC with the near-infrared cameras MAGIC and OMEGA Cass at the 3.5m telescope on Calar Alto. We also include a few speckle observations at optical wavelengths from the Special Astrophysical Observatory 6m Bolshoi Azimuth Telescope (BTA) and 1m Zeiss (Balega et al. 1994, Cat. J/A+AS/105/503), from the CFHT (Blazit et al. 1987) and from the Differential Speckle Survey Instrument (DSSI) on the Wisconsin, Indiana, Yale, National optical astronomy observatory (WIYN) 3.5m (Horch et al. 2012, Cat. J/AJ/143/10). Where available, we use astrometric observations from HST instruments other than the FGSs, including the Faint Object Camera (FOC; Barbieri et al. 1996A&A...315..418B), the Faint Object Spectrograph (FOS; Schultz et al. 1998PASP..110...31S), the Near-Infrared Camera and Multi-Object Spectrometer (NICMOS; Golimowski et al. 2004AJ....128.1733G), and the Wide-Field Planetary Camera 2 (WFPC2; Schroeder et al. 2000AJ....119..906S; Dieterich et al. 2012, Cat. J/AJ/144/64). Our radial velocity measurements, listed in table3, are from two sources. We obtained most radial velocity data with the McDonald 2.1m Struve telescope and the Sandiford Cassegrain Echelle spectrograph, hereafter CE. The CE delivers a dispersion equivalent to 2.5km/s/pix (R=λ/Δλ=60000) with a wavelength range of 5500{<=}λ{<=}6700Å spread across 26 orders (apertures). The McDonald data were collected during 33 observing runs from 1995 to 2009. Some GJ 623 AB velocities came from the Hobby-Eberly Telescope (HET) using the Tull Spectrograph. (3 data files).

  16. Rosetta/OSIRIS - Nucleus morphology and activity of comet 67P/Churyumov-Gerasimenko

    NASA Astrophysics Data System (ADS)

    Sierks, Holger; Barbieri, Cesare; Lamy, Philippe; Rickman, Hans; Rodrigo, Rafael; Koschny, Detlef

    2015-04-01

    ESA's Rosetta mission arrived on August 6, 2014, at target comet 67P/Churyumov-Gerasimenko after 10 years of cruise. OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) is the scientific imaging system onboard Rosetta. It comprises a Narrow Angle Camera (NAC) for nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field coma investigations. OSIRIS imaged the nucleus and coma of the comet from the arrival throughout the mapping phase, PHILAE landing, early escort phase and close fly-by. The overview paper will discuss the surface morpholo-gy and activity of the nucleus as seen in gas, dust, and local jets as well as small scale structures in the local topography.

  17. Multi-viewer tracking integral imaging system and its viewing zone analysis.

    PubMed

    Park, Gilbae; Jung, Jae-Hyun; Hong, Keehoon; Kim, Yunhee; Kim, Young-Hoon; Min, Sung-Wook; Lee, Byoungho

    2009-09-28

    We propose a multi-viewer tracking integral imaging system for viewing angle and viewing zone improvement. In the tracking integral imaging system, the pickup angles in each elemental lens in the lens array are decided by the positions of viewers, which means the elemental image can be made for each viewer to provide wider viewing angle and larger viewing zone. Our tracking integral imaging system is implemented with an infrared camera and infrared light emitting diodes which can track the viewers' exact positions robustly. For multiple viewers to watch integrated three-dimensional images in the tracking integral imaging system, it is needed to formulate the relationship between the multiple viewers' positions and the elemental images. We analyzed the relationship and the conditions for the multiple viewers, and verified them by the implementation of two-viewer tracking integral imaging system.

  18. Multi-Angle Snowflake Camera Value-Added Product

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shkurko, Konstantin; Garrett, T.; Gaustad, K

    The Multi-Angle Snowflake Camera (MASC) addresses a need for high-resolution multi-angle imaging of hydrometeors in freefall with simultaneous measurement of fallspeed. As illustrated in Figure 1, the MASC consists of three cameras, separated by 36°, each pointing at an identical focal point approximately 10 cm away. Located immediately above each camera, a light aims directly at the center of depth of field for its corresponding camera. The focal point at which the cameras are aimed lies within a ring through which hydrometeors fall. The ring houses a system of near-infrared emitter-detector pairs, arranged in two arrays separated vertically by 32more » mm. When hydrometeors pass through the lower array, they simultaneously trigger all cameras and lights. Fallspeed is calculated from the time it takes to traverse the distance between the upper and lower triggering arrays. The trigger electronics filter out ambient light fluctuations associated with varying sunlight and shadows. The microprocessor onboard the MASC controls the camera system and communicates with the personal computer (PC). The image data is sent via FireWire 800 line, and fallspeed (and camera control) is sent via a Universal Serial Bus (USB) line that relies on RS232-over-USB serial conversion. See Table 1 for specific details on the MASC located at the Oliktok Point Mobile Facility on the North Slope of Alaska. The value-added product (VAP) detailed in this documentation analyzes the raw data (Section 2.0) using Python: images rely on OpenCV image processing library and derived aggregated statistics rely on some clever averaging. See Sections 4.1 and 4.2 for more details on what variables are computed.« less

  19. Two Perspectives on Forest Fire

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Multi-angle Imaging Spectroradiometer (MISR) images of smoke plumes from wildfires in western Montana acquired on August 14, 2000. A portion of Flathead Lake is visible at the top, and the Bitterroot Range traverses the images. The left view is from MISR's vertical-viewing (nadir) camera. The right view is from the camera that looks forward at a steep angle (60 degrees). The smoke location and extent are far more visible when seen at this highly oblique angle. However, vegetation is much darker in the forward view. A brown burn scar is located nearly in the exact center of the nadir image, while in the high-angle view it is shrouded in smoke. Also visible in the center and upper right of the images, and more obvious in the clearer nadir view, are checkerboard patterns on the surface associated with land ownership boundaries and logging. Compare these images with the high resolution infrared imagery captured nearby by Landsat 7 half an hour earlier. Images by NASA/GSFC/JPL, MISR Science Team.

  20. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.

    PubMed

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-08-31

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.

  1. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User’s Head Movement

    PubMed Central

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-01-01

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user’s head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768

  2. Preliminary calibration results of the wide angle camera of the imaging instrument OSIRIS for the Rosetta mission

    NASA Astrophysics Data System (ADS)

    Da Deppo, V.; Naletto, G.; Nicolosi, P.; Zambolin, P.; De Cecco, M.; Debei, S.; Parzianello, G.; Ramous, P.; Zaccariotto, M.; Fornasier, S.; Verani, S.; Thomas, N.; Barthol, P.; Hviid, S. F.; Sebastian, I.; Meller, R.; Sierks, H.; Keller, H. U.; Barbieri, C.; Angrilli, F.; Lamy, P.; Rodrigo, R.; Rickman, H.; Wenzel, K. P.

    2017-11-01

    Rosetta is one of the cornerstone missions of the European Space Agency for having a rendezvous with the comet 67P/Churyumov-Gerasimenko in 2014. The imaging instrument on board the satellite is OSIRIS (Optical, Spectroscopic and Infrared Remote Imaging System), a cooperation among several European institutes, which consists of two cameras: a Narrow (NAC) and a Wide Angle Camera (WAC). The WAC optical design is an innovative one: it adopts an all reflecting, unvignetted and unobstructed two mirror configuration which allows to cover a 12° × 12° field of view with an F/5.6 aperture and gives a nominal contrast ratio of about 10-4. The flight model of this camera has been successfully integrated and tested in our laboratories, and finally has been integrated on the satellite which is now waiting to be launched in February 2004. In this paper we are going to describe the optical characteristics of the camera, and to summarize the results so far obtained with the preliminary calibration data. The analysis of the optical performance of this model shows a good agreement between theoretical performance and experimental results.

  3. Mechanism controller system for the optical spectroscopic and infrared remote imaging system instrument on board the Rosetta space mission

    NASA Astrophysics Data System (ADS)

    Castro Marín, J. M.; Brown, V. J. G.; López Jiménez, A. C.; Rodríguez Gómez, J.; Rodrigo, R.

    2001-05-01

    The optical, spectroscopic infrared remote imaging system (OSIRIS) is an instrument carried on board the European Space Agency spacecraft Rosetta that will be launched in January 2003 to study in situ the comet Wirtanen. The electronic design of the mechanism controller board (MCB) system of the two OSIRIS optical cameras, the narrow angle camera, and the wide angle camera, is described here. The system is comprised of two boards mounted on an aluminum frame as part of an electronics box that contains the power supply and the digital processor unit of the instrument. The mechanisms controlled by the MCB for each camera are the front door assembly and a filter wheel assembly. The front door assembly for each camera is driven by a four phase, permanent magnet stepper motor. Each filter wheel assembly consists of two, eight filter wheels. Each wheel is driven by a four phase, variable reluctance stepper motor. Each motor, for all the assemblies, also contains a redundant set of four stator phase windings that can be energized separately or in parallel with the main windings. All stepper motors are driven in both directions using the full step unipolar mode of operation. The MCB also performs general housekeeping data acquisition of the OSIRIS instrument, i.e., mechanism position encoders and temperature measurements. The electronic design application used is quite new due to use of a field programmable gate array electronic devices that avoid the use of the now traditional system controlled by microcontrollers and software. Electrical tests of the engineering model have been performed successfully and the system is ready for space qualification after environmental testing. This system may be of interest to institutions involved in future space experiments with similar needs for mechanisms control.

  4. 3D medical thermography device

    NASA Astrophysics Data System (ADS)

    Moghadam, Peyman

    2015-05-01

    In this paper, a novel handheld 3D medical thermography system is introduced. The proposed system consists of a thermal-infrared camera, a color camera and a depth camera rigidly attached in close proximity and mounted on an ergonomic handle. As a practitioner holding the device smoothly moves it around the human body parts, the proposed system generates and builds up a precise 3D thermogram model by incorporating information from each new measurement in real-time. The data is acquired in motion, thus it provides multiple points of view. When processed, these multiple points of view are adaptively combined by taking into account the reliability of each individual measurement which can vary due to a variety of factors such as angle of incidence, distance between the device and the subject and environmental sensor data or other factors influencing a confidence of the thermal-infrared data when captured. Finally, several case studies are presented to support the usability and performance of the proposed system.

  5. Winter precipitation particle size distribution measurement by Multi-Angle Snowflake Camera

    NASA Astrophysics Data System (ADS)

    Huang, Gwo-Jong; Kleinkort, Cameron; Bringi, V. N.; Notaroš, Branislav M.

    2017-12-01

    From the radar meteorology viewpoint, the most important properties for quantitative precipitation estimation of winter events are 3D shape, size, and mass of precipitation particles, as well as the particle size distribution (PSD). In order to measure these properties precisely, optical instruments may be the best choice. The Multi-Angle Snowflake Camera (MASC) is a relatively new instrument equipped with three high-resolution cameras to capture the winter precipitation particle images from three non-parallel angles, in addition to measuring the particle fall speed using two pairs of infrared motion sensors. However, the results from the MASC so far are usually presented as monthly or seasonally, and particle sizes are given as histograms, no previous studies have used the MASC for a single storm study, and no researchers use MASC to measure the PSD. We propose the methodology for obtaining the winter precipitation PSD measured by the MASC, and present and discuss the development, implementation, and application of the new technique for PSD computation based on MASC images. Overall, this is the first study of the MASC-based PSD. We present PSD MASC experiments and results for segments of two snow events to demonstrate the performance of our PSD algorithm. The results show that the self-consistency of the MASC measured single-camera PSDs is good. To cross-validate PSD measurements, we compare MASC mean PSD (averaged over three cameras) with the collocated 2D Video Disdrometer, and observe good agreements of the two sets of results.

  6. Techniques for Surface-Temperature Measurements and Transition Detection on Projectiles at Hypersonic Velocities--Status Report No. 2

    NASA Technical Reports Server (NTRS)

    Bogdanoff, D. W.; Wilder, M. C.

    2006-01-01

    The latest developments in a research effort to advance techniques for measuring surface temperatures and heat fluxes and determining transition locations on projectiles in hypersonic free flight in a ballistic range are described. Spherical and hemispherical titanium projectiles were launched at muzzle velocities of 4.6-5.8 km/sec into air and nitrogen at pressures of 95-380 Torr. Hemisphere models with diameters of 2.22 cm had maximum pitch and yaw angles of 5.5-8 degrees and 4.7-7 degrees, depending on whether they were launched using an evacuated launch tube or not. Hemisphere models with diameters of 2.86 cm had maximum pitch and yaw angles of 2.0-2.5 degrees. Three intensified-charge-coupled-device (ICCD) cameras with wavelength sensitivity ranges of 480-870 nm (as well as one infrared camera with a wavelength sensitivity range of 3 to 5 microns), were used to obtain images of the projectiles in flight. Helium plumes were used to remove the radiating gas cap around the projectiles at the locations where ICCD camera images were taken. ICCD and infrared (IR) camera images of titanium hemisphere projectiles at velocities of 4.0-4.4 km/sec are presented as well as preliminary temperature data for these projectiles. Comparisons were made of normalized temperature data for shots at approx.190 Torr in air and nitrogen and with and without the launch tube evacuated. Shots into nitrogen had temperatures 6% lower than those into air. Evacuation of the launch tube was also found to lower the projectile temperatures by approx.6%.

  7. Theodolite with CCD Camera for Safe Measurement of Laser-Beam Pointing

    NASA Technical Reports Server (NTRS)

    Crooke, Julie A.

    2003-01-01

    The simple addition of a charge-coupled-device (CCD) camera to a theodolite makes it safe to measure the pointing direction of a laser beam. The present state of the art requires this to be a custom addition because theodolites are manufactured without CCD cameras as standard or even optional equipment. A theodolite is an alignment telescope equipped with mechanisms to measure the azimuth and elevation angles to the sub-arcsecond level. When measuring the angular pointing direction of a Class ll laser with a theodolite, one could place a calculated amount of neutral density (ND) filters in front of the theodolite s telescope. One could then safely view and measure the laser s boresight looking through the theodolite s telescope without great risk to one s eyes. This method for a Class ll visible wavelength laser is not acceptable to even consider tempting for a Class IV laser and not applicable for an infrared (IR) laser. If one chooses insufficient attenuation or forgets to use the filters, then looking at the laser beam through the theodolite could cause instant blindness. The CCD camera is already commercially available. It is a small, inexpensive, blackand- white CCD circuit-board-level camera. An interface adaptor was designed and fabricated to mount the camera onto the eyepiece of the specific theodolite s viewing telescope. Other equipment needed for operation of the camera are power supplies, cables, and a black-and-white television monitor. The picture displayed on the monitor is equivalent to what one would see when looking directly through the theodolite. Again, the additional advantage afforded by a cheap black-and-white CCD camera is that it is sensitive to infrared as well as to visible light. Hence, one can use the camera coupled to a theodolite to measure the pointing of an infrared as well as a visible laser.

  8. Boundary Layer Transition Detection on a Rotor Blade Using Rotating Mirror Thermography

    NASA Technical Reports Server (NTRS)

    Heineck, James T.; Schuelein, Erich; Raffel, Markus

    2014-01-01

    Laminar-to-turbulent transition on a rotor blade in hover has been imaged using an area-scan infrared camera. A new method for tracking a blade using a rotating mirror was employed. The mirror axis of rotation roughly corresponded to the rotor axis of rotation and the mirror rotational frequency is 1/2 that of the rotor. This permitted the use of cameras whose integration time was too long to prevent image blur due to the motion of the blade. This article will show the use of this method for a rotor blade at different collective pitch angles.

  9. Infrared and visible cooperative vehicle identification markings

    NASA Astrophysics Data System (ADS)

    O'Keefe, Eoin S.; Raven, Peter N.

    2006-05-01

    Airborne surveillance helicopters and aeroplanes used by security and defence forces around the world increasingly rely on their visible band and thermal infrared cameras to prosecute operations such as the co-ordination of police vehicles during the apprehension of a stolen car, or direction of all emergency services at a serious rail crash. To perform their function effectively, it is necessary for the airborne officers to unambiguously identify police and the other emergency service vehicles. In the visible band, identification is achieved by placing high contrast symbols and characters on the vehicle roof. However, at the wavelengths at which thermal imagers operate, the dark and light coloured materials have similar low reflectivity and the visible markings cannot be discerned. Hence there is a requirement for a method of passively and unobtrusively marking vehicles concurrently in the visible and thermal infrared, over a large range of viewing angles. In this paper we discuss the design, detailed angle-dependent spectroscopic characterisation and operation of novel visible and infrared vehicle marking materials, and present airborne IR and visible imagery of materials in use.

  10. WindCam and MSPI: two cloud and aerosol instrument concepts derived from Terra/MISR heritage

    NASA Astrophysics Data System (ADS)

    Diner, David J.; Mischna, Michael; Chipman, Russell A.; Davis, Ab; Cairns, Brian; Davies, Roger; Kahn, Ralph A.; Muller, Jan-Peter; Torres, Omar

    2008-08-01

    The Multi-angle Imaging SpectroRadiometer (MISR) has been acquiring global cloud and aerosol data from polar orbit since February 2000. MISR acquires moderately high-resolution imagery at nine view angles from nadir to 70.5°, in four visible/near-infrared spectral bands. Stereoscopic parallax, time lapse among the nine views, and the variation of radiance with angle and wavelength enable retrieval of geometric cloud and aerosol plume heights, height-resolved cloud-tracked winds, and aerosol optical depth and particle property information. Two instrument concepts based upon MISR heritage are in development. The Cloud Motion Vector Camera, or WindCam, is a simplified version comprised of a lightweight, compact, wide-angle camera to acquire multiangle stereo imagery at a single visible wavelength. A constellation of three WindCam instruments in polar Earth orbit would obtain height-resolved cloud-motion winds with daily global coverage, making it a low-cost complement to a spaceborne lidar wind measurement system. The Multiangle SpectroPolarimetric Imager (MSPI) is aimed at aerosol and cloud microphysical properties, and is a candidate for the National Research Council Decadal Survey's Aerosol-Cloud-Ecosystem (ACE) mission. MSPI combines the capabilities of MISR with those of other aerosol sensors, extending the spectral coverage to the ultraviolet and shortwave infrared and incorporating high-accuracy polarimetric imaging. Based on requirements for the nonimaging Aerosol Polarimeter Sensor on NASA's Glory mission, a degree of linear polarization uncertainty of 0.5% is specified within a subset of the MSPI bands. We are developing a polarization imaging approach using photoelastic modulators (PEMs) to accomplish this objective.

  11. Wide-angle ITER-prototype tangential infrared and visible viewing system for DIII-D.

    PubMed

    Lasnier, C J; Allen, S L; Ellis, R E; Fenstermacher, M E; McLean, A G; Meyer, W H; Morris, K; Seppala, L G; Crabtree, K; Van Zeeland, M A

    2014-11-01

    An imaging system with a wide-angle tangential view of the full poloidal cross-section of the tokamak in simultaneous infrared and visible light has been installed on DIII-D. The optical train includes three polished stainless steel mirrors in vacuum, which view the tokamak through an aperture in the first mirror, similar to the design concept proposed for ITER. A dichroic beam splitter outside the vacuum separates visible and infrared (IR) light. Spatial calibration is accomplished by warping a CAD-rendered image to align with landmarks in a data image. The IR camera provides scrape-off layer heat flux profile deposition features in diverted and inner-wall-limited plasmas, such as heat flux reduction in pumped radiative divertor shots. Demonstration of the system to date includes observation of fast-ion losses to the outer wall during neutral beam injection, and shows reduced peak wall heat loading with disruption mitigation by injection of a massive gas puff.

  12. Wide-angle ITER-prototype tangential infrared and visible viewing system for DIII-D

    DOE PAGES

    Lasnier, Charles J.; Allen, Steve L.; Ellis, Ronald E.; ...

    2014-08-26

    An imaging system with a wide-angle tangential view of the full poloidal cross-section of the tokamak in simultaneous infrared and visible light has been installed on DIII-D. The optical train includes three polished stainless steel mirrors in vacuum, which view the tokamak through an aperture in the first mirror, similar to the design concept proposed for ITER. A dichroic beam splitter outside the vacuum separates visible and infrared (IR) light. Spatial calibration is accomplished by warping a CAD-rendered image to align with landmarks in a data image. The IR camera provides scrape-off layer heat flux profile deposition features in divertedmore » and inner-wall-limited plasmas, such as heat flux reduction in pumped radiative divertor shots. As a result, demonstration of the system to date includes observation of fast-ion losses to the outer wall during neutral beam injection, and shows reduced peak wall heat loading with disruption mitigation by injection of a massive gas puff.« less

  13. Thermoelectric infrared imaging sensors for automotive applications

    NASA Astrophysics Data System (ADS)

    Hirota, Masaki; Nakajima, Yasushi; Saito, Masanori; Satou, Fuminori; Uchiyama, Makoto

    2004-07-01

    This paper describes three low-cost thermoelectric infrared imaging sensors having a 1,536, 2,304, and 10,800 element thermoelectric focal plane array (FPA) respectively and two experimental automotive application systems. The FPAs are basically fabricated with a conventional IC process and micromachining technologies and have a low cost potential. Among these sensors, the sensor having 2,304 elements provide high responsivity of 5,500 V/W and a very small size with adopting a vacuum-sealed package integrated with a wide-angle ZnS lens. One experimental system incorporated in the Nissan ASV-2 is a blind spot pedestrian warning system that employs four infrared imaging sensors. This system helps alert the driver to the presence of a pedestrian in a blind spot by detecting the infrared radiation emitted from the person"s body. The system can also prevent the vehicle from moving in the direction of the pedestrian. The other is a rearview camera system with an infrared detection function. This system consists of a visible camera and infrared sensors, and it helps alert the driver to the presence of a pedestrian in a rear blind spot. Various issues that will need to be addressed in order to expand the automotive applications of IR imaging sensors in the future are also summarized. This performance is suitable for consumer electronics as well as automotive applications.

  14. Preliminary status of POLICAN: A near-infrared imaging polarimeter

    NASA Astrophysics Data System (ADS)

    Devaraj, R.; Luna, A.; Carrasco, L.; Mayya, Y. D.

    2015-10-01

    POLICAN is a near-infrared (J, H, K) imaging polarimeter developed for the Cananea near infrared camera (CANICA) at the 2.1m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located at Cananea, Sonora, México. The camera has a 1024 x 1024 HgCdTe detector (HAWAII array) with a plate scale of 0.32 arcsec/pixel providing a field of view of 5.5 x 5.5 arcmin. POLICAN is mounted externally to CANICA for narrow-field (f/12) linear polarimetric observations. It consists of a rotating super achromatic (1-2.7μm) half waveplate and a fixed wire-grid polarizer as the analyzer. The light is modulated by setting the half waveplate at different angles (0°, 22.5°, 45°, 67.5°) and linear combinations of the Stokes parameters (I, Q and U) are obtained. Image reduction and removal of instrumental polarization consist of dark noise subtraction, polarimetric flat fielding and background sky subtraction. Polarimetric calibration is performed by observing polarization standards available in the literature. The astrometry correction is performed by matching common stars with the Two Micron All Sky Survey. POLICAN's bright and limiting magnitudes are approximately 6th and 16th magnitude, which correspond to saturation and photon noise, respectively. POLICAN currently achieves a polarimetric accuracy about 3.0% and polarization angle uncertainties within 3°. Preliminary observations of star forming regions are being carried out in order to study their magnetic field properties.

  15. Alpha and Omega

    NASA Image and Video Library

    2017-11-27

    These two images illustrate just how far Cassini traveled to get to Saturn. On the left is one of the earliest images Cassini took of the ringed planet, captured during the long voyage from the inner solar system. On the right is one of Cassini's final images of Saturn, showing the site where the spacecraft would enter the atmosphere on the following day. In the left image, taken in 2001, about six months after the spacecraft passed Jupiter for a gravity assist flyby, the best view of Saturn using the spacecraft's high-resolution (narrow-angle) camera was on the order of what could be seen using the Earth-orbiting Hubble Space Telescope. At the end of the mission (at right), from close to Saturn, even the lower resolution (wide-angle) camera could capture just a tiny part of the planet. The left image looks toward Saturn from 20 degrees below the ring plane and was taken on July 13, 2001 in wavelengths of infrared light centered at 727 nanometers using the Cassini spacecraft narrow-angle camera. The view at right is centered on a point 6 degrees north of the equator and was taken in visible light using the wide-angle camera on Sept. 14, 2017. The view on the left was acquired at a distance of approximately 317 million miles (510 million kilometers) from Saturn. Image scale is about 1,900 miles (3,100 kilometers) per pixel. The view at right was acquired at a distance of approximately 360,000 miles (579,000 kilometers) from Saturn. Image scale is 22 miles (35 kilometers) per pixel. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA21353

  16. The Activity of Comet 67P/Churyumov-Gerasimenko as Seen by Rosetta/OSIRIS

    NASA Astrophysics Data System (ADS)

    Sierks, H.; Barbieri, C.; Lamy, P. L.; Rodrigo, R.; Rickman, H.; Koschny, D.

    2015-12-01

    The Rosetta mission of the European Space Agency arrived on August 6, 2014, at the target comet 67P/Churyumov-Gerasimenko. OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) is the scientific imaging system onboard Rosetta. OSIRIS consists of a Narrow Angle Camera (NAC) for the nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field gas and dust coma investigations. OSIRIS observed the coma and the nucleus of comet 67P/C-G during approach, arrival, and landing of PHILAE. OSIRIS continued comet monitoring and mapping of surface and activity in 2015 with close fly-bys with high resolution and remote, wide angle observations. The scientific results reveal a nucleus with two lobes and varied morphology. Active regions are located at steep cliffs and collapsed pits which form collimated gas jets. Dust is accelerated by the gas, forming bright jet filaments and the large scale, diffuse coma of the comet. We will present activity and surface changes observed in the Northern and Southern hemisphere and around perihelion passage.

  17. Two Titans

    NASA Image and Video Library

    2017-08-11

    These two views of Saturn's moon Titan exemplify how NASA's Cassini spacecraft has revealed the surface of this fascinating world. Cassini carried several instruments to pierce the veil of hydrocarbon haze that enshrouds Titan. The mission's imaging cameras also have several spectral filters sensitive to specific wavelengths of infrared light that are able to make it through the haze to the surface and back into space. These "spectral windows" have enable the imaging cameras to map nearly the entire surface of Titan. In addition to Titan's surface, images from both the imaging cameras and VIMS have provided windows into the moon's ever-changing atmosphere, chronicling the appearance and movement of hazes and clouds over the years. A large, bright and feathery band of summer clouds can be seen arcing across high northern latitudes in the view at right. These views were obtained with the Cassini spacecraft narrow-angle camera on March 21, 2017. Images taken using red, green and blue spectral filters were combined to create the natural-color view at left. The false-color view at right was made by substituting an infrared image (centered at 938 nanometers) for the red color channel. The views were acquired at a distance of approximately 613,000 miles (986,000 kilometers) from Titan. Image scale is about 4 miles (6 kilometers) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA21624

  18. Snowstorm Along the China-Mongolia-Russia Borders

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Heavy snowfall on March 12, 2004, across north China's Inner Mongolia Autonomous Region, Mongolia and Russia, caused train and highway traffic to stop for several days along the Russia-China border. This pair of images from the Multi-angle Imaging SpectroRadiometer (MISR) highlights the snow and surface properties across the region on March 13. The left-hand image is a multi-spectral false-color view made from the near-infrared, red, and green bands of MISR's vertical-viewing (nadir) camera. The right-hand image is a multi-angle false-color view made from the red band data of the 46-degree aftward camera, the nadir camera, and the 46-degree forward camera.

    About midway between the frozen expanse of China's Hulun Nur Lake (along the right-hand edge of the images) and Russia's Torey Lakes (above image center) is a dark linear feature that corresponds with the China-Mongolia border. In the upper portion of the images, many small plumes of black smoke rise from coal and wood fires and blow toward the southeast over the frozen lakes and snow-covered grasslands. Along the upper left-hand portion of the images, in Russia's Yablonovyy mountain range and the Onon River Valley, the terrain becomes more hilly and forested. In the nadir image, vegetation appears in shades of red, owing to its high near-infrared reflectivity. In the multi-angle composite, open-canopy forested areas are indicated by green hues. Since this is a multi-angle composite, the green color arises not from the color of the leaves but from the architecture of the surface cover. The green areas appear brighter at the nadir angle than at the oblique angles because more of the snow-covered surface in the gaps between the trees is visible. Color variations in the multi-angle composite also indicate angular reflectance properties for areas covered by snow and ice. The light blue color of the frozen lakes is due to the increased forward scattering of smooth ice, and light orange colors indicate rougher ice or snow, which scatters more light in the backward direction.

    The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire Earth between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbit 22525. The panels cover an area of about 355 kilometers x 380 kilometers, and utilize data from blocks 50 to 52 within World Reference System-2 path 126.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  19. The space shuttle payload planning working groups. Volume 1: Astronomy

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The space astronomy missions to be accomplished by the space shuttle are discussed. The principal instrument is the Large Space Telescope optimized for the ultraviolet and visible regions of the spectrum, but usable also in the infrared. Two infrared telescopes are also proposed and their characteristics are described. Other instruments considered for the astronomical observations are: (1) a very wide angle ultraviolet camera, (2) a grazing incidence telescope, (3) Explorer-class free flyers to measure the cosmic microwave background, and (4) rocket-class instruments which can fly frequently on a variety of missions. The stability requirements of the space shuttle for accomplishing the astronomy mission are defined.

  20. Low-speed flowfield characterization by infrared measurements of surface temperatures

    NASA Technical Reports Server (NTRS)

    Gartenberg, E.; Roberts, A. S., Jr.; Mcree, G. J.

    1989-01-01

    An experimental program was aimed at identifying areas in low speed aerodynamic research where infrared imaging systems can make significant contributions. Implementing a new technique, a long electrically heated wire was placed across a laminar jet. By measuring the temperature distribution along the wire with the IR imaging camera, the flow behavior was identified. Furthermore, using Nusselt number correlations, the velocity distribution could be deduced. The same approach was used to survey wakes behind cylinders in a wind-tunnel. This method is suited to investigate flows with position dependent velocities, e.g., boundary layers, confined flows, jets, wakes, and shear layers. It was found that the IR imaging camera cannot accurately track high gradient temperature fields. A correlation procedure was devised to account for this limitation. Other wind-tunnel experiments included tracking the development of the laminar boundary layer over a warmed flat plate by measuring the chordwise temperature distribution. This technique was applied also to the flow downstream from a rearward facing step. Finally, the IR imaging system was used to study boundary layer behavior over an airfoil at angles of attack from zero up to separation. The results were confirmed with tufts observable both visually and with the IR imaging camera.

  1. Dust mass distribution around comet 67P/Churyumov-Gerasimenko determined via parallax measurements using Rosetta's OSIRIS cameras

    NASA Astrophysics Data System (ADS)

    Ott, T.; Drolshagen, E.; Koschny, D.; Güttler, C.; Tubiana, C.; Frattin, E.; Agarwal, J.; Sierks, H.; Bertini, I.; Barbieri, C.; Lamy, P. I.; Rodrigo, R.; Rickman, H.; A'Hearn, M. F.; Barucci, M. A.; Bertaux, J.-L.; Boudreault, S.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; Debei, S.; De Cecco, M.; Deller, J.; Feller, C.; Fornasier, S.; Fulle, M.; Geiger, B.; Gicquel, A.; Groussin, O.; Gutiérrez, P. J.; Hofmann, M.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Keller, H. U.; Knollenberg, J.; Kovacs, G.; Kramm, J. R.; Kührt, E.; Küppers, M.; Lara, L. M.; Lazzarin, M.; Lin, Z.-Y.; López-Moreno, J. J.; Marzari, F.; Mottola, S.; Naletto, G.; Oklay, N.; Pajola, M.; Shi, X.; Thomas, N.; Vincent, J.-B.; Poppe, B.

    2017-07-01

    The OSIRIS (optical, spectroscopic and infrared remote imaging system) instrument on board the ESA Rosetta spacecraft collected data of 67P/Churyumov-Gerasimenko for over 2 yr. OSIRIS consists of two cameras, a Narrow Angle Camera and a Wide Angle Camera. For specific imaging sequences related to the observation of dust aggregates in 67P's coma, the two cameras were operating simultaneously. The two cameras are mounted 0.7 m apart from each other, as a result this baseline yields a parallax shift of the apparent particle trails on the analysed images directly proportional to their distance. Thanks to such shifts, the distance between observed dust aggregates and the spacecraft was determined. This method works for particles closer than 6000 m to the spacecraft and requires very few assumptions. We found over 250 particles in a suitable distance range with sizes of some centimetres, masses in the range of 10-6-102 kg and a mean velocity of about 2.4 m s-1 relative to the nucleus. Furthermore, the spectral slope was analysed showing a decrease in the median spectral slope of the particles with time. The further a particle is from the spacecraft the fainter is its signal. For this reason, this was counterbalanced by a debiasing. Moreover, the dust mass-loss rate of the nucleus could be computed as well as the Afρ of the comet around perihelion. The summed-up dust mass-loss rate for the mass bins 10-4-102 kg is almost 8300 kg s-1.

  2. Quantifying seasonal variation of leaf area index using near-infrared digital camera in a rice paddy

    NASA Astrophysics Data System (ADS)

    Hwang, Y.; Ryu, Y.; Kim, J.

    2017-12-01

    Digital camera has been widely used to quantify leaf area index (LAI). Numerous simple and automatic methods have been proposed to improve the digital camera based LAI estimates. However, most studies in rice paddy relied on arbitrary thresholds or complex radiative transfer models to make binary images. Moreover, only a few study reported continuous, automatic observation of LAI over the season in rice paddy. The objective of this study is to quantify seasonal variations of LAI using raw near-infrared (NIR) images coupled with a histogram shape-based algorithm in a rice paddy. As vegetation highly reflects the NIR light, we installed NIR digital camera 1.8 m above the ground surface and acquired unsaturated raw format images at one-hour intervals between 15 to 80 º solar zenith angles over the entire growing season in 2016 (from May to September). We applied a sub-pixel classification combined with light scattering correction method. Finally, to confirm the accuracy of the quantified LAI, we also conducted direct (destructive sampling) and indirect (LAI-2200) manual observations of LAI once per ten days on average. Preliminary results show that NIR derived LAI agreed well with in-situ observations but divergence tended to appear once rice canopy is fully developed. The continuous monitoring of LAI in rice paddy will help to understand carbon and water fluxes better and evaluate satellite based LAI products.

  3. SLR digital camera for forensic photography

    NASA Astrophysics Data System (ADS)

    Har, Donghwan; Son, Youngho; Lee, Sungwon

    2004-06-01

    Forensic photography, which was systematically established in the late 19th century by Alphonse Bertillon of France, has developed a lot for about 100 years. The development will be more accelerated with the development of high technologies, in particular the digital technology. This paper reviews three studies to answer the question: Can the SLR digital camera replace the traditional silver halide type ultraviolet photography and infrared photography? 1. Comparison of relative ultraviolet and infrared sensitivity of SLR digital camera to silver halide photography. 2. How much ultraviolet or infrared sensitivity is improved when removing the UV/IR cutoff filter built in the SLR digital camera? 3. Comparison of relative sensitivity of CCD and CMOS for ultraviolet and infrared. The test result showed that the SLR digital camera has a very low sensitivity for ultraviolet and infrared. The cause was found to be the UV/IR cutoff filter mounted in front of the image sensor. Removing the UV/IR cutoff filter significantly improved the sensitivity for ultraviolet and infrared. Particularly for infrared, the sensitivity of the SLR digital camera was better than that of the silver halide film. This shows the possibility of replacing the silver halide type ultraviolet photography and infrared photography with the SLR digital camera. Thus, the SLR digital camera seems to be useful for forensic photography, which deals with a lot of ultraviolet and infrared photographs.

  4. Preclinical imaging of iridocorneal angle and fundus using a modified integrated flexible handheld probe

    PubMed Central

    Hong, Xun Jie Jeesmond; Shinoj, Vengalathunadakal K.; Murukeshan, Vadakke Matham; Baskaran, Mani; Aung, Tin

    2017-01-01

    Abstract. A flexible handheld imaging probe consisting of a 3  mm×3  mm charge-coupled device camera, light-emitting diode light sources, and near-infrared laser source is designed and developed. The imaging probe is designed with specifications to capture the iridocorneal angle images and posterior segment images. Light propagation from the anterior chamber of the eye to the exterior is considered analytically using Snell’s law. Imaging of the iridocorneal angle region and fundus is performed on ex vivo porcine samples and subsequently on small laboratory animals, such as the New Zealand white rabbit and nonhuman primate, in vivo. The integrated flexible handheld probe demonstrates high repeatability in iridocorneal angle and fundus documentation. The proposed concept and methodology are expected to find potential application in the diagnosis, prognosis, and management of glaucoma. PMID:28413809

  5. Advanced Video Data-Acquisition System For Flight Research

    NASA Technical Reports Server (NTRS)

    Miller, Geoffrey; Richwine, David M.; Hass, Neal E.

    1996-01-01

    Advanced video data-acquisition system (AVDAS) developed to satisfy variety of requirements for in-flight video documentation. Requirements range from providing images for visualization of airflows around fighter airplanes at high angles of attack to obtaining safety-of-flight documentation. F/A-18 AVDAS takes advantage of very capable systems like NITE Hawk forward-looking infrared (FLIR) pod and recent video developments like miniature charge-couple-device (CCD) color video cameras and other flight-qualified video hardware.

  6. ARNICA, the Arcetri Near-Infrared Camera

    NASA Astrophysics Data System (ADS)

    Lisi, F.; Baffa, C.; Bilotti, V.; Bonaccini, D.; del Vecchio, C.; Gennari, S.; Hunt, L. K.; Marcucci, G.; Stanga, R.

    1996-04-01

    ARNICA (ARcetri Near-Infrared CAmera) is the imaging camera for the near-infrared bands between 1.0 and 2.5 microns that the Arcetri Observatory has designed and built for the Infrared Telescope TIRGO located at Gornergrat, Switzerland. We describe the mechanical and optical design of the camera, and report on the astronomical performance of ARNICA as measured during the commissioning runs at the TIRGO (December, 1992 to December 1993), and an observing run at the William Herschel Telescope, Canary Islands (December, 1993). System performance is defined in terms of efficiency of the camera+telescope system and camera sensitivity for extended and point-like sources. (SECTION: Astronomical Instrumentation)

  7. Accuracy and Precision of a Surgical Navigation System: Effect of Camera and Patient Tracker Position and Number of Active Markers.

    PubMed

    Gundle, Kenneth R; White, Jedediah K; Conrad, Ernest U; Ching, Randal P

    2017-01-01

    Surgical navigation systems are increasingly used to aid resection and reconstruction of osseous malignancies. In the process of implementing image-based surgical navigation systems, there are numerous opportunities for error that may impact surgical outcome. This study aimed to examine modifiable sources of error in an idealized scenario, when using a bidirectional infrared surgical navigation system. Accuracy and precision were assessed using a computerized-numerical-controlled (CNC) machined grid with known distances between indentations while varying: 1) the distance from the grid to the navigation camera (range 150 to 247cm), 2) the distance from the grid to the patient tracker device (range 20 to 40cm), and 3) whether the minimum or maximum number of bidirectional infrared markers were actively functioning. For each scenario, distances between grid points were measured at 10-mm increments between 10 and 120mm, with twelve measurements made at each distance. The accuracy outcome was the root mean square (RMS) error between the navigation system distance and the actual grid distance. To assess precision, four indentations were recorded six times for each scenario while also varying the angle of the navigation system pointer. The outcome for precision testing was the standard deviation of the distance between each measured point to the mean three-dimensional coordinate of the six points for each cluster. Univariate and multiple linear regression revealed that as the distance from the navigation camera to the grid increased, the RMS error increased (p<0.001). The RMS error also increased when not all infrared markers were actively tracking (p=0.03), and as the measured distance increased (p<0.001). In a multivariate model, these factors accounted for 58% of the overall variance in the RMS error. Standard deviations in repeated measures also increased when not all infrared markers were active (p<0.001), and as the distance between navigation camera and physical space increased (p=0.005). Location of the patient tracker did not affect accuracy (0.36) or precision (p=0.97). In our model laboratory test environment, the infrared bidirectional navigation system was more accurate and precise when the distance from the navigation camera to the physical (working) space was minimized and all bidirectional markers were active. These findings may require alterations in operating room setup and software changes to improve the performance of this system.

  8. Rings Around the Pole

    NASA Image and Video Library

    2005-01-20

    Atmospheric features in Saturn's north polar region are revealed in spectacular detail in this Cassini image, taken in the near infrared spectral region, where methane gas is not very absorbing. The dark shadows of Saturn's rings drape across the planet, creating the illusion of atmospheric bands. Dots of bright clouds give the appearance that this is an active place. The image was taken with the Cassini spacecraft wide angle camera on Dec. 14, 2004, at a distance of 717,800 kilometers (446,100 miles) from Saturn through a filter sensitive to wavelengths of infrared light centered at 939 nanometers. The image scale is about 43 kilometers (27 miles) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA06567

  9. Non-contact measurement of rotation angle with solo camera

    NASA Astrophysics Data System (ADS)

    Gan, Xiaochuan; Sun, Anbin; Ye, Xin; Ma, Liqun

    2015-02-01

    For the purpose to measure a rotation angle around the axis of an object, a non-contact rotation angle measurement method based on solo camera was promoted. The intrinsic parameters of camera were calibrated using chessboard on principle of plane calibration theory. The translation matrix and rotation matrix between the object coordinate and the camera coordinate were calculated according to the relationship between the corners' position on object and their coordinates on image. Then the rotation angle between the measured object and the camera could be resolved from the rotation matrix. A precise angle dividing table (PADT) was chosen as the reference to verify the angle measurement error of this method. Test results indicated that the rotation angle measurement error of this method did not exceed +/- 0.01 degree.

  10. Impact Site: Cassini's Final Image

    NASA Image and Video Library

    2017-09-15

    This monochrome view is the last image taken by the imaging cameras on NASA's Cassini spacecraft. It looks toward the planet's night side, lit by reflected light from the rings, and shows the location at which the spacecraft would enter the planet's atmosphere hours later. A natural color view, created using images taken with red, green and blue spectral filters, is also provided (Figure 1). The imaging cameras obtained this view at approximately the same time that Cassini's visual and infrared mapping spectrometer made its own observations of the impact area in the thermal infrared. This location -- the site of Cassini's atmospheric entry -- was at this time on the night side of the planet, but would rotate into daylight by the time Cassini made its final dive into Saturn's upper atmosphere, ending its remarkable 13-year exploration of Saturn. The view was acquired on Sept. 14, 2017 at 19:59 UTC (spacecraft event time). The view was taken in visible light using the Cassini spacecraft wide-angle camera at a distance of 394,000 miles (634,000 kilometers) from Saturn. Image scale is about 11 miles (17 kilometers). The original image has a size of 512x512 pixels. A movie is available at https://photojournal.jpl.nasa.gov/catalog/PIA21895

  11. The North

    NASA Image and Video Library

    2017-10-30

    Reflected sunlight is the source of the illumination for visible wavelength images such as the one above. However, at longer infrared wavelengths, direct thermal emission from objects dominates over reflected sunlight. This enabled instruments that can detect infrared radiation to observe the pole even in the dark days of winter when Cassini first arrived at Saturn and Saturn's northern hemisphere was shrouded in shadow. Now, 13 years later, the north pole basks in full sunlight. Close to the northern summer solstice, sunlight illuminates the previously dark region, permitting Cassini scientists to study this area with the spacecraft's full suite of imagers. This view looks toward the northern hemisphere from about 34 degrees above Saturn's ringplane. The image was taken with the Cassini spacecraft wide-angle camera on April 25, 2017 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was acquired at a distance of approximately 274,000 miles (441,000 kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 111 degrees. Image scale is 16 miles (26 kilometers) per pixel. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA21351

  12. Emissivity Measurements of Additively Manufactured Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgan, Robert Vaughn; Reid, Robert Stowers; Baker, Andrew M.

    The emissivity of common 3D printing materials such as ABS and PLA were measured using a reflectivity meter and have the measured value of approximately 0.92. Adding a conductive material to the filament appears to cause a decrease in the emissivity of the surface. The angular dependence of the emissivity and the apparent temperature was measured using a FLIR infrared camera showing that the emissivity does not change much for shallow angles less than 40 angular degrees, and drops off dramatically after 70 angular degrees.

  13. Infrared cameras are potential traceable "fixed points" for future thermometry studies.

    PubMed

    Yap Kannan, R; Keresztes, K; Hussain, S; Coats, T J; Bown, M J

    2015-01-01

    The National physical laboratory (NPL) requires "fixed points" whose temperatures have been established by the International Temperature Scale of 1990 (ITS 90) be used for device calibration. In practice, "near" blackbody radiators together with the standard platinum resistance thermometer is accepted as a standard. The aim of this study was to report the correlation and limits of agreement (LOA) of the thermal infrared camera and non-contact infrared temporal thermometer against each other and the "near" blackbody radiator. Temperature readings from an infrared thermography camera (FLIR T650sc) and a non-contact infrared temporal thermometer (Hubdic FS-700) were compared to a near blackbody (Hyperion R blackbody model 982) at 0.5 °C increments between 20-40 °C. At each increment, blackbody cavity temperature was confirmed with the platinum resistance thermometer. Measurements were taken initially with the thermal infrared camera followed by the infrared thermometer, with each device mounted in turn on a stand at a fixed distance of 20 cm and 5 cm from the blackbody aperture, respectively. The platinum thermometer under-estimated the blackbody temperature by 0.015 °C (95% LOA: -0.08 °C to 0.05 °C), in contrast to the thermal infrared camera and infrared thermometer which over-estimated the blackbody temperature by 0.16 °C (95% LOA: 0.03 °C to 0.28 °C) and 0.75 °C (95% LOA: -0.30 °C to 1.79 °C), respectively. Infrared thermometer over-estimates thermal infrared camera measurements by 0.6 °C (95% LOA: -0.46 °C to 1.65 °C). In conclusion, the thermal infrared camera is a potential temperature reference "fixed point" that could substitute mercury thermometers. However, further repeatability and reproducibility studies will be required with different models of thermal infrared cameras.

  14. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. Comparison and evaluation of datasets for off-angle iris recognition

    NASA Astrophysics Data System (ADS)

    Kurtuncu, Osman M.; Cerme, Gamze N.; Karakaya, Mahmut

    2016-05-01

    In this paper, we investigated the publicly available iris recognition datasets and their data capture procedures in order to determine if they are suitable for the stand-off iris recognition research. Majority of the iris recognition datasets include only frontal iris images. Even if a few datasets include off-angle iris images, the frontal and off-angle iris images are not captured at the same time. The comparison of the frontal and off-angle iris images shows not only differences in the gaze angle but also change in pupil dilation and accommodation as well. In order to isolate the effect of the gaze angle from other challenging issues including dilation and accommodation, the frontal and off-angle iris images are supposed to be captured at the same time by using two different cameras. Therefore, we developed an iris image acquisition platform by using two cameras in this work where one camera captures frontal iris image and the other one captures iris images from off-angle. Based on the comparison of Hamming distance between frontal and off-angle iris images captured with the two-camera- setup and one-camera-setup, we observed that Hamming distance in two-camera-setup is less than one-camera-setup ranging from 0.05 to 0.001. These results show that in order to have accurate results in the off-angle iris recognition research, two-camera-setup is necessary in order to distinguish the challenging issues from each other.

  16. Improved calibration-based non-uniformity correction method for uncooled infrared camera

    NASA Astrophysics Data System (ADS)

    Liu, Chengwei; Sui, Xiubao

    2017-08-01

    With the latest improvements of microbolometer focal plane arrays (FPA), uncooled infrared (IR) cameras are becoming the most widely used devices in thermography, especially in handheld devices. However the influences derived from changing ambient condition and the non-uniform response of the sensors make it more difficult to correct the nonuniformity of uncooled infrared camera. In this paper, based on the infrared radiation characteristic in the TEC-less uncooled infrared camera, a novel model was proposed for calibration-based non-uniformity correction (NUC). In this model, we introduce the FPA temperature, together with the responses of microbolometer under different ambient temperature to calculate the correction parameters. Based on the proposed model, we can work out the correction parameters with the calibration measurements under controlled ambient condition and uniform blackbody. All correction parameters can be determined after the calibration process and then be used to correct the non-uniformity of the infrared camera in real time. This paper presents the detail of the compensation procedure and the performance of the proposed calibration-based non-uniformity correction method. And our method was evaluated on realistic IR images obtained by a 384x288 pixels uncooled long wave infrared (LWIR) camera operated under changed ambient condition. The results show that our method can exclude the influence caused by the changed ambient condition, and ensure that the infrared camera has a stable performance.

  17. Seeing in a different light—using an infrared camera to teach heat transfer and optical phenomena

    NASA Astrophysics Data System (ADS)

    Pei Wong, Choun; Subramaniam, R.

    2018-05-01

    The infrared camera is a useful tool in physics education to ‘see’ in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.

  18. Seeing in a Different Light--Using an Infrared Camera to Teach Heat Transfer and Optical Phenomena

    ERIC Educational Resources Information Center

    Wong, Choun Pei; Subramaniam, R.

    2018-01-01

    The infrared camera is a useful tool in physics education to 'see' in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.

  19. Fiber-Optic Surface Temperature Sensor Based on Modal Interference.

    PubMed

    Musin, Frédéric; Mégret, Patrice; Wuilpart, Marc

    2016-07-28

    Spatially-integrated surface temperature sensing is highly useful when it comes to controlling processes, detecting hazardous conditions or monitoring the health and safety of equipment and people. Fiber-optic sensing based on modal interference has shown great sensitivity to temperature variation, by means of cost-effective image-processing of few-mode interference patterns. New developments in the field of sensor configuration, as described in this paper, include an innovative cooling and heating phase discrimination functionality and more precise measurements, based entirely on the image processing of interference patterns. The proposed technique was applied to the measurement of the integrated surface temperature of a hollow cylinder and compared with a conventional measurement system, consisting of an infrared camera and precision temperature probe. As a result, the optical technique is in line with the reference system. Compared with conventional surface temperature probes, the optical technique has the following advantages: low heat capacity temperature measurement errors, easier spatial deployment, and replacement of multiple angle infrared camera shooting and the continuous monitoring of surfaces that are not visually accessible.

  20. VizieR Online Data Catalog: NIR polarimetric study in the LMC N159/N160 field (Kim+, 2017)

    NASA Astrophysics Data System (ADS)

    Kim, J.; Jeong, W.-S.; Pyo, J.; Pak, S.; Park, W.-K.; Kwon, J.; Tamura, M.

    2018-04-01

    Simultaneous JHKs polarimetric observations of the N159/N160 c were performed on 2007 February 3 and 5. We used the near-infrared camera SIRIUS (Nagayama et al. 2003SPIE.4841..459N) and the polarimeter SIRPOL (Kandori et al. 2006SPIE.6269E..51K) of the Infrared Survey Facility (IRSF) 1.4 m telescope at the South African Astronomical Observatory in Sutherland, South Africa. The camera has a field of view of 7.7"x7.7" and a pixel scale of 0.45"/pixel. One set of observations for a target field consisted of 20 s exposures at 10 dithered positions for four wave-plate angles (0°, 45°, 22.5°, and 67.5°) in the J, H, and Ks bands, and the whole sequence is repeated 10 and 9 times for the N159 and N160 fields centered at (α, δ)2000=(5h39m37.1s, -69°43'45.1") and (5h40m05.6s, -69°36'25.8"), respectively. (2 data files).

  1. SPARTAN Near-IR Camera | SOAR

    Science.gov Websites

    SPARTAN Near-IR Camera SPARTAN Cookbook Ohio State Infrared Imager/Spectrograph (OSIRIS) - NO LONGER Instrumentation at SOAR»SPARTAN Near-IR Camera SPARTAN Near-IR Camera System Overview The Spartan Infrared Camera is a high spatial resolution near-IR imager. Spartan has a focal plane conisisting of four "

  2. Flooding in the Aftermath of Hurricane Katrina

    NASA Technical Reports Server (NTRS)

    2005-01-01

    These views of the Louisiana and Mississippi regions were acquired before and one day after Katrina made landfall along the Gulf of Mexico coast, and highlight many of the changes to the rivers and vegetation that occurred between the two views. The images were acquired by NASA's Multi-angle Imaging SpectroRadiometer (MISR) on August 14 and August 30, 2005. These multiangular, multispectral false-color composites were created using red band data from MISR's 46o backward and forward-viewing cameras, and near-infrared data from MISR's nadir camera. Such a display causes water bodies and inundated soil to appear in blue and purple hues, and highly vegetated areas to appear bright green. The scene differentiation is a result of both spectral effects (living vegetation is highly reflective at near-infrared wavelengths whereas water is absorbing) and of angular effects (wet surfaces preferentially forward scatter sunlight). The two images were processed identically and extend from the regions of Greenville, Mississippi (upper left) to Mobile Bay, Alabama (lower right).

    There are numerous rivers along the Mississippi coast that were not apparent in the pre-Katrina image; the most dramatic of these is a new inlet in the Pascagoula River that was not apparent before Katrina. The post-Katrina flooding along the edges of Lake Pontchartrain and the city of New Orleans is also apparent. In addition, the agricultural lands along the Mississippi floodplain in the upper left exhibit stronger near-infrared brightness before Katrina. After Katrina, many of these agricultural areas exhibit a stronger signal to MISR's oblique cameras, indicating the presence of inundated soil throughout the floodplain. Note that clouds appear in a different spot for each view angle due to a parallax effect resulting from their height above the surface.

    The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously, viewing the entire globe between 82o north and 82o south latitude every nine days. Each image covers an area of about 380 kilometers by 410 kilometers. The data products were generated from a portion of the imagery acquired during Terra orbits 30091 and 30324 and utilize data from blocks 64-67 within World Reference System-2 path 22.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Science Mission Directorate, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is managed for NASA by the California Institute of Technology.

  3. Thermal Imaging with Novel Infrared Focal Plane Arrays and Quantitative Analysis of Thermal Imagery

    NASA Technical Reports Server (NTRS)

    Gunapala, S. D.; Rafol, S. B.; Bandara, S. V.; Liu, J. K.; Mumolo, J. M.; Soibel, A.; Ting, D. Z.; Tidrow, Meimei

    2012-01-01

    We have developed a single long-wavelength infrared (LWIR) quantum well infrared photodetector (QWIP) camera for thermography. This camera has been used to measure the temperature profile of patients. A pixel coregistered simultaneously reading mid-wavelength infrared (MWIR)/LWIR dual-band QWIP camera was developed to improve the accuracy of temperature measurements especially with objects with unknown emissivity. Even the dualband measurement can provide inaccurate results due to the fact that emissivity is a function of wavelength. Thus we have been developing a four-band QWIP camera for accurate temperature measurement of remote object.

  4. Infrared imaging results of an excited planar jet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrington, R.B.

    1991-12-01

    Planar jets are used for many applications including heating, cooling, and ventilation. Generally such a jet is designed to provide good mixing within an enclosure. In building applications, the jet provides both thermal comfort and adequate indoor air quality. Increased mixing rates may lead to lower short-circuiting of conditioned air, elimination of dead zones within the occupied zone, reduced energy costs, increased occupant comfort, and higher indoor air quality. This paper discusses using an infrared imaging system to show the effect of excitation of a jet on the spread angle and on the jet mixing efficiency. Infrared imaging captures amore » large number of data points in real time (over 50,000 data points per image) providing significant advantages over single-point measurements. We used a screen mesh with a time constant of approximately 0.3 seconds as a target for the infrared camera to detect temperature variations in the jet. The infrared images show increased jet spread due to excitation of the jet. Digital data reduction and analysis show change in jet isotherms and quantify the increased mixing caused by excitation. 17 refs., 20 figs.« less

  5. Accuracy and Precision of a Surgical Navigation System: Effect of Camera and Patient Tracker Position and Number of Active Markers

    PubMed Central

    Gundle, Kenneth R.; White, Jedediah K.; Conrad, Ernest U.; Ching, Randal P.

    2017-01-01

    Introduction: Surgical navigation systems are increasingly used to aid resection and reconstruction of osseous malignancies. In the process of implementing image-based surgical navigation systems, there are numerous opportunities for error that may impact surgical outcome. This study aimed to examine modifiable sources of error in an idealized scenario, when using a bidirectional infrared surgical navigation system. Materials and Methods: Accuracy and precision were assessed using a computerized-numerical-controlled (CNC) machined grid with known distances between indentations while varying: 1) the distance from the grid to the navigation camera (range 150 to 247cm), 2) the distance from the grid to the patient tracker device (range 20 to 40cm), and 3) whether the minimum or maximum number of bidirectional infrared markers were actively functioning. For each scenario, distances between grid points were measured at 10-mm increments between 10 and 120mm, with twelve measurements made at each distance. The accuracy outcome was the root mean square (RMS) error between the navigation system distance and the actual grid distance. To assess precision, four indentations were recorded six times for each scenario while also varying the angle of the navigation system pointer. The outcome for precision testing was the standard deviation of the distance between each measured point to the mean three-dimensional coordinate of the six points for each cluster. Results: Univariate and multiple linear regression revealed that as the distance from the navigation camera to the grid increased, the RMS error increased (p<0.001). The RMS error also increased when not all infrared markers were actively tracking (p=0.03), and as the measured distance increased (p<0.001). In a multivariate model, these factors accounted for 58% of the overall variance in the RMS error. Standard deviations in repeated measures also increased when not all infrared markers were active (p<0.001), and as the distance between navigation camera and physical space increased (p=0.005). Location of the patient tracker did not affect accuracy (0.36) or precision (p=0.97) Conclusion: In our model laboratory test environment, the infrared bidirectional navigation system was more accurate and precise when the distance from the navigation camera to the physical (working) space was minimized and all bidirectional markers were active. These findings may require alterations in operating room setup and software changes to improve the performance of this system. PMID:28694888

  6. A Ground-Based Near Infrared Camera Array System for UAV Auto-Landing in GPS-Denied Environment.

    PubMed

    Yang, Tao; Li, Guangpo; Li, Jing; Zhang, Yanning; Zhang, Xiaoqiang; Zhang, Zhuoyue; Li, Zhi

    2016-08-30

    This paper proposes a novel infrared camera array guidance system with capability to track and provide real time position and speed of a fixed-wing Unmanned air vehicle (UAV) during a landing process. The system mainly include three novel parts: (1) Infrared camera array and near infrared laser lamp based cooperative long range optical imaging module; (2) Large scale outdoor camera array calibration module; and (3) Laser marker detection and 3D tracking module. Extensive automatic landing experiments with fixed-wing flight demonstrate that our infrared camera array system has the unique ability to guide the UAV landing safely and accurately in real time. Moreover, the measurement and control distance of our system is more than 1000 m. The experimental results also demonstrate that our system can be used for UAV automatic accurate landing in Global Position System (GPS)-denied environments.

  7. Mapping wave breaking and residual foam using infrared remote sensing

    NASA Astrophysics Data System (ADS)

    Carini, R. J.; Jessup, A. T.; Chickadel, C.

    2012-12-01

    Quantifying wave breaking in the surfzone is important for the advancement of models that seek to accurately predict energy dissipation, near-shore circulation, wave-current interactions, and air-sea gas transfer. Electro-optical remote sensing has been used to try to identify breaking waves. However, the residual foam, left over after the wave has broken, is indistinguishable from active foam in the visible band, which makes identification of active breaking difficult. Here, we explore infrared remote sensing of breaking waves at near-grazing incidence angles to differentiate between active and residual foam in the surfzone. Measurements were made at two field sites: Duck, NC, in September 2010 (Surf Zone Optics) and New River Inlet, NC, in May 2012 (RIVET). At both sites, multiple IR cameras were mounted to a tower onshore, viewing the surfzone at near-grazing incidence angles. For near-grazing incidence angles, small changes in viewing angle, such as those produced by the slope of a wave face, cause large modulations of the infrared signal. Therefore, the passage of waves can be seen in IR imagery. Wave breaking, however, is identified by the resulting foam. Foam has a higher emissivity than undisturbed water and thus appears warmer in an IR image. Residual foam cools quickly [Marmorino and Smith, 2005], thereby making its signal distinct from that of foam produced during active wave breaking. We will use these properties to develop a technique to produce spatial and temporal maps of active breaking and residual foam. These products can then be used to validate current models of surfzone bubbles and foam coverage. From the maps, we can also estimate energy dissipation due to wave breaking in the surfzone and compare this to estimates made with in situ data.; Infrared image of the surfzone at Duck, NC. Examples of actively breaking foam and cool residual foam are labeled.

  8. 13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  9. 10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  10. Waves on Saturn

    NASA Technical Reports Server (NTRS)

    2005-01-01

    An up-close look at Saturn's atmosphere shows wavelike structures in the planet's constantly changing clouds.

    Feathery striations in the lower right appear to be small-scale waves propagating at a higher altitude than the other cloud features.

    The image was taken with the Cassini spacecraft wide-angle camera on April 14, 2005, through a filter sensitive to wavelengths of infrared light centered at 727 nanometers and at a distance of approximately 386,000 kilometers (240,000 miles) from Saturn. The image scale is 19 kilometers (12 miles) per pixel.

  11. Seeing the Storm

    NASA Image and Video Library

    2007-03-08

    This beautiful look at Saturn's south polar atmosphere shows the hurricane-like polar storm swirling there. Sunlight highlights its high cloud walls, especially around the 10 o'clock position. The image was taken with the Cassini spacecraft wide-angle camera using a spectral filter sensitive to wavelengths of infrared light centered at 939 nanometers. The image was taken on Jan. 30, 2007 at a distance of approximately 1.1 million kilometers (700,000 miles) from Saturn. Image scale is 61 kilometers (38 miles) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA08892

  12. Overcoming Pose Limitations of a Skin-Cued Histograms of Oriented Gradients Dismount Detector Through Contextual Use of Skin Islands and Multiple Support Vector Machines

    DTIC Science & Technology

    2011-03-24

    HOG) dismount detector that cues based off of the presence of human skin (to limit false detections and to reduce the search space complexity). While...wave infrared wavelengths in addition to the visible spectra in order to identify human skin [29] and selectively scan the image for the presence of...and the angle of the acqui- sition camera. Consequently, it is expected that limitations exist on the humans ’ range of motion or stance that still

  13. Wide Angle Movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  14. The development of large-aperture test system of infrared camera and visible CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  15. Morphology and Dynamics of Jets of Comet 67P Churyumov-Gerasimenko: Early Phase Development

    NASA Astrophysics Data System (ADS)

    Lin, Zhong-Yi; Ip, Wing-Huen; Lai, Ian-Lin; Lee, Jui-Chi; Pajola, Maurizio; Lara, Luisa; Gutierrez, Pedro; Rodrigo, Rafael; Bodewits, Dennis; A'Hearn, Mike; Vincent, Jean-Baptiste; Agarwal, Jessica; Keller, Uwe; Mottola, Stefano; Bertini, Ivano; Lowry, Stephen; Rozek, Agata; Liao, Ying; Rosetta Osiris Coi Team

    2015-04-01

    The scientific camera, OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System), onboard the Rosetta spacecraft comprises a Narrow Angle Camera (NAC) for nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field of dust and gas coma investigations. The dynamical behavior of jets in the dust coma continuously monitored by using dust filters from the arrival at the comet (August 2014) throughout the mapping phase (Oct. 2014) is described here. The analysis will cover the study of the time variability of jets, the source regions of these jets, the excess brightness of jets relative to the averaged coma brightness, and the brightness distribution of dust jets along the projected distance. The jets detected between August and September originated mostly from the neck region (Hapi). Morphological changes appeared over a time scale of several days in September. The brightness slope of the dust jets is much steeper than the background coma. This might be related to the sublimation or fragmentation of the emitted dust grains. Inter-comparison with results from other experiments will be necessary to understand the difference between the dust emitted from Hapi and those from the head and the body of the nucleus surface. The physical properties of the Hapi jets will be compared to dust jets (and their source regions) to emerge as comet 67P moves around the perihelion.

  16. Study of carbonate concretions using imaging spectroscopy in the Frontier Formation, Wyoming

    NASA Astrophysics Data System (ADS)

    de Linaje, Virginia Alonso; Khan, Shuhab D.; Bhattacharya, Janok

    2018-04-01

    Imaging spectroscopy is applied to study diagenetic processes of the Wall Creek Member of the Cretaceous Frontier Formation, Wyoming. Visible Near-Infrared and Shortwave-Infrared hyperspectral cameras were used to scan near vertical and well-exposed outcrop walls to analyze lateral and vertical geochemical variations. Reflectance spectra were analyzed and compared with high-resolution laboratory spectral and hyperspectral imaging data. Spectral Angle Mapper (SAM) and Mixture Tuned Matched Filtering (MTMF) classification algorithms were applied to quantify facies and mineral abundances in the Frontier Formation. MTMF is the most effective and reliable technique when studying spectrally similar materials. Classification results show that calcite cement in concretions associated with the channel facies is homogeneously distributed, whereas the bar facies was shown to be interbedded with layers of non-calcite-cemented sandstone.

  17. Sublimation of icy aggregates in the coma of comet 67P/Churyumov-Gerasimenko detected with the OSIRIS cameras on board Rosetta

    NASA Astrophysics Data System (ADS)

    Gicquel, A.; Vincent, J.-B.; Agarwal, J.; A'Hearn, M. F.; Bertini, I.; Bodewits, D.; Sierks, H.; Lin, Z.-Y.; Barbieri, C.; Lamy, P. L.; Rodrigo, R.; Koschny, D.; Rickman, H.; Keller, H. U.; Barucci, M. A.; Bertaux, J.-L.; Besse, S.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; Debei, S.; Deller, J.; De Cecco, M.; Frattin, E.; El-Maarry, M. R.; Fornasier, S.; Fulle, M.; Groussin, O.; Gutiérrez, P. J.; Gutiérrez-Marquez, P.; Güttler, C.; Höfner, S.; Hofmann, M.; Hu, X.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Knollenberg, J.; Kovacs, G.; Kramm, J.-R.; Kührt, E.; Küppers, M.; Lara, L. M.; Lazzarin, M.; Moreno, J. J. Lopez; Lowry, S.; Marzari, F.; Masoumzadeh, N.; Massironi, M.; Moreno, F.; Mottola, S.; Naletto, G.; Oklay, N.; Pajola, M.; Pommerol, A.; Preusker, F.; Scholten, F.; Shi, X.; Thomas, N.; Toth, I.; Tubiana, C.

    2016-11-01

    Beginning in 2014 March, the OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) cameras began capturing images of the nucleus and coma (gas and dust) of comet 67P/Churyumov-Gerasimenko using both the wide angle camera (WAC) and the narrow angle camera (NAC). The many observations taken since July of 2014 have been used to study the morphology, location, and temporal variation of the comet's dust jets. We analysed the dust monitoring observations shortly after the southern vernal equinox on 2015 May 30 and 31 with the WAC at the heliocentric distance Rh = 1.53 AU, where it is possible to observe that the jet rotates with the nucleus. We found that the decline of brightness as a function of the distance of the jet is much steeper than the background coma, which is a first indication of sublimation. We adapted a model of sublimation of icy aggregates and studied the effect as a function of the physical properties of the aggregates (composition and size). The major finding of this paper was that through the sublimation of the aggregates of dirty grains (radius a between 5 and 50 μm) we were able to completely reproduce the radial brightness profile of a jet beyond 4 km from the nucleus. To reproduce the data, we needed to inject a number of aggregates between 8.5 × 1013 and 8.5 × 1010 for a = 5 and 50 μm, respectively, or an initial mass of H2O ice around 22 kg.

  18. Traffic Sign Recognition with Invariance to Lighting in Dual-Focal Active Camera System

    NASA Astrophysics Data System (ADS)

    Gu, Yanlei; Panahpour Tehrani, Mehrdad; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki

    In this paper, we present an automatic vision-based traffic sign recognition system, which can detect and classify traffic signs at long distance under different lighting conditions. To realize this purpose, the traffic sign recognition is developed in an originally proposed dual-focal active camera system. In this system, a telephoto camera is equipped as an assistant of a wide angle camera. The telephoto camera can capture a high accuracy image for an object of interest in the view field of the wide angle camera. The image from the telephoto camera provides enough information for recognition when the accuracy of traffic sign is low from the wide angle camera. In the proposed system, the traffic sign detection and classification are processed separately for different images from the wide angle camera and telephoto camera. Besides, in order to detect traffic sign from complex background in different lighting conditions, we propose a type of color transformation which is invariant to light changing. This color transformation is conducted to highlight the pattern of traffic signs by reducing the complexity of background. Based on the color transformation, a multi-resolution detector with cascade mode is trained and used to locate traffic signs at low resolution in the image from the wide angle camera. After detection, the system actively captures a high accuracy image of each detected traffic sign by controlling the direction and exposure time of the telephoto camera based on the information from the wide angle camera. Moreover, in classification, a hierarchical classifier is constructed and used to recognize the detected traffic signs in the high accuracy image from the telephoto camera. Finally, based on the proposed system, a set of experiments in the domain of traffic sign recognition is presented. The experimental results demonstrate that the proposed system can effectively recognize traffic signs at low resolution in different lighting conditions.

  19. [Evaluation of Iris Morphology Viewed through Stromal Edematous Corneas by Infrared Camera].

    PubMed

    Kobayashi, Masaaki; Morishige, Naoyuki; Morita, Yukiko; Yamada, Naoyuki; Kobayashi, Motomi; Sonoda, Koh-Hei

    2016-02-01

    We reported that the application of infrared camera enables us to observe iris morphology in Peters' anomaly through edematous corneas. To observe the iris morphology in bullous keratopathy or failure grafts with an infrared camera. Eleven bullous keratopathy or failure grafts subjects (6 men and 5 women, mean age ± SD; 72.7 ± 13.0 years old) were enrolled in this study. The iris morphology was observed by applying visible light mode and near infrared light mode of infrared camera (MeibomPen). The detectability of pupil shapes, iris patterns and presence of iridectomy was evaluated. Infrared mode observation enabled us to detect the pupil shapes in 11 out of 11 cases, iris patterns in 3 out of 11 cases, and presence of iridetomy in 9 out of 11 cases although visible light mode observation could not detect any iris morphological changes. Applying infrared optics was valuable for observation of the iris morphology through stromal edematous corneas.

  20. Coherent infrared imaging camera (CIRIC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.

    1995-07-01

    New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerousmore » and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.« less

  1. Ten-Meter Scale Topography and Roughness of Mars Exploration Rovers Landing Sites and Martian Polar Regions

    NASA Technical Reports Server (NTRS)

    Ivanov, Anton B.

    2003-01-01

    The Mars Orbiter Camera (MOC) has been operating on board of the Mars Global Surveyor (MGS) spacecraft since 1998. It consists of three cameras - Red and Blue Wide Angle cameras (FOV=140 deg.) and Narrow Angle camera (FOV=0.44 deg.). The Wide Angle camera allows surface resolution down to 230 m/pixel and the Narrow Angle camera - down to 1.5 m/pixel. This work is a continuation of the project, which we have reported previously. Since then we have refined and improved our stereo correlation algorithm and have processed many more stereo pairs. We will discuss results of our stereo pair analysis located in the Mars Exploration rovers (MER) landing sites and address feasibility of recovering topography from stereo pairs (especially in the polar regions), taken during MGS 'Relay-16' mode.

  2. The Effect of Camera Angle and Image Size on Source Credibility and Interpersonal Attraction.

    ERIC Educational Resources Information Center

    McCain, Thomas A.; Wakshlag, Jacob J.

    The purpose of this study was to examine the effects of two nonverbal visual variables (camera angle and image size) on variables developed in a nonmediated context (source credibility and interpersonal attraction). Camera angle and image size were manipulated in eight video taped television newscasts which were subsequently presented to eight…

  3. High-Resolution Mars Camera Test Image of Moon Infrared

    NASA Image and Video Library

    2005-09-13

    This crescent view of Earth Moon in infrared wavelengths comes from a camera test by NASA Mars Reconnaissance Orbiter spacecraft on its way to Mars. This image was taken by taken by the High Resolution Imaging Science Experiment camera Sept. 8, 2005.

  4. Application of spatially modulated near-infrared structured light to study changes in optical properties of mouse brain tissue during heatstress.

    PubMed

    Shaul, Oren; Fanrazi-Kahana, Michal; Meitav, Omri; Pinhasi, Gad A; Abookasis, David

    2017-11-10

    Heat stress (HS) is a medical emergency defined by abnormally elevated body temperature that causes biochemical, physiological, and hematological changes. The goal of the present research was to detect variations in optical properties (absorption, reduced scattering, and refractive index coefficients) of mouse brain tissue during HS by using near-infrared (NIR) spatial light modulation. NIR spatial patterns with different spatial phases were used to differentiate the effects of tissue scattering from those of absorption. Decoupling optical scattering from absorption enabled the quantification of a tissue's chemical constituents (related to light absorption) and structural properties (related to light scattering). Technically, structured light patterns at low and high spatial frequencies of six wavelengths ranging between 690 and 970 nm were projected onto the mouse scalp surface while diffuse reflected light was recorded by a CCD camera positioned perpendicular to the mouse scalp. Concurrently to pattern projection, brain temperature was measured with a thermal camera positioned slightly off angle from the mouse head while core body temperature was monitored by thermocouple probe. Data analysis demonstrated variations from baseline measurements in a battery of intrinsic brain properties following HS.

  5. Circles and Hexagons

    NASA Image and Video Library

    2017-10-09

    Saturn's cloud belts generally move around the planet in a circular path, but one feature is slightly different. The planet's wandering, hexagon-shaped polar jet stream breaks the mold -- a reminder that surprises lurk everywhere in the solar system. This atmospheric feature was first observed by the Voyager mission in the early 1980s, and was dubbed "the hexagon." Cassini's visual and infrared mapping spectrometer was first to spy the hexagon during the mission, since it could see the feature's outline while the pole was still immersed in wintry darkness. The hexagon became visible to Cassini's imaging cameras as sunlight returned to the northern hemisphere. This view looks toward the northern hemisphere of Saturn -- in summer when this view was acquired -- from above 65 degrees north latitude. The image was taken with the Cassini spacecraft wide-angle camera on June 28, 2017 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was acquired at a distance of approximately 536,000 miles (862,000 kilometers) from Saturn. Image scale is 32 miles (52 kilometers) per pixel. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA21348

  6. Method of radiometric quality assessment of NIR images acquired with a custom sensor mounted on an unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Wierzbicki, Damian; Fryskowska, Anna; Kedzierski, Michal; Wojtkowska, Michalina; Delis, Paulina

    2018-01-01

    Unmanned aerial vehicles are suited to various photogrammetry and remote sensing missions. Such platforms are equipped with various optoelectronic sensors imaging in the visible and infrared spectral ranges and also thermal sensors. Nowadays, near-infrared (NIR) images acquired from low altitudes are often used for producing orthophoto maps for precision agriculture among other things. One major problem results from the application of low-cost custom and compact NIR cameras with wide-angle lenses introducing vignetting. In numerous cases, such cameras acquire low radiometric quality images depending on the lighting conditions. The paper presents a method of radiometric quality assessment of low-altitude NIR imagery data from a custom sensor. The method utilizes statistical analysis of NIR images. The data used for the analyses were acquired from various altitudes in various weather and lighting conditions. An objective NIR imagery quality index was determined as a result of the research. The results obtained using this index enabled the classification of images into three categories: good, medium, and low radiometric quality. The classification makes it possible to determine the a priori error of the acquired images and assess whether a rerun of the photogrammetric flight is necessary.

  7. Radiometric Cross-Calibration of GAOFEN-1 Wfv Cameras with LANDSAT-8 Oli and Modis Sensors Based on Radiation and Geometry Matching

    NASA Astrophysics Data System (ADS)

    Li, J.; Wu, Z.; Wei, X.; Zhang, Y.; Feng, F.; Guo, F.

    2018-04-01

    Cross-calibration has the advantages of high precision, low resource requirements and simple implementation. It has been widely used in recent years. The four wide-field-of-view (WFV) cameras on-board Gaofen-1 satellite provide high spatial resolution and wide combined coverage (4 × 200 km) without onboard calibration. In this paper, the four-band radiometric cross-calibration coefficients of WFV1 camera were obtained based on radiation and geometry matching taking Landsat 8 OLI (Operational Land Imager) sensor as reference. Scale Invariant Feature Transform (SIFT) feature detection method and distance and included angle weighting method were introduced to correct misregistration of WFV-OLI image pair. The radiative transfer model was used to eliminate difference between OLI sensor and WFV1 camera through the spectral match factor (SMF). The near-infrared band of WFV1 camera encompasses water vapor absorption bands, thus a Look Up Table (LUT) for SMF varies from water vapor amount is established to estimate the water vapor effects. The surface synchronization experiment was designed to verify the reliability of the cross-calibration coefficients, which seem to perform better than the official coefficients claimed by the China Centre for Resources Satellite Data and Application (CCRSDA).

  8. Space-based infrared sensors of space target imaging effect analysis

    NASA Astrophysics Data System (ADS)

    Dai, Huayu; Zhang, Yasheng; Zhou, Haijun; Zhao, Shuang

    2018-02-01

    Target identification problem is one of the core problem of ballistic missile defense system, infrared imaging simulation is an important means of target detection and recognition. This paper first established the space-based infrared sensors ballistic target imaging model of point source on the planet's atmosphere; then from two aspects of space-based sensors camera parameters and target characteristics simulated atmosphere ballistic target of infrared imaging effect, analyzed the camera line of sight jitter, camera system noise and different imaging effects of wave on the target.

  9. Challenges and solutions for high performance SWIR lens design

    NASA Astrophysics Data System (ADS)

    Gardner, M. C.; Rogers, P. J.; Wilde, M. F.; Cook, T.; Shipton, A.

    2016-10-01

    Shortwave infrared (SWIR) cameras are becoming increasingly attractive due to the improving size, resolution and decreasing prices of InGaAs focal plane arrays (FPAs). The rapid development of competitively priced HD performance SWIR cameras has not been matched in SWIR imaging lenses with the result that the lens is now more likely to be the limiting factor in imaging quality than the FPA. Adapting existing lens designs from the visible region by re-coating for SWIR will improve total transmission but diminished image quality metrics such as MTF, and in particular large field angle performance such as vignetting, field curvature and distortion are serious consequences. To meet this challenge original SWIR solutions are presented including a wide field of view fixed focal length lens for commercial machine vision (CMV) and a wide angle, small, lightweight defence lens and their relevant design considerations discussed. Issues restricting suitable glass types will be examined. The index and dispersion properties at SWIR wavelengths can differ significantly from their visible values resulting in unusual glass combinations when matching doublet elements. Materials chosen simultaneously allow athermalization of the design as well as containing matched CTEs in the elements of doublets. Recently, thinned backside-illuminated InGaAs devices have made Vis.SWIR cameras viable. The SWIR band is sufficiently close to the visible that the same constituent materials can be used for AR coatings covering both bands. Keeping the lens short and mass low can easily result in high incidence angles which in turn complicates coating design, especially when extended beyond SWIR into the visible band. This paper also explores the potential performance of wideband Vis.SWIR AR coatings.

  10. Digital cartography of Io

    NASA Technical Reports Server (NTRS)

    Mcewen, Alfred S.; Duck, B.; Edwards, Kathleen

    1991-01-01

    A high resolution controlled mosaic of the hemisphere of Io centered on longitude 310 degrees is produced. Digital cartographic techniques were employed. Approximately 80 Voyager 1 clear and blue filter frames were utilized. This mosaic was merged with low-resolution color images. This dataset is compared to the geologic map of this region. Passage of the Voyager spacecraft through the Io plasma torus during acquisition of the highest resolution images exposed the vidicon detectors to ionized radiation, resulting in dark-current buildup on the vidicon. Because the vidicon is scanned from top to bottom, more charge accumulated toward the bottom of the frames, and the additive error increases from top to bottom as a ramp function. This ramp function was removed by using a model. Photometric normalizations were applied using the Minnaert function. An attempt to use Hapke's photometric function revealed that this function does not adequately describe Io's limb darkening at emission angles greater than 80 degrees. In contrast, the Minnaert function accurately describes the limb darkening up to emission angles of about 89 degrees. The improved set of discrete camera angles derived from this effort will be used in conjunction with the space telemetry pointing history file (the IPPS file), corrected on 4 or 12 second intervals to derive a revised time history for the pointing of the Infrared Interferometric Spectrometer (IRIS). For IRIS observations acquired between camera shutterings, the IPPS file can be corrected by linear interpolation, provided that the spacecraft motions were continuous. Image areas corresponding to the fields of view of IRIS spectra acquired between camera shutterings will be extracted from the mosaic to place the IRIS observations and hotspot models into geologic context.

  11. Surveillance Using Multiple Unmanned Aerial Vehicles

    DTIC Science & Technology

    2009-03-01

    BATCAM wingspan was 21” vs Jodeh’s 9.1 ft, the BATCAM’s propulsion was electric vs. Jodeh’s gas engine, cameras were body fixed vs. gimballed, and...3.1: BATCAM Camera FOV Angles Angle Front Camera Side Camera Depression Angle 49◦ 39◦ horizontal FOV 48◦ 48◦ vertical FOV 40◦ 40◦ by a quiet electric ...motor. The batteries can be recharged with a car cigarette lighter in less than an hour. Assembly of the wing airframe takes less than a minute, and

  12. Space infrared telescope facility wide field and diffraction limited array camera (IRAC)

    NASA Technical Reports Server (NTRS)

    Fazio, Giovanni G.

    1988-01-01

    The wide-field and diffraction limited array camera (IRAC) is capable of two-dimensional photometry in either a wide-field or diffraction-limited mode over the wavelength range from 2 to 30 microns with a possible extension to 120 microns. A low-doped indium antimonide detector was developed for 1.8 to 5.0 microns, detectors were tested and optimized for the entire 1.8 to 30 micron range, beamsplitters were developed and tested for the 1.8 to 30 micron range, and tradeoff studies of the camera's optical system performed. Data are presented on the performance of InSb, Si:In, Si:Ga, and Si:Sb array detectors bumpbonded to a multiplexed CMOS readout chip of the source-follower type at SIRTF operating backgrounds (equal to or less than 1 x 10 to the 8th ph/sq cm/sec) and temperature (4 to 12 K). Some results at higher temperatures are also presented for comparison to SIRTF temperature results. Data are also presented on the performance of IRAC beamsplitters at room temperature at both 0 and 45 deg angle of incidence and on the performance of the all-reflecting optical system baselined for the camera.

  13. Reconditioning of Cassini Narrow-Angle Camera

    NASA Image and Video Library

    2002-07-23

    These five images of single stars, taken at different times with the narrow-angle camera on NASA Cassini spacecraft, show the effects of haze collecting on the camera optics, then successful removal of the haze by warming treatments.

  14. Visual cueing considerations in Nap-of-the-Earth helicopter flight head-slaved helmet-mounted displays

    NASA Technical Reports Server (NTRS)

    Grunwald, Arthur J.; Kohn, Silvia

    1993-01-01

    The pilot's ability to derive Control-Oriented Visual Field Information from teleoperated Helmet-Mounted displays in Nap-of-the-Earth flight, is investigated. The visual field with these types of displays, commonly used in Apache and Cobra helicopter night operations, originates from a relatively narrow field-of-view Forward Looking Infrared Radiation Camera, gimbal-mounted at the nose of the aircraft and slaved to the pilot's line-of-sight, in order to obtain a wide-angle field-of-regard. Pilots have encountered considerable difficulties in controlling the aircraft by these devices. Experimental simulator results presented here indicate that part of these difficulties can be attributed to head/camera slaving system phase lags and errors. In the presence of voluntary head rotation, these slaving system imperfections are shown to impair the Control-Oriented Visual Field Information vital in vehicular control, such as the perception of the anticipated flight path or the vehicle yaw rate. Since, in the presence of slaving system imperfections, the pilot will tend to minimize head rotation, the full wide-angle field-of-regard of the line-of-sight slaved Helmet-Mounted Display, is not always fully utilized.

  15. Comparison between diffuse infrared and acoustic transmission over the human skull.

    PubMed

    Wang, Q; Reganti, N; Yoshioka, Y; Howell, M; Clement, G T

    2015-01-01

    Skull-induced distortion and attenuation present a challenge to both transcranial imaging and therapy. Whereas therapeutic procedures have been successful in offsetting aberration using from prior CTs, this approach impractical for imaging. In effort to provide a simplified means for aberration correction, we have been investigating the use of diffuse infrared light as an indicator of acoustic properties. Infrared wavelengths were specifically selected for tissue penetration; however this preliminary study was performed through bone alone via a transmission mode to facilitate comparison with acoustic measurements. The inner surface of a half human skull, cut along the sagittal midline, was illuminated using an infrared heat lamp and images of the outer surface were acquired with an IR-sensitive camera. A range of source angles were acquired and averaged to eliminate source bias. Acoustic measurement were likewise obtained over the surface with a source (1MHz, 12.7mm-diam) oriented parallel to the skull surface and hydrophone receiver (1mm PVDF). Preliminary results reveal a positive correlation between sound speed and optical intensity, whereas poor correlation is observed between acoustic amplitude and optical intensity.

  16. Lock-in thermographic inspection of squats on rail steel head

    NASA Astrophysics Data System (ADS)

    Peng, D.; Jones, R.

    2013-03-01

    The development of squat defects has become a major concern in numerous railway systems throughout the world. Infrared thermography is a relatively new non-destructive inspection technique used for a wide range of applications. However, it has not been used for rail squat detection. Lock-in thermography is a non-destructive inspection technique that utilizes an infrared camera to detect the thermal waves. A thermal image is produced, which displays the local thermal wave variation in phase or amplitude. In inhomogeneous materials, the amplitude and phase of the thermal wave carries information related to both the local thermal properties and the nature of the structure being inspected. By examining the infrared thermal signature of squat damage on the head of steel rails, it was possible to generate a relationship matching squat depth to thermal image phase angle, using appropriate experimental/numerical calibration. The results showed that with the additional data sets obtained from further experimental tests, the clarity of this relationship will be greatly improved to a level whereby infrared thermal contours can be directly translated into the precise subsurface behaviour of a squat.

  17. MISR at 15: Multiple Perspectives on Our Changing Earth

    NASA Astrophysics Data System (ADS)

    Diner, D. J.; Ackerman, T. P.; Braverman, A. J.; Bruegge, C. J.; Chopping, M. J.; Clothiaux, E. E.; Davies, R.; Di Girolamo, L.; Garay, M. J.; Jovanovic, V. M.; Kahn, R. A.; Kalashnikova, O.; Knyazikhin, Y.; Liu, Y.; Marchand, R.; Martonchik, J. V.; Muller, J. P.; Nolin, A. W.; Pinty, B.; Verstraete, M. M.; Wu, D. L.

    2014-12-01

    Launched aboard NASA's Terra satellite in December 1999, the Multi-angle Imaging SpectroRadiometer (MISR) instrument has opened new vistas in remote sensing of our home planet. Its 9 pushbroom cameras provide as many view angles ranging from 70 degrees forward to 70 degrees backward along Terra's flight track, in four visible and near-infrared spectral bands. MISR's well-calibrated, accurately co-registered, and moderately high spatial resolution radiance images have been coupled with novel data processing algorithms to mine the information content of angular reflectance anisotropy and multi-camera stereophotogrammetry, enabling new perspectives on the 3-D structure and dynamics of Earth's atmosphere and surface in support of climate and environmental research. Beginning with "first light" in February 2000, the nearly 15-year (and counting) MISR observational record provides an unprecedented data set with applications to multiple disciplines, documenting regional, global, short-term, and long-term changes in aerosol optical depths, aerosol type, near-surface particulate pollution, spectral top-of-atmosphere and surface albedos, aerosol plume-top and cloud-top heights, height-resolved cloud fractions, atmospheric motion vectors, and the structure of vegetated and ice-covered terrains. Recent computational advances include aerosol retrievals at finer spatial resolution than previously possible, and production of near-real time tropospheric winds with a latency of less than 3 hours, making possible for the first time the assimilation of MISR data into weather forecast models. In addition, recent algorithmic and technological developments provide the means of using and acquiring multi-angular data in new ways, such as the application of optical tomography to map 3-D atmospheric structure; building smaller multi-angle instruments in the future; and extending the multi-angular imaging methodology to the ultraviolet, shortwave infrared, and polarimetric realms. Such advances promise further enhancements to the observational power of the remote sensing approaches that MISR has pioneered.

  18. Application of infrared camera to bituminous concrete pavements: measuring vehicle

    NASA Astrophysics Data System (ADS)

    Janků, Michal; Stryk, Josef

    2017-09-01

    Infrared thermography (IR) has been used for decades in certain fields. However, the technological level of advancement of measuring devices has not been sufficient for some applications. Over the recent years, good quality thermal cameras with high resolution and very high thermal sensitivity have started to appear on the market. The development in the field of measuring technologies allowed the use of infrared thermography in new fields and for larger number of users. This article describes the research in progress in Transport Research Centre with a focus on the use of infrared thermography for diagnostics of bituminous road pavements. A measuring vehicle, equipped with a thermal camera, digital camera and GPS sensor, was designed for the diagnostics of pavements. New, highly sensitive, thermal cameras allow to measure very small temperature differences from the moving vehicle. This study shows the potential of a high-speed inspection without lane closures while using IR thermography.

  19. Uncooled infrared sensors: rapid growth and future perspective

    NASA Astrophysics Data System (ADS)

    Balcerak, Raymond S.

    2000-07-01

    The uncooled infrared cameras are now available for both the military and commercial markets. The current camera technology incorporates the fruits of many years of development, focusing on the details of pixel design, novel material processing, and low noise read-out electronics. The rapid insertion of cameras into systems is testimony to the successful completion of this 'first phase' of development. In the military market, the first uncooled infrared cameras will be used for weapon sights, driver's viewers and helmet mounted cameras. Major commercial applications include night driving, security, police and fire fighting, and thermography, primarily for preventive maintenance and process control. The technology for the next generation of cameras is even more demanding, but within reach. The paper outlines the technology program planned for the next generation of cameras, and the approaches to further enhance performance, even to the radiation limit of thermal detectors.

  20. Electro-optical system for gunshot detection: analysis, concept, and performance

    NASA Astrophysics Data System (ADS)

    Kastek, M.; Dulski, R.; Madura, H.; Trzaskawka, P.; Bieszczad, G.; Sosnowski, T.

    2011-08-01

    The paper discusses technical possibilities to build an effective electro-optical sensor unit for sniper detection using infrared cameras. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. At first, the analysis was presented of three distinguished phases of sniper activity: before, during and after the shot. On the basis of experimental data the parameters defining the relevant sniper signatures were determined which are essential in assessing the capability of infrared camera to detect sniper activity. A sniper body and muzzle flash were analyzed as targets and the descriptions of phenomena which make it possible to detect sniper activities in infrared spectra as well as analysis of physical limitations were performed. The analyzed infrared systems were simulated using NVTherm software. The calculations for several cameras, equipped with different lenses and detector types were performed. The simulation of detection ranges was performed for the selected scenarios of sniper detection tasks. After the analysis of simulation results, the technical specifications of infrared sniper detection system were discussed, required to provide assumed detection range. Finally the infrared camera setup was proposed which can detected sniper from 1000 meters range.

  1. Unstructured Facility Navigation by Applying the NIST 4D/RCS Architecture

    DTIC Science & Technology

    2006-07-01

    control, and the planner); wire- less data and emergency stop radios; GPS receiver; inertial navigation unit; dual stereo cameras; infrared sensors...current Actuators Wheel motors, camera controls Scale & filter signals status commands commands commands GPS Antenna Dual stereo cameras...used in the sensory processing module include the two pairs of stereo color cameras, the physical bumper and infrared bumper sensors, the motor

  2. Infrared Fiber Imager

    DTIC Science & Technology

    1999-05-12

    to an infrared television camera AVTO TVS-2100. The detector in the camera was an InSb crystal having sensitivity in the wavelength region between 3.0...Serial Number: Navy Case: 79,823 camera AVTO TVS-2100, with a detector of the In Sb crystal, having peak sensitivity in the wavelength region between

  3. Students' Framing of Laboratory Exercises Using Infrared Cameras

    ERIC Educational Resources Information Center

    Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.

    2015-01-01

    Thermal science is challenging for students due to its largely imperceptible nature. Handheld infrared cameras offer a pedagogical opportunity for students to see otherwise invisible thermal phenomena. In the present study, a class of upper secondary technology students (N = 30) partook in four IR-camera laboratory activities, designed around the…

  4. Watercolor World

    NASA Image and Video Library

    2017-04-17

    When imaged by NASA Cassini spacecraft at infrared wavelengths that pierce the planet upper haze layer, the high-speed winds of Saturn atmosphere produce watercolor-like patterns. With no solid surface creating atmospheric drag, winds on Saturn can reach speeds of more than 1,100 miles per hour (1,800 kilometers per hour) -- some of the fastest in the solar system. This view was taken from a vantage point about 28 degrees above Saturn's equator. The image was taken with the Cassini spacecraft wide-angle camera on Dec. 2, 2016, with a combination of spectral filters which preferentially admits wavelengths of near-infrared light centered at 728 nanometers. The view was acquired at a distance of approximately 592,000 miles (953,000 kilometers) from Saturn. Image scale is 35 miles (57 kilometers) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA20528

  5. An infrared jet in Centaurus A - A link to the extranuclear activity in distant radio galaxies?

    NASA Technical Reports Server (NTRS)

    Joy, Marshall; Harvey, P. M.; Tollestrup, E. V.; Sellgren, K.; Mcgregor, P. J.

    1991-01-01

    High-resolution NIR images of the visually obscured central region of Centaurus A (NGC 5128) were obtained with the University of Texas array camera on the AAT in June 1988, in order to investigate the effect of the active nucleus on the surrounding galaxy. The J (1.25 micron), H (1.65 micron), and K (2.2 micron) images of the central 40 arcsec of the galaxy revealed an emission feature extending about 10 arcsec northeast of the nucleus at the same position angle as the X-ray and radio jets. This jet is most prominent at the 1.25 micron wavelength, where its brightness was comparable to that of the nucleus. The observed properties of the 'infrared jet' were found to be similar to those seen in distant radio sources.

  6. Near-infrared hyperspectral imaging of atherosclerotic tissue phantom

    NASA Astrophysics Data System (ADS)

    Ishii, K.; Nagao, R.; Kitayabu, A.; Awazu, K.

    2013-06-01

    A method to identify vulnerable plaques that are likely to cause acute coronary events has been required. The object of this study is identifying vulnerable plaques by hyperspectral imaging in near-infrared range (NIR-HSI) for an angioscopic application. In this study, NIR-HSI of atherosclerotic tissue phantoms was demonstrated under simulated angioscopic conditions. NIR-HSI system was constructed by a NIR super continuum light and a mercury-cadmium-telluride camera. Spectral absorbance values were obtained in the wavelength range from 1150 to 2400 nm at 10 nm intervals. The hyperspectral images were constructed with spectral angle mapper algorithm. As a result, detections of the lipid area in the atherosclerotic tissue phantom under angioscopic observation conditions were achieved especially in the wavelength around 1200 nm, which corresponds to the second overtone of CH stretching vibration mode.

  7. Northern Summer on Titan

    NASA Image and Video Library

    2017-06-14

    NASA's Cassini spacecraft sees bright methane clouds drifting in the summer skies of Saturn's moon Titan, along with dark hydrocarbon lakes and seas clustered around the north pole. Compared to earlier in Cassini's mission, most of the surface in the moon's northern high latitudes is now illuminated by the sun. The image was taken with the Cassini spacecraft narrow-angle camera on June 9, 2017, using a spectral filter that preferentially admits wavelengths of near-infrared light centered at 938 nanometers. Cassini obtained the view at a distance of about 315,000 miles (507,000 kilometers) from Titan. https://photojournal.jpl.nasa.gov/catalog/PIA21615

  8. Distinguishing the road conditions of dry, aquaplane, and frozen by using a three-color infrared camera

    NASA Astrophysics Data System (ADS)

    Tabuchi, Toru; Yamagata, Shigeki; Tamura, Tetsuo

    2003-04-01

    There are increasing demands for information to avoid accident in automobile traffic increase. We will discuss that an infrared camera can identify three conditions (dry, aquaplane, frozen) of the road surface. Principles of this method are; 1.We have found 3-color infrared camera can distinguish those conditions using proper data processing 2.The emissivity of the materials on the road surface (conclete, water, ice) differs in three wavelength regions. 3.The sky's temperature is lower than the road's. The emissivity of the road depends on the road surface conditions. Therefore, 3-color infrared camera measure the energy reflected from the sky on the road surface and self radiation of road surface. The road condition can be distinguished by processing the energy pattern measured in three wavelength regions. We were able to collect the experimental results that the emissivity of conclete is differ from water. The infrared camera whose NETD (Noise Equivalent Temperature Difference) at each 3-wavelength is 1.0C or less can distinguish the road conditions by using emissivity difference.

  9. Looking Up to the Giant

    NASA Image and Video Library

    2015-08-03

    Thanks to the illumination angle, Mimas (right) and Dione (left) appear to be staring up at a giant Saturn looming in the background. Although certainly large enough to be noticeable, moons like Mimas (246 miles or 396 kilometers across) and Dione (698 miles or 1123 kilometers across) are tiny compared to Saturn (75,400 miles or 120,700 kilometers across). Even the enormous moon Titan (3,200 miles or 5,150 kilometers across) is dwarfed by the giant planet. This view looks toward the unilluminated side of the rings from about one degree of the ring plane. The image was taken with the Cassini spacecraft wide-angle camera on May 27, 2015 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 728 nanometers. The view was obtained at a distance of approximately 634,000 miles (one million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 85 degrees. Image scale is 38 miles (61 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18331

  10. Mitigation of Atmospheric Effects on Imaging Systems

    DTIC Science & Technology

    2004-03-31

    focal length. The imaging system had two cameras: an Electrim camera sensitive in the visible (0.6 µ m) waveband and an Amber QWIP infrared camera...sensitive in the 9–micron region. The Amber QWIP infrared camera had 256x256 pixels, pixel pitch 38 mµ , focal length of 1.8 m, FOV of 5.4 x5.4 mr...each day. Unfortunately, signals from the different read ports of the Electrim camera picked up noise on their way to the digitizer, and this resulted

  11. The Detection and Photometric Redshift Determination of Distant Galaxies using SIRTF's Infrared Array Camera

    NASA Technical Reports Server (NTRS)

    Simpson, C.; Eisenhardt, P.

    1998-01-01

    We investigate the ability of the Space Infrared Telescope Facility's Infrared Array Camera to detect distant (z3) galaxies and measure their photometric redshifts. Our analysis shows that changing the original long wavelength filter specifications provides significant improvements in performance in this and other areas.

  12. ARNICA, the Arcetri near-infrared camera: Astronomical performance assessment.

    NASA Astrophysics Data System (ADS)

    Hunt, L. K.; Lisi, F.; Testi, L.; Baffa, C.; Borelli, S.; Maiolino, R.; Moriondo, G.; Stanga, R. M.

    1996-01-01

    The Arcetri near-infrared camera ARNICA was built as a users' instrument for the Infrared Telescope at Gornergrat (TIRGO), and is based on a 256x256 NICMOS 3 detector. In this paper, we discuss ARNICA's optical and astronomical performance at the TIRGO and at the William Herschel Telescope on La Palma. Optical performance is evaluated in terms of plate scale, distortion, point spread function, and ghosting. Astronomical performance is characterized by camera efficiency, sensitivity, and spatial uniformity of the photometry.

  13. A Robust Mechanical Sensing System for Unmanned Sea Surface Vehicles

    NASA Technical Reports Server (NTRS)

    Kulczycki, Eric A.; Magnone, Lee J.; Huntsberger, Terrance; Aghazarian, Hrand; Padgett, Curtis W.; Trotz, David C.; Garrett, Michael S.

    2009-01-01

    The need for autonomous navigation and intelligent control of unmanned sea surface vehicles requires a mechanically robust sensing architecture that is watertight, durable, and insensitive to vibration and shock loading. The sensing system developed here comprises four black and white cameras and a single color camera. The cameras are rigidly mounted to a camera bar that can be reconfigured to mount multiple vehicles, and act as both navigational cameras and application cameras. The cameras are housed in watertight casings to protect them and their electronics from moisture and wave splashes. Two of the black and white cameras are positioned to provide lateral vision. They are angled away from the front of the vehicle at horizontal angles to provide ideal fields of view for mapping and autonomous navigation. The other two black and white cameras are positioned at an angle into the color camera's field of view to support vehicle applications. These two cameras provide an overlap, as well as a backup to the front camera. The color camera is positioned directly in the middle of the bar, aimed straight ahead. This system is applicable to any sea-going vehicle, both on Earth and in space.

  14. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    USGS Publications Warehouse

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  15. Development of a stiffness-angle law for simplifying the measurement of human hair stiffness.

    PubMed

    Jung, I K; Park, S C; Lee, Y R; Bin, S A; Hong, Y D; Eun, D; Lee, J H; Roh, Y S; Kim, B M

    2018-04-01

    This research examines the benefits of caffeine absorption on hair stiffness. To test hair stiffness, we have developed an evaluation method that is not only accurate, but also inexpensive. Our evaluation method for measuring hair stiffness culminated in a model, called the Stiffness-Angle Law, which describes the elastic properties of hair and can be widely applied to the development of hair care products. Small molecules (≤500 g mol -1 ) such as caffeine can be absorbed into hair. A common shampoo containing 4% caffeine was formulated and applied to hair 10 times, after which the hair stiffness was measured. The caffeine absorption of the treated hair was observed using Fourier-transform infrared spectroscopy (FTIR) with a focal plane array (FPA) detector. Our evaluation method for measuring hair stiffness consists of a regular camera and a support for single strands of hair. After attaching the hair to the support, the bending angle of the hair was observed with a camera and measured. Then, the hair strand was weighed. The stiffness of the hair was calculated based on our proposed Stiffness-Angle Law using three variables: angle, weight of hair and the distance the hair was pulled across the support. The caffeine absorption was confirmed by FTIR analysis. The concentration of amide bond in the hair certainly increased due to caffeine absorption. After caffeine was absorbed into the hair, the bending angle and weight of the hair changed. Applying these measured changes to the Stiffness-Angle Law, it was confirmed that the hair stiffness increased by 13.2% due to caffeine absorption. The theoretical results using the Stiffness-Angle Law agree with the visual examinations of hair exposed to caffeine and also the known results of hair stiffness from a previous report. Our evaluation method combined with our proposed Stiffness-Angle Law effectively provides an accurate and inexpensive evaluation technique for measuring bending stiffness of human hair. © 2018 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  16. An Inexpensive Digital Infrared Camera

    ERIC Educational Resources Information Center

    Mills, Allan

    2012-01-01

    Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)

  17. 33 CFR 117.993 - Lake Champlain.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) A sufficient number of infrared cameras shall be maintained in good working order at all times with... infrared cameras to verify that the channel is clear of all approaching vessel traffic. All approaching...

  18. 33 CFR 117.993 - Lake Champlain.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) A sufficient number of infrared cameras shall be maintained in good working order at all times with... infrared cameras to verify that the channel is clear of all approaching vessel traffic. All approaching...

  19. A Portable, Inexpensive, Nonmydriatic Fundus Camera Based on the Raspberry Pi® Computer.

    PubMed

    Shen, Bailey Y; Mukai, Shizuo

    2017-01-01

    Purpose. Nonmydriatic fundus cameras allow retinal photography without pharmacologic dilation of the pupil. However, currently available nonmydriatic fundus cameras are bulky, not portable, and expensive. Taking advantage of recent advances in mobile technology, we sought to create a nonmydriatic fundus camera that was affordable and could be carried in a white coat pocket. Methods. We built a point-and-shoot prototype camera using a Raspberry Pi computer, an infrared-sensitive camera board, a dual infrared and white light light-emitting diode, a battery, a 5-inch touchscreen liquid crystal display, and a disposable 20-diopter condensing lens. Our prototype camera was based on indirect ophthalmoscopy with both infrared and white lights. Results. The prototype camera measured 133mm × 91mm × 45mm and weighed 386 grams. The total cost of the components, including the disposable lens, was $185.20. The camera was able to obtain good-quality fundus images without pharmacologic dilation of the pupils. Conclusion. A fully functional, inexpensive, handheld, nonmydriatic fundus camera can be easily assembled from a relatively small number of components. With modest improvements, such a camera could be useful for a variety of healthcare professionals, particularly those who work in settings where a traditional table-mounted nonmydriatic fundus camera would be inconvenient.

  20. A Portable, Inexpensive, Nonmydriatic Fundus Camera Based on the Raspberry Pi® Computer

    PubMed Central

    Shen, Bailey Y.

    2017-01-01

    Purpose. Nonmydriatic fundus cameras allow retinal photography without pharmacologic dilation of the pupil. However, currently available nonmydriatic fundus cameras are bulky, not portable, and expensive. Taking advantage of recent advances in mobile technology, we sought to create a nonmydriatic fundus camera that was affordable and could be carried in a white coat pocket. Methods. We built a point-and-shoot prototype camera using a Raspberry Pi computer, an infrared-sensitive camera board, a dual infrared and white light light-emitting diode, a battery, a 5-inch touchscreen liquid crystal display, and a disposable 20-diopter condensing lens. Our prototype camera was based on indirect ophthalmoscopy with both infrared and white lights. Results. The prototype camera measured 133mm × 91mm × 45mm and weighed 386 grams. The total cost of the components, including the disposable lens, was $185.20. The camera was able to obtain good-quality fundus images without pharmacologic dilation of the pupils. Conclusion. A fully functional, inexpensive, handheld, nonmydriatic fundus camera can be easily assembled from a relatively small number of components. With modest improvements, such a camera could be useful for a variety of healthcare professionals, particularly those who work in settings where a traditional table-mounted nonmydriatic fundus camera would be inconvenient. PMID:28396802

  1. C-RED one: ultra-high speed wavefront sensing in the infrared made possible

    NASA Astrophysics Data System (ADS)

    Gach, J.-L.; Feautrier, Philippe; Stadler, Eric; Greffe, Timothee; Clop, Fabien; Lemarchand, Stéphane; Carmignani, Thomas; Boutolleau, David; Baker, Ian

    2016-07-01

    First Light Imaging's CRED-ONE infrared camera is capable of capturing up to 3500 full frames per second with a subelectron readout noise. This breakthrough has been made possible thanks to the use of an e-APD infrared focal plane array which is a real disruptive technology in imagery. We will show the performances of the camera, its main features and compare them to other high performance wavefront sensing cameras like OCAM2 in the visible and in the infrared. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.

  2. Unmanned Ground Vehicle Perception Using Thermal Infrared Cameras

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo; Huertas, Andres; Matthies, Larry; Bajracharya, Max; Assad, Christopher; Brennan, Shane; Bellutta, Paolo; Sherwin, Gary W.

    2011-01-01

    The ability to perform off-road autonomous navigation at any time of day or night is a requirement for some unmanned ground vehicle (UGV) programs. Because there are times when it is desirable for military UGVs to operate without emitting strong, detectable electromagnetic signals, a passive only terrain perception mode of operation is also often a requirement. Thermal infrared (TIR) cameras can be used to provide day and night passive terrain perception. TIR cameras have a detector sensitive to either mid-wave infrared (MWIR) radiation (3-5?m) or long-wave infrared (LWIR) radiation (8-12?m). With the recent emergence of high-quality uncooled LWIR cameras, TIR cameras have become viable passive perception options for some UGV programs. The Jet Propulsion Laboratory (JPL) has used a stereo pair of TIR cameras under several UGV programs to perform stereo ranging, terrain mapping, tree-trunk detection, pedestrian detection, negative obstacle detection, and water detection based on object reflections. In addition, we have evaluated stereo range data at a variety of UGV speeds, evaluated dual-band TIR classification of soil, vegetation, and rock terrain types, analyzed 24 hour water and 12 hour mud TIR imagery, and analyzed TIR imagery for hazard detection through smoke. Since TIR cameras do not currently provide the resolution available from megapixel color cameras, a UGV's daytime safe speed is often reduced when using TIR instead of color cameras. In this paper, we summarize the UGV terrain perception work JPL has performed with TIR cameras over the last decade and describe a calibration target developed by General Dynamics Robotic Systems (GDRS) for TIR cameras and other sensors.

  3. Search for and limits on plume activity on Mimas, Tethys, and Dione with the Cassini Visual Infrared Mapping Spectrometer (VIMS)

    USGS Publications Warehouse

    Buratti, B.J.; Faulk, S.P.; Mosher, J.; Baines, K.H.; Brown, R.H.; Clark, R.N.; Nicholson, P.D.

    2011-01-01

    Cassini Visual Infrared Mapping Spectrometer (VIMS) observations of Mimas, Tethys, and Dione obtained during the nominal and extended missions at large solar phase angles were analyzed to search for plume activity. No forward scattered peaks in the solar phase curves of these satellites were detected. The upper limit on water vapor production for Mimas and Tethys is one order of magnitude less than the production for Enceladus. For Dione, the upper limit is two orders of magnitude less, suggesting this world is as inert as Rhea (Pitman, K.M., Buratti, B.J., Mosher, J.A., Bauer, J.M., Momary, T., Brown, R.H., Nicholson, P.D., Hedman, M.M. [2008]. Astrophys. J. Lett. 680, L65-L68). Although the plumes are best seen at ???2.0. ??m, Imaging Science Subsystem (ISS) Narrow Angle Camera images obtained at the same time as the VIMS data were also inspected for these features. None of the Cassini ISS images shows evidence for plumes. The absence of evidence for any Enceladus-like plumes on the medium-sized saturnian satellites cannot absolutely rule out current geologic activity. The activity may below our threshold of detection, or it may be occurring but not captured on the handful of observations at large solar phase angles obtained for each moon. Many VIMS and ISS images of Enceladus at large solar phase angles, for example, do not contain plumes, as the active "tiger stripes" in the south pole region are pointed away from the spacecraft at these times. The 7-year Cassini Solstice Mission is scheduled to gather additional measurements at large solar phase angles that are capable of revealing activity on the saturnian moons. ?? 2011 Elsevier Inc.

  4. A wide-angle camera module for disposable endoscopy

    NASA Astrophysics Data System (ADS)

    Shim, Dongha; Yeon, Jesun; Yi, Jason; Park, Jongwon; Park, Soo Nam; Lee, Nanhee

    2016-08-01

    A wide-angle miniaturized camera module for disposable endoscope is demonstrated in this paper. A lens module with 150° angle of view (AOV) is designed and manufactured. All plastic injection-molded lenses and a commercial CMOS image sensor are employed to reduce the manufacturing cost. The image sensor and LED illumination unit are assembled with a lens module. The camera module does not include a camera processor to further reduce its size and cost. The size of the camera module is 5.5 × 5.5 × 22.3 mm3. The diagonal field of view (FOV) of the camera module is measured to be 110°. A prototype of a disposable endoscope is implemented to perform a pre-clinical animal testing. The esophagus of an adult beagle dog is observed. These results demonstrate the feasibility of a cost-effective and high-performance camera module for disposable endoscopy.

  5. Portable Long-Wavelength Infrared Camera for Civilian Application

    NASA Technical Reports Server (NTRS)

    Gunapala, S. D.; Krabach, T. N.; Bandara, S. V.; Liu, J. K.

    1997-01-01

    In this paper, we discuss the performance of this portable long-wavelength infrared camera in quantum efficiency, NEAT, minimum resolvable temperature differnce (MRTD), uniformity, etc. and its application in science, medicine and defense.

  6. Standoff aircraft IR characterization with ABB dual-band hyper spectral imager

    NASA Astrophysics Data System (ADS)

    Prel, Florent; Moreau, Louis; Lantagne, Stéphane; Bullis, Ritchie D.; Roy, Claude; Vallières, Christian; Levesque, Luc

    2012-09-01

    Remote sensing infrared characterization of rapidly evolving events generally involves the combination of a spectro-radiometer and infrared camera(s) as separated instruments. Time synchronization, spatial coregistration, consistent radiometric calibration and managing several systems are important challenges to overcome; they complicate the target infrared characterization data processing and increase the sources of errors affecting the final radiometric accuracy. MR-i is a dual-band Hyperspectal imaging spectro-radiometer, that combines two 256 x 256 pixels infrared cameras and an infrared spectro-radiometer into one single instrument. This field instrument generates spectral datacubes in the MWIR and LWIR. It is designed to acquire the spectral signatures of rapidly evolving events. The design is modular. The spectrometer has two output ports configured with two simultaneously operated cameras to either widen the spectral coverage or to increase the dynamic range of the measured amplitudes. Various telescope options are available for the input port. Recent platform developments and field trial measurements performances will be presented for a system configuration dedicated to the characterization of airborne targets.

  7. Design of a Remote Infrared Images and Other Data Acquisition Station for outdoor applications

    NASA Astrophysics Data System (ADS)

    Béland, M.-A.; Djupkep, F. B. D.; Bendada, A.; Maldague, X.; Ferrarini, G.; Bison, P.; Grinzato, E.

    2013-05-01

    The Infrared Images and Other Data Acquisition Station enables a user, who is located inside a laboratory, to acquire visible and infrared images and distances in an outdoor environment with the help of an Internet connection. This station can acquire data using an infrared camera, a visible camera, and a rangefinder. The system can be used through a web page or through Python functions.

  8. Low-cost low-power uncooled a-Si-based micro infrared camera for unattended ground sensor applications

    NASA Astrophysics Data System (ADS)

    Schimert, Thomas R.; Ratcliff, David D.; Brady, John F., III; Ropson, Steven J.; Gooch, Roland W.; Ritchey, Bobbi; McCardel, P.; Rachels, K.; Wand, Marty; Weinstein, M.; Wynn, John

    1999-07-01

    Low power and low cost are primary requirements for an imaging infrared camera used in unattended ground sensor arrays. In this paper, an amorphous silicon (a-Si) microbolometer-based uncooled infrared camera technology offering a low cost, low power solution to infrared surveillance for UGS applications is presented. A 15 X 31 micro infrared camera (MIRC) has been demonstrated which exhibits an f/1 noise equivalent temperature difference sensitivity approximately 67 mK. This sensitivity has been achieved without the use of a thermoelectric cooler for array temperature stabilization thereby significantly reducing the power requirements. The chopperless camera is capable of operating from snapshot mode (1 Hz) to video frame rate (30 Hz). Power consumption of 0.4 W without display, and 0.75 W with display, respectively, has been demonstrated at 30 Hz operation. The 15 X 31 camera demonstrated exhibits a 35 mm camera form factor employing a low cost f/1 singlet optic and LED display, as well as low cost vacuum packaging. A larger 120 X 160 version of the MIRC is also in development and will be discussed. The 120 X 160 MIRC exhibits a substantially smaller form factor and incorporates all the low cost, low power features demonstrated in the 15 X 31 MIRC prototype. In this paper, a-Si microbolometer technology for the MIRC will be presented. Also, the key features and performance parameters of the MIRC are presented.

  9. Effect of indocyanine green angiography using infrared fundus camera on subsequent dark adaptation and electroretinogram.

    PubMed

    Wen, Feng; Yu, Minzhong; Wu, Dezheng; Ma, Juanmei; Wu, Lezheng

    2002-07-01

    To observe the effect of indocyanine green angiography (ICGA) with infrared fundus camera on subsequent dark adaptation and the Ganzfeld electroretinogram (ERG), the ERGs of 38 eyes with different retinal diseases were recorded before and after ICGA during a 40-min dark adaptation period. ICGA was performed with Topcon 50IA retina camera. Ganzfeld ERG was recorded with Neuropack II evoked response recorder. The results showed that ICGA did not affect the latencies and the amplitudes in ERG of rod response, cone response and mixed maximum response (p>0.05). It suggests that ICGA using infrared fundus camera could be performed prior to the recording of the Ganzfeld ERG.

  10. High-frame-rate infrared and visible cameras for test range instrumentation

    NASA Astrophysics Data System (ADS)

    Ambrose, Joseph G.; King, B.; Tower, John R.; Hughes, Gary W.; Levine, Peter A.; Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; O'Mara, K.; Sjursen, W.; McCaffrey, Nathaniel J.; Pantuso, Francis P.

    1995-09-01

    Field deployable, high frame rate camera systems have been developed to support the test and evaluation activities at the White Sands Missile Range. The infrared cameras employ a 640 by 480 format PtSi focal plane array (FPA). The visible cameras employ a 1024 by 1024 format backside illuminated CCD. The monolithic, MOS architecture of the PtSi FPA supports commandable frame rate, frame size, and integration time. The infrared cameras provide 3 - 5 micron thermal imaging in selectable modes from 30 Hz frame rate, 640 by 480 frame size, 33 ms integration time to 300 Hz frame rate, 133 by 142 frame size, 1 ms integration time. The infrared cameras employ a 500 mm, f/1.7 lens. Video outputs are 12-bit digital video and RS170 analog video with histogram-based contrast enhancement. The 1024 by 1024 format CCD has a 32-port, split-frame transfer architecture. The visible cameras exploit this architecture to provide selectable modes from 30 Hz frame rate, 1024 by 1024 frame size, 32 ms integration time to 300 Hz frame rate, 1024 by 1024 frame size (with 2:1 vertical binning), 0.5 ms integration time. The visible cameras employ a 500 mm, f/4 lens, with integration time controlled by an electro-optical shutter. Video outputs are RS170 analog video (512 by 480 pixels), and 12-bit digital video.

  11. All sky imaging observations in visible and infrared waveband for validation of satellite cloud and aerosol products

    NASA Astrophysics Data System (ADS)

    Lu, Daren; Huo, Juan; Zhang, W.; Liu, J.

    A series of satellite sensors in visible and infrared wavelengths have been successfully operated on board a number of research satellites, e.g. NOAA/AVHRR, the MODIS onboard Terra and Aqua, etc. A number of cloud and aerosol products are produced and released in recent years. However, the validation of the product quality and accuracy are still a challenge to the atmospheric remote sensing community. In this paper, we suggest a ground based validation scheme for satellite-derived cloud and aerosol products by using combined visible and thermal infrared all sky imaging observations as well as surface meteorological observations. In the scheme, a visible digital camera with a fish-eye lens is used to continuously monitor the all sky with the view angle greater than 180 deg. The digital camera system is calibrated for both its geometry and radiance (broad blue, green, and red band) so as to a retrieval method can be used to detect the clear and cloudy sky spatial distribution and their temporal variations. A calibrated scanning thermal infrared thermometer is used to monitor the all sky brightness temperature distribution. An algorithm is developed to detect the clear and cloudy sky as well as cloud base height by using sky brightness distribution and surface temperature and humidity as input. Based on these composite retrieval of clear and cloudy sky distribution, it can be used to validate the satellite retrievals in the sense of real-simultaneous comparison and statistics, respectively. What will be presented in this talk include the results of the field observations and comparisons completed in Beijing (40 deg N, 116.5 deg E) in year 2003 and 2004. This work is supported by NSFC grant No. 4002700, and MOST grant No 2001CCA02200

  12. A user-friendly technical set-up for infrared photography of forensic findings.

    PubMed

    Rost, Thomas; Kalberer, Nicole; Scheurer, Eva

    2017-09-01

    Infrared photography is interesting for a use in forensic science and forensic medicine since it reveals findings that normally are almost invisible to the human eye. Originally, infrared photography has been made possible by the placement of an infrared light transmission filter screwed in front of the camera objective lens. However, this set-up is associated with many drawbacks such as the loss of the autofocus function, the need of an external infrared source, and long exposure times which make the use of a tripod necessary. These limitations prevented up to now the routine application of infrared photography in forensics. In this study the use of a professional modification inside the digital camera body was evaluated regarding camera handling and image quality. This permanent modification consisted of the replacement of the in-built infrared blocking filter by an infrared transmission filter of 700nm and 830nm, respectively. The application of this camera set-up for the photo-documentation of forensically relevant post-mortem findings was investigated in examples of trace evidence such as gunshot residues on the skin, in external findings, e.g. hematomas, as well as in an exemplary internal finding, i.e., Wischnewski spots in a putrefied stomach. The application of scattered light created by indirect flashlight yielded a more uniform illumination of the object, and the use of the 700nm filter resulted in better pictures than the 830nm filter. Compared to pictures taken under visible light, infrared photographs generally yielded better contrast. This allowed for discerning more details and revealed findings which were not visible otherwise, such as imprints on a fabric and tattoos in mummified skin. The permanent modification of a digital camera by building in a 700nm infrared transmission filter resulted in a user-friendly and efficient set-up which qualified for the use in daily forensic routine. Main advantages were a clear picture in the viewfinder, an auto-focus usable over the whole range of infrared light, and the possibility of using short shutter speeds which allows taking infrared pictures free-hand. The proposed set-up with a modification of the camera allows a user-friendly application of infrared photography in post-mortem settings. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Swirls and Shadows

    NASA Image and Video Library

    2015-05-04

    Saturn's surface is painted with swirls and shadows. Each swirl here is a weather system, reminding us of how dynamic Saturn's atmosphere is. Images taken in the near-infrared (like this one) permit us to peer through Saturn's methane haze layer to the clouds below. Scientists track the clouds and weather systems in the hopes of better understanding Saturn's complex atmosphere - and thus Earth's as well. This view looks toward the sunlit side of the rings from about 17 degrees above the ringplane. The image was taken with the Cassini spacecraft wide-angle camera on Feb. 8, 2015 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was obtained at a distance of approximately 794,000 miles (1.3 million kilometers) from Saturn. Image scale is 47 miles (76 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/pia18311

  14. TIFR Near Infrared Imaging Camera-II on the 3.6 m Devasthal Optical Telescope

    NASA Astrophysics Data System (ADS)

    Baug, T.; Ojha, D. K.; Ghosh, S. K.; Sharma, S.; Pandey, A. K.; Kumar, Brijesh; Ghosh, Arpan; Ninan, J. P.; Naik, M. B.; D’Costa, S. L. A.; Poojary, S. S.; Sandimani, P. R.; Shah, H.; Krishna Reddy, B.; Pandey, S. B.; Chand, H.

    Tata Institute of Fundamental Research (TIFR) Near Infrared Imaging Camera-II (TIRCAM2) is a closed-cycle Helium cryo-cooled imaging camera equipped with a Raytheon 512×512 pixels InSb Aladdin III Quadrant focal plane array (FPA) having sensitivity to photons in the 1-5μm wavelength band. In this paper, we present the performance of the camera on the newly installed 3.6m Devasthal Optical Telescope (DOT) based on the calibration observations carried out during 2017 May 11-14 and 2017 October 7-31. After the preliminary characterization, the camera has been released to the Indian and Belgian astronomical community for science observations since 2017 May. The camera offers a field-of-view (FoV) of ˜86.5‧‧×86.5‧‧ on the DOT with a pixel scale of 0.169‧‧. The seeing at the telescope site in the near-infrared (NIR) bands is typically sub-arcsecond with the best seeing of ˜0.45‧‧ realized in the NIR K-band on 2017 October 16. The camera is found to be capable of deep observations in the J, H and K bands comparable to other 4m class telescopes available world-wide. Another highlight of this camera is the observational capability for sources up to Wide-field Infrared Survey Explorer (WISE) W1-band (3.4μm) magnitudes of 9.2 in the narrow L-band (nbL; λcen˜ 3.59μm). Hence, the camera could be a good complementary instrument to observe the bright nbL-band sources that are saturated in the Spitzer-Infrared Array Camera (IRAC) ([3.6] ≲ 7.92 mag) and the WISE W1-band ([3.4] ≲ 8.1 mag). Sources with strong polycyclic aromatic hydrocarbon (PAH) emission at 3.3μm are also detected. Details of the observations and estimated parameters are presented in this paper.

  15. TIRCAM2: The TIFR near infrared imaging camera

    NASA Astrophysics Data System (ADS)

    Naik, M. B.; Ojha, D. K.; Ghosh, S. K.; Poojary, S. S.; Jadhav, R. B.; Meshram, G. S.; Sandimani, P. R.; Bhagat, S. B.; D'Costa, S. L. A.; Gharat, S. M.; Bakalkar, C. B.; Ninan, J. P.; Joshi, J. S.

    2012-12-01

    TIRCAM2 (TIFR near infrared imaging camera - II) is a closed cycle cooled imager that has been developed by the Infrared Astronomy Group at the Tata Institute of Fundamental Research for observations in the near infrared band of 1 to 3.7 μm with existing Indian telescopes. In this paper, we describe some of the technical details of TIRCAM2 and report its observing capabilities, measured performance and limiting magnitudes with the 2-m IUCAA Girawali telescope and the 1.2-m PRL Gurushikhar telescope. The main highlight is the camera's capability of observing in the nbL (3.59 mum) band enabling our primary motivation of mapping of Polycyclic Aromatic Hydrocarbon (PAH) emission at 3.3 mum.

  16. The Eye of Saturn

    NASA Image and Video Library

    2014-08-04

    Like a giant eye for the giant planet, Saturn great vortex at its north pole appears to stare back at Cassini as NASA Cassini spacecraft stares at it. Measurements have sized the "eye" at a staggering 1,240 miles (2,000 kilometers) across with cloud speeds as fast as 330 miles per hour (150 meters per second). For color views of the eye and the surrounding region, see PIA14946 and PIA14944. The image was taken with the Cassini spacecraft narrow-angle camera on April 2, 2014 using a combination of spectral filters which preferentially admit wavelengths of near-infrared light centered at 748 nanometers. The view was obtained at a distance of approximately 1.4 million miles (2.2 million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 43 degrees. Image scale is 8 miles (13 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18273

  17. Vortex and Rings

    NASA Image and Video Library

    2014-07-07

    NASA Cassini spacecraft captures three magnificent sights at once: Saturn north polar vortex and hexagon along with its expansive rings. The hexagon, which is wider than two Earths, owes its appearance to the jet stream that forms its perimeter. The jet stream forms a six-lobed, stationary wave which wraps around the north polar regions at a latitude of roughly 77 degrees North. This view looks toward the sunlit side of the rings from about 37 degrees above the ringplane. The image was taken with the Cassini spacecraft wide-angle camera on April 2, 2014 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was obtained at a distance of approximately 1.4 million miles (2.2 million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 43 degrees. Image scale is 81 miles (131 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18274

  18. The W. M. Keck Observatory Infrared Vortex Coronagraph and a First Image of HIP 79124 B

    NASA Astrophysics Data System (ADS)

    Serabyn, E.; Huby, E.; Matthews, K.; Mawet, D.; Absil, O.; Femenia, B.; Wizinowich, P.; Karlsson, M.; Bottom, M.; Campbell, R.; Carlomagno, B.; Defrère, D.; Delacroix, C.; Forsberg, P.; Gomez Gonzalez, C.; Habraken, S.; Jolivet, A.; Liewer, K.; Lilley, S.; Piron, P.; Reggiani, M.; Surdej, J.; Tran, H.; Vargas Catalán, E.; Wertz, O.

    2017-01-01

    An optical vortex coronagraph has been implemented within the NIRC2 camera on the Keck II telescope and used to carry out on-sky tests and observations. The development of this new L‧-band observational mode is described, and an initial demonstration of the new capability is presented: a resolved image of the low-mass companion to HIP 79124, which had previously been detected by means of interferometry. With HIP 79124 B at a projected separation of 186.5 mas, both the small inner working angle of the vortex coronagraph and the related imaging improvements were crucial in imaging this close companion directly. Due to higher Strehl ratios and more relaxed contrasts in L‧ band versus H band, this new coronagraphic capability will enable high-contrast, small-angle observations of nearby young exoplanets and disks on a par with those of shorter-wavelength extreme adaptive optics coronagraphs.

  19. Tomographic reconstruction of an aerosol plume using passive multiangle observations from the MISR satellite instrument

    NASA Astrophysics Data System (ADS)

    Garay, Michael J.; Davis, Anthony B.; Diner, David J.

    2016-12-01

    We present initial results using computed tomography to reconstruct the three-dimensional structure of an aerosol plume from passive observations made by the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite. MISR views the Earth from nine different angles at four visible and near-infrared wavelengths. Adopting the 672 nm channel, we treat each view as an independent measure of aerosol optical thickness along the line of sight at 1.1 km resolution. A smoke plume over dark water is selected as it provides a more tractable lower boundary condition for the retrieval. A tomographic algorithm is used to reconstruct the horizontal and vertical aerosol extinction field for one along-track slice from the path of all camera rays passing through a regular grid. The results compare well with ground-based lidar observations from a nearby Micropulse Lidar Network site.

  20. A Search for Faint, Diffuse Halo Emission in Edge-On Galaxies with Spitzer/IRAC

    NASA Astrophysics Data System (ADS)

    Ashby, Matthew; Arendt, R. G.; Pipher, J. L.; Forrest, W. J.; Marengo, M.; Barmby, P.; Willner, S. P.; Stauffer, J. R.; Fazio, G. G.

    2006-12-01

    We present deep infrared mosaics of the nearby edge-on spiral galaxies NGC 891, 4244, 4565, and 5907. These data were acquired at 3.6, 4.5, 5.8, and 8.0 microns using the Infrared Array Camera aboard Spitzer as part of GTO program number 3. This effort is designed to detect the putative faint, diffuse emission from halos and thick disks of spiral galaxies in the near-mid infrared under the thermally stable, low-background conditions of space. These conditions in combination with the advantageous viewing angles presented by these well-known edge-on spirals provide arguably the best opportunity to characterize the halo/thick disk components of such galaxies in the infrared. In this contribution we describe our observations, data reduction techniques, corrections for artifacts in the data, and the modeling approach we applied to analyze this unique dataset. This work is based in part on observations made with the Spitzer Space Telescope, which is operated by the Jet Propulsion Laboratory, California Institute of Technology under a contract with NASA. Support for this work was provided by NASA through an award issued by JPL/Caltech.

  1. Visual field information in Nap-of-the-Earth flight by teleoperated Helmet-Mounted displays

    NASA Technical Reports Server (NTRS)

    Grunwald, Arthur J.; Kohn, S.; Merhav, S. J.

    1991-01-01

    The human ability to derive Control-Oriented Visual Field Information from teleoperated Helmet-Mounted displays in Nap-of-the-Earth flight, is investigated. The visual field with these types of displays originates from a Forward Looking Infrared Radiation Camera, gimbal-mounted at the front of the aircraft and slaved to the pilot's line-of-sight, to obtain wide-angle visual coverage. Although these displays are proved to be effective in Apache and Cobra helicopter night operations, they demand very high pilot proficiency and work load. Experimental work presented in the paper has shown that part of the difficulties encountered in vehicular control by means of these displays can be attributed to the narrow viewing aperture and head/camera slaving system phase lags. Both these shortcomings will impair visuo-vestibular coordination, when voluntary head rotation is present. This might result in errors in estimating the Control-Oriented Visual Field Information vital in vehicular control, such as the vehicle yaw rate or the anticipated flight path, or might even lead to visuo-vestibular conflicts (motion sickness). Since, under these conditions, the pilot will tend to minimize head rotation, the full wide-angle coverage of the Helmet-Mounted Display, provided by the line-of-sight slaving system, is not always fully utilized.

  2. Dunelands of Titan

    NASA Image and Video Library

    2015-11-02

    Saturn's frigid moon Titan has some characteristics that are oddly similar to Earth, but still slightly alien. It has clouds, rain and lakes (made of methane and ethane), a solid surface (made of water ice), and vast dune fields (filled with hydrocarbon sands). The dark, H-shaped area seen here contains two of the dune-filled regions, Fensal (in the north) and Aztlan (to the south). Cassini's cameras have frequently monitored the surface of Titan (3200 miles or 5150 kilometers across) to look for changes in its features over the course of the mission. Any changes would help scientists better understand different phenomena like winds and dune formation on this strangely earth-like moon. For a closer view of Fensal-Aztlan, see PIA07732 . This view looks toward the leading side of Titan. North on Titan is up. The image was taken with the Cassini spacecraft narrow-angle camera on July 25, 2015 using a spectral filter sensitive to wavelengths of near-infrared light centered at 938 nanometers. The view was obtained at a distance of approximately 450,000 miles (730,000 kilometers) from Titan and at a Sun-Titan-spacecraft, or phase, angle of 32 degrees. Image scale is 3 miles (4 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18341

  3. Saskatchewan and Manitoba

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Surface brightness contrasts accentuated by a thin layer of snow enable a network of rivers, roads, and farmland boundaries to stand out clearly in these MISR images of southeastern Saskatchewan and southwestern Manitoba. The lefthand image is a multi-spectral false-color view made from the near-infrared, red, and green bands of MISR's vertical-viewing (nadir) camera. The righthand image is a multi-angle false-color view made from the red band data of the 60-degree aftward camera, the nadir camera, and the 60-degree forward camera. In each image, the selected channels are displayed as red, green, and blue, respectively. The data were acquired April 17, 2001 during Terra orbit 7083, and cover an area measuring about 285 kilometers x 400 kilometers. North is at the top.

    The junction of the Assiniboine and Qu'Apelle Rivers in the bottom part of the images is just east of the Saskatchewan-Manitoba border. During the growing season, the rich, fertile soils in this area support numerous fields of wheat, canola, barley, flaxseed, and rye. Beef cattle are raised in fenced pastures. To the north, the terrain becomes more rocky and forested. Many frozen lakes are visible as white patches in the top right. The narrow linear, north-south trending patterns about a third of the way down from the upper right corner are snow-filled depressions alternating with vegetated ridges, most probably carved by glacial flow.

    In the lefthand image, vegetation appears in shades of red, owing to its high near-infrared reflectivity. In the righthand image, several forested regions are clearly visible in green hues. Since this is a multi-angle composite, the green arises not from the color of the leaves but from the architecture of the surface cover. Progressing southeastward along the Manitoba Escarpment, the forested areas include the Pasquia Hills, the Porcupine Hills, Duck Mountain Provincial Park, and Riding Mountain National Park. The forests are brighter in the nadir than at the oblique angles, probably because more of the snow-covered surface is visible in the gaps between the trees. In contrast, the valley between the Pasquia and Porcupine Hills near the top of the images appears bright red in the lefthand image (indicating high vegetation abundance) but shows a mauve color in the multi-angle view. This means that it is darker in the nadir than at the oblique angles. Examination of imagery acquired after the snow has melted should establish whether this difference is related to the amount of snow on the surface or is indicative of a different type of vegetation structure.

    Saskatchewan and Manitoba are believed to derive their names from the Cree words for the winding and swift-flowing waters of the Saskatchewan River and for a narrows on Lake Manitoba where the roaring sound of wind and water evoked the voice of the Great Spirit. They are two of Canada's Prairie Provinces; Alberta is the third.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  4. Surging Across the Rings

    NASA Image and Video Library

    2007-07-26

    A surge in brightness appears on the rings directly opposite the Sun from the Cassini spacecraft. This "opposition surge" travels across the rings as the spacecraft watches. This view looks toward the sunlit side of the rings from about 9 degrees below the ringplane. The image was taken in visible light with the Cassini spacecraft wide-angle camera on June 12, 2007 using a spectral filter sensitive to wavelengths of infrared light centered at 853 nanometers. The view was acquired at a distance of approximately 524,374 kilometers (325,830 miles) from Saturn. Image scale is 31 kilometers (19 miles) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA08992

  5. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. Low-cost thermo-electric infrared FPAs and their automotive applications

    NASA Astrophysics Data System (ADS)

    Hirota, Masaki; Ohta, Yoshimi; Fukuyama, Yasuhiro

    2008-04-01

    This paper describes three low-cost infrared focal plane arrays (FPAs) having a 1,536, 2,304, and 10,800 elements and experimental vehicle systems. They have a low-cost potential because each element consists of p-n polysilicon thermocouples, which allows the use of low-cost ultra-fine microfabrication technology commonly employed in the conventional semiconductor manufacturing processes. To increase the responsivity of FPA, we have developed a precisely patterned Au-black absorber that has high infrared absorptivity of more than 90%. The FPA having a 2,304 elements achieved high resposivity of 4,300 V/W. In order to reduce package cost, we developed a vacuum-sealed package integrated with a molded ZnS lens. The camera aiming the temperature measurement of a passenger cabin is compact and light weight devices that measures 45 x 45 x 30 mm and weighs 190 g. The camera achieves a noise equivalent temperature deviation (NETD) of less than 0.7°C from 0 to 40°C. In this paper, we also present a several experimental systems that use infrared cameras. One experimental system is a blind spot pedestrian warning system that employs four infrared cameras. It can detect the infrared radiation emitted from a human body and alerts the driver when a pedestrian is in a blind spot. The system can also prevent the vehicle from moving in the direction of the pedestrian. Another system uses a visible-light camera and infrared sensors to detect the presence of a pedestrian in a rear blind spot and alerts the driver. The third system is a new type of human-machine interface system that enables the driver to control the car's audio system without letting go of the steering wheel. Uncooled infrared cameras are still costly, which limits their automotive use to high-end luxury cars at present. To promote widespread use of IR imaging sensors on vehicles, we need to reduce their cost further.

  7. Types of rocks exposed at the Viking landing sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guinness, E.; Arvidson, R.; Dale-Bannister, M.

    1985-01-01

    Spectral estimates derived from Viking Lander multispectral images have been used to investigate the types of rocks exposed at both landing sites, and to infer whether the rocks are primary igneous rocks or weathering products. These analyses should aid interpretations of spectra to be returned from the Visual and Infrared Mapping Spectrometer on the upcoming Mars Observer Mission. A series of gray surfaces on the Landers were used to check the accuracy of the camera preflight calibrations. Results indicate that the pre-flight calibrations for the three color channels are probably correct for all cameras but camera 2 on Lander 1.more » The calibration for the infrared channels appears to have changed, although the cause is not known. For this paper, only the color channels were used to derive data for rocks. Rocks at both sites exhibit a variety of reflectance values. For example, reflectance estimates for two rocks in the blue (0.4-0.5 microns), green (0.5-0.6 microns), and red (0.6-0.75 microns) channels are 0.16, 0.23, and 0.33 and 0.12, 0.19, 0.37 at a phase angle of 20 degrees. These values have been compared with laboratory reflectance spectra of analog materials and telescopic spectra of Mars, both convolved to the Lander bandpasses. Lander values for some rocks are similar to earth based observations of martian dark regions and with certain mafic igneous rocks thinly coated with amorphous ferric-oxide rich weathering products. These results are consistent with previous interpretations.« less

  8. Observation of runaway electrons by infrared camera in J-TEXT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tong, R. H.; Chen, Z. Y., E-mail: zychen@hust.edu.cn; Zhang, M.

    2016-11-15

    When the energy of confined runaway electrons approaches several tens of MeV, the runaway electrons can emit synchrotron radiation in the range of infrared wavelength. An infrared camera working in the wavelength of 3-5 μm has been developed to study the runaway electrons in the Joint Texas Experimental Tokamak (J-TEXT). The camera is located in the equatorial plane looking tangentially into the direction of electron approach. The runaway electron beam inside the plasma has been observed at the flattop phase. With a fast acquisition of the camera, the behavior of runaway electron beam has been observed directly during the runawaymore » current plateau following the massive gas injection triggered disruptions.« less

  9. InfraCAM (trade mark): A Hand-Held Commercial Infrared Camera Modified for Spaceborne Applications

    NASA Technical Reports Server (NTRS)

    Manitakos, Daniel; Jones, Jeffrey; Melikian, Simon

    1996-01-01

    In 1994, Inframetrics introduced the InfraCAM(TM), a high resolution hand-held thermal imager. As the world's smallest, lightest and lowest power PtSi based infrared camera, the InfraCAM is ideal for a wise range of industrial, non destructive testing, surveillance and scientific applications. In addition to numerous commercial applications, the light weight and low power consumption of the InfraCAM make it extremely valuable for adaptation to space borne applications. Consequently, the InfraCAM has been selected by NASA Lewis Research Center (LeRC) in Cleveland, Ohio, for use as part of the DARTFire (Diffusive and Radiative Transport in Fires) space borne experiment. In this experiment, a solid fuel is ignited in a low gravity environment. The combustion period is recorded by both visible and infrared cameras. The infrared camera measures the emission from polymethyl methacrylate, (PMMA) and combustion products in six distinct narrow spectral bands. Four cameras successfully completed all qualification tests at Inframetrics and at NASA Lewis. They are presently being used for ground based testing in preparation for space flight in the fall of 1995.

  10. Sniper detection using infrared camera: technical possibilities and limitations

    NASA Astrophysics Data System (ADS)

    Kastek, M.; Dulski, R.; Trzaskawka, P.; Bieszczad, G.

    2010-04-01

    The paper discusses technical possibilities to build an effective system for sniper detection using infrared cameras. Descriptions of phenomena which make it possible to detect sniper activities in infrared spectra as well as analysis of physical limitations were performed. Cooled and uncooled detectors were considered. Three phases of sniper activities were taken into consideration: before, during and after the shot. On the basis of experimental data the parameters defining the target were determined which are essential in assessing the capability of infrared camera to detect sniper activity. A sniper body and muzzle flash were analyzed as targets. The simulation of detection ranges was done for the assumed scenario of sniper detection task. The infrared sniper detection system was discussed, capable of fulfilling the requirements. The discussion of the results of analysis and simulations was finally presented.

  11. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  12. Quasi-stiffness of the knee joint in flexion and extension during the golf swing.

    PubMed

    Choi, Ahnryul; Sim, Taeyong; Mun, Joung Hwan

    2015-01-01

    Biomechanical understanding of the knee joint during a golf swing is essential to improve performance and prevent injury. In this study, we quantified the flexion/extension angle and moment as the primary knee movement, and evaluated quasi-stiffness represented by moment-angle coupling in the knee joint. Eighteen skilled and 23 unskilled golfers participated in this study. Six infrared cameras and two force platforms were used to record a swing motion. The anatomical angle and moment were calculated from kinematic and kinetic models, and quasi-stiffness of the knee joint was determined as an instantaneous slope of moment-angle curves. The lead knee of the skilled group had decreased resistance duration compared with the unskilled group (P < 0.05), and the resistance duration of the lead knee was lower than that of the trail knee in the skilled group (P < 0.01). The lead knee of the skilled golfers had greater flexible excursion duration than the trail knee of the skilled golfers, and of both the lead and trail knees of the unskilled golfers. These results provide critical information for preventing knee injuries during a golf swing and developing rehabilitation strategies following surgery.

  13. Thermal Scanning of Dental Pulp Chamber by Thermocouple System and Infrared Camera during Photo Curing of Resin Composites.

    PubMed

    Hamze, Faeze; Ganjalikhan Nasab, Seyed Abdolreza; Eskandarizadeh, Ali; Shahravan, Arash; Akhavan Fard, Fatemeh; Sinaee, Neda

    2018-01-01

    Due to thermal hazard during composite restorations, this study was designed to scan the pulp temperature by thermocouple and infrared camera during photo polymerizing different composites. A mesio-occlso-distal (MOD) cavity was prepared in an extracted tooth and the K-type thermocouple was fixed in its pulp chamber. Subsequently, 1 mm increment of each composites were inserted (four composite types were incorporated) and photo polymerized employing either LED or QTH systems for 60 sec while the temperature was recorded with 10 sec intervals. Ultimately, the same tooth was hemisected bucco-lingually and the amalgam was removed. The same composite curing procedure was repeated while the thermogram was recorded using an infrared camera. Thereafter, the data was analyzed by repeated measured ANOVA followed by Tukey's HSD Post Hoc test for multiple comparisons ( α =0.05). The pulp temperature was significantly increased (repeated measures) during photo polymerization ( P =0.000) while there was no significant difference among the results recorded by thermocouple comparing to infrared camera ( P >0.05). Moreover, different composite materials and LCUs lead to similar outcomes ( P >0.05). Although various composites have significant different chemical compositions, they lead to similar pulp thermal changes. Moreover, both the infrared camera and the thermocouple would record parallel results of dental pulp temperature.

  14. Power estimation of martial arts movement using 3D motion capture camera

    NASA Astrophysics Data System (ADS)

    Azraai, Nur Zaidi; Awang Soh, Ahmad Afiq Sabqi; Mat Jafri, Mohd Zubir

    2017-06-01

    Motion capture camera (MOCAP) has been widely used in many areas such as biomechanics, physiology, animation, arts, etc. This project is done by approaching physics mechanics and the extended of MOCAP application through sports. Most researchers will use a force plate, but this will only can measure the force of impact, but for us, we are keen to observe the kinematics of the movement. Martial arts is one of the sports that uses more than one part of the human body. For this project, martial art `Silat' was chosen because of its wide practice in Malaysia. 2 performers have been selected, one of them has an experienced in `Silat' practice and another one have no experience at all so that we can compare the energy and force generated by the performers. Every performer will generate a punching with same posture which in this project, two types of punching move were selected. Before the measuring start, a calibration has been done so the software knows the area covered by the camera and reduce the error when analyze by using the T stick that have been pasted with a marker. A punching bag with mass 60 kg was hung on an iron bar as a target. The use of this punching bag is to determine the impact force of a performer when they punch. This punching bag also will be stuck with the optical marker so we can observe the movement after impact. 8 cameras have been used and placed with 2 cameras at every side of the wall with different angle in a rectangular room 270 ft2 and the camera covered approximately 50 ft2. We covered only a small area so less noise will be detected and make the measurement more accurate. A Marker has been pasted on the limb of the entire hand that we want to observe and measure. A passive marker used in this project has a characteristic to reflect the infrared that being generated by the camera. The infrared will reflected to the camera sensor so the marker position can be detected and show in software. The used of many cameras is to increase the precision and improve the accuracy of the marker. Performer movement was recorded and analyzed using software Cortex motion analysis where velocity and acceleration of a performer movement can be measured. With classical mechanics approach we have estimated the power and force of impact and shows that an experienced performer produces more power and force of impact is higher than the inexperienced performer.

  15. Variation in detection among passive infrared triggered-cameras used in wildlife research

    USGS Publications Warehouse

    Damm, Philip E.; Grand, James B.; Barnett, Steven W.

    2010-01-01

    Precise and accurate estimates of demographics such as age structure, productivity, and density are necessary in determining habitat and harvest management strategies for wildlife populations. Surveys using automated cameras are becoming an increasingly popular tool for estimating these parameters. However, most camera studies fail to incorporate detection probabilities, leading to parameter underestimation. The objective of this study was to determine the sources of heterogeneity in detection for trail cameras that incorporate a passive infrared (PIR) triggering system sensitive to heat and motion. Images were collected at four baited sites within the Conecuh National Forest, Alabama, using three cameras at each site operating continuously over the same seven-day period. Detection was estimated for four groups of animals based on taxonomic group and body size. Our hypotheses of detection considered variation among bait sites and cameras. The best model (w=0.99) estimated different rates of detection for each camera in addition to different detection rates for four animal groupings. Factors that explain this variability might include poor manufacturing tolerances, variation in PIR sensitivity, animal behavior, and species-specific infrared radiation. Population surveys using trail cameras with PIR systems must incorporate detection rates for individual cameras. Incorporating time-lapse triggering systems into survey designs should eliminate issues associated with PIR systems.

  16. Visualising the invisible

    NASA Astrophysics Data System (ADS)

    Laychak, M. B.

    2008-06-01

    In addition to the optical camera Megacam, the Canada-France-Hawaii Telescope operates a large field infrared camera, Wircam, and a spectrograph/spectropolimeter, Espadons. When these instruments were commissioned, the challenge arose to create educational outreach programmes incorporating the concepts of infrared astronomy and spectroscopy. We integrated spectroscopy into discussions of extrasolar planets and the search for life, two topics routinely requested by teachers for classroom talks. Making the infrared accessible to students provided a unique challenge, one that we met through the implementation and use of webcams modified for infrared use.

  17. "Wow, It Turned out Red! First, a Little Yellow, and Then Red!" 1st-Graders' Work with an Infrared Camera

    ERIC Educational Resources Information Center

    Jeppsson, Fredrik; Frejd, Johanna; Lundmark, Frida

    2017-01-01

    This study focuses on investigating how students make use of their bodily experiences in combination with infrared (IR) cameras, as a way to make meaning in learning about heat, temperature, and friction. A class of 20 primary students (age 7-8 years), divided into three groups, took part in three IR camera laboratory experiments. The qualitative…

  18. Adaptive illumination source for multispectral vision system applied to material discrimination

    NASA Astrophysics Data System (ADS)

    Conde, Olga M.; Cobo, Adolfo; Cantero, Paulino; Conde, David; Mirapeix, Jesús; Cubillas, Ana M.; López-Higuera, José M.

    2008-04-01

    A multispectral system based on a monochrome camera and an adaptive illumination source is presented in this paper. Its preliminary application is focused on material discrimination for food and beverage industries, where monochrome, color and infrared imaging have been successfully applied for this task. This work proposes a different approach, in which the relevant wavelengths for the required discrimination task are selected in advance using a Sequential Forward Floating Selection (SFFS) Algorithm. A light source, based on Light Emitting Diodes (LEDs) at these wavelengths is then used to sequentially illuminate the material under analysis, and the resulting images are captured by a CCD camera with spectral response in the entire range of the selected wavelengths. Finally, the several multispectral planes obtained are processed using a Spectral Angle Mapping (SAM) algorithm, whose output is the desired material classification. Among other advantages, this approach of controlled and specific illumination produces multispectral imaging with a simple monochrome camera, and cold illumination restricted to specific relevant wavelengths, which is desirable for the food and beverage industry. The proposed system has been tested with success for the automatic detection of foreign object in the tobacco processing industry.

  19. Decoupling Intensity Radiated by the Emitter in Distance Estimation from Camera to IR Emitter

    PubMed Central

    Cano-García, Angel E.; Galilea, José Luis Lázaro; Fernández, Pedro; Infante, Arturo Luis; Pompa-Chacón, Yamilet; Vázquez, Carlos Andrés Luna

    2013-01-01

    Various models using radiometric approach have been proposed to solve the problem of estimating the distance between a camera and an infrared emitter diode (IRED). They depend directly on the radiant intensity of the emitter, set by the IRED bias current. As is known, this current presents a drift with temperature, which will be transferred to the distance estimation method. This paper proposes an alternative approach to remove temperature drift in the distance estimation method by eliminating the dependence on radiant intensity. The main aim was to use the relative accumulated energy together with other defined models, such as the zeroth-frequency component of the FFT of the IRED image and the standard deviation of pixel gray level intensities in the region of interest containing the IRED image. By using the abovementioned models, an expression free of IRED radiant intensity was obtained. Furthermore, the final model permitted simultaneous estimation of the distance between the IRED and the camera and the IRED orientation angle. The alternative presented in this paper gave a 3% maximum relative error over a range of distances up to 3 m. PMID:23727954

  20. IrisDenseNet: Robust Iris Segmentation Using Densely Connected Fully Convolutional Networks in the Images by Visible Light and Near-Infrared Light Camera Sensors

    PubMed Central

    Arsalan, Muhammad; Naqvi, Rizwan Ali; Kim, Dong Seop; Nguyen, Phong Ha; Owais, Muhammad; Park, Kang Ryoung

    2018-01-01

    The recent advancements in computer vision have opened new horizons for deploying biometric recognition algorithms in mobile and handheld devices. Similarly, iris recognition is now much needed in unconstraint scenarios with accuracy. These environments make the acquired iris image exhibit occlusion, low resolution, blur, unusual glint, ghost effect, and off-angles. The prevailing segmentation algorithms cannot cope with these constraints. In addition, owing to the unavailability of near-infrared (NIR) light, iris recognition in visible light environment makes the iris segmentation challenging with the noise of visible light. Deep learning with convolutional neural networks (CNN) has brought a considerable breakthrough in various applications. To address the iris segmentation issues in challenging situations by visible light and near-infrared light camera sensors, this paper proposes a densely connected fully convolutional network (IrisDenseNet), which can determine the true iris boundary even with inferior-quality images by using better information gradient flow between the dense blocks. In the experiments conducted, five datasets of visible light and NIR environments were used. For visible light environment, noisy iris challenge evaluation part-II (NICE-II selected from UBIRIS.v2 database) and mobile iris challenge evaluation (MICHE-I) datasets were used. For NIR environment, the institute of automation, Chinese academy of sciences (CASIA) v4.0 interval, CASIA v4.0 distance, and IIT Delhi v1.0 iris datasets were used. Experimental results showed the optimal segmentation of the proposed IrisDenseNet and its excellent performance over existing algorithms for all five datasets. PMID:29748495

  1. Hayabusa2 Mission Overview

    NASA Astrophysics Data System (ADS)

    Watanabe, Sei-ichiro; Tsuda, Yuichi; Yoshikawa, Makoto; Tanaka, Satoshi; Saiki, Takanao; Nakazawa, Satoru

    2017-07-01

    The Hayabusa2 mission journeys to C-type near-Earth asteroid (162173) Ryugu (1999 JU3) to observe and explore the 900 m-sized object, as well as return samples collected from the surface layer. The Haybusa2 spacecraft developed by Japan Aerospace Exploration Agency (JAXA) was successfully launched on December 3, 2014 by an H-IIA launch vehicle and performed an Earth swing-by on December 3, 2015 to set it on a course toward its target Ryugu. Hayabusa2 aims at increasing our knowledge of the early history and transfer processes of the solar system through deciphering memories recorded on Ryugu, especially about the origin of water and organic materials transferred to the Earth's region. Hayabusa2 carries four remote-sensing instruments, a telescopic optical camera with seven colors (ONC-T), a laser altimeter (LIDAR), a near-infrared spectrometer covering the 3-μm absorption band (NIRS3), and a thermal infrared imager (TIR). It also has three small rovers of MINERVA-II and a small lander MASCOT (Mobile Asteroid Surface Scout) developed by German Aerospace Center (DLR) in cooperation with French space agency CNES. MASCOT has a wide angle imager (MasCam), a 6-band thermal radiator (MARA), a 3-axis magnetometer (MasMag), and a hyperspectral infrared microscope (MicrOmega). Further, Hayabusa2 has a sampling device (SMP), and impact experiment devices which consist of a small carry-on impactor (SCI) and a deployable camera (DCAM3). The interdisciplinary research using the data from these onboard and lander's instruments and the analyses of returned samples are the key to success of the mission.

  2. IrisDenseNet: Robust Iris Segmentation Using Densely Connected Fully Convolutional Networks in the Images by Visible Light and Near-Infrared Light Camera Sensors.

    PubMed

    Arsalan, Muhammad; Naqvi, Rizwan Ali; Kim, Dong Seop; Nguyen, Phong Ha; Owais, Muhammad; Park, Kang Ryoung

    2018-05-10

    The recent advancements in computer vision have opened new horizons for deploying biometric recognition algorithms in mobile and handheld devices. Similarly, iris recognition is now much needed in unconstraint scenarios with accuracy. These environments make the acquired iris image exhibit occlusion, low resolution, blur, unusual glint, ghost effect, and off-angles. The prevailing segmentation algorithms cannot cope with these constraints. In addition, owing to the unavailability of near-infrared (NIR) light, iris recognition in visible light environment makes the iris segmentation challenging with the noise of visible light. Deep learning with convolutional neural networks (CNN) has brought a considerable breakthrough in various applications. To address the iris segmentation issues in challenging situations by visible light and near-infrared light camera sensors, this paper proposes a densely connected fully convolutional network (IrisDenseNet), which can determine the true iris boundary even with inferior-quality images by using better information gradient flow between the dense blocks. In the experiments conducted, five datasets of visible light and NIR environments were used. For visible light environment, noisy iris challenge evaluation part-II (NICE-II selected from UBIRIS.v2 database) and mobile iris challenge evaluation (MICHE-I) datasets were used. For NIR environment, the institute of automation, Chinese academy of sciences (CASIA) v4.0 interval, CASIA v4.0 distance, and IIT Delhi v1.0 iris datasets were used. Experimental results showed the optimal segmentation of the proposed IrisDenseNet and its excellent performance over existing algorithms for all five datasets.

  3. Determination of spatial distribution of increase in bone temperature during drilling by infrared thermography: preliminary report.

    PubMed

    Augustin, Goran; Davila, Slavko; Udiljak, Toma; Vedrina, Denis Stjepan; Bagatin, Dinko

    2009-05-01

    During the drilling of the bone, the temperature could increase above 47 degrees C and cause irreversible osteonecrosis. The spatial distribution of increase in bone temperature could only be presumed using several thermocouples around the drilling site. The aim of this study was to use infrared thermographic camera for determination of spatial distribution of increase in bone temperature during drilling. One combination of drill parameters was used (drill diameter 4.5 mm; drill speed 1,820 rpm; feed-rate 84 mm/min; drill point angle 100 degrees) without external irrigation on room temperature of 26 degrees C. The increase in bone temperature during drilling was analyzed with infrared thermographic camera in two perpendicular planes. Thermographic pictures were taken before drilling, during drilling with measurement of maximal temperature values and after extraction of the drill from the bone. The thermographic picture shows that the increase in bone temperature has irregular shape with maximal increase along cortical bone, which is the most compact component of the bone. The width of this area with the temperature above critical level is three times broader than the width of cortical bone. From the front, the distribution of increase in bone temperature follows the form of the cortical bone (segment of a ring), which is the most compact part and causes the highest resistance to drilling and subsequent friction. Thermography showed that increase in bone temperature spreads through cortical bone, which is the most compact and dense part, and generates highest frictional heat during drilling. The medullar cavity, because of its gelatinous structure, contributes only to thermal dissipation.

  4. A single camera photogrammetry system for multi-angle fast localization of EEG electrodes.

    PubMed

    Qian, Shuo; Sheng, Yang

    2011-11-01

    Photogrammetry has become an effective method for the determination of electroencephalography (EEG) electrode positions in three dimensions (3D). Capturing multi-angle images of the electrodes on the head is a fundamental objective in the design of photogrammetry system for EEG localization. Methods in previous studies are all based on the use of either a rotating camera or multiple cameras, which are time-consuming or not cost-effective. This study aims to present a novel photogrammetry system that can realize simultaneous acquisition of multi-angle head images in a single camera position. Aligning two planar mirrors with the angle of 51.4°, seven views of the head with 25 electrodes are captured simultaneously by the digital camera placed in front of them. A complete set of algorithms for electrode recognition, matching, and 3D reconstruction is developed. It is found that the elapsed time of the whole localization procedure is about 3 min, and camera calibration computation takes about 1 min, after the measurement of calibration points. The positioning accuracy with the maximum error of 1.19 mm is acceptable. Experimental results demonstrate that the proposed system provides a fast and cost-effective method for the EEG positioning.

  5. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  7. Thermal-depth matching in dynamic scene based on affine projection and feature registration

    NASA Astrophysics Data System (ADS)

    Wang, Hongyu; Jia, Tong; Wu, Chengdong; Li, Yongqiang

    2018-03-01

    This paper aims to study the construction of 3D temperature distribution reconstruction system based on depth and thermal infrared information. Initially, a traditional calibration method cannot be directly used, because the depth and thermal infrared camera is not sensitive to the color calibration board. Therefore, this paper aims to design a depth and thermal infrared camera calibration board to complete the calibration of the depth and thermal infrared camera. Meanwhile a local feature descriptors in thermal and depth images is proposed. The belief propagation matching algorithm is also investigated based on the space affine transformation matching and local feature matching. The 3D temperature distribution model is built based on the matching of 3D point cloud and 2D thermal infrared information. Experimental results show that the method can accurately construct the 3D temperature distribution model, and has strong robustness.

  8. Free-form reflective optics for mid-infrared camera and spectrometer on board SPICA

    NASA Astrophysics Data System (ADS)

    Fujishiro, Naofumi; Kataza, Hirokazu; Wada, Takehiko; Ikeda, Yuji; Sakon, Itsuki; Oyabu, Shinki

    2017-11-01

    SPICA (Space Infrared Telescope for Cosmology and Astrophysics) is an astronomical mission optimized for mid-and far-infrared astronomy with a cryogenically cooled 3-m class telescope, envisioned for launch in early 2020s. Mid-infrared Camera and Spectrometer (MCS) is a focal plane instrument for SPICA with imaging and spectroscopic observing capabilities in the mid-infrared wavelength range of 5-38μm. MCS consists of two relay optical modules and following four scientific optical modules of WFC (Wide Field Camera; 5'x 5' field of view, f/11.7 and f/4.2 cameras), LRS (Low Resolution Spectrometer; 2'.5 long slits, prism dispersers, f/5.0 and f/1.7 cameras, spectral resolving power R ∼ 50-100), MRS (Mid Resolution Spectrometer; echelles, integral field units by image slicer, f/3.3 and f/1.9 cameras, R ∼ 1100-3000) and HRS (High Resolution Spectrometer; immersed echelles, f/6.0 and f/3.6 cameras, R ∼ 20000-30000). Here, we present optical design and expected optical performance of MCS. Most parts of MCS optics adopt off-axis reflective system for covering the wide wavelength range of 5-38μm without chromatic aberration and minimizing problems due to changes in shapes and refractive indices of materials from room temperature to cryogenic temperature. In order to achieve the high specification requirements of wide field of view, small F-number and large spectral resolving power with compact size, we employed the paraxial and aberration analysis of off-axial optical systems (Araki 2005 [1]) which is a design method using free-form surfaces for compact reflective optics such as head mount displays. As a result, we have successfully designed compact reflective optics for MCS with as-built performance of diffraction-limited image resolution.

  9. STS-109 Flight Day 3 Highlights

    NASA Technical Reports Server (NTRS)

    2002-01-01

    This footage from the third day of the STS-109 mission to service the Hubble Space Telescope (HST) begins with the grappling of the HST by the robotic arm of the Columbia Orbiter, operated by Mission Specialist Nancy Currie. During the grappling, numerous angles deliver close-up images of the telescope which appears to be in good shape despite many years in orbit around the Earth. Following the positioning of the HST on its berthing platform in the Shuttle bay, the robotic arm is used to perform an external survey of the telescope. Some cursory details are given about different equipment which will be installed on the HST including a replacement cooling system for the Near Infrared Camera Multi-Object Spectrometer (NICMOS) and the Advanced Camera for Surveys. Following the survey, there is footage of the retraction of both of the telescope's two flexible solar arrays, which was successful. These arrays will be replaced by rigid solar arrays with decreased surface area and increased performance.

  10. Convolutional Neural Network-Based Human Detection in Nighttime Images Using Visible Light Camera Sensors.

    PubMed

    Kim, Jong Hyun; Hong, Hyung Gil; Park, Kang Ryoung

    2017-05-08

    Because intelligent surveillance systems have recently undergone rapid growth, research on accurately detecting humans in videos captured at a long distance is growing in importance. The existing research using visible light cameras has mainly focused on methods of human detection for daytime hours when there is outside light, but human detection during nighttime hours when there is no outside light is difficult. Thus, methods that employ additional near-infrared (NIR) illuminators and NIR cameras or thermal cameras have been used. However, in the case of NIR illuminators, there are limitations in terms of the illumination angle and distance. There are also difficulties because the illuminator power must be adaptively adjusted depending on whether the object is close or far away. In the case of thermal cameras, their cost is still high, which makes it difficult to install and use them in a variety of places. Because of this, research has been conducted on nighttime human detection using visible light cameras, but this has focused on objects at a short distance in an indoor environment or the use of video-based methods to capture multiple images and process them, which causes problems related to the increase in the processing time. To resolve these problems, this paper presents a method that uses a single image captured at night on a visible light camera to detect humans in a variety of environments based on a convolutional neural network. Experimental results using a self-constructed Dongguk night-time human detection database (DNHD-DB1) and two open databases (Korea advanced institute of science and technology (KAIST) and computer vision center (CVC) databases), as well as high-accuracy human detection in a variety of environments, show that the method has excellent performance compared to existing methods.

  11. Near-infrared transillumination photography of intraocular tumours.

    PubMed

    Krohn, Jørgen; Ulltang, Erlend; Kjersem, Bård

    2013-10-01

    To present a technique for near-infrared transillumination imaging of intraocular tumours based on the modifications of a conventional digital slit lamp camera system. The Haag-Streit Photo-Slit Lamp BX 900 (Haag-Streit AG) was used for transillumination photography by gently pressing the tip of the background illumination cable against the surface of the patient's eye. Thus the light from the flash unit was transmitted into the eye, leading to improved illumination and image resolution. The modification for near-infrared photography was done by replacing the original camera with a Canon EOS 30D (Canon Inc) converted by Advanced Camera Services Ltd. In this camera, the infrared blocking filter was exchanged for a 720 nm long-pass filter, so that the near-infrared part of the spectrum was recorded by the sensor. The technique was applied in eight patients: three with anterior choroidal melanoma, three with ciliary body melanoma and two with ocular pigment alterations. The good diagnostic quality of the photographs made it possible to evaluate the exact location and extent of the lesions in relation to pigmented intraocular landmarks such as the ora serrata and ciliary body. The photographic procedure did not lead to any complications. We recommend near-infrared transillumination photography as a supplementary diagnostic tool for the evaluation and documentation of anteriorly located intraocular tumours.

  12. Prototype of microbolometer thermal infrared camera for forest fire detection from space

    NASA Astrophysics Data System (ADS)

    Guerin, Francois; Dantes, Didier; Bouzou, Nathalie; Chorier, Philippe; Bouchardy, Anne-Marie; Rollin, Joël.

    2017-11-01

    The contribution of the thermal infrared (TIR) camera to the Earth observation FUEGO mission is to participate; to discriminate the clouds and smoke; to detect the false alarms of forest fires; to monitor the forest fires. Consequently, the camera needs a large dynamic range of detectable radiances. A small volume, low mass and power are required by the small FUEGO payload. These specifications can be attractive for other similar missions.

  13. Report Of The HST Strategy Panel: A Strategy For Recovery

    DTIC Science & Technology

    1991-01-01

    orbit change out: the Wide Field/Planetary Camera II (WFPC II), the Near-Infrared Camera and Multi- Object Spectrometer (NICMOS) and the Space ...are the Space Telescope Imaging Spectrograph (STB), the Near-Infrared Camera and Multi- Object Spectrom- eter (NICMOS), and the second Wide Field and...expected to fail to lock due to duplicity was 20%; on- orbit data indicates that 10% may be a better estimate, but the guide stars were preselected

  14. Design of an infrared camera based aircraft detection system for laser guide star installations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, H.; Macintosh, B.

    1996-03-05

    There have been incidents in which the irradiance resulting from laser guide stars have temporarily blinded pilots or passengers of aircraft. An aircraft detection system based on passive near infrared cameras (instead of active radar) is described in this report.

  15. Thermal Scanning of Dental Pulp Chamber by Thermocouple System and Infrared Camera during Photo Curing of Resin Composites

    PubMed Central

    Hamze, Faeze; Ganjalikhan Nasab, Seyed Abdolreza; Eskandarizadeh, Ali; Shahravan, Arash; Akhavan Fard, Fatemeh; Sinaee, Neda

    2018-01-01

    Introduction: Due to thermal hazard during composite restorations, this study was designed to scan the pulp temperature by thermocouple and infrared camera during photo polymerizing different composites. Methods and Materials: A mesio-occlso-distal (MOD) cavity was prepared in an extracted tooth and the K-type thermocouple was fixed in its pulp chamber. Subsequently, 1 mm increment of each composites were inserted (four composite types were incorporated) and photo polymerized employing either LED or QTH systems for 60 sec while the temperature was recorded with 10 sec intervals. Ultimately, the same tooth was hemisected bucco-lingually and the amalgam was removed. The same composite curing procedure was repeated while the thermogram was recorded using an infrared camera. Thereafter, the data was analyzed by repeated measured ANOVA followed by Tukey’s HSD Post Hoc test for multiple comparisons (α=0.05). Results: The pulp temperature was significantly increased (repeated measures) during photo polymerization (P=0.000) while there was no significant difference among the results recorded by thermocouple comparing to infrared camera (P>0.05). Moreover, different composite materials and LCUs lead to similar outcomes (P>0.05). Conclusion: Although various composites have significant different chemical compositions, they lead to similar pulp thermal changes. Moreover, both the infrared camera and the thermocouple would record parallel results of dental pulp temperature. PMID:29707014

  16. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  17. Mimas Showing False Colors #2

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This false color image of Saturn's moon Mimas reveals variation in either the composition or texture across its surface.

    During its approach to Mimas on Aug. 2, 2005, the Cassini spacecraft narrow-angle camera obtained multi-spectral views of the moon from a range of 228,000 kilometers (142,500 miles).

    This image is a color composite of narrow-angle ultraviolet, green, infrared and clear filter images, which have been specially processed to accentuate subtle changes in the spectral properties of Mimas' surface materials. To create this view, three color images (ultraviolet, green and infrared) were combined with a single black and white picture that isolates and maps regional color differences to create the final product.

    Shades of blue and violet in the image at the right are used to identify surface materials that are bluer in color and have a weaker infrared brightness than average Mimas materials, which are represented by green.

    Herschel crater, a 140-kilometer-wide (88-mile) impact feature with a prominent central peak, is visible in the upper right of the image. The unusual bluer materials are seen to broadly surround Herschel crater. However, the bluer material is not uniformly distributed in and around the crater. Instead, it appears to be concentrated on the outside of the crater and more to the west than to the north or south. The origin of the color differences is not yet understood. It may represent ejecta material that was excavated from inside Mimas when the Herschel impact occurred. The bluer color of these materials may be caused by subtle differences in the surface composition or the sizes of grains making up the icy soil.

    This image was obtained when the Cassini spacecraft was above 25 degrees south, 134 degrees west latitude and longitude. The Sun-Mimas-spacecraft angle was 45 degrees and north is at the top.

    The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA's Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colo.

    For more information about the Cassini-Huygens mission visit http://saturn.jpl.nasa.gov . The Cassini imaging team homepage is at http://ciclops.org .

  18. Space imaging infrared optical guidance for autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Akiyama, Akira; Kobayashi, Nobuaki; Mutoh, Eiichiro; Kumagai, Hideo; Yamada, Hirofumi; Ishii, Hiromitsu

    2008-08-01

    We have developed the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle based on the uncooled infrared camera and focusing technique to detect the objects to be evaded and to set the drive path. For this purpose we made servomotor drive system to control the focus function of the infrared camera lens. To determine the best focus position we use the auto focus image processing of Daubechies wavelet transform technique with 4 terms. From the determined best focus position we transformed it to the distance of the object. We made the aluminum frame ground vehicle to mount the auto focus infrared unit. Its size is 900mm long and 800mm wide. This vehicle mounted Ackerman front steering system and the rear motor drive system. To confirm the guidance ability of the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle we had the experiments for the detection ability of the infrared auto focus unit to the actual car on the road and the roadside wall. As a result the auto focus image processing based on the Daubechies wavelet transform technique detects the best focus image clearly and give the depth of the object from the infrared camera unit.

  19. A position and attitude vision measurement system for wind tunnel slender model

    NASA Astrophysics Data System (ADS)

    Cheng, Lei; Yang, Yinong; Xue, Bindang; Zhou, Fugen; Bai, Xiangzhi

    2014-11-01

    A position and attitude vision measurement system for drop test slender model in wind tunnel is designed and developed. The system used two high speed cameras, one is put to the side of the model and another is put to the position where the camera can look up the model. Simple symbols are set on the model. The main idea of the system is based on image matching technique between the 3D-digital model projection image and the image captured by the camera. At first, we evaluate the pitch angles, the roll angles and the position of the centroid of a model through recognizing symbols in the images captured by the side camera. And then, based on the evaluated attitude info, giving a series of yaw angles, a series of projection images of the 3D-digital model are obtained. Finally, these projection images are matched with the image which captured by the looking up camera, and the best match's projection images corresponds to the yaw angle is the very yaw angle of the model. Simulation experiments are conducted and the results show that the maximal error of attitude measurement is less than 0.05°, which can meet the demand of test in wind tunnel.

  20. 24/7 security system: 60-FPS color EMCCD camera with integral human recognition

    NASA Astrophysics Data System (ADS)

    Vogelsong, T. L.; Boult, T. E.; Gardner, D. W.; Woodworth, R.; Johnson, R. C.; Heflin, B.

    2007-04-01

    An advanced surveillance/security system is being developed for unattended 24/7 image acquisition and automated detection, discrimination, and tracking of humans and vehicles. The low-light video camera incorporates an electron multiplying CCD sensor with a programmable on-chip gain of up to 1000:1, providing effective noise levels of less than 1 electron. The EMCCD camera operates in full color mode under sunlit and moonlit conditions, and monochrome under quarter-moonlight to overcast starlight illumination. Sixty frame per second operation and progressive scanning minimizes motion artifacts. The acquired image sequences are processed with FPGA-compatible real-time algorithms, to detect/localize/track targets and reject non-targets due to clutter under a broad range of illumination conditions and viewing angles. The object detectors that are used are trained from actual image data. Detectors have been developed and demonstrated for faces, upright humans, crawling humans, large animals, cars and trucks. Detection and tracking of targets too small for template-based detection is achieved. For face and vehicle targets the results of the detection are passed to secondary processing to extract recognition templates, which are then compared with a database for identification. When combined with pan-tilt-zoom (PTZ) optics, the resulting system provides a reliable wide-area 24/7 surveillance system that avoids the high life-cycle cost of infrared cameras and image intensifiers.

  1. Techniques for Transition and Surface Temperature Measurements on Projectiles at Hypersonic Velocities- A Status Report

    NASA Technical Reports Server (NTRS)

    Wilder, M. C.; Bogdanoff, D. W.

    2005-01-01

    A research effort to advance techniques for determining transition location and measuring surface temperatures on graphite-tipped projectiles in hypersonic flight in a ballistic range is described. Projectiles were launched at muzzle velocities of approx. 4.7 km/sec into air at pressures of 190-570 Torr. Most launches had maximum pitch and yaw angles of 2.5-5 degrees at pressures of 380 Torr and above and 3-6 degrees at pressures of 190-380 Torr. Arcjet-ablated and machined, bead-blasted projectiles were launched; special cleaning techniques had to be developed for the latter class of projectiles. Improved methods of using helium to remove the radiating gas cap around the projectiles at the locations where ICCD (intensified charge coupled device) camera images were taken are described. Two ICCD cameras with a wavelength sensitivity range of 480-870 nm have been used in this program for several years to obtain images. In the last year, a third camera, with a wavelength sensitivity range of 1.5-5 microns [in the infrared (IR)], has been added. ICCD and IR camera images of hemisphere nose and 70 degree sphere-cone nose projectiles at velocities of 4.0-4.7 km/sec are presented. The ICCD images clearly show a region of steep temperature rise indicative of transition from laminar to turbulent flow. Preliminary temperature data for the graphite projectile noses are presented.

  2. Low-cost camera modifications and methodologies for very-high-resolution digital images

    USDA-ARS?s Scientific Manuscript database

    Aerial color and color-infrared photography are usually acquired at high altitude so the ground resolution of the photographs is < 1 m. Moreover, current color-infrared cameras and manned aircraft flight time are expensive, so the objective is the development of alternative methods for obtaining ve...

  3. Robust Behavior Recognition in Intelligent Surveillance Environments.

    PubMed

    Batchuluun, Ganbayar; Kim, Yeong Gon; Kim, Jong Hyun; Hong, Hyung Gil; Park, Kang Ryoung

    2016-06-30

    Intelligent surveillance systems have been studied by many researchers. These systems should be operated in both daytime and nighttime, but objects are invisible in images captured by visible light camera during the night. Therefore, near infrared (NIR) cameras, thermal cameras (based on medium-wavelength infrared (MWIR), and long-wavelength infrared (LWIR) light) have been considered for usage during the nighttime as an alternative. Due to the usage during both daytime and nighttime, and the limitation of requiring an additional NIR illuminator (which should illuminate a wide area over a great distance) for NIR cameras during the nighttime, a dual system of visible light and thermal cameras is used in our research, and we propose a new behavior recognition in intelligent surveillance environments. Twelve datasets were compiled by collecting data in various environments, and they were used to obtain experimental results. The recognition accuracy of our method was found to be 97.6%, thereby confirming the ability of our method to outperform previous methods.

  4. Polarized Light from Jupiter

    NASA Technical Reports Server (NTRS)

    2001-01-01

    These images taken through the wide angle camera near closest approach in the deep near-infrared methane band, combined with filters which sense electromagnetic radiation of orthogonal polarization, show that the light from the poles is polarized. That is, the poles appear bright in one image, and dark in the other. Polarized light is most readily scattered by aerosols. These images indicate that the aerosol particles at Jupiter's poles are small and likely consist of aggregates of even smaller particles, whereas the particles at the equator and covering the Great Red Spot are larger. Images like these will allow scientists to ascertain the distribution, size and shape of aerosols, and consequently, the distribution of heat, in Jupiter's atmosphere.

  5. Dual light-emitting diode-based multichannel microscopy for whole-slide multiplane, multispectral and phase imaging.

    PubMed

    Liao, Jun; Wang, Zhe; Zhang, Zibang; Bian, Zichao; Guo, Kaikai; Nambiar, Aparna; Jiang, Yutong; Jiang, Shaowei; Zhong, Jingang; Choma, Michael; Zheng, Guoan

    2018-02-01

    We report the development of a multichannel microscopy for whole-slide multiplane, multispectral and phase imaging. We use trinocular heads to split the beam path into 6 independent channels and employ a camera array for parallel data acquisition, achieving a maximum data throughput of approximately 1 gigapixel per second. To perform single-frame rapid autofocusing, we place 2 near-infrared light-emitting diodes (LEDs) at the back focal plane of the condenser lens to illuminate the sample from 2 different incident angles. A hot mirror is used to direct the near-infrared light to an autofocusing camera. For multiplane whole-slide imaging (WSI), we acquire 6 different focal planes of a thick specimen simultaneously. For multispectral WSI, we relay the 6 independent image planes to the same focal position and simultaneously acquire information at 6 spectral bands. For whole-slide phase imaging, we acquire images at 3 focal positions simultaneously and use the transport-of-intensity equation to recover the phase information. We also provide an open-source design to further increase the number of channels from 6 to 15. The reported platform provides a simple solution for multiplexed fluorescence imaging and multimodal WSI. Acquiring an instant focal stack without z-scanning may also enable fast 3-dimensional dynamic tracking of various biological samples. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Infrared On-Orbit RCC Inspection With the EVA IR Camera: Development of Flight Hardware From a COTS System

    NASA Technical Reports Server (NTRS)

    Gazanik, Michael; Johnson, Dave; Kist, Ed; Novak, Frank; Antill, Charles; Haakenson, David; Howell, Patricia; Jenkins, Rusty; Yates, Rusty; Stephan, Ryan; hide

    2005-01-01

    In November 2004, NASA's Space Shuttle Program approved the development of the Extravehicular (EVA) Infrared (IR) Camera to test the application of infrared thermography to on-orbit reinforced carbon-carbon (RCC) damage detection. A multi-center team composed of members from NASA's Johnson Space Center (JSC), Langley Research Center (LaRC), and Goddard Space Flight Center (GSFC) was formed to develop the camera system and plan a flight test. The initial development schedule called for the delivery of the system in time to support STS-115 in late 2005. At the request of Shuttle Program managers and the flight crews, the team accelerated its schedule and delivered a certified EVA IR Camera system in time to support STS-114 in July 2005 as a contingency. The development of the camera system, led by LaRC, was based on the Commercial-Off-the-Shelf (COTS) FLIR S65 handheld infrared camera. An assessment of the S65 system in regards to space-flight operation was critical to the project. This paper discusses the space-flight assessment and describes the significant modifications required for EVA use by the astronaut crew. The on-orbit inspection technique will be demonstrated during the third EVA of STS-121 in September 2005 by imaging damaged RCC samples mounted in a box in the Shuttle's cargo bay.

  7. Power estimation of martial arts movement with different physical, mood, and behavior using motion capture camera

    NASA Astrophysics Data System (ADS)

    Awang Soh, Ahmad Afiq Sabqi; Mat Jafri, Mohd Zubir; Azraai, Nur Zaidi

    2017-07-01

    In Malay world, there is a spirit traditional ritual where they use it as healing practices or for normal life. Malay martial arts (silat) also is not exceptional where some branch of silat have spirit traditional ritual where they said can help them in combat. In this paper, we will not use any ritual, instead we will use some medicinal and environment change when they are performing. There will be 2 performers (fighter) selected, one of them have an experience in martial arts training and another performer does not have experience. Motion Capture (MOCAP) camera will help observe and analyze this move. 8 cameras have been placed in the MOCAP room 2 on each side of the wall facing toward the center of the room from every angle. This will help prevent the loss detection of a marker that been stamped on the limb of a performer. Passive marker has been used where it will reflect the infrared to the camera sensor. Infrared is generated by the source around the camera lens. A 60 kg punching bag was hung on the iron bar function as the target for the performer when throws a punch. Markers also have been stamped on the punching bag so we can detect the movement how far can it swing when hit by the performer. 2 performers will perform 2 moves each with the same position and posture. For every 2 moves, we have made the environment change without the performer notice about it. The first 2 punch with normal environment, second part we have played a positive music to change the performer's mood and third part we have put a medicine (cream/oil) on the skin of the performer. This medicine will make the skin feel a little bit hot. This process repeated to another performer with no experience. The position of this marker analyzed by the Cortex Motion Analysis software where from this data, we can estimate the kinetics and kinematics of the performer. It shows that the increase of kinetics for every part because of the change in the environment, and different result for the 2 performers.

  8. LWIR NUC using an uncooled microbolometer camera

    NASA Astrophysics Data System (ADS)

    Laveigne, Joe; Franks, Greg; Sparkman, Kevin; Prewarski, Marcus; Nehring, Brian; McHugh, Steve

    2010-04-01

    Performing a good non-uniformity correction is a key part of achieving optimal performance from an infrared scene projector. Ideally, NUC will be performed in the same band in which the scene projector will be used. Cooled, large format MWIR cameras are readily available and have been successfully used to perform NUC, however, cooled large format LWIR cameras are not as common and are prohibitively expensive. Large format uncooled cameras are far more available and affordable, but present a range of challenges in practical use for performing NUC on an IRSP. Santa Barbara Infrared, Inc. reports progress on a continuing development program to use a microbolometer camera to perform LWIR NUC on an IRSP. Camera instability and temporal response and thermal resolution are the main difficulties. A discussion of processes developed to mitigate these issues follows.

  9. Compensation method of cloud infrared radiation interference based on a spinning projectile's attitude measurement

    NASA Astrophysics Data System (ADS)

    Xu, Miaomiao; Bu, Xiongzhu; Yu, Jing; He, Zilu

    2018-01-01

    Based on the study of earth infrared radiation and further requirement of anticloud interference ability for a spinning projectile's infrared attitude measurement, a compensation method of cloud infrared radiation interference is proposed. First, the theoretical model of infrared radiation interference is established by analyzing the generation mechanism and interference characteristics of cloud infrared radiation. Then, the influence of cloud infrared radiation on attitude angle is calculated in the following two situations. The first situation is the projectile in cloud, and the maximum of roll angle error can reach ± 20 deg. The second situation is the projectile outside of cloud, and it results in the inability to measure the projectile's attitude angle. Finally, a multisensor weighted fusion algorithm is proposed based on trust function method to reduce the influence of cloud infrared radiation. The results of semiphysical experiments show that the error of roll angle with a weighted fusion algorithm can be kept within ± 0.5 deg in the presence of cloud infrared radiation interference. This proposed method improves the accuracy of roll angle by nearly four times in attitude measurement and also solves the problem of low accuracy of infrared radiation attitude measurement in navigation and guidance field.

  10. Pettit holds cameras in the U.S. Laboratory

    NASA Image and Video Library

    2012-01-15

    ISS030-E-175788 (15 Jan. 2012) --- NASA astronaut Don Pettit, Expedition 30 flight engineer, is pictured with two still cameras mounted together in the Destiny laboratory of the International Space Station. One camera is an infrared modified still camera.

  11. 67P/Churyumov-Gerasimenko: Activity between March and June 2014 as observed from Rosetta/OSIRIS

    NASA Astrophysics Data System (ADS)

    Tubiana, C.; Snodgrass, C.; Bertini, I.; Mottola, S.; Vincent, J.-B.; Lara, L.; Fornasier, S.; Knollenberg, J.; Thomas, N.; Fulle, M.; Agarwal, J.; Bodewits, D.; Ferri, F.; Güttler, C.; Gutierrez, P. J.; La Forgia, F.; Lowry, S.; Magrin, S.; Oklay, N.; Pajola, M.; Rodrigo, R.; Sierks, H.; A'Hearn, M. F.; Angrilli, F.; Barbieri, C.; Barucci, M. A.; Bertaux, J.-L.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; De Cecco, M.; Debei, S.; Groussin, O.; Hviid, S. F.; Ip, W.; Jorda, L.; Keller, H. U.; Koschny, D.; Kramm, R.; Kührt, E.; Küppers, M.; Lazzarin, M.; Lamy, P. L.; Lopez Moreno, J. J.; Marzari, F.; Michalik, H.; Naletto, G.; Rickman, H.; Sabau, L.; Wenzel, K.-P.

    2015-01-01

    Aims: 67P/Churyumov-Gerasimenko is the target comet of the ESA's Rosetta mission. After commissioning at the end of March 2014, the Optical, Spectroscopic, and Infrared Remote Imaging System (OSIRIS) onboard Rosetta, started imaging the comet and its dust environment to investigate how they change and evolve while approaching the Sun. Methods: We focused our work on Narrow Angle Camera (NAC) orange images and Wide Angle Camera (WAC) red and visible-610 images acquired between 2014 March 23 and June 24 when the nucleus of 67P was unresolved and moving from approximately 4.3 AU to 3.8 AU inbound. During this period the 67P - Rosetta distance decreased from 5 million to 120 thousand km. Results: Through aperture photometry, we investigated how the comet brightness varies with heliocentric distance. 67P was likely already weakly active at the end of March 2014, with excess flux above that expected for the nucleus. The comet's brightness was mostly constant during the three months of approach observations, apart from one outburst that occurred around April 30 and a second increase in flux after June 20. Coma was resolved in the profiles from mid-April. Analysis of the coma morphology suggests that most of the activity comes from a source towards the celestial north pole of the comet, but the outburst that occurred on April 30 released material in a different direction.

  12. Fluctuations of Lake Eyre, South Australia

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Lake Eyre is a large salt lake situated between two deserts in one of Australia's driest regions. However, this low-lying lake attracts run-off from one of the largest inland drainage systems in the world. The drainage basin is very responsive to rainfall variations, and changes dramatically with Australia's inter-annual weather fluctuations. When Lake Eyre fills,as it did in 1989, it is temporarily Australia's largest lake, and becomes dense with birds, frogs and colorful plant life. The Lake responds to extended dry periods (often associated with El Nino events) by drying completely.

    These four images from the Multi-angle Imaging SpectroRadiometer contrast the lake area at the start of the austral summers of 2000 and 2002. The top two panels portray the region as it appeared on December 9, 2000. Heavy rains in the first part of 2000 caused both the north and south sections of the lake to fill partially and the northern part of the lake still contained significant standing water by the time these data were acquired. The bottom panels were captured on November 29, 2002. Rainfall during 2002 was significantly below average ( http://www.bom.gov.au/ ), although showers occurring in the week before the image was acquired helped alleviate this condition slightly.

    The left-hand panels portray the area as it appeared to MISR's vertical-viewing (nadir) camera, and are false-color views comprised of data from the near-infrared, green and blue channels. Here, wet and/or moist surfaces appear blue-green, since water selectively absorbs longer wavelengths such as near-infrared. The right-hand panels are multi-angle composites created with red band data from MISR's 60-degree forward, nadir and 60-degree backward-viewing cameras, displayed as red, green and blue, respectively. In these multi-angle composites, color variations serve as a proxy for changes in angular reflectance, and indicate textural properties of the surface related to roughness and/or moisture content.Data from the two dates were processed identically to preserve relative variations in brightness between them. Wet surfaces or areas with standing water appear green due to the effect of sunglint at the nadir camera view angle. Dry, salt encrusted parts of the lake appear bright white or gray. Purple areas have enhanced forward scattering, possibly as a result of surface moistness. Some variations exhibited by the multi-angle composites are not discernible in the nadir multi-spectral images and vice versa, suggesting that the combination of angular and spectral information is a more powerful diagnostic of surface conditions than either technique by itself.

    The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire globe between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbits 5194 and 15679. The panels cover an area of 146 kilometers x 122 kilometers, and utilize data from blocks 113 to 114 within World Reference System-2 path 100.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory,Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  13. Infrared detectors and test technology of cryogenic camera

    NASA Astrophysics Data System (ADS)

    Yang, Xiaole; Liu, Xingxin; Xing, Mailing; Ling, Long

    2016-10-01

    Cryogenic camera which is widely used in deep space detection cools down optical system and support structure by cryogenic refrigeration technology, thereby improving the sensitivity. Discussing the characteristics and design points of infrared detector combined with camera's characteristics. At the same time, cryogenic background test systems of chip and detector assembly are established. Chip test system is based on variable cryogenic and multilayer Dewar, and assembly test system is based on target and background simulator in the thermal vacuum environment. The core of test is to establish cryogenic background. Non-uniformity, ratio of dead pixels and noise of test result are given finally. The establishment of test system supports for the design and calculation of infrared systems.

  14. Performance and Calibration of H2RG Detectors and SIDECAR ASICs for the RATIR Camera

    NASA Technical Reports Server (NTRS)

    Fox, Ori D.; Kutyrev, Alexander S.; Rapchun, David A.; Klein, Christopher R.; Butler, Nathaniel R.; Bloom, Josh; de Diego, Jos A.; Simn Farah, Alejandro D.; Gehrels, Neil A.; Georgiev, Leonid; hide

    2012-01-01

    The Reionization And Transient Infra,.Red (RATIR) camera has been built for rapid Gamma,.Ray Burst (GRE) followup and will provide simultaneous optical and infrared photometric capabilities. The infrared portion of this camera incorporates two Teledyne HgCdTe HAWAII-2RG detectors, controlled by Teledyne's SIDECAR ASICs. While other ground-based systems have used the SIDECAR before, this system also utilizes Teledyne's JADE2 interface card and IDE development environment. Together, this setup comprises Teledyne's Development Kit, which is a bundled solution that can be efficiently integrated into future ground-based systems. In this presentation, we characterize the system's read noise, dark current, and conversion gain.

  15. Obstacle Detection and Avoidance of a Mobile Robotic Platform Using Active Depth Sensing

    DTIC Science & Technology

    2014-06-01

    price of nearly one tenth of a laser range finder, the Xbox Kinect uses an infrared projector and camera to capture images of its environment in three...inception. At the price of nearly one tenth of a laser range finder, the Xbox Kinect uses an infrared projector and camera to capture images of its...cropped between 280 and 480 pixels. ........11 Figure 9. RGB image captured by the camera on the Xbox Kinect. ...............................12 Figure

  16. Demonstration of First 9 Micron cutoff 640 x 486 GaAs Based Quantum Well Infrared PhotoDetector (QWIP) Snap-Shot Camera

    NASA Technical Reports Server (NTRS)

    Gunapala, S.; Bandara, S. V.; Liu, J. K.; Hong, W.; Sundaram, M.; Maker, P. D.; Muller, R. E.

    1997-01-01

    In this paper, we discuss the development of this very sensitive long waelength infrared (LWIR) camera based on a GaAs/AlGaAs QWIP focal plane array (FPA) and its performance in quantum efficiency, NEAT, uniformity, and operability.

  17. Molecular Shocks Associated with Massive Young Stars: CO Line Images with a New Far-Infrared Spectroscopic Camera on the Kuiper Airborne Observatory

    NASA Technical Reports Server (NTRS)

    Watson, Dan M.

    1997-01-01

    Under the terms of our contract with NASA Ames Research Center, the University of Rochester (UR) offers the following final technical report on grant NAG 2-958, Molecular shocks associated with massive young stars: CO line images with a new far-infrared spectroscopic camera, given for implementation of the UR Far-Infrared Spectroscopic Camera (FISC) on the Kuiper Airborne Observatory (KAO), and use of this camera for observations of star-formation regions 1. Two KAO flights in FY 1995, the final year of KAO operations, were awarded to this program, conditional upon a technical readiness confirmation which was given in January 1995. The funding period covered in this report is 1 October 1994 - 30 September 1996. The project was supported with $30,000, and no funds remained at the conclusion of the project.

  18. Methane Painting

    NASA Image and Video Library

    2015-09-07

    Why does Saturn look like it's been painted with a dark brush in this infrared image, but Dione looks untouched? Perhaps an artist with very specific tastes in palettes? The answer is methane. This image was taken in a wavelength that is absorbed by methane. Dark areas seen here on Saturn are regions with thicker clouds, where light has to travel through more methane on its way into and back out of the atmosphere. Since Dione (698 miles or 1,123 kilometers across) doesn't have an atmosphere rich in methane the way Saturn does, it does not experience similar absorption -- the sunlight simply bounces off its icy surface. Shadows of the rings are seen cast onto the planet at lower right. This view looks toward Saturn from the unilluminated side of the rings, about 0.3 degrees below the ring plane. The image was taken with the Cassini spacecraft wide-angle camera on May 27, 2015 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 728 nanometers. http://photojournal.jpl.nasa.gov/catalog/PIA18336

  19. Silicon Based Schottky Barrier Infrared Sensors For Power System And Industrial Applications

    NASA Astrophysics Data System (ADS)

    Elabd, Hammam; Kosonocky, Walter F.

    1984-03-01

    Schottky barrier infrared charge coupled device sensors (IR-CCDs) have been developed. PtSi Schottky barrier detectors require cooling to liquid Nitrogen temperature and cover the wavelength range between 1 and 6 μm. The PtSi IR-CCDs can be used in industrial thermography with NEAT below 0.1°C. Pd Si-Schottkybarrier detectors require cooling to 145K and cover the spectral range between 1 and 3.5 μm. 11d2Si-IR-CCDs can be used in imaging high temperature scenes with NE▵T around 100°C. Several high density staring area and line imagers are available. Both interlaced and noninterlaced area imagers can be operated with variable and TV compatible frame rates as well as various field of view angles. The advantages of silicon fabrication technology in terms of cost and high density structures opens the doors for the design of special purpose thermal camera systems for a number of power aystem and industrial applications.

  20. Combined Infrared Stereo and Laser Ranging Cloud Measurements from Shuttle Mission STS-85

    NASA Technical Reports Server (NTRS)

    Lancaster, Redgie S.; Spinhirne, James D.; OCStarr, David (Technical Monitor)

    2001-01-01

    Multi-angle remote sensing provides a wealth of information for earth and climate monitoring. And, as technology advances so do the options for developing instrumentation versatile enough to meet the demands associated with these types of measurements. In the current work, the multiangle measurement capability of the Infrared Spectral Imaging Radiometer is demonstrated. This instrument flew as part of mission STS-85 of the space shuttle Columbia in 1997 and was the first earth-observing radiometer to incorporate an uncooled microbolometer array detector as its image sensor. Specifically, a method for computing cloud-top height from the multi-spectral stereo measurements acquired during this flight has been developed and the results demonstrate that a vertical precision of 10.6 km was achieved. Further, the accuracy of these measurements is confirmed by comparison with coincident direct laser ranging measurements from the Shuttle Laser Altimeter. Mission STS-85 was the first space flight to combine laser ranging and thermal IR camera systems for cloud remote sensing.

  1. 8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  2. Infrared Imaging Camera Final Report CRADA No. TC02061.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roos, E. V.; Nebeker, S.

    This was a collaborative effort between the University of California, Lawrence Livermore National Laboratory (LLNL) and Cordin Company (Cordin) to enhance the U.S. ability to develop a commercial infrared camera capable of capturing high-resolution images in a l 00 nanoseconds (ns) time frame. The Department of Energy (DOE), under an Initiative for Proliferation Prevention (IPP) project, funded the Russian Federation Nuclear Center All-Russian Scientific Institute of Experimental Physics (RFNC-VNIIEF) in Sarov. VNIIEF was funded to develop a prototype commercial infrared (IR) framing camera and to deliver a prototype IR camera to LLNL. LLNL and Cordin were partners with VNIIEF onmore » this project. A prototype IR camera was delivered by VNIIEF to LLNL in December 2006. In June of 2007, LLNL and Cordin evaluated the camera and the test results revealed that the camera exceeded presently available commercial IR cameras. Cordin believes that the camera can be sold on the international market. The camera is currently being used as a scientific tool within Russian nuclear centers. This project was originally designated as a two year project. The project was not started on time due to changes in the IPP project funding conditions; the project funding was re-directed through the International Science and Technology Center (ISTC), which delayed the project start by over one year. The project was not completed on schedule due to changes within the Russian government export regulations. These changes were directed by Export Control regulations on the export of high technology items that can be used to develop military weapons. The IR camera was on the list that export controls required. The ISTC and Russian government, after negotiations, allowed the delivery of the camera to LLNL. There were no significant technical or business changes to the original project.« less

  3. Investigation of the influence of spatial degrees of freedom on thermal infrared measurement

    NASA Astrophysics Data System (ADS)

    Fleuret, Julien R.; Yousefi, Bardia; Lei, Lei; Djupkep Dizeu, Frank Billy; Zhang, Hai; Sfarra, Stefano; Ouellet, Denis; Maldague, Xavier P. V.

    2017-05-01

    Long Wavelength Infrared (LWIR) cameras can provide a representation of a part of the light spectrum that is sensitive to temperature. These cameras also named Thermal Infrared (TIR) cameras are powerful tools to detect features that cannot be seen by other imaging technologies. For instance they enable defect detection in material, fever and anxiety in mammals and many other features for numerous applications. However, the accuracy of thermal cameras can be affected by many parameters; the most critical involves the relative position of the camera with respect to the object of interest. Several models have been proposed in order to minimize the influence of some of the parameters but they are mostly related to specific applications. Because such models are based on some prior informations related to context, their applicability to other contexts cannot be easily assessed. The few models remaining are mostly associated with a specific device. In this paper the authors studied the influence of the camera position on the measurement accuracy. Modeling of the position of the camera from the object of interest depends on many parameters. In order to propose a study which is as accurate as possible, the position of the camera will be represented as a five dimensions model. The aim of this study is to investigate and attempt to introduce a model which is as independent from the device as possible.

  4. The Wide Angle Camera of the ROSETTA Mission

    NASA Astrophysics Data System (ADS)

    Barbieri, C.; Fornasier, S.; Verani, S.; Bertini, I.; Lazzarin, M.; Rampazzi, F.; Cremonese, G.; Ragazzoni, R.; Marzari, F.; Angrilli, F.; Bianchini, G. A.; Debei, S.; Dececco, M.; Guizzo, G.; Parzianello, G.; Ramous, P.; Saggin, B.; Zaccariotto, M.; Da Deppo, V.; Naletto, G.; Nicolosi, G.; Pelizzo, M. G.; Tondello, G.; Brunello, P.; Peron, F.

    This paper aims to give a brief description of the Wide Angle Camera (WAC), built by the Centro Servizi e AttivitàSpaziali (CISAS) of the University of Padova for the ESA ROSETTA Mission to comet 46P/Wirtanen and asteroids 4979 Otawara and 140 Siwa. The WAC is part of the OSIRIS imaging system, which comprises also a Narrow Angle Camera (NAC) built by the Laboratoire d'Astrophysique Spatiale (LAS) of Marseille. CISAS had also the responsibility to build the shutter and the front cover mechanism for the NAC. The flight model of the WAC was delivered in December 2001, and has been already integrated on ROSETTA.

  5. Camera traps can be heard and seen by animals.

    PubMed

    Meek, Paul D; Ballard, Guy-Anthony; Fleming, Peter J S; Schaefer, Michael; Williams, Warwick; Falzon, Greg

    2014-01-01

    Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals' hearing and produce illumination that can be seen by many species.

  6. Exploring the imaging properties of thin lenses for cryogenic infrared cameras

    NASA Astrophysics Data System (ADS)

    Druart, Guillaume; Verdet, Sebastien; Guerineau, Nicolas; Magli, Serge; Chambon, Mathieu; Grulois, Tatiana; Matallah, Noura

    2016-05-01

    Designing a cryogenic camera is a good strategy to miniaturize and simplify an infrared camera using a cooled detector. Indeed, the integration of optics inside the cold shield allows to simply athermalize the design, guarantees a cold pupil and releases the constraint on having a high back focal length for small focal length systems. By this way, cameras made of a single lens or two lenses are viable systems with good optical features and a good stability in image correction. However it involves a relatively significant additional optical mass inside the dewar and thus increases the cool down time of the camera. ONERA is currently exploring a minimalist strategy consisting in giving an imaging function to thin optical plates that are found in conventional dewars. By this way, we could make a cryogenic camera that has the same cool down time as a traditional dewar without an imagery function. Two examples will be presented: the first one is a camera using a dual-band infrared detector made of a lens outside the dewar and a lens inside the cold shield, the later having the main optical power of the system. We were able to design a cold plano-convex lens with a thickness lower than 1mm. The second example is an evolution of a former cryogenic camera called SOIE. We replaced the cold meniscus by a plano-convex Fresnel lens with a decrease of the optical thermal mass of 66%. The performances of both cameras will be compared.

  7. Zoom-in on Epimetheus

    NASA Image and Video Library

    2017-07-03

    This zoomed-in view of Epimetheus, one of the highest resolution ever taken, shows a surface covered in craters, vivid reminders of the hazards of space. Epimetheus (70 miles or 113 kilometers across) is too small for its gravity to hold onto an atmosphere. It is also too small to be geologically active. There is therefore no way to erase the scars from meteor impacts, except for the generation of new impact craters on top of old ones. This view looks toward anti-Saturn side of Epimetheus. North on Epimetheus is up and rotated 32 degrees to the right. The image was taken with the Cassini spacecraft narrow-angle camera on Feb. 21, 2017 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 939 nanometers. The view was acquired at a distance of approximately 9,300 miles (15,000 kilometers) from Epimetheus and at a Sun-Epimetheus-spacecraft, or phase, angle of 71 degrees. Image scale is 290 feet (89 meters) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA21335

  8. Detail on Dione False color

    NASA Image and Video Library

    2006-01-27

    The leading hemisphere of Dione displays subtle variations in color across its surface in this false color view. To create this view, ultraviolet, green and infrared images were combined into a single black and white picture that isolates and maps regional color differences. This "color map" was then superposed over a clear-filter image. The origin of the color differences is not yet understood, but may be caused by subtle differences in the surface composition or the sizes of grains making up the icy soil. Terrain visible here is on the moon's leading hemisphere. North on Dione (1,126 kilometers, or 700 miles across) is up and rotated 17 degrees to the right. All images were acquired with the Cassini spacecraft narrow-angle camera on Dec. 24, 2005 at a distance of approximately 597,000 kilometers (371,000 miles) from Dione and at a Sun-Dione-spacecraft, or phase, angle of 21 degrees. Image scale is 4 kilometers (2 miles) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA07688

  9. Mighty Little Dot

    NASA Image and Video Library

    2014-12-01

    Enceladus (visible in the lower-left corner of the image) is but a speck before enormous Saturn, but even a small moon can generate big waves of excitement throughout the scientific community. Enceladus, only 313 miles (504 kilometers) across, spurts vapor jets from its south pole. The presence of these jets from Enceladus has been the subject of intense study since they were discovered by Cassini. Their presence may point to a sub-surface water reservoir. This view looks toward the unilluminated side of the rings from about 2 degrees below the ringplane. The image was taken with the Cassini spacecraft wide-angle camera on Oct. 20, 2014 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was obtained at a distance of approximately 589,000 miles (948,000 kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 26 degrees. Image scale is 35 miles (57 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18296

  10. Image quality testing of assembled IR camera modules

    NASA Astrophysics Data System (ADS)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  11. Pettit works with two still cameras mounted together in the U.S. Laboratory

    NASA Image and Video Library

    2012-01-21

    ISS030-E-049636 (21 Jan. 2012) --- NASA astronaut Don Pettit, Expedition 30 flight engineer, works with two still cameras mounted together in the Destiny laboratory of the International Space Station. One camera is an infrared modified still camera.

  12. Pettit works with two still cameras mounted together in the U.S. Laboratory

    NASA Image and Video Library

    2012-01-21

    ISS030-E-049643 (21 Jan. 2012) --- NASA astronaut Don Pettit, Expedition 30 flight engineer, works with two still cameras mounted together in the Destiny laboratory of the International Space Station. One camera is an infrared modified still camera.

  13. Southern Quebec in Late Winter

    NASA Technical Reports Server (NTRS)

    2002-01-01

    These images of Canada's Quebec province were acquired by the Multi-angle Imaging SpectroRadiometer on March 4, 2001. The region's forests are a mixture of coniferous and hardwood trees, and 'sugar-shack' festivities are held at this time of year to celebrate the beginning of maple syrup production. The large river visible in the images is the northeast-flowing St. Lawrence. The city of Montreal is located near the lower left corner, and Quebec City, at the upper right, is near the mouth of the partially ice-covered St. Lawrence Seaway.

    Both spectral and angular information are retrieved for every scene observed by MISR. The left-hand image was acquired by the instrument's vertical-viewing (nadir) camera, and is a false-color spectral composite from the near-infrared, red, and blue bands. The right-hand image is a false-color angular composite using red band data from the 60-degree backward-viewing, nadir, and 60-degree forward-viewing cameras. In each case, the individual channels of data are displayed as red, green, and blue, respectively.

    Much of the ground remains covered or partially covered with snow. Vegetation appears red in the left-hand image because of its high near-infrared brightness. In the multi-angle composite, vegetated areas appear in shades of green because they are brighter at nadir, possibly as a result of an underlying blanket of snow which is more visible from this direction. Enhanced forward scatter from the smooth water surface results in bluer hues, whereas urban areas look somewhat orange, possibly due to the effect of vertical structures which preferentially backscatter sunlight.

    The data were acquired during Terra orbit 6441, and cover an area measuring 275 kilometers x 310 kilometers.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  14. Attitude identification for SCOLE using two infrared cameras

    NASA Technical Reports Server (NTRS)

    Shenhar, Joram

    1991-01-01

    An algorithm is presented that incorporates real time data from two infrared cameras and computes the attitude parameters of the Spacecraft COntrol Lab Experiment (SCOLE), a lab apparatus representing an offset feed antenna attached to the Space Shuttle by a flexible mast. The algorithm uses camera position data of three miniature light emitting diodes (LEDs), mounted on the SCOLE platform, permitting arbitrary camera placement and an on-line attitude extraction. The continuous nature of the algorithm allows identification of the placement of the two cameras with respect to some initial position of the three reference LEDs, followed by on-line six degrees of freedom attitude tracking, regardless of the attitude time history. A description is provided of the algorithm in the camera identification mode as well as the mode of target tracking. Experimental data from a reduced size SCOLE-like lab model, reflecting the performance of the camera identification and the tracking processes, are presented. Computer code for camera placement identification and SCOLE attitude tracking is listed.

  15. Built-in hyperspectral camera for smartphone in visible, near-infrared and middle-infrared lights region (third report): spectroscopic imaging for broad-area and real-time componential analysis system against local unexpected terrorism and disasters

    NASA Astrophysics Data System (ADS)

    Hosono, Satsuki; Kawashima, Natsumi; Wollherr, Dirk; Ishimaru, Ichiro

    2016-05-01

    The distributed networks for information collection of chemical components with high-mobility objects, such as drones or smartphones, will work effectively for investigations, clarifications and predictions against unexpected local terrorisms and disasters like localized torrential downpours. We proposed and reported the proposed spectroscopic line-imager for smartphones in this conference. In this paper, we will mention the wide-area spectroscopic-image construction by estimating 6 DOF (Degrees Of Freedom: parallel movements=x,y,z and rotational movements=θx, θy, θz) from line data to observe and analyze surrounding chemical-environments. Recently, smartphone movies, what were photographed by peoples happened to be there, had worked effectively to analyze what kinds of phenomenon had happened around there. But when a gas tank suddenly blew up, we did not recognize from visible-light RGB-color cameras what kinds of chemical gas components were polluting surrounding atmospheres. Conventionally Fourier spectroscopy had been well known as chemical components analysis in laboratory usages. But volatile gases should be analyzed promptly at accident sites. And because the humidity absorption in near and middle infrared lights has very high sensitivity, we will be able to detect humidity in the sky from wide field spectroscopic image. And also recently, 6-DOF sensors are easily utilized for estimation of position and attitude for UAV (Unmanned Air Vehicle) or smartphone. But for observing long-distance views, accuracies of angle measurements were not sufficient to merge line data because of leverage theory. Thus, by searching corresponding pixels between line spectroscopic images, we are trying to estimate 6-DOF in high accuracy.

  16. A Summer View of Russia's Lena Delta and Olenek

    NASA Technical Reports Server (NTRS)

    2004-01-01

    These views of the Russian Arctic were acquired by NASA's Multi-angle Imaging SpectroRadiometer (MISR) instrument on July 11, 2004, when the brief arctic summer had transformed the frozen tundra and the thousands of lakes, channels, and rivers of the Lena Delta into a fertile wetland, and when the usual blanket of thick snow had melted from the vast plains and taiga forests. This set of three images cover an area in the northern part of the Eastern Siberian Sakha Republic. The Olenek River wends northeast from the bottom of the images to the upper left, and the top portions of the images are dominated by the delta into which the mighty Lena River empties when it reaches the Laptev Sea. At left is a natural color image from MISR's nadir (vertical-viewing) camera, in which the rivers appear murky due to the presence of sediment, and photosynthetically-active vegetation appears green. The center image is also from MISR's nadir camera, but is a false color view in which the predominant red color is due to the brightness of vegetation at near-infrared wavelengths. The most photosynthetically active parts of this area are the Lena Delta, in the lower half of the image, and throughout the great stretch of land that curves across the Olenek River and extends northeast beyond the relatively barren ranges of the Volyoi mountains (the pale tan-colored area to the right of image center).

    The right-hand image is a multi-angle false-color view made from the red band data of the 60o backward, nadir, and 60o forward cameras, displayed as red, green and blue, respectively. Water appears blue in this image because sun glitter makes smooth, wet surfaces look brighter at the forward camera's view angle. Much of the landscape and many low clouds appear purple since these surfaces are both forward and backward scattering, and clouds that are further from the surface appear in a different spot for each view angle, creating a rainbow-like appearance. However, the vegetated region that is darker green in the natural color nadir image, also appears to exhibit a faint greenish hue in the multi-angle composite. A possible explanation for this subtle green effect is that the taiga forest trees (or dwarf-shrubs) are not too dense here. Since the the nadir camera is more likly to observe any gaps between the trees or shrubs, and since the vegetation is not as bright (in the red band) as the underlying soil or surface, the brighter underlying surface results in an area that is relatively brighter at the nadir view angle. Accurate maps of vegetation structural units are an essential part of understanding the seasonal exchanges of energy and water at the Earth's surface, and of preserving the biodiversity in these regions.

    The Multiangle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire globe between 82o north and 82o south latitude. These data products were generated from a portion of the imagery acquired during Terra orbit 24273. The panels cover an area of about 230 kilometers x 420 kilometers, and utilize data from blocks 30 to 34 within World Reference System-2 path 134.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  17. Multi-spectral imaging with infrared sensitive organic light emitting diode

    PubMed Central

    Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky

    2014-01-01

    Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions. PMID:25091589

  18. Multi-spectral imaging with infrared sensitive organic light emitting diode

    NASA Astrophysics Data System (ADS)

    Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky

    2014-08-01

    Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions.

  19. Built-in hyperspectral camera for smartphone in visible, near-infrared and middle-infrared lights region (second report): sensitivity improvement of Fourier-spectroscopic imaging to detect diffuse reflection lights from internal human tissues for healthcare sensors

    NASA Astrophysics Data System (ADS)

    Kawashima, Natsumi; Hosono, Satsuki; Ishimaru, Ichiro

    2016-05-01

    We proposed the snapshot-type Fourier spectroscopic imaging for smartphone that was mentioned in 1st. report in this conference. For spectroscopic components analysis, such as non-invasive blood glucose sensors, the diffuse reflection lights from internal human skins are very weak for conventional hyperspectral cameras, such as AOTF (Acousto-Optic Tunable Filter) type. Furthermore, it is well known that the spectral absorption of mid-infrared lights or Raman spectroscopy especially in long wavelength region is effective to distinguish specific biomedical components quantitatively, such as glucose concentration. But the main issue was that photon energies of middle infrared lights and light intensities of Raman scattering are extremely weak. For improving sensitivity of our spectroscopic imager, the wide-field-stop & beam-expansion method was proposed. Our line spectroscopic imager introduced a single slit for field stop on the conjugate objective plane. Obviously to increase detected light intensities, the wider slit width of the field stop makes light intensities higher, regardless of deterioration of spatial resolutions. Because our method is based on wavefront-division interferometry, it becomes problems that the wider width of single slit makes the diffraction angle narrower. This means that the narrower diameter of collimated objective beams deteriorates visibilities of interferograms. By installing the relative inclined phaseshifter onto optical Fourier transform plane of infinity corrected optical systems, the collimated half flux of objective beams derived from single-bright points on objective surface penetrate through the wedge prism and the cuboid glass respectively. These two beams interfere each other and form the infererogram as spatial fringe patterns. Thus, we installed concave-cylindrical lens between the wider slit and objective lens as a beam expander. We successfully obtained the spectroscopic characters of hemoglobin from reflected lights from human fingers.

  20. Development of plenoptic infrared camera using low dimensional material based photodetectors

    NASA Astrophysics Data System (ADS)

    Chen, Liangliang

    Infrared (IR) sensor has extended imaging from submicron visible spectrum to tens of microns wavelength, which has been widely used for military and civilian application. The conventional bulk semiconductor materials based IR cameras suffer from low frame rate, low resolution, temperature dependent and highly cost, while the unusual Carbon Nanotube (CNT), low dimensional material based nanotechnology has been made much progress in research and industry. The unique properties of CNT lead to investigate CNT based IR photodetectors and imaging system, resolving the sensitivity, speed and cooling difficulties in state of the art IR imagings. The reliability and stability is critical to the transition from nano science to nano engineering especially for infrared sensing. It is not only for the fundamental understanding of CNT photoresponse induced processes, but also for the development of a novel infrared sensitive material with unique optical and electrical features. In the proposed research, the sandwich-structured sensor was fabricated within two polymer layers. The substrate polyimide provided sensor with isolation to background noise, and top parylene packing blocked humid environmental factors. At the same time, the fabrication process was optimized by real time electrical detection dielectrophoresis and multiple annealing to improve fabrication yield and sensor performance. The nanoscale infrared photodetector was characterized by digital microscopy and precise linear stage in order for fully understanding it. Besides, the low noise, high gain readout system was designed together with CNT photodetector to make the nano sensor IR camera available. To explore more of infrared light, we employ compressive sensing algorithm into light field sampling, 3-D camera and compressive video sensing. The redundant of whole light field, including angular images for light field, binocular images for 3-D camera and temporal information of video streams, are extracted and expressed in compressive approach. The following computational algorithms are applied to reconstruct images beyond 2D static information. The super resolution signal processing was then used to enhance and improve the image spatial resolution. The whole camera system brings a deeply detailed content for infrared spectrum sensing.

  1. University Physics Students' Ideas of Thermal Radiation Expressed in Open Laboratory Activities Using Infrared Cameras

    ERIC Educational Resources Information Center

    Haglund, Jesper; Melander, Emil; Weiszflog, Matthias; Andersson, Staffan

    2017-01-01

    Background: University physics students were engaged in open-ended thermodynamics laboratory activities with a focus on understanding a chosen phenomenon or the principle of laboratory apparatus, such as thermal radiation and a heat pump. Students had access to handheld infrared (IR) cameras for their investigations. Purpose: The purpose of the…

  2. Noisy Ocular Recognition Based on Three Convolutional Neural Networks.

    PubMed

    Lee, Min Beom; Hong, Hyung Gil; Park, Kang Ryoung

    2017-12-17

    In recent years, the iris recognition system has been gaining increasing acceptance for applications such as access control and smartphone security. When the images of the iris are obtained under unconstrained conditions, an issue of undermined quality is caused by optical and motion blur, off-angle view (the user's eyes looking somewhere else, not into the front of the camera), specular reflection (SR) and other factors. Such noisy iris images increase intra-individual variations and, as a result, reduce the accuracy of iris recognition. A typical iris recognition system requires a near-infrared (NIR) illuminator along with an NIR camera, which are larger and more expensive than fingerprint recognition equipment. Hence, many studies have proposed methods of using iris images captured by a visible light camera without the need for an additional illuminator. In this research, we propose a new recognition method for noisy iris and ocular images by using one iris and two periocular regions, based on three convolutional neural networks (CNNs). Experiments were conducted by using the noisy iris challenge evaluation-part II (NICE.II) training dataset (selected from the university of Beira iris (UBIRIS).v2 database), mobile iris challenge evaluation (MICHE) database, and institute of automation of Chinese academy of sciences (CASIA)-Iris-Distance database. As a result, the method proposed by this study outperformed previous methods.

  3. Mimas Showing False Colors #1

    NASA Technical Reports Server (NTRS)

    2005-01-01

    False color images of Saturn's moon, Mimas, reveal variation in either the composition or texture across its surface.

    During its approach to Mimas on Aug. 2, 2005, the Cassini spacecraft narrow-angle camera obtained multi-spectral views of the moon from a range of 228,000 kilometers (142,500 miles).

    The image at the left is a narrow angle clear-filter image, which was separately processed to enhance the contrast in brightness and sharpness of visible features. The image at the right is a color composite of narrow-angle ultraviolet, green, infrared and clear filter images, which have been specially processed to accentuate subtle changes in the spectral properties of Mimas' surface materials. To create this view, three color images (ultraviolet, green and infrared) were combined into a single black and white picture that isolates and maps regional color differences. This 'color map' was then superimposed over the clear-filter image at the left.

    The combination of color map and brightness image shows how the color differences across the Mimas surface materials are tied to geological features. Shades of blue and violet in the image at the right are used to identify surface materials that are bluer in color and have a weaker infrared brightness than average Mimas materials, which are represented by green.

    Herschel crater, a 140-kilometer-wide (88-mile) impact feature with a prominent central peak, is visible in the upper right of each image. The unusual bluer materials are seen to broadly surround Herschel crater. However, the bluer material is not uniformly distributed in and around the crater. Instead, it appears to be concentrated on the outside of the crater and more to the west than to the north or south. The origin of the color differences is not yet understood. It may represent ejecta material that was excavated from inside Mimas when the Herschel impact occurred. The bluer color of these materials may be caused by subtle differences in the surface composition or the sizes of grains making up the icy soil.

    The images were obtained when the Cassini spacecraft was above 25 degrees south, 134 degrees west latitude and longitude. The Sun-Mimas-spacecraft angle was 45 degrees and north is at the top.

    The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA's Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colo.

    For more information about the Cassini-Huygens mission visit http://saturn.jpl.nasa.gov . The Cassini imaging team homepage is at http://ciclops.org .

  4. Kinematic analysis of head, trunk, and pelvic motion during mirror therapy for stroke patients

    PubMed Central

    Kim, Jinmin; Yi, Jaehoon; Song, Chang-Ho

    2017-01-01

    [Purpose] The purpose of this study was to investigate mirror therapy (MT) condition by analyzing kinematic parameters according to mirror size and angle. [Subjects and Methods] Three hemiparesis stroke patients and five healthy adults participated in this cross-sectional study. Kinematic parameters during the MT were collected over a total of 5 trials for each subject (3 mirror angles × 3 mirror sizes). Center of pressure (COP) excursion data was collected by force plate, and other kinematic parameters by infra-red cameras. [Results] The larger the size and smaller the angle, the overall dependent variables decreased in all participants. Particularly, when virtual reality reflection equipment (VRRE) was used, the value of the flexion and the lateral tilt was the closest to the midline compared to all other independent variables. Moreover, it showed tendency of moving towards the affected side. Based on the results, MT for stroke patients has a disadvantage of shifting weight and leaning towards the unaffected side during therapy. [Conclusion] Therefore, it seems to be more effective in terms of clinics to apply VRRE to make up for the weak parts and provide more elaborate visual feedback. PMID:29184290

  5. Kinematic analysis of head, trunk, and pelvic motion during mirror therapy for stroke patients.

    PubMed

    Kim, Jinmin; Yi, Jaehoon; Song, Chang-Ho

    2017-10-01

    [Purpose] The purpose of this study was to investigate mirror therapy (MT) condition by analyzing kinematic parameters according to mirror size and angle. [Subjects and Methods] Three hemiparesis stroke patients and five healthy adults participated in this cross-sectional study. Kinematic parameters during the MT were collected over a total of 5 trials for each subject (3 mirror angles × 3 mirror sizes). Center of pressure (COP) excursion data was collected by force plate, and other kinematic parameters by infra-red cameras. [Results] The larger the size and smaller the angle, the overall dependent variables decreased in all participants. Particularly, when virtual reality reflection equipment (VRRE) was used, the value of the flexion and the lateral tilt was the closest to the midline compared to all other independent variables. Moreover, it showed tendency of moving towards the affected side. Based on the results, MT for stroke patients has a disadvantage of shifting weight and leaning towards the unaffected side during therapy. [Conclusion] Therefore, it seems to be more effective in terms of clinics to apply VRRE to make up for the weak parts and provide more elaborate visual feedback.

  6. Long-Wavelength 640 x 486 GaAs/AlGaAs Quantum Well Infrared Photodetector Snap-Shot Camera

    NASA Technical Reports Server (NTRS)

    Gunapala, Sarath D.; Bandara, Sumith V.; Liu, John K.; Hong, Winn; Sundaram, Mani; Maker, Paul D.; Muller, Richard E.; Shott, Craig A.; Carralejo, Ronald

    1998-01-01

    A 9-micrometer cutoff 640 x 486 snap-shot quantum well infrared photodetector (QWIP) camera has been demonstrated. The performance of this QWIP camera is reported including indoor and outdoor imaging. The noise equivalent differential temperature (NE.deltaT) of 36 mK has been achieved at 300 K background with f/2 optics. This is in good agreement with expected focal plane array sensitivity due to the practical limitations on charge handling capacity of the multiplexer, read noise, bias voltage, and operating temperature.

  7. Good Old Summer Time

    NASA Image and Video Library

    2017-07-31

    Saturn's northern hemisphere reached its summer solstice in mid-2017, bringing continuous sunshine to the planet's far north. The solstice took place on May 24, 2017. The Cassini mission is using the unparalleled opportunity to observe changes that occur on the planet as the Saturnian seasons turn. This view looks toward the sunlit side of the rings from about 17 degrees above the ring plane. The image was taken with the Cassini spacecraft wide-angle camera on April 17, 2017 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 939 nanometers. The view was acquired at a distance of approximately 733,000 miles (1.2 million kilometers) from Saturn. Image scale is 44 miles (70 kilometers) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA21337

  8. Optimization of a miniature short-wavelength infrared objective optics of a short-wavelength infrared to visible upconversion layer attached to a mobile-devices visible camera

    NASA Astrophysics Data System (ADS)

    Kadosh, Itai; Sarusi, Gabby

    2017-10-01

    The use of dual cameras in parallax in order to detect and create 3-D images in mobile devices has been increasing over the last few years. We propose a concept where the second camera will be operating in the short-wavelength infrared (SWIR-1300 to 1800 nm) and thus have night vision capability while preserving most of the other advantages of dual cameras in terms of depth and 3-D capabilities. In order to maintain commonality of the two cameras, we propose to attach to one of the cameras a SWIR to visible upconversion layer that will convert the SWIR image into a visible image. For this purpose, the fore optics (the objective lenses) should be redesigned for the SWIR spectral range and the additional upconversion layer, whose thickness is <1 μm. Such layer should be attached in close proximity to the mobile device visible range camera sensor (the CMOS sensor). This paper presents such a SWIR objective optical design and optimization that is formed and fit mechanically to the visible objective design but with different lenses in order to maintain the commonality and as a proof-of-concept. Such a SWIR objective design is very challenging since it requires mimicking the original visible mobile camera lenses' sizes and the mechanical housing, so we can adhere to the visible optical and mechanical design. We present in depth a feasibility study and the overall optical system performance of such a SWIR mobile-device camera fore optics design.

  9. Voyager spacecraft images of Jupiter and Saturn

    NASA Technical Reports Server (NTRS)

    Birnbaum, M. M.

    1982-01-01

    The Voyager imaging system is described, noting that it is made up of a narrow-angle and a wide-angle TV camera, each in turn consisting of optics, a filter wheel and shutter assembly, a vidicon tube, and an electronics subsystem. The narrow-angle camera has a focal length of 1500 mm; its field of view is 0.42 deg and its focal ratio is f/8.5. For the wide-angle camera, the focal length is 200 mm, the field of view 3.2 deg, and the focal ratio of f/3.5. Images are exposed by each camera through one of eight filters in the filter wheel on the photoconductive surface of a magnetically focused and deflected vidicon having a diameter of 25 mm. The vidicon storage surface (target) is a selenium-sulfur film having an active area of 11.14 x 11.14 mm; it holds a frame consisting of 800 lines with 800 picture elements per line. Pictures of Jupiter, Saturn, and their moons are presented, with short descriptions given of the area being viewed.

  10. Estimating the Infrared Radiation Wavelength Emitted by a Remote Control Device Using a Digital Camera

    ERIC Educational Resources Information Center

    Catelli, Francisco; Giovannini, Odilon; Bolzan, Vicente Dall Agnol

    2011-01-01

    The interference fringes produced by a diffraction grating illuminated with radiation from a TV remote control and a red laser beam are, simultaneously, captured by a digital camera. Based on an image with two interference patterns, an estimate of the infrared radiation wavelength emitted by a TV remote control is made. (Contains 4 figures.)

  11. Feasibility Study of Utilizing Existing Infrared Array Cameras for Daylight Star Tracking on NASA's Ultra Long Duration Balloon (ULDB) Missions

    NASA Technical Reports Server (NTRS)

    Tueller, Jack (Technical Monitor); Fazio, Giovanni G.; Tolls, Volker

    2004-01-01

    The purpose of this study was to investigate the feasibility of developing a daytime star tracker for ULDB flights using a commercially available off-the-shelf infrared array camera. This report describes the system used for ground-based tests, the observations, the test results, and gives recommendations for continued development.

  12. Testing and Evaluation of Low-Light Sensors to Enhance Intelligence, Surveillance, and Reconnaissance (ISR) and Real-Time Situational Awareness

    DTIC Science & Technology

    2009-03-01

    infrared, thermal , or night vision applications. Understanding the true capabilities and limitations of the ALAN camera and its applicability to a...an option to more expensive infrared, thermal , or night vision applications. Ultimately, it will be clear whether the configuration of the Kestrel...45  A.  THERMAL CAMERAS................................................................................45  1

  13. Camera Traps Can Be Heard and Seen by Animals

    PubMed Central

    Meek, Paul D.; Ballard, Guy-Anthony; Fleming, Peter J. S.; Schaefer, Michael; Williams, Warwick; Falzon, Greg

    2014-01-01

    Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals’ hearing and produce illumination that can be seen by many species. PMID:25354356

  14. Dynamic calibration of pan-tilt-zoom cameras for traffic monitoring.

    PubMed

    Song, Kai-Tai; Tai, Jen-Chao

    2006-10-01

    Pan-tilt-zoom (PTZ) cameras have been widely used in recent years for monitoring and surveillance applications. These cameras provide flexible view selection as well as a wider observation range. This makes them suitable for vision-based traffic monitoring and enforcement systems. To employ PTZ cameras for image measurement applications, one first needs to calibrate the camera to obtain meaningful results. For instance, the accuracy of estimating vehicle speed depends on the accuracy of camera calibration and that of vehicle tracking results. This paper presents a novel calibration method for a PTZ camera overlooking a traffic scene. The proposed approach requires no manual operation to select the positions of special features. It automatically uses a set of parallel lane markings and the lane width to compute the camera parameters, namely, focal length, tilt angle, and pan angle. Image processing procedures have been developed for automatically finding parallel lane markings. Interesting experimental results are presented to validate the robustness and accuracy of the proposed method.

  15. Radiometric stability of the Multi-angle Imaging SpectroRadiometer (MISR) following 15 years on-orbit

    NASA Astrophysics Data System (ADS)

    Bruegge, Carol J.; Val, Sebastian; Diner, David J.; Jovanovic, Veljko; Gray, Ellyn; Di Girolamo, Larry; Zhao, Guangyu

    2014-09-01

    The Multi-angle Imaging SpectroRadiometer (MISR) has successfully operated on the EOS/ Terra spacecraft since 1999. It consists of nine cameras pointing from nadir to 70.5° view angle with four spectral channels per camera. Specifications call for a radiometric uncertainty of 3% absolute and 1% relative to the other cameras. To accomplish this, MISR utilizes an on-board calibrator (OBC) to measure camera response changes. Once every two months the two Spectralon panels are deployed to direct solar-light into the cameras. Six photodiode sets measure the illumination level that are compared to MISR raw digital numbers, thus determining the radiometric gain coefficients used in Level 1 data processing. Although panel stability is not required, there has been little detectable change in panel reflectance, attributed to careful preflight handling techniques. The cameras themselves have degraded in radiometric response by 10% since launch, but calibration updates using the detector-based scheme has compensated for these drifts and allowed the radiance products to meet accuracy requirements. Validation using Sahara desert observations show that there has been a drift of ~1% in the reported nadir-view radiance over a decade, common to all spectral bands.

  16. Mapping the Apollo 17 landing site area based on Lunar Reconnaissance Orbiter Camera images and Apollo surface photography

    NASA Astrophysics Data System (ADS)

    Haase, I.; Oberst, J.; Scholten, F.; Wählisch, M.; Gläser, P.; Karachevtseva, I.; Robinson, M. S.

    2012-05-01

    Newly acquired high resolution Lunar Reconnaissance Orbiter Camera (LROC) images allow accurate determination of the coordinates of Apollo hardware, sampling stations, and photographic viewpoints. In particular, the positions from where the Apollo 17 astronauts recorded panoramic image series, at the so-called “traverse stations”, were precisely determined for traverse path reconstruction. We analyzed observations made in Apollo surface photography as well as orthorectified orbital images (0.5 m/pixel) and Digital Terrain Models (DTMs) (1.5 m/pixel and 100 m/pixel) derived from LROC Narrow Angle Camera (NAC) and Wide Angle Camera (WAC) images. Key features captured in the Apollo panoramic sequences were identified in LROC NAC orthoimages. Angular directions of these features were measured in the panoramic images and fitted to the NAC orthoimage by applying least squares techniques. As a result, we obtained the surface panoramic camera positions to within 50 cm. At the same time, the camera orientations, North azimuth angles and distances to nearby features of interest were also determined. Here, initial results are shown for traverse station 1 (northwest of Steno Crater) as well as the Apollo Lunar Surface Experiment Package (ALSEP) area.

  17. High spatial resolution infrared camera as ISS external experiment

    NASA Astrophysics Data System (ADS)

    Eckehard, Lorenz; Frerker, Hap; Fitch, Robert Alan

    High spatial resolution infrared camera as ISS external experiment for monitoring global climate changes uses ISS internal and external resources (eg. data storage). The optical experiment will consist of an infrared camera for monitoring global climate changes from the ISS. This technology was evaluated by the German small satellite mission BIRD and further developed in different ESA projects. Compared to BIRD the presended instrument uses proven sensor advanced technologies (ISS external) and ISS on board processing and storage capabili-ties (internal). The instrument will be equipped with a serial interfaces for TM/TC and several relay commands for the power supply. For data processing and storage a mass memory is re-quired. The access to actual attitude data is highly desired to produce geo referenced maps-if possible by an on board processing.

  18. Method for determining and displaying the spacial distribution of a spectral pattern of received light

    DOEpatents

    Bennett, C.L.

    1996-07-23

    An imaging Fourier transform spectrometer is described having a Fourier transform infrared spectrometer providing a series of images to a focal plane array camera. The focal plane array camera is clocked to a multiple of zero crossing occurrences as caused by a moving mirror of the Fourier transform infrared spectrometer and as detected by a laser detector such that the frame capture rate of the focal plane array camera corresponds to a multiple of the zero crossing rate of the Fourier transform infrared spectrometer. The images are transmitted to a computer for processing such that representations of the images as viewed in the light of an arbitrary spectral ``fingerprint`` pattern can be displayed on a monitor or otherwise stored and manipulated by the computer. 2 figs.

  19. Advanced imaging research and development at DARPA

    NASA Astrophysics Data System (ADS)

    Dhar, Nibir K.; Dat, Ravi

    2012-06-01

    Advances in imaging technology have huge impact on our daily lives. Innovations in optics, focal plane arrays (FPA), microelectronics and computation have revolutionized camera design. As a result, new approaches to camera design and low cost manufacturing is now possible. These advances are clearly evident in visible wavelength band due to pixel scaling, improvements in silicon material and CMOS technology. CMOS cameras are available in cell phones and many other consumer products. Advances in infrared imaging technology have been slow due to market volume and many technological barriers in detector materials, optics and fundamental limits imposed by the scaling laws of optics. There is of course much room for improvements in both, visible and infrared imaging technology. This paper highlights various technology development projects at DARPA to advance the imaging technology for both, visible and infrared. Challenges and potentials solutions are highlighted in areas related to wide field-of-view camera design, small pitch pixel, broadband and multiband detectors and focal plane arrays.

  20. Lock-in thermography using a cellphone attachment infrared camera

    NASA Astrophysics Data System (ADS)

    Razani, Marjan; Parkhimchyk, Artur; Tabatabaei, Nima

    2018-03-01

    Lock-in thermography (LIT) is a thermal-wave-based, non-destructive testing, technique which has been widely utilized in research settings for characterization and evaluation of biological and industrial materials. However, despite promising research outcomes, the wide spread adaptation of LIT in industry, and its commercialization, is hindered by the high cost of the infrared cameras used in the LIT setups. In this paper, we report on the feasibility of using inexpensive cellphone attachment infrared cameras for performing LIT. While the cost of such cameras is over two orders of magnitude less than their research-grade counterparts, our experimental results on block sample with subsurface defects and tooth with early dental caries suggest that acceptable performance can be achieved through careful instrumentation and implementation of proper data acquisition and image processing steps. We anticipate this study to pave the way for development of low-cost thermography systems and their commercialization as inexpensive tools for non-destructive testing of industrial samples as well as affordable clinical devices for diagnostic imaging of biological tissues.

  1. 15-micro-m 128 x 128 GaAs/Al(x)Ga(1-x) As Quantum Well Infrared Photodetector Focal Plane Array Camera

    NASA Technical Reports Server (NTRS)

    Gunapala, Sarath D.; Park, Jin S.; Sarusi, Gabby; Lin, True-Lon; Liu, John K.; Maker, Paul D.; Muller, Richard E.; Shott, Craig A.; Hoelter, Ted

    1997-01-01

    In this paper, we discuss the development of very sensitive, very long wavelength infrared GaAs/Al(x)Ga(1-x)As quantum well infrared photodetectors (QWIP's) based on bound-to-quasi-bound intersubband transition, fabrication of random reflectors for efficient light coupling, and the demonstration of a 15 micro-m cutoff 128 x 128 focal plane array imaging camera. Excellent imagery, with a noise equivalent differential temperature (N E(delta T)) of 30 mK has been achieved.

  2. Image processing with cellular nonlinear networks implemented on field-programmable gate arrays for real-time applications in nuclear fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palazzo, S.; Vagliasindi, G.; Arena, P.

    2010-08-15

    In the past years cameras have become increasingly common tools in scientific applications. They are now quite systematically used in magnetic confinement fusion, to the point that infrared imaging is starting to be used systematically for real-time machine protection in major devices. However, in order to guarantee that the control system can always react rapidly in case of critical situations, the time required for the processing of the images must be as predictable as possible. The approach described in this paper combines the new computational paradigm of cellular nonlinear networks (CNNs) with field-programmable gate arrays and has been tested inmore » an application for the detection of hot spots on the plasma facing components in JET. The developed system is able to perform real-time hot spot recognition, by processing the image stream captured by JET wide angle infrared camera, with the guarantee that computational time is constant and deterministic. The statistical results obtained from a quite extensive set of examples show that this solution approximates very well an ad hoc serial software algorithm, with no false or missed alarms and an almost perfect overlapping of alarm intervals. The computational time can be reduced to a millisecond time scale for 8 bit 496x560-sized images. Moreover, in our implementation, the computational time, besides being deterministic, is practically independent of the number of iterations performed by the CNN - unlike software CNN implementations.« less

  3. The NASA - Arc 10/20 micron camera

    NASA Technical Reports Server (NTRS)

    Roellig, T. L.; Cooper, R.; Deutsch, L. K.; Mccreight, C.; Mckelvey, M.; Pendleton, Y. J.; Witteborn, F. C.; Yuen, L.; Mcmahon, T.; Werner, M. W.

    1994-01-01

    A new infrared camera (AIR Camera) has been developed at NASA - Ames Research Center for observations from ground-based telescopes. The heart of the camera is a Hughes 58 x 62 pixel Arsenic-doped Silicon detector array that has the spectral sensitivity range to allow observations in both the 10 and 20 micron atmospheric windows.

  4. A hands-free region-of-interest selection interface for solo surgery with a wide-angle endoscope: preclinical proof of concept.

    PubMed

    Jung, Kyunghwa; Choi, Hyunseok; Hong, Hanpyo; Adikrishna, Arnold; Jeon, In-Ho; Hong, Jaesung

    2017-02-01

    A hands-free region-of-interest (ROI) selection interface is proposed for solo surgery using a wide-angle endoscope. A wide-angle endoscope provides images with a larger field of view than a conventional endoscope. With an appropriate selection interface for a ROI, surgeons can also obtain a detailed local view as if they moved a conventional endoscope in a specific position and direction. To manipulate the endoscope without releasing the surgical instrument in hand, a mini-camera is attached to the instrument, and the images taken by the attached camera are analyzed. When a surgeon moves the instrument, the instrument orientation is calculated by an image processing. Surgeons can select the ROI with this instrument movement after switching from 'task mode' to 'selection mode.' The accelerated KAZE algorithm is used to track the features of the camera images once the instrument is moved. Both the wide-angle and detailed local views are displayed simultaneously, and a surgeon can move the local view area by moving the mini-camera attached to the surgical instrument. Local view selection for a solo surgery was performed without releasing the instrument. The accuracy of camera pose estimation was not significantly different between camera resolutions, but it was significantly different between background camera images with different numbers of features (P < 0.01). The success rate of ROI selection diminished as the number of separated regions increased. However, separated regions up to 12 with a region size of 160 × 160 pixels were selected with no failure. Surgical tasks on a phantom model and a cadaver were attempted to verify the feasibility in a clinical environment. Hands-free endoscope manipulation without releasing the instruments in hand was achieved. The proposed method requires only a small, low-cost camera and an image processing. The technique enables surgeons to perform solo surgeries without a camera assistant.

  5. SCC500: next-generation infrared imaging camera core products with highly flexible architecture for unique camera designs

    NASA Astrophysics Data System (ADS)

    Rumbaugh, Roy N.; Grealish, Kevin; Kacir, Tom; Arsenault, Barry; Murphy, Robert H.; Miller, Scott

    2003-09-01

    A new 4th generation MicroIR architecture is introduced as the latest in the highly successful Standard Camera Core (SCC) series by BAE SYSTEMS to offer an infrared imaging engine with greatly reduced size, weight, power, and cost. The advanced SCC500 architecture provides great flexibility in configuration to include multiple resolutions, an industry standard Real Time Operating System (RTOS) for customer specific software application plug-ins, and a highly modular construction for unique physical and interface options. These microbolometer based camera cores offer outstanding and reliable performance over an extended operating temperature range to meet the demanding requirements of real-world environments. A highly integrated lens and shutter is included in the new SCC500 product enabling easy, drop-in camera designs for quick time-to-market product introductions.

  6. ARC-1990-AC79-7127

    NASA Image and Video Library

    1990-02-14

    Range : 4 billion miles from Earth, at 32 degrees to the ecliptic. P-36057C This color image of the Sun, Earth, and Venus is one of the first, and maybe, only images that show are solar system from such a vantage point. The image is a portion of a wide angle image containing the sun and the region of space where the Earth and Venus were at the time, with narrow angle cameras centered on each planet. The wide angle was taken with the cameras darkest filter, a methane absorption band, and the shortest possible exposure, one two-hundredth of a second, to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large in the sky, as seen from Voyager's perpective at the edge of the solar system. Yet, it is still 8xs brighter than the brightest star in Earth's sky, Sirius. The image of the sun you see is far larger than the actual dimension of the solar disk. The result of the brightness is a bright burned out image with multiple reflections from the optics of the camera. The rays around th sun are a diffraction pattern of the calibration lamp which is mounted in front of the wide angle lens. the 2 narrow angle frames containing the images of the Earth and Venus have been digitally mosaicked into the wide angle image at the appropriate scale. These images were taken through three color filters and recombined to produce the color image. The violet, green, and blue filters used , as well as exposure times of .72,.48, and .72 for Earth, and .36, .24, and .36 for Venus.The images also show long linear streaks resulting from scatering of sulight off parts of the camera and its shade.

  7. Inflight Calibration of the Lunar Reconnaissance Orbiter Camera Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Humm, D. C.; Robinson, M. S.; Boyd, A. K.; Stelling, R.; Sato, H.; Denevi, B. W.; Braden, S. E.; Bowman-Cisneros, E.; Brylow, S. M.; Tschimmel, M.

    2016-04-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) has acquired more than 250,000 images of the illuminated lunar surface and over 190,000 observations of space and non-illuminated Moon since 1 January 2010. These images, along with images from the Narrow Angle Camera (NAC) and other Lunar Reconnaissance Orbiter instrument datasets are enabling new discoveries about the morphology, composition, and geologic/geochemical evolution of the Moon. Characterizing the inflight WAC system performance is crucial to scientific and exploration results. Pre-launch calibration of the WAC provided a baseline characterization that was critical for early targeting and analysis. Here we present an analysis of WAC performance from the inflight data. In the course of our analysis we compare and contrast with the pre-launch performance wherever possible and quantify the uncertainty related to various components of the calibration process. We document the absolute and relative radiometric calibration, point spread function, and scattered light sources and provide estimates of sources of uncertainty for spectral reflectance measurements of the Moon across a range of imaging conditions.

  8. PNIC - A near infrared camera for testing focal plane arrays

    NASA Astrophysics Data System (ADS)

    Hereld, Mark; Harper, D. A.; Pernic, R. J.; Rauscher, Bernard J.

    1990-07-01

    This paper describes the design and the performance of the Astrophysical Research Consortium prototype near-infrared camera (pNIC) designed to test focal plane arrays both on and off the telescope. Special attention is given to the detector in pNIC, the mechanical and optical designs, the electronics, and the instrument interface. Experiments performed to illustrate the most salient aspects of pNIC are described.

  9. POLICAN: A Near-infrared Imaging Polarimeter at the 2.1m OAGH Telescope

    NASA Astrophysics Data System (ADS)

    Devaraj, R.; Luna, A.; Carrasco, L.; Vázquez-Rodríguez, M. A.; Mayya, Y. D.; Tánori, J. G.; Serrano Bernal, E. O.

    2018-05-01

    POLICAN is a near-infrared imaging linear polarimeter developed for the Cananea Near-infrared Camera (CANICA) at the 2.1 m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located in Cananea, Sonora, México. POLICAN is mounted ahead of CANICA and consist of a rotating super-achromatic (1–2.7 μm) half-wave plate (HWP) as the modulator and a fixed wire-grid polarizer as the analyzer. CANICA has a 1024 × 1024 HgCdTe detector with a plate scale of 0.32 arcsec/pixel and provides a field of view of 5.5 × 5.5 arcmin2. The polarimetric observations are carried out by modulating the incoming light through different steps of half-wave plate angles (0°, 22.°5, 45°, 67.°5) to establish linear Stokes parameters (I, Q, and U). Image reduction consists of dark subtraction, polarimetric flat fielding, and sky subtraction. The astrometry and photometric calibrations are performed using the publicly available data from the Two Micron All Sky Survey. Polarimetric calibration includes observations of globular clusters and polarization standards available in the literature. Analysis of multiple observations of globular clusters yielded an instrumental polarization of 0.51%. Uncertainties in polarization range from 0.1% to 10% from the brightest 7 mag to faintest 16 mag stars. The polarimetric accuracy achieved is better than 0.5% and the position angle errors less than 5° for stars brighter than 13 mag in H-band. POLICAN is mainly being used to study the scattered polarization and magnetic fields in and around star-forming regions of the interstellar medium.

  10. Thermographic measurements of high-speed metal cutting

    NASA Astrophysics Data System (ADS)

    Mueller, Bernhard; Renz, Ulrich

    2002-03-01

    Thermographic measurements of a high-speed cutting process have been performed with an infrared camera. To realize images without motion blur the integration times were reduced to a few microseconds. Since the high tool wear influences the measured temperatures a set-up has been realized which enables small cutting lengths. Only single images have been recorded because the process is too fast to acquire a sequence of images even with the frame rate of the very fast infrared camera which has been used. To expose the camera when the rotating tool is in the middle of the camera image an experimental set-up with a light barrier and a digital delay generator with a time resolution of 1 ns has been realized. This enables a very exact triggering of the camera at the desired position of the tool in the image. Since the cutting depth is between 0.1 and 0.2 mm a high spatial resolution was also necessary which was obtained by a special close-up lens allowing a resolution of app. 45 microns. The experimental set-up will be described and infrared images and evaluated temperatures of a titanium alloy and a carbon steel will be presented for cutting speeds up to 42 m/s.

  11. Reductions in injury crashes associated with red light camera enforcement in oxnard, california.

    PubMed

    Retting, Richard A; Kyrychenko, Sergey Y

    2002-11-01

    This study estimated the impact of red light camera enforcement on motor vehicle crashes in one of the first US communities to employ such cameras-Oxnard, California. Crash data were analyzed for Oxnard and for 3 comparison cities. Changes in crash frequencies were compared for Oxnard and control cities and for signalized and nonsignalized intersections by means of a generalized linear regression model. Overall, crashes at signalized intersections throughout Oxnard were reduced by 7% and injury crashes were reduced by 29%. Right-angle crashes, those most associated with red light violations, were reduced by 32%; right-angle crashes involving injuries were reduced by 68%. Because red light cameras can be a permanent component of the transportation infrastructure, crash reductions attributed to camera enforcement should be sustainable.

  12. Confocal retinal imaging using a digital light projector with a near infrared VCSEL source

    NASA Astrophysics Data System (ADS)

    Muller, Matthew S.; Elsner, Ann E.

    2018-02-01

    A custom near infrared VCSEL source has been implemented in a confocal non-mydriatic retinal camera, the Digital Light Ophthalmoscope (DLO). The use of near infrared light improves patient comfort, avoids pupil constriction, penetrates the deeper retina, and does not mask visual stimuli. The DLO performs confocal imaging by synchronizing a sequence of lines displayed with a digital micromirror device to the rolling shutter exposure of a 2D CMOS camera. Real-time software adjustments enable multiply scattered light imaging, which rapidly and cost-effectively emphasizes drusen and other scattering disruptions in the deeper retina. A separate 5.1" LCD display provides customizable visible stimuli for vision experiments with simultaneous near infrared imaging.

  13. In-situ calibration of nonuniformity in infrared staring and modulated systems

    NASA Astrophysics Data System (ADS)

    Black, Wiley T.

    Infrared cameras can directly measure the apparent temperature of objects, providing thermal imaging. However, the raw output from most infrared cameras suffers from a strong, often limiting noise source called nonuniformity. Manufacturing imperfections in infrared focal planes lead to high pixel-to-pixel sensitivity to electronic bias, focal plane temperature, and other effects. The resulting imagery can only provide useful thermal imaging after a nonuniformity calibration has been performed. Traditionally, these calibrations are performed by momentarily blocking the field of view with a at temperature plate or blackbody cavity. However because the pattern is a coupling of manufactured sensitivities with operational variations, periodic recalibration is required, sometimes on the order of tens of seconds. A class of computational methods called Scene-Based Nonuniformity Correction (SBNUC) has been researched for over 20 years where the nonuniformity calibration is estimated in digital processing by analysis of the video stream in the presence of camera motion. The most sophisticated SBNUC methods can completely and robustly eliminate the high-spatial frequency component of nonuniformity with only an initial reference calibration or potentially no physical calibration. I will demonstrate a novel algorithm that advances these SBNUC techniques to support all spatial frequencies of nonuniformity correction. Long-wave infrared microgrid polarimeters are a class of camera that incorporate a microscale per-pixel wire-grid polarizer directly affixed to each pixel of the focal plane. These cameras have the capability of simultaneously measuring thermal imagery and polarization in a robust integrated package with no moving parts. I will describe the necessary adaptations of my SBNUC method to operate on this class of sensor as well as demonstrate SBNUC performance in LWIR polarimetry video collected on the UA mall.

  14. Small Wonders

    NASA Image and Video Library

    2017-06-28

    This montage of views from NASA's Cassini spacecraft shows three of Saturn's small ring moons: Atlas, Daphnis and Pan at the same scale for ease of comparison. Two differences between Atlas and Pan are obvious in this montage. Pan's equatorial band is much thinner and more sharply defined, and the central mass of Atlas (the part underneath the smooth equatorial band) appears to be smaller than that of Pan. Images of Atlas and Pan taken using infrared, green and ultraviolet spectral filters were combined to create enhanced-color views, which highlight subtle color differences across the moons' surfaces at wavelengths not visible to human eyes. (The Daphnis image was colored using the same green filter image for all three color channels, adjusted to have a realistic appearance next to the other two moons.) All of these images were taken using the Cassini spacecraft narrow-angle camera. The images of Atlas were acquired on April 12, 2017, at a distance of 10,000 miles (16,000 kilometers) and at a sun-moon-spacecraft angle (or phase angle) of 37 degrees. The images of Pan were taken on March 7, 2017, at a distance of 16,000 miles (26,000 kilometers) and a phase angle of 21 degrees. The Daphnis image was obtained on Jan. 16, 2017, at a distance of 17,000 miles (28,000 kilometers) and at a phase angle of 71 degrees. All images are oriented so that north is up. A monochrome version is available at https://photojournal.jpl.nasa.gov/catalog/PIA21449

  15. Comparative Study of Convective Heat Transfer Performance of Steam and Air Flow in Rib Roughened Channels

    NASA Astrophysics Data System (ADS)

    Ma, Chao; Ji, Yongbin; Ge, Bing; Zang, Shusheng; Chen, Hua

    2018-04-01

    A comparative experimental study of heat transfer characteristics of steam and air flow in rectangular channels roughened with parallel ribs was conducted by using an infrared camera. Effects of Reynolds numbers and rib angles on the steam and air convective heat transfer have been obtained and compared with each other for the Reynolds number from about 4,000 to 15,000. For all the ribbed channels the rib pitch to height ratio (p/e) is 10, and the rib height to the channel hydraulic diameter ratio is 0.078, while the rib angles are varied from 90° to 45°. Based on experimental results, it can be found that, even though the heat transfer distributions of steam and air flow in the ribbed channels are similar to each other, the steam flow can obtain higher convective heat transfer enhancement capability, and the heat transfer enhancement of both the steam and air becomes greater with the rib angle deceasing from 90° to 45°. At Reynolds number of about 12,000, the area-averaged Nusselt numbers of the steam flow is about 13.9%, 14.2%, 19.9% and 23.9% higher than those of the air flow for the rib angles of 90°, 75°, 60° and 45° respectively. With the experimental results the correlations for Nusselt number in terms of Reynolds number and rib angle for the steam and air flow in the ribbed channels were developed respectively.

  16. Dark Spots on Titan

    NASA Image and Video Library

    2005-05-02

    This recent image of Titan reveals more complex patterns of bright and dark regions on the surface, including a small, dark, circular feature, completely surrounded by brighter material. During the two most recent flybys of Titan, on March 31 and April 16, 2005, Cassini captured a number of images of the hemisphere of Titan that faces Saturn. The image at the left is taken from a mosaic of images obtained in March 2005 (see PIA06222) and shows the location of the more recently acquired image at the right. The new image shows intriguing details in the bright and dark patterns near an 80-kilometer-wide (50-mile) crater seen first by Cassini's synthetic aperture radar experiment during a Titan flyby in February 2005 (see PIA07368) and subsequently seen by the imaging science subsystem cameras as a dark spot (center of the image at the left). Interestingly, a smaller, roughly 20-kilometer-wide (12-mile), dark and circular feature can be seen within an irregularly-shaped, brighter ring, and is similar to the larger dark spot associated with the radar crater. However, the imaging cameras see only brightness variations, and without topographic information, the identity of this feature as an impact crater cannot be conclusively determined from this image. The visual infrared mapping spectrometer, which is sensitive to longer wavelengths where Titan's atmospheric haze is less obscuring -- observed this area simultaneously with the imaging cameras, so those data, and perhaps future observations by Cassini's radar, may help to answer the question of this feature's origin. The new image at the right consists of five images that have been added together and enhanced to bring out surface detail and to reduce noise, although some camera artifacts remain. These images were taken with the Cassini spacecraft narrow-angle camera using a filter sensitive to wavelengths of infrared light centered at 938 nanometers -- considered to be the imaging science subsystem's best spectral filter for observing the surface of Titan. This view was acquired from a distance of 33,000 kilometers (20,500 miles). The pixel scale of this image is 390 meters (0.2 miles) per pixel, although the actual resolution is likely to be several times larger. http://photojournal.jpl.nasa.gov/catalog/PIA06234

  17. Final Optical Design of PANIC, a Wide-Field Infrared Camera for CAHA

    NASA Astrophysics Data System (ADS)

    Cárdenas, M. C.; Gómez, J. Rodríguez; Lenzen, R.; Sánchez-Blanco, E.

    We present the Final Optical Design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Ritchey-Chrtien focus of the Calar Alto 2.2 m telescope. This will be the first instrument built under the German-Spanish consortium that manages the Calar Alto observatory. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. The optical design produces a well defined internal pupil available to reducing the thermal background by a cryogenic pupil stop. A mosaic of four detectors Hawaii 2RG of 2 k ×2 k, made by Teledyne, will give a field of view of 31.9 arcmin ×31.9 arcmin.

  18. Multi-channel automotive night vision system

    NASA Astrophysics Data System (ADS)

    Lu, Gang; Wang, Li-jun; Zhang, Yi

    2013-09-01

    A four-channel automotive night vision system is designed and developed .It is consist of the four active near-infrared cameras and an Mulit-channel image processing display unit,cameras were placed in the automobile front, left, right and rear of the system .The system uses near-infrared laser light source,the laser light beam is collimated, the light source contains a thermoelectric cooler (TEC),It can be synchronized with the camera focusing, also has an automatic light intensity adjustment, and thus can ensure the image quality. The principle of composition of the system is description in detail,on this basis, beam collimation,the LD driving and LD temperature control of near-infrared laser light source,four-channel image processing display are discussed.The system can be used in driver assistance, car BLIS, car parking assist system and car alarm system in day and night.

  19. 5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF BRIDGE AND ENGINE CAR ON TRACKS, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  20. Imaging of breast cancer with mid- and long-wave infrared camera.

    PubMed

    Joro, R; Lääperi, A-L; Dastidar, P; Soimakallio, S; Kuukasjärvi, T; Toivonen, T; Saaristo, R; Järvenpää, R

    2008-01-01

    In this novel study the breasts of 15 women with palpable breast cancer were preoperatively imaged with three technically different infrared (IR) cameras - micro bolometer (MB), quantum well (QWIP) and photo voltaic (PV) - to compare their ability to differentiate breast cancer from normal tissue. The IR images were processed, the data for frequency analysis were collected from dynamic IR images by pixel-based analysis and from each image selectively windowed regional analysis was carried out, based on angiogenesis and nitric oxide production of cancer tissue causing vasomotor and cardiogenic frequency differences compared to normal tissue. Our results show that the GaAs QWIP camera and the InSb PV camera demonstrate the frequency difference between normal and cancerous breast tissue; the PV camera more clearly. With selected image processing operations more detailed frequency analyses could be applied to the suspicious area. The MB camera was not suitable for tissue differentiation, as the difference between noise and effective signal was unsatisfactory.

  1. Optimising Camera Traps for Monitoring Small Mammals

    PubMed Central

    Glen, Alistair S.; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

    2013-01-01

    Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera’s field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera’s field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats ( Mustela erminea ), feral cats (Felis catus) and hedgehogs ( Erinaceus europaeus ). Trigger speeds of 0.2–2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera’s field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790

  2. Firefly: A HOT camera core for thermal imagers with enhanced functionality

    NASA Astrophysics Data System (ADS)

    Pillans, Luke; Harmer, Jack; Edwards, Tim

    2015-06-01

    Raising the operating temperature of mercury cadmium telluride infrared detectors from 80K to above 160K creates new applications for high performance infrared imagers by vastly reducing the size, weight and power consumption of the integrated cryogenic cooler. Realizing the benefits of Higher Operating Temperature (HOT) requires a new kind of infrared camera core with the flexibility to address emerging applications in handheld, weapon mounted and UAV markets. This paper discusses the Firefly core developed to address these needs by Selex ES in Southampton UK. Firefly represents a fundamental redesign of the infrared signal chain reducing power consumption and providing compatibility with low cost, low power Commercial Off-The-Shelf (COTS) computing technology. This paper describes key innovations in this signal chain: a ROIC purpose built to minimize power consumption in the proximity electronics, GPU based image processing of infrared video, and a software customisable infrared core which can communicate wirelessly with other Battlespace systems.

  3. Don't get burned: thermal monitoring of vessel sealing using a miniature infrared camera

    NASA Astrophysics Data System (ADS)

    Lin, Shan; Fichera, Loris; Fulton, Mitchell J.; Webster, Robert J.

    2017-03-01

    Miniature infrared cameras have recently come to market in a form factor that facilitates packaging in endoscopic or other minimally invasive surgical instruments. If absolute temperature measurements can be made with these cameras, they may be useful for non-contact monitoring of electrocautery-based vessel sealing, or other thermal surgical processes like thermal ablation of tumors. As a first step in evaluating the feasibility of optical medical thermometry with these new cameras, in this paper we explore how well thermal measurements can be made with them. These cameras measure the raw flux of incoming IR radiation, and we perform a calibration procedure to map their readings to absolute temperature values in the range between 40 and 150 °C. Furthermore, we propose and validate a method to estimate the spatial extent of heat spread created by a cautery tool based on the thermal images.

  4. The Effect of Mediated Camera Angle on Receiver Evaluations of Source Credibility, Dominance, Attraction and Homophily.

    ERIC Educational Resources Information Center

    Beverly, Robert E.; Young, Thomas J.

    Two hundred forty college undergraduates participated in a study of the effect of camera angle on an audience's perceptual judgments of source credibility, dominance, attraction, and homophily. The subjects were divided into four groups and each group was shown a videotape presentation in which sources had been videotaped according to one of four…

  5. High-Resolution Mars Camera Test Image of Moon (Infrared)

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This crescent view of Earth's Moon in infrared wavelengths comes from a camera test by NASA's Mars Reconnaissance Orbiter spacecraft on its way to Mars. The mission's High Resolution Imaging Science Experiment camera took the image on Sept. 8, 2005, while at a distance of about 10 million kilometers (6 million miles) from the Moon. The dark feature on the right is Mare Crisium. From that distance, the Moon would appear as a star-like point of light to the unaided eye. The test verified the camera's focusing capability and provided an opportunity for calibration. The spacecraft's Context Camera and Optical Navigation Camera also performed as expected during the test.

    The Mars Reconnaissance Orbiter, launched on Aug. 12, 2005, is on course to reach Mars on March 10, 2006. After gradually adjusting the shape of its orbit for half a year, it will begin its primary science phase in November 2006. From the mission's planned science orbit about 300 kilometers (186 miles) above the surface of Mars, the high resolution camera will be able to discern features as small as one meter or yard across.

  6. 3D bubble reconstruction using multiple cameras and space carving method

    NASA Astrophysics Data System (ADS)

    Fu, Yucheng; Liu, Yang

    2018-07-01

    An accurate measurement of bubble shape and size has a significant value in understanding the behavior of bubbles that exist in many engineering applications. Past studies usually use one or two cameras to estimate bubble volume, surface area, among other parameters. The 3D bubble shape and rotation angle are generally not available in these studies. To overcome this challenge and obtain more detailed information of individual bubbles, a 3D imaging system consisting of four high-speed cameras is developed in this paper, and the space carving method is used to reconstruct the 3D bubble shape based on the recorded high-speed images from different view angles. The proposed method can reconstruct the bubble surface with minimal assumptions. A benchmarking test is performed in a 3 cm  ×  1 cm rectangular channel with stagnant water. The results show that the newly proposed method can measure the bubble volume with an error of less than 2% compared with the syringe reading. The conventional two-camera system has an error around 10%. The one-camera system has an error greater than 25%. The visualization of a 3D bubble rising demonstrates the wall influence on bubble rotation angle and aspect ratio. This also explains the large error that exists in the single camera measurement.

  7. Oil Fire Plumes Over Baghdad

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Dark smoke from oil fires extend for about 60 kilometers south of Iraq's capital city of Baghdad in these images acquired by the Multi-angle Imaging SpectroRadiometer (MISR) on April 2, 2003. The thick, almost black smoke is apparent near image center and contains chemical and particulate components hazardous to human health and the environment.

    The top panel is from MISR's vertical-viewing (nadir) camera. Vegetated areas appear red here because this display is constructed using near-infrared, red and blue band data, displayed as red, green and blue, respectively, to produce a false-color image. The bottom panel is a combination of two camera views of the same area and is a 3-D stereo anaglyph in which red band nadir camera data are displayed as red, and red band data from the 60-degree backward-viewing camera are displayed as green and blue. Both panels are oriented with north to the left in order to facilitate stereo viewing. Viewing the 3-D anaglyph with red/blue glasses (with the red filter placed over the left eye and the blue filter over the right) makes it possible to see the rising smoke against the surface terrain. This technique helps to distinguish features in the atmosphere from those on the surface. In addition to the smoke, several high, thin cirrus clouds (barely visible in the nadir view) are readily observed using the stereo image.

    The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire globe between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbit 17489. The panels cover an area of about 187 kilometers x 123 kilometers, and use data from blocks 63 to 65 within World Reference System-2 path 168.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory,Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  8. Automatic helmet-wearing detection for law enforcement using CCTV cameras

    NASA Astrophysics Data System (ADS)

    Wonghabut, P.; Kumphong, J.; Satiennam, T.; Ung-arunyawee, R.; Leelapatra, W.

    2018-04-01

    The objective of this research is to develop an application for enforcing helmet wearing using CCTV cameras. The developed application aims to help law enforcement by police, and eventually resulting in changing risk behaviours and consequently reducing the number of accidents and its severity. Conceptually, the application software implemented using C++ language and OpenCV library uses two different angle of view CCTV cameras. Video frames recorded by the wide-angle CCTV camera are used to detect motorcyclists. If any motorcyclist without helmet is found, then the zoomed (narrow-angle) CCTV is activated to capture image of the violating motorcyclist and the motorcycle license plate in real time. Captured images are managed by database implemented using MySQL for ticket issuing. The results show that the developed program is able to detect 81% of motorcyclists on various motorcycle types during daytime and night-time. The validation results reveal that the program achieves 74% accuracy in detecting the motorcyclist without helmet.

  9. Use of a microscope-mounted wide-angle point of view camera to record optimal hand position in ocular surgery.

    PubMed

    Gooi, Patrick; Ahmed, Yusuf; Ahmed, Iqbal Ike K

    2014-07-01

    We describe the use of a microscope-mounted wide-angle point-of-view camera to record optimal hand positions in ocular surgery. The camera is mounted close to the objective lens beneath the surgeon's oculars and faces the same direction as the surgeon, providing a surgeon's view. A wide-angle lens enables viewing of both hands simultaneously and does not require repositioning the camera during the case. Proper hand positioning and instrument placement through microincisions are critical for effective and atraumatic handling of tissue within the eye. Our technique has potential in the assessment and training of optimal hand position for surgeons performing intraocular surgery. It is an innovative way to routinely record instrument and operating hand positions in ophthalmic surgery and has minimal requirements in terms of cost, personnel, and operating-room space. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2014 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  10. Optical and Acoustic Sensor-Based 3D Ball Motion Estimation for Ball Sport Simulators †.

    PubMed

    Seo, Sang-Woo; Kim, Myunggyu; Kim, Yejin

    2018-04-25

    Estimation of the motion of ball-shaped objects is essential for the operation of ball sport simulators. In this paper, we propose an estimation system for 3D ball motion, including speed and angle of projection, by using acoustic vector and infrared (IR) scanning sensors. Our system is comprised of three steps to estimate a ball motion: sound-based ball firing detection, sound source localization, and IR scanning for motion analysis. First, an impulsive sound classification based on the mel-frequency cepstrum and feed-forward neural network is introduced to detect the ball launch sound. An impulsive sound source localization using a 2D microelectromechanical system (MEMS) microphones and delay-and-sum beamforming is presented to estimate the firing position. The time and position of a ball in 3D space is determined from a high-speed infrared scanning method. Our experimental results demonstrate that the estimation of ball motion based on sound allows a wider activity area than similar camera-based methods. Thus, it can be practically applied to various simulations in sports such as soccer and baseball.

  11. Near-infrared hyperspectral imaging of atherosclerotic plaque in WHHLMI rabbit artery

    NASA Astrophysics Data System (ADS)

    Ishii, Katsunori; Kitayabu, Akiko; Omiya, Kota; Honda, Norihiro; Awazu, Kunio

    2013-03-01

    Hyperspectral imaging (HSI) of rabbit atherosclerotic plaque in near-infrared (NIR) range from 1150 to 2400 nm was demonstrated. A method to identify vulnerable plaques that are likely to cause acute coronary events has been required. The object of this study is identifying vulnerable plaques by NIR-HSI for an angioscopic application. In this study, we observed the hyperspectral images of the atherosclerotic plaque in WHHLMI rabbit (atherosclerotic rabbit) artery under simulated angioscopic conditions by NIR-HSI. NIR-HSI system was constructed by a NIR super continuum light and a mercury-cadmium-telluride camera. Spectral absorbance values (log (1/R) data) were obtained in the wavelength range from 1150 to 2400 nm at 10 nm intervals. The hyperspectral images were constructed with spectral angle mapper algorithm. As a result, the detections of atherosclerotic plaque under angioscopic observation conditions were achieved especially in the wavelength around 1200 nm, which corresponds to the second overtone of CH stretching vibration mode. The NIR-HSI was considered to serve as an angioscopic diagnosis technique to identify vulnerable plaques without clamping and saline injection.

  12. Evaporation of Binary Sessile Drops: Infrared and Acoustic Methods To Track Alcohol Concentration at the Interface and on the Surface.

    PubMed

    Chen, Pin; Toubal, Malika; Carlier, Julien; Harmand, Souad; Nongaillard, Bertrand; Bigerelle, Maxence

    2016-09-27

    Evaporation of droplets of three pure liquids (water, 1-butanol, and ethanol) and four binary solutions (5 wt % 1-butanol-water-based solution and 5, 25, and 50 wt % ethanol-water-based solutions) deposited on hydrophobic silicon was investigated. A drop shape analyzer was used to measure the contact angle, diameter, and volume of the droplets. An infrared camera was used for infrared thermal mapping of the droplet's surface. An acoustic high-frequency echography technique was, for the first time, applied to track the alcohol concentration in a binary-solution droplet. Evaporation of pure alcohol droplets was executed at different values of relative humidity (RH), among which the behavior of pure ethanol evaporation was notably influenced by the ambient humidity as a result of high hygrometry. Evaporation of droplets of water and binary solutions was performed at a temperature of 22 °C and a mean humidity of approximately 50%. The exhaustion times of alcohol in the droplets estimated by the acoustic method and the visual method were similar for the water-1-butanol mixture; however, the time estimated by the acoustic method was longer when compared with that estimated by the visual method for the water-ethanol mixture due to the residual ethanol at the bottom of the droplet.

  13. Polarimetric Thermal Imaging

    DTIC Science & Technology

    2007-03-01

    front of a large area blackbody as background. The viewing angle , defined as the angle between surface normal and camera line of sight, was varied by...and polarization angle were derived from the Stokes parameters. The dependence of these polarization characteristics on viewing angle was investigated

  14. Temperature measurement with industrial color camera devices

    NASA Astrophysics Data System (ADS)

    Schmidradler, Dieter J.; Berndorfer, Thomas; van Dyck, Walter; Pretschuh, Juergen

    1999-05-01

    This paper discusses color camera based temperature measurement. Usually, visual imaging and infrared image sensing are treated as two separate disciplines. We will show, that a well selected color camera device might be a cheaper, more robust and more sophisticated solution for optical temperature measurement in several cases. Herein, only implementation fragments and important restrictions for the sensing element will be discussed. Our aim is to draw the readers attention to the use of visual image sensors for measuring thermal radiation and temperature and to give reasons for the need of improved technologies for infrared camera devices. With AVL-List, our partner of industry, we successfully used the proposed sensor to perform temperature measurement for flames inside the combustion chamber of diesel engines which finally led to the presented insights.

  15. Measurement of reach envelopes with a four-camera Selective Spot Recognition (SELSPOT) system

    NASA Technical Reports Server (NTRS)

    Stramler, J. H., Jr.; Woolford, B. J.

    1983-01-01

    The basic Selective Spot Recognition (SELSPOT) system is essentially a system which uses infrared LEDs and a 'camera' with an infrared-sensitive photodetector, a focusing lens, and some A/D electronics to produce a digital output representing an X and Y coordinate for each LED for each camera. When the data are synthesized across all cameras with appropriate calibrations, an XYZ set of coordinates is obtained for each LED at a given point in time. Attention is given to the operating modes, a system checkout, and reach envelopes and software. The Video Recording Adapter (VRA) represents the main addition to the basic SELSPOT system. The VRA contains a microprocessor and other electronics which permit user selection of several options and some interaction with the system.

  16. Infrared stereo calibration for unmanned ground vehicle navigation

    NASA Astrophysics Data System (ADS)

    Harguess, Josh; Strange, Shawn

    2014-06-01

    The problem of calibrating two color cameras as a stereo pair has been heavily researched and many off-the-shelf software packages, such as Robot Operating System and OpenCV, include calibration routines that work in most cases. However, the problem of calibrating two infrared (IR) cameras for the purposes of sensor fusion and point could generation is relatively new and many challenges exist. We present a comparison of color camera and IR camera stereo calibration using data from an unmanned ground vehicle. There are two main challenges in IR stereo calibration; the calibration board (material, design, etc.) and the accuracy of calibration pattern detection. We present our analysis of these challenges along with our IR stereo calibration methodology. Finally, we present our results both visually and analytically with computed reprojection errors.

  17. A new high-speed IR camera system

    NASA Technical Reports Server (NTRS)

    Travis, Jeffrey W.; Shu, Peter K.; Jhabvala, Murzy D.; Kasten, Michael S.; Moseley, Samuel H.; Casey, Sean C.; Mcgovern, Lawrence K.; Luers, Philip J.; Dabney, Philip W.; Kaipa, Ravi C.

    1994-01-01

    A multi-organizational team at the Goddard Space Flight Center is developing a new far infrared (FIR) camera system which furthers the state of the art for this type of instrument by the incorporating recent advances in several technological disciplines. All aspects of the camera system are optimized for operation at the high data rates required for astronomical observations in the far infrared. The instrument is built around a Blocked Impurity Band (BIB) detector array which exhibits responsivity over a broad wavelength band and which is capable of operating at 1000 frames/sec, and consists of a focal plane dewar, a compact camera head electronics package, and a Digital Signal Processor (DSP)-based data system residing in a standard 486 personal computer. In this paper we discuss the overall system architecture, the focal plane dewar, and advanced features and design considerations for the electronics. This system, or one derived from it, may prove useful for many commercial and/or industrial infrared imaging or spectroscopic applications, including thermal machine vision for robotic manufacturing, photographic observation of short-duration thermal events such as combustion or chemical reactions, and high-resolution surveillance imaging.

  18. Ring King

    NASA Image and Video Library

    2014-08-18

    Saturn reigns supreme, encircled by its retinue of rings. Although all four giant planets have ring systems, Saturn's is by far the most massive and impressive. Scientists are trying to understand why by studying how the rings have formed and how they have evolved over time. Also seen in this image is Saturn's famous north polar vortex and hexagon. This view looks toward the sunlit side of the rings from about 37 degrees above the ringplane. The image was taken with the Cassini spacecraft wide-angle camera on May 4, 2014 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was acquired at a distance of approximately 2 million miles (3 million kilometers) from Saturn. Image scale is 110 miles (180 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18278

  19. Instrument Overview of the JEM-EUSO Mission

    NASA Technical Reports Server (NTRS)

    Kajino, F.; Yamamoto, T.; Sakata, M.; Yamamoto, Y.; Sato, H.; Ebizuka, N.; Ebisuzaki, T.; Uehara, Y.; Ohmori, H.; Kawasaki, Y.; hide

    2007-01-01

    JEM-EUSO with a large and wide-angle telescope mounted on the International Space Station (ISS) has been planned as a space mission to explore extremes of the universe through the investigation of extreme energy cosmic rays by detecting photons which accompany air showers developed in the earth's atmosphere. JEM-EUSO will be launched by Japanese H-II Transfer Vehicle (HTV) and mounted at the Exposed Facility of Japanese Experiment Module (JEM/EF) of the ISS in the second phase of utilization plan. The telescope consists of high transmittance optical Fresnel lenses with a diameter of 2.5m, 200k channels of multi anode-photomultiplier tubes, focal surface front-end, readout, trigger and system electronics. An infrared camera and a LIDAR system will be also used to monitor the earth's atmosphere.

  20. Neptune's small dark spot (D2)

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This bulls-eye view of Neptune's small dark spot (D2) was obtained by Voyager 2's narrow-angle camera. Banding surrounding the feature indicates unseen strong winds, while structures within the bright spot suggest both active upwelling of clouds and rotation about the center. A rotation rate has not yet been measured, but the V-shaped structure near the right edge of the bright area indicates that the spot rotates clockwise. Unlike the Great Red Spot on Jupiter, which rotates counterclockwise, if the D2 spot on Neptune rotates clockwise, the material will be descending in the dark oval region. The fact that infrared data will yield temperature information about the region above the clouds makes this observation especially valuable. The Voyager Mission is conducted by JPL for NASA's Office of Space Science and Applications.

  1. Noisy Ocular Recognition Based on Three Convolutional Neural Networks

    PubMed Central

    Lee, Min Beom; Hong, Hyung Gil; Park, Kang Ryoung

    2017-01-01

    In recent years, the iris recognition system has been gaining increasing acceptance for applications such as access control and smartphone security. When the images of the iris are obtained under unconstrained conditions, an issue of undermined quality is caused by optical and motion blur, off-angle view (the user’s eyes looking somewhere else, not into the front of the camera), specular reflection (SR) and other factors. Such noisy iris images increase intra-individual variations and, as a result, reduce the accuracy of iris recognition. A typical iris recognition system requires a near-infrared (NIR) illuminator along with an NIR camera, which are larger and more expensive than fingerprint recognition equipment. Hence, many studies have proposed methods of using iris images captured by a visible light camera without the need for an additional illuminator. In this research, we propose a new recognition method for noisy iris and ocular images by using one iris and two periocular regions, based on three convolutional neural networks (CNNs). Experiments were conducted by using the noisy iris challenge evaluation-part II (NICE.II) training dataset (selected from the university of Beira iris (UBIRIS).v2 database), mobile iris challenge evaluation (MICHE) database, and institute of automation of Chinese academy of sciences (CASIA)-Iris-Distance database. As a result, the method proposed by this study outperformed previous methods. PMID:29258217

  2. A study of thermographic diagnosis system and imaging algorithm by distributed thermal data using single infrared sensor.

    PubMed

    Yoon, Se Jin; Noh, Si Cheol; Choi, Heung Ho

    2007-01-01

    The infrared diagnosis device provides two-dimensional images and patient-oriented results that can be easily understood by the inspection target by using infrared cameras; however, it has disadvantages such as large size, high price, and inconvenient maintenance. In this regard, this study has proposed small-sized diagnosis device for body heat using a single infrared sensor and implemented an infrared detection system using a single infrared sensor and an algorithm that represents thermography using the obtained data on the temperature of the point source. The developed systems had the temperature resolution of 0.1 degree and the reproducibility of +/-0.1 degree. The accuracy was 90.39% at the error bound of +/-0 degree and 99.98% at that of +/-0.1 degree. In order to evaluate the proposed algorithm and system, the infrared images of camera method was compared. The thermal images that have clinical meaning were obtained from a patient who has lesion to verify its clinical applicability.

  3. Esthetic smile preferences and the orientation of the maxillary occlusal plane.

    PubMed

    Kattadiyil, Mathew T; Goodacre, Charles J; Naylor, W Patrick; Maveli, Thomas C

    2012-12-01

    The anteroposterior orientation of the maxillary occlusal plane has an important role in the creation, assessment, and perception of an esthetic smile. However, the effect of the angle at which this plane is visualized (the viewing angle) in a broad smile has not been quantified. The purpose of this study was to assess the esthetic preferences of dental professionals and nondentists by using 3 viewing angles of the anteroposterior orientation of the maxillary occlusal plane. After Institutional Review Board approval, standardized digital photographic images of the smiles of 100 participants were recorded by simultaneously triggering 3 cameras set at different viewing angles. The top camera was positioned 10 degrees above the occlusal plane (camera #1, Top view); the center camera was positioned at the level of the occlusal plane (camera #2, Center view); and the bottom camera was located 10 degrees below the occlusal plane (camera #3, Bottom view). Forty-two dental professionals and 31 nondentists (persons from the general population) independently evaluated digital images of each participant's smile captured from the Top view, Center view, and Bottom view. The 73 evaluators were asked individually through a questionnaire to rank the 3 photographic images of each patient as 'most pleasing,' 'somewhat pleasing,' or 'least pleasing,' with most pleasing being the most esthetic view and the preferred orientation of the occlusal plane. The resulting esthetic preferences were statistically analyzed by using the Friedman test. In addition, the participants were asked to rank their own images from the 3 viewing angles as 'most pleasing,' 'somewhat pleasing,' and 'least pleasing.' The 73 evaluators found statistically significant differences in the esthetic preferences between the Top and Bottom views and between the Center and Bottom views (P<.001). No significant differences were found between the Top and Center views. The Top position was marginally preferred over the Center, and both were significantly preferred over the Bottom position. When the participants evaluated their own smiles, a significantly greater number (P< .001) preferred the Top view over the Center or the Bottom views. No significant differences were found in preferences based on the demographics of the evaluators when comparing age, education, gender, profession, and race. The esthetic preference for the maxillary occlusal plane was influenced by the viewing angle with the higher (Top) and center views preferred by both dental and nondental evaluators. The participants themselves preferred the higher view of their smile significantly more often than the center or lower angle views (P<.001). Copyright © 2012 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.

  4. First light observations with TIFR Near Infrared Imaging Camera (TIRCAM-II)

    NASA Astrophysics Data System (ADS)

    Ojha, D. K.; Ghosh, S. K.; D'Costa, S. L. A.; Naik, M. B.; Sandimani, P. R.; Poojary, S. S.; Bhagat, S. B.; Jadhav, R. B.; Meshram, G. S.; Bakalkar, C. B.; Ramaprakash, A. N.; Mohan, V.; Joshi, J.

    TIFR near infrared imaging camera (TIRCAM-II) is based on the Aladdin III Quadrant InSb focal plane array (512×512 pixels; 27.6 μm pixel size; sensitive between 1 - 5.5 μm). TIRCAM-II had its first engineering run with the 2 m IUCAA telescope at Girawali during February - March 2011. The first light observations with TIRCAM-II were quite successful. Several infrared standard with TIRCAM-II were quite successful. Several infrared standard stars, the Trapezium Cluster in Orion region, McNeil's nebula, etc., were observed in the J, K and in a narrow-band at 3.6 μm (nbL). In the nbL band, some bright stars could be detected from the Girawali site. The performance of TIRCAM-II is discussed in the light of preliminary observations in near infrared bands.

  5. Teaching physics and understanding infrared thermal imaging

    NASA Astrophysics Data System (ADS)

    Vollmer, Michael; Möllmann, Klaus-Peter

    2017-08-01

    Infrared thermal imaging is a very rapidly evolving field. The latest trends are small smartphone IR camera accessories, making infrared imaging a widespread and well-known consumer product. Applications range from medical diagnosis methods via building inspections and industrial predictive maintenance etc. also to visualization in the natural sciences. Infrared cameras do allow qualitative imaging and visualization but also quantitative measurements of the surface temperatures of objects. On the one hand, they are a particularly suitable tool to teach optics and radiation physics and many selected topics in different fields of physics, on the other hand there is an increasing need of engineers and physicists who understand these complex state of the art photonics systems. Therefore students must also learn and understand the physics underlying these systems.

  6. Confocal Retinal Imaging Using a Digital Light Projector with a Near Infrared VCSEL Source

    PubMed Central

    Muller, Matthew S.; Elsner, Ann E.

    2018-01-01

    A custom near infrared VCSEL source has been implemented in a confocal non-mydriatic retinal camera, the Digital Light Ophthalmoscope (DLO). The use of near infrared light improves patient comfort, avoids pupil constriction, penetrates the deeper retina, and does not mask visual stimuli. The DLO performs confocal imaging by synchronizing a sequence of lines displayed with a digital micromirror device to the rolling shutter exposure of a 2D CMOS camera. Real-time software adjustments enable multiply scattered light imaging, which rapidly and cost-effectively emphasizes drusen and other scattering disruptions in the deeper retina. A separate 5.1″ LCD display provides customizable visible stimuli for vision experiments with simultaneous near infrared imaging. PMID:29899586

  7. The infrared imaging radiometer for PICASSO-CENA

    NASA Astrophysics Data System (ADS)

    Corlay, Gilles; Arnolfo, Marie-Christine; Bret-Dibat, Thierry; Lifferman, Anne; Pelon, Jacques

    2017-11-01

    Microbolometers are infrared detectors of an emerging technology mainly developed in US and few other countries for few years. The main targets of these developments are low performing and low cost military and civilian applications like survey cameras. Applications in space are now arising thanks to the design simplification and the associated cost reduction allowed by this new technology. Among the four instruments of the payload of PICASSO-CENA, the Imaging Infrared Radiometer (IIR) is based on the microbolometer technology. An infrared camera in development for the IASI instrument is the core of the IIR. The aim of the paper is to recall the PICASSO-CENA mission goal, to describe the IIR instrument architecture and highlight its main features and performances and to give the its development status.

  8. Phantom Limb

    NASA Image and Video Library

    2017-09-25

    The brightly lit limb of a crescent Enceladus looks ethereal against the blackness of space. The rest of the moon, lit by light reflected from Saturn, presents a ghostly appearance. Enceladus (313 miles or 504 kilometers across) is back-lit in this image, as is apparent by the thin crescent. However, the Sun-Enceladus-spacecraft (or phase) angle, at 141 degrees, is too low to make the moon's famous plumes easily visible. This view looks toward the Saturn-facing hemisphere of Enceladus. North on Enceladus is up. The above image is a composite of images taken with the Cassini spacecraft narrow-angle camera on March 29, 2017 using filters that allow infrared, green, and ultraviolet light. The image filter centered on 930 nm (IR) was is red in this image, the image filter centered on the green is green, and the image filter centered on 338 nm (UV) is blue. The view was obtained at a distance of approximately 110,000 miles (180,000 kilometers) from Enceladus. Image scale is 0.6 miles (1 kilometer) per pixel. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA21346

  9. A Closer Look at Telesto False-Color

    NASA Image and Video Library

    2006-02-08

    These views show surface features and color variation on the Trojan moon Telesto. The smooth surface of this moon suggests that, like Pandora, it is covered with a mantle of fine, dust-sized icy material. The monochrome image was taken in visible light (see PIA07696). To create the false-color view, ultraviolet, green and infrared images were combined into a single black and white picture that isolates and maps regional color differences. This "color map" was then superposed over a clear-filter image. The origin of the color differences is not yet understood, but may be caused by subtle differences in the surface composition or the sizes of grains making up the icy soil. Tiny Telesto is a mere 24 kilometers (15 miles) wide. The image was acquired with the Cassini spacecraft narrow-angle camera on Dec. 25, 2005 at a distance of approximately 20,000 kilometers (12,000 miles) from Telesto and at a Sun-Telesto-spacecraft, or phase, angle of 58 degrees. Image scale is 118 meters (387 feet) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA07697

  10. Integrated calibration between digital camera and laser scanner from mobile mapping system for land vehicles

    NASA Astrophysics Data System (ADS)

    Zhao, Guihua; Chen, Hong; Li, Xingquan; Zou, Xiaoliang

    The paper presents the concept of lever arm and boresight angle, the design requirements of calibration sites and the integrated calibration method of boresight angles of digital camera or laser scanner. Taking test data collected by Applanix's LandMark system as an example, the camera calibration method is introduced to be piling three consecutive stereo images and OTF-Calibration method using ground control points. The laser calibration of boresight angle is proposed to use a manual and automatic method with ground control points. Integrated calibration between digital camera and laser scanner is introduced to improve the systemic precision of two sensors. By analyzing the measurement value between ground control points and its corresponding image points in sequence images, a conclusion is that position objects between camera and images are within about 15cm in relative errors and 20cm in absolute errors. By comparing the difference value between ground control points and its corresponding laser point clouds, the errors is less than 20cm. From achieved results of these experiments in analysis, mobile mapping system is efficient and reliable system for generating high-accuracy and high-density road spatial data more rapidly.

  11. Synchrotron emission diagnostic of full-orbit kinetic simulations of runaway electrons in tokamaks plasmas

    NASA Astrophysics Data System (ADS)

    Carbajal Gomez, Leopoldo; Del-Castillo-Negrete, Diego

    2017-10-01

    Developing avoidance or mitigation strategies of runaway electrons (RE) for the safe operation of ITER is imperative. Synchrotron radiation (SR) of RE is routinely used in current tokamak experiments to diagnose RE. We present the results of a newly developed camera diagnostic of SR for full-orbit kinetic simulations of RE in DIII-D-like plasmas that simultaneously includes: full-orbit effects, information of the spectral and angular distribution of SR of each electron, and basic geometric optics of a camera. We observe a strong dependence of the SR measured by the camera on the pitch angle distribution of RE, namely we find that crescent shapes of the SR on the camera pictures relate to RE distributions with small pitch angles, while ellipse shapes relate to distributions of RE with larger pitch angles. A weak dependence of the SR measured by the camera with the RE energy, value of the q-profile at the edge, and the chosen range of wavelengths is found. Furthermore, we observe that oversimplifying the angular distribution of the SR changes the synchrotron spectra and overestimates its amplitude. Research sponsored by the LDRD Program of ORNL, managed by UT-Battelle, LLC, for the U. S. DoE.

  12. A low-cost dual-camera imaging system for aerial applicators

    USDA-ARS?s Scientific Manuscript database

    Agricultural aircraft provide a readily available remote sensing platform as low-cost and easy-to-use consumer-grade cameras are being increasingly used for aerial imaging. In this article, we report on a dual-camera imaging system we recently assembled that can capture RGB and near-infrared (NIR) i...

  13. A DirtI Application for LBT Commissioning Campaigns

    NASA Astrophysics Data System (ADS)

    Borelli, J. L.

    2009-09-01

    In order to characterize the Gregorian focal stations and test the performance achieved by the Large Binocular Telescope (LBT) adaptive optics system, two infrared test cameras were constructed within a joint project between INAF (Observatorio Astronomico di Bologna, Italy) and the Max Planck Institute for Astronomy (Germany). Is intended here to describe the functionality and successful results obtained with the Daemon for the Infrared Test Camera Interface (DirtI) during commissioning campaigns.

  14. Research on camera on orbit radial calibration based on black body and infrared calibration stars

    NASA Astrophysics Data System (ADS)

    Wang, YuDu; Su, XiaoFeng; Zhang, WanYing; Chen, FanSheng

    2018-05-01

    Affected by launching process and space environment, the response capability of a space camera must be attenuated. So it is necessary for a space camera to have a spaceborne radiant calibration. In this paper, we propose a method of calibration based on accurate Infrared standard stars was proposed for increasing infrared radiation measurement precision. As stars can be considered as a point target, we use them as the radiometric calibration source and establish the Taylor expansion method and the energy extrapolation model based on WISE catalog and 2MASS catalog. Then we update the calibration results from black body. Finally, calibration mechanism is designed and the technology of design is verified by on orbit test. The experimental calibration result shows the irradiance extrapolation error is about 3% and the accuracy of calibration methods is about 10%, the results show that the methods could satisfy requirements of on orbit calibration.

  15. Ensuring long-term stability of infrared camera absolute calibration.

    PubMed

    Kattnig, Alain; Thetas, Sophie; Primot, Jérôme

    2015-07-13

    Absolute calibration of cryogenic 3-5 µm and 8-10 µm infrared cameras is notoriously instable and thus has to be repeated before actual measurements. Moreover, the signal to noise ratio of the imagery is lowered, decreasing its quality. These performances degradations strongly lessen the suitability of Infrared Imaging. These defaults are often blamed on detectors reaching a different "response state" after each return to cryogenic conditions, while accounting for the detrimental effects of imperfect stray light management. We show here that detectors are not to be blamed and that the culprit can also dwell in proximity electronics. We identify an unexpected source of instability in the initial voltage of the integrating capacity of detectors. Then we show that this parameter can be easily measured and taken into account. This way we demonstrate that a one month old calibration of a 3-5 µm camera has retained its validity.

  16. AFRC2016-0116-065

    NASA Image and Video Library

    2016-04-15

    The newest instrument, an infrared camera called the High-resolution Airborne Wideband Camera-Plus (HAWC+), was installed on the Stratospheric Observatory for Infrared Astronomy, SOFIA, in April of 2016. This is the only currently operating astronomical camera that makes images using far-infrared light, allowing studies of low-temperature early stages of star and planet formation. HAWC+ includes a polarimeter, a device that measures the alignment of incoming light waves. With the polarimeter, HAWC+ can map magnetic fields in star forming regions and in the environment around the supermassive black hole at the center of the Milky Way galaxy. These new maps can reveal how the strength and direction of magnetic fields affect the rate at which interstellar clouds condense to form new stars. A team led by C. Darren Dowell at NASA’s Jet Propulsion Laboratory and including participants from more than a dozen institutions developed the instrument.

  17. Method for determining and displaying the spacial distribution of a spectral pattern of received light

    DOEpatents

    Bennett, Charles L.

    1996-01-01

    An imaging Fourier transform spectrometer (10, 210) having a Fourier transform infrared spectrometer (12) providing a series of images (40) to a focal plane array camera (38). The focal plane array camera (38) is clocked to a multiple of zero crossing occurrences as caused by a moving mirror (18) of the Fourier transform infrared spectrometer (12) and as detected by a laser detector (50) such that the frame capture rate of the focal plane array camera (38) corresponds to a multiple of the zero crossing rate of the Fourier transform infrared spectrometer (12). The images (40) are transmitted to a computer (45) for processing such that representations of the images (40) as viewed in the light of an arbitrary spectral "fingerprint" pattern can be displayed on a monitor (60) or otherwise stored and manipulated by the computer (45).

  18. CANICA: The Cananea Near-Infrared Camera at the 2.1 m OAGH Telescope

    NASA Astrophysics Data System (ADS)

    Carrasco, L.; Hernández Utrera, O.; Vázquez, S.; Mayya, Y. D.; Carrasco, E.; Pedraza, J.; Castillo-Domínguez, E.; Escobedo, G.; Devaraj, R.; Luna, A.

    2017-10-01

    The Cananea near-infrared camera (CANICA) is an instrument commissioned at the 2.12 m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located in Cananea, Sonora, México. CANICA operates in the near-infrared at multiple bands including J(1.24 μm), H(1.63 μm) and K' (2.12 μm) broad-bands. CANICA in located at the Ritchey-Chrétien focal plane of the telescope, reimaging the f/12 beam into f/6 beam. The detector is a 1024 × 1024 HgCdTe HAWAII array of 18.5 μm pixel size, covering a field of view of 5.5 × 5.5 arcmin2, for a plate scale of 0.32 arcsec/pixel. The camera is enclosed in a cryostat, cooled with liquid nitrogen to 77 K. The cryostat contains the collimator, two 15-position filter wheels, single fixed reimaging optics and the detector.

  19. Note: Simple hysteresis parameter inspector for camera module with liquid lens

    NASA Astrophysics Data System (ADS)

    Chen, Po-Jui; Liao, Tai-Shan; Hwang, Chi-Hung

    2010-05-01

    A method to inspect hysteresis parameter is presented in this article. The hysteresis of whole camera module with liquid lens can be measured rather than a single lens merely. Because the variation in focal length influences image quality, we propose utilizing the sharpness of images which is captured from camera module for hysteresis evaluation. Experiments reveal that the profile of sharpness hysteresis corresponds to the characteristic of contact angle of liquid lens. Therefore, it can infer that the hysteresis of camera module is induced by the contact angle of liquid lens. An inspection process takes only 20 s to complete. Thus comparing with other instruments, this inspection method is more suitable to integrate into the mass production lines for online quality assurance.

  20. Autonomous pedestrian localization technique using CMOS camera sensors

    NASA Astrophysics Data System (ADS)

    Chun, Chanwoo

    2014-09-01

    We present a pedestrian localization technique that does not need infrastructure. The proposed angle-only measurement method needs specially manufactured shoes. Each shoe has two CMOS cameras and two markers such as LEDs attached on the inward side. The line of sight (LOS) angles towards the two markers on the forward shoe are measured using the two cameras on the other rear shoe. Our simulation results shows that a pedestrian walking down in a shopping mall wearing this device can be accurately guided to the front of a destination store located 100m away, if the floor plan of the mall is available.

  1. Navigating surgical fluorescence cameras using near-infrared optical tracking.

    PubMed

    van Oosterom, Matthias; den Houting, David; van de Velde, Cornelis; van Leeuwen, Fijs

    2018-05-01

    Fluorescence guidance facilitates real-time intraoperative visualization of the tissue of interest. However, due to attenuation, the application of fluorescence guidance is restricted to superficial lesions. To overcome this shortcoming, we have previously applied three-dimensional surgical navigation to position the fluorescence camera in reach of the superficial fluorescent signal. Unfortunately, in open surgery, the near-infrared (NIR) optical tracking system (OTS) used for navigation also induced an interference during NIR fluorescence imaging. In an attempt to support future implementation of navigated fluorescence cameras, different aspects of this interference were characterized and solutions were sought after. Two commercial fluorescence cameras for open surgery were studied in (surgical) phantom and human tissue setups using two different NIR OTSs and one OTS simulating light-emitting diode setup. Following the outcome of these measurements, OTS settings were optimized. Measurements indicated the OTS interference was caused by: (1) spectral overlap between the OTS light and camera, (2) OTS light intensity, (3) OTS duty cycle, (4) OTS frequency, (5) fluorescence camera frequency, and (6) fluorescence camera sensitivity. By optimizing points 2 to 4, navigation of fluorescence cameras during open surgery could be facilitated. Optimization of the OTS and camera compatibility can be used to support navigated fluorescence guidance concepts. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  2. Seasonal variations of leaf and canopy properties tracked by ground-based NDVI imagery in a temperate forest.

    PubMed

    Yang, Hualei; Yang, Xi; Heskel, Mary; Sun, Shucun; Tang, Jianwu

    2017-04-28

    Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporal resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). We found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.

  3. Students' framing of laboratory exercises using infrared cameras

    NASA Astrophysics Data System (ADS)

    Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.

    2015-12-01

    Thermal science is challenging for students due to its largely imperceptible nature. Handheld infrared cameras offer a pedagogical opportunity for students to see otherwise invisible thermal phenomena. In the present study, a class of upper secondary technology students (N =30 ) partook in four IR-camera laboratory activities, designed around the predict-observe-explain approach of White and Gunstone. The activities involved central thermal concepts that focused on heat conduction and dissipative processes such as friction and collisions. Students' interactions within each activity were videotaped and the analysis focuses on how a purposefully selected group of three students engaged with the exercises. As the basis for an interpretative study, a "thick" narrative description of the students' epistemological and conceptual framing of the exercises and how they took advantage of the disciplinary affordance of IR cameras in the thermal domain is provided. Findings include that the students largely shared their conceptual framing of the four activities, but differed among themselves in their epistemological framing, for instance, in how far they found it relevant to digress from the laboratory instructions when inquiring into thermal phenomena. In conclusion, the study unveils the disciplinary affordances of infrared cameras, in the sense of their use in providing access to knowledge about macroscopic thermal science.

  4. Easily Accessible Camera Mount

    NASA Technical Reports Server (NTRS)

    Chalson, H. E.

    1986-01-01

    Modified mount enables fast alinement of movie cameras in explosionproof housings. Screw on side and readily reached through side door of housing. Mount includes right-angle drive mechanism containing two miter gears that turn threaded shaft. Shaft drives movable dovetail clamping jaw that engages fixed dovetail plate on camera. Mechanism alines camera in housing and secures it. Reduces installation time by 80 percent.

  5. Conception of a cheap infrared camera using a Fresnel lens

    NASA Astrophysics Data System (ADS)

    Grulois, Tatiana; Druart, Guillaume; Guérineau, Nicolas; Crastes, Arnaud; Sauer, Hervé; Chavel, Pierre

    2014-09-01

    Today huge efforts are made in the research and industrial areas to design compact and cheap uncooled infrared optical systems for low-cost imagery applications. Indeed, infrared cameras are currently too expensive to be widespread. If we manage to cut their cost, we expect to open new types of markets. In this paper, we will present the cheap broadband microimager we have designed. It operates in the long-wavelength infrared range and uses only one silicon lens at a minimal cost for the manufacturing process. Our concept is based on the use of a thin optics. Therefore inexpensive unconventional materials can be used because some absorption can be tolerated. Our imager uses a thin Fresnel lens. Up to now, Fresnel lenses have not been used for broadband imagery applications because of their disastrous chromatic properties. However, we show that working in a high diffraction order can significantly reduce chromatism. A prototype has been made and the performance of our camera will be discussed. Its characterization has been carried out in terms of modulation transfer function (MTF) and noise equivalent temperature difference (NETD). Finally, experimental images will be presented.

  6. Forward-Looking Infrared Cameras for Micrometeorological Applications within Vineyards

    PubMed Central

    Katurji, Marwan; Zawar-Reza, Peyman

    2016-01-01

    We apply the principles of atmospheric surface layer dynamics within a vineyard canopy to demonstrate the use of forward-looking infrared cameras measuring surface brightness temperature (spectrum bandwidth of 7.5 to 14 μm) at a relatively high temporal rate of 10 s. The temporal surface brightness signal over a few hours of the stable nighttime boundary layer, intermittently interrupted by periods of turbulent heat flux surges, was shown to be related to the observed meteorological measurements by an in situ eddy-covariance system, and reflected the above-canopy wind variability. The infrared raster images were collected and the resultant self-organized spatial cluster provided the meteorological context when compared to in situ data. The spatial brightness temperature pattern was explained in terms of the presence or absence of nighttime cloud cover and down-welling of long-wave radiation and the canopy turbulent heat flux. Time sequential thermography as demonstrated in this research provides positive evidence behind the application of thermal infrared cameras in the domain of micrometeorology, and to enhance our spatial understanding of turbulent eddy interactions with the surface. PMID:27649208

  7. LIFTING THE VEIL OF DUST TO REVEAL THE SECRETS OF SPIRAL GALAXIES

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Astronomers have combined information from the NASA Hubble Space Telescope's visible- and infrared-light cameras to show the hearts of four spiral galaxies peppered with ancient populations of stars. The top row of pictures, taken by a ground-based telescope, represents complete views of each galaxy. The blue boxes outline the regions observed by the Hubble telescope. The bottom row represents composite pictures from Hubble's visible- and infrared-light cameras, the Wide Field and Planetary Camera 2 (WFPC2) and the Near Infrared Camera and Multi-Object Spectrometer (NICMOS). Astronomers combined views from both cameras to obtain the true ages of the stars surrounding each galaxy's bulge. The Hubble telescope's sharper resolution allows astronomers to study the intricate structure of a galaxy's core. The galaxies are ordered by the size of their bulges. NGC 5838, an 'S0' galaxy, is dominated by a large bulge and has no visible spiral arms; NGC 7537, an 'Sbc' galaxy, has a small bulge and loosely wound spiral arms. Astronomers think that the structure of NGC 7537 is very similar to our Milky Way. The galaxy images are composites made from WFPC2 images taken with blue (4445 Angstroms) and red (8269 Angstroms) filters, and NICMOS images taken in the infrared (16,000 Angstroms). They were taken in June, July, and August of 1997. Credits for the ground-based images: Allan Sandage (The Observatories of the Carnegie Institution of Washington) and John Bedke (Computer Sciences Corporation and the Space Telescope Science Institute) Credits for WFPC2 and NICMOS composites: NASA, ESA, and Reynier Peletier (University of Nottingham, United Kingdom)

  8. Capturing method for integral three-dimensional imaging using multiviewpoint robotic cameras

    NASA Astrophysics Data System (ADS)

    Ikeya, Kensuke; Arai, Jun; Mishina, Tomoyuki; Yamaguchi, Masahiro

    2018-03-01

    Integral three-dimensional (3-D) technology for next-generation 3-D television must be able to capture dynamic moving subjects with pan, tilt, and zoom camerawork as good as in current TV program production. We propose a capturing method for integral 3-D imaging using multiviewpoint robotic cameras. The cameras are controlled through a cooperative synchronous system composed of a master camera controlled by a camera operator and other reference cameras that are utilized for 3-D reconstruction. When the operator captures a subject using the master camera, the region reproduced by the integral 3-D display is regulated in real space according to the subject's position and view angle of the master camera. Using the cooperative control function, the reference cameras can capture images at the narrowest view angle that does not lose any part of the object region, thereby maximizing the resolution of the image. 3-D models are reconstructed by estimating the depth from complementary multiviewpoint images captured by robotic cameras arranged in a two-dimensional array. The model is converted into elemental images to generate the integral 3-D images. In experiments, we reconstructed integral 3-D images of karate players and confirmed that the proposed method satisfied the above requirements.

  9. Method and apparatus for implementing material thermal property measurement by flash thermal imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Jiangang

    A method and apparatus are provided for implementing measurement of material thermal properties including measurement of thermal effusivity of a coating and/or film or a bulk material of uniform property. The test apparatus includes an infrared camera, a data acquisition and processing computer coupled to the infrared camera for acquiring and processing thermal image data, a flash lamp providing an input of heat onto the surface of a two-layer sample with an enhanced optical filter covering the flash lamp attenuating an entire infrared wavelength range with a series of thermal images is taken of the surface of the two-layer sample.

  10. Ambient and Cryogenic Alignment Verification and Performance of the Infrared Multi-Object Spectrometer

    NASA Technical Reports Server (NTRS)

    Connelly, Joseph A.; Ohl, Raymond G.; Mink, Ronald G.; Mentzell, J. Eric; Saha, Timo T.; Tveekrem, June L.; Hylan, Jason E.; Sparr, Leroy M.; Chambers, V. John; Hagopian, John G.

    2003-01-01

    The Infrared Multi-Object Spectrometer (IRMOS) is a facility instrument for the Kitt Peak National Observatory 4 and 2.1 meter telescopes. IRMOS is a near-IR (0.8 - 2.5 micron) spectrometer with low- to mid-resolving power (R = 300 - 3000). IRMOS produces simultaneous spectra of approximately 100 objects in its 2.8 x 2.0 arc-min field of view using a commercial Micro Electro-Mechanical Systems (MEMS) Digital Micro-mirror Device (DMD) from Texas Instruments. The IRMOS optical design consists of two imaging subsystems. The focal reducer images the focal plane of the telescope onto the DMD field stop, and the spectrograph images the DMD onto the detector. We describe ambient breadboard subsystem alignment and imaging performance of each stage independently, and the ambient and cryogenic imaging performance of the fully assembled instrument. Interferometric measurements of subsystem wavefront error serve to venfy alignment, and are accomplished using a commercial, modified Twyman-Green laser unequal path interferometer. Image testing provides further verification of the optomechanical alignment method and a measurement of near-angle scattered light due to mirror small-scale surface error. Image testing is performed at multiple field points. A mercury-argon pencil lamp provides spectral lines at 546.1 nm and 1550 nm, and a CCD camera and IR camera are used as detectors. We use commercial optical modeling software to predict the point-spread function and its effect on instrument slit transmission and resolution. Our breadboard test results validate this prediction. We conclude with an instrument performance prediction for first light.

  11. MISR Global Images See the Light of Day

    NASA Technical Reports Server (NTRS)

    2002-01-01

    As of July 31, 2002, global multi-angle, multi-spectral radiance products are available from the MISR instrument aboard the Terra satellite. Measuring the radiative properties of different types of surfaces, clouds and atmospheric particulates is an important step toward understanding the Earth's climate system. These images are among the first planet-wide summary views to be publicly released from the Multi-angle Imaging SpectroRadiometer experiment. Data for these images were collected during the month of March 2002, and each pixel represents monthly-averaged daylight radiances from an area measuring 1/2 degree in latitude by 1/2 degree in longitude.

    The top panel is from MISR's nadir (vertical-viewing) camera and combines data from the red, green and blue spectral bands to create a natural color image. The central view combines near-infrared, red, and green spectral data to create a false-color rendition that enhances highly vegetated terrain. It takes 9 days for MISR to view the entire globe, and only areas within 8 degrees of latitude of the north and south poles are not observed due to the Terra orbit inclination. Because a single pole-to-pole swath of MISR data is just 400 kilometers wide, multiple swaths must be mosaiced to create these global views. Discontinuities appear in some cloud patterns as a consequence of changes in cloud cover from one day to another.

    The lower panel is a composite in which red, green, and blue radiances from MISR's 70-degree forward-viewing camera are displayed in the northern hemisphere, and radiances from the 70-degree backward-viewing camera are displayed in the southern hemisphere. At the March equinox (spring in the northern hemisphere, autumn in the southern hemisphere), the Sun is near the equator. Therefore, both oblique angles are observing the Earth in 'forward scattering', particularly at high latitudes. Forward scattering occurs when you (or MISR) observe an object with the Sun at a point in the sky that is in front of you. Relative to the nadir view, this geometry accentuates the appearance of polar clouds, and can even reveal clouds that are invisible in the nadir direction. In relatively clear ocean areas, the oblique-angle composite is generally brighter than its nadir counterpart due to enhanced reflection of light by atmospheric particulates.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  12. Spinning projectile's attitude measurement with LW infrared radiation under sea-sky background

    NASA Astrophysics Data System (ADS)

    Xu, Miaomiao; Bu, Xiongzhu; Yu, Jing; He, Zilu

    2018-05-01

    With the further development of infrared radiation research in sea-sky background and the requirement of spinning projectile's attitude measurement, the sea-sky infrared radiation field is used to carry out spinning projectile's attitude angle instead of inertial sensors. Firstly, the generation mechanism of sea-sky infrared radiation is analysed. The mathematical model of sea-sky infrared radiation is deduced in LW (long wave) infrared 8 ∼ 14 μm band by calculating the sea surface and sky infrared radiation. Secondly, according to the movement characteristics of spinning projectile, the attitude measurement model of infrared sensors on projectile's three axis is established. And the feasibility of the model is analysed by simulation. Finally, the projectile's attitude calculation algorithm is designed to improve the attitude angle estimation accuracy. The results of semi-physical experiments show that the segmented interactive algorithm estimation error of pitch and roll angle is within ±1.5°. The attitude measurement method is effective and feasible, and provides accurate measurement basis for the guidance of spinning projectile.

  13. Performance of Backshort-Under-Grid Kilopixel TES Arrays for HAWC+

    NASA Technical Reports Server (NTRS)

    Staguhn, J. G.; Benford, D. J.; Dowell, C. D.; Fixsen, D. J.; Hilton, G. C.; Irwin, K. D.; Jhabvala, C. A.; Maher, S. F.; Miller, T. M.; Moseley, S. H.; hide

    2016-01-01

    We present results from laboratory detector characterizations of the first kilopixel BUG arrays for the High- resolution Wideband Camera Plus (HAWC+) which is the imaging far-infrared polarimeter camera for the Stratospheric Observatory for Infrared Astronomy (SOFIA). Our tests demonstrate that the array performance is consistent with the predicted properties. Here, we highlight results obtained for the thermal conductivity, noise performance, detector speed, and first optical results demonstrating the pixel yield of the arrays.

  14. Infrared imaging spectrometry by the use of bundled chalcogenide glass fibers and a PtSi CCD camera

    NASA Astrophysics Data System (ADS)

    Saito, Mitsunori; Kikuchi, Katsuhiro; Tanaka, Chinari; Sone, Hiroshi; Morimoto, Shozo; Yamashita, Toshiharu T.; Nishii, Junji

    1999-10-01

    A coherent fiber bundle for infrared image transmission was prepared by arranging 8400 chalcogenide (AsS) glass fibers. The fiber bundle, 1 m in length, is transmissive in the infrared spectral region of 1 - 6 micrometer. A remote spectroscopic imaging system was constructed with the fiber bundle and an infrared PtSi CCD camera. The system was used for the real-time observation (frame time: 1/60 s) of gas distribution. Infrared light from a SiC heater was delivered to a gas cell through a chalcogenide fiber, and transmitted light was observed through the fiber bundle. A band-pass filter was used for the selection of gas species. A He-Ne laser of 3.4 micrometer wavelength was also used for the observation of hydrocarbon gases. Gases bursting from a nozzle were observed successfully by a remote imaging system.

  15. AMICA (Antarctic Multiband Infrared CAmera) project

    NASA Astrophysics Data System (ADS)

    Dolci, Mauro; Straniero, Oscar; Valentini, Gaetano; Di Rico, Gianluca; Ragni, Maurizio; Pelusi, Danilo; Di Varano, Igor; Giuliani, Croce; Di Cianno, Amico; Valentini, Angelo; Corcione, Leonardo; Bortoletto, Favio; D'Alessandro, Maurizio; Bonoli, Carlotta; Giro, Enrico; Fantinel, Daniela; Magrin, Demetrio; Zerbi, Filippo M.; Riva, Alberto; Molinari, Emilio; Conconi, Paolo; De Caprio, Vincenzo; Busso, Maurizio; Tosti, Gino; Nucciarelli, Giuliano; Roncella, Fabio; Abia, Carlos

    2006-06-01

    The Antarctic Plateau offers unique opportunities for ground-based Infrared Astronomy. AMICA (Antarctic Multiband Infrared CAmera) is an instrument designed to perform astronomical imaging from Dome-C in the near- (1 - 5 μm) and mid- (5 - 27 μm) infrared wavelength regions. The camera consists of two channels, equipped with a Raytheon InSb 256 array detector and a DRS MF-128 Si:As IBC array detector, cryocooled at 35 and 7 K respectively. Cryogenic devices will move a filter wheel and a sliding mirror, used to feed alternatively the two detectors. Fast control and readout, synchronized with the chopping secondary mirror of the telescope, will be required because of the large background expected at these wavelengths, especially beyond 10 μm. An environmental control system is needed to ensure the correct start-up, shut-down and housekeeping of the camera. The main technical challenge is represented by the extreme environmental conditions of Dome C (T about -90 °C, p around 640 mbar) and the need for a complete automatization of the overall system. AMICA will be mounted at the Nasmyth focus of the 80 cm IRAIT telescope and will perform survey-mode automatic observations of selected regions of the Southern sky. The first goal will be a direct estimate of the observational quality of this new highly promising site for Infrared Astronomy. In addition, IRAIT, equipped with AMICA, is expected to provide a significant improvement in the knowledge of fundamental astrophysical processes, such as the late stages of stellar evolution (especially AGB and post-AGB stars) and the star formation.

  16. Near infrared photography with a vacuum-cold camera. [Orion nebula observation

    NASA Technical Reports Server (NTRS)

    Rossano, G. S.; Russell, R. W.; Cornett, R. H.

    1980-01-01

    Sensitized cooled plates have been obtained of the Orion nebula region and of Sh2-149 in the wavelength ranges 8000 A-9000 A and 9,000 A-11,000 A with a recently designed and constructed vacuum-cold camera. Sensitization procedures are described and the camera design is presented.

  17. A technical innovation for improving identification of the trackers by the LED cameras in navigation-assisted total knee arthroplasty.

    PubMed

    Darmanis, Spyridon; Toms, Andrew; Durman, Robert; Moore, Donna; Eyres, Keith

    2007-07-01

    To reduce the operating time in computer-assisted navigated total knee replacement (TKR), by improving communication between the infrared camera and the trackers placed on the patient. The innovation involves placing a routinely used laser pointer on top of the camera, so that the infrared cameras focus precisely on the trackers located on the knee to be operated on. A prospective randomized study was performed involving 40 patients divided into two groups, A and B. Both groups underwent navigated TKR, but for group B patients a laser pointer was used to improve the targeting capabilities of the cameras. Without the laser pointer, the camera had to move a mean 9.2 times in order to identify the trackers. With the introduction of the laser pointer, this was reduced to 0.9 times. Accordingly, the additional mean time required without the laser pointer was 11.6 minutes. Time delays are a major problem in computer-assisted surgery, and our technical suggestion can contribute towards reducing the delays associated with this particular application.

  18. 9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE LOOKING WEST, APRIL 26, 1948. (ORIGINAL PHOTOGRAPH IN POSSESSION OF DAVE WILLIS, SAN DIEGO, CALIFORNIA.) - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  19. IRAIT project: future mid-IR operations at Dome C during summer

    NASA Astrophysics Data System (ADS)

    Tosti, Gino; IRAIT Collaboration

    The project IRAIT consists of a robotic mid-infrared telescope that will be hosted at Dome C in the Italian-French Concordia station on the Antarctic Plateau. The telescope was built in collaboration with the PNRA (sectors Technology and Earth-Sun Interaction and Astrophysics). Its focal plane instrumentation is a mid-infrared Camera (5-25 mu m), based on the TIRCAM II prototype, which is the result of a join effort between Institutes of CNR and INAF. International collaborations with French and Spanish Institutes for the construction of a near infrared spectrographic camera have also been started. We present the status of the project and the ongoing developments that will make possible to start infrared observations at Dome C during the summer Antarctic campaign 2005-2006.

  20. The Absolute Reflectance and New Calibration Site of the Moon

    NASA Astrophysics Data System (ADS)

    Wu, Yunzhao; Wang, Zhenchao; Cai, Wei; Lu, Yu

    2018-05-01

    How bright the Moon is forms a simple but fundamental and important question. Although numerous efforts have been made to answer this question such as use of sophisticated electro-optical measurements and suggestions for calibration sites, the answer is still debated. An in situ measurement with a calibration panel on the surface of the Moon is crucial for obtaining the accurate absolute reflectance and resolving the debate. China’s Chang’E-3 (CE-3) “Yutu” rover accomplished this type of measurement using the Visible-Near Infrared Spectrometer (VNIS). The measurements of the VNIS, which were at large emission and phase angles, complement existing measurements for the range of photometric geometry. The in situ reflectance shows that the CE-3 landing site is very dark with an average reflectance of 3.86% in the visible bands. The results are compared with recent mission instruments: the Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC), the Spectral Profiler (SP) on board the SELENE, the Moon Mineralogy Mapper (M3) on board the Chandrayaan-1, and the Chang’E-1 Interference Imaging Spectrometer (IIM). The differences in the measurements of these instruments are very large and indicate inherent differences in their absolute calibration. The M3 and IIM measurements are smaller than LROC WAC and SP, and the VNIS measurement falls between these two pairs. When using the Moon as a radiance source for the on-orbit calibration of spacecraft instruments, one should be cautious about the data. We propose that the CE-3 landing site, a young and homogeneous surface, should serve as the new calibration site.

  1. Spatial variability in the seasonal south polar CAP of Mars

    NASA Astrophysics Data System (ADS)

    Calvin, Wendy M.; Martin, Terry Z.

    1994-10-01

    The first comprehensive discussion of the south seasonal polar cap spectra obtained by the Mariner 7 infrared spectrometer in the short-wavelength region (2-4 microns) is presented. The infrared spectra is correlated with images acquired by the wide-angle camera. Significant spectral variation is noted in the cap interior and regions of varying water frost abundance, CO2 ice/frost cover, and CO2-ice path length can be distinguished. Many of these spectral variations correlate with heterogeneity noted in the camera images, but certain significant infrared spectral variations are not discernible in the visible. Simple reflectance models are used to classify the observed spectral variations into four regions. Region I is at the cap edge, where there is enhanced absorption beyond 3 microns inferred to be caused by an increased abundance of water frost. The increase in water abundance over that in the interior is on the level of a few parts per thousand or less. Region II is the typical cap interior characterized by spectral features of CO2 ice at grain sizes of several millimeters to centimeters. These spectra also indicate the presence of water frost at the parts per thousand level. A third, unusual region (III), is defined by three spectra in which weak CO2 absorption features are as much as twice as strong as in the average cap spectra and are assumed to be caused by an increased path length in the CO2. Such large paths are inconsistent with the high reflectance in the visible and at 2.2 microns and suggest layered structures or deposition conditions that are not accounted for in current reflectance models. The final region (IV) is an area of thinning frost coverage or transparent ice well in the interior of the seasonal cap. These spectra are a combination of CO2 and ground signatures.

  2. Spatial variability in the seasonal south polar cap of Mars

    NASA Technical Reports Server (NTRS)

    Calvin, Wendy M.; Martin, Terry Z.

    1994-01-01

    The first comprehensive discussion of the south seasonal polar cap spectra obtained by the Mariner 7 infrared spectrometer in the short-wavelength region (2-4 microns) is presented. The infrared spectra is correlated with images acquired by the wide-angle camera. Significant spectral variation is noted in the cap interior and regions of varying water frost abundance, CO2 ice/frost cover, and CO2-ice path length can be distinguished. Many of these spectral variations correlate with heterogeneity noted in the camera images, but certain significant infrared spectral variations are not discernible in the visible. Simple reflectance models are used to classify the observed spectral variations into four regions. Region I is at the cap edge, where there is enhanced absorption beyond 3 microns inferred to be caused by an increased abundance of water frost. The increase in water abundance over that in the interior is on the level of a few parts per thousand or less. Region II is the typical cap interior characterized by spectral features of CO2 ice at grain sizes of several millimeters to centimeters. These spectra also indicate the presence of water frost at the parts per thousand level. A third, unusual region (III), is defined by three spectra in which weak CO2 absorption features are as much as twice as strong as in the average cap spectra and are assumed to be caused by an increased path length in the CO2. Such large paths are inconsistent with the high reflectance in the visible and at 2.2 microns and suggest layered structures or deposition conditions that are not accounted for in current reflectance models. The final region (IV) is an area of thinning frost coverage or transparent ice well in the interior of the seasonal cap. These spectra are a combination of CO2 and ground signatures.

  3. EARLY SCIENCE WITH SOFIA, THE STRATOSPHERIC OBSERVATORY FOR INFRARED ASTRONOMY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, E. T.; Becklin, E. E.; De Buizer, J. M.

    The Stratospheric Observatory For Infrared Astronomy (SOFIA) is an airborne observatory consisting of a specially modified Boeing 747SP with a 2.7 m telescope, flying at altitudes as high as 13.7 km (45,000 ft). Designed to observe at wavelengths from 0.3 {mu}m to 1.6 mm, SOFIA operates above 99.8% of the water vapor that obscures much of the infrared and submillimeter. SOFIA has seven science instruments under development, including an occultation photometer, near-, mid-, and far-infrared cameras, infrared spectrometers, and heterodyne receivers. SOFIA, a joint project between NASA and the German Aerospace Center Deutsches Zentrum fuer Luft und-Raumfahrt, began initial sciencemore » flights in 2010 December, and has conducted 30 science flights in the subsequent year. During this early science period three instruments have flown: the mid-infrared camera FORCAST, the heterodyne spectrometer GREAT, and the occultation photometer HIPO. This Letter provides an overview of the observatory and its early performance.« less

  4. ARNICA, the NICMOS 3 imaging camera of TIRGO.

    NASA Astrophysics Data System (ADS)

    Lisi, F.; Baffa, C.; Hunt, L.; Stanga, R.

    ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 μm that Arcetri Observatory has designed and built as a general facility for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1″per pixel, with sky coverage of more than 4 min×4 min on the NICMOS 3 (256×256 pixels, 40 μm side) detector array. The camera is remotely controlled by a PC 486, connected to the array control electronics via a fiber-optics link. A C-language package, running under MS-DOS on the PC 486, acquires and stores the frames, and controls the timing of the array. The camera is intended for imaging of large extra-galactic and Galactic fields; a large effort has been dedicated to explore the possibility of achieving precise photometric measurements in the J, H, K astronomical bands, with very promising results.

  5. Experience with the UKIRT InSb array camera

    NASA Technical Reports Server (NTRS)

    Mclean, Ian S.; Casali, Mark M.; Wright, Gillian S.; Aspin, Colin

    1989-01-01

    The cryogenic infrared camera, IRCAM, has been operating routinely on the 3.8 m UK Infrared Telescope on Mauna Kea, Hawaii for over two years. The camera, which uses a 62x58 element Indium Antimonide array from Santa Barbara Research Center, was designed and built at the Royal Observatory, Edinburgh which operates UKIRT on behalf of the UK Science and Engineering Research Council. Over the past two years at least 60% of the available time on UKIRT has been allocated for IRCAM observations. Described here are some of the properties of this instrument and its detector which influence astronomical performance. Observational techniques and the power of IR arrays with some recent astronomical results are discussed.

  6. Prediction of Viking lander camera image quality

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

    1976-01-01

    Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

  7. Charon's light curves, as observed by New Horizons' Ralph color camera (MVIC) on approach to the Pluto system

    NASA Astrophysics Data System (ADS)

    Howett, C. J. A.; Ennico, K.; Olkin, C. B.; Buie, M. W.; Verbiscer, A. J.; Zangari, A. M.; Parker, A. H.; Reuter, D. C.; Grundy, W. M.; Weaver, H. A.; Young, L. A.; Stern, S. A.

    2017-05-01

    Light curves produced from color observations taken during New Horizons' approach to the Pluto-system by its Multi-spectral Visible Imaging Camera (MVIC, part of the Ralph instrument) are analyzed. Fifty seven observations were analyzed, they were obtained between 9th April and 3rd July 2015, at a phase angle of 14.5° to 15.1°, sub-observer latitude of 51.2 °N to 51.5 °N, and a sub-solar latitude of 41.2°N. MVIC has four color channels; all are discussed for completeness but only two were found to produce reliable light curves: Blue (400-550 nm) and Red (540-700 nm). The other two channels, Near Infrared (780-975 nm) and Methane-Band (860-910 nm), were found to be potentially erroneous and too noisy respectively. The Blue and Red light curves show that Charon's surface is neutral in color, but slightly brighter on its Pluto-facing hemisphere. This is consistent with previous studies made with the Johnson B and V bands, which are at shorter wavelengths than that of the MVIC Blue and Red channel respectively.

  8. Charon's Light Curves, as Observed by New Horizons' Ralph Color Camera (MVIC) on Approach to the Pluto System.

    NASA Technical Reports Server (NTRS)

    Howett, C. J. A.; Ennico, K.; Olkin, C. B.; Buie, M. W.; Verbiscer, A. J.; Zangari, A. M.; Parker, A. H.; Reuter, D. C.; Grundy, W. M.; Weaver, H. A.; hide

    2016-01-01

    Light curves produced from color observations taken during New Horizons approach to the Pluto-system by its Multi-spectral Visible Imaging Camera (MVIC, part of the Ralph instrument) are analyzed. Fifty seven observations were analyzed, they were obtained between 9th April and 3rd July 2015, at a phase angle of 14.5 degrees to 15.1 degrees, sub-observer latitude of 51.2 degrees North to 51.5 degrees North, and a sub-solar latitude of 41.2 degrees North. MVIC has four color channels; all are discussed for completeness but only two were found to produce reliable light curves: Blue (400-550 nm) and Red (540-700 nm). The other two channels, Near Infrared (780-975 nm) and Methane-Band (860-910 nm), were found to be potentially erroneous and too noisy respectively. The Blue and Red light curves show that Charon's surface is neutral in color, but slightly brighter on its Pluto-facing hemisphere. This is consistent with previous studies made with the Johnson B and V bands, which are at shorter wavelengths than that of the MVIC Blue and Red channel respectively.

  9. Development of a portable multispectral thermal infrared camera

    NASA Technical Reports Server (NTRS)

    Osterwisch, Frederick G.

    1991-01-01

    The purpose of this research and development effort was to design and build a prototype instrument designated the 'Thermal Infrared Multispectral Camera' (TIRC). The Phase 2 effort was a continuation of the Phase 1 feasibility study and preliminary design for such an instrument. The completed instrument designated AA465 has application in the field of geologic remote sensing and exploration. The AA465 Thermal Infrared Camera (TIRC) System is a field-portable multispectral thermal infrared camera operating over the 8.0 - 13.0 micron wavelength range. Its primary function is to acquire two-dimensional thermal infrared images of user-selected scenes. Thermal infrared energy emitted by the scene is collected, dispersed into ten 0.5 micron wide channels, and then measured and recorded by the AA465 System. This multispectral information is presented in real time on a color display to be used by the operator to identify spectral and spatial variations in the scenes emissivity and/or irradiance. This fundamental instrument capability has a wide variety of commercial and research applications. While ideally suited for two-man operation in the field, the AA465 System can be transported and operated effectively by a single user. Functionally, the instrument operates as if it were a single exposure camera. System measurement sensitivity requirements dictate relatively long (several minutes) instrument exposure times. As such, the instrument is not suited for recording time-variant information. The AA465 was fabricated, assembled, tested, and documented during this Phase 2 work period. The detailed design and fabrication of the instrument was performed during the period of June 1989 to July 1990. The software development effort and instrument integration/test extended from July 1990 to February 1991. Software development included an operator interface/menu structure, instrument internal control functions, DSP image processing code, and a display algorithm coding program. The instrument was delivered to NASA in March 1991. Potential commercial and research uses for this instrument are in its primary application as a field geologists exploration tool. Other applications have been suggested but not investigated in depth. These are measurements of process control in commercial materials processing and quality control functions which require information on surface heterogeneity.

  10. Flight Calibration of the LROC Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Humm, D. C.; Tschimmel, M.; Brylow, S. M.; Mahanti, P.; Tran, T. N.; Braden, S. E.; Wiseman, S.; Danton, J.; Eliason, E. M.; Robinson, M. S.

    2016-04-01

    Characterization and calibration are vital for instrument commanding and image interpretation in remote sensing. The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) takes 500 Mpixel greyscale images of lunar scenes at 0.5 meters/pixel. It uses two nominally identical line scan cameras for a larger crosstrack field of view. Stray light, spatial crosstalk, and nonlinearity were characterized using flight images of the Earth and the lunar limb. These are important for imaging shadowed craters, studying ˜1 meter size objects, and photometry respectively. Background, nonlinearity, and flatfield corrections have been implemented in the calibration pipeline. An eight-column pattern in the background is corrected. The detector is linear for DN = 600--2000 but a signal-dependent additive correction is required and applied for DN<600. A predictive model of detector temperature and dark level was developed to command dark level offset. This avoids images with a cutoff at DN=0 and minimizes quantization error in companding. Absolute radiometric calibration is derived from comparison of NAC images with ground-based images taken with the Robotic Lunar Observatory (ROLO) at much lower spatial resolution but with the same photometric angles.

  11. Geomorphologic mapping of the lunar crater Tycho and its impact melt deposits

    NASA Astrophysics Data System (ADS)

    Krüger, T.; van der Bogert, C. H.; Hiesinger, H.

    2016-07-01

    Using SELENE/Kaguya Terrain Camera and Lunar Reconnaissance Orbiter Camera (LROC) data, we produced a new, high-resolution (10 m/pixel), geomorphological and impact melt distribution map for the lunar crater Tycho. The distal ejecta blanket and crater rays were investigated using LROC wide-angle camera (WAC) data (100 m/pixel), while the fine-scale morphologies of individual units were documented using high resolution (∼0.5 m/pixel) LROC narrow-angle camera (NAC) frames. In particular, Tycho shows a large coherent melt sheet on the crater floor, melt pools and flows along the terraced walls, and melt pools on the continuous ejecta blanket. The crater floor of Tycho exhibits three distinct units, distinguishable by their elevation and hummocky surface morphology. The distribution of impact melt pools and ejecta, as well as topographic asymmetries, support the formation of Tycho as an oblique impact from the W-SW. The asymmetric ejecta blanket, significantly reduced melt emplacement uprange, and the depressed uprange crater rim at Tycho suggest an impact angle of ∼25-45°.

  12. Volcano monitoring with an infrared camera: first insights from Villarrica Volcano

    NASA Astrophysics Data System (ADS)

    Rosas Sotomayor, Florencia; Amigo Ramos, Alvaro; Velasquez Vargas, Gabriela; Medina, Roxana; Thomas, Helen; Prata, Fred; Geoffroy, Carolina

    2015-04-01

    This contribution focuses on the first trials of the, almost 24/7 monitoring of Villarrica volcano with an infrared camera. Results must be compared with other SO2 remote sensing instruments such as DOAS and UV-camera, for the ''day'' measurements. Infrared remote sensing of volcanic emissions is a fast and safe method to obtain gas abundances in volcanic plumes, in particular when the access to the vent is difficult, during volcanic crisis and at night time. In recent years, a ground-based infrared camera (Nicair) has been developed by Nicarnica Aviation, which quantifies SO2 and ash on volcanic plumes, based on the infrared radiance at specific wavelengths through the application of filters. Three Nicair1 (first model) have been acquired by the Geological Survey of Chile in order to study degassing of active volcanoes. Several trials with the instruments have been performed in northern Chilean volcanoes, and have proven that the intervals of retrieved SO2 concentration and fluxes are as expected. Measurements were also performed at Villarrica volcano, and a location to install a ''fixed'' camera, at 8km from the crater, was discovered here. It is a coffee house with electrical power, wifi network, polite and committed owners and a full view of the volcano summit. The first measurements are being made and processed in order to have full day and week of SO2 emissions, analyze data transfer and storage, improve the remote control of the instrument and notebook in case of breakdown, web-cam/GoPro support, and the goal of the project: which is to implement a fixed station to monitor and study the Villarrica volcano with a Nicair1 integrating and comparing these results with other remote sensing instruments. This works also looks upon the strengthen of bonds with the community by developing teaching material and giving talks to communicate volcanic hazards and other geoscience topics to the people who live "just around the corner" from one of the most active volcanoes in Chile.

  13. Multiple-frame IR photo-recorder KIT-3M

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roos, E; Wilkins, P; Nebeker, N

    2006-05-15

    This paper reports the experimental results of a high-speed multi-frame infrared camera which has been developed in Sarov at VNIIEF. Earlier [1] we discussed the possibility of creation of the multi-frame infrared radiation photo-recorder with framing frequency about 1 MHz. The basis of the photo-recorder is a semiconductor ionization camera [2, 3], which converts IR radiation of spectral range 1-10 micrometers into a visible image. Several sequential thermal images are registered by using the IR converter in conjunction with a multi-frame electron-optical camera. In the present report we discuss the performance characteristics of a prototype commercial 9-frame high-speed IR photo-recorder.more » The image converter records infrared images of thermal fields corresponding to temperatures ranging from 300 C to 2000 C with an exposure time of 1-20 {micro}s at a frame frequency up to 500 KHz. The IR-photo-recorder camera is useful for recording the time evolution of thermal fields in fast processes such as gas dynamics, ballistics, pulsed welding, thermal processing, automotive industry, aircraft construction, in pulsed-power electric experiments, and for the measurement of spatial mode characteristics of IR-laser radiation.« less

  14. A fuzzy automated object classification by infrared laser camera

    NASA Astrophysics Data System (ADS)

    Kanazawa, Seigo; Taniguchi, Kazuhiko; Asari, Kazunari; Kuramoto, Kei; Kobashi, Syoji; Hata, Yutaka

    2011-06-01

    Home security in night is very important, and the system that watches a person's movements is useful in the security. This paper describes a classification system of adult, child and the other object from distance distribution measured by an infrared laser camera. This camera radiates near infrared waves and receives reflected ones. Then, it converts the time of flight into distance distribution. Our method consists of 4 steps. First, we do background subtraction and noise rejection in the distance distribution. Second, we do fuzzy clustering in the distance distribution, and form several clusters. Third, we extract features such as the height, thickness, aspect ratio, area ratio of the cluster. Then, we make fuzzy if-then rules from knowledge of adult, child and the other object so as to classify the cluster to one of adult, child and the other object. Here, we made the fuzzy membership function with respect to each features. Finally, we classify the clusters to one with the highest fuzzy degree among adult, child and the other object. In our experiment, we set up the camera in room and tested three cases. The method successfully classified them in real time processing.

  15. Research on a solid state-streak camera based on an electro-optic crystal

    NASA Astrophysics Data System (ADS)

    Wang, Chen; Liu, Baiyu; Bai, Yonglin; Bai, Xiaohong; Tian, Jinshou; Yang, Wenzheng; Xian, Ouyang

    2006-06-01

    With excellent temporal resolution ranging from nanosecond to sub-picoseconds, a streak camera is widely utilized in measuring ultrafast light phenomena, such as detecting synchrotron radiation, examining inertial confinement fusion target, and making measurements of laser-induced discharge. In combination with appropriate optics or spectroscope, the streak camera delivers intensity vs. position (or wavelength) information on the ultrafast process. The current streak camera is based on a sweep electric pulse and an image converting tube with a wavelength-sensitive photocathode ranging from the x-ray to near infrared region. This kind of streak camera is comparatively costly and complex. This paper describes the design and performance of a new-style streak camera based on an electro-optic crystal with large electro-optic coefficient. Crystal streak camera accomplishes the goal of time resolution by direct photon beam deflection using the electro-optic effect which can replace the current streak camera from the visible to near infrared region. After computer-aided simulation, we design a crystal streak camera which has the potential of time resolution between 1ns and 10ns.Some further improvements in sweep electric circuits, a crystal with a larger electro-optic coefficient, for example LN (γ 33=33.6×10 -12m/v) and the optimal optic system may lead to better time resolution less than 1ns.

  16. Seasonal variations of leaf and canopy properties tracked by ground-based NDVI imagery in a temperate forest

    DOE PAGES

    Yang, Hualei; Yang, Xi; Heskel, Mary; ...

    2017-04-28

    Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporalmore » resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). Here we found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.« less

  17. Seasonal variations of leaf and canopy properties tracked by ground-based NDVI imagery in a temperate forest

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Hualei; Yang, Xi; Heskel, Mary

    Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporalmore » resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). Here we found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.« less

  18. Study of optical techniques for the Ames unitary wind tunnel. Part 5: Infrared imagery

    NASA Technical Reports Server (NTRS)

    Lee, George

    1992-01-01

    A survey of infrared thermography for aerodynamics was made. Particular attention was paid to boundary layer transition detection. IR thermography flow visualization of 2-D and 3-D separation was surveyed. Heat transfer measurements and surface temperature measurements were also covered. Comparisons of several commercial IR cameras were made. The use of a recently purchased IR camera in the Ames Unitary Plan Wind Tunnels was studied. Optical access for these facilities and the methods to scan typical models was investigated.

  19. The infrared camera application for calculating the impact of the feed screw thermal expansion on machining accuracy

    NASA Astrophysics Data System (ADS)

    Matras, A.

    2017-08-01

    The paper discusses the impact of the feed screw heating on the machining accuracy. The test stand was built based on HASS Mini Mill 2 CNC milling machine and a Flir SC620 infrared camera. Measurements of workpiece were performed on Talysurf Intra 50 Taylor Hobson profilometer. The research proved that the intensive work of the milling machine lasted 60 minutes, causing thermal expansion of the feed screw what influence on the dimension error of the workpiece.

  20. Use and validation of mirrorless digital single light reflex camera for recording of vitreoretinal surgeries in high definition

    PubMed Central

    Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish

    2018-01-01

    Purpose: The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Methods: Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Conclusion: Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching. PMID:29283133

  1. Use and validation of mirrorless digital single light reflex camera for recording of vitreoretinal surgeries in high definition.

    PubMed

    Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish

    2018-01-01

    The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching.

  2. Omnidirectional Underwater Camera Design and Calibration

    PubMed Central

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  3. VizieR Online Data Catalog: Antennae galaxies (NGC 4038/4039) revisited (Whitmore+, 2010)

    NASA Astrophysics Data System (ADS)

    Whitmore, B. C.; Chandar, R.; Schweizer, F.; Rothberg, B.; Leitherer, C.; Rieke, M.; Rieke, G.; Blair, W. P.; Mengel, S.; Alonso-Herrero, A.

    2012-06-01

    Observations of the main bodies of NGC 4038/39 were made with the Hubble Space Telescope (HST), using the ACS, as part of Program GO-10188. Multi-band photometry was obtained in the following optical broadband filters: F435W (~B), F550M (~V), and F814W (~I). Archival F336W photometry of the Antennae (Program GO-5962) was used to supplement our optical ACS/WFC observations. Infrared observations were made using the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) camera on HST as part of Program GO-10188. Observations were made using the NIC2 camera with the F160W, F187N, and F237M filters, and the NIC3 camera with the F110W, F160W, F164W, F187N, and F222M filters. (10 data files).

  4. Characterization and optimization for detector systems of IGRINS

    NASA Astrophysics Data System (ADS)

    Jeong, Ueejeong; Chun, Moo-Young; Oh, Jae Sok; Park, Chan; Yuk, In-Soo; Oh, Heeyoung; Kim, Kang-Min; Ko, Kyeong Yeon; Pavel, Michael D.; Yu, Young Sam; Jaffe, Daniel T.

    2014-07-01

    IGRINS (Immersion GRating INfrared Spectrometer) is a high resolution wide-band infrared spectrograph developed by the Korea Astronomy and Space Science Institute (KASI) and the University of Texas at Austin (UT). This spectrograph has H-band and K-band science cameras and a slit viewing camera, all three of which use Teledyne's λc~2.5μm 2k×2k HgCdTe HAWAII-2RG CMOS detectors. The two spectrograph cameras employ science grade detectors, while the slit viewing camera includes an engineering grade detector. Teledyne's cryogenic SIDECAR ASIC boards and JADE2 USB interface cards were installed to control those detectors. We performed experiments to characterize and optimize the detector systems in the IGRINS cryostat. We present measurements and optimization of noise, dark current, and referencelevel stability obtained under dark conditions. We also discuss well depth, linearity and conversion gain measurements obtained using an external light source.

  5. KSC-01pp1760

    NASA Image and Video Library

    2001-11-29

    KENNEDY SPACE CENTER, Fla. -- Fully unwrapped, the Advanced Camera for Surveys, which is suspended by an overhead crane, is checked over by workers. Part of the payload on the Hubble Space Telescope Servicing Mission, STS-109, the ACS will increase the discovery efficiency of the HST by a factor of ten. It consists of three electronic cameras and a complement of filters and dispersers that detect light from the ultraviolet to the near infrared (1200 - 10,000 angstroms). The ACS was built through a collaborative effort between Johns Hopkins University, Goddard Space Flight Center, Ball Aerospace Corporation and Space Telescope Science Institute. Tasks for the mission include replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the ACS, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002

  6. Painted Saturn

    NASA Image and Video Library

    2014-09-29

    Saturn many cloud patterns, swept along by high-speed winds, look as if they were painted on by some eager alien artist in this image from NASA Cassini spacecraft. With no real surface features to slow them down, wind speeds on Saturn can top 1,100 mph (1,800 kph), more than four times the top speeds on Earth. This view looks toward the sunlit side of the rings from about 29 degrees above the ringplane. The image was taken with the Cassini spacecraft wide-angle camera on April 4, 2014 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was obtained at a distance of approximately 1.1 million miles (1.8 million kilometers) from Saturn. Image scale is 68 miles (109 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18280

  7. Frozen Paradise

    NASA Image and Video Library

    2015-03-02

    Named after a Japanese paradise, the Senkyo region of Titan (the dark area below and to the right of center) is a bit less welcoming than its namesake. With a very inhospitable average temperature of approximately 290 degrees below zero Fahrenheit (-180 degrees Celsius), water on Titan (3,200 miles or 5,150 kilometers across) freezes hard enough to be essentially considered rock. This view looks toward the Saturn-facing side of Titan. North on Titan is up and rotated 33 degrees to the right. The image was taken with the Cassini spacecraft narrow-angle camera on Jan. 8, 2015 using a near-infrared filter which is centered at 938 nanometers. The view was acquired at a distance of approximately 1.2 million miles (1.9 million kilometers) from Titan. Image scale is 7 miles (11 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/pia18309

  8. Intraoperative Near-Infrared Fluorescence Imaging using indocyanine green in colorectal carcinomatosis surgery: Proof of concept.

    PubMed

    Barabino, G; Klein, J P; Porcheron, J; Grichine, A; Coll, J-L; Cottier, M

    2016-12-01

    This study assesses the value of using Intraoperative Near Infrared Fluorescence Imaging and Indocyanine green to detect colorectal carcinomatosis during oncological surgery. In colorectal carcinomatosis cancer, two of the most important prognostic factors are completeness of staging and completeness of cytoreductive surgery. Presently, intraoperative assessment of tumoral margins relies on palpation and visual inspection. The recent introduction of Near Infrared fluorescence image guidance provides new opportunities for surgical roles, particularly in cancer surgery. The study was a non-randomized, monocentric, pilot "ex vivo" blinded clinical trial validated by the ethical committee of University Hospital of Saint Etienne. Ten patients with colorectal carcinomatosis cancer scheduled for cytoreductive surgery were included. Patients received 0.25 mg/kg of Indocyanine green intravenously 24 h before surgery. A Near Infrared camera was used to detect "ex-vivo" fluorescent lesions. There was no surgical mortality. Each analysis was done blindly. In a total of 88 lesions analyzed, 58 were classified by a pathologist as cancerous and 30 as non-cancerous. Among the 58 cancerous lesions, 42 were correctly classified by the Intraoperative Near-Infrared camera (sensitivity of 72.4%). Among the 30 non-cancerous lesions, 18 were correctly classified by the Intraoperative Near-Infrared camera (specificity of 60.0%). Near Infrared fluorescence imaging is a promising technique for intraoperative tumor identification. It could help the surgeon to determine resection margins and reduce the risk of locoregional recurrence. Copyright © 2016 Elsevier Ltd, BASO ~ the Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.

  9. Time-dependent spatial intensity profiles of near-infrared idler pulses from nanosecond optical parametric oscillators

    NASA Astrophysics Data System (ADS)

    Olafsen, L. J.; Olafsen, J. S.; Eaves, I. K.

    2018-06-01

    We report on an experimental investigation of the time-dependent spatial intensity distribution of near-infrared idler pulses from an optical parametric oscillator measured using an infrared (IR) camera, in contrast to beam profiles obtained using traditional knife-edge techniques. Comparisons show the information gained by utilizing the thermal camera provides more detail than the spatially- or time-averaged measurements from a knife-edge profile. Synchronization, averaging, and thresholding techniques are applied to enhance the images acquired. The additional information obtained can improve the process by which semiconductor devices and other IR lasers are characterized for their beam quality and output response and thereby result in IR devices with higher performance.

  10. A portable W-band radar system for enhancement of infrared vision in fire fighting operations

    NASA Astrophysics Data System (ADS)

    Klenner, Mathias; Zech, Christian; Hülsmann, Axel; Kühn, Jutta; Schlechtweg, Michael; Hahmann, Konstantin; Kleiner, Bernhard; Ulrich, Michael; Ambacher, Oliver

    2016-10-01

    In this paper, we present a millimeter wave radar system which will enhance the performance of infrared cameras used for fire-fighting applications. The radar module is compact and lightweight such that the system can be combined with inertial sensors and integrated in a hand-held infrared camera. This allows for precise distance measurements in harsh environmental conditions, such as tunnel or industrial fires, where optical sensors are unreliable or fail. We discuss the design of the RF front-end, the antenna and a quasi-optical lens for beam shaping as well as signal processing and demonstrate the performance of the system by in situ measurements in a smoke filled environment.

  11. High speed Infrared imaging method for observation of the fast varying temperature phenomena

    NASA Astrophysics Data System (ADS)

    Moghadam, Reza; Alavi, Kambiz; Yuan, Baohong

    With new improvements in high-end commercial R&D camera technologies many challenges have been overcome for exploring the high-speed IR camera imaging. The core benefits of this technology is the ability to capture fast varying phenomena without image blur, acquire enough data to properly characterize dynamic energy, and increase the dynamic range without compromising the number of frames per second. This study presents a noninvasive method for determining the intensity field of a High Intensity Focused Ultrasound Device (HIFU) beam using Infrared imaging. High speed Infrared camera was placed above the tissue-mimicking material that was heated by HIFU with no other sensors present in the HIFU axial beam. A MATLAB simulation code used to perform a finite-element solution to the pressure wave propagation and heat equations within the phantom and temperature rise to the phantom was computed. Three different power levels of HIFU transducers were tested and the predicted temperature increase values were within about 25% of IR measurements. The fundamental theory and methods developed in this research can be used to detect fast varying temperature phenomena in combination with the infrared filters.

  12. AMICA: The First camera for Near- and Mid-Infrared Astronomical Imaging at Dome C

    NASA Astrophysics Data System (ADS)

    Straniero, O.; Dolci, M.; Valentini, A.; Valentini, G.; di Rico, G.; Ragni, M.; Giuliani, C.; di Cianno, A.; di Varano, I.; Corcione, L.; Bortoletto, F.; D'Alessandro, M.; Magrin, D.; Bonoli, C.; Giro, E.; Fantinel, D.; Zerbi, F. M.; Riva, A.; de Caprio, V.; Molinari, E.; Conconi, P.; Busso, M.; Tosti, G.; Abia, C. A.

    AMICA (Antarctic Multiband Infrared CAmera) is an instrument designed to perform astronomical imaging in the near- (1{-}5 μm) and mid- (5 27 μm) infrared wavelength regions. Equipped with two detectors, an InSb 2562 and a Si:As 1282 IBC, cooled at 35 and 7 K respectively, it will be the first instrument to investigate the potential of the Italian-French base Concordia for IR astronomy. The main technical challenge is represented by the extreme conditions of Dome C (T ˜ -90 °C, p ˜640 mbar). An environmental control system ensures the correct start-up, shut-down and housekeeping of the various components of the camera. AMICA will be mounted on the IRAIT telescope and will perform survey-mode observations in the Southern sky. The first task is to provide important site-quality data. Substantial contributions to the solution of fundamental astrophysical quests, such as those related to late phases of stellar evolution and to star formation processes, are also expected.

  13. Infrared needle mapping to assist biopsy procedures and training.

    PubMed

    Shar, Bruce; Leis, John; Coucher, John

    2018-04-01

    A computed tomography (CT) biopsy is a radiological procedure which involves using a needle to withdraw tissue or a fluid specimen from a lesion of interest inside a patient's body. The needle is progressively advanced into the patient's body, guided by the most recent CT scan. CT guided biopsies invariably expose patients to high dosages of radiation, due to the number of scans required whilst the needle is advanced. This study details the design of a novel method to aid biopsy procedures using infrared cameras. Two cameras are used to image the biopsy needle area, from which the proposed algorithm computes an estimate of the needle endpoint, which is projected onto the CT image space. This estimated position may be used to guide the needle between scans, and results in a reduction in the number of CT scans that need to be performed during the biopsy procedure. The authors formulate a 2D augmentation system which compensates for camera pose, and show that multiple low-cost infrared imaging devices provide a promising approach.

  14. STRONG EVIDENCE FOR THE DENSITY-WAVE THEORY OF SPIRAL STRUCTURE IN DISK GALAXIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pour-Imani, Hamed; Kennefick, Daniel; Kennefick, Julia

    2016-08-10

    The density-wave theory of galactic spiral-arm structure makes a striking prediction that the pitch angle of spiral arms should vary with the wavelength of the galaxy’s image. The reason is that stars are born in the density wave but move out of it as they age. They move ahead of the density wave inside the co-rotation radius, and fall behind outside of it, resulting in a tighter pitch angle at wavelengths that image stars (optical and near-infrared) than those that are associated with star formation (far-infrared and ultraviolet). In this study we combined large sample size with wide range ofmore » wavelengths, from the ultraviolet to the infrared to investigate this issue. For each galaxy we used an optical wavelength image ( B -band: 445 nm) and images from the Spitzer Space Telescope at two infrared wavelengths (infrared: 3.6 and 8.0 μ m) and we measured the pitch angle with the 2DFFT and Spirality codes. We find that the B -band and 3.6 μ m images have smaller pitch angles than the infrared 8.0 μ m image in all cases, in agreement with the prediction of density-wave theory. We also used images in the ultraviolet from Galaxy Evolution Explorer , whose pitch angles agreed with the measurements made at 8 μ m.« less

  15. Impact of autofluorescence-based identification of parathyroids during total thyroidectomy on postoperative hypocalcemia: a before and after controlled study.

    PubMed

    Benmiloud, Fares; Rebaudet, Stanislas; Varoquaux, Arthur; Penaranda, Guillaume; Bannier, Marie; Denizot, Anne

    2018-01-01

    The clinical impact of intraoperative autofluorescence-based identification of parathyroids using a near-infrared camera remains unknown. In a before and after controlled study, we compared all patients who underwent total thyroidectomy by the same surgeon during Period 1 (January 2015 to January 2016) without near-infrared (near-infrared- group) and those operated on during Period 2 (February 2016 to September 2016) using a near-infrared camera (near-infrared+ group). In parallel, we also compared all patients who underwent surgery without near-infrared during those same periods by another surgeon in the same unit (control groups). Main outcomes included postoperative hypocalcemia, parathyroid identification, autotransplantation, and inadvertent resection. The near-infrared+ group displayed significantly lower postoperative hypocalcemia rates (5.2%) than the near-infrared- group (20.9%; P < .001). Compared with the near-infrared- patients, the near-infrared+ group exhibited an increased mean number of identified parathyroids and reduced parathyroid autotransplantation rates, although no difference was observed in inadvertent resection rates. Parathyroids were identified via near-infrared before they were visualized by the surgeon in 68% patients. In the control groups, parathyroid identification improved significantly from Period 1 to Period 2, although autotransplantation, inadvertent resection and postoperative hypocalcemia rates did not differ. Near-infrared use during total thyroidectomy significantly reduced postoperative hypocalcemia, improved parathyroid identification and reduced their autotransplantation rate. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Investigation into the use of photoanthropometry in facial image comparison.

    PubMed

    Moreton, Reuben; Morley, Johanna

    2011-10-10

    Photoanthropometry is a metric based facial image comparison technique. Measurements of the face are taken from an image using predetermined facial landmarks. Measurements are then converted to proportionality indices (PIs) and compared to PIs from another facial image. Photoanthropometry has been presented as a facial image comparison technique in UK courts for over 15 years. It is generally accepted that extrinsic factors (e.g. orientation of the head, camera angle and distance from the camera) can cause discrepancies in anthropometric measurements of the face from photographs. However there has been limited empirical research into quantifying the influence of such variables. The aim of this study was to determine the reliability of photoanthropometric measurements between different images of the same individual taken with different angulations of the camera. The study examined the facial measurements of 25 individuals from high resolution photographs, taken at different horizontal and vertical camera angles in a controlled environment. Results show that the degree of variability in facial measurements of the same individual due to variations in camera angle can be as great as the variability of facial measurements between different individuals. Results suggest that photoanthropometric facial comparison, as it is currently practiced, is unsuitable for elimination purposes. Preliminary investigations into the effects of distance from camera and image resolution in poor quality images suggest that such images are not an accurate representation of an individuals face, however further work is required. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  17. Evaluation of Suppression of Hydroprocessed Renewable Jet (HRJ) Fuel Fires with Aqueous Film Forming Foam (AFFF)

    DTIC Science & Technology

    2011-07-01

    cameras were installed around the test pan and an underwater GoPro ® video camera recorded the fire from below the layer of fuel. 3.2.2. Camera Images...Distribution A: Approved for public release; distribution unlimited. 3.2.3. Video Images A GoPro video camera with a wide angle lens recorded the tests...camera and the GoPro ® video camera were not used for fire suppression experiments. 3.3.2. Test Pans Two ¼-in thick stainless steel test pans were

  18. Early forest fire detection using principal component analysis of infrared video

    NASA Astrophysics Data System (ADS)

    Saghri, John A.; Radjabi, Ryan; Jacobs, John T.

    2011-09-01

    A land-based early forest fire detection scheme which exploits the infrared (IR) temporal signature of fire plume is described. Unlike common land-based and/or satellite-based techniques which rely on measurement and discrimination of fire plume directly from its infrared and/or visible reflectance imagery, this scheme is based on exploitation of fire plume temporal signature, i.e., temperature fluctuations over the observation period. The method is simple and relatively inexpensive to implement. The false alarm rate is expected to be lower that of the existing methods. Land-based infrared (IR) cameras are installed in a step-stare-mode configuration in potential fire-prone areas. The sequence of IR video frames from each camera is digitally processed to determine if there is a fire within camera's field of view (FOV). The process involves applying a principal component transformation (PCT) to each nonoverlapping sequence of video frames from the camera to produce a corresponding sequence of temporally-uncorrelated principal component (PC) images. Since pixels that form a fire plume exhibit statistically similar temporal variation (i.e., have a unique temporal signature), PCT conveniently renders the footprint/trace of the fire plume in low-order PC images. The PC image which best reveals the trace of the fire plume is then selected and spatially filtered via simple threshold and median filter operations to remove the background clutter, such as traces of moving tree branches due to wind.

  19. Methods for LWIR Radiometric Calibration and Characterization

    NASA Technical Reports Server (NTRS)

    Ryan, Robert; Harrington, Gary; Howell, Dane; Pagnutti, Mary; Zanoni, Vicki

    2002-01-01

    The utility of a remote sensing system increases with its ability to retrieve surface temperature or radiance accurately. Research applications, such as sea temperature and power plant discharge, require a 0.2 C resolution or better for absolute temperature retrievals. Other applications, including agriculture water stress detection, require at least a 1 C resolution. To achieve these levels of accuracy routinely, scientists must perform laboratory and onboard calibration, as well as in-flight vicarious radiometric characterization. A common approach used for in-flight radiometric characterization incorporates a well-calibrated infrared radiometer that is mounted on a bouy and placed on a uniform water body. The radiometer monitors radiant temperature along with pressure, humidity, and temperature measurements of an associated column of atmosphere. On very still waters, however, a buoy can significantly distrub these measurements. Researchers at NASA's Stennis Space Center (SSC) have developed a novel approach of using an uncooled infrared camera mounted on a boom to quantify buoy effects. Another critical aspect of using buoy-mounted infrared radiometers is the need for extensive laboratory characterization of the instruments' radiometric sensitivity, field of view, and spectral response. Proper surface temperature retrieval also requires detailed knowledge of both the upward emission and the reflected sky emission. Recent work at SSC has demonstrated that the use of a polarization-based radiometer operating at the Brewster angle can greatly simplify temperature retrieval as well as improve overall accuracy.

  20. HST NICMOS Observations of the Polarization of NGC 1068

    NASA Technical Reports Server (NTRS)

    Simpson, Janet P.; Colgan, Sean W. J.; Erickson, Edwin F.; Hines, Dean C.; Schultz, A. S. B.; Trammell, Susan R.; DeVincenzi, D. (Technical Monitor)

    2002-01-01

    We have observed the polarized light at 2 microns in the center of NGC 1068 with HST (Hubble Space Telescope) NICMOS (Near Infrared Camera Multi Object Spectrometer) Camera 2. The nucleus is dominated by a bright, unresolved source, polarized at a level of 6.0 +/- 1.2% with a position angle of 122 degrees +/- 1.5 degrees. There are two polarized lobes extending tip to 8" northeast and southwest of the nucleus. The polarized flux in both lobes is quite clumpy, with the maximum polarization occurring in the southwest lobe at a level of 17% when smoothed to 0.23" resolution. The perpendiculars to the polarization vectors in these two lobes point back to the intense unresolved nuclear source to within one 0.076" Camera 2 pixel, thereby confirming that this source is the origin of the scattered light and therefore the probable AGN (Active Galactic Nuclei) central engine. Whereas the polarization of the nucleus is probably caused by dichroic absorption, the polarization in the lobes is almost certainly caused by scattering, with very little contribution from dichroic absorption. Features in the polarized lobes include a gap at a distance of about 1" from the nucleus toward the southwest lobe and a "knot" of emission about 5" northwest of the nucleus. Both features had been discussed by groundbased observers, but they are much better defined with the high spatial resolution of NICMOS. The northeast knot may be the side of a molecular cloud that is facing the nucleus, which cloud may be preventing the expansion of the northeast radio lobe at the head of the radio synchrotron-radiation-emitting jet. We also report the presence of two ghosts in the Camera 2 polarizers.

  1. Designing the optimal semi-warm NIR spectrograph for SALT via detailed thermal analysis

    NASA Astrophysics Data System (ADS)

    Wolf, Marsha J.; Sheinis, Andrew I.; Mulligan, Mark P.; Wong, Jeffrey P.; Rogers, Allen

    2008-07-01

    The near infrared (NIR) upgrade to the Robert Stobie Spectrograph (RSS) on the Southern African Large Telescope (SALT), RSS/NIR, extends the spectral coverage of all modes of the optical spectrograph. The RSS/NIR is a low to medium resolution spectrograph with broadband, spectropolarimetric, and Fabry-Perot imaging capabilities. The optical and NIR arms can be used simultaneously to extend spectral coverage from 3200 Å to approximately 1.6 μm. Both arms utilize high efficiency volume phase holographic gratings via articulating gratings and cameras. The NIR camera incorporates a HAWAII-2RG detector with an Epps optical design consisting of 6 spherical elements and providing subpixel rms image sizes of 7.5 +/- 1.0 μm over all wavelengths and field angles. The NIR spectrograph is semi-warm, sharing a common slit plane and partial collimator with the optical arm. A pre-dewar, cooled to below ambient temperature, houses the final NIR collimator optic, the grating/Fabry-Perot etalon, the polarizing beam splitter, and the first three camera optics. The last three camera elements, blocking filters, and detector are housed in a cryogenically cooled dewar. The semi-warm design concept has long been proposed as an economical way to extend optical instruments into the NIR, however, success has been very limited. A major portion of our design effort entails a detailed thermal analysis using non-sequential ray tracing to interactively guide the mechanical design and determine a truly realizable long wavelength cutoff over which astronomical observations will be sky-limited. In this paper we describe our thermal analysis, design concepts for the staged cooling scheme, and results to be incorporated into the overall mechanical design and baffling.

  2. Bending strength measurements at different materials used for IR-cut filters in mobile camera devices

    NASA Astrophysics Data System (ADS)

    Dietrich, Volker; Hartmann, Peter; Kerz, Franca

    2015-03-01

    Digital cameras are present everywhere in our daily life. Science, business or private life cannot be imagined without digital images. The quality of an image is often rated by its color rendering. In order to obtain a correct color recognition, a near infrared cut (IRC-) filter must be used to alter the sensitivity of imaging sensor. Increasing requirements related to color balance and larger angle of incidence (AOI) enforced the use of new materials as the e.g. BG6X series which substitutes interference coated filters on D263 thin glass. Although the optical properties are the major design criteria, devices have to withstand numerous environmental conditions during use and manufacturing - as e.g. temperature change, humidity, and mechanical shock, as wells as mechanical stress. The new materials show different behavior with respect to all these aspects. They are usually more sensitive against these requirements to a larger or smaller extent. Mechanical strength is especially different. Reliable strength data are of major interest for mobile phone camera applications. As bending strength of a glass component depends not only upon the material itself, but mainly on the surface treatment and test conditions, a single number for the strength might be misleading if the conditions of the test and the samples are not described precisely,. Therefore, Schott started investigations upon the bending strength data of various IRC-filter materials. Different test methods were used to obtain statistical relevant data.

  3. Improved signal to noise ratio and sensitivity of an infrared imaging video bolometer on large helical device by using an infrared periscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandya, Shwetang N., E-mail: pandya.shwetang@LHD.nifs.ac.jp; Sano, Ryuichi; Peterson, Byron J.

    An Infrared imaging Video Bolometer (IRVB) diagnostic is currently being used in the Large Helical Device (LHD) for studying the localization of radiation structures near the magnetic island and helical divertor X-points during plasma detachment and for 3D tomography. This research demands high signal to noise ratio (SNR) and sensitivity to improve the temporal resolution for studying the evolution of radiation structures during plasma detachment and a wide IRVB field of view (FoV) for tomography. Introduction of an infrared periscope allows achievement of a higher SNR and higher sensitivity, which in turn, permits a twofold improvement in the temporal resolutionmore » of the diagnostic. Higher SNR along with wide FoV is achieved simultaneously by reducing the separation of the IRVB detector (metal foil) from the bolometer's aperture and the LHD plasma. Altering the distances to meet the aforesaid requirements results in an increased separation between the foil and the IR camera. This leads to a degradation of the diagnostic performance in terms of its sensitivity by 1.5-fold. Using an infrared periscope to image the IRVB foil results in a 7.5-fold increase in the number of IR camera pixels imaging the foil. This improves the IRVB sensitivity which depends on the square root of the number of IR camera pixels being averaged per bolometer channel. Despite the slower f-number (f/# = 1.35) and reduced transmission (τ{sub 0} = 89%, due to an increased number of lens elements) for the periscope, the diagnostic with an infrared periscope operational on LHD has improved in terms of sensitivity and SNR by a factor of 1.4 and 4.5, respectively, as compared to the original diagnostic without a periscope (i.e., IRVB foil being directly imaged by the IR camera through conventional optics). The bolometer's field of view has also increased by two times. The paper discusses these improvements in apt details.« less

  4. Infrared-enhanced TV for fire detection

    NASA Technical Reports Server (NTRS)

    Hall, J. R.

    1978-01-01

    Closed-circuit television is superior to conventional smoke or heat sensors for detecting fires in large open spaces. Single TV camera scans entire area, whereas many conventional sensors and maze of interconnecting wiring might be required to get same coverage. Camera is monitored by person who would trip alarm if fire were detected, or electronic circuitry could process camera signal for fully-automatic alarm system.

  5. Uncertainty of Videogrammetric Techniques used for Aerodynamic Testing

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Liu, Tianshu; DeLoach, Richard

    2002-01-01

    The uncertainty of videogrammetric techniques used for the measurement of static aeroelastic wind tunnel model deformation and wind tunnel model pitch angle is discussed. Sensitivity analyses and geometrical considerations of uncertainty are augmented by analyses of experimental data in which videogrammetric angle measurements were taken simultaneously with precision servo accelerometers corrected for dynamics. An analysis of variance (ANOVA) to examine error dependence on angle of attack, sensor used (inertial or optical). and on tunnel state variables such as Mach number is presented. Experimental comparisons with a high-accuracy indexing table are presented. Small roll angles are found to introduce a zero-shift in the measured angles. It is shown experimentally that. provided the proper constraints necessary for a solution are met, a single- camera solution can he comparable to a 2-camera intersection result. The relative immunity of optical techniques to dynamics is illustrated.

  6. Design and Calibration of a Dispersive Imaging Spectrometer Adaptor for a Fast IR Camera on NSTX-U

    NASA Astrophysics Data System (ADS)

    Reksoatmodjo, Richard; Gray, Travis; Princeton Plasma Physics Laboratory Team

    2017-10-01

    A dispersive spectrometer adaptor was designed, constructed and calibrated for use on a fast infrared camera employed to measure temperatures on the lower divertor tiles of the NSTX-U tokamak. This adaptor efficiently and evenly filters and distributes long-wavelength infrared photons between 8.0 and 12.0 microns across the 128x128 pixel detector of the fast IR camera. By determining the width of these separated wavelength bands across the camera detector, and then determining the corresponding average photon count for each photon wavelength, a very accurate measurement of the temperature, and thus heat flux, of the divertor tiles can be calculated using Plank's law. This approach of designing an exterior dispersive adaptor for the fast IR camera allows accurate temperature measurements to be made of materials with unknown emissivity. Further, the relative simplicity and affordability of this adaptor design provides an attractive option over more expensive, slower, dispersive IR camera systems. This work was made possible by funding from the Department of Energy for the Summer Undergraduate Laboratory Internship (SULI) program. This work is supported by the US DOE Contract No. DE-AC02-09CH11466.

  7. Directional infrared temperature and emissivity of vegetation: Measurements and models

    NASA Technical Reports Server (NTRS)

    Norman, J. M.; Castello, S.; Balick, L. K.

    1994-01-01

    Directional thermal radiance from vegetation depends on many factors, including the architecture of the plant canopy, thermal irradiance, emissivity of the foliage and soil, view angle, slope, and the kinetic temperature distribution within the vegetation-soil system. A one dimensional model, which includes the influence of topography, indicates that thermal emissivity of vegetation canopies may remain constant with view angle, or emissivity may increase or decrease as view angle from nadir increases. Typically, variations of emissivity with view angle are less than 0.01. As view angle increases away from nadir, directional infrared canopy temperature usually decreases but may remain nearly constant or even increase. Variations in directional temperature with view angle may be 5C or more. Model predictions of directional emissivity are compared with field measurements in corn canopies and over a bare soil using a method that requires two infrared thermometers, one sensitive to the 8 to 14 micrometer wavelength band and a second to the 14 to 22 micrometer band. After correction for CO2 absorption by the atmosphere, a directional canopy emissivity can be obtained as a function of view angle in the 8 to 14 micrometer band to an accuracy of about 0.005. Modeled and measured canopy emissivities for corn varied slightly with view angle (0.990 at nadir and 0.982 at 75 deg view zenith angle) and did not appear to vary significantly with view angle for the bare soil. Canopy emissivity is generally nearer to unity than leaf emissivity may vary by 0.02 with wavelength even though leaf emissivity. High spectral resolution, canopy thermal emissivity may vary by 0.02 with wavelength even though leaf emissivity may vary by 0.07. The one dimensional model provides reasonably accurate predictions of infrared temperature and can be used to study the dependence of infrared temperature on various plant, soil, and environmental factors.

  8. Miranda

    NASA Image and Video Library

    1999-08-24

    One wide-angle and eight narrow-angle camera images of Miranda, taken by NASA Voyager 2, were combined in this view. The controlled mosaic was transformed to an orthographic view centered on the south pole.

  9. An infrared jet in Centaurus A (NGC 5128): Evidence for interaction between the active nucleus and the interstellar medium

    NASA Technical Reports Server (NTRS)

    Joy, Marshall; Harvey, P. M.; Tollestrup, E. V.; Mcgregor, P. J.; Hyland, A. R.

    1990-01-01

    In the present study, higher resolution near infrared images of the visually-obscured central region of Centaurus A were obtained in order to investigate the effects of the active nucleus on the surrounding galaxy. Researchers present J(1.25 microns), H(1.65 microns), and K(2.2 microns) images of the central 40 seconds of the galaxy, taken with the Univ. of Texas InSb array camera on the Anglo Australian 3.9 meter telescope. These images reveal a jet extending approx. 10 arcseconds to the northeast of the nucleus at the same position angle as the x ray and radio jets. The infrared jet is most prominent at the shortest wavelength (1.25 microns), where its brightness surpasses that of the nucleus. The blue appearance of the infrared jet is remarkable considering the heavy obscuration that is evident at visual wavelengths. The amount of reddening in the vicinity of the jet is determined from the measured colors of the stellar core of the galaxy, and this value is used to generate an extinction-corrected energy distribution. In contrast to previously studied optical and infrared jets in active nuclei, the short-wavelength prominence of the Cen A jet indicates that it cannot be attributed to synchrotron emission from a beam of relativistic electrons. The remaining viable mechanisms involve an interaction between the interstellar medium and the active nucleus: the infrared radiation from the jet may be due to emission from interstellar gas that has been entrained and heated by the flow of relativistic particles from the nucleus; alternatively, luminous blue stars may have been created by compression of interstellar material by the relativistic plasma. To investigate these proposed mechanisms, near-infrared spectroscopic studies of Cen A are in progress to look for collisionally excited molecular hydrogen emission lines and recombination lines from ionized gas.

  10. THE ORION H ii REGION AND THE ORION BAR IN THE MID-INFRARED

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salgado, F.; Tielens, A. G. G. M.; Berné, O.

    2016-10-20

    We present mid-infrared photometry of the Orion bar obtained with the Faint Object infraRed Camera for the SOFIA Telescope (FORCAST) on board SOFIA at 6.4, 6.6, 7.7, 19.7, 31.5, and 37.1 μ m. By complementing this observation with archival FORCAST and Herschel /PACS images, we are able to construct a complete infrared spectral energy distribution of the Huygens region in the Orion nebula. Comparing the infrared images with gas tracers, we find that PACS maps trace the molecular cloud, while the FORCAST data trace the photodissociation region (PDR) and the H ii region. Analysis of the energetics of the regionmore » reveal that the PDR extends for 0.28 pc along the line of sight and that the bar is inclined at an angle of 4°. The infrared and submillimeter images reveal that the Orion bar represents a swept-up shell with a thickness of 0.1 pc. The mass of the shell implies a shock velocity of ≃3 km s{sup −1} and an age of ≃10{sup 5} years for the H ii region. Our analysis shows that the UV and infrared dust opacities in the H ii region and the PDR are a factor 5 to 10 lower than in the diffuse interstellar medium. In the ionized gas, Ly α photons are a major source of dust heating at distances larger than ≃0.06 pc from θ {sup 1} Ori C. Dust temperatures can be explained if the size of the grains is between 0.1 and 1 μ m. We derive the photoelectric heating efficiency of the atomic gas in the Orion bar. The results are in good qualitative agreement with models and the quantitative differences indicate a decreased polycyclic aromatic hydrocarbon abundance in this region.« less

  11. 360 degree vision system: opportunities in transportation

    NASA Astrophysics Data System (ADS)

    Thibault, Simon

    2007-09-01

    Panoramic technologies are experiencing new and exciting opportunities in the transportation industries. The advantages of panoramic imagers are numerous: increased areas coverage with fewer cameras, imaging of multiple target simultaneously, instantaneous full horizon detection, easier integration of various applications on the same imager and others. This paper reports our work on panomorph optics and potential usage in transportation applications. The novel panomorph lens is a new type of high resolution panoramic imager perfectly suitable for the transportation industries. The panomorph lens uses optimization techniques to improve the performance of a customized optical system for specific applications. By adding a custom angle to pixel relation at the optical design stage, the optical system provides an ideal image coverage which is designed to reduce and optimize the processing. The optics can be customized for the visible, near infra-red (NIR) or infra-red (IR) wavebands. The panomorph lens is designed to optimize the cost per pixel which is particularly important in the IR. We discuss the use of the 360 vision system which can enhance on board collision avoidance systems, intelligent cruise controls and parking assistance. 360 panoramic vision systems might enable safer highways and significant reduction in casualties.

  12. Precise determination of the heat delivery during in vivo magnetic nanoparticle hyperthermia with infrared thermography

    NASA Astrophysics Data System (ADS)

    Rodrigues, Harley F.; Capistrano, Gustavo; Mello, Francyelli M.; Zufelato, Nicholas; Silveira-Lacerda, Elisângela; Bakuzis, Andris F.

    2017-05-01

    Non-invasive and real-time monitoring of the heat delivery during magnetic nanoparticle hyperthermia (MNH) is of fundamental importance to predict clinical outcomes for cancer treatment. Infrared thermography (IRT) can determine the surface temperature due to three-dimensional heat delivery inside a subcutaneous tumor, an argument that is supported by numerical simulations. However, for precise temperature determination, it is of crucial relevance to use a correct experimental configuration. This work reports an MNH study using a sarcoma 180 murine tumor containing 3.9 mg of intratumorally injected manganese-ferrite nanoparticles. MNH was performed at low field amplitude and non-uniform field configuration. Five 30 min in vivo magnetic hyperthermia experiments were performed, monitoring the surface temperature with a fiber optical sensor and thermal camera at distinct angles with respect to the animal’s surface. The results indicate that temperature errors as large as 7~\\circ C can occur if the experiment is not properly designed. A new IRT error model is found to explain the data. More importantly, we show how to precisely monitor temperature with IRT during hyperthermia, which could positively impact heat dosimetry and clinical planning.

  13. Modified algorithm for mineral identification in LWIR hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Yousefi, Bardia; Sojasi, Saeed; Liaigre, Kévin; Ibarra Castanedo, Clemente; Beaudoin, Georges; Huot, François; Maldague, Xavier P. V.; Chamberland, Martin

    2017-05-01

    The applications of hyperspectral infrared imagery in the different fields of research are significant and growing. It is mainly used in remote sensing for target detection, vegetation detection, urban area categorization, astronomy and geological applications. The geological applications of this technology mainly consist in mineral identification using in airborne or satellite imagery. We address a quantitative and qualitative assessment of mineral identification in the laboratory conditions. We strive to identify nine different mineral grains (Biotite, Diopside, Epidote, Goethite, Kyanite, Scheelite, Smithsonite, Tourmaline, Quartz). A hyperspectral camera in the Long Wave Infrared (LWIR, 7.7-11.8 ) with a LW-macro lens providing a spatial resolution of 100 μm, an infragold plate, and a heating source are the instruments used in the experiment. The proposed algorithm clusters all the pixel-spectra in different categories. Then the best representatives of each cluster are chosen and compared with the ASTER spectral library of JPL/NASA through spectral comparison techniques, such as Spectral angle mapper (SAM) and Normalized Cross Correlation (NCC). The results of the algorithm indicate significant computational efficiency (more than 20 times faster) as compared to previous algorithms and have shown a promising performance for mineral identification.

  14. Precise determination of the heat delivery during in vivo magnetic nanoparticle hyperthermia with infrared thermography.

    PubMed

    Rodrigues, Harley F; Capistrano, Gustavo; Mello, Francyelli M; Zufelato, Nicholas; Silveira-Lacerda, Elisângela; Bakuzis, Andris F

    2017-05-21

    Non-invasive and real-time monitoring of the heat delivery during magnetic nanoparticle hyperthermia (MNH) is of fundamental importance to predict clinical outcomes for cancer treatment. Infrared thermography (IRT) can determine the surface temperature due to three-dimensional heat delivery inside a subcutaneous tumor, an argument that is supported by numerical simulations. However, for precise temperature determination, it is of crucial relevance to use a correct experimental configuration. This work reports an MNH study using a sarcoma 180 murine tumor containing 3.9 mg of intratumorally injected manganese-ferrite nanoparticles. MNH was performed at low field amplitude and non-uniform field configuration. Five 30 min in vivo magnetic hyperthermia experiments were performed, monitoring the surface temperature with a fiber optical sensor and thermal camera at distinct angles with respect to the animal's surface. The results indicate that temperature errors as large as [Formula: see text]C can occur if the experiment is not properly designed. A new IRT error model is found to explain the data. More importantly, we show how to precisely monitor temperature with IRT during hyperthermia, which could positively impact heat dosimetry and clinical planning.

  15. Saturnian Hexagon Collage

    NASA Image and Video Library

    2016-12-06

    This collage of images from NASA's Cassini spacecraft shows Saturn's northern hemisphere and rings as viewed with four different spectral filters. Each filter is sensitive to different wavelengths of light and reveals clouds and hazes at different altitudes. Clockwise from top left, the filters used are sensitive to violet (420 nanometers), red (648 nanometers), near-infrared (728 nanometers) and infrared (939 nanometers) light. The image was taken with the Cassini spacecraft wide-angle camera on Dec. 2, 2016, at a distance of about 400,000 miles (640,000 kilometers) from Saturn. Image scale is 95 miles (153 kilometers) per pixel. The images have been enlarged by a factor of two. The original versions of these images, as sent by the spacecraft, have a size of 256 pixels by 256 pixels. Cassini's images are sometimes planned to be compressed to smaller sizes due to data storage limitations on the spacecraft, or to allow a larger number of images to be taken than would otherwise be possible. These images were obtained about two days before its first close pass by the outer edges of Saturn's main rings during its penultimate mission phase. http://photojournal.jpl.nasa.gov/catalog/PIA21053

  16. The near infrared imaging system for the real-time protection of the JET ITER-like wall

    NASA Astrophysics Data System (ADS)

    Huber, A.; Kinna, D.; Huber, V.; Arnoux, G.; Balboa, I.; Balorin, C.; Carman, P.; Carvalho, P.; Collins, S.; Conway, N.; McCullen, P.; Jachmich, S.; Jouve, M.; Linsmeier, Ch; Lomanowski, B.; Lomas, P. J.; Lowry, C. G.; Maggi, C. F.; Matthews, G. F.; May-Smith, T.; Meigs, A.; Mertens, Ph; Nunes, I.; Price, M.; Puglia, P.; Riccardo, V.; Rimini, F. G.; Sergienko, G.; Tsalas, M.; Zastrow, K.-D.; contributors, JET

    2017-12-01

    This paper describes the design, implementation and operation of the near infrared (NIR) imaging diagnostic system of the JET ITER-like wall (JET-ILW) plasma experiment and its integration into the existing JET protection architecture. The imaging system comprises four wide-angle views, four tangential divertor views, and two top views of the divertor covering 66% of the first wall and up to 43% of the divertor. The operation temperature ranges which must be observed by the NIR protection cameras are, for the materials used on JET: Be 700 °C-1400 °C W coating 700 °C-1370 °C W bulk 700 °C-1400 °C. The Real-Time Protection system operates routinely since 2011 and successfully demonstrated its capability to avoid the overheating of the main chamber beryllium wall as well as of the divertor W and W-coated carbon fibre composite (CFC) tiles. During this period, less than 0.5% of the terminated discharges were aborted by a malfunction of the system. About 2%-3% of the discharges were terminated due to the detection of actual hot spots.

  17. Calibration and verification of thermographic cameras for geometric measurements

    NASA Astrophysics Data System (ADS)

    Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P.

    2011-03-01

    Infrared thermography is a technique with an increasing degree of development and applications. Quality assessment in the measurements performed with the thermal cameras should be achieved through metrology calibration and verification. Infrared cameras acquire temperature and geometric information, although calibration and verification procedures are only usual for thermal data. Black bodies are used for these purposes. Moreover, the geometric information is important for many fields as architecture, civil engineering and industry. This work presents a calibration procedure that allows the photogrammetric restitution and a portable artefact to verify the geometric accuracy, repeatability and drift of thermographic cameras. These results allow the incorporation of this information into the quality control processes of the companies. A grid based on burning lamps is used for the geometric calibration of thermographic cameras. The artefact designed for the geometric verification consists of five delrin spheres and seven cubes of different sizes. Metrology traceability for the artefact is obtained from a coordinate measuring machine. Two sets of targets with different reflectivity are fixed to the spheres and cubes to make data processing and photogrammetric restitution possible. Reflectivity was the chosen material propriety due to the thermographic and visual cameras ability to detect it. Two thermographic cameras from Flir and Nec manufacturers, and one visible camera from Jai are calibrated, verified and compared using calibration grids and the standard artefact. The calibration system based on burning lamps shows its capability to perform the internal orientation of the thermal cameras. Verification results show repeatability better than 1 mm for all cases, being better than 0.5 mm for the visible one. As it must be expected, also accuracy appears higher in the visible camera, and the geometric comparison between thermographic cameras shows slightly better results for the Nec camera.

  18. Photometry of Galactic and Extragalactic Far-Infrared Sources using the 91.5 cm Airborne Infrared Telescope

    NASA Technical Reports Server (NTRS)

    Harper, D. A.

    1996-01-01

    The objective of this grant was to construct a series of far infrared photometers, cameras, and supporting systems for use in astronomical observations in the Kuiper Airborne Observatory. The observations have included studies of galaxies, star formation regions, and objects within the Solar System.

  19. Solar System Portrait - 60 Frame Mosaic

    NASA Image and Video Library

    1996-09-13

    The cameras of Voyager 1 on Feb. 14, 1990, pointed back toward the sun and took a series of pictures of the sun and the planets, making the first ever portrait of our solar system as seen from the outside. In the course of taking this mosaic consisting of a total of 60 frames, Voyager 1 made several images of the inner solar system from a distance of approximately 4 billion miles and about 32 degrees above the ecliptic plane. Thirty-nine wide angle frames link together six of the planets of our solar system in this mosaic. Outermost Neptune is 30 times further from the sun than Earth. Our sun is seen as the bright object in the center of the circle of frames. The wide-angle image of the sun was taken with the camera's darkest filter (a methane absorption band) and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large as seen from Voyager, only about one-fortieth of the diameter as seen from Earth, but is still almost 8 million times brighter than the brightest star in Earth's sky, Sirius. The result of this great brightness is an image with multiple reflections from the optics in the camera. Wide-angle images surrounding the sun also show many artifacts attributable to scattered light in the optics. These were taken through the clear filter with one second exposures. The insets show the planets magnified many times. Narrow-angle images of Earth, Venus, Jupiter, Saturn, Uranus and Neptune were acquired as the spacecraft built the wide-angle mosaic. Jupiter is larger than a narrow-angle pixel and is clearly resolved, as is Saturn with its rings. Uranus and Neptune appear larger than they really are because of image smear due to spacecraft motion during the long (15 second) exposures. From Voyager's great distance Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun. http://photojournal.jpl.nasa.gov/catalog/PIA00451

  20. Solar System Portrait - 60 Frame Mosaic

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The cameras of Voyager 1 on Feb. 14, 1990, pointed back toward the sun and took a series of pictures of the sun and the planets, making the first ever 'portrait' of our solar system as seen from the outside. In the course of taking this mosaic consisting of a total of 60 frames, Voyager 1 made several images of the inner solar system from a distance of approximately 4 billion miles and about 32 degrees above the ecliptic plane. Thirty-nine wide angle frames link together six of the planets of our solar system in this mosaic. Outermost Neptune is 30 times further from the sun than Earth. Our sun is seen as the bright object in the center of the circle of frames. The wide-angle image of the sun was taken with the camera's darkest filter (a methane absorption band) and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large as seen from Voyager, only about one-fortieth of the diameter as seen from Earth, but is still almost 8 million times brighter than the brightest star in Earth's sky, Sirius. The result of this great brightness is an image with multiple reflections from the optics in the camera. Wide-angle images surrounding the sun also show many artifacts attributable to scattered light in the optics. These were taken through the clear filter with one second exposures. The insets show the planets magnified many times. Narrow-angle images of Earth, Venus, Jupiter, Saturn, Uranus and Neptune were acquired as the spacecraft built the wide-angle mosaic. Jupiter is larger than a narrow-angle pixel and is clearly resolved, as is Saturn with its rings. Uranus and Neptune appear larger than they really are because of image smear due to spacecraft motion during the long (15 second) exposures. From Voyager's great distance Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun.

  1. Star Formation as Seen by the Infrared Array Camera on Spitzer

    NASA Technical Reports Server (NTRS)

    Smith, Howard A.; Allen, L.; Megeath, T.; Barmby, P.; Calvet, N.; Fazio, G.; Hartmann, L.; Myers, P.; Marengo, M.; Gutermuth, R.

    2004-01-01

    The Infrared Array Camera (IRAC) onboard Spitzer has imaged regions of star formation (SF) in its four IR bands with spatial resolutions of approximately 2"/pixel. IRAC is sensitive enough to detect very faint, embedded young stars at levels of tens of Jy, and IRAC photometry can categorize their stages of development: from young protostars with infalling envelopes (Class 0/1) to stars whose infrared excesses derive from accreting circumstellar disks (Class 11) to evolved stars dominated by photospheric emission. The IRAC images also clearly reveal and help diagnose associated regions of shocked and/or PDR emission in the clouds; we find existing models provide a good start at explaining the continuum of the SF regions IRAC observes.

  2. Near infra-red astronomy with adaptive optics and laser guide stars at the Keck Observatory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Max, C.E.; Gavel, D.T.; Olivier, S.S.

    1995-08-03

    A laser guide star adaptive optics system is being built for the W. M. Keck Observatory`s 10-meter Keck II telescope. Two new near infra-red instruments will be used with this system: a high-resolution camera (NIRC 2) and an echelle spectrometer (NIRSPEC). The authors describe the expected capabilities of these instruments for high-resolution astronomy, using adaptive optics with either a natural star or a sodium-layer laser guide star as a reference. They compare the expected performance of these planned Keck adaptive optics instruments with that predicted for the NICMOS near infra-red camera, which is scheduled to be installed on the Hubblemore » Space Telescope in 1997.« less

  3. Pre-flight and On-orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.; Wagner, R. V.; Robinson, M. S.; Licht, A.; Thomas, P. C.; Becker, K.; Anderson, J.; Brylow, S. M.; Humm, D. C.; Tschimmel, M.

    2016-04-01

    The Lunar Reconnaissance Orbiter Camera (LROC) consists of two imaging systems that provide multispectral and high resolution imaging of the lunar surface. The Wide Angle Camera (WAC) is a seven color push-frame imager with a 90∘ field of view in monochrome mode and 60∘ field of view in color mode. From the nominal 50 km polar orbit, the WAC acquires images with a nadir ground sampling distance of 75 m for each of the five visible bands and 384 m for the two ultraviolet bands. The Narrow Angle Camera (NAC) consists of two identical cameras capable of acquiring images with a ground sampling distance of 0.5 m from an altitude of 50 km. The LROC team geometrically calibrated each camera before launch at Malin Space Science Systems in San Diego, California and the resulting measurements enabled the generation of a detailed camera model for all three cameras. The cameras were mounted and subsequently launched on the Lunar Reconnaissance Orbiter (LRO) on 18 June 2009. Using a subset of the over 793000 NAC and 207000 WAC images of illuminated terrain collected between 30 June 2009 and 15 December 2013, we improved the interior and exterior orientation parameters for each camera, including the addition of a wavelength dependent radial distortion model for the multispectral WAC. These geometric refinements, along with refined ephemeris, enable seamless projections of NAC image pairs with a geodetic accuracy better than 20 meters and sub-pixel precision and accuracy when orthorectifying WAC images.

  4. Alignment and Performance of the Infrared Multi-Object Spectrometer

    NASA Technical Reports Server (NTRS)

    Connelly, Joseph A.; Ohl, Raymond G.; Mentzell, J. Eric; Madison, Timothy J.; Hylan, Jason E.; Mink, Ronald G.; Saha, Timo T.; Tveekrem, June L.; Sparr, Leroy M.; Chambers, V. John; hide

    2004-01-01

    The Infrared Multi-Object Spectrometer (IRMOS) is a principle investigator class instrument for the Kitt Peak National Observatory 4 and 2.1 meter telescopes. IRMOS is a near-IR (0.8 - 2.5 micron) spectrometer with low-to mid-resolving power (R = 300 - 3000). IRMOS produces simultaneous spectra of approximately 100 objects in its 2.8 x 2.0 arc-min field of view (4 m telescope) using a commercial Micro Electro-Mechanical Systems (MEMS) micro-mirror array (MMA) from Texas Instruments. The IRMOS optical design consists of two imaging subsystems. The focal reducer images the focal plane of the telescope onto the MMA field stop, and the spectrograph images the MMA onto the detector. We describe ambient breadboard subsystem alignment and imaging performance of each stage independently, and ambient imaging performance of the fully assembled instrument. Interferometric measurements of subsystem wavefront error serve as a qualitative alignment guide, and are accomplished using a commercial, modified Twyman-Green laser unequal path interferometer. Image testing provides verification of the optomechanical alignment method and a measurement of near-angle scattered light due to mirror small-scale surface error. Image testing is performed at multiple field points. A mercury-argon pencil lamp provides a spectral line at 546.1 nanometers, a blackbody source provides a line at 1550 nanometers, and a CCD camera and IR camera are used as detectors. We use commercial optical modeling software to predict the point-spread function and its effect on instrument slit transmission and resolution. Our breadboard and instrument level test results validate this prediction. We conclude with an instrument performance prediction for cryogenic operation and first light in late 2003.

  5. Hyperspectral imaging spectro radiometer improves radiometric accuracy

    NASA Astrophysics Data System (ADS)

    Prel, Florent; Moreau, Louis; Bouchard, Robert; Bullis, Ritchie D.; Roy, Claude; Vallières, Christian; Levesque, Luc

    2013-06-01

    Reliable and accurate infrared characterization is necessary to measure the specific spectral signatures of aircrafts and associated infrared counter-measures protections (i.e. flares). Infrared characterization is essential to improve counter measures efficiency, improve friend-foe identification and reduce the risk of friendly fire. Typical infrared characterization measurement setups include a variety of panchromatic cameras and spectroradiometers. Each instrument brings essential information; cameras measure the spatial distribution of targets and spectroradiometers provide the spectral distribution of the emitted energy. However, the combination of separate instruments brings out possible radiometric errors and uncertainties that can be reduced with Hyperspectral imagers. These instruments combine both spectral and spatial information into the same data. These instruments measure both the spectral and spatial distribution of the energy at the same time ensuring the temporal and spatial cohesion of collected information. This paper presents a quantitative analysis of the main contributors of radiometric uncertainties and shows how a hyperspectral imager can reduce these uncertainties.

  6. Interpretation of multispectral and infrared thermal surveys of the Suez Canal Zone, Egypt

    NASA Technical Reports Server (NTRS)

    Elshazly, E. M.; Hady, M. A. A. H.; Hafez, M. A. A.; Salman, A. B.; Morsy, M. A.; Elrakaiby, M. M.; Alaassy, I. E. E.; Kamel, A. F.

    1977-01-01

    Remote sensing airborne surveys were conducted, as part of the plan of rehabilitation, of the Suez Canal Zone using I2S multispectral camera and Bendix LN-3 infrared passive scanner. The multispectral camera gives four separate photographs for the same scene in the blue, green, red, and near infrared bands. The scanner was operated in the microwave bands of 8 to 14 microns and the thermal surveying was carried out both at night and in the day time. The surveys, coupled with intensive ground investigations, were utilized in the construction of new geological, structural lineation and drainage maps for the Suez Canal Zone on a scale of approximately 1:20,000, which are superior to the maps made by normal aerial photography. A considerable number of anomalies belonging to various types were revealed through the interpretation of the executed multispectral and infrared thermal surveys.

  7. Topview stereo: combining vehicle-mounted wide-angle cameras to a distance sensor array

    NASA Astrophysics Data System (ADS)

    Houben, Sebastian

    2015-03-01

    The variety of vehicle-mounted sensors in order to fulfill a growing number of driver assistance tasks has become a substantial factor in automobile manufacturing cost. We present a stereo distance method exploiting the overlapping field of view of a multi-camera fisheye surround view system, as they are used for near-range vehicle surveillance tasks, e.g. in parking maneuvers. Hence, we aim at creating a new input signal from sensors that are already installed. Particular properties of wide-angle cameras (e.g. hanging resolution) demand an adaptation of the image processing pipeline to several problems that do not arise in classical stereo vision performed with cameras carefully designed for this purpose. We introduce the algorithms for rectification, correspondence analysis, and regularization of the disparity image, discuss reasons and avoidance of the shown caveats, and present first results on a prototype topview setup.

  8. What convention is used for the illumination and view angles?

    Atmospheric Science Data Center

    2014-12-08

    ... Azimuth angles are measured clockwise from the direction of travel to local north. For both the Sun and cameras, azimuth describes the ... to the equator, because of its morning equator crossing time. Additionally, the difference in view and solar azimuth angle will be near ...

  9. Unmanned Ground Vehicle Perception Using Thermal Infrared Cameras

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo; Huertas, Andres; Matthies, Larry; Bajracharya, Max; Assad, Christopher; Brennan, Shane; Bellut, Paolo; Sherwin, Gary

    2011-01-01

    TIR cameras can be used for day/night Unmanned Ground Vehicle (UGV) autonomous navigation when stealth is required. The quality of uncooled TIR cameras has significantly improved over the last decade, making them a viable option at low speed Limiting factors for stereo ranging with uncooled LWIR cameras are image blur and low texture scenes TIR perception capabilities JPL has explored includes: (1) single and dual band TIR terrain classification (2) obstacle detection (pedestrian, vehicle, tree trunks, ditches, and water) (3) perception thru obscurants

  10. An infrared image based methodology for breast lesions screening

    NASA Astrophysics Data System (ADS)

    Morais, K. C. C.; Vargas, J. V. C.; Reisemberger, G. G.; Freitas, F. N. P.; Oliari, S. H.; Brioschi, M. L.; Louveira, M. H.; Spautz, C.; Dias, F. G.; Gasperin, P.; Budel, V. M.; Cordeiro, R. A. G.; Schittini, A. P. P.; Neto, C. D.

    2016-05-01

    The objective of this paper is to evaluate the potential of utilizing a structured methodology for breast lesions screening, based on infrared imaging temperature measurements of a healthy control group to establish expected normality ranges, and of breast cancer patients, previously diagnosed through biopsies of the affected regions. An analysis of the systematic error of the infrared camera skin temperature measurements was conducted in several different regions of the body, by direct comparison to high precision thermistor temperature measurements, showing that infrared camera temperatures are consistently around 2 °C above the thermistor temperatures. Therefore, a method of conjugated gradients is proposed to eliminate the infrared camera direct temperature measurement imprecision, by calculating the temperature difference between two points to cancel out the error. The method takes into account the human body approximate bilateral symmetry, and compares measured dimensionless temperature difference values (Δ θ bar) between two symmetric regions of the patient's breast, that takes into account the breast region, the surrounding ambient and the individual core temperatures, and doing so, the results interpretation for different individuals become simple and non subjective. The range of normal whole breast average dimensionless temperature differences for 101 healthy individuals was determined, and admitting that the breasts temperatures exhibit a unimodal normal distribution, the healthy normal range for each region was considered to be the dimensionless temperature difference plus/minus twice the standard deviation of the measurements, Δ θ bar ‾ + 2σ Δ θ bar ‾ , in order to represent 95% of the population. Forty-seven patients with previously diagnosed breast cancer through biopsies were examined with the method, which was capable of detecting breast abnormalities in 45 cases (96%). Therefore, the conjugated gradients method was considered effective in breast lesions screening through infrared imaging in order to recommend a biopsy, even with the use of a low optical resolution camera (160 × 120 pixels) and a thermal resolution of 0.1 °C, whose results were compared to the results of a higher resolution camera (320 × 240 pixels). The main conclusion is that the results demonstrate that the method has potential for utilization as a noninvasive screening exam for individuals with breast complaints, indicating whether the patient should be submitted to a biopsy or not.

  11. Accuracy, resolution, and cost comparisons between small format and mapping cameras for environmental mapping

    NASA Technical Reports Server (NTRS)

    Clegg, R. H.; Scherz, J. P.

    1975-01-01

    Successful aerial photography depends on aerial cameras providing acceptable photographs within cost restrictions of the job. For topographic mapping where ultimate accuracy is required only large format mapping cameras will suffice. For mapping environmental patterns of vegetation, soils, or water pollution, 9-inch cameras often exceed accuracy and cost requirements, and small formats may be better. In choosing the best camera for environmental mapping, relative capabilities and costs must be understood. This study compares resolution, photo interpretation potential, metric accuracy, and cost of 9-inch, 70mm, and 35mm cameras for obtaining simultaneous color and color infrared photography for environmental mapping purposes.

  12. First experiences with ARNICA, the ARCETRI observatory imaging camera

    NASA Astrophysics Data System (ADS)

    Lisi, F.; Baffa, C.; Hunt, L.; Maiolino, R.; Moriondo, G.; Stanga, R.

    1994-03-01

    ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 micrometer that Arcetri Observatory has designed and built as a common use instrument for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1 sec per pixel, with sky coverage of more than 4 min x 4 min on the NICMOS 3 (256 x 256 pixels, 40 micrometer side) detector array. The optical path is compact enough to be enclosed in a 25.4 cm diameter dewar; the working temperature of detector and optics is 76 K. We give an estimate of performance, in terms of sensitivity with an assigned observing time, along with some preliminary considerations on photometric accuracy.

  13. Apollo 9 Mission image - S0-65 Multispectral Photography - Georgia

    NASA Image and Video Library

    2009-02-19

    AS09-26A-3792A (11 March 1969) --- Color infrared photograph of the Atlanta, Georgia area taken on March 11, 1969, by one of the four synchronized cameras of the Apollo 9 Earth Resources Survey (SO-65) experiment. At 11:21 a.m. (EST) when this picture was taken, the Apollo 9 spacecraft was at an altitude of 106 nautical miles, and the sun elevation was 47 degrees above the horizon. The location of the point on Earth's surface at which the four-camera combination was aimed was 33 degrees 10 minutes north latitude, and 84 degrees and 40 minutes west longitude. The other three cameras used: (B) black and white film with a red filter; (C) black and white infrared film; and (D) black and white film with a green filter.

  14. 3-D Flow Visualization with a Light-field Camera

    NASA Astrophysics Data System (ADS)

    Thurow, B.

    2012-12-01

    Light-field cameras have received attention recently due to their ability to acquire photographs that can be computationally refocused after they have been acquired. In this work, we describe the development of a light-field camera system for 3D visualization of turbulent flows. The camera developed in our lab, also known as a plenoptic camera, uses an array of microlenses mounted next to an image sensor to resolve both the position and angle of light rays incident upon the camera. For flow visualization, the flow field is seeded with small particles that follow the fluid's motion and are imaged using the camera and a pulsed light source. The tomographic MART algorithm is then applied to the light-field data in order to reconstruct a 3D volume of the instantaneous particle field. 3D, 3C velocity vectors are then determined from a pair of 3D particle fields using conventional cross-correlation algorithms. As an illustration of the concept, 3D/3C velocity measurements of a turbulent boundary layer produced on the wall of a conventional wind tunnel are presented. Future experiments are planned to use the camera to study the influence of wall permeability on the 3-D structure of the turbulent boundary layer.Schematic illustrating the concept of a plenoptic camera where each pixel represents both the position and angle of light rays entering the camera. This information can be used to computationally refocus an image after it has been acquired. Instantaneous 3D velocity field of a turbulent boundary layer determined using light-field data captured by a plenoptic camera.

  15. First Results from the Wide Angle Camera of the ROSETTA Mission .

    NASA Astrophysics Data System (ADS)

    Barbieri, C.; Fornasier, S.; Bertini, I.; Angrilli, F.; Bianchini, G. A.; Debei, S.; De Cecco, M.; Parzianello, G.; Zaccariotto, M.; Da Deppo, V.; Naletto, G.

    This paper gives a brief description of the Wide Angle Camera (WAC), built by the Center of Studies and Activities for Space (CISAS) of the University of Padova for the ESA ROSETTA Mission, of data we have obtained about the new mission targets, and of the first results achieved after the launch in March 2004. The WAC is part of the OSIRIS imaging system, built under the PI-ship of Dr. U. Keller (Max-Planck-Institute for Solar System Studies) which comprises also a Narrow Angle Camera (NAC) built by the Laboratoire d'Astrophysique Spatiale (LAS) of Marseille. CISAS had also the responsibility to build the shutter and the front door mechanism for the NAC. The images show the excellent optical quality of the WAC, exceeding the specifications both in term of encircled energy (80% in one pixel over a FoV of 12×12 sq degree), limiting magnitude (fainter than the 13th in 30s exposure time through a wideband red filter) and amount of distortions.

  16. LROC Stereo Observations

    NASA Astrophysics Data System (ADS)

    Beyer, Ross A.; Archinal, B.; Li, R.; Mattson, S.; Moratto, Z.; McEwen, A.; Oberst, J.; Robinson, M.

    2009-09-01

    The Lunar Reconnaissance Orbiter Camera (LROC) will obtain two types of multiple overlapping coverage to derive terrain models of the lunar surface. LROC has two Narrow Angle Cameras (NACs), working jointly to provide a wider (in the cross-track direction) field of view, as well as a Wide Angle Camera (WAC). LRO's orbit precesses, and the same target can be viewed at different solar azimuth and incidence angles providing the opportunity to acquire `photometric stereo' in addition to traditional `geometric stereo' data. Geometric stereo refers to images acquired by LROC with two observations at different times. They must have different emission angles to provide a stereo convergence angle such that the resultant images have enough parallax for a reasonable stereo solution. The lighting at the target must not be radically different. If shadows move substantially between observations, it is very difficult to correlate the images. The majority of NAC geometric stereo will be acquired with one nadir and one off-pointed image (20 degree roll). Alternatively, pairs can be obtained with two spacecraft rolls (one to the left and one to the right) providing a stereo convergence angle up to 40 degrees. Overlapping WAC images from adjacent orbits can be used to generate topography of near-global coverage at kilometer-scale effective spatial resolution. Photometric stereo refers to multiple-look observations of the same target under different lighting conditions. LROC will acquire at least three (ideally five) observations of a target. These observations should have near identical emission angles, but with varying solar azimuth and incidence angles. These types of images can be processed via various methods to derive single pixel resolution topography and surface albedo. The LROC team will produce some topographic models, but stereo data collection is focused on acquiring the highest quality data so that such models can be generated later.

  17. Ambient-Light-Canceling Camera Using Subtraction of Frames

    NASA Technical Reports Server (NTRS)

    Morookian, John Michael

    2004-01-01

    The ambient-light-canceling camera (ALCC) is a proposed near-infrared electronic camera that would utilize a combination of (1) synchronized illumination during alternate frame periods and (2) subtraction of readouts from consecutive frames to obtain images without a background component of ambient light. The ALCC is intended especially for use in tracking the motion of an eye by the pupil center corneal reflection (PCCR) method. Eye tracking by the PCCR method has shown potential for application in human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological deficiencies. In the PCCR method, an eye is illuminated by near-infrared light from a lightemitting diode (LED). Some of the infrared light is reflected from the surface of the cornea. Some of the infrared light enters the eye through the pupil and is reflected from back of the eye out through the pupil a phenomenon commonly observed as the red-eye effect in flash photography. An electronic camera is oriented to image the user's eye. The output of the camera is digitized and processed by algorithms that locate the two reflections. Then from the locations of the centers of the two reflections, the direction of gaze is computed. As described thus far, the PCCR method is susceptible to errors caused by reflections of ambient light. Although a near-infrared band-pass optical filter can be used to discriminate against ambient light, some sources of ambient light have enough in-band power to compete with the LED signal. The mode of operation of the ALCC would complement or supplant spectral filtering by providing more nearly complete cancellation of the effect of ambient light. In the operation of the ALCC, a near-infrared LED would be pulsed on during one camera frame period and off during the next frame period. Thus, the scene would be illuminated by both the LED (signal) light and the ambient (background) light during one frame period, and would be illuminated with only ambient (background) light during the next frame period. The camera output would be digitized and sent to a computer, wherein the pixel values of the background-only frame would be subtracted from the pixel values of the signal-plus-background frame to obtain signal-only pixel values (see figure). To prevent artifacts of motion from entering the images, it would be necessary to acquire image data at a rate greater than the standard video rate of 30 frames per second. For this purpose, the ALCC would exploit a novel control technique developed at NASA s Jet Propulsion Laboratory for advanced charge-coupled-device (CCD) cameras. This technique provides for readout from a subwindow [region of interest (ROI)] within the image frame. Because the desired reflections from the eye would typically occupy a small fraction of the area within the image frame, the ROI capability would make it possible to acquire and subtract pixel values at rates of several hundred frames per second considerably greater than the standard video rate and sufficient to both (1) suppress motion artifacts and (2) track the motion of the eye between consecutive subtractive frame pairs.

  18. Augmented reality glass-free three-dimensional display with the stereo camera

    NASA Astrophysics Data System (ADS)

    Pang, Bo; Sang, Xinzhu; Chen, Duo; Xing, Shujun; Yu, Xunbo; Yan, Binbin; Wang, Kuiru; Yu, Chongxiu

    2017-10-01

    An improved method for Augmented Reality (AR) glass-free three-dimensional (3D) display based on stereo camera used for presenting parallax contents from different angle with lenticular lens array is proposed. Compared with the previous implementation method of AR techniques based on two-dimensional (2D) panel display with only one viewpoint, the proposed method can realize glass-free 3D display of virtual objects and real scene with 32 virtual viewpoints. Accordingly, viewers can get abundant 3D stereo information from different viewing angles based on binocular parallax. Experimental results show that this improved method based on stereo camera can realize AR glass-free 3D display, and both of virtual objects and real scene have realistic and obvious stereo performance.

  19. Estimation of wetland evapotranspiration in northern New York using infrared thermometry

    NASA Astrophysics Data System (ADS)

    Hwang, K.; Chandler, D. G.

    2016-12-01

    Evapotranspiration (ET) is an important component of the water budget and often regarded as a major water loss. In freshwater wetlands, cumulative annual ET can equal precipitation under well-watered conditions. Wetland ET is therefore an important control on contaminant and nutrient transport. Yet, quantification of wetland ET is challenged by complex surface characteristics, diverse plant species and density, and variations in wetland shape and size. As handheld infrared (IR) cameras have become available, studies exploiting the new technology have increased, especially in agriculture and hydrology. The benefits of IR cameras include (1) high spatial resolution, (2) high sample rates, (3) real-time imaging, (4) a constant viewing geometry, and (5) no need for atmosphere and cloud corrections. Compared with traditional methods, infrared thermometer is capable of monitoring at the scale of a small pond or localized plant community. This enables finer scale survey of heterogeneous land surfaces rather than strict dependence on atmospheric variables. Despite this potential, there has been a limited number of studies of ET and drought stress with IR cameras. In this study, the infrared thermometry-based method was applied to estimate ET over wetland plant species in St. Lawrence River Valley, NY. The results are evaluated with traditional methods to test applicability over multiple vegetation species in a same area.

  20. Water ingress detection in honeycomb sandwich panels by passive infrared thermography using a high-resolution thermal imaging camera

    NASA Astrophysics Data System (ADS)

    Ibarra-Castanedo, C.; Brault, L.; Marcotte, F.; Genest, M.; Farley, V.; Maldague, X.

    2012-06-01

    Water ingress in honeycomb structures is of great concern for the civil and military aerospace industries. Pressure and temperature variations during take-off and landing produce considerable stress on aircraft structures, promoting moisture ingress (by diffusion through fibers or by direct ingress through voids, cracks or unsealed joints) into the core. The presence of water (or other fluids such as kerosene, hydraulic fluid and de-icing agents) in any of its forms (gas vapor, liquid or ice) promotes corrosion, cell breakage, and induce composite layer delaminations and skin disbonds. In this study, testing specimens were produced from unserviceable parts from military aircraft. In order to simulate atmospheric conditions during landing, selected core areas were filled with measured quantities of water and then frozen in a cold chamber. The specimens were then removed from the chamber and monitored for over 20 minutes as they warm up using a cooled high-resolution infrared camera. Results have shown that detection and quantification of water ingress on honeycomb sandwich structures by passive infrared thermography is possible using a HD mid-wave infrared cameras for volumes of water as low as 0.2 ml and from a distance as far as 20 m from the target.

  1. SOFIA Science Instruments: Commissioning, Upgrades and Future Opportunities

    NASA Technical Reports Server (NTRS)

    Smith, Erin C.

    2014-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is the world's largest airborne observatory, featuring a 2.5 meter telescope housed in the aft section of a Boeing 747sp aircraft. SOFIA's current instrument suite includes: FORCAST (Faint Object InfraRed CAmera for the SOFIA Telescope), a 5-40 µm dual band imager/grism spectrometer developed at Cornell University; HIPO (High-speed Imaging Photometer for Occultations), a 0.3-1.1 micron imager built by Lowell Observatory; FLITECAM (First Light Infrared Test Experiment CAMera), a 1-5 micron wide-field imager/grism spectrometer developed at UCLA; FIFI-LS (Far-Infrared Field-Imaging Line Spectrometer), a 42-210 micron IFU grating spectrograph completed by University Stuttgart; and EXES (Echelon-Cross- Echelle Spectrograph), a 5-28 micron high-resolution spectrometer being completed by UC Davis and NASA Ames. A second generation instrument, HAWC+ (Highresolution Airborne Wideband Camera), is a 50-240 micron imager being upgraded at JPL to add polarimetry and new detectors developed at GSFC. SOFIA will continually update its instrument suite with new instrumentation, technology demonstration experiments and upgrades to the existing instrument suite. This paper details instrument capabilities and status as well as plans for future instrumentation, including the call for proposals for 3rd generation SOFIA science instruments.

  2. POLICAN: A near-infrared imaging polarimeter at OAGH

    NASA Astrophysics Data System (ADS)

    Devaraj, R.; Luna, A.; Carrasco, L.; Mayya, Y. D.; Serrano-Bernal, O.

    2017-07-01

    We present a near-infrared linear imaging polarimeter POLICAN, developed for the Cananea near-infrared camera (CANICA) at the 2.1m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located at Cananea, Sonora, México. POLICAN reaches a limiting magnitude to about 16th mag with a polarimetric accuracy of about 1% for bright sources.

  3. A state observer for using a slow camera as a sensor for fast control applications

    NASA Astrophysics Data System (ADS)

    Gahleitner, Reinhard; Schagerl, Martin

    2013-03-01

    This contribution concerns about a problem that often arises in vision based control, when a camera is used as a sensor for fast control applications, or more precisely, when the sample rate of the control loop is higher than the frame rate of the camera. In control applications for mechanical axes, e.g. in robotics or automated production, a camera and some image processing can be used as a sensor to detect positions or angles. The sample time in these applications is typically in the range of a few milliseconds or less and this demands the use of a camera with a high frame rate up to 1000 fps. The presented solution is a special state observer that can work with a slower and therefore cheaper camera to estimate the state variables at the higher sample rate of the control loop. To simplify the image processing for the determination of positions or angles and make it more robust, some LED markers are applied to the plant. Simulation and experimental results show that the concept can be used even if the plant is unstable like the inverted pendulum.

  4. ARNICA: the Arcetri Observatory NICMOS3 imaging camera

    NASA Astrophysics Data System (ADS)

    Lisi, Franco; Baffa, Carlo; Hunt, Leslie K.

    1993-10-01

    ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 micrometers that Arcetri Observatory has designed and built as a general facility for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1' per pixel, with sky coverage of more than 4' X 4' on the NICMOS 3 (256 X 256 pixels, 40 micrometers side) detector array. The optical path is compact enough to be enclosed in a 25.4 cm diameter dewar; the working temperature is 76 K. The camera is remotely controlled by a 486 PC, connected to the array control electronics via a fiber-optics link. A C-language package, running under MS-DOS on the 486 PC, acquires and stores the frames, and controls the timing of the array. We give an estimate of performance, in terms of sensitivity with an assigned observing time, along with some details on the main parameters of the NICMOS 3 detector.

  5. KSC01pd1736

    NASA Image and Video Library

    2001-11-26

    KENNEDY SPACE CENTER, Fla. -- A piece of equipment for Hubble Space Telescope Servicing mission is moved inside Hangar AE, Cape Canaveral. In the canister is the Advanced Camera for Surveys (ACS). The ACS will increase the discovery efficiency of the HST by a factor of ten. It consists of three electronic cameras and a complement of filters and dispersers that detect light from the ultraviolet to the near infrared (1200 - 10,000 angstroms). The ACS was built through a collaborative effort between Johns Hopkins University, Goddard Space Flight Center, Ball Aerospace Corporation and Space Telescope Science Institute. The goal of the mission, STS-109, is to service the HST, replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the ACS, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002

  6. KSC-01pp1758

    NASA Image and Video Library

    2001-11-29

    KENNEDY SPACE CENTER, Fla. -- In Hangar A&E, workers watch as an overhead crane lifts the Advanced Camera for Surveys out of its transportation container. Part of the payload on the Hubble Space Telescope Servicing Mission, STS-109, the ACS will increase the discovery efficiency of the HST by a factor of ten. It consists of three electronic cameras and a complement of filters and dispersers that detect light from the ultraviolet to the near infrared (1200 - 10,000 angstroms). The ACS was built through a collaborative effort between Johns Hopkins University, Goddard Space Flight Center, Ball Aerospace Corporation and Space Telescope Science Institute. Tasks for the mission include replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the ACS, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002

  7. KSC01pd1735

    NASA Image and Video Library

    2001-11-26

    KENNEDY SPACE CENTER, Fla. - A piece of equipment for Hubble Space Telescope Servicing mission arrives at Hangar AE, Cape Canaveral. Inside the canister is the Advanced Camera for Surveys (ACS). The ACS will increase the discovery efficiency of the HST by a factor of ten. It consists of three electronic cameras and a complement of filters and dispersers that detect light from the ultraviolet to the near infrared (1200 - 10,000 angstroms). The ACS was built through a collaborative effort between Johns Hopkins University, Goddard Space Flight Center, Ball Aerospace Corporation and Space Telescope Science Institute. The goal of the mission, STS-109, is to service the HST, replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the ACS, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002

  8. Dot Against the Dark

    NASA Image and Video Library

    2014-09-02

    As if trying to get our attention, Mimas is positioned against the shadow of Saturn's rings, bright on dark. As we near summer in Saturn's northern hemisphere, the rings cast ever larger shadows on the planet. With a reflectivity of about 96 percent, Mimas (246 miles, or 396 kilometers across) appears bright against the less-reflective Saturn. This view looks toward the sunlit side of the rings from about 10 degrees above the ringplane. The image was taken with the Cassini spacecraft wide-angle camera on July 13, 2014 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was acquired at a distance of approximately 1.1 million miles (1.8 million kilometers) from Saturn and approximately 1 million miles (1.6 million kilometers) from Mimas. Image scale is 67 miles (108 kilometers) per pixel at Saturn and 60 miles (97 kilometers) per pixel at Mimas. http://photojournal.jpl.nasa.gov/catalog/PIA18282

  9. Low-loss negative index metamaterials for X, Ku, and K microwave bands

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, David A.; Vedral, L. James; Smith, David A.

    2015-04-15

    Low-loss, negative-index of refraction metamaterials were designed and tested for X, Ku, and K microwave frequency bands. An S-shaped, split-ring resonator was used as a unit cell to design homogeneous slabs of negative-index metamaterials. Then, the slabs of metamaterials were cut unto prisms to measure experimentally the negative index of refraction of a plane electromagnetic wave. Theoretical simulations using High-Frequency Structural Simulator, a finite element equation solver, were in good agreement with experimental measurements. The negative index of refraction was retrieved from the angle- and frequency-dependence of the transmitted intensity of the microwave beam through the metamaterial prism and comparedmore » well to simulations; in addition, near-field electromagnetic intensity mapping was conducted with an infrared camera, and there was also a good match with the simulations for expected frequency ranges for the negative index of refraction.« less

  10. Adhesive Wear of Rollers in Vacuum

    NASA Technical Reports Server (NTRS)

    Shaeef, Iqbal; Krantz, Timothy L.

    2012-01-01

    This work was done to support NASA's James Webb Space Telescope that is equipped with a Near Infrared Camera and Spectrograph and Micro Shutter Assembly (MSA). A MSA mechanism's qualification test in cryogenic vacuum at 30deg K for 96K cycles resulted in roller wear and formation of some debris. Lab tests in vacuum were conducted at NASA Glenn Research Center (GRC) to understand the wear of Ti6Al4V mated with 440F steel rollers. Misalignment angle was found to have the most significant effect on debris formation. At misalignment angle of 1.4deg, significant amount of wear debris were formed within 50,000 cycles. Very few wear particles were found for a zero misalignment angle, and the total wear was small even after 367,000 cycles. The mode of wear in all the tests was attributed to adhesion, which was clearly evident from video records as well as the plate-like amalgamated debris material from both rollers. The adhesive wear rate was found to be approximately proportional to the misalignment angle. The wear is a two-way phenomenon, and the mixing of both roller materials in wear debris was confirmed by x-ray fluorescence (XRF) and EDX spectra. While there was a net loss of mass from the steel rollers, XRF and energy dispersive x-ray (EDX) spectra showed peaks of Ti on steel rollers, and peaks of Fe on Ti rollers. These results are useful for designers in terms of maintaining appropriate tolerances to avoid misalignment of rolling elements and the resulting severe wear

  11. Determination of the microbolometric FPA's responsivity with imaging system's radiometric considerations

    NASA Astrophysics Data System (ADS)

    Gogler, Slawomir; Bieszczad, Grzegorz; Krupinski, Michal

    2013-10-01

    Thermal imagers and used therein infrared array sensors are subject to calibration procedure and evaluation of their voltage sensitivity on incident radiation during manufacturing process. The calibration procedure is especially important in so-called radiometric cameras, where accurate radiometric quantities, given in physical units, are of concern. Even though non-radiometric cameras are not expected to stand up to such elevated standards, it is still important, that the image faithfully represents temperature variations across the scene. Detectors used in thermal camera are illuminated by infrared radiation transmitted through an infrared transmitting optical system. Often an optical system, when exposed to uniform Lambertian source forms a non-uniform irradiation distribution in its image plane. In order to be able to carry out an accurate non-uniformity correction it is essential to correctly predict irradiation distribution from a uniform source. In the article a non-uniformity correction method has been presented, that takes into account optical system's radiometry. Predictions of the irradiation distribution have been confronted with measured irradiance values. Presented radiometric model allows fast and accurate non-uniformity correction to be carried out.

  12. Solar System Portrait - View of the Sun, Earth and Venus

    NASA Image and Video Library

    1996-09-13

    This color image of the sun, Earth and Venus was taken by the Voyager 1 spacecraft Feb. 14, 1990, when it was approximately 32 degrees above the plane of the ecliptic and at a slant-range distance of approximately 4 billion miles. It is the first -- and may be the only -- time that we will ever see our solar system from such a vantage point. The image is a portion of a wide-angle image containing the sun and the region of space where the Earth and Venus were at the time with two narrow-angle pictures centered on each planet. The wide-angle was taken with the camera's darkest filter (a methane absorption band), and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large in the sky as seen from Voyager's perspective at the edge of the solar system but is still eight million times brighter than the brightest star in Earth's sky, Sirius. The image of the sun you see is far larger than the actual dimension of the solar disk. The result of the brightness is a bright burned out image with multiple reflections from the optics in the camera. The "rays" around the sun are a diffraction pattern of the calibration lamp which is mounted in front of the wide angle lens. The two narrow-angle frames containing the images of the Earth and Venus have been digitally mosaiced into the wide-angle image at the appropriate scale. These images were taken through three color filters and recombined to produce a color image. The violet, green and blue filters were used; exposure times were, for the Earth image, 0.72, 0.48 and 0.72 seconds, and for the Venus frame, 0.36, 0.24 and 0.36, respectively. Although the planetary pictures were taken with the narrow-angle camera (1500 mm focal length) and were not pointed directly at the sun, they show the effects of the glare from the nearby sun, in the form of long linear streaks resulting from the scattering of sunlight off parts of the camera and its sun shade. From Voyager's great distance both Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun. Detailed analysis also suggests that Voyager detected the moon as well, but it is too faint to be seen without special processing. Venus was only 0.11 pixel in diameter. The faint colored structure in both planetary frames results from sunlight scattered in the optics. http://photojournal.jpl.nasa.gov/catalog/PIA00450

  13. Solar System Portrait - View of the Sun, Earth and Venus

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This color image of the sun, Earth and Venus was taken by the Voyager 1 spacecraft Feb. 14, 1990, when it was approximately 32 degrees above the plane of the ecliptic and at a slant-range distance of approximately 4 billion miles. It is the first -- and may be the only -- time that we will ever see our solar system from such a vantage point. The image is a portion of a wide-angle image containing the sun and the region of space where the Earth and Venus were at the time with two narrow-angle pictures centered on each planet. The wide-angle was taken with the camera's darkest filter (a methane absorption band), and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large in the sky as seen from Voyager's perspective at the edge of the solar system but is still eight million times brighter than the brightest star in Earth's sky, Sirius. The image of the sun you see is far larger than the actual dimension of the solar disk. The result of the brightness is a bright burned out image with multiple reflections from the optics in the camera. The 'rays' around the sun are a diffraction pattern of the calibration lamp which is mounted in front of the wide angle lens. The two narrow-angle frames containing the images of the Earth and Venus have been digitally mosaiced into the wide-angle image at the appropriate scale. These images were taken through three color filters and recombined to produce a color image. The violet, green and blue filters were used; exposure times were, for the Earth image, 0.72, 0.48 and 0.72 seconds, and for the Venus frame, 0.36, 0.24 and 0.36, respectively. Although the planetary pictures were taken with the narrow-angle camera (1500 mm focal length) and were not pointed directly at the sun, they show the effects of the glare from the nearby sun, in the form of long linear streaks resulting from the scattering of sunlight off parts of the camera and its sun shade. From Voyager's great distance both Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun. Detailed analysis also suggests that Voyager detected the moon as well, but it is too faint to be seen without special processing. Venus was only 0.11 pixel in diameter. The faint colored structure in both planetary frames results from sunlight scattered in the optics.

  14. Stand-off CWA imaging system: second sight MS

    NASA Astrophysics Data System (ADS)

    Bernascolle, Philippe F.; Elichabe, Audrey; Fervel, Franck; Haumonté, Jean-Baptiste

    2012-06-01

    In recent years, several manufactures of IR imaging devices have launched commercial models applicable to a wide range of chemical species. These cameras are rugged and sufficiently sensitive to detect low concentrations of toxic and combustible gases. Bertin Technologies, specialized in the design and supply of innovating systems for industry, defense and health, has developed a stand-off gas imaging system using a multi-spectral infrared imaging technology. With this system, the gas cloud size, localization and evolution can be displayed in real time. This technology was developed several years ago in partnership with the CEB, a French MoD CBRN organization. The goal was to meet the need for early warning caused by a chemical threat. With a night & day efficiency of up to 5 km, this process is able to detect Chemical Warfare Agents (CWA), critical Toxic Industrial Compounds (TIC) and also flammable gases. The system has been adapted to detect industrial spillage, using off-the-shelf uncooled infrared cameras, allowing 24/7 surveillance without costly frequent maintenance. The changes brought to the system are in compliance with Military Specifications (MS) and primarily focus on the signal processing improving the classification of the detected products and on the simplification of the Human Machine Interface (HMI). Second Sight MS is the only mass produced, passive stand-off CWA imaging system with a wide angle (up to 60°) already used by several regular armies around the world. This paper examines this IR gas imager performance when exposed to several CWA, TIC and simulant compounds. First, we will describe the Second Sight MS system. The theory of gas detection, visualization and classification functions has already been described elsewhere, so we will just summarize it here. We will then present the main topic of this paper which is the results of the tests done in laboratory on live agents and in open field on simulant. The sensitivity threshold of the camera measured in laboratory, on some CWA (G, H agents...) and TIC (ammonia, sulfur dioxide...) will be given. The result of the detection and visualization of a gas cloud in open field testing for some simulants (DMMP, SF6) at a far distance will be also shown.

  15. Multi-Angle View of the Canary Islands

    NASA Technical Reports Server (NTRS)

    2000-01-01

    A multi-angle view of the Canary Islands in a dust storm, 29 February 2000. At left is a true-color image taken by the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite. This image was captured by the MISR camera looking at a 70.5-degree angle to the surface, ahead of the spacecraft. The middle image was taken by the MISR downward-looking (nadir) camera, and the right image is from the aftward 70.5-degree camera. The images are reproduced using the same radiometric scale, so variations in brightness, color, and contrast represent true variations in surface and atmospheric reflectance with angle. Windblown dust from the Sahara Desert is apparent in all three images, and is much brighter in the oblique views. This illustrates how MISR's oblique imaging capability makes the instrument a sensitive detector of dust and other particles in the atmosphere. Data for all channels are presented in a Space Oblique Mercator map projection to facilitate their co-registration. The images are about 400 km (250 miles)wide, with a spatial resolution of about 1.1 kilometers (1,200 yards). North is toward the top. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  16. Multi-color, rotationally resolved photometry of asteroid 21 Lutetia from OSIRIS/Rosetta observations

    NASA Astrophysics Data System (ADS)

    Lamy, P. L.; Faury, G.; Jorda, L.; Kaasalainen, M.; Hviid, S. F.

    2010-10-01

    Context. Asteroid 21 Lutetia is the second target of the Rosetta space mission. Extensive pre-encounter, space-, and ground-based observations are being performed to prepare for the flyby in July 2010. Aims: The aim of this article is to accurately characterize the photometric properties of this asteroid over a broad spectral range from the ultraviolet to the near-infrared and to search for evidence of surface inhomogeneities. Methods: The asteroid was imaged on 2 and 3 January 2007 with the Narrow Angle Camera (NAC) of the Optical, Spectroscopic, and Infrared Remote Imaging System (OSIRIS) during the cruise phase of the Rosetta spacecraft. The geometric conditions were such that the aspect angle was 44^circ (i.e., mid-northern latitudes) and the phase angle 22.4^circ. Lutetia was continuously monitored over 14.3 h, thus exceeding one rotational period and a half, with twelve filters whose spectral coverage extended from 271 to 986 nm. An accurate photometric calibration was obtained from the observations of a solar analog star, 16 Cyg B. Results: High-quality light curves in the U, B, V, R and I photometric bands were obtained. Once they were merged with previous light curves from over some 45 years, the sidereal period is accurately determined: Prot = 8.168271 ± 0.000002 h. Color variations with rotational phase are marginally detected with the ultraviolet filter centered at 368 nm but are absent in the other visible and near-infrared filters. The albedo is directly determined from the observed maximum cross-section obtained from an elaborated shape model that results from a combination of adaptive-optics imaging and light curve inversion. Using current solutions for the phase function, we find geometric albedos pV = 0.130 ± 0.014 when using the linear phase function and pV(H-G) = 0.180 ± 0.018 when using the (H-G) phase function, which incorporates the opposition effect. The spectral variation of the reflectance indicates a steady decrease with decreasing wavelength rather than a sharp fall-off. Photometric tables (Tables 4 to 8) are only available in electronic form at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/521/A19

  17. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates

    USGS Publications Warehouse

    Hobbs, Michael T.; Brehme, Cheryl S.

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  18. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates.

    PubMed

    Hobbs, Michael T; Brehme, Cheryl S

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  19. Miniaturized fundus camera

    NASA Astrophysics Data System (ADS)

    Gliss, Christine; Parel, Jean-Marie A.; Flynn, John T.; Pratisto, Hans S.; Niederer, Peter F.

    2003-07-01

    We present a miniaturized version of a fundus camera. The camera is designed for the use in screening for retinopathy of prematurity (ROP). There, but also in other applications a small, light weight, digital camera system can be extremely useful. We present a small wide angle digital camera system. The handpiece is significantly smaller and lighter then in all other systems. The electronics is truly portable fitting in a standard boardcase. The camera is designed to be offered at a compatible price. Data from tests on young rabbits' eyes is presented. The development of the camera system is part of a telemedicine project screening for ROP. Telemedical applications are a perfect application for this camera system using both advantages: the portability as well as the digital image.

  20. Pedestrian Detection Based on Adaptive Selection of Visible Light or Far-Infrared Light Camera Image by Fuzzy Inference System and Convolutional Neural Network-Based Verification.

    PubMed

    Kang, Jin Kyu; Hong, Hyung Gil; Park, Kang Ryoung

    2017-07-08

    A number of studies have been conducted to enhance the pedestrian detection accuracy of intelligent surveillance systems. However, detecting pedestrians under outdoor conditions is a challenging problem due to the varying lighting, shadows, and occlusions. In recent times, a growing number of studies have been performed on visible light camera-based pedestrian detection systems using a convolutional neural network (CNN) in order to make the pedestrian detection process more resilient to such conditions. However, visible light cameras still cannot detect pedestrians during nighttime, and are easily affected by shadows and lighting. There are many studies on CNN-based pedestrian detection through the use of far-infrared (FIR) light cameras (i.e., thermal cameras) to address such difficulties. However, when the solar radiation increases and the background temperature reaches the same level as the body temperature, it remains difficult for the FIR light camera to detect pedestrians due to the insignificant difference between the pedestrian and non-pedestrian features within the images. Researchers have been trying to solve this issue by inputting both the visible light and the FIR camera images into the CNN as the input. This, however, takes a longer time to process, and makes the system structure more complex as the CNN needs to process both camera images. This research adaptively selects a more appropriate candidate between two pedestrian images from visible light and FIR cameras based on a fuzzy inference system (FIS), and the selected candidate is verified with a CNN. Three types of databases were tested, taking into account various environmental factors using visible light and FIR cameras. The results showed that the proposed method performs better than the previously reported methods.

  1. Space infrared telescope facility wide field and diffraction limited array camera (IRAC)

    NASA Technical Reports Server (NTRS)

    Fazio, G. G.

    1986-01-01

    IRAC focal plane detector technology was developed and studies of alternate focal plane configurations were supported. While any of the alternate focal planes under consideration would have a major impact on the Infrared Array Camera, it was possible to proceed with detector development and optical analysis research based on the proposed design since, to a large degree, the studies undertaken are generic to any SIRTF imaging instrument. Development of the proposed instrument was also important in a situation in which none of the alternate configurations has received the approval of the Science Working Group.

  2. Using Thermal Radiation in Detection of Negative Obstacles

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo L.; Matthies, Larry H.

    2009-01-01

    A method of automated detection of negative obstacles (potholes, ditches, and the like) ahead of ground vehicles at night involves processing of imagery from thermal-infrared cameras aimed at the terrain ahead of the vehicles. The method is being developed as part of an overall obstacle-avoidance scheme for autonomous and semi-autonomous offroad robotic vehicles. The method could also be applied to help human drivers of cars and trucks avoid negative obstacles -- a development that may entail only modest additional cost inasmuch as some commercially available passenger cars are already equipped with infrared cameras as aids for nighttime operation.

  3. Infrared Camera Diagnostic for Heat Flux Measurements on NSTX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. Mastrovito; R. Maingi; H.W. Kugel

    2003-03-25

    An infrared imaging system has been installed on NSTX (National Spherical Torus Experiment) at the Princeton Plasma Physics Laboratory to measure the surface temperatures on the lower divertor and center stack. The imaging system is based on an Indigo Alpha 160 x 128 microbolometer camera with 12 bits/pixel operating in the 7-13 {micro}m range with a 30 Hz frame rate and a dynamic temperature range of 0-700 degrees C. From these data and knowledge of graphite thermal properties, the heat flux is derived with a classic one-dimensional conduction model. Preliminary results of heat flux scaling are reported.

  4. Qualification Test Report for 450 Gallon Crashworthy Fuel Tank for U.S. Air Force H-53 Helicopter. Volume 6

    DTIC Science & Technology

    1982-04-02

    General S130 Eclipse computer. 2.2.3 Photographic Coverage Each crash test was recorded on 16 mm color film by four W cameras. The event was filmed at...rotate further nose-up until impact. Unfortunately, all cameras had either run out of film or had been turned off prior to impact so that there is no...record of impact angle or crash events. From visual observations at the time, the impact angle seemed to be nearly 90* nose-up. What film exists

  5. Student Measurements of the Double Star Eta Cassiopeiae

    NASA Astrophysics Data System (ADS)

    Brewer, Mark; Cacace, Gabriel; Do, Vivian; Griffith, Nicholas; Malan, Alexandria; Paredes, Hanna; Peticolas, Brian; Stasiak, Kathryne

    2016-10-01

    The double star Eta Cassiopeiae was measured at Vanguard Preparatory School. Digital measurements were made with a 14-inch telescope equipped with a CCD camera. The plate scale was determined to be 0.50 arcseconds per pixel. The separations and position angles were determined to be 13.3 arcseconds and 340.4 degrees, by the use of astronomy software. Previous observations reported in the Washington Double Star Catalog were used as a comparison. The camera angle was found to be the ultimate issue in the skewed data gathered for the double star.

  6. Change detection and characterization of volcanic activity using ground based low-light and near infrared cameras to monitor incandescence and thermal signatures

    NASA Astrophysics Data System (ADS)

    Harrild, M.; Webley, P.; Dehn, J.

    2014-12-01

    Knowledge and understanding of precursory events and thermal signatures are vital for monitoring volcanogenic processes, as activity can often range from low level lava effusion to large explosive eruptions, easily capable of ejecting ash up to aircraft cruise altitudes. Using ground based remote sensing techniques to monitor and detect this activity is essential, but often the required equipment and maintenance is expensive. Our investigation explores the use of low-light cameras to image volcanic activity in the visible to near infrared (NIR) portion of the electromagnetic spectrum. These cameras are ideal for monitoring as they are cheap, consume little power, are easily replaced and can provide near real-time data. We focus here on the early detection of volcanic activity, using automated scripts, that capture streaming online webcam imagery and evaluate image pixel brightness values to determine relative changes and flag increases in activity. The script is written in Python, an open source programming language, to reduce the overall cost to potential consumers and increase the application of these tools across the volcanological community. In addition, by performing laboratory tests to determine the spectral response of these cameras, a direct comparison of collocated low-light and thermal infrared cameras has allowed approximate eruption temperatures and effusion rates to be determined from pixel brightness. The results of a field campaign in June, 2013 to Stromboli volcano, Italy, are also presented here. Future field campaigns to Latin America will include collaborations with INSIVUMEH in Guatemala, to apply our techniques to Fuego and Santiaguito volcanoes.

  7. Change detection and characterization of volcanic activity using ground based low-light and near infrared cameras to monitor incandescence and thermal signatures

    NASA Astrophysics Data System (ADS)

    Harrild, Martin; Webley, Peter; Dehn, Jonathan

    2015-04-01

    Knowledge and understanding of precursory events and thermal signatures are vital for monitoring volcanogenic processes, as activity can often range from low level lava effusion to large explosive eruptions, easily capable of ejecting ash up to aircraft cruise altitudes. Using ground based remote sensing techniques to monitor and detect this activity is essential, but often the required equipment and maintenance is expensive. Our investigation explores the use of low-light cameras to image volcanic activity in the visible to near infrared (NIR) portion of the electromagnetic spectrum. These cameras are ideal for monitoring as they are cheap, consume little power, are easily replaced and can provide near real-time data. We focus here on the early detection of volcanic activity, using automated scripts, that capture streaming online webcam imagery and evaluate image pixel brightness values to determine relative changes and flag increases in activity. The script is written in Python, an open source programming language, to reduce the overall cost to potential consumers and increase the application of these tools across the volcanological community. In addition, by performing laboratory tests to determine the spectral response of these cameras, a direct comparison of collocated low-light and thermal infrared cameras has allowed approximate eruption temperatures and effusion rates to be determined from pixel brightness. The results of a field campaign in June, 2013 to Stromboli volcano, Italy, are also presented here. Future field campaigns to Latin America will include collaborations with INSIVUMEH in Guatemala, to apply our techniques to Fuego and Santiaguito volcanoes.

  8. Apollo 9 Mission image - S0-65 Multispectral Photography - New Mexico and Texas

    NASA Image and Video Library

    1969-03-12

    AS09-26A-3807A (12 March 1969) --- Color infrared photograph of the Texas-New Mexico border area, between Lubbock and Roswell, taken on March 12, 1969, by one of the four synchronized cameras of the Apollo 9 Earth Resources Survey (SO65). At 11:30 a.m. (EST) when this picture was made the Apollo 9 spacecraft was at an altitude of 119 nautical miles, and the sun elevation was 38 degrees above the horizon. The location of the point on Earth's surface at which the four-camera combination was aimed was 33 degrees 42 minutes north latitude, and 103 degrees 1 minute west longitude. The other three cameras used: (B) black and white film with a red filter; (C) black and white infrared film; and (D) black and white film with a green filter.

  9. More than Meets the Eye - Infrared Cameras in Open-Ended University Thermodynamics Labs

    NASA Astrophysics Data System (ADS)

    Melander, Emil; Haglund, Jesper; Weiszflog, Matthias; Andersson, Staffan

    2016-12-01

    Educational research has found that students have challenges understanding thermal science. Undergraduate physics students have difficulties differentiating basic thermal concepts, such as heat, temperature, and internal energy. Engineering students have been found to have difficulties grasping surface emissivity as a thermal material property. One potential source of students' challenges with thermal science is the lack of opportunity to visualize energy transfer in intuitive ways with traditional measurement equipment. Thermodynamics laboratories have typically depended on point measures of temperature by use of thermometers (detecting heat conduction) or pyrometers (detecting heat radiation). In contrast, thermal imaging by means of an infrared (IR) camera provides a real-time, holistic image. Here we provide some background on IR cameras and their uses in education, and summarize five qualitative investigations that we have used in our courses.

  10. Apollo 9 Mission image - S0-65 Multispectral Photography - Georgia

    NASA Image and Video Library

    2009-02-19

    AS09-26A-3816A (12 March 1969) --- Color infrared photograph of the Atlantic coast of Georgia, Brunswick area, taken on March 12, 1969, by one of the four synchronized cameras of the Apollo 9 Earth Resources Survey SO65 Experiment. At 11:35 a.m. (EST) when this picture was made the Apollo 9 spacecraft was at an altitude of 102 nautical miles, and the sun elevation was 51 degrees above the horizon. The location of the point on Earth's surface at which the four-camera combination was aimed 31 degrees 16 minutes north latitude, and 81 degrees 17 minutes west longitude. The other three cameras used: (B) black and white film with a red filter; (C) black and white infrared film; and (D) black and white film with a green filter.

  11. Diffraction experiments with infrared remote controls

    NASA Astrophysics Data System (ADS)

    Kuhn, Jochen; Vogt, Patrik

    2012-02-01

    In this paper we describe an experiment in which radiation emitted by an infrared remote control is passed through a diffraction grating. An image of the diffraction pattern is captured using a cell phone camera and then used to determine the wavelength of the radiation.

  12. [Development of a Surgical Navigation System with Beam Split and Fusion of the Visible and Near-Infrared Fluorescence].

    PubMed

    Yang, Xiaofeng; Wu, Wei; Wang, Guoan

    2015-04-01

    This paper presents a surgical optical navigation system with non-invasive, real-time, and positioning characteristics for open surgical procedure. The design was based on the principle of near-infrared fluorescence molecular imaging. The in vivo fluorescence excitation technology, multi-channel spectral camera technology and image fusion software technology were used. Visible and near-infrared light ring LED excitation source, multi-channel band pass filters, spectral camera 2 CCD optical sensor technology and computer systems were integrated, and, as a result, a new surgical optical navigation system was successfully developed. When the near-infrared fluorescence was injected, the system could display anatomical images of the tissue surface and near-infrared fluorescent functional images of surgical field simultaneously. The system can identify the lymphatic vessels, lymph node, tumor edge which doctor cannot find out with naked eye intra-operatively. Our research will guide effectively the surgeon to remove the tumor tissue to improve significantly the success rate of surgery. The technologies have obtained a national patent, with patent No. ZI. 2011 1 0292374. 1.

  13. 2001 Mars Odyssey Images Earth (Visible and Infrared)

    NASA Technical Reports Server (NTRS)

    2001-01-01

    2001 Mars Odyssey's Thermal Emission Imaging System (THEMIS) acquired these images of the Earth using its visible and infrared cameras as it left the Earth. The visible image shows the thin crescent viewed from Odyssey's perspective. The infrared image was acquired at exactly the same time, but shows the entire Earth using the infrared's 'night-vision' capability. Invisible light the instrument sees only reflected sunlight and therefore sees nothing on the night side of the planet. In infrared light the camera observes the light emitted by all regions of the Earth. The coldest ground temperatures seen correspond to the nighttime regions of Antarctica; the warmest temperatures occur in Australia. The low temperature in Antarctica is minus 50 degrees Celsius (minus 58 degrees Fahrenheit); the high temperature at night in Australia 9 degrees Celsius(48.2 degrees Fahrenheit). These temperatures agree remarkably well with observed temperatures of minus 63 degrees Celsius at Vostok Station in Antarctica, and 10 degrees Celsius in Australia. The images were taken at a distance of 3,563,735 kilometers (more than 2 million miles) on April 19,2001 as the Odyssey spacecraft left Earth.

  14. Strategic options towards an affordable high-performance infrared camera

    NASA Astrophysics Data System (ADS)

    Oduor, Patrick; Mizuno, Genki; Dutta, Achyut K.; Lewis, Jay; Dhar, Nibir K.

    2016-05-01

    The promise of infrared (IR) imaging attaining low-cost akin to CMOS sensors success has been hampered by the inability to achieve cost advantages that are necessary for crossover from military and industrial applications into the consumer and mass-scale commercial realm despite well documented advantages. Banpil Photonics is developing affordable IR cameras by adopting new strategies to speed-up the decline of the IR camera cost curve. We present a new short-wave IR (SWIR) camera; 640x512 pixel InGaAs uncooled system that is high sensitivity low noise (<50e-), high dynamic range (100 dB), high-frame rates (> 500 frames per second (FPS)) at full resolution, and low power consumption (< 1 W) in a compact system. This camera paves the way towards mass market adoption by not only demonstrating high-performance IR imaging capability value add demanded by military and industrial application, but also illuminates a path towards justifiable price points essential for consumer facing application industries such as automotive, medical, and security imaging adoption. Among the strategic options presented include new sensor manufacturing technologies that scale favorably towards automation, multi-focal plane array compatible readout electronics, and dense or ultra-small pixel pitch devices.

  15. In vitro near-infrared imaging of occlusal dental caries using a germanium-enhanced CMOS camera

    NASA Astrophysics Data System (ADS)

    Lee, Chulsung; Darling, Cynthia L.; Fried, Daniel

    2010-02-01

    The high transparency of dental enamel in the near-infrared (NIR) at 1310-nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study was to determine whether the lesion contrast derived from NIR transillumination can be used to estimate lesion severity. Another aim was to compare the performance of a new Ge enhanced complementary metal-oxide-semiconductor (CMOS) based NIR imaging camera with the InGaAs focal plane array (FPA). Extracted human teeth (n=52) with natural occlusal caries were imaged with both cameras at 1310-nm and the image contrast between sound and carious regions was calculated. After NIR imaging, teeth were sectioned and examined using more established methods, namely polarized light microscopy (PLM) and transverse microradiography (TMR) to calculate lesion severity. Lesions were then classified into 4 categories according to the lesion severity. Lesion contrast increased significantly with lesion severity for both cameras (p<0.05). The Ge enhanced CMOS camera equipped with the larger array and smaller pixels yielded higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.

  16. In vitro near-infrared imaging of occlusal dental caries using germanium enhanced CMOS camera.

    PubMed

    Lee, Chulsung; Darling, Cynthia L; Fried, Daniel

    2010-03-01

    The high transparency of dental enamel in the near-infrared (NIR) at 1310-nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study was to determine whether the lesion contrast derived from NIR transillumination can be used to estimate lesion severity. Another aim was to compare the performance of a new Ge enhanced complementary metal-oxide-semiconductor (CMOS) based NIR imaging camera with the InGaAs focal plane array (FPA). Extracted human teeth (n=52) with natural occlusal caries were imaged with both cameras at 1310-nm and the image contrast between sound and carious regions was calculated. After NIR imaging, teeth were sectioned and examined using more established methods, namely polarized light microscopy (PLM) and transverse microradiography (TMR) to calculate lesion severity. Lesions were then classified into 4 categories according to the lesion severity. Lesion contrast increased significantly with lesion severity for both cameras (p<0.05). The Ge enhanced CMOS camera equipped with the larger array and smaller pixels yielded higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.

  17. A near-Infrared SETI Experiment: Alignment and Astrometric precision

    NASA Astrophysics Data System (ADS)

    Duenas, Andres; Maire, Jerome; Wright, Shelley; Drake, Frank D.; Marcy, Geoffrey W.; Siemion, Andrew; Stone, Remington P. S.; Tallis, Melisa; Treffers, Richard R.; Werthimer, Dan

    2016-06-01

    Beginning in March 2015, a Near-InfraRed Optical SETI (NIROSETI) instrument aiming to search for fast nanosecond laser pulses, has been commissioned on the Nickel 1m-telescope at Lick Observatory. The NIROSETI instrument makes use of an optical guide camera, SONY ICX694 CCD from PointGrey, to align our selected sources into two 200µm near-infrared Avalanche Photo Diodes (APD) with a field-of-view of 2.5"x2.5" each. These APD detectors operate at very fast bandwidths and are able to detect pulse widths extending down into the nanosecond range. Aligning sources onto these relatively small detectors requires characterizing the guide camera plate scale, static optical distortion solution, and relative orientation with respect to the APD detectors. We determined the guide camera plate scale as 55.9+- 2.7 milli-arcseconds/pixel and magnitude limit of 18.15mag (+1.07/-0.58) in V-band. We will present the full distortion solution of the guide camera, orientation, and our alignment method between the camera and the two APDs, and will discuss target selection within the NIROSETI observational campaign, including coordination with Breakthrough Listen.

  18. Russian Arctic

    Atmospheric Science Data Center

    2013-04-16

    ... faint greenish hue in the multi-angle composite. This subtle effect suggests that the nadir camera is observing more of the brighter ... energy and water at the Earth's surface, and for preserving biodiversity. The Multi-angle Imaging SpectroRadiometer observes the daylit ...

  19. Behavioral patterns and in-situ target strength of the hairtail ( Trichiurus lepturus) via coupling of scientific echosounder and acoustic camera data

    NASA Astrophysics Data System (ADS)

    Hwang, Kangseok; Yoon, Eun-A.; Kang, Sukyung; Cha, Hyungkee; Lee, Kyounghoon

    2017-12-01

    The present study focuses on the influence of target strength (TS) changes in the swimming angle of the hairtail ( Trichiurus lepturus). We measured in-situ TS at 38 and 120 kHz with luring lamps at a fishing ground for jigging boats near the coastal waters of Jeju-do in Korea. Swimming angle and size of hairtails were measured using an acoustic camera. Results showed that mean preanal length was estimated to be 13.5 cm (SD = 2.7 cm) and mean swimming tilt angle was estimated to be 43.9° (SD = 17.6°). The mean TS values were -35.7 and -41.2 dB at 38 and 120 kHz, respectively. The results will assist in understanding the influence of swimming angle on the TS of hairtails and, thus, improve the accuracy of biomass estimates.

  20. Methods to attack or defend the professional integrity and competency of infrared thermographers and their work; what every attorney and infrared thermographer needs to know before going into a lawsuit

    NASA Astrophysics Data System (ADS)

    Colbert, Fred

    2013-05-01

    There has been a significant increase in the number of in-house Infrared Thermographic Predictive Maintenance programs for Electrical/Mechanical inspections as compared to out-sourced programs using hired consultants. In addition, the number of infrared consulting services companies offering out-sourced programs has also has grown exponentially. These market segments include: Building Envelope (commercial and residential), Refractory, Boiler Evaluations, etc... These surges are driven by two main factors: 1. The low cost of investment in the equipment (the cost of cameras and peripherals continues to decline). 2. Novel marketing campaigns by the camera manufacturers who are looking to sell more cameras into an otherwise saturated market. The key characteristics of these campaigns are to over simplify the applications and understate the significances of technical training, specific skills and experience that's needed to obtain the risk-lowering information that a facility manager needs. These camera selling campaigns focuses on the simplicity of taking a thermogram, but ignores the critical factors of what it takes to actually perform and manage a creditable, valid IR program, which in-turn expose everyone to tremendous liability. As the In-house vs. Out-sourced consulting services compete for market share head to head with each other in a constricted market space, the price for out-sourced/consulting services drops to try to compete on price for more market share. The consequences of this approach are, something must be compromised to be able to stay competitive from a price point, and that compromise is the knowledge, technical skills and experience of the thermographer. This also ends up being reflected back into the skill sets of the in-house thermographer as well. This over simplification of the skill and experience is producing the "Perfect Storm" for Infrared Thermography, for both in-house and out-sourced programs.

  1. A new spherical scanning system for infrared reflectography of paintings

    NASA Astrophysics Data System (ADS)

    Gargano, M.; Cavaliere, F.; Viganò, D.; Galli, A.; Ludwig, N.

    2017-03-01

    Infrared reflectography is an imaging technique used to visualize the underdrawings of ancient paintings; it relies on the fact that most pigment layers are quite transparent to infrared radiation in the spectral band between 0.8 μm and 2.5 μm. InGaAs sensor cameras are nowadays the most used devices to visualize the underdrawings but due to the small size of the detectors, these cameras are usually mounted on scanning systems to record high resolution reflectograms. This work describes a portable scanning system prototype based on a peculiar spherical scanning system built through a light weight and low cost motorized head. The motorized head was built with the purpose of allowing the refocusing adjustment needed to compensate the variable camera-painting distance during the rotation of the camera. The prototype has been tested first in laboratory and then in-situ for the Giotto panel "God the Father with Angels" with a 256 pixel per inch resolution. The system performance is comparable with that of other reflectographic devices with the advantage of extending the scanned area up to 1 m × 1 m, with a 40 min scanning time. The present configuration can be easily modified to increase the resolution up to 560 pixels per inch or to extend the scanned area up to 2 m × 2 m.

  2. Small Unmanned Aerial Vehicles; DHS’s Answer to Border Surveillance Requirements

    DTIC Science & Technology

    2013-03-01

    5 of more than 4000 illegal aliens, including the seizure of more than 15,000 pounds of marijuana .13 In addition to the Predator UAVs being...payload includes two color video cameras, an infrared camera that offers night vision capability and synthetic aperture radar that provides high

  3. UXO Forum 1996

    DTIC Science & Technology

    1996-01-01

    used to locate and characterize a magnetic dipole source, and this finding accelerated the development of superconducting tensor gradiometers for... superconducting magnetic field gradiometer, two-color infrared camera, synthetic aperture radar, and a visible spectrum camera. The combination of these...Pieter Hoekstra, Blackhawk GeoSciences ......................................... 68 Prediction for UXO Shape and Orientation Effects on Magnetic

  4. Can reliable sage-grouse lek counts be obtained using aerial infrared technology

    USGS Publications Warehouse

    Gillette, Gifford L.; Coates, Peter S.; Petersen, Steven; Romero, John P.

    2013-01-01

    More effective methods for counting greater sage-grouse (Centrocercus urophasianus) are needed to better assess population trends through enumeration or location of new leks. We describe an aerial infrared technique for conducting sage-grouse lek counts and compare this method with conventional ground-based lek count methods. During the breeding period in 2010 and 2011, we surveyed leks from fixed-winged aircraft using cryogenically cooled mid-wave infrared cameras and surveyed the same leks on the same day from the ground following a standard lek count protocol. We did not detect significant differences in lek counts between surveying techniques. These findings suggest that using a cryogenically cooled mid-wave infrared camera from an aerial platform to conduct lek surveys is an effective alternative technique to conventional ground-based methods, but further research is needed. We discuss multiple advantages to aerial infrared surveys, including counting in remote areas, representing greater spatial variation, and increasing the number of counted leks per season. Aerial infrared lek counts may be a valuable wildlife management tool that releases time and resources for other conservation efforts. Opportunities exist for wildlife professionals to refine and apply aerial infrared techniques to wildlife monitoring programs because of the increasing reliability and affordability of this technology.

  5. Detailed real-time infrared radiation simulation applied to the sea surface

    NASA Astrophysics Data System (ADS)

    Zhang, Xuemin; Wu, Limin; Long, Liang; Zhang, Lisha

    2018-01-01

    In this paper, the infrared radiation characteristics of sea background have been studied. First, MODTRAN4.0 was used to calculate the transmittance of mid-infrared and far-infrared, and the solar spectral irradiance, the atmospheric and sea surface radiation. Secondly, according to the JONSWAP sea spectrum model, the different sea conditions grid model based on gravity wave theory was generated. The spectral scattering of the sun and the atmospheric background radiation was studied. The total infrared radiation of the sea surface was calculated. Finally, the infrared radiation of a piece of sea surface was mapped to each pixel of the detector, and the infrared radiation is simulated. The conclusion is that solar radiance has a great influence on the infrared radiance. When the detector angle is close to the sun's height angle, there will be bright spots on the sea surface.

  6. Simulation of laser beam reflection at the sea surface modeling and validation

    NASA Astrophysics Data System (ADS)

    Schwenger, Frédéric; Repasi, Endre

    2013-06-01

    A 3D simulation of the reflection of a Gaussian shaped laser beam on the dynamic sea surface is presented. The simulation is suitable for the pre-calculation of images for cameras operating in different spectral wavebands (visible, short wave infrared) for a bistatic configuration of laser source and receiver for different atmospheric conditions. In the visible waveband the calculated detected total power of reflected laser light from a 660nm laser source is compared with data collected in a field trial. Our computer simulation comprises the 3D simulation of a maritime scene (open sea/clear sky) and the simulation of laser beam reflected at the sea surface. The basic sea surface geometry is modeled by a composition of smooth wind driven gravity waves. To predict the view of a camera the sea surface radiance must be calculated for the specific waveband. Additionally, the radiances of laser light specularly reflected at the wind-roughened sea surface are modeled considering an analytical statistical sea surface BRDF (bidirectional reflectance distribution function). Validation of simulation results is prerequisite before applying the computer simulation to maritime laser applications. For validation purposes data (images and meteorological data) were selected from field measurements, using a 660nm cw-laser diode to produce laser beam reflection at the water surface and recording images by a TV camera. The validation is done by numerical comparison of measured total laser power extracted from recorded images with the corresponding simulation results. The results of the comparison are presented for different incident (zenith/azimuth) angles of the laser beam.

  7. Fabrication of multi-focal microlens array on curved surface for wide-angle camera module

    NASA Astrophysics Data System (ADS)

    Pan, Jun-Gu; Su, Guo-Dung J.

    2017-08-01

    In this paper, we present a wide-angle and compact camera module that consists of microlens array with different focal lengths on curved surface. The design integrates the principle of an insect's compound eye and the human eye. It contains a curved hexagonal microlens array and a spherical lens. Compared with normal mobile phone cameras which usually need no less than four lenses, but our proposed system only uses one lens. Furthermore, the thickness of our proposed system is only 2.08 mm and diagonal full field of view is about 100 degrees. In order to make the critical microlens array, we used the inkjet printing to control the surface shape of each microlens for achieving different focal lengths and use replication method to form curved hexagonal microlens array.

  8. MUSIC - Multifunctional stereo imaging camera system for wide angle and high resolution stereo and color observations on the Mars-94 mission

    NASA Astrophysics Data System (ADS)

    Oertel, D.; Jahn, H.; Sandau, R.; Walter, I.; Driescher, H.

    1990-10-01

    Objectives of the multifunctional stereo imaging camera (MUSIC) system to be deployed on the Soviet Mars-94 mission are outlined. A high-resolution stereo camera (HRSC) and wide-angle opto-electronic stereo scanner (WAOSS) are combined in terms of hardware, software, technology aspects, and solutions. Both HRSC and WAOSS are push-button instruments containing a single optical system and focal plates with several parallel CCD line sensors. Emphasis is placed on the MUSIC system's stereo capability, its design, mass memory, and data compression. A 1-Gbit memory is divided into two parts: 80 percent for HRSC and 20 percent for WAOSS, while the selected on-line compression strategy is based on macropixel coding and real-time transform coding.

  9. Modelling of the sublimation of icy grains in the coma of comet 67P/Churyumov-Gerasimenko

    NASA Astrophysics Data System (ADS)

    Gicquel, A.; Vincent, J.-B.; Shi, X.; Sierks, H.; Rose, M.; Güttler, C.; Tubiana, C.

    2015-10-01

    The ESA (European Space Agency) Rosetta spacecraft was launched on 2 March 2004, to reach comet 67P/Churyumov-Gerasimenko in August 2014. Since March 2014, images of the nucleus and the coma (gas and dust) of the comet have been acquired by the OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) camera system [1] using both, the wide angle camera (WAC) and the narrow angle camera (NAC). The orbiter will be maintained in the vicinity of the comet until perihelion (Rh=1.3 AU) or even until Rh=1.8 AU post-perihelion (December 2015). Nineteen months of uninterrupted, close-up observations of the gas and dust coma will be obtained and will help to characterize the evolution of comet gas and dust activity during its approach to the Sun. Indeed, for the first time, we will follow the development of a comet's coma from a close distance. Also the study of the dust-gas interaction in the coma will highlight the sublimation of icy grains. Even if the sublimation of icy grains is known, it is not yet integrated in a complete dust-gas model. We are using the Direct Simulation Monte Carlo (DSMC) method to study the gas flow close to the nucleus. The code called PI-DSMC (www.pidsmc. com) can simulate millions of molecules for multiple species.When the gas flow is simulated, we inject the dust particle with a zero velocity and we take into account the 3 forces acting on the grains in a cometary environment (drag force, gravity and radiative pressure). We used the DLL (Dynamic Link Library) model to integrate the sublimation of icy grains in the gas flowand allow studying the effect of the additional gas on the dust particle trajectories. For a quantitative analysis of the sublimation of icy, outflowing grains we will consider an ensemble of grains of various radii with different compositions [2] The evolution of the grains, once they are ejected into the coma, depends on their initial size, their composition and the heliocentric distance (because the temperature of the grain is higher close to the Sun). The grain temperatures will be derived by assuming equilibrium between the energy absorbed from the Sun, the energy re-radiated in the infrared, and the cooling by sublimation. We will use Mie theory [3, 4] to compute the scattering properties of an assumed grain (grain size, shape and composition, including mineralogy and porosity). We follow the evolution of grains until the icy layer sublimates completely. Once ejected in the gas flow, the generated molecules have no preferred direction. First results highlighted that the sublimation has a significant influence on the dust trajectories and generates a gas cloud that moves with the velocity of the icy grains. Our model can produce artificial images for a wide range of parameters, including outgassing rate, surface temperature, dust properties and sublimation of icy grains. The results of this model will be compared to the images obtained with OSIRIS camera and to the published data from other instruments.

  10. Marangoni Flow Induced Evaporation Enhancement on Binary Sessile Drops.

    PubMed

    Chen, Pin; Harmand, Souad; Ouenzerfi, Safouene; Schiffler, Jesse

    2017-06-15

    The evaporation processes of pure water, pure 1-butanol, and 5% 1-butanol aqueous solution drops on heated hydrophobic substrates are investigated to determine the effect of temperature on the drop evaporation behavior. The evolution of the parameters (contact angle, diameter, and volume) during evaporation measured using a drop shape analyzer and the infrared thermal mapping of the drop surface recorded by an infrared camera were used in investigating the evaporation process. The pure 1-butanol drop does not show any thermal instability at different substrate temperatures, while the convection cells created by the thermal Marangoni effect appear on the surface of the pure water drop from 50 °C. Because 1-butanol and water have different surface tensions, the infrared video of the 5% 1-butanol aqueous solution drop shows that the convection cells are generated by the solutal Marangoni effect at any substrate temperature. Furthermore, when the substrate temperature exceeds 50 °C, coexistence of the thermal and solutal Marangoni flows is observed. By analyzing the relation between the ratio of the evaporation rate of pure water and 1-butanol aqueous solution drops and the Marangoni number, a series of empirical equations for predicting the evaporation rates of pure water and 1-butanol aqueous solution drops at the initial time as well as the equations for the evaporation rate of 1-butanol aqueous solution drop before the depletion of alcohol are derived. The results of these equations correspond fairly well to the experimental data.

  11. Jupiter-Io Montage

    NASA Technical Reports Server (NTRS)

    2007-01-01

    This is a montage of New Horizons images of Jupiter and its volcanic moon Io, taken during the spacecraft's Jupiter flyby in early 2007. The Jupiter image is an infrared color composite taken by the spacecraft's near-infrared imaging spectrometer, the Linear Etalon Imaging Spectral Array (LEISA) at 1:40 UT on Feb. 28, 2007. The infrared wavelengths used (red: 1.59 um, green: 1.94 um, blue: 1.85 um) highlight variations in the altitude of the Jovian cloud tops, with blue denoting high-altitude clouds and hazes, and red indicating deeper clouds. The prominent bluish-white oval is the Great Red Spot. The observation was made at a solar phase angle of 75 degrees but has been projected onto a crescent to remove distortion caused by Jupiter's rotation during the scan. The Io image, taken at 00:25 UT on March 1st 2007, is an approximately true-color composite taken by the panchromatic Long-Range Reconnaissance Imager (LORRI), with color information provided by the 0.5 um ('blue') and 0.9 um ('methane') channels of the Multispectral Visible Imaging Camera (MVIC). The image shows a major eruption in progress on Io's night side, at the northern volcano Tvashtar. Incandescent lava glows red beneath a 330-kilometer high volcanic plume, whose uppermost portions are illuminated by sunlight. The plume appears blue due to scattering of light by small particles in the plume

    This montage appears on the cover of the Oct. 12, 2007, issue of Science magazine.

  12. Greenland's Coast in Holiday Colors

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Vibrant reds, emerald greens, brilliant whites, and pastel blues adorn this view of the area surrounding the Jakobshavn Glacier on the western coast of Greenland. The image is a false-color (near-infrared, green, blue) view acquired by the Multi-angle Imaging SpectroRadiometer's nadir camera. The brightness of vegetation in the near-infrared contributes to the reddish hues; glacial silt gives rise to the green color of the water; and blue-colored melt ponds are visible in the bright white ice. A scattering of small icebergs in Disco Bay adds a touch of glittery sparkle to the scene.

    The large island in the upper left is called Qeqertarsuaq. To the east of this island, and just above image center, is the outlet of the fast-flowing Jakobshavn (or Ilulissat) glacier. Jakobshavn is considered to have the highest iceberg production of all Greenland glaciers and is a major drainage outlet for a large portion of the western side of the ice sheet. Icebergs released from the glacier drift slowly with the ocean currents and pose hazards for shipping along the coast.

    The Multi-angle Imaging SpectroRadiometer views the daylit Earth continuously and the entire globe between 82 degrees north and 82 degrees south latitude is observed every 9 days. These data products were generated from a portion of the imagery acquired on June 18, 2003 during Terra orbit 18615. The image cover an area of about 254 kilometers x 210 kilometers, and use data from blocks 34 to 35 within World Reference System-2 path 10.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  13. Status of the JWST Science Instrument Payload

    NASA Technical Reports Server (NTRS)

    Greenhouse, Matt

    2016-01-01

    The James Webb Space Telescope (JWST) Integrated Science Instrument Module (ISIM) system consists of five sensors (4 science): Mid-Infrared Instrument (MIRI), Near Infrared Imager and Slitless Spectrograph (NIRISS), Fine Guidance Sensor (FGS), Near InfraRed Camera (NIRCam), Near InfraRed Spectrograph (NIRSpec); and nine instrument support systems: Optical metering structure system, Electrical Harness System; Harness Radiator System, ISIM Electronics Compartment, ISIM Remote Services Unit, Cryogenic Thermal Control System, Command and Data Handling System, Flight Software System, Operations Scripts System.

  14. The Utility of Using a Near-Infrared (NIR) Camera to Measure Beach Surface Moisture

    NASA Astrophysics Data System (ADS)

    Nelson, S.; Schmutz, P. P.

    2017-12-01

    Surface moisture content is an important factor that must be considered when studying aeolian sediment transport in a beach environment. A few different instruments and procedures are available for measuring surface moisture content (i.e. moisture probes, LiDAR, and gravimetric moisture data from surface scrapings); however, these methods can be inaccurate, costly, and inapplicable, particularly in the field. Near-infrared (NIR) spectral band imagery is another technique used to obtain moisture data. NIR imagery has been predominately used through remote sensing and has yet to be used for ground-based measurements. Dry sand reflects infrared radiation given off by the sun and wet sand absorbs IR radiation. All things considered, this study assesses the utility of measuring surface moisture content of beach sand with a modified NIR camera. A traditional point and shoot digital camera was internally modified with the placement of a visible light-blocking filter. Images were taken of three different types of beach sand at controlled moisture content values, with sunlight as the source of infrared radiation. A technique was established through trial and error by comparing resultant histogram values using Adobe Photoshop with the various moisture conditions. The resultant IR absorption histogram values were calibrated to actual gravimetric moisture content from surface scrapings of the samples. Overall, the results illustrate that the NIR spectrum modified camera does not provide the ability to adequately measure beach surface moisture content. However, there were noted differences in IR absorption histogram values among the different sediment types. Sediment with darker quartz mineralogy provided larger variations in histogram values, but the technique is not sensitive enough to accurately represent low moisture percentages, which are of most importance when studying aeolian sediment transport.

  15. Preplanning and Evaluating Video Documentaries and Features.

    ERIC Educational Resources Information Center

    Maynard, Riley

    1997-01-01

    This article presents a ten-part pre-production outline and post-production evaluation that helps communications students more effectively improve video skills. Examines camera movement and motion, camera angle and perspective, lighting, audio, graphics, backgrounds and color, special effects, editing, transitions, and music. Provides a glossary…

  16. Single-Camera Stereoscopy Setup to Visualize 3D Dusty Plasma Flows

    NASA Astrophysics Data System (ADS)

    Romero-Talamas, C. A.; Lemma, T.; Bates, E. M.; Birmingham, W. J.; Rivera, W. F.

    2016-10-01

    A setup to visualize and track individual particles in multi-layered dusty plasma flows is presented. The setup consists of a single camera with variable frame rate, and a pair of adjustable mirrors that project the same field of view from two different angles to the camera, allowing for three-dimensional tracking of particles. Flows are generated by inclining the plane in which the dust is levitated using a specially designed setup that allows for external motion control without compromising vacuum. Dust illumination is achieved with an optics arrangement that includes a Powell lens that creates a laser fan with adjustable thickness and with approximately constant intensity everywhere. Both the illumination and the stereoscopy setup allow for the camera to be placed at right angles with respect to the levitation plane, in preparation for magnetized dusty plasma experiments in which there will be no direct optical access to the levitation plane. Image data and analysis of unmagnetized dusty plasma flows acquired with this setup are presented.

  17. CIRCE: The Canarias InfraRed Camera Experiment for the Gran Telescopio Canarias

    NASA Astrophysics Data System (ADS)

    Eikenberry, Stephen S.; Charcos, Miguel; Edwards, Michelle L.; Garner, Alan; Lasso-Cabrera, Nestor; Stelter, Richard D.; Marin-Franch, Antonio; Raines, S. Nicholas; Ackley, Kendall; Bennett, John G.; Cenarro, Javier A.; Chinn, Brian; Donoso, H. Veronica; Frommeyer, Raymond; Hanna, Kevin; Herlevich, Michael D.; Julian, Jeff; Miller, Paola; Mullin, Scott; Murphey, Charles H.; Packham, Chris; Varosi, Frank; Vega, Claudia; Warner, Craig; Ramaprakash, A. N.; Burse, Mahesh; Punnadi, Sunjit; Chordia, Pravin; Gerarts, Andreas; Martín, Héctor De Paz; Calero, María Martín; Scarpa, Riccardo; Acosta, Sergio Fernandez; Sánchez, William Miguel Hernández; Siegel, Benjamin; Pérez, Francisco Francisco; Martín, Himar D. Viera; Losada, José A. Rodríguez; Nuñez, Agustín; Tejero, Álvaro; González, Carlos E. Martín; Rodríguez, César Cabrera; Sendra, Jordi Molgó; Rodriguez, J. Esteban; Cáceres, J. Israel Fernádez; García, Luis A. Rodríguez; Lopez, Manuel Huertas; Dominguez, Raul; Gaggstatter, Tim; Lavers, Antonio Cabrera; Geier, Stefan; Pessev, Peter; Sarajedini, Ata; Castro-Tirado, A. J.

    The Canarias InfraRed Camera Experiment (CIRCE) is a near-infrared (1-2.5μm) imager, polarimeter and low-resolution spectrograph operating as a visitor instrument for the Gran Telescopio Canarias (GTC) 10.4-m telescope. It was designed and built largely by graduate students and postdocs, with help from the University of Florida (UF) astronomy engineering group, and is funded by the UF and the US National Science Foundation. CIRCE is intended to help fill the gap in near-infrared capabilities prior to the arrival of Especrografo Multiobjecto Infra-Rojo (EMIR) to the GTC and will also provide the following scientific capabilities to compliment EMIR after its arrival: high-resolution imaging, narrowband imaging, high-time-resolution photometry, imaging polarimetry, and low resolution spectroscopy. In this paper, we review the design, fabrication, integration, lab testing, and on-sky performance results for CIRCE. These include a novel approach to the opto-mechanical design, fabrication, and alignment.

  18. Statistical Analysis of an Infrared Thermography Inspection of Reinforced Carbon-Carbon

    NASA Technical Reports Server (NTRS)

    Comeaux, Kayla

    2011-01-01

    Each piece of flight hardware being used on the shuttle must be analyzed and pass NASA requirements before the shuttle is ready for launch. One tool used to detect cracks that lie within flight hardware is Infrared Flash Thermography. This is a non-destructive testing technique which uses an intense flash of light to heat up the surface of a material after which an Infrared camera is used to record the cooling of the material. Since cracks within the material obstruct the natural heat flow through the material, they are visible when viewing the data from the Infrared camera. We used Ecotherm, a software program, to collect data pertaining to the delaminations and analyzed the data using Ecotherm and University of Dayton Log Logistic Probability of Detection (POD) Software. The goal was to reproduce the statistical analysis produced by the University of Dayton software, by using scatter plots, log transforms, and residuals to test the assumption of normality for the residuals.

  19. Computing camera heading: A study

    NASA Astrophysics Data System (ADS)

    Zhang, John Jiaxiang

    2000-08-01

    An accurate estimate of the motion of a camera is a crucial first step for the 3D reconstruction of sites, objects, and buildings from video. Solutions to the camera heading problem can be readily applied to many areas, such as robotic navigation, surgical operation, video special effects, multimedia, and lately even in internet commerce. From image sequences of a real world scene, the problem is to calculate the directions of the camera translations. The presence of rotations makes this problem very hard. This is because rotations and translations can have similar effects on the images, and are thus hard to tell apart. However, the visual angles between the projection rays of point pairs are unaffected by rotations, and their changes over time contain sufficient information to determine the direction of camera translation. We developed a new formulation of the visual angle disparity approach, first introduced by Tomasi, to the camera heading problem. Our new derivation makes theoretical analysis possible. Most notably, a theorem is obtained that locates all possible singularities of the residual function for the underlying optimization problem. This allows identifying all computation trouble spots beforehand, and to design reliable and accurate computational optimization methods. A bootstrap-jackknife resampling method simultaneously reduces complexity and tolerates outliers well. Experiments with image sequences show accurate results when compared with the true camera motion as measured with mechanical devices.

  20. Recent Mastcam and MAHLI Visible/Near-Infrared Spectrophotometric Observations: Pahrump Hills to Marias Pass

    NASA Astrophysics Data System (ADS)

    Johnson, J. R.; Bell, J. F., III; Hayes, A.; Deen, R. G.; Godber, A.; Arvidson, R. E.; Lemmon, M. T.

    2015-12-01

    The Mastcam imaging system on the Curiosity rover continued acquisition of multispectral images of the same terrain at multiple times of day at three new rover locations between sols 872 and 1003. These data sets will be used to investigate the light scattering properties of rocks and soils along the Curiosity traverse using radiative transfer models. Images were acquired by the Mastcam-34 (M-34) camera on Sols 872-892 at 8 times of day (Mojave drill location), Sols 914-917 (Telegraph Peak drill location) at 9 times of day, and Sols 1000-1003 at 8 times of day (Stimson-Murray Formation contact near Marias Pass). Data sets were acquired using filters centered at 445, 527, 751, and 1012 nm, and the images were jpeg-compressed. Data sets typically were pointed ~east and ~west to provide phase angle coverage from near 0° to 125-140° for a variety of rocks and soils. Also acquired on Sols 917-918 at the Telegraph Peak site was a multiple time-of-day Mastcam sequence pointed southeast using only the broadband Bayer filters that provided losslessly compressed images with phase angles ~55-129°. Navcam stereo images were also acquired with each data set to provide broadband photometry and terrain measurements for computing surface normals and local incidence and emission angles used in photometric modeling. On Sol 1028, the MAHLI camera was used as a goniometer to acquire images at 20 arm positions, all centered at the same location within the work volume from a near-constant distance of 85 cm from the surface. Although this experiment was run at only one time of day (~15:30 LTST), it provided phase angle coverage from ~30° to ~111°. The terrain included the contact between the uppermost portion of the Murray Formation and the Stimson sandstones, and was the first acquisition of both Mastcam and MALHI photometry images at the same rover location. The MAHLI images also allowed construction of a 3D shape model of the Stimson-Murray contact region. The attached figure shows a phase color composite of the western Stimson area, created using phase angles of 8°, 78°, and 130° at 751 nm. The red areas correspond to highly backscattering materials that appear to concentrate along linear fractures throughout this area. The blue areas correspond to more forward scattering materials dispersed through the stratigraphic sequence.

  1. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates

    PubMed Central

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing. PMID:28981533

  2. Upgrade of the infrared camera diagnostics for the JET ITER-like wall divertor.

    PubMed

    Balboa, I; Arnoux, G; Eich, T; Sieglin, B; Devaux, S; Zeidner, W; Morlock, C; Kruezi, U; Sergienko, G; Kinna, D; Thomas, P D; Rack, M

    2012-10-01

    For the new ITER-like wall at JET, two new infrared diagnostics (KL9B, KL3B) have been installed. These diagnostics can operate between 3.5 and 5 μm and up to sampling frequencies of ∼20 kHz. KL9B and KL3B image the horizontal and vertical tiles of the divertor. The divertor tiles are tungsten coated carbon fiber composite except the central tile which is bulk tungsten and consists of lamella segments. The thermal emission between lamellae affects the surface temperature measurement and therefore KL9A has been upgraded to achieve a higher spatial resolution (by a factor of 2). A technical description of KL9A, KL9B, and KL3B and cross correlation with a near infrared camera and a two-color pyrometer is presented.

  3. C-RED One : the infrared camera using the Saphira e-APD detector

    NASA Astrophysics Data System (ADS)

    Greffe, Timothée.; Feautrier, Philippe; Gach, Jean-Luc; Stadler, Eric; Clop, Fabien; Lemarchand, Stephane; Boutolleau, David; Baker, Ian

    2016-08-01

    Name for Person Card: Observatoire de la Côte d'Azur First Light Imaging' C-RED One infrared camera is capable of capturing up to 3500 full frames per second with a sub-electron readout noise and very low background. This breakthrough has been made possible thanks to the use of an e- APD infrared focal plane array which is a real disruptive technology in imagery. C-RED One is an autonomous system with an integrated cooling system and a vacuum regeneration system. It operates its sensor with a wide variety of read out techniques and processes video on-board thanks to an FPGA. We will show its performances and expose its main features. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.

  4. Selecting among competing models of electro-optic, infrared camera system range performance

    USGS Publications Warehouse

    Nichols, Jonathan M.; Hines, James E.; Nichols, James D.

    2013-01-01

    Range performance is often the key requirement around which electro-optical and infrared camera systems are designed. This work presents an objective framework for evaluating competing range performance models. Model selection based on the Akaike’s Information Criterion (AIC) is presented for the type of data collected during a typical human observer and target identification experiment. These methods are then demonstrated on observer responses to both visible and infrared imagery in which one of three maritime targets was placed at various ranges. We compare the performance of a number of different models, including those appearing previously in the literature. We conclude that our model-based approach offers substantial improvements over the traditional approach to inference, including increased precision and the ability to make predictions for some distances other than the specific set for which experimental trials were conducted.

  5. Binocular Multispectral Adaptive Imaging System (BMAIS)

    DTIC Science & Technology

    2010-07-26

    system for pilots that adaptively integrates shortwave infrared (SWIR), visible, near ‐IR (NIR), off‐head thermal, and computer symbology/imagery into...respective areas. BMAIS is a binocular helmet mounted imaging system that features dual shortwave infrared (SWIR) cameras, embedded image processors and...algorithms and fusion of other sensor sites such as forward looking infrared (FLIR) and other aircraft subsystems. BMAIS is attached to the helmet

  6. An integrated multispectral video and environmental monitoring system for the study of coastal processes and the support of beach management operations

    NASA Astrophysics Data System (ADS)

    Ghionis, George; Trygonis, Vassilis; Karydis, Antonis; Vousdoukas, Michalis; Alexandrakis, George; Drakopoulos, Panos; Amdreadis, Olympos; Psarros, Fotis; Velegrakis, Antonis; Poulos, Serafim

    2016-04-01

    Effective beach management requires environmental assessments that are based on sound science, are cost-effective and are available to beach users and managers in an accessible, timely and transparent manner. The most common problems are: 1) The available field data are scarce and of sub-optimal spatio-temporal resolution and coverage, 2) our understanding of local beach processes needs to be improved in order to accurately model/forecast beach dynamics under a changing climate, and 3) the information provided by coastal scientists/engineers in the form of data, models and scientific interpretation is often too complicated to be of direct use by coastal managers/decision makers. A multispectral video system has been developed, consisting of one or more video cameras operating in the visible part of the spectrum, a passive near-infrared (NIR) camera, an active NIR camera system, a thermal infrared camera and a spherical video camera, coupled with innovative image processing algorithms and a telemetric system for the monitoring of coastal environmental parameters. The complete system has the capability to record, process and communicate (in quasi-real time) high frequency information on shoreline position, wave breaking zones, wave run-up, erosion hot spots along the shoreline, nearshore wave height, turbidity, underwater visibility, wind speed and direction, air and sea temperature, solar radiation, UV radiation, relative humidity, barometric pressure and rainfall. An innovative, remotely-controlled interactive visual monitoring system, based on the spherical video camera (with 360°field of view), combines the video streams from all cameras and can be used by beach managers to monitor (in real time) beach user numbers, flow activities and safety at beaches of high touristic value. The high resolution near infrared cameras permit 24-hour monitoring of beach processes, while the thermal camera provides information on beach sediment temperature and moisture, can detect upwelling in the nearshore zone, and enhances the safety of beach users. All data can be presented in real- or quasi-real time and are stored for future analysis and training/validation of coastal processes models. Acknowledgements: This work was supported by the project BEACHTOUR (11SYN-8-1466) of the Operational Program "Cooperation 2011, Competitiveness and Entrepreneurship", co-funded by the European Regional Development Fund and the Greek Ministry of Education and Religious Affairs.

  7. A hidden view of wildlife conservation: How camera traps aid science, research and management

    USGS Publications Warehouse

    O'Connell, Allan F.

    2015-01-01

    Camera traps — remotely activated cameras with infrared sensors — first gained measurable popularity in wildlife conservation in the early 1990s. Today, they’re used for a variety of activities, from species-specific research to broad-scale inventory or monitoring programs that, in some cases, attempt to detect biodiversity across vast landscapes. As this modern tool continues to evolve, it’s worth examining its uses and benefits for wildlife management and conservation.

  8. The Effect of a Pre-Lens Aperture on the Temperature Range and Image Uniformity of Microbolometer Infrared Cameras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinwiddie, Ralph Barton; Parris, Larkin S.; Lindal, John M.

    This paper explores the temperature range extension of long-wavelength infrared (LWIR) cameras by placing an aperture in front of the lens. An aperture smaller than the lens will reduce the radiance to the sensor, allowing the camera to image targets much hotter than typically allowable. These higher temperatures were accurately determined after developing a correction factor which was applied to the built-in temperature calibration. The relationship between aperture diameter and temperature range is linear. The effect of pre-lens apertures on the image uniformity is a form of anti-vignetting, meaning the corners appear brighter (hotter) than the rest of the image.more » An example of using this technique to measure temperatures of high melting point polymers during 3D printing provide valuable information of the time required for the weld-line temperature to fall below the glass transition temperature.« less

  9. Augmented reality in laser laboratories

    NASA Astrophysics Data System (ADS)

    Quercioli, Franco

    2018-05-01

    Laser safety glasses block visibility of the laser light. This is a big nuisance when a clear view of the beam path is required. A headset made up of a smartphone and a viewer can overcome this problem. The user looks at the image of the real world on the cellphone display, captured by its rear camera. An unimpeded and safe sight of the laser beam is then achieved. If the infrared blocking filter of the smartphone camera is removed, the spectral sensitivity of the CMOS image sensor extends in the near infrared region up to 1100 nm. This substantial improvement widens the usability of the device to many laser systems for industrial and medical applications, which are located in this spectral region. The paper describes this modification of a phone camera to extend its sensitivity beyond the visible and make a true augmented reality laser viewer.

  10. Basic temperature correction of QWIP cameras in thermoelastic/plastic tests of composite materials.

    PubMed

    Boccardi, Simone; Carlomagno, Giovanni Maria; Meola, Carosena

    2016-12-01

    The present work is concerned with the use of a quantum well infrared photodetector (QWIP) infrared camera to measure very small temperature variations, which are related to thermoelastic/plastic effects, developing on composites under relatively low loads, either periodic or due to impact. As is evident from previous work, some temperature variations are difficult to measure, being at the edge of the IR camera resolution and/or affected by the instrument noise. Conversely, they may be valuable to get either information about the material characteristics and its behavior under periodic load (thermoelastic), or to assess the overall extension of delaminations due to impact (thermo-plastic). An image post-processing procedure is herein described that, with the help of a reference signal, allows for suppression of the instrument noise and better discrimination of thermal signatures induced by the two different loads.

  11. The Hubble Space Telescope: UV, Visible, and Near-Infrared Pursuits

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer

    2010-01-01

    The Hubble Space Telescope continues to push the limits on world-class astrophysics. Cameras including the Advanced Camera for Surveys and the new panchromatic Wide Field Camera 3 which was installed nu last year's successful servicing mission S2N4,o{fer imaging from near-infrared through ultraviolet wavelengths. Spectroscopic studies of sources from black holes to exoplanet atmospheres are making great advances through the versatile use of STIS, the Space Telescope Imaging Spectrograph. The new Cosmic Origins Spectrograph, also installed last year, is the most sensitive UV spectrograph to fly io space and is uniquely suited to address particular scientific questions on galaxy halos, the intergalactic medium, and the cosmic web. With these outstanding capabilities on HST come complex needs for laboratory astrophysics support including atomic and line identification data. I will provide an overview of Hubble's current capabilities and the scientific programs and goals that particularly benefit from the studies of laboratory astrophysics.

  12. Feasibility evaluation of a motion detection system with face images for stereotactic radiosurgery.

    PubMed

    Yamakawa, Takuya; Ogawa, Koichi; Iyatomi, Hitoshi; Kunieda, Etsuo

    2011-01-01

    In stereotactic radiosurgery we can irradiate a targeted volume precisely with a narrow high-energy x-ray beam, and thus the motion of a targeted area may cause side effects to normal organs. This paper describes our motion detection system with three USB cameras. To reduce the effect of change in illuminance in a tracking area we used an infrared light and USB cameras that were sensitive to the infrared light. The motion detection of a patient was performed by tracking his/her ears and nose with three USB cameras, where pattern matching between a predefined template image for each view and acquired images was done by an exhaustive search method with a general-purpose computing on a graphics processing unit (GPGPU). The results of the experiments showed that the measurement accuracy of our system was less than 0.7 mm, amounting to less than half of that of our previous system.

  13. Tracking a Head-Mounted Display in a Room-Sized Environment with Head-Mounted Cameras

    DTIC Science & Technology

    1990-04-01

    poor resolution and a very limited working volume [Wan90]. 4 OPTOTRAK [Nor88] uses one camera with two dual-axis CCD infrared position sensors. Each...Nor88] Northern Digital. Trade literature on Optotrak - Northern Digital’s Three Dimensional Optical Motion Tracking and Analysis System. Northern Digital

  14. The diagnosing of plasmas using spectroscopy and imaging on Proto-MPEX

    NASA Astrophysics Data System (ADS)

    Baldwin, K. A.; Biewer, T. M.; Crouse Powers, J.; Hardin, R.; Johnson, S.; McCleese, A.; Shaw, G. C.; Showers, M.; Skeen, C.

    2015-11-01

    The Prototype Material Plasma Exposure eXperiment (Proto-MPEX) is a linear plasma device being developed at Oak Ridge National Laboratory (ORNL). This machine plans to study plasma-material interaction (PMI) physics relevant to future fusion reactors. We tested and learned to use tools of spectroscopy and imaging. These tools consist of a spectrometer, a high speed camera, an infrared camera, and a thermocouple. The spectrometer measures the color of the light from the plasma and its intensity. We also used a high speed camera to see how the magnetic field acts on the plasma, and how it is heated to the fourth state of matter. The thermocouples measure the temperature of the objects they are placed against, which in this case are the end plates of the machine. We also used the infrared camera to see the heat pattern of the plasma on the end plates. Data from these instruments will be shown. This work was supported by the US. D.O.E. contract DE-AC05-00OR22725, and the Oak Ridge Associated Universities ARC program.

  15. Studies of coronal lines with electronic cameras during the eclipse of 7 march 1970.

    PubMed

    Fort, B

    1970-12-01

    The experimental design described here allows us to study with 2-A. bandpass filters the brightness distribution of the green coronal line, the two infrared lines of Fe XIII, and the neighboring coronal continuum. For the first time, in an eclipse expedition, electrostatic cameras derived from the Lallemand type are used; full advantage was taken of their speed, especially in the near infrared spectral range, and their good photometric qualities. They permit the measurement of intensity and polarization of the lines in the corona to a height of 1.25 solar radii above the limb of the sun, with a spatial resolution >/= (10")(2).

  16. A Cryogenic, Insulating Suspension System for the High Resolution Airborne Wideband Camera (HAWC)and Submillemeter And Far Infrared Experiment (SAFIRE) Adiabatic Demagnetization Refrigerators (ADRs)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Jackson, Michael L.; Shirron, Peter J.; Tuttle, James G.

    2002-01-01

    The High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter And Far Infrared Experiment (SAFIRE) will use identical Adiabatic Demagnetization Refrigerators (ADR) to cool their detectors to 200mK and 100mK, respectively. In order to minimize thermal loads on the salt pill, a Kevlar suspension system is used to hold it in place. An innovative, kinematic suspension system is presented. The suspension system is unique in that it consists of two parts that can be assembled and tensioned offline, and later bolted onto the salt pill.

  17. Thin and thick cloud top height retrieval algorithm with the Infrared Camera and LIDAR of the JEM-EUSO Space Mission

    NASA Astrophysics Data System (ADS)

    Sáez-Cano, G.; Morales de los Ríos, J. A.; del Peral, L.; Neronov, A.; Wada, S.; Rodríguez Frías, M. D.

    2015-03-01

    The origin of cosmic rays have remained a mistery for more than a century. JEM-EUSO is a pioneer space-based telescope that will be located at the International Space Station (ISS) and its aim is to detect Ultra High Energy Cosmic Rays (UHECR) and Extremely High Energy Cosmic Rays (EHECR) by observing the atmosphere. Unlike ground-based telescopes, JEM-EUSO will observe from upwards, and therefore, for a properly UHECR reconstruction under cloudy conditions, a key element of JEM-EUSO is an Atmospheric Monitoring System (AMS). This AMS consists of a space qualified bi-spectral Infrared Camera, that will provide the cloud coverage and cloud top height in the JEM-EUSO Field of View (FoV) and a LIDAR, that will measure the atmospheric optical depth in the direction it has been shot. In this paper we will explain the effects of clouds for the determination of the UHECR arrival direction. Moreover, since the cloud top height retrieval is crucial to analyze the UHECR and EHECR events under cloudy conditions, the retrieval algorithm that fulfills the technical requierements of the Infrared Camera of JEM-EUSO to reconstruct the cloud top height is presently reported.

  18. Getting Closer

    NASA Image and Video Library

    2005-06-20

    One of the two pictures of Tempel 1 (see also PIA02101) taken by Deep Impact's medium-resolution camera is shown next to data of the comet taken by the spacecraft's infrared spectrometer. This instrument breaks apart light like a prism to reveal the "fingerprints," or signatures, of chemicals. Even though the spacecraft was over 10 days away from the comet when these data were acquired, it detected some of the molecules making up the comet's gas and dust envelope, or coma. The signatures of these molecules -- including water, hydrocarbons, carbon dioxide and carbon monoxide -- can be seen in the graph, or spectrum. Deep Impact's impactor spacecraft is scheduled to collide with Tempel 1 at 10:52 p.m. Pacific time on July 3 (1:52 a.m. Eastern time, July 4). The mission's flyby spacecraft will use its infrared spectrometer to sample the ejected material, providing the first look at the chemical composition of a comet's nucleus. These data were acquired from June 20 to 21, 2005. The picture of Tempel 1 was taken by the flyby spacecraft's medium-resolution instrument camera. The infrared spectrometer uses the same telescope as the high-resolution instrument camera. http://photojournal.jpl.nasa.gov/catalog/PIA02100

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalas, Paul G.; Rajan, Abhijith; Wang, Jason J.

    Here, we present the first scattered light detections of the HD 106906 debris disk using the Gemini/Gemini Planet Imager in the infrared and Hubble Space Telescope (HST)/Advanced Camera for Surveys in the optical. HD 106906 is a 13 Myr old F5V star in the Sco–Cen association, with a previously detected planet-mass candidate HD 106906b projected 650 AU from the host star. Our observations reveal a near edge-on debris disk that has a central cleared region with radius ~50 AU, and an outer extent >500 AU. The HST data show that the outer regions are highly asymmetric, resembling the "needle" morphologymore » seen for the HD 15115 debris disk. The planet candidate is oriented ~21° away from the position angle of the primary's debris disk, strongly suggesting non-coplanarity with the system. We hypothesize that HD 106906b could be dynamically involved in the perturbation of the primary's disk, and investigate whether or not there is evidence for a circumplanetary dust disk or cloud that is either primordial or captured from the primary. In conclusion, we show that both the existing optical properties and near-infrared colors of HD 106906b are weakly consistent with this possibility, motivating future work to test for the observational signatures of dust surrounding the planet.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalas, Paul G.; Wang, Jason J.; Duchene, Gaspard

    We present the first scattered light detections of the HD 106906 debris disk using the Gemini/Gemini Planet Imager in the infrared and Hubble Space Telescope (HST)/Advanced Camera for Surveys in the optical. HD 106906 is a 13 Myr old F5V star in the Sco–Cen association, with a previously detected planet-mass candidate HD 106906b projected 650 AU from the host star. Our observations reveal a near edge-on debris disk that has a central cleared region with radius ∼50 AU, and an outer extent >500 AU. The HST data show that the outer regions are highly asymmetric, resembling the “needle” morphology seenmore » for the HD 15115 debris disk. The planet candidate is oriented ∼21° away from the position angle of the primary’s debris disk, strongly suggesting non-coplanarity with the system. We hypothesize that HD 106906b could be dynamically involved in the perturbation of the primary’s disk, and investigate whether or not there is evidence for a circumplanetary dust disk or cloud that is either primordial or captured from the primary. We show that both the existing optical properties and near-infrared colors of HD 106906b are weakly consistent with this possibility, motivating future work to test for the observational signatures of dust surrounding the planet.« less

  1. Sunny Side of a Comet

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Figure 1: Temperature Map

    This image composite shows comet Tempel 1 in visible (left) and infrared (right) light (figure 1). The infrared picture highlights the warm, or sunlit, side of the comet, where NASA's Deep Impact probe later hit. These data were acquired about six minutes before impact. The visible image was taken by the medium-resolution camera on the mission's flyby spacecraft, and the infrared data were acquired by the flyby craft's infrared spectrometer.

  2. Shutterless non-uniformity correction for the long-term stability of an uncooled long-wave infrared camera

    NASA Astrophysics Data System (ADS)

    Liu, Chengwei; Sui, Xiubao; Gu, Guohua; Chen, Qian

    2018-02-01

    For the uncooled long-wave infrared (LWIR) camera, the infrared (IR) irradiation the focal plane array (FPA) receives is a crucial factor that affects the image quality. Ambient temperature fluctuation as well as system power consumption can result in changes of FPA temperature and radiation characteristics inside the IR camera; these will further degrade the imaging performance. In this paper, we present a novel shutterless non-uniformity correction method to compensate for non-uniformity derived from the variation of ambient temperature. Our method combines a calibration-based method and the properties of a scene-based method to obtain correction parameters at different ambient temperature conditions, so that the IR camera performance can be less influenced by ambient temperature fluctuation or system power consumption. The calibration process is carried out in a temperature chamber with slowly changing ambient temperature and a black body as uniform radiation source. Enough uniform images are captured and the gain coefficients are calculated during this period. Then in practical application, the offset parameters are calculated via the least squares method based on the gain coefficients, the captured uniform images and the actual scene. Thus we can get a corrected output through the gain coefficients and offset parameters. The performance of our proposed method is evaluated on realistic IR images and compared with two existing methods. The images we used in experiments are obtained by a 384× 288 pixels uncooled LWIR camera. Results show that our proposed method can adaptively update correction parameters as the actual target scene changes and is more stable to temperature fluctuation than the other two methods.

  3. Low-cost panoramic infrared surveillance system

    NASA Astrophysics Data System (ADS)

    Kecskes, Ian; Engel, Ezra; Wolfe, Christopher M.; Thomson, George

    2017-05-01

    A nighttime surveillance concept consisting of a single surface omnidirectional mirror assembly and an uncooled Vanadium Oxide (VOx) longwave infrared (LWIR) camera has been developed. This configuration provides a continuous field of view spanning 360° in azimuth and more than 110° in elevation. Both the camera and the mirror are readily available, off-the-shelf, inexpensive products. The mirror assembly is marketed for use in the visible spectrum and requires only minor modifications to function in the LWIR spectrum. The compactness and portability of this optical package offers significant advantages over many existing infrared surveillance systems. The developed system was evaluated on its ability to detect moving, human-sized heat sources at ranges between 10 m and 70 m. Raw camera images captured by the system are converted from rectangular coordinates in the camera focal plane to polar coordinates and then unwrapped into the users azimuth and elevation system. Digital background subtraction and color mapping are applied to the images to increase the users ability to extract moving items from background clutter. A second optical system consisting of a commercially available 50 mm f/1.2 ATHERM lens and a second LWIR camera is used to examine the details of objects of interest identified using the panoramic imager. A description of the components of the proof of concept is given, followed by a presentation of raw images taken by the panoramic LWIR imager. A description of the method by which these images are analyzed is given, along with a presentation of these results side-by-side with the output of the 50 mm LWIR imager and a panoramic visible light imager. Finally, a discussion of the concept and its future development are given.

  4. A Spectralon BRF Data Base for MISR Calibration Application

    NASA Technical Reports Server (NTRS)

    Bruegge, C.; Chrien, N.; Haner, D.

    1999-01-01

    The Multi-angle Imaging SpectroRadiometer (MISR) is an Earth observing sensor which will provide global retrievals of aerosols, clouds, and land surface parameters. Instrument specifications require high accuracy absolute calibration, as well as accurate camera-to-camera, band-to-band and pixel-to-pixel relative response determinations.

  5. The canopy camera

    Treesearch

    Harry E. Brown

    1962-01-01

    The canopy camera is a device of new design that takes wide-angle, overhead photographs of vegetation canopies, cloud cover, topographic horizons, and similar subjects. Since the entire hemisphere is photographed in a single exposure, the resulting photograph is circular, with the horizon forming the perimeter and the zenith the center. Photographs of this type provide...

  6. Cloud top structure of Venus revealed by Subaru/COMICS mid-infrared images

    NASA Astrophysics Data System (ADS)

    Sato, T. M.; Sagawa, H.; Kouyama, T.; Mitsuyama, K.; Satoh, T.; Ohtsuki, S.; Ueno, M.; Kasaba, Y.; Nakamura, M.; Imamura, T.

    2014-04-01

    We have investigated the cloud top structure of Venus by analyzing ground-based images obtained by the Cooled Mid-Infrared Camera and Spectrometer (COMICS), mounted on the 8.2-m Subaru Telescope. In this presentation, we will overview the observational results and discuss their interpretations.

  7. TIRGO and its instrumentation

    NASA Astrophysics Data System (ADS)

    Baffa, Carlo; Gennari, Sandro; Hunt, Leslie K.; Lisi, Franco; Tofani, Gianni; Vanzi, Leonardo

    1995-09-01

    We describe the general characteristics of the TIRGO infrared telescope, located on Gornergrat (Switzerland), and its most recent instrumentation. This telescope is specifically designed for infrared astronomical observations. Two newly designed instruments are presented: the imaging camera Arnica and the long-slit spectrometer LonGSp, both based on two-dimensional array detectors.

  8. Near infrared observations of S 155. Evidence of induced star formation?

    NASA Astrophysics Data System (ADS)

    Hunt, L. K.; Lisi, F.; Felli, M.; Tofani, G.

    In order to investigate the possible existence of embedded objects of recent formation in the area of the Cepheus B - Sh2-155 interface, the authors have observed the region of the compact radio continuum source with the new near infrared camera ARNICA and the TIRGO telescope.

  9. Finite Element Modeling and Long Wave Infrared Imaging for Detection and Identification of Buried Objects

    DTIC Science & Technology

    surface temperature profile of a sandbox containing buried objects using a long-wave infrared camera. Images were recorded for several days under ambient...time of day . Best detection of buried objects corresponded to shallow depths for observed intervals where maxima/minima ambient temperatures coincided

  10. Simulation of a polarized laser beam reflected at the sea surface: modeling and validation

    NASA Astrophysics Data System (ADS)

    Schwenger, Frédéric

    2015-05-01

    A 3-D simulation of the polarization-dependent reflection of a Gaussian shaped laser beam on the dynamic sea surface is presented. The simulation considers polarized or unpolarized laser sources and calculates the polarization states upon reflection at the sea surface. It is suitable for the radiance calculation of the scene in different spectral wavebands (e.g. near-infrared, SWIR, etc.) not including the camera degradations. The simulation also considers a bistatic configuration of laser source and receiver as well as different atmospheric conditions. In the SWIR, the detected total power of reflected laser light is compared with data collected in a field trial. Our computer simulation combines the 3-D simulation of a maritime scene (open sea/clear sky) with the simulation of polarized or unpolarized laser light reflected at the sea surface. The basic sea surface geometry is modeled by a composition of smooth wind driven gravity waves. To predict the input of a camera equipped with a linear polarizer, the polarized sea surface radiance must be calculated for the specific waveband. The s- and p-polarization states are calculated for the emitted sea surface radiance and the specularly reflected sky radiance to determine the total polarized sea surface radiance of each component. The states of polarization and the radiance of laser light specularly reflected at the wind-roughened sea surface are calculated by considering the s- and p- components of the electric field of laser light with respect to the specular plane of incidence. This is done by using the formalism of their coherence matrices according to E. Wolf [1]. Additionally, an analytical statistical sea surface BRDF (bidirectional reflectance distribution function) is considered for the reflection of laser light radiances. Validation of the simulation results is required to ensure model credibility and applicability to maritime laser applications. For validation purposes, field measurement data (images and meteorological data) was analyzed. An infrared laser, with or without a mounted polarizer, produced laser beam reflection at the water surface and images were recorded by a camera equipped with a polarizer with horizontal or vertical alignment. The validation is done by numerical comparison of measured total laser power extracted from recorded images with the corresponding simulation results. The results of the comparison are presented for different incident (zenith/azimuth) angles of the laser beam and different alignment for the laser polarizers (vertical/horizontal/without) and the camera (vertical/horizontal).

  11. Fluid Fantasy

    NASA Image and Video Library

    2016-10-24

    Saturn's clouds are full of raw beauty, but they also represent a playground for a branch of physics called fluid dynamics, which seeks to understand the motion of gases and liquids. Saturn's lack of a solid planetary surface (as on Earth, Mars or Venus) means that its atmosphere is free to flow around the planet essentially without obstruction. This is one factor that generates Saturn's pattern of alternating belts and zones -- one of the main features of its dynamic atmosphere. Winds in the belts blow at speeds different from those in the adjacent zones, leading to the formation of vortices along the boundaries between the two. And vigorous convection occasionally leads to storms and waves. Saturn's innermost rings are just visible at the bottom and in the upper left corner. This view is centered on clouds at 25 degrees north latitude on Saturn. The image was taken with the Cassini spacecraft wide-angle camera on July 20, 2016 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 728 nanometers. The view was obtained at a distance of approximately 752,000 miles (1.21 million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 6 degrees. Image scale is 45 miles (72 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20503

  12. Inter-joint coordination between hips and trunk during downswings: Effects on the clubhead speed.

    PubMed

    Choi, Ahnryul; Lee, In-Kwang; Choi, Mun-Taek; Mun, Joung Hwan

    2016-10-01

    Understanding of the inter-joint coordination between rotational movement of each hip and trunk in golf would provide basic knowledge regarding how the neuromuscular system organises the related joints to perform a successful swing motion. In this study, we evaluated the inter-joint coordination characteristics between rotational movement of the hips and trunk during golf downswings. Twenty-one right-handed male professional golfers were recruited for this study. Infrared cameras were installed to capture the swing motion. The axial rotation angle, angular velocity and inter-joint coordination were calculated by the Euler angle, numerical difference method and continuous relative phase, respectively. A more typical inter-joint coordination demonstrated in the leading hip/trunk than trailing hip/trunk. Three coordination characteristics of the leading hip/trunk reported a significant relationship with clubhead speed at impact (r < -0.5) in male professional golfers. The increased rotation difference between the leading hip and trunk in the overall downswing phase as well as the faster rotation of the leading hip compared to that of the trunk in the early downswing play important roles in increasing clubhead speed. These novel inter-joint coordination strategies have the great potential to use a biomechanical guideline to improve the golf swing performance of unskilled golfers.

  13. Improved determination of dynamic balance using the centre of mass and centre of pressure inclination variables in a complete golf swing cycle.

    PubMed

    Choi, Ahnryul; Sim, Taeyong; Mun, Joung Hwan

    2016-01-01

    Golf requires proper dynamic balance to accurately control the club head through a harmonious coordination of each human segment and joint. In this study, we evaluated the ability for dynamic balance during a golf swing by using the centre of mass (COM)-centre of pressure (COP) inclination variables. Twelve professional, 13 amateur and 10 novice golfers participated in this study. Six infrared cameras, two force platforms and SB-Clinic software were used to measure the net COM and COP trajectories. In order to evaluate dynamic balance ability, the COM-COP inclination angle, COM-COP inclination angular velocity and normalised COM-COP inclination angular jerk were used. Professional golfer group revealed a smaller COM-COP inclination angle and angular velocity than novice golfer group in the lead/trail direction (P < 0.01). In the normalised COM-COP inclination angular jerk, the professional golfer group showed a lower value than the other two groups in all directions. Professional golfers tend to exhibit improved dynamic balance, and this can be attributed to the neuromusculoskeletal system that maintains balance with proper postural control. This study has the potential to allow for an evaluation of the dynamic balance mechanism and will provide useful basic information for swing training and prevention of golf injuries.

  14. A multiscale video system for studying an optical phenomena during active experiments in the upper atmosphere

    NASA Astrophysics Data System (ADS)

    Nikolashkin, S. V.; Reshetnikov, A. A.

    2017-11-01

    The system of video surveillance during active rocket experiments in the Polar geophysical observatory "Tixie" and studies of the effects of "Soyuz" vehicle launches from the "Vostochny" cosmodrome over the territory of the Republic of Sakha (Yakutia) is presented. The created system consists of three AHD video cameras with different angles of view mounted on a common platform mounted on a tripod with the possibility of manual guiding. The main camera with high-sensitivity black and white CCD matrix SONY EXview HADII is equipped depending on the task with lenses "MTO-1000" (F = 1000 mm) or "Jupiter-21M " (F = 300 mm) and is designed for more detailed shooting of luminous formations. The second camera of the same type, but with a 30 degree angle of view. It is intended for shooting of the general plan and large objects, and also for a binding of coordinates of object on stars. The third color wide-angle camera (120 degrees) is designed to be connected to landmarks in the daytime, the optical axis of this channel is directed at 60 degrees down. The data is recorded on the hard disk of a four-channel digital video recorder. Tests of the original version of the system with two channels were conducted during the launch of the geophysical rocket in Tixie in September 2015 and showed its effectiveness.

  15. Bridge deck surface temperature monitoring by infrared thermography and inner structure identification using PPT and PCT analysis methods

    NASA Astrophysics Data System (ADS)

    Dumoulin, Jean

    2013-04-01

    One of the objectives of ISTIMES project was to evaluate the potentialities offered by the integration of different electromagnetic techniques able to perform non-invasive diagnostics for surveillance and monitoring of transport infrastructures. Among the EM methods investigated, we focused our research and development efforts on uncooled infrared camera techniques due to their promising potential level of dissemination linked to their relative low cost on the market. On the other hand, works were also carried out to identify well adapted implementation protocols and key limits of Pulse Phase Thermography (PPT) and Principal Component Thermography (PCT) processing methods to analyse thermal image sequence and retrieve information about the inner structure. So the first part of this research works addresses infrared thermography measurement when it is used in quantitative mode (not in laboratory conditions) and not in qualitative mode (vision applied to survey). In such context, it requires to process in real time thermal radiative corrections on raw data acquired to take into account influences of natural environment evolution with time, thanks to additional measurements. But, camera sensor has to be enough smart to apply in real time calibration law and radiometric corrections in a varying atmosphere. So, a complete measurement system was studied and developed [1] with low cost infrared cameras available on the market. In the system developed, infrared camera is coupled with other sensors to feed simplified radiative models running, in real time, on GPU available on small PC. The whole measurement system was implemented on the "Musmeci" bridge located in Potenza (Italy). No traffic interruption was required during the mounting of our measurement system. The infrared camera was fixed on top of a mast at 6 m elevation from the surface of the bridge deck. A small weather station was added on the same mast at 1 m under the camera. A GPS antenna was also fixed at the basis of the mast and at a same elevation than the bridge deck surface. This trial took place during 4 days, but our system was leaved in stand alone acquisition mode only during 3 days. Thanks to the software developed and the small computer hardware used, thermal image were acquired at a frame rate of 0.1 Hz by averaging 50 thermal images leaving the original camera frame rate fixed at 5 Hz. Each hour, a thermal image sequence was stored on the internal hard drive and data were also retrieved, on demand, by using a wireless connection and a tablet PC. In the second part of this work, thermal image sequences analysis was carried out. Two analysis approaches were studied: one based on the use of the Fast Fourier Transform [2] and the second one based on the Principal Component Analysis [3-4]. Results obtained show that the inner structure of the deck was identified though thermal images were affected by the fact that the bridge was open to traffic during the whole experiments duration. ACKNOWLEDGEMENT - The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 225663. References [1] Dumoulin J. and Averty R., « Development of an infrared system coupled with a weather station for real time atmospheric corrections using GPU computing: Application to bridge monitoring", QIRT 2012, Naples, Italy, June 2012. [2] Cooley J.W., Tukey J.W., "An algorithm for the machine calculation of complex Fourier series", Mathematics of Computation, vol. 19, n° 90, 1965, p. 297-301. [3] Rajic N., "Principal component thermography for flaw contrast enhancement and flaw depth characterization in composite structures", Composite Structures, vol 58, pp 521-528, 2002. [4] Marinetti S., Grinzato E., Bison P. G., Bozzi E., Chimenti M., Pieri G. and Salvetti O. "Statistical analysis of IR thermographic sequences by PCA," Infrared Physics & Technology vol 46 pp 85-91, 2004.

  16. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell; Nettles, Mindy

    2015-01-01

    The Additive Manufacturing Infrared Inspection Task started the development of a real-time dimensional inspection technique and digital quality record for the additive manufacturing process using infrared camera imaging and processing techniques. This project will benefit additive manufacturing by providing real-time inspection of internal geometry that is not currently possible and reduce the time and cost of additive manufactured parts with automated real-time dimensional inspections which deletes post-production inspections.

  17. Calibration procedures of the Tore-Supra infrared endoscopes

    NASA Astrophysics Data System (ADS)

    Desgranges, C.; Jouve, M.; Balorin, C.; Reichle, R.; Firdaouss, M.; Lipa, M.; Chantant, M.; Gardarein, J. L.; Saille, A.; Loarer, T.

    2018-01-01

    Five endoscopes equipped with infrared cameras working in the medium infrared range (3-5 μm) are installed on the controlled thermonuclear fusion research device Tore-Supra. These endoscopes aim at monitoring the plasma facing components surface temperature to prevent their overheating. Signals delivered by infrared cameras through endoscopes are analysed and used on the one hand through a real time feedback control loop acting on the heating systems of the plasma to decrease plasma facing components surface temperatures when necessary, on the other hand for physics studies such as determination of the incoming heat flux . To ensure these two roles a very accurate knowledge of the absolute surface temperatures is mandatory. Consequently the infrared endoscopes must be calibrated through a very careful procedure. This means determining their transmission coefficients which is a delicate operation. Methods to calibrate infrared endoscopes during the shutdown period of the Tore-Supra machine will be presented. As they do not allow determining the possible transmittances evolution during operation an in-situ method is presented. It permits the validation of the calibration performed in laboratory as well as the monitoring of their evolution during machine operation. This is possible by the use of the endoscope shutter and a dedicated plasma scenario developed to heat it. Possible improvements of this method are briefly evoked.

  18. Optimal design of an earth observation optical system with dual spectral and high resolution

    NASA Astrophysics Data System (ADS)

    Yan, Pei-pei; Jiang, Kai; Liu, Kai; Duan, Jing; Shan, Qiusha

    2017-02-01

    With the increasing demand of the high-resolution remote sensing images by military and civilians, Countries around the world are optimistic about the prospect of higher resolution remote sensing images. Moreover, design a visible/infrared integrative optic system has important value in earth observation. Because visible system can't identify camouflage and recon at night, so we should associate visible camera with infrared camera. An earth observation optical system with dual spectral and high resolution is designed. The paper mainly researches on the integrative design of visible and infrared optic system, which makes the system lighter and smaller, and achieves one satellite with two uses. The working waveband of the system covers visible, middle infrared (3-5um). Dual waveband clear imaging is achieved with dispersive RC system. The focal length of visible system is 3056mm, F/# is 10.91. And the focal length of middle infrared system is 1120mm, F/# is 4. In order to suppress the middle infrared thermal radiation and stray light, the second imaging system is achieved and the narcissus phenomenon is analyzed. The system characteristic is that the structure is simple. And the especial requirements of the Modulation Transfer Function (MTF), spot, energy concentration, and distortion etc. are all satisfied.

  19. Monitoring machining conditions by infrared images

    NASA Astrophysics Data System (ADS)

    Borelli, Joao E.; Gonzaga Trabasso, Luis; Gonzaga, Adilson; Coelho, Reginaldo T.

    2001-03-01

    During machining process the knowledge of the temperature is the most important factor in tool analysis. It allows to control main factors that influence tool use, life time and waste. The temperature in the contact area between the piece and the tool is resulting from the material removal in cutting operation and it is too difficult to be obtained because the tool and the work piece are in motion. One way to measure the temperature in this situation is detecting the infrared radiation. This work presents a new methodology for diagnosis and monitoring of machining processes with the use of infrared images. The infrared image provides a map in gray tones of the elements in the process: tool, work piece and chips. Each gray tone in the image corresponds to a certain temperature for each one of those materials and the relationship between the gray tones and the temperature is gotten by the previous of infrared camera calibration. The system developed in this work uses an infrared camera, a frame grabber board and a software composed of three modules. The first module makes the image acquisition and processing. The second module makes the feature image extraction and performs the feature vector. Finally, the third module uses fuzzy logic to evaluate the feature vector and supplies the tool state diagnostic as output.

  20. Single Pixel Black Phosphorus Photodetector for Near-Infrared Imaging.

    PubMed

    Miao, Jinshui; Song, Bo; Xu, Zhihao; Cai, Le; Zhang, Suoming; Dong, Lixin; Wang, Chuan

    2018-01-01

    Infrared imaging systems have wide range of military or civil applications and 2D nanomaterials have recently emerged as potential sensing materials that may outperform conventional ones such as HgCdTe, InGaAs, and InSb. As an example, 2D black phosphorus (BP) thin film has a thickness-dependent direct bandgap with low shot noise and noncryogenic operation for visible to mid-infrared photodetection. In this paper, the use of a single-pixel photodetector made with few-layer BP thin film for near-infrared imaging applications is demonstrated. The imaging is achieved by combining the photodetector with a digital micromirror device to encode and subsequently reconstruct the image based on compressive sensing algorithm. Stationary images of a near-infrared laser spot (λ = 830 nm) with up to 64 × 64 pixels are captured using this single-pixel BP camera with 2000 times of measurements, which is only half of the total number of pixels. The imaging platform demonstrated in this work circumvents the grand challenges of scalable BP material growth for photodetector array fabrication and shows the efficacy of utilizing the outstanding performance of BP photodetector for future high-speed infrared camera applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Top