Improved calibration-based non-uniformity correction method for uncooled infrared camera
NASA Astrophysics Data System (ADS)
Liu, Chengwei; Sui, Xiubao
2017-08-01
With the latest improvements of microbolometer focal plane arrays (FPA), uncooled infrared (IR) cameras are becoming the most widely used devices in thermography, especially in handheld devices. However the influences derived from changing ambient condition and the non-uniform response of the sensors make it more difficult to correct the nonuniformity of uncooled infrared camera. In this paper, based on the infrared radiation characteristic in the TEC-less uncooled infrared camera, a novel model was proposed for calibration-based non-uniformity correction (NUC). In this model, we introduce the FPA temperature, together with the responses of microbolometer under different ambient temperature to calculate the correction parameters. Based on the proposed model, we can work out the correction parameters with the calibration measurements under controlled ambient condition and uniform blackbody. All correction parameters can be determined after the calibration process and then be used to correct the non-uniformity of the infrared camera in real time. This paper presents the detail of the compensation procedure and the performance of the proposed calibration-based non-uniformity correction method. And our method was evaluated on realistic IR images obtained by a 384x288 pixels uncooled long wave infrared (LWIR) camera operated under changed ambient condition. The results show that our method can exclude the influence caused by the changed ambient condition, and ensure that the infrared camera has a stable performance.
Space-based infrared sensors of space target imaging effect analysis
NASA Astrophysics Data System (ADS)
Dai, Huayu; Zhang, Yasheng; Zhou, Haijun; Zhao, Shuang
2018-02-01
Target identification problem is one of the core problem of ballistic missile defense system, infrared imaging simulation is an important means of target detection and recognition. This paper first established the space-based infrared sensors ballistic target imaging model of point source on the planet's atmosphere; then from two aspects of space-based sensors camera parameters and target characteristics simulated atmosphere ballistic target of infrared imaging effect, analyzed the camera line of sight jitter, camera system noise and different imaging effects of wave on the target.
A Ground-Based Near Infrared Camera Array System for UAV Auto-Landing in GPS-Denied Environment.
Yang, Tao; Li, Guangpo; Li, Jing; Zhang, Yanning; Zhang, Xiaoqiang; Zhang, Zhuoyue; Li, Zhi
2016-08-30
This paper proposes a novel infrared camera array guidance system with capability to track and provide real time position and speed of a fixed-wing Unmanned air vehicle (UAV) during a landing process. The system mainly include three novel parts: (1) Infrared camera array and near infrared laser lamp based cooperative long range optical imaging module; (2) Large scale outdoor camera array calibration module; and (3) Laser marker detection and 3D tracking module. Extensive automatic landing experiments with fixed-wing flight demonstrate that our infrared camera array system has the unique ability to guide the UAV landing safely and accurately in real time. Moreover, the measurement and control distance of our system is more than 1000 m. The experimental results also demonstrate that our system can be used for UAV automatic accurate landing in Global Position System (GPS)-denied environments.
Thermal-depth matching in dynamic scene based on affine projection and feature registration
NASA Astrophysics Data System (ADS)
Wang, Hongyu; Jia, Tong; Wu, Chengdong; Li, Yongqiang
2018-03-01
This paper aims to study the construction of 3D temperature distribution reconstruction system based on depth and thermal infrared information. Initially, a traditional calibration method cannot be directly used, because the depth and thermal infrared camera is not sensitive to the color calibration board. Therefore, this paper aims to design a depth and thermal infrared camera calibration board to complete the calibration of the depth and thermal infrared camera. Meanwhile a local feature descriptors in thermal and depth images is proposed. The belief propagation matching algorithm is also investigated based on the space affine transformation matching and local feature matching. The 3D temperature distribution model is built based on the matching of 3D point cloud and 2D thermal infrared information. Experimental results show that the method can accurately construct the 3D temperature distribution model, and has strong robustness.
Design of an infrared camera based aircraft detection system for laser guide star installations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman, H.; Macintosh, B.
1996-03-05
There have been incidents in which the irradiance resulting from laser guide stars have temporarily blinded pilots or passengers of aircraft. An aircraft detection system based on passive near infrared cameras (instead of active radar) is described in this report.
Coherent infrared imaging camera (CIRIC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.
1995-07-01
New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerousmore » and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.« less
A Portable, Inexpensive, Nonmydriatic Fundus Camera Based on the Raspberry Pi® Computer.
Shen, Bailey Y; Mukai, Shizuo
2017-01-01
Purpose. Nonmydriatic fundus cameras allow retinal photography without pharmacologic dilation of the pupil. However, currently available nonmydriatic fundus cameras are bulky, not portable, and expensive. Taking advantage of recent advances in mobile technology, we sought to create a nonmydriatic fundus camera that was affordable and could be carried in a white coat pocket. Methods. We built a point-and-shoot prototype camera using a Raspberry Pi computer, an infrared-sensitive camera board, a dual infrared and white light light-emitting diode, a battery, a 5-inch touchscreen liquid crystal display, and a disposable 20-diopter condensing lens. Our prototype camera was based on indirect ophthalmoscopy with both infrared and white lights. Results. The prototype camera measured 133mm × 91mm × 45mm and weighed 386 grams. The total cost of the components, including the disposable lens, was $185.20. The camera was able to obtain good-quality fundus images without pharmacologic dilation of the pupils. Conclusion. A fully functional, inexpensive, handheld, nonmydriatic fundus camera can be easily assembled from a relatively small number of components. With modest improvements, such a camera could be useful for a variety of healthcare professionals, particularly those who work in settings where a traditional table-mounted nonmydriatic fundus camera would be inconvenient.
A Portable, Inexpensive, Nonmydriatic Fundus Camera Based on the Raspberry Pi® Computer
Shen, Bailey Y.
2017-01-01
Purpose. Nonmydriatic fundus cameras allow retinal photography without pharmacologic dilation of the pupil. However, currently available nonmydriatic fundus cameras are bulky, not portable, and expensive. Taking advantage of recent advances in mobile technology, we sought to create a nonmydriatic fundus camera that was affordable and could be carried in a white coat pocket. Methods. We built a point-and-shoot prototype camera using a Raspberry Pi computer, an infrared-sensitive camera board, a dual infrared and white light light-emitting diode, a battery, a 5-inch touchscreen liquid crystal display, and a disposable 20-diopter condensing lens. Our prototype camera was based on indirect ophthalmoscopy with both infrared and white lights. Results. The prototype camera measured 133mm × 91mm × 45mm and weighed 386 grams. The total cost of the components, including the disposable lens, was $185.20. The camera was able to obtain good-quality fundus images without pharmacologic dilation of the pupils. Conclusion. A fully functional, inexpensive, handheld, nonmydriatic fundus camera can be easily assembled from a relatively small number of components. With modest improvements, such a camera could be useful for a variety of healthcare professionals, particularly those who work in settings where a traditional table-mounted nonmydriatic fundus camera would be inconvenient. PMID:28396802
ARNICA, the Arcetri near-infrared camera: Astronomical performance assessment.
NASA Astrophysics Data System (ADS)
Hunt, L. K.; Lisi, F.; Testi, L.; Baffa, C.; Borelli, S.; Maiolino, R.; Moriondo, G.; Stanga, R. M.
1996-01-01
The Arcetri near-infrared camera ARNICA was built as a users' instrument for the Infrared Telescope at Gornergrat (TIRGO), and is based on a 256x256 NICMOS 3 detector. In this paper, we discuss ARNICA's optical and astronomical performance at the TIRGO and at the William Herschel Telescope on La Palma. Optical performance is evaluated in terms of plate scale, distortion, point spread function, and ghosting. Astronomical performance is characterized by camera efficiency, sensitivity, and spatial uniformity of the photometry.
NASA Technical Reports Server (NTRS)
Gunapala, S.; Bandara, S. V.; Liu, J. K.; Hong, W.; Sundaram, M.; Maker, P. D.; Muller, R. E.
1997-01-01
In this paper, we discuss the development of this very sensitive long waelength infrared (LWIR) camera based on a GaAs/AlGaAs QWIP focal plane array (FPA) and its performance in quantum efficiency, NEAT, uniformity, and operability.
InfraCAM (trade mark): A Hand-Held Commercial Infrared Camera Modified for Spaceborne Applications
NASA Technical Reports Server (NTRS)
Manitakos, Daniel; Jones, Jeffrey; Melikian, Simon
1996-01-01
In 1994, Inframetrics introduced the InfraCAM(TM), a high resolution hand-held thermal imager. As the world's smallest, lightest and lowest power PtSi based infrared camera, the InfraCAM is ideal for a wise range of industrial, non destructive testing, surveillance and scientific applications. In addition to numerous commercial applications, the light weight and low power consumption of the InfraCAM make it extremely valuable for adaptation to space borne applications. Consequently, the InfraCAM has been selected by NASA Lewis Research Center (LeRC) in Cleveland, Ohio, for use as part of the DARTFire (Diffusive and Radiative Transport in Fires) space borne experiment. In this experiment, a solid fuel is ignited in a low gravity environment. The combustion period is recorded by both visible and infrared cameras. The infrared camera measures the emission from polymethyl methacrylate, (PMMA) and combustion products in six distinct narrow spectral bands. Four cameras successfully completed all qualification tests at Inframetrics and at NASA Lewis. They are presently being used for ground based testing in preparation for space flight in the fall of 1995.
NASA Astrophysics Data System (ADS)
Schimert, Thomas R.; Ratcliff, David D.; Brady, John F., III; Ropson, Steven J.; Gooch, Roland W.; Ritchey, Bobbi; McCardel, P.; Rachels, K.; Wand, Marty; Weinstein, M.; Wynn, John
1999-07-01
Low power and low cost are primary requirements for an imaging infrared camera used in unattended ground sensor arrays. In this paper, an amorphous silicon (a-Si) microbolometer-based uncooled infrared camera technology offering a low cost, low power solution to infrared surveillance for UGS applications is presented. A 15 X 31 micro infrared camera (MIRC) has been demonstrated which exhibits an f/1 noise equivalent temperature difference sensitivity approximately 67 mK. This sensitivity has been achieved without the use of a thermoelectric cooler for array temperature stabilization thereby significantly reducing the power requirements. The chopperless camera is capable of operating from snapshot mode (1 Hz) to video frame rate (30 Hz). Power consumption of 0.4 W without display, and 0.75 W with display, respectively, has been demonstrated at 30 Hz operation. The 15 X 31 camera demonstrated exhibits a 35 mm camera form factor employing a low cost f/1 singlet optic and LED display, as well as low cost vacuum packaging. A larger 120 X 160 version of the MIRC is also in development and will be discussed. The 120 X 160 MIRC exhibits a substantially smaller form factor and incorporates all the low cost, low power features demonstrated in the 15 X 31 MIRC prototype. In this paper, a-Si microbolometer technology for the MIRC will be presented. Also, the key features and performance parameters of the MIRC are presented.
Infrared detectors and test technology of cryogenic camera
NASA Astrophysics Data System (ADS)
Yang, Xiaole; Liu, Xingxin; Xing, Mailing; Ling, Long
2016-10-01
Cryogenic camera which is widely used in deep space detection cools down optical system and support structure by cryogenic refrigeration technology, thereby improving the sensitivity. Discussing the characteristics and design points of infrared detector combined with camera's characteristics. At the same time, cryogenic background test systems of chip and detector assembly are established. Chip test system is based on variable cryogenic and multilayer Dewar, and assembly test system is based on target and background simulator in the thermal vacuum environment. The core of test is to establish cryogenic background. Non-uniformity, ratio of dead pixels and noise of test result are given finally. The establishment of test system supports for the design and calculation of infrared systems.
Performance and Calibration of H2RG Detectors and SIDECAR ASICs for the RATIR Camera
NASA Technical Reports Server (NTRS)
Fox, Ori D.; Kutyrev, Alexander S.; Rapchun, David A.; Klein, Christopher R.; Butler, Nathaniel R.; Bloom, Josh; de Diego, Jos A.; Simn Farah, Alejandro D.; Gehrels, Neil A.; Georgiev, Leonid;
2012-01-01
The Reionization And Transient Infra,.Red (RATIR) camera has been built for rapid Gamma,.Ray Burst (GRE) followup and will provide simultaneous optical and infrared photometric capabilities. The infrared portion of this camera incorporates two Teledyne HgCdTe HAWAII-2RG detectors, controlled by Teledyne's SIDECAR ASICs. While other ground-based systems have used the SIDECAR before, this system also utilizes Teledyne's JADE2 interface card and IDE development environment. Together, this setup comprises Teledyne's Development Kit, which is a bundled solution that can be efficiently integrated into future ground-based systems. In this presentation, we characterize the system's read noise, dark current, and conversion gain.
Space imaging infrared optical guidance for autonomous ground vehicle
NASA Astrophysics Data System (ADS)
Akiyama, Akira; Kobayashi, Nobuaki; Mutoh, Eiichiro; Kumagai, Hideo; Yamada, Hirofumi; Ishii, Hiromitsu
2008-08-01
We have developed the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle based on the uncooled infrared camera and focusing technique to detect the objects to be evaded and to set the drive path. For this purpose we made servomotor drive system to control the focus function of the infrared camera lens. To determine the best focus position we use the auto focus image processing of Daubechies wavelet transform technique with 4 terms. From the determined best focus position we transformed it to the distance of the object. We made the aluminum frame ground vehicle to mount the auto focus infrared unit. Its size is 900mm long and 800mm wide. This vehicle mounted Ackerman front steering system and the rear motor drive system. To confirm the guidance ability of the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle we had the experiments for the detection ability of the infrared auto focus unit to the actual car on the road and the roadside wall. As a result the auto focus image processing based on the Daubechies wavelet transform technique detects the best focus image clearly and give the depth of the object from the infrared camera unit.
Development of plenoptic infrared camera using low dimensional material based photodetectors
NASA Astrophysics Data System (ADS)
Chen, Liangliang
Infrared (IR) sensor has extended imaging from submicron visible spectrum to tens of microns wavelength, which has been widely used for military and civilian application. The conventional bulk semiconductor materials based IR cameras suffer from low frame rate, low resolution, temperature dependent and highly cost, while the unusual Carbon Nanotube (CNT), low dimensional material based nanotechnology has been made much progress in research and industry. The unique properties of CNT lead to investigate CNT based IR photodetectors and imaging system, resolving the sensitivity, speed and cooling difficulties in state of the art IR imagings. The reliability and stability is critical to the transition from nano science to nano engineering especially for infrared sensing. It is not only for the fundamental understanding of CNT photoresponse induced processes, but also for the development of a novel infrared sensitive material with unique optical and electrical features. In the proposed research, the sandwich-structured sensor was fabricated within two polymer layers. The substrate polyimide provided sensor with isolation to background noise, and top parylene packing blocked humid environmental factors. At the same time, the fabrication process was optimized by real time electrical detection dielectrophoresis and multiple annealing to improve fabrication yield and sensor performance. The nanoscale infrared photodetector was characterized by digital microscopy and precise linear stage in order for fully understanding it. Besides, the low noise, high gain readout system was designed together with CNT photodetector to make the nano sensor IR camera available. To explore more of infrared light, we employ compressive sensing algorithm into light field sampling, 3-D camera and compressive video sensing. The redundant of whole light field, including angular images for light field, binocular images for 3-D camera and temporal information of video streams, are extracted and expressed in compressive approach. The following computational algorithms are applied to reconstruct images beyond 2D static information. The super resolution signal processing was then used to enhance and improve the image spatial resolution. The whole camera system brings a deeply detailed content for infrared spectrum sensing.
Land-based infrared imagery for marine mammal detection
NASA Astrophysics Data System (ADS)
Graber, Joseph; Thomson, Jim; Polagye, Brian; Jessup, Andrew
2011-09-01
A land-based infrared (IR) camera is used to detect endangered Southern Resident killer whales in Puget Sound, Washington, USA. The observations are motivated by a proposed tidal energy pilot project, which will be required to monitor for environmental effects. Potential monitoring methods also include visual observation, passive acoustics, and active acoustics. The effectiveness of observations in the infrared spectrum is compared to observations in the visible spectrum to assess the viability of infrared imagery for cetacean detection and classification. Imagery was obtained at Lime Kiln Park, Washington from 7/6/10-7/9/10 using a FLIR Thermovision A40M infrared camera (7.5-14μm, 37°HFOV, 320x240 pixels) under ideal atmospheric conditions (clear skies, calm seas, and wind speed 0-4 m/s). Whales were detected during both day (9 detections) and night (75 detections) at distances ranging from 42 to 162 m. The temperature contrast between dorsal fins and the sea surface ranged from 0.5 to 4.6 °C. Differences in emissivity from sea surface to dorsal fin are shown to aid detection at high incidence angles (near grazing). A comparison to theory is presented, and observed deviations from theory are investigated. A guide for infrared camera selection based on site geometry and desired target size is presented, with specific considerations regarding marine mammal detection. Atmospheric conditions required to use visible and infrared cameras for marine mammal detection are established and compared with 2008 meteorological data for the proposed tidal energy site. Using conservative assumptions, infrared observations are predicted to provide a 74% increase in hours of possible detection, compared with visual observations.
Research on camera on orbit radial calibration based on black body and infrared calibration stars
NASA Astrophysics Data System (ADS)
Wang, YuDu; Su, XiaoFeng; Zhang, WanYing; Chen, FanSheng
2018-05-01
Affected by launching process and space environment, the response capability of a space camera must be attenuated. So it is necessary for a space camera to have a spaceborne radiant calibration. In this paper, we propose a method of calibration based on accurate Infrared standard stars was proposed for increasing infrared radiation measurement precision. As stars can be considered as a point target, we use them as the radiometric calibration source and establish the Taylor expansion method and the energy extrapolation model based on WISE catalog and 2MASS catalog. Then we update the calibration results from black body. Finally, calibration mechanism is designed and the technology of design is verified by on orbit test. The experimental calibration result shows the irradiance extrapolation error is about 3% and the accuracy of calibration methods is about 10%, the results show that the methods could satisfy requirements of on orbit calibration.
High-frame-rate infrared and visible cameras for test range instrumentation
NASA Astrophysics Data System (ADS)
Ambrose, Joseph G.; King, B.; Tower, John R.; Hughes, Gary W.; Levine, Peter A.; Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; O'Mara, K.; Sjursen, W.; McCaffrey, Nathaniel J.; Pantuso, Francis P.
1995-09-01
Field deployable, high frame rate camera systems have been developed to support the test and evaluation activities at the White Sands Missile Range. The infrared cameras employ a 640 by 480 format PtSi focal plane array (FPA). The visible cameras employ a 1024 by 1024 format backside illuminated CCD. The monolithic, MOS architecture of the PtSi FPA supports commandable frame rate, frame size, and integration time. The infrared cameras provide 3 - 5 micron thermal imaging in selectable modes from 30 Hz frame rate, 640 by 480 frame size, 33 ms integration time to 300 Hz frame rate, 133 by 142 frame size, 1 ms integration time. The infrared cameras employ a 500 mm, f/1.7 lens. Video outputs are 12-bit digital video and RS170 analog video with histogram-based contrast enhancement. The 1024 by 1024 format CCD has a 32-port, split-frame transfer architecture. The visible cameras exploit this architecture to provide selectable modes from 30 Hz frame rate, 1024 by 1024 frame size, 32 ms integration time to 300 Hz frame rate, 1024 by 1024 frame size (with 2:1 vertical binning), 0.5 ms integration time. The visible cameras employ a 500 mm, f/4 lens, with integration time controlled by an electro-optical shutter. Video outputs are RS170 analog video (512 by 480 pixels), and 12-bit digital video.
Unmanned Ground Vehicle Perception Using Thermal Infrared Cameras
NASA Technical Reports Server (NTRS)
Rankin, Arturo; Huertas, Andres; Matthies, Larry; Bajracharya, Max; Assad, Christopher; Brennan, Shane; Bellutta, Paolo; Sherwin, Gary W.
2011-01-01
The ability to perform off-road autonomous navigation at any time of day or night is a requirement for some unmanned ground vehicle (UGV) programs. Because there are times when it is desirable for military UGVs to operate without emitting strong, detectable electromagnetic signals, a passive only terrain perception mode of operation is also often a requirement. Thermal infrared (TIR) cameras can be used to provide day and night passive terrain perception. TIR cameras have a detector sensitive to either mid-wave infrared (MWIR) radiation (3-5?m) or long-wave infrared (LWIR) radiation (8-12?m). With the recent emergence of high-quality uncooled LWIR cameras, TIR cameras have become viable passive perception options for some UGV programs. The Jet Propulsion Laboratory (JPL) has used a stereo pair of TIR cameras under several UGV programs to perform stereo ranging, terrain mapping, tree-trunk detection, pedestrian detection, negative obstacle detection, and water detection based on object reflections. In addition, we have evaluated stereo range data at a variety of UGV speeds, evaluated dual-band TIR classification of soil, vegetation, and rock terrain types, analyzed 24 hour water and 12 hour mud TIR imagery, and analyzed TIR imagery for hazard detection through smoke. Since TIR cameras do not currently provide the resolution available from megapixel color cameras, a UGV's daytime safe speed is often reduced when using TIR instead of color cameras. In this paper, we summarize the UGV terrain perception work JPL has performed with TIR cameras over the last decade and describe a calibration target developed by General Dynamics Robotic Systems (GDRS) for TIR cameras and other sensors.
ERIC Educational Resources Information Center
Catelli, Francisco; Giovannini, Odilon; Bolzan, Vicente Dall Agnol
2011-01-01
The interference fringes produced by a diffraction grating illuminated with radiation from a TV remote control and a red laser beam are, simultaneously, captured by a digital camera. Based on an image with two interference patterns, an estimate of the infrared radiation wavelength emitted by a TV remote control is made. (Contains 4 figures.)
NASA Technical Reports Server (NTRS)
Tueller, Jack (Technical Monitor); Fazio, Giovanni G.; Tolls, Volker
2004-01-01
The purpose of this study was to investigate the feasibility of developing a daytime star tracker for ULDB flights using a commercially available off-the-shelf infrared array camera. This report describes the system used for ground-based tests, the observations, the test results, and gives recommendations for continued development.
The NASA - Arc 10/20 micron camera
NASA Technical Reports Server (NTRS)
Roellig, T. L.; Cooper, R.; Deutsch, L. K.; Mccreight, C.; Mckelvey, M.; Pendleton, Y. J.; Witteborn, F. C.; Yuen, L.; Mcmahon, T.; Werner, M. W.
1994-01-01
A new infrared camera (AIR Camera) has been developed at NASA - Ames Research Center for observations from ground-based telescopes. The heart of the camera is a Hughes 58 x 62 pixel Arsenic-doped Silicon detector array that has the spectral sensitivity range to allow observations in both the 10 and 20 micron atmospheric windows.
Don't get burned: thermal monitoring of vessel sealing using a miniature infrared camera
NASA Astrophysics Data System (ADS)
Lin, Shan; Fichera, Loris; Fulton, Mitchell J.; Webster, Robert J.
2017-03-01
Miniature infrared cameras have recently come to market in a form factor that facilitates packaging in endoscopic or other minimally invasive surgical instruments. If absolute temperature measurements can be made with these cameras, they may be useful for non-contact monitoring of electrocautery-based vessel sealing, or other thermal surgical processes like thermal ablation of tumors. As a first step in evaluating the feasibility of optical medical thermometry with these new cameras, in this paper we explore how well thermal measurements can be made with them. These cameras measure the raw flux of incoming IR radiation, and we perform a calibration procedure to map their readings to absolute temperature values in the range between 40 and 150 °C. Furthermore, we propose and validate a method to estimate the spatial extent of heat spread created by a cautery tool based on the thermal images.
Yang, Hualei; Yang, Xi; Heskel, Mary; Sun, Shucun; Tang, Jianwu
2017-04-28
Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporal resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). We found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.
Early forest fire detection using principal component analysis of infrared video
NASA Astrophysics Data System (ADS)
Saghri, John A.; Radjabi, Ryan; Jacobs, John T.
2011-09-01
A land-based early forest fire detection scheme which exploits the infrared (IR) temporal signature of fire plume is described. Unlike common land-based and/or satellite-based techniques which rely on measurement and discrimination of fire plume directly from its infrared and/or visible reflectance imagery, this scheme is based on exploitation of fire plume temporal signature, i.e., temperature fluctuations over the observation period. The method is simple and relatively inexpensive to implement. The false alarm rate is expected to be lower that of the existing methods. Land-based infrared (IR) cameras are installed in a step-stare-mode configuration in potential fire-prone areas. The sequence of IR video frames from each camera is digitally processed to determine if there is a fire within camera's field of view (FOV). The process involves applying a principal component transformation (PCT) to each nonoverlapping sequence of video frames from the camera to produce a corresponding sequence of temporally-uncorrelated principal component (PC) images. Since pixels that form a fire plume exhibit statistically similar temporal variation (i.e., have a unique temporal signature), PCT conveniently renders the footprint/trace of the fire plume in low-order PC images. The PC image which best reveals the trace of the fire plume is then selected and spatially filtered via simple threshold and median filter operations to remove the background clutter, such as traces of moving tree branches due to wind.
Imaging of breast cancer with mid- and long-wave infrared camera.
Joro, R; Lääperi, A-L; Dastidar, P; Soimakallio, S; Kuukasjärvi, T; Toivonen, T; Saaristo, R; Järvenpää, R
2008-01-01
In this novel study the breasts of 15 women with palpable breast cancer were preoperatively imaged with three technically different infrared (IR) cameras - micro bolometer (MB), quantum well (QWIP) and photo voltaic (PV) - to compare their ability to differentiate breast cancer from normal tissue. The IR images were processed, the data for frequency analysis were collected from dynamic IR images by pixel-based analysis and from each image selectively windowed regional analysis was carried out, based on angiogenesis and nitric oxide production of cancer tissue causing vasomotor and cardiogenic frequency differences compared to normal tissue. Our results show that the GaAs QWIP camera and the InSb PV camera demonstrate the frequency difference between normal and cancerous breast tissue; the PV camera more clearly. With selected image processing operations more detailed frequency analyses could be applied to the suspicious area. The MB camera was not suitable for tissue differentiation, as the difference between noise and effective signal was unsatisfactory.
Variation in detection among passive infrared triggered-cameras used in wildlife research
Damm, Philip E.; Grand, James B.; Barnett, Steven W.
2010-01-01
Precise and accurate estimates of demographics such as age structure, productivity, and density are necessary in determining habitat and harvest management strategies for wildlife populations. Surveys using automated cameras are becoming an increasingly popular tool for estimating these parameters. However, most camera studies fail to incorporate detection probabilities, leading to parameter underestimation. The objective of this study was to determine the sources of heterogeneity in detection for trail cameras that incorporate a passive infrared (PIR) triggering system sensitive to heat and motion. Images were collected at four baited sites within the Conecuh National Forest, Alabama, using three cameras at each site operating continuously over the same seven-day period. Detection was estimated for four groups of animals based on taxonomic group and body size. Our hypotheses of detection considered variation among bait sites and cameras. The best model (w=0.99) estimated different rates of detection for each camera in addition to different detection rates for four animal groupings. Factors that explain this variability might include poor manufacturing tolerances, variation in PIR sensitivity, animal behavior, and species-specific infrared radiation. Population surveys using trail cameras with PIR systems must incorporate detection rates for individual cameras. Incorporating time-lapse triggering systems into survey designs should eliminate issues associated with PIR systems.
Robust Behavior Recognition in Intelligent Surveillance Environments.
Batchuluun, Ganbayar; Kim, Yeong Gon; Kim, Jong Hyun; Hong, Hyung Gil; Park, Kang Ryoung
2016-06-30
Intelligent surveillance systems have been studied by many researchers. These systems should be operated in both daytime and nighttime, but objects are invisible in images captured by visible light camera during the night. Therefore, near infrared (NIR) cameras, thermal cameras (based on medium-wavelength infrared (MWIR), and long-wavelength infrared (LWIR) light) have been considered for usage during the nighttime as an alternative. Due to the usage during both daytime and nighttime, and the limitation of requiring an additional NIR illuminator (which should illuminate a wide area over a great distance) for NIR cameras during the nighttime, a dual system of visible light and thermal cameras is used in our research, and we propose a new behavior recognition in intelligent surveillance environments. Twelve datasets were compiled by collecting data in various environments, and they were used to obtain experimental results. The recognition accuracy of our method was found to be 97.6%, thereby confirming the ability of our method to outperform previous methods.
TIFR Near Infrared Imaging Camera-II on the 3.6 m Devasthal Optical Telescope
NASA Astrophysics Data System (ADS)
Baug, T.; Ojha, D. K.; Ghosh, S. K.; Sharma, S.; Pandey, A. K.; Kumar, Brijesh; Ghosh, Arpan; Ninan, J. P.; Naik, M. B.; D’Costa, S. L. A.; Poojary, S. S.; Sandimani, P. R.; Shah, H.; Krishna Reddy, B.; Pandey, S. B.; Chand, H.
Tata Institute of Fundamental Research (TIFR) Near Infrared Imaging Camera-II (TIRCAM2) is a closed-cycle Helium cryo-cooled imaging camera equipped with a Raytheon 512×512 pixels InSb Aladdin III Quadrant focal plane array (FPA) having sensitivity to photons in the 1-5μm wavelength band. In this paper, we present the performance of the camera on the newly installed 3.6m Devasthal Optical Telescope (DOT) based on the calibration observations carried out during 2017 May 11-14 and 2017 October 7-31. After the preliminary characterization, the camera has been released to the Indian and Belgian astronomical community for science observations since 2017 May. The camera offers a field-of-view (FoV) of ˜86.5‧‧×86.5‧‧ on the DOT with a pixel scale of 0.169‧‧. The seeing at the telescope site in the near-infrared (NIR) bands is typically sub-arcsecond with the best seeing of ˜0.45‧‧ realized in the NIR K-band on 2017 October 16. The camera is found to be capable of deep observations in the J, H and K bands comparable to other 4m class telescopes available world-wide. Another highlight of this camera is the observational capability for sources up to Wide-field Infrared Survey Explorer (WISE) W1-band (3.4μm) magnitudes of 9.2 in the narrow L-band (nbL; λcen˜ 3.59μm). Hence, the camera could be a good complementary instrument to observe the bright nbL-band sources that are saturated in the Spitzer-Infrared Array Camera (IRAC) ([3.6] ≲ 7.92 mag) and the WISE W1-band ([3.4] ≲ 8.1 mag). Sources with strong polycyclic aromatic hydrocarbon (PAH) emission at 3.3μm are also detected. Details of the observations and estimated parameters are presented in this paper.
NASA Technical Reports Server (NTRS)
Gunapala, Sarath D.; Park, Jin S.; Sarusi, Gabby; Lin, True-Lon; Liu, John K.; Maker, Paul D.; Muller, Richard E.; Shott, Craig A.; Hoelter, Ted
1997-01-01
In this paper, we discuss the development of very sensitive, very long wavelength infrared GaAs/Al(x)Ga(1-x)As quantum well infrared photodetectors (QWIP's) based on bound-to-quasi-bound intersubband transition, fabrication of random reflectors for efficient light coupling, and the demonstration of a 15 micro-m cutoff 128 x 128 focal plane array imaging camera. Excellent imagery, with a noise equivalent differential temperature (N E(delta T)) of 30 mK has been achieved.
SLR digital camera for forensic photography
NASA Astrophysics Data System (ADS)
Har, Donghwan; Son, Youngho; Lee, Sungwon
2004-06-01
Forensic photography, which was systematically established in the late 19th century by Alphonse Bertillon of France, has developed a lot for about 100 years. The development will be more accelerated with the development of high technologies, in particular the digital technology. This paper reviews three studies to answer the question: Can the SLR digital camera replace the traditional silver halide type ultraviolet photography and infrared photography? 1. Comparison of relative ultraviolet and infrared sensitivity of SLR digital camera to silver halide photography. 2. How much ultraviolet or infrared sensitivity is improved when removing the UV/IR cutoff filter built in the SLR digital camera? 3. Comparison of relative sensitivity of CCD and CMOS for ultraviolet and infrared. The test result showed that the SLR digital camera has a very low sensitivity for ultraviolet and infrared. The cause was found to be the UV/IR cutoff filter mounted in front of the image sensor. Removing the UV/IR cutoff filter significantly improved the sensitivity for ultraviolet and infrared. Particularly for infrared, the sensitivity of the SLR digital camera was better than that of the silver halide film. This shows the possibility of replacing the silver halide type ultraviolet photography and infrared photography with the SLR digital camera. Thus, the SLR digital camera seems to be useful for forensic photography, which deals with a lot of ultraviolet and infrared photographs.
Near-infrared transillumination photography of intraocular tumours.
Krohn, Jørgen; Ulltang, Erlend; Kjersem, Bård
2013-10-01
To present a technique for near-infrared transillumination imaging of intraocular tumours based on the modifications of a conventional digital slit lamp camera system. The Haag-Streit Photo-Slit Lamp BX 900 (Haag-Streit AG) was used for transillumination photography by gently pressing the tip of the background illumination cable against the surface of the patient's eye. Thus the light from the flash unit was transmitted into the eye, leading to improved illumination and image resolution. The modification for near-infrared photography was done by replacing the original camera with a Canon EOS 30D (Canon Inc) converted by Advanced Camera Services Ltd. In this camera, the infrared blocking filter was exchanged for a 720 nm long-pass filter, so that the near-infrared part of the spectrum was recorded by the sensor. The technique was applied in eight patients: three with anterior choroidal melanoma, three with ciliary body melanoma and two with ocular pigment alterations. The good diagnostic quality of the photographs made it possible to evaluate the exact location and extent of the lesions in relation to pigmented intraocular landmarks such as the ora serrata and ciliary body. The photographic procedure did not lead to any complications. We recommend near-infrared transillumination photography as a supplementary diagnostic tool for the evaluation and documentation of anteriorly located intraocular tumours.
Multi-spectral imaging with infrared sensitive organic light emitting diode
Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky
2014-01-01
Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions. PMID:25091589
Multi-spectral imaging with infrared sensitive organic light emitting diode
NASA Astrophysics Data System (ADS)
Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky
2014-08-01
Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions.
ARNICA, the Arcetri Near-Infrared Camera
NASA Astrophysics Data System (ADS)
Lisi, F.; Baffa, C.; Bilotti, V.; Bonaccini, D.; del Vecchio, C.; Gennari, S.; Hunt, L. K.; Marcucci, G.; Stanga, R.
1996-04-01
ARNICA (ARcetri Near-Infrared CAmera) is the imaging camera for the near-infrared bands between 1.0 and 2.5 microns that the Arcetri Observatory has designed and built for the Infrared Telescope TIRGO located at Gornergrat, Switzerland. We describe the mechanical and optical design of the camera, and report on the astronomical performance of ARNICA as measured during the commissioning runs at the TIRGO (December, 1992 to December 1993), and an observing run at the William Herschel Telescope, Canary Islands (December, 1993). System performance is defined in terms of efficiency of the camera+telescope system and camera sensitivity for extended and point-like sources. (SECTION: Astronomical Instrumentation)
Research on a solid state-streak camera based on an electro-optic crystal
NASA Astrophysics Data System (ADS)
Wang, Chen; Liu, Baiyu; Bai, Yonglin; Bai, Xiaohong; Tian, Jinshou; Yang, Wenzheng; Xian, Ouyang
2006-06-01
With excellent temporal resolution ranging from nanosecond to sub-picoseconds, a streak camera is widely utilized in measuring ultrafast light phenomena, such as detecting synchrotron radiation, examining inertial confinement fusion target, and making measurements of laser-induced discharge. In combination with appropriate optics or spectroscope, the streak camera delivers intensity vs. position (or wavelength) information on the ultrafast process. The current streak camera is based on a sweep electric pulse and an image converting tube with a wavelength-sensitive photocathode ranging from the x-ray to near infrared region. This kind of streak camera is comparatively costly and complex. This paper describes the design and performance of a new-style streak camera based on an electro-optic crystal with large electro-optic coefficient. Crystal streak camera accomplishes the goal of time resolution by direct photon beam deflection using the electro-optic effect which can replace the current streak camera from the visible to near infrared region. After computer-aided simulation, we design a crystal streak camera which has the potential of time resolution between 1ns and 10ns.Some further improvements in sweep electric circuits, a crystal with a larger electro-optic coefficient, for example LN (γ 33=33.6×10 -12m/v) and the optimal optic system may lead to better time resolution less than 1ns.
Yang, Hualei; Yang, Xi; Heskel, Mary; ...
2017-04-28
Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporalmore » resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). Here we found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Hualei; Yang, Xi; Heskel, Mary
Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporalmore » resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). Here we found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.« less
Camera traps can be heard and seen by animals.
Meek, Paul D; Ballard, Guy-Anthony; Fleming, Peter J S; Schaefer, Michael; Williams, Warwick; Falzon, Greg
2014-01-01
Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals' hearing and produce illumination that can be seen by many species.
NASA Technical Reports Server (NTRS)
Gazanik, Michael; Johnson, Dave; Kist, Ed; Novak, Frank; Antill, Charles; Haakenson, David; Howell, Patricia; Jenkins, Rusty; Yates, Rusty; Stephan, Ryan;
2005-01-01
In November 2004, NASA's Space Shuttle Program approved the development of the Extravehicular (EVA) Infrared (IR) Camera to test the application of infrared thermography to on-orbit reinforced carbon-carbon (RCC) damage detection. A multi-center team composed of members from NASA's Johnson Space Center (JSC), Langley Research Center (LaRC), and Goddard Space Flight Center (GSFC) was formed to develop the camera system and plan a flight test. The initial development schedule called for the delivery of the system in time to support STS-115 in late 2005. At the request of Shuttle Program managers and the flight crews, the team accelerated its schedule and delivered a certified EVA IR Camera system in time to support STS-114 in July 2005 as a contingency. The development of the camera system, led by LaRC, was based on the Commercial-Off-the-Shelf (COTS) FLIR S65 handheld infrared camera. An assessment of the S65 system in regards to space-flight operation was critical to the project. This paper discusses the space-flight assessment and describes the significant modifications required for EVA use by the astronaut crew. The on-orbit inspection technique will be demonstrated during the third EVA of STS-121 in September 2005 by imaging damaged RCC samples mounted in a box in the Shuttle's cargo bay.
Infrared cameras are potential traceable "fixed points" for future thermometry studies.
Yap Kannan, R; Keresztes, K; Hussain, S; Coats, T J; Bown, M J
2015-01-01
The National physical laboratory (NPL) requires "fixed points" whose temperatures have been established by the International Temperature Scale of 1990 (ITS 90) be used for device calibration. In practice, "near" blackbody radiators together with the standard platinum resistance thermometer is accepted as a standard. The aim of this study was to report the correlation and limits of agreement (LOA) of the thermal infrared camera and non-contact infrared temporal thermometer against each other and the "near" blackbody radiator. Temperature readings from an infrared thermography camera (FLIR T650sc) and a non-contact infrared temporal thermometer (Hubdic FS-700) were compared to a near blackbody (Hyperion R blackbody model 982) at 0.5 °C increments between 20-40 °C. At each increment, blackbody cavity temperature was confirmed with the platinum resistance thermometer. Measurements were taken initially with the thermal infrared camera followed by the infrared thermometer, with each device mounted in turn on a stand at a fixed distance of 20 cm and 5 cm from the blackbody aperture, respectively. The platinum thermometer under-estimated the blackbody temperature by 0.015 °C (95% LOA: -0.08 °C to 0.05 °C), in contrast to the thermal infrared camera and infrared thermometer which over-estimated the blackbody temperature by 0.16 °C (95% LOA: 0.03 °C to 0.28 °C) and 0.75 °C (95% LOA: -0.30 °C to 1.79 °C), respectively. Infrared thermometer over-estimates thermal infrared camera measurements by 0.6 °C (95% LOA: -0.46 °C to 1.65 °C). In conclusion, the thermal infrared camera is a potential temperature reference "fixed point" that could substitute mercury thermometers. However, further repeatability and reproducibility studies will be required with different models of thermal infrared cameras.
Kang, Jin Kyu; Hong, Hyung Gil; Park, Kang Ryoung
2017-07-08
A number of studies have been conducted to enhance the pedestrian detection accuracy of intelligent surveillance systems. However, detecting pedestrians under outdoor conditions is a challenging problem due to the varying lighting, shadows, and occlusions. In recent times, a growing number of studies have been performed on visible light camera-based pedestrian detection systems using a convolutional neural network (CNN) in order to make the pedestrian detection process more resilient to such conditions. However, visible light cameras still cannot detect pedestrians during nighttime, and are easily affected by shadows and lighting. There are many studies on CNN-based pedestrian detection through the use of far-infrared (FIR) light cameras (i.e., thermal cameras) to address such difficulties. However, when the solar radiation increases and the background temperature reaches the same level as the body temperature, it remains difficult for the FIR light camera to detect pedestrians due to the insignificant difference between the pedestrian and non-pedestrian features within the images. Researchers have been trying to solve this issue by inputting both the visible light and the FIR camera images into the CNN as the input. This, however, takes a longer time to process, and makes the system structure more complex as the CNN needs to process both camera images. This research adaptively selects a more appropriate candidate between two pedestrian images from visible light and FIR cameras based on a fuzzy inference system (FIS), and the selected candidate is verified with a CNN. Three types of databases were tested, taking into account various environmental factors using visible light and FIR cameras. The results showed that the proposed method performs better than the previously reported methods.
NASA Astrophysics Data System (ADS)
Rumbaugh, Roy N.; Grealish, Kevin; Kacir, Tom; Arsenault, Barry; Murphy, Robert H.; Miller, Scott
2003-09-01
A new 4th generation MicroIR architecture is introduced as the latest in the highly successful Standard Camera Core (SCC) series by BAE SYSTEMS to offer an infrared imaging engine with greatly reduced size, weight, power, and cost. The advanced SCC500 architecture provides great flexibility in configuration to include multiple resolutions, an industry standard Real Time Operating System (RTOS) for customer specific software application plug-ins, and a highly modular construction for unique physical and interface options. These microbolometer based camera cores offer outstanding and reliable performance over an extended operating temperature range to meet the demanding requirements of real-world environments. A highly integrated lens and shutter is included in the new SCC500 product enabling easy, drop-in camera designs for quick time-to-market product introductions.
Investigation of the influence of spatial degrees of freedom on thermal infrared measurement
NASA Astrophysics Data System (ADS)
Fleuret, Julien R.; Yousefi, Bardia; Lei, Lei; Djupkep Dizeu, Frank Billy; Zhang, Hai; Sfarra, Stefano; Ouellet, Denis; Maldague, Xavier P. V.
2017-05-01
Long Wavelength Infrared (LWIR) cameras can provide a representation of a part of the light spectrum that is sensitive to temperature. These cameras also named Thermal Infrared (TIR) cameras are powerful tools to detect features that cannot be seen by other imaging technologies. For instance they enable defect detection in material, fever and anxiety in mammals and many other features for numerous applications. However, the accuracy of thermal cameras can be affected by many parameters; the most critical involves the relative position of the camera with respect to the object of interest. Several models have been proposed in order to minimize the influence of some of the parameters but they are mostly related to specific applications. Because such models are based on some prior informations related to context, their applicability to other contexts cannot be easily assessed. The few models remaining are mostly associated with a specific device. In this paper the authors studied the influence of the camera position on the measurement accuracy. Modeling of the position of the camera from the object of interest depends on many parameters. In order to propose a study which is as accurate as possible, the position of the camera will be represented as a five dimensions model. The aim of this study is to investigate and attempt to introduce a model which is as independent from the device as possible.
Lock-in thermography using a cellphone attachment infrared camera
NASA Astrophysics Data System (ADS)
Razani, Marjan; Parkhimchyk, Artur; Tabatabaei, Nima
2018-03-01
Lock-in thermography (LIT) is a thermal-wave-based, non-destructive testing, technique which has been widely utilized in research settings for characterization and evaluation of biological and industrial materials. However, despite promising research outcomes, the wide spread adaptation of LIT in industry, and its commercialization, is hindered by the high cost of the infrared cameras used in the LIT setups. In this paper, we report on the feasibility of using inexpensive cellphone attachment infrared cameras for performing LIT. While the cost of such cameras is over two orders of magnitude less than their research-grade counterparts, our experimental results on block sample with subsurface defects and tooth with early dental caries suggest that acceptable performance can be achieved through careful instrumentation and implementation of proper data acquisition and image processing steps. We anticipate this study to pave the way for development of low-cost thermography systems and their commercialization as inexpensive tools for non-destructive testing of industrial samples as well as affordable clinical devices for diagnostic imaging of biological tissues.
Seeing in a different light—using an infrared camera to teach heat transfer and optical phenomena
NASA Astrophysics Data System (ADS)
Pei Wong, Choun; Subramaniam, R.
2018-05-01
The infrared camera is a useful tool in physics education to ‘see’ in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.
Seeing in a Different Light--Using an Infrared Camera to Teach Heat Transfer and Optical Phenomena
ERIC Educational Resources Information Center
Wong, Choun Pei; Subramaniam, R.
2018-01-01
The infrared camera is a useful tool in physics education to 'see' in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.
LIFTING THE VEIL OF DUST TO REVEAL THE SECRETS OF SPIRAL GALAXIES
NASA Technical Reports Server (NTRS)
2002-01-01
Astronomers have combined information from the NASA Hubble Space Telescope's visible- and infrared-light cameras to show the hearts of four spiral galaxies peppered with ancient populations of stars. The top row of pictures, taken by a ground-based telescope, represents complete views of each galaxy. The blue boxes outline the regions observed by the Hubble telescope. The bottom row represents composite pictures from Hubble's visible- and infrared-light cameras, the Wide Field and Planetary Camera 2 (WFPC2) and the Near Infrared Camera and Multi-Object Spectrometer (NICMOS). Astronomers combined views from both cameras to obtain the true ages of the stars surrounding each galaxy's bulge. The Hubble telescope's sharper resolution allows astronomers to study the intricate structure of a galaxy's core. The galaxies are ordered by the size of their bulges. NGC 5838, an 'S0' galaxy, is dominated by a large bulge and has no visible spiral arms; NGC 7537, an 'Sbc' galaxy, has a small bulge and loosely wound spiral arms. Astronomers think that the structure of NGC 7537 is very similar to our Milky Way. The galaxy images are composites made from WFPC2 images taken with blue (4445 Angstroms) and red (8269 Angstroms) filters, and NICMOS images taken in the infrared (16,000 Angstroms). They were taken in June, July, and August of 1997. Credits for the ground-based images: Allan Sandage (The Observatories of the Carnegie Institution of Washington) and John Bedke (Computer Sciences Corporation and the Space Telescope Science Institute) Credits for WFPC2 and NICMOS composites: NASA, ESA, and Reynier Peletier (University of Nottingham, United Kingdom)
SPARTAN Near-IR Camera SPARTAN Cookbook Ohio State Infrared Imager/Spectrograph (OSIRIS) - NO LONGER Instrumentation at SOAR»SPARTAN Near-IR Camera SPARTAN Near-IR Camera System Overview The Spartan Infrared Camera is a high spatial resolution near-IR imager. Spartan has a focal plane conisisting of four "
Camera Traps Can Be Heard and Seen by Animals
Meek, Paul D.; Ballard, Guy-Anthony; Fleming, Peter J. S.; Schaefer, Michael; Williams, Warwick; Falzon, Greg
2014-01-01
Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals’ hearing and produce illumination that can be seen by many species. PMID:25354356
The infrared imaging radiometer for PICASSO-CENA
NASA Astrophysics Data System (ADS)
Corlay, Gilles; Arnolfo, Marie-Christine; Bret-Dibat, Thierry; Lifferman, Anne; Pelon, Jacques
2017-11-01
Microbolometers are infrared detectors of an emerging technology mainly developed in US and few other countries for few years. The main targets of these developments are low performing and low cost military and civilian applications like survey cameras. Applications in space are now arising thanks to the design simplification and the associated cost reduction allowed by this new technology. Among the four instruments of the payload of PICASSO-CENA, the Imaging Infrared Radiometer (IIR) is based on the microbolometer technology. An infrared camera in development for the IASI instrument is the core of the IIR. The aim of the paper is to recall the PICASSO-CENA mission goal, to describe the IIR instrument architecture and highlight its main features and performances and to give the its development status.
Firefly: A HOT camera core for thermal imagers with enhanced functionality
NASA Astrophysics Data System (ADS)
Pillans, Luke; Harmer, Jack; Edwards, Tim
2015-06-01
Raising the operating temperature of mercury cadmium telluride infrared detectors from 80K to above 160K creates new applications for high performance infrared imagers by vastly reducing the size, weight and power consumption of the integrated cryogenic cooler. Realizing the benefits of Higher Operating Temperature (HOT) requires a new kind of infrared camera core with the flexibility to address emerging applications in handheld, weapon mounted and UAV markets. This paper discusses the Firefly core developed to address these needs by Selex ES in Southampton UK. Firefly represents a fundamental redesign of the infrared signal chain reducing power consumption and providing compatibility with low cost, low power Commercial Off-The-Shelf (COTS) computing technology. This paper describes key innovations in this signal chain: a ROIC purpose built to minimize power consumption in the proximity electronics, GPU based image processing of infrared video, and a software customisable infrared core which can communicate wirelessly with other Battlespace systems.
Temperature measurement with industrial color camera devices
NASA Astrophysics Data System (ADS)
Schmidradler, Dieter J.; Berndorfer, Thomas; van Dyck, Walter; Pretschuh, Juergen
1999-05-01
This paper discusses color camera based temperature measurement. Usually, visual imaging and infrared image sensing are treated as two separate disciplines. We will show, that a well selected color camera device might be a cheaper, more robust and more sophisticated solution for optical temperature measurement in several cases. Herein, only implementation fragments and important restrictions for the sensing element will be discussed. Our aim is to draw the readers attention to the use of visual image sensors for measuring thermal radiation and temperature and to give reasons for the need of improved technologies for infrared camera devices. With AVL-List, our partner of industry, we successfully used the proposed sensor to perform temperature measurement for flames inside the combustion chamber of diesel engines which finally led to the presented insights.
NASA Astrophysics Data System (ADS)
Matras, A.
2017-08-01
The paper discusses the impact of the feed screw heating on the machining accuracy. The test stand was built based on HASS Mini Mill 2 CNC milling machine and a Flir SC620 infrared camera. Measurements of workpiece were performed on Talysurf Intra 50 Taylor Hobson profilometer. The research proved that the intensive work of the milling machine lasted 60 minutes, causing thermal expansion of the feed screw what influence on the dimension error of the workpiece.
First light observations with TIFR Near Infrared Imaging Camera (TIRCAM-II)
NASA Astrophysics Data System (ADS)
Ojha, D. K.; Ghosh, S. K.; D'Costa, S. L. A.; Naik, M. B.; Sandimani, P. R.; Poojary, S. S.; Bhagat, S. B.; Jadhav, R. B.; Meshram, G. S.; Bakalkar, C. B.; Ramaprakash, A. N.; Mohan, V.; Joshi, J.
TIFR near infrared imaging camera (TIRCAM-II) is based on the Aladdin III Quadrant InSb focal plane array (512×512 pixels; 27.6 μm pixel size; sensitive between 1 - 5.5 μm). TIRCAM-II had its first engineering run with the 2 m IUCAA telescope at Girawali during February - March 2011. The first light observations with TIRCAM-II were quite successful. Several infrared standard with TIRCAM-II were quite successful. Several infrared standard stars, the Trapezium Cluster in Orion region, McNeil's nebula, etc., were observed in the J, K and in a narrow-band at 3.6 μm (nbL). In the nbL band, some bright stars could be detected from the Girawali site. The performance of TIRCAM-II is discussed in the light of preliminary observations in near infrared bands.
Thermal Imaging with Novel Infrared Focal Plane Arrays and Quantitative Analysis of Thermal Imagery
NASA Technical Reports Server (NTRS)
Gunapala, S. D.; Rafol, S. B.; Bandara, S. V.; Liu, J. K.; Mumolo, J. M.; Soibel, A.; Ting, D. Z.; Tidrow, Meimei
2012-01-01
We have developed a single long-wavelength infrared (LWIR) quantum well infrared photodetector (QWIP) camera for thermography. This camera has been used to measure the temperature profile of patients. A pixel coregistered simultaneously reading mid-wavelength infrared (MWIR)/LWIR dual-band QWIP camera was developed to improve the accuracy of temperature measurements especially with objects with unknown emissivity. Even the dualband measurement can provide inaccurate results due to the fact that emissivity is a function of wavelength. Thus we have been developing a four-band QWIP camera for accurate temperature measurement of remote object.
Benmiloud, Fares; Rebaudet, Stanislas; Varoquaux, Arthur; Penaranda, Guillaume; Bannier, Marie; Denizot, Anne
2018-01-01
The clinical impact of intraoperative autofluorescence-based identification of parathyroids using a near-infrared camera remains unknown. In a before and after controlled study, we compared all patients who underwent total thyroidectomy by the same surgeon during Period 1 (January 2015 to January 2016) without near-infrared (near-infrared- group) and those operated on during Period 2 (February 2016 to September 2016) using a near-infrared camera (near-infrared+ group). In parallel, we also compared all patients who underwent surgery without near-infrared during those same periods by another surgeon in the same unit (control groups). Main outcomes included postoperative hypocalcemia, parathyroid identification, autotransplantation, and inadvertent resection. The near-infrared+ group displayed significantly lower postoperative hypocalcemia rates (5.2%) than the near-infrared- group (20.9%; P < .001). Compared with the near-infrared- patients, the near-infrared+ group exhibited an increased mean number of identified parathyroids and reduced parathyroid autotransplantation rates, although no difference was observed in inadvertent resection rates. Parathyroids were identified via near-infrared before they were visualized by the surgeon in 68% patients. In the control groups, parathyroid identification improved significantly from Period 1 to Period 2, although autotransplantation, inadvertent resection and postoperative hypocalcemia rates did not differ. Near-infrared use during total thyroidectomy significantly reduced postoperative hypocalcemia, improved parathyroid identification and reduced their autotransplantation rate. Copyright © 2017 Elsevier Inc. All rights reserved.
Toslak, Devrim; Liu, Changgeng; Alam, Minhaj Nur; Yao, Xincheng
2018-06-01
A portable fundus imager is essential for emerging telemedicine screening and point-of-care examination of eye diseases. However, existing portable fundus cameras have limited field of view (FOV) and frequently require pupillary dilation. We report here a miniaturized indirect ophthalmoscopy-based nonmydriatic fundus camera with a snapshot FOV up to 67° external angle, which corresponds to a 101° eye angle. The wide-field fundus camera consists of a near-infrared light source (LS) for retinal guidance and a white LS for color retinal imaging. By incorporating digital image registration and glare elimination methods, a dual-image acquisition approach was used to achieve reflection artifact-free fundus photography.
Infrared Imaging for Inquiry-Based Learning
ERIC Educational Resources Information Center
Xie, Charles; Hazzard, Edmund
2011-01-01
Based on detecting long-wavelength infrared (IR) radiation emitted by the subject, IR imaging shows temperature distribution instantaneously and heat flow dynamically. As a picture is worth a thousand words, an IR camera has great potential in teaching heat transfer, which is otherwise invisible. The idea of using IR imaging in teaching was first…
Volcano monitoring with an infrared camera: first insights from Villarrica Volcano
NASA Astrophysics Data System (ADS)
Rosas Sotomayor, Florencia; Amigo Ramos, Alvaro; Velasquez Vargas, Gabriela; Medina, Roxana; Thomas, Helen; Prata, Fred; Geoffroy, Carolina
2015-04-01
This contribution focuses on the first trials of the, almost 24/7 monitoring of Villarrica volcano with an infrared camera. Results must be compared with other SO2 remote sensing instruments such as DOAS and UV-camera, for the ''day'' measurements. Infrared remote sensing of volcanic emissions is a fast and safe method to obtain gas abundances in volcanic plumes, in particular when the access to the vent is difficult, during volcanic crisis and at night time. In recent years, a ground-based infrared camera (Nicair) has been developed by Nicarnica Aviation, which quantifies SO2 and ash on volcanic plumes, based on the infrared radiance at specific wavelengths through the application of filters. Three Nicair1 (first model) have been acquired by the Geological Survey of Chile in order to study degassing of active volcanoes. Several trials with the instruments have been performed in northern Chilean volcanoes, and have proven that the intervals of retrieved SO2 concentration and fluxes are as expected. Measurements were also performed at Villarrica volcano, and a location to install a ''fixed'' camera, at 8km from the crater, was discovered here. It is a coffee house with electrical power, wifi network, polite and committed owners and a full view of the volcano summit. The first measurements are being made and processed in order to have full day and week of SO2 emissions, analyze data transfer and storage, improve the remote control of the instrument and notebook in case of breakdown, web-cam/GoPro support, and the goal of the project: which is to implement a fixed station to monitor and study the Villarrica volcano with a Nicair1 integrating and comparing these results with other remote sensing instruments. This works also looks upon the strengthen of bonds with the community by developing teaching material and giving talks to communicate volcanic hazards and other geoscience topics to the people who live "just around the corner" from one of the most active volcanoes in Chile.
IRAIT project: future mid-IR operations at Dome C during summer
NASA Astrophysics Data System (ADS)
Tosti, Gino; IRAIT Collaboration
The project IRAIT consists of a robotic mid-infrared telescope that will be hosted at Dome C in the Italian-French Concordia station on the Antarctic Plateau. The telescope was built in collaboration with the PNRA (sectors Technology and Earth-Sun Interaction and Astrophysics). Its focal plane instrumentation is a mid-infrared Camera (5-25 mu m), based on the TIRCAM II prototype, which is the result of a join effort between Institutes of CNR and INAF. International collaborations with French and Spanish Institutes for the construction of a near infrared spectrographic camera have also been started. We present the status of the project and the ongoing developments that will make possible to start infrared observations at Dome C during the summer Antarctic campaign 2005-2006.
A new high-speed IR camera system
NASA Technical Reports Server (NTRS)
Travis, Jeffrey W.; Shu, Peter K.; Jhabvala, Murzy D.; Kasten, Michael S.; Moseley, Samuel H.; Casey, Sean C.; Mcgovern, Lawrence K.; Luers, Philip J.; Dabney, Philip W.; Kaipa, Ravi C.
1994-01-01
A multi-organizational team at the Goddard Space Flight Center is developing a new far infrared (FIR) camera system which furthers the state of the art for this type of instrument by the incorporating recent advances in several technological disciplines. All aspects of the camera system are optimized for operation at the high data rates required for astronomical observations in the far infrared. The instrument is built around a Blocked Impurity Band (BIB) detector array which exhibits responsivity over a broad wavelength band and which is capable of operating at 1000 frames/sec, and consists of a focal plane dewar, a compact camera head electronics package, and a Digital Signal Processor (DSP)-based data system residing in a standard 486 personal computer. In this paper we discuss the overall system architecture, the focal plane dewar, and advanced features and design considerations for the electronics. This system, or one derived from it, may prove useful for many commercial and/or industrial infrared imaging or spectroscopic applications, including thermal machine vision for robotic manufacturing, photographic observation of short-duration thermal events such as combustion or chemical reactions, and high-resolution surveillance imaging.
The development of large-aperture test system of infrared camera and visible CCD camera
NASA Astrophysics Data System (ADS)
Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying
2015-10-01
Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.
Selecting among competing models of electro-optic, infrared camera system range performance
Nichols, Jonathan M.; Hines, James E.; Nichols, James D.
2013-01-01
Range performance is often the key requirement around which electro-optical and infrared camera systems are designed. This work presents an objective framework for evaluating competing range performance models. Model selection based on the Akaike’s Information Criterion (AIC) is presented for the type of data collected during a typical human observer and target identification experiment. These methods are then demonstrated on observer responses to both visible and infrared imagery in which one of three maritime targets was placed at various ranges. We compare the performance of a number of different models, including those appearing previously in the literature. We conclude that our model-based approach offers substantial improvements over the traditional approach to inference, including increased precision and the ability to make predictions for some distances other than the specific set for which experimental trials were conducted.
Conception of a cheap infrared camera using a Fresnel lens
NASA Astrophysics Data System (ADS)
Grulois, Tatiana; Druart, Guillaume; Guérineau, Nicolas; Crastes, Arnaud; Sauer, Hervé; Chavel, Pierre
2014-09-01
Today huge efforts are made in the research and industrial areas to design compact and cheap uncooled infrared optical systems for low-cost imagery applications. Indeed, infrared cameras are currently too expensive to be widespread. If we manage to cut their cost, we expect to open new types of markets. In this paper, we will present the cheap broadband microimager we have designed. It operates in the long-wavelength infrared range and uses only one silicon lens at a minimal cost for the manufacturing process. Our concept is based on the use of a thin optics. Therefore inexpensive unconventional materials can be used because some absorption can be tolerated. Our imager uses a thin Fresnel lens. Up to now, Fresnel lenses have not been used for broadband imagery applications because of their disastrous chromatic properties. However, we show that working in a high diffraction order can significantly reduce chromatism. A prototype has been made and the performance of our camera will be discussed. Its characterization has been carried out in terms of modulation transfer function (MTF) and noise equivalent temperature difference (NETD). Finally, experimental images will be presented.
[Evaluation of Iris Morphology Viewed through Stromal Edematous Corneas by Infrared Camera].
Kobayashi, Masaaki; Morishige, Naoyuki; Morita, Yukiko; Yamada, Naoyuki; Kobayashi, Motomi; Sonoda, Koh-Hei
2016-02-01
We reported that the application of infrared camera enables us to observe iris morphology in Peters' anomaly through edematous corneas. To observe the iris morphology in bullous keratopathy or failure grafts with an infrared camera. Eleven bullous keratopathy or failure grafts subjects (6 men and 5 women, mean age ± SD; 72.7 ± 13.0 years old) were enrolled in this study. The iris morphology was observed by applying visible light mode and near infrared light mode of infrared camera (MeibomPen). The detectability of pupil shapes, iris patterns and presence of iridectomy was evaluated. Infrared mode observation enabled us to detect the pupil shapes in 11 out of 11 cases, iris patterns in 3 out of 11 cases, and presence of iridetomy in 9 out of 11 cases although visible light mode observation could not detect any iris morphological changes. Applying infrared optics was valuable for observation of the iris morphology through stromal edematous corneas.
Calibration and verification of thermographic cameras for geometric measurements
NASA Astrophysics Data System (ADS)
Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P.
2011-03-01
Infrared thermography is a technique with an increasing degree of development and applications. Quality assessment in the measurements performed with the thermal cameras should be achieved through metrology calibration and verification. Infrared cameras acquire temperature and geometric information, although calibration and verification procedures are only usual for thermal data. Black bodies are used for these purposes. Moreover, the geometric information is important for many fields as architecture, civil engineering and industry. This work presents a calibration procedure that allows the photogrammetric restitution and a portable artefact to verify the geometric accuracy, repeatability and drift of thermographic cameras. These results allow the incorporation of this information into the quality control processes of the companies. A grid based on burning lamps is used for the geometric calibration of thermographic cameras. The artefact designed for the geometric verification consists of five delrin spheres and seven cubes of different sizes. Metrology traceability for the artefact is obtained from a coordinate measuring machine. Two sets of targets with different reflectivity are fixed to the spheres and cubes to make data processing and photogrammetric restitution possible. Reflectivity was the chosen material propriety due to the thermographic and visual cameras ability to detect it. Two thermographic cameras from Flir and Nec manufacturers, and one visible camera from Jai are calibrated, verified and compared using calibration grids and the standard artefact. The calibration system based on burning lamps shows its capability to perform the internal orientation of the thermal cameras. Verification results show repeatability better than 1 mm for all cases, being better than 0.5 mm for the visible one. As it must be expected, also accuracy appears higher in the visible camera, and the geometric comparison between thermographic cameras shows slightly better results for the Nec camera.
In-situ calibration of nonuniformity in infrared staring and modulated systems
NASA Astrophysics Data System (ADS)
Black, Wiley T.
Infrared cameras can directly measure the apparent temperature of objects, providing thermal imaging. However, the raw output from most infrared cameras suffers from a strong, often limiting noise source called nonuniformity. Manufacturing imperfections in infrared focal planes lead to high pixel-to-pixel sensitivity to electronic bias, focal plane temperature, and other effects. The resulting imagery can only provide useful thermal imaging after a nonuniformity calibration has been performed. Traditionally, these calibrations are performed by momentarily blocking the field of view with a at temperature plate or blackbody cavity. However because the pattern is a coupling of manufactured sensitivities with operational variations, periodic recalibration is required, sometimes on the order of tens of seconds. A class of computational methods called Scene-Based Nonuniformity Correction (SBNUC) has been researched for over 20 years where the nonuniformity calibration is estimated in digital processing by analysis of the video stream in the presence of camera motion. The most sophisticated SBNUC methods can completely and robustly eliminate the high-spatial frequency component of nonuniformity with only an initial reference calibration or potentially no physical calibration. I will demonstrate a novel algorithm that advances these SBNUC techniques to support all spatial frequencies of nonuniformity correction. Long-wave infrared microgrid polarimeters are a class of camera that incorporate a microscale per-pixel wire-grid polarizer directly affixed to each pixel of the focal plane. These cameras have the capability of simultaneously measuring thermal imagery and polarization in a robust integrated package with no moving parts. I will describe the necessary adaptations of my SBNUC method to operate on this class of sensor as well as demonstrate SBNUC performance in LWIR polarimetry video collected on the UA mall.
High-Resolution Mars Camera Test Image of Moon Infrared
2005-09-13
This crescent view of Earth Moon in infrared wavelengths comes from a camera test by NASA Mars Reconnaissance Orbiter spacecraft on its way to Mars. This image was taken by taken by the High Resolution Imaging Science Experiment camera Sept. 8, 2005.
Can reliable sage-grouse lek counts be obtained using aerial infrared technology
Gillette, Gifford L.; Coates, Peter S.; Petersen, Steven; Romero, John P.
2013-01-01
More effective methods for counting greater sage-grouse (Centrocercus urophasianus) are needed to better assess population trends through enumeration or location of new leks. We describe an aerial infrared technique for conducting sage-grouse lek counts and compare this method with conventional ground-based lek count methods. During the breeding period in 2010 and 2011, we surveyed leks from fixed-winged aircraft using cryogenically cooled mid-wave infrared cameras and surveyed the same leks on the same day from the ground following a standard lek count protocol. We did not detect significant differences in lek counts between surveying techniques. These findings suggest that using a cryogenically cooled mid-wave infrared camera from an aerial platform to conduct lek surveys is an effective alternative technique to conventional ground-based methods, but further research is needed. We discuss multiple advantages to aerial infrared surveys, including counting in remote areas, representing greater spatial variation, and increasing the number of counted leks per season. Aerial infrared lek counts may be a valuable wildlife management tool that releases time and resources for other conservation efforts. Opportunities exist for wildlife professionals to refine and apply aerial infrared techniques to wildlife monitoring programs because of the increasing reliability and affordability of this technology.
NASA Astrophysics Data System (ADS)
Liu, Chengwei; Sui, Xiubao; Gu, Guohua; Chen, Qian
2018-02-01
For the uncooled long-wave infrared (LWIR) camera, the infrared (IR) irradiation the focal plane array (FPA) receives is a crucial factor that affects the image quality. Ambient temperature fluctuation as well as system power consumption can result in changes of FPA temperature and radiation characteristics inside the IR camera; these will further degrade the imaging performance. In this paper, we present a novel shutterless non-uniformity correction method to compensate for non-uniformity derived from the variation of ambient temperature. Our method combines a calibration-based method and the properties of a scene-based method to obtain correction parameters at different ambient temperature conditions, so that the IR camera performance can be less influenced by ambient temperature fluctuation or system power consumption. The calibration process is carried out in a temperature chamber with slowly changing ambient temperature and a black body as uniform radiation source. Enough uniform images are captured and the gain coefficients are calculated during this period. Then in practical application, the offset parameters are calculated via the least squares method based on the gain coefficients, the captured uniform images and the actual scene. Thus we can get a corrected output through the gain coefficients and offset parameters. The performance of our proposed method is evaluated on realistic IR images and compared with two existing methods. The images we used in experiments are obtained by a 384× 288 pixels uncooled LWIR camera. Results show that our proposed method can adaptively update correction parameters as the actual target scene changes and is more stable to temperature fluctuation than the other two methods.
Framework for 2D-3D image fusion of infrared thermography with preoperative MRI.
Hoffmann, Nico; Weidner, Florian; Urban, Peter; Meyer, Tobias; Schnabel, Christian; Radev, Yordan; Schackert, Gabriele; Petersohn, Uwe; Koch, Edmund; Gumhold, Stefan; Steiner, Gerald; Kirsch, Matthias
2017-11-27
Multimodal medical image fusion combines information of one or more images in order to improve the diagnostic value. While previous applications mainly focus on merging images from computed tomography, magnetic resonance imaging (MRI), ultrasonic and single-photon emission computed tomography, we propose a novel approach for the registration and fusion of preoperative 3D MRI with intraoperative 2D infrared thermography. Image-guided neurosurgeries are based on neuronavigation systems, which further allow us track the position and orientation of arbitrary cameras. Hereby, we are able to relate the 2D coordinate system of the infrared camera with the 3D MRI coordinate system. The registered image data are now combined by calibration-based image fusion in order to map our intraoperative 2D thermographic images onto the respective brain surface recovered from preoperative MRI. In extensive accuracy measurements, we found that the proposed framework achieves a mean accuracy of 2.46 mm.
Application of infrared camera to bituminous concrete pavements: measuring vehicle
NASA Astrophysics Data System (ADS)
Janků, Michal; Stryk, Josef
2017-09-01
Infrared thermography (IR) has been used for decades in certain fields. However, the technological level of advancement of measuring devices has not been sufficient for some applications. Over the recent years, good quality thermal cameras with high resolution and very high thermal sensitivity have started to appear on the market. The development in the field of measuring technologies allowed the use of infrared thermography in new fields and for larger number of users. This article describes the research in progress in Transport Research Centre with a focus on the use of infrared thermography for diagnostics of bituminous road pavements. A measuring vehicle, equipped with a thermal camera, digital camera and GPS sensor, was designed for the diagnostics of pavements. New, highly sensitive, thermal cameras allow to measure very small temperature differences from the moving vehicle. This study shows the potential of a high-speed inspection without lane closures while using IR thermography.
AMICA: The First camera for Near- and Mid-Infrared Astronomical Imaging at Dome C
NASA Astrophysics Data System (ADS)
Straniero, O.; Dolci, M.; Valentini, A.; Valentini, G.; di Rico, G.; Ragni, M.; Giuliani, C.; di Cianno, A.; di Varano, I.; Corcione, L.; Bortoletto, F.; D'Alessandro, M.; Magrin, D.; Bonoli, C.; Giro, E.; Fantinel, D.; Zerbi, F. M.; Riva, A.; de Caprio, V.; Molinari, E.; Conconi, P.; Busso, M.; Tosti, G.; Abia, C. A.
AMICA (Antarctic Multiband Infrared CAmera) is an instrument designed to perform astronomical imaging in the near- (1{-}5 μm) and mid- (5 27 μm) infrared wavelength regions. Equipped with two detectors, an InSb 2562 and a Si:As 1282 IBC, cooled at 35 and 7 K respectively, it will be the first instrument to investigate the potential of the Italian-French base Concordia for IR astronomy. The main technical challenge is represented by the extreme conditions of Dome C (T ˜ -90 °C, p ˜640 mbar). An environmental control system ensures the correct start-up, shut-down and housekeeping of the various components of the camera. AMICA will be mounted on the IRAIT telescope and will perform survey-mode observations in the Southern sky. The first task is to provide important site-quality data. Substantial contributions to the solution of fundamental astrophysical quests, such as those related to late phases of stellar evolution and to star formation processes, are also expected.
Cloud top structure of Venus revealed by Subaru/COMICS mid-infrared images
NASA Astrophysics Data System (ADS)
Sato, T. M.; Sagawa, H.; Kouyama, T.; Mitsuyama, K.; Satoh, T.; Ohtsuki, S.; Ueno, M.; Kasaba, Y.; Nakamura, M.; Imamura, T.
2014-04-01
We have investigated the cloud top structure of Venus by analyzing ground-based images obtained by the Cooled Mid-Infrared Camera and Spectrometer (COMICS), mounted on the 8.2-m Subaru Telescope. In this presentation, we will overview the observational results and discuss their interpretations.
NASA Astrophysics Data System (ADS)
Baffa, Carlo; Gennari, Sandro; Hunt, Leslie K.; Lisi, Franco; Tofani, Gianni; Vanzi, Leonardo
1995-09-01
We describe the general characteristics of the TIRGO infrared telescope, located on Gornergrat (Switzerland), and its most recent instrumentation. This telescope is specifically designed for infrared astronomical observations. Two newly designed instruments are presented: the imaging camera Arnica and the long-slit spectrometer LonGSp, both based on two-dimensional array detectors.
Uncooled infrared sensors: rapid growth and future perspective
NASA Astrophysics Data System (ADS)
Balcerak, Raymond S.
2000-07-01
The uncooled infrared cameras are now available for both the military and commercial markets. The current camera technology incorporates the fruits of many years of development, focusing on the details of pixel design, novel material processing, and low noise read-out electronics. The rapid insertion of cameras into systems is testimony to the successful completion of this 'first phase' of development. In the military market, the first uncooled infrared cameras will be used for weapon sights, driver's viewers and helmet mounted cameras. Major commercial applications include night driving, security, police and fire fighting, and thermography, primarily for preventive maintenance and process control. The technology for the next generation of cameras is even more demanding, but within reach. The paper outlines the technology program planned for the next generation of cameras, and the approaches to further enhance performance, even to the radiation limit of thermal detectors.
On the Integration of Medium Wave Infrared Cameras for Vision-Based Navigation
2015-03-01
SWIR Short Wave Infrared VisualSFM Visual Structure from Motion WPAFB Wright Patterson Air Force Base xi ON THE INTEGRATION OF MEDIUM WAVE INFRARED...Structure from Motion Visual Structure from Motion ( VisualSFM ) is an application that performs incremental SfM using images fed into it of a scene [20...too drastically in between frames. When this happens, VisualSFM will begin creating a new model with images that do not fit to the old one. These new
Electro-optical system for gunshot detection: analysis, concept, and performance
NASA Astrophysics Data System (ADS)
Kastek, M.; Dulski, R.; Madura, H.; Trzaskawka, P.; Bieszczad, G.; Sosnowski, T.
2011-08-01
The paper discusses technical possibilities to build an effective electro-optical sensor unit for sniper detection using infrared cameras. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. At first, the analysis was presented of three distinguished phases of sniper activity: before, during and after the shot. On the basis of experimental data the parameters defining the relevant sniper signatures were determined which are essential in assessing the capability of infrared camera to detect sniper activity. A sniper body and muzzle flash were analyzed as targets and the descriptions of phenomena which make it possible to detect sniper activities in infrared spectra as well as analysis of physical limitations were performed. The analyzed infrared systems were simulated using NVTherm software. The calculations for several cameras, equipped with different lenses and detector types were performed. The simulation of detection ranges was performed for the selected scenarios of sniper detection tasks. After the analysis of simulation results, the technical specifications of infrared sniper detection system were discussed, required to provide assumed detection range. Finally the infrared camera setup was proposed which can detected sniper from 1000 meters range.
Unstructured Facility Navigation by Applying the NIST 4D/RCS Architecture
2006-07-01
control, and the planner); wire- less data and emergency stop radios; GPS receiver; inertial navigation unit; dual stereo cameras; infrared sensors...current Actuators Wheel motors, camera controls Scale & filter signals status commands commands commands GPS Antenna Dual stereo cameras...used in the sensory processing module include the two pairs of stereo color cameras, the physical bumper and infrared bumper sensors, the motor
NASA Astrophysics Data System (ADS)
Sáez-Cano, G.; Morales de los Ríos, J. A.; del Peral, L.; Neronov, A.; Wada, S.; Rodríguez Frías, M. D.
2015-03-01
The origin of cosmic rays have remained a mistery for more than a century. JEM-EUSO is a pioneer space-based telescope that will be located at the International Space Station (ISS) and its aim is to detect Ultra High Energy Cosmic Rays (UHECR) and Extremely High Energy Cosmic Rays (EHECR) by observing the atmosphere. Unlike ground-based telescopes, JEM-EUSO will observe from upwards, and therefore, for a properly UHECR reconstruction under cloudy conditions, a key element of JEM-EUSO is an Atmospheric Monitoring System (AMS). This AMS consists of a space qualified bi-spectral Infrared Camera, that will provide the cloud coverage and cloud top height in the JEM-EUSO Field of View (FoV) and a LIDAR, that will measure the atmospheric optical depth in the direction it has been shot. In this paper we will explain the effects of clouds for the determination of the UHECR arrival direction. Moreover, since the cloud top height retrieval is crucial to analyze the UHECR and EHECR events under cloudy conditions, the retrieval algorithm that fulfills the technical requierements of the Infrared Camera of JEM-EUSO to reconstruct the cloud top height is presently reported.
1999-05-12
to an infrared television camera AVTO TVS-2100. The detector in the camera was an InSb crystal having sensitivity in the wavelength region between 3.0...Serial Number: Navy Case: 79,823 camera AVTO TVS-2100, with a detector of the In Sb crystal, having peak sensitivity in the wavelength region between
Students' Framing of Laboratory Exercises Using Infrared Cameras
ERIC Educational Resources Information Center
Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.
2015-01-01
Thermal science is challenging for students due to its largely imperceptible nature. Handheld infrared cameras offer a pedagogical opportunity for students to see otherwise invisible thermal phenomena. In the present study, a class of upper secondary technology students (N = 30) partook in four IR-camera laboratory activities, designed around the…
Space infrared telescope facility wide field and diffraction limited array camera (IRAC)
NASA Technical Reports Server (NTRS)
Fazio, G. G.
1986-01-01
IRAC focal plane detector technology was developed and studies of alternate focal plane configurations were supported. While any of the alternate focal planes under consideration would have a major impact on the Infrared Array Camera, it was possible to proceed with detector development and optical analysis research based on the proposed design since, to a large degree, the studies undertaken are generic to any SIRTF imaging instrument. Development of the proposed instrument was also important in a situation in which none of the alternate configurations has received the approval of the Science Working Group.
Infrared Camera Diagnostic for Heat Flux Measurements on NSTX
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. Mastrovito; R. Maingi; H.W. Kugel
2003-03-25
An infrared imaging system has been installed on NSTX (National Spherical Torus Experiment) at the Princeton Plasma Physics Laboratory to measure the surface temperatures on the lower divertor and center stack. The imaging system is based on an Indigo Alpha 160 x 128 microbolometer camera with 12 bits/pixel operating in the 7-13 {micro}m range with a 30 Hz frame rate and a dynamic temperature range of 0-700 degrees C. From these data and knowledge of graphite thermal properties, the heat flux is derived with a classic one-dimensional conduction model. Preliminary results of heat flux scaling are reported.
NASA Astrophysics Data System (ADS)
Harrild, M.; Webley, P.; Dehn, J.
2014-12-01
Knowledge and understanding of precursory events and thermal signatures are vital for monitoring volcanogenic processes, as activity can often range from low level lava effusion to large explosive eruptions, easily capable of ejecting ash up to aircraft cruise altitudes. Using ground based remote sensing techniques to monitor and detect this activity is essential, but often the required equipment and maintenance is expensive. Our investigation explores the use of low-light cameras to image volcanic activity in the visible to near infrared (NIR) portion of the electromagnetic spectrum. These cameras are ideal for monitoring as they are cheap, consume little power, are easily replaced and can provide near real-time data. We focus here on the early detection of volcanic activity, using automated scripts, that capture streaming online webcam imagery and evaluate image pixel brightness values to determine relative changes and flag increases in activity. The script is written in Python, an open source programming language, to reduce the overall cost to potential consumers and increase the application of these tools across the volcanological community. In addition, by performing laboratory tests to determine the spectral response of these cameras, a direct comparison of collocated low-light and thermal infrared cameras has allowed approximate eruption temperatures and effusion rates to be determined from pixel brightness. The results of a field campaign in June, 2013 to Stromboli volcano, Italy, are also presented here. Future field campaigns to Latin America will include collaborations with INSIVUMEH in Guatemala, to apply our techniques to Fuego and Santiaguito volcanoes.
NASA Astrophysics Data System (ADS)
Harrild, Martin; Webley, Peter; Dehn, Jonathan
2015-04-01
Knowledge and understanding of precursory events and thermal signatures are vital for monitoring volcanogenic processes, as activity can often range from low level lava effusion to large explosive eruptions, easily capable of ejecting ash up to aircraft cruise altitudes. Using ground based remote sensing techniques to monitor and detect this activity is essential, but often the required equipment and maintenance is expensive. Our investigation explores the use of low-light cameras to image volcanic activity in the visible to near infrared (NIR) portion of the electromagnetic spectrum. These cameras are ideal for monitoring as they are cheap, consume little power, are easily replaced and can provide near real-time data. We focus here on the early detection of volcanic activity, using automated scripts, that capture streaming online webcam imagery and evaluate image pixel brightness values to determine relative changes and flag increases in activity. The script is written in Python, an open source programming language, to reduce the overall cost to potential consumers and increase the application of these tools across the volcanological community. In addition, by performing laboratory tests to determine the spectral response of these cameras, a direct comparison of collocated low-light and thermal infrared cameras has allowed approximate eruption temperatures and effusion rates to be determined from pixel brightness. The results of a field campaign in June, 2013 to Stromboli volcano, Italy, are also presented here. Future field campaigns to Latin America will include collaborations with INSIVUMEH in Guatemala, to apply our techniques to Fuego and Santiaguito volcanoes.
Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects
Lambers, Martin; Kolb, Andreas
2017-01-01
In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data. PMID:29271888
Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects.
Bulczak, David; Lambers, Martin; Kolb, Andreas
2017-12-22
In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.
NASA Astrophysics Data System (ADS)
Tabuchi, Toru; Yamagata, Shigeki; Tamura, Tetsuo
2003-04-01
There are increasing demands for information to avoid accident in automobile traffic increase. We will discuss that an infrared camera can identify three conditions (dry, aquaplane, frozen) of the road surface. Principles of this method are; 1.We have found 3-color infrared camera can distinguish those conditions using proper data processing 2.The emissivity of the materials on the road surface (conclete, water, ice) differs in three wavelength regions. 3.The sky's temperature is lower than the road's. The emissivity of the road depends on the road surface conditions. Therefore, 3-color infrared camera measure the energy reflected from the sky on the road surface and self radiation of road surface. The road condition can be distinguished by processing the energy pattern measured in three wavelength regions. We were able to collect the experimental results that the emissivity of conclete is differ from water. The infrared camera whose NETD (Noise Equivalent Temperature Difference) at each 3-wavelength is 1.0C or less can distinguish the road conditions by using emissivity difference.
Mitigation of Atmospheric Effects on Imaging Systems
2004-03-31
focal length. The imaging system had two cameras: an Electrim camera sensitive in the visible (0.6 µ m) waveband and an Amber QWIP infrared camera...sensitive in the 9–micron region. The Amber QWIP infrared camera had 256x256 pixels, pixel pitch 38 mµ , focal length of 1.8 m, FOV of 5.4 x5.4 mr...each day. Unfortunately, signals from the different read ports of the Electrim camera picked up noise on their way to the digitizer, and this resulted
Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor.
Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung
2018-03-23
Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works.
Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor
Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung
2018-01-01
Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works. PMID:29570690
Packet based serial link realized in FPGA dedicated for high resolution infrared image transmission
NASA Astrophysics Data System (ADS)
Bieszczad, Grzegorz
2015-05-01
In article the external digital interface specially designed for thermographic camera built in Military University of Technology is described. The aim of article is to illustrate challenges encountered during design process of thermal vision camera especially related to infrared data processing and transmission. Article explains main requirements for interface to transfer Infra-Red or Video digital data and describes the solution which we elaborated based on Low Voltage Differential Signaling (LVDS) physical layer and signaling scheme. Elaborated link for image transmission is built using FPGA integrated circuit with built-in high speed serial transceivers achieving up to 2500Gbps throughput. Image transmission is realized using proprietary packet protocol. Transmission protocol engine was described in VHDL language and tested in FPGA hardware. The link is able to transmit 1280x1024@60Hz 24bit video data using one signal pair. Link was tested to transmit thermal-vision camera picture to remote monitor. Construction of dedicated video link allows to reduce power consumption compared to solutions with ASIC based encoders and decoders realizing video links like DVI or packed based Display Port, with simultaneous reduction of wires needed to establish link to one pair. Article describes functions of modules integrated in FPGA design realizing several functions like: synchronization to video source, video stream packeting, interfacing transceiver module and dynamic clock generation for video standard conversion.
AMICA (Antarctic Multiband Infrared CAmera) project
NASA Astrophysics Data System (ADS)
Dolci, Mauro; Straniero, Oscar; Valentini, Gaetano; Di Rico, Gianluca; Ragni, Maurizio; Pelusi, Danilo; Di Varano, Igor; Giuliani, Croce; Di Cianno, Amico; Valentini, Angelo; Corcione, Leonardo; Bortoletto, Favio; D'Alessandro, Maurizio; Bonoli, Carlotta; Giro, Enrico; Fantinel, Daniela; Magrin, Demetrio; Zerbi, Filippo M.; Riva, Alberto; Molinari, Emilio; Conconi, Paolo; De Caprio, Vincenzo; Busso, Maurizio; Tosti, Gino; Nucciarelli, Giuliano; Roncella, Fabio; Abia, Carlos
2006-06-01
The Antarctic Plateau offers unique opportunities for ground-based Infrared Astronomy. AMICA (Antarctic Multiband Infrared CAmera) is an instrument designed to perform astronomical imaging from Dome-C in the near- (1 - 5 μm) and mid- (5 - 27 μm) infrared wavelength regions. The camera consists of two channels, equipped with a Raytheon InSb 256 array detector and a DRS MF-128 Si:As IBC array detector, cryocooled at 35 and 7 K respectively. Cryogenic devices will move a filter wheel and a sliding mirror, used to feed alternatively the two detectors. Fast control and readout, synchronized with the chopping secondary mirror of the telescope, will be required because of the large background expected at these wavelengths, especially beyond 10 μm. An environmental control system is needed to ensure the correct start-up, shut-down and housekeeping of the camera. The main technical challenge is represented by the extreme environmental conditions of Dome C (T about -90 °C, p around 640 mbar) and the need for a complete automatization of the overall system. AMICA will be mounted at the Nasmyth focus of the 80 cm IRAIT telescope and will perform survey-mode automatic observations of selected regions of the Southern sky. The first goal will be a direct estimate of the observational quality of this new highly promising site for Infrared Astronomy. In addition, IRAIT, equipped with AMICA, is expected to provide a significant improvement in the knowledge of fundamental astrophysical processes, such as the late stages of stellar evolution (especially AGB and post-AGB stars) and the star formation.
NASA Technical Reports Server (NTRS)
Simpson, C.; Eisenhardt, P.
1998-01-01
We investigate the ability of the Space Infrared Telescope Facility's Infrared Array Camera to detect distant (z3) galaxies and measure their photometric redshifts. Our analysis shows that changing the original long wavelength filter specifications provides significant improvements in performance in this and other areas.
NASA Astrophysics Data System (ADS)
Kim, Sungho; Choi, Byungin; Kim, Jieun; Kwon, Soon; Kim, Kyung-Tae
2012-05-01
This paper presents a separate spatio-temporal filter based small infrared target detection method to address the sea-based infrared search and track (IRST) problem in dense sun-glint environment. It is critical to detect small infrared targets such as sea-skimming missiles or asymmetric small ships for national defense. On the sea surface, sun-glint clutters degrade the detection performance. Furthermore, if we have to detect true targets using only three images with a low frame rate camera, then the problem is more difficult. We propose a novel three plot correlation filter and statistics based clutter reduction method to achieve robust small target detection rate in dense sun-glint environment. We validate the robust detection performance of the proposed method via real infrared test sequences including synthetic targets.
An infrared image based methodology for breast lesions screening
NASA Astrophysics Data System (ADS)
Morais, K. C. C.; Vargas, J. V. C.; Reisemberger, G. G.; Freitas, F. N. P.; Oliari, S. H.; Brioschi, M. L.; Louveira, M. H.; Spautz, C.; Dias, F. G.; Gasperin, P.; Budel, V. M.; Cordeiro, R. A. G.; Schittini, A. P. P.; Neto, C. D.
2016-05-01
The objective of this paper is to evaluate the potential of utilizing a structured methodology for breast lesions screening, based on infrared imaging temperature measurements of a healthy control group to establish expected normality ranges, and of breast cancer patients, previously diagnosed through biopsies of the affected regions. An analysis of the systematic error of the infrared camera skin temperature measurements was conducted in several different regions of the body, by direct comparison to high precision thermistor temperature measurements, showing that infrared camera temperatures are consistently around 2 °C above the thermistor temperatures. Therefore, a method of conjugated gradients is proposed to eliminate the infrared camera direct temperature measurement imprecision, by calculating the temperature difference between two points to cancel out the error. The method takes into account the human body approximate bilateral symmetry, and compares measured dimensionless temperature difference values (Δ θ bar) between two symmetric regions of the patient's breast, that takes into account the breast region, the surrounding ambient and the individual core temperatures, and doing so, the results interpretation for different individuals become simple and non subjective. The range of normal whole breast average dimensionless temperature differences for 101 healthy individuals was determined, and admitting that the breasts temperatures exhibit a unimodal normal distribution, the healthy normal range for each region was considered to be the dimensionless temperature difference plus/minus twice the standard deviation of the measurements, Δ θ bar ‾ + 2σ Δ θ bar ‾ , in order to represent 95% of the population. Forty-seven patients with previously diagnosed breast cancer through biopsies were examined with the method, which was capable of detecting breast abnormalities in 45 cases (96%). Therefore, the conjugated gradients method was considered effective in breast lesions screening through infrared imaging in order to recommend a biopsy, even with the use of a low optical resolution camera (160 × 120 pixels) and a thermal resolution of 0.1 °C, whose results were compared to the results of a higher resolution camera (320 × 240 pixels). The main conclusion is that the results demonstrate that the method has potential for utilization as a noninvasive screening exam for individuals with breast complaints, indicating whether the patient should be submitted to a biopsy or not.
Pattern recognition applied to infrared images for early alerts in fog
NASA Astrophysics Data System (ADS)
Boucher, Vincent; Marchetti, Mario; Dumoulin, Jean; Cord, Aurélien
2014-09-01
Fog conditions are the cause of severe car accidents in western countries because of the poor induced visibility. Its forecast and intensity are still very difficult to predict by weather services. Infrared cameras allow to detect and to identify objects in fog while visibility is too low for eye detection. Over the past years, the implementation of cost effective infrared cameras on some vehicles has enabled such detection. On the other hand pattern recognition algorithms based on Canny filters and Hough transformation are a common tool applied to images. Based on these facts, a joint research program between IFSTTAR and Cerema has been developed to study the benefit of infrared images obtained in a fog tunnel during its natural dissipation. Pattern recognition algorithms have been applied, specifically on road signs which shape is usually associated to a specific meaning (circular for a speed limit, triangle for an alert, …). It has been shown that road signs were detected early enough in images, with respect to images in the visible spectrum, to trigger useful alerts for Advanced Driver Assistance Systems.
NASA Astrophysics Data System (ADS)
Vainer, Boris G.; Morozov, Vitaly V.
A peculiar branch of biophotonics is a measurement, visualisation and quantitative analysis of infrared (IR) radiation emitted from living object surfaces. Focal plane array (FPA)-based IR cameras make it possible to realize in medicine the so called interventional infrared thermal diagnostics. An integrated technique aimed at the advancement of this new approach in biomedical science and practice is described in the paper. The assembled system includes a high-performance short-wave (2.45-3.05 μm) or long-wave (8-14 μm) IR camera, two laser Doppler flowmeters (LDF) and additional equipment and complementary facilities implementing the monitoring of human cardiovascular status. All these means operate synchronously. It is first ascertained the relationship between infrared thermography (IRT) and LDF data in humans in regard to their systemic cardiovascular reactivity. Blood supply real-time dynamics in a narcotized patient is first visualized and quantitatively represented during surgery in order to observe how the general hyperoxia influences thermoregulatory mechanisms; an abrupt increase in temperature of the upper limb is observed using IRT. It is outlined that the IRT-based integrated technique may act as a take-off runway leading to elaboration of informative new methods directly applicable to medicine and biomedical sciences.
Thin-Film Quantum Dot Photodiode for Monolithic Infrared Image Sensors.
Malinowski, Pawel E; Georgitzikis, Epimitheas; Maes, Jorick; Vamvaka, Ioanna; Frazzica, Fortunato; Van Olmen, Jan; De Moor, Piet; Heremans, Paul; Hens, Zeger; Cheyns, David
2017-12-10
Imaging in the infrared wavelength range has been fundamental in scientific, military and surveillance applications. Currently, it is a crucial enabler of new industries such as autonomous mobility (for obstacle detection), augmented reality (for eye tracking) and biometrics. Ubiquitous deployment of infrared cameras (on a scale similar to visible cameras) is however prevented by high manufacturing cost and low resolution related to the need of using image sensors based on flip-chip hybridization. One way to enable monolithic integration is by replacing expensive, small-scale III-V-based detector chips with narrow bandgap thin-films compatible with 8- and 12-inch full-wafer processing. This work describes a CMOS-compatible pixel stack based on lead sulfide quantum dots (PbS QD) with tunable absorption peak. Photodiode with a 150-nm thick absorber in an inverted architecture shows dark current of 10 -6 A/cm² at -2 V reverse bias and EQE above 20% at 1440 nm wavelength. Optical modeling for top illumination architecture can improve the contact transparency to 70%. Additional cooling (193 K) can improve the sensitivity to 60 dB. This stack can be integrated on a CMOS ROIC, enabling order-of-magnitude cost reduction for infrared sensors.
Lee, Byoung-Hee
2016-04-01
[Purpose] This study investigated the effects of real-time feedback using infrared camera recognition technology-based augmented reality in gait training for children with cerebral palsy. [Subjects] Two subjects with cerebral palsy were recruited. [Methods] In this study, augmented reality based real-time feedback training was conducted for the subjects in two 30-minute sessions per week for four weeks. Spatiotemporal gait parameters were used to measure the effect of augmented reality-based real-time feedback training. [Results] Velocity, cadence, bilateral step and stride length, and functional ambulation improved after the intervention in both cases. [Conclusion] Although additional follow-up studies of the augmented reality based real-time feedback training are required, the results of this study demonstrate that it improved the gait ability of two children with cerebral palsy. These findings suggest a variety of applications of conservative therapeutic methods which require future clinical trials.
An Inexpensive Digital Infrared Camera
ERIC Educational Resources Information Center
Mills, Allan
2012-01-01
Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)
33 CFR 117.993 - Lake Champlain.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) A sufficient number of infrared cameras shall be maintained in good working order at all times with... infrared cameras to verify that the channel is clear of all approaching vessel traffic. All approaching...
33 CFR 117.993 - Lake Champlain.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) A sufficient number of infrared cameras shall be maintained in good working order at all times with... infrared cameras to verify that the channel is clear of all approaching vessel traffic. All approaching...
C-RED one: ultra-high speed wavefront sensing in the infrared made possible
NASA Astrophysics Data System (ADS)
Gach, J.-L.; Feautrier, Philippe; Stadler, Eric; Greffe, Timothee; Clop, Fabien; Lemarchand, Stéphane; Carmignani, Thomas; Boutolleau, David; Baker, Ian
2016-07-01
First Light Imaging's CRED-ONE infrared camera is capable of capturing up to 3500 full frames per second with a subelectron readout noise. This breakthrough has been made possible thanks to the use of an e-APD infrared focal plane array which is a real disruptive technology in imagery. We will show the performances of the camera, its main features and compare them to other high performance wavefront sensing cameras like OCAM2 in the visible and in the infrared. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.
Extended spectrum SWIR camera with user-accessible Dewar
NASA Astrophysics Data System (ADS)
Benapfl, Brendan; Miller, John Lester; Vemuri, Hari; Grein, Christoph; Sivananthan, Siva
2017-02-01
Episensors has developed a series of extended short wavelength infrared (eSWIR) cameras based on high-Cd concentration Hg1-xCdxTe absorbers. The cameras have a bandpass extending to 3 microns cutoff wavelength, opening new applications relative to traditional InGaAs-based cameras. Applications and uses are discussed and examples given. A liquid nitrogen pour-filled version was initially developed. This was followed by a compact Stirling-cooled version with detectors operating at 200 K. Each camera has unique sensitivity and performance characteristics. The cameras' size, weight and power specifications are presented along with images captured with band pass filters and eSWIR sources to demonstrate spectral response beyond 1.7 microns. The soft seal Dewars of the cameras are designed for accessibility, and can be opened and modified in a standard laboratory environment. This modular approach allows user flexibility for swapping internal components such as cold filters and cold stops. The core electronics of the Stirlingcooled camera are based on a single commercial field programmable gate array (FPGA) that also performs on-board non-uniformity corrections, bad pixel replacement, and directly drives any standard HDMI display.
Portable Long-Wavelength Infrared Camera for Civilian Application
NASA Technical Reports Server (NTRS)
Gunapala, S. D.; Krabach, T. N.; Bandara, S. V.; Liu, J. K.
1997-01-01
In this paper, we discuss the performance of this portable long-wavelength infrared camera in quantum efficiency, NEAT, minimum resolvable temperature differnce (MRTD), uniformity, etc. and its application in science, medicine and defense.
NASA Astrophysics Data System (ADS)
Cabib, Dario; Lavi, Moshe; Gil, Amir; Milman, Uri
2011-06-01
Since the early '90's CI has been involved in the development of FTIR hyperspectral imagers based on a Sagnac or similar type of interferometer. CI also pioneered the commercialization of such hyperspectral imagers in those years. After having developed a visible version based on a CCD in the early '90's (taken on by a spin-off company for biomedical applications) and a 3 to 5 micron infrared version based on a cooled InSb camera in 2008, it is now developing an LWIR version based on an uncooled camera for the 8 to 14 microns range. In this paper we will present design features and expected performance of the system. The instrument is designed to be rugged for field use, yield a relatively high spectral resolution of 8 cm-1, an IFOV of 0.5 mrad., a 640x480 pixel spectral cube in less than a minute and a noise equivalent spectral radiance of 40 nW/cm2/sr/cm-1 at 10μ. The actually measured performance will be presented in a future paper.
Standoff aircraft IR characterization with ABB dual-band hyper spectral imager
NASA Astrophysics Data System (ADS)
Prel, Florent; Moreau, Louis; Lantagne, Stéphane; Bullis, Ritchie D.; Roy, Claude; Vallières, Christian; Levesque, Luc
2012-09-01
Remote sensing infrared characterization of rapidly evolving events generally involves the combination of a spectro-radiometer and infrared camera(s) as separated instruments. Time synchronization, spatial coregistration, consistent radiometric calibration and managing several systems are important challenges to overcome; they complicate the target infrared characterization data processing and increase the sources of errors affecting the final radiometric accuracy. MR-i is a dual-band Hyperspectal imaging spectro-radiometer, that combines two 256 x 256 pixels infrared cameras and an infrared spectro-radiometer into one single instrument. This field instrument generates spectral datacubes in the MWIR and LWIR. It is designed to acquire the spectral signatures of rapidly evolving events. The design is modular. The spectrometer has two output ports configured with two simultaneously operated cameras to either widen the spectral coverage or to increase the dynamic range of the measured amplitudes. Various telescope options are available for the input port. Recent platform developments and field trial measurements performances will be presented for a system configuration dedicated to the characterization of airborne targets.
Design of a Remote Infrared Images and Other Data Acquisition Station for outdoor applications
NASA Astrophysics Data System (ADS)
Béland, M.-A.; Djupkep, F. B. D.; Bendada, A.; Maldague, X.; Ferrarini, G.; Bison, P.; Grinzato, E.
2013-05-01
The Infrared Images and Other Data Acquisition Station enables a user, who is located inside a laboratory, to acquire visible and infrared images and distances in an outdoor environment with the help of an Internet connection. This station can acquire data using an infrared camera, a visible camera, and a rangefinder. The system can be used through a web page or through Python functions.
Camera Systems Rapidly Scan Large Structures
NASA Technical Reports Server (NTRS)
2013-01-01
Needing a method to quickly scan large structures like an aircraft wing, Langley Research Center developed the line scanning thermography (LST) system. LST works in tandem with a moving infrared camera to capture how a material responds to changes in temperature. Princeton Junction, New Jersey-based MISTRAS Group Inc. now licenses the technology and uses it in power stations and industrial plants.
Wen, Feng; Yu, Minzhong; Wu, Dezheng; Ma, Juanmei; Wu, Lezheng
2002-07-01
To observe the effect of indocyanine green angiography (ICGA) with infrared fundus camera on subsequent dark adaptation and the Ganzfeld electroretinogram (ERG), the ERGs of 38 eyes with different retinal diseases were recorded before and after ICGA during a 40-min dark adaptation period. ICGA was performed with Topcon 50IA retina camera. Ganzfeld ERG was recorded with Neuropack II evoked response recorder. The results showed that ICGA did not affect the latencies and the amplitudes in ERG of rod response, cone response and mixed maximum response (p>0.05). It suggests that ICGA using infrared fundus camera could be performed prior to the recording of the Ganzfeld ERG.
A user-friendly technical set-up for infrared photography of forensic findings.
Rost, Thomas; Kalberer, Nicole; Scheurer, Eva
2017-09-01
Infrared photography is interesting for a use in forensic science and forensic medicine since it reveals findings that normally are almost invisible to the human eye. Originally, infrared photography has been made possible by the placement of an infrared light transmission filter screwed in front of the camera objective lens. However, this set-up is associated with many drawbacks such as the loss of the autofocus function, the need of an external infrared source, and long exposure times which make the use of a tripod necessary. These limitations prevented up to now the routine application of infrared photography in forensics. In this study the use of a professional modification inside the digital camera body was evaluated regarding camera handling and image quality. This permanent modification consisted of the replacement of the in-built infrared blocking filter by an infrared transmission filter of 700nm and 830nm, respectively. The application of this camera set-up for the photo-documentation of forensically relevant post-mortem findings was investigated in examples of trace evidence such as gunshot residues on the skin, in external findings, e.g. hematomas, as well as in an exemplary internal finding, i.e., Wischnewski spots in a putrefied stomach. The application of scattered light created by indirect flashlight yielded a more uniform illumination of the object, and the use of the 700nm filter resulted in better pictures than the 830nm filter. Compared to pictures taken under visible light, infrared photographs generally yielded better contrast. This allowed for discerning more details and revealed findings which were not visible otherwise, such as imprints on a fabric and tattoos in mummified skin. The permanent modification of a digital camera by building in a 700nm infrared transmission filter resulted in a user-friendly and efficient set-up which qualified for the use in daily forensic routine. Main advantages were a clear picture in the viewfinder, an auto-focus usable over the whole range of infrared light, and the possibility of using short shutter speeds which allows taking infrared pictures free-hand. The proposed set-up with a modification of the camera allows a user-friendly application of infrared photography in post-mortem settings. Copyright © 2017 Elsevier B.V. All rights reserved.
The Mast Cameras and Mars Descent Imager (MARDI) for the 2009 Mars Science Laboratory
NASA Technical Reports Server (NTRS)
Malin, M. C.; Bell, J. F.; Cameron, J.; Dietrich, W. E.; Edgett, K. S.; Hallet, B.; Herkenhoff, K. E.; Lemmon, M. T.; Parker, T. J.; Sullivan, R. J.
2005-01-01
Based on operational experience gained during the Mars Exploration Rover (MER) mission, we proposed and were selected to conduct two related imaging experiments: (1) an investigation of the geology and short-term atmospheric vertical wind profile local to the Mars Science Laboratory (MSL) landing site using descent imaging, and (2) a broadly-based scientific investigation of the MSL locale employing visible and very near infra-red imaging techniques from a pair of mast-mounted, high resolution cameras. Both instruments share a common electronics design, a design also employed for the MSL Mars Hand Lens Imager (MAHLI) [1]. The primary differences between the cameras are in the nature and number of mechanisms and specific optics tailored to each camera s requirements.
Reflective all-sky thermal infrared cloud imager.
Redman, Brian J; Shaw, Joseph A; Nugent, Paul W; Clark, R Trevor; Piazzolla, Sabino
2018-04-30
A reflective all-sky imaging system has been built using a long-wave infrared microbolometer camera and a reflective metal sphere. This compact system was developed for measuring spatial and temporal patterns of clouds and their optical depth in support of applications including Earth-space optical communications. The camera is mounted to the side of the reflective sphere to leave the zenith sky unobstructed. The resulting geometric distortion is removed through an angular map derived from a combination of checkerboard-target imaging, geometric ray tracing, and sun-location-based alignment. A tape of high-emissivity material on the side of the reflector acts as a reference that is used to estimate and remove thermal emission from the metal sphere. Once a bias that is under continuing study was removed, sky radiance measurements from the all-sky imager in the 8-14 μm wavelength range agreed to within 0.91 W/(m 2 sr) of measurements from a previously calibrated, lens-based infrared cloud imager over its 110° field of view.
Focal plane arrays based on Type-II indium arsenide/gallium antimonide superlattices
NASA Astrophysics Data System (ADS)
Delaunay, Pierre-Yves
The goal of this work is to demonstrate that Type-II InAs/GaSb superlattices can perform high quality infrared imaging from the middle (MWIR) to the long (LWIR) wavelength infrared range. Theoretically, focal plane arrays (FPAs) based on this technology could be operated at higher temperatures, with lower dark currents than the leading HgCdTe platform. This effort will focus on the fabrication of MWIR and LWIR FPAs with performance similar to existing infrared cameras. Some applications in the MWIR require fast, sensitive imagers able to sustain frame rates up to 100Hz. Such speed can only be achieved with photon detectors. However, these cameras need to be operated below 170K. Current research in this spectral band focuses on increasing the operating temperature of the FPA to a point where cooling could be performed with compact and reliable thermoelectric coolers. Type-II superlattice was used to demonstrate a camera that presented similar performance to HgCdTe and that could be operated up to room temperature. At 80K, the camera could detect temperature differences as low as 10 mK for an integration time shorter than 25 ms. In the LWIR, the electric performance of Type-II photodiodes is mainly limited by surface leakage. Aggressive processing steps such as hybridization and underfill can increase the dark current of the devices by several orders of magnitude. New cleaning and passivation techniques were used to reduce the dark current of FPA diodes by two orders of magnitudes. The absorbing GaSb substrate was also removed to increase the quantum efficiency of the devices up to 90%. At 80K, a FPA with a 9.6 microm 50%-cutoff in responsivity was able to detect temperature differences as low as 19 mK, only limited by the performance of the testing system. The non-uniformity in responsivity reached 3.8% for a 98.2% operability. The third generation of infrared cameras is based on multi-band imaging in order to improve the recognition capabilities of the imager. Preliminary detectors based on back to back diodes presented similar performance to single colors devices; the quantum efficiency was measured higher than 40% for both bands. Preliminary imaging results were demonstrated in the LWIR.
TIRCAM2: The TIFR near infrared imaging camera
NASA Astrophysics Data System (ADS)
Naik, M. B.; Ojha, D. K.; Ghosh, S. K.; Poojary, S. S.; Jadhav, R. B.; Meshram, G. S.; Sandimani, P. R.; Bhagat, S. B.; D'Costa, S. L. A.; Gharat, S. M.; Bakalkar, C. B.; Ninan, J. P.; Joshi, J. S.
2012-12-01
TIRCAM2 (TIFR near infrared imaging camera - II) is a closed cycle cooled imager that has been developed by the Infrared Astronomy Group at the Tata Institute of Fundamental Research for observations in the near infrared band of 1 to 3.7 μm with existing Indian telescopes. In this paper, we describe some of the technical details of TIRCAM2 and report its observing capabilities, measured performance and limiting magnitudes with the 2-m IUCAA Girawali telescope and the 1.2-m PRL Gurushikhar telescope. The main highlight is the camera's capability of observing in the nbL (3.59 mum) band enabling our primary motivation of mapping of Polycyclic Aromatic Hydrocarbon (PAH) emission at 3.3 mum.
Low-cost uncooled VOx infrared camera development
NASA Astrophysics Data System (ADS)
Li, Chuan; Han, C. J.; Skidmore, George D.; Cook, Grady; Kubala, Kenny; Bates, Robert; Temple, Dorota; Lannon, John; Hilton, Allan; Glukh, Konstantin; Hardy, Busbee
2013-06-01
The DRS Tamarisk® 320 camera, introduced in 2011, is a low cost commercial camera based on the 17 µm pixel pitch 320×240 VOx microbolometer technology. A higher resolution 17 µm pixel pitch 640×480 Tamarisk®640 has also been developed and is now in production serving the commercial markets. Recently, under the DARPA sponsored Low Cost Thermal Imager-Manufacturing (LCTI-M) program and internal project, DRS is leading a team of industrial experts from FiveFocal, RTI International and MEMSCAP to develop a small form factor uncooled infrared camera for the military and commercial markets. The objective of the DARPA LCTI-M program is to develop a low SWaP camera (<3.5 cm3 in volume and <500 mW in power consumption) that costs less than US $500 based on a 10,000 units per month production rate. To meet this challenge, DRS is developing several innovative technologies including a small pixel pitch 640×512 VOx uncooled detector, an advanced digital ROIC and low power miniature camera electronics. In addition, DRS and its partners are developing innovative manufacturing processes to reduce production cycle time and costs including wafer scale optic and vacuum packaging manufacturing and a 3-dimensional integrated camera assembly. This paper provides an overview of the DRS Tamarisk® project and LCTI-M related uncooled technology development activities. Highlights of recent progress and challenges will also be discussed. It should be noted that BAE Systems and Raytheon Vision Systems are also participants of the DARPA LCTI-M program.
Estimation of wetland evapotranspiration in northern New York using infrared thermometry
NASA Astrophysics Data System (ADS)
Hwang, K.; Chandler, D. G.
2016-12-01
Evapotranspiration (ET) is an important component of the water budget and often regarded as a major water loss. In freshwater wetlands, cumulative annual ET can equal precipitation under well-watered conditions. Wetland ET is therefore an important control on contaminant and nutrient transport. Yet, quantification of wetland ET is challenged by complex surface characteristics, diverse plant species and density, and variations in wetland shape and size. As handheld infrared (IR) cameras have become available, studies exploiting the new technology have increased, especially in agriculture and hydrology. The benefits of IR cameras include (1) high spatial resolution, (2) high sample rates, (3) real-time imaging, (4) a constant viewing geometry, and (5) no need for atmosphere and cloud corrections. Compared with traditional methods, infrared thermometer is capable of monitoring at the scale of a small pond or localized plant community. This enables finer scale survey of heterogeneous land surfaces rather than strict dependence on atmospheric variables. Despite this potential, there has been a limited number of studies of ET and drought stress with IR cameras. In this study, the infrared thermometry-based method was applied to estimate ET over wetland plant species in St. Lawrence River Valley, NY. The results are evaluated with traditional methods to test applicability over multiple vegetation species in a same area.
Yang, Xiaofeng; Wu, Wei; Wang, Guoan
2015-04-01
This paper presents a surgical optical navigation system with non-invasive, real-time, and positioning characteristics for open surgical procedure. The design was based on the principle of near-infrared fluorescence molecular imaging. The in vivo fluorescence excitation technology, multi-channel spectral camera technology and image fusion software technology were used. Visible and near-infrared light ring LED excitation source, multi-channel band pass filters, spectral camera 2 CCD optical sensor technology and computer systems were integrated, and, as a result, a new surgical optical navigation system was successfully developed. When the near-infrared fluorescence was injected, the system could display anatomical images of the tissue surface and near-infrared fluorescent functional images of surgical field simultaneously. The system can identify the lymphatic vessels, lymph node, tumor edge which doctor cannot find out with naked eye intra-operatively. Our research will guide effectively the surgeon to remove the tumor tissue to improve significantly the success rate of surgery. The technologies have obtained a national patent, with patent No. ZI. 2011 1 0292374. 1.
NASA Astrophysics Data System (ADS)
Sumriddetchkajorn, Sarun; Chaitavon, Kosom
2009-07-01
This paper introduces a parallel measurement approach for fast infrared-based human temperature screening suitable for use in a large public area. Our key idea is based on the combination of simple image processing algorithms, infrared technology, and human flow management. With this multidisciplinary concept, we arrange as many people as possible in a two-dimensional space in front of a thermal imaging camera and then highlight all human facial areas through simple image filtering, image morphological, and particle analysis processes. In this way, an individual's face in live thermal image can be located and the maximum facial skin temperature can be monitored and displayed. Our experiment shows a measured 1 ms processing time in highlighting all human face areas. With a thermal imaging camera having an FOV lens of 24° × 18° and 320 × 240 active pixels, the maximum facial skin temperatures from three people's faces located at 1.3 m from the camera can also be simultaneously monitored and displayed in a measured rate of 31 fps, limited by the looping process in determining coordinates of all faces. For our 3-day test under the ambient temperature of 24-30 °C, 57-72% relative humidity, and weak wind from the outside hospital building, hyperthermic patients can be identified with 100% sensitivity and 36.4% specificity when the temperature threshold level and the offset temperature value are appropriately chosen. Appropriately locating our system away from the building doors, air conditioners and electric fans in order to eliminate wind blow coming toward the camera lens can significantly help improve our system specificity.
Low-cost thermo-electric infrared FPAs and their automotive applications
NASA Astrophysics Data System (ADS)
Hirota, Masaki; Ohta, Yoshimi; Fukuyama, Yasuhiro
2008-04-01
This paper describes three low-cost infrared focal plane arrays (FPAs) having a 1,536, 2,304, and 10,800 elements and experimental vehicle systems. They have a low-cost potential because each element consists of p-n polysilicon thermocouples, which allows the use of low-cost ultra-fine microfabrication technology commonly employed in the conventional semiconductor manufacturing processes. To increase the responsivity of FPA, we have developed a precisely patterned Au-black absorber that has high infrared absorptivity of more than 90%. The FPA having a 2,304 elements achieved high resposivity of 4,300 V/W. In order to reduce package cost, we developed a vacuum-sealed package integrated with a molded ZnS lens. The camera aiming the temperature measurement of a passenger cabin is compact and light weight devices that measures 45 x 45 x 30 mm and weighs 190 g. The camera achieves a noise equivalent temperature deviation (NETD) of less than 0.7°C from 0 to 40°C. In this paper, we also present a several experimental systems that use infrared cameras. One experimental system is a blind spot pedestrian warning system that employs four infrared cameras. It can detect the infrared radiation emitted from a human body and alerts the driver when a pedestrian is in a blind spot. The system can also prevent the vehicle from moving in the direction of the pedestrian. Another system uses a visible-light camera and infrared sensors to detect the presence of a pedestrian in a rear blind spot and alerts the driver. The third system is a new type of human-machine interface system that enables the driver to control the car's audio system without letting go of the steering wheel. Uncooled infrared cameras are still costly, which limits their automotive use to high-end luxury cars at present. To promote widespread use of IR imaging sensors on vehicles, we need to reduce their cost further.
NASA Astrophysics Data System (ADS)
Ghionis, George; Trygonis, Vassilis; Karydis, Antonis; Vousdoukas, Michalis; Alexandrakis, George; Drakopoulos, Panos; Amdreadis, Olympos; Psarros, Fotis; Velegrakis, Antonis; Poulos, Serafim
2016-04-01
Effective beach management requires environmental assessments that are based on sound science, are cost-effective and are available to beach users and managers in an accessible, timely and transparent manner. The most common problems are: 1) The available field data are scarce and of sub-optimal spatio-temporal resolution and coverage, 2) our understanding of local beach processes needs to be improved in order to accurately model/forecast beach dynamics under a changing climate, and 3) the information provided by coastal scientists/engineers in the form of data, models and scientific interpretation is often too complicated to be of direct use by coastal managers/decision makers. A multispectral video system has been developed, consisting of one or more video cameras operating in the visible part of the spectrum, a passive near-infrared (NIR) camera, an active NIR camera system, a thermal infrared camera and a spherical video camera, coupled with innovative image processing algorithms and a telemetric system for the monitoring of coastal environmental parameters. The complete system has the capability to record, process and communicate (in quasi-real time) high frequency information on shoreline position, wave breaking zones, wave run-up, erosion hot spots along the shoreline, nearshore wave height, turbidity, underwater visibility, wind speed and direction, air and sea temperature, solar radiation, UV radiation, relative humidity, barometric pressure and rainfall. An innovative, remotely-controlled interactive visual monitoring system, based on the spherical video camera (with 360°field of view), combines the video streams from all cameras and can be used by beach managers to monitor (in real time) beach user numbers, flow activities and safety at beaches of high touristic value. The high resolution near infrared cameras permit 24-hour monitoring of beach processes, while the thermal camera provides information on beach sediment temperature and moisture, can detect upwelling in the nearshore zone, and enhances the safety of beach users. All data can be presented in real- or quasi-real time and are stored for future analysis and training/validation of coastal processes models. Acknowledgements: This work was supported by the project BEACHTOUR (11SYN-8-1466) of the Operational Program "Cooperation 2011, Competitiveness and Entrepreneurship", co-funded by the European Regional Development Fund and the Greek Ministry of Education and Religious Affairs.
Observation of runaway electrons by infrared camera in J-TEXT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tong, R. H.; Chen, Z. Y., E-mail: zychen@hust.edu.cn; Zhang, M.
2016-11-15
When the energy of confined runaway electrons approaches several tens of MeV, the runaway electrons can emit synchrotron radiation in the range of infrared wavelength. An infrared camera working in the wavelength of 3-5 μm has been developed to study the runaway electrons in the Joint Texas Experimental Tokamak (J-TEXT). The camera is located in the equatorial plane looking tangentially into the direction of electron approach. The runaway electron beam inside the plasma has been observed at the flattop phase. With a fast acquisition of the camera, the behavior of runaway electron beam has been observed directly during the runawaymore » current plateau following the massive gas injection triggered disruptions.« less
Sniper detection using infrared camera: technical possibilities and limitations
NASA Astrophysics Data System (ADS)
Kastek, M.; Dulski, R.; Trzaskawka, P.; Bieszczad, G.
2010-04-01
The paper discusses technical possibilities to build an effective system for sniper detection using infrared cameras. Descriptions of phenomena which make it possible to detect sniper activities in infrared spectra as well as analysis of physical limitations were performed. Cooled and uncooled detectors were considered. Three phases of sniper activities were taken into consideration: before, during and after the shot. On the basis of experimental data the parameters defining the target were determined which are essential in assessing the capability of infrared camera to detect sniper activity. A sniper body and muzzle flash were analyzed as targets. The simulation of detection ranges was done for the assumed scenario of sniper detection task. The infrared sniper detection system was discussed, capable of fulfilling the requirements. The discussion of the results of analysis and simulations was finally presented.
Thin-Film Quantum Dot Photodiode for Monolithic Infrared Image Sensors †
Georgitzikis, Epimitheas; Vamvaka, Ioanna; Frazzica, Fortunato; Van Olmen, Jan; De Moor, Piet; Heremans, Paul; Hens, Zeger; Cheyns, David
2017-01-01
Imaging in the infrared wavelength range has been fundamental in scientific, military and surveillance applications. Currently, it is a crucial enabler of new industries such as autonomous mobility (for obstacle detection), augmented reality (for eye tracking) and biometrics. Ubiquitous deployment of infrared cameras (on a scale similar to visible cameras) is however prevented by high manufacturing cost and low resolution related to the need of using image sensors based on flip-chip hybridization. One way to enable monolithic integration is by replacing expensive, small-scale III–V-based detector chips with narrow bandgap thin-films compatible with 8- and 12-inch full-wafer processing. This work describes a CMOS-compatible pixel stack based on lead sulfide quantum dots (PbS QD) with tunable absorption peak. Photodiode with a 150-nm thick absorber in an inverted architecture shows dark current of 10−6 A/cm2 at −2 V reverse bias and EQE above 20% at 1440 nm wavelength. Optical modeling for top illumination architecture can improve the contact transparency to 70%. Additional cooling (193 K) can improve the sensitivity to 60 dB. This stack can be integrated on a CMOS ROIC, enabling order-of-magnitude cost reduction for infrared sensors. PMID:29232871
In vitro near-infrared imaging of occlusal dental caries using a germanium-enhanced CMOS camera
NASA Astrophysics Data System (ADS)
Lee, Chulsung; Darling, Cynthia L.; Fried, Daniel
2010-02-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310-nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study was to determine whether the lesion contrast derived from NIR transillumination can be used to estimate lesion severity. Another aim was to compare the performance of a new Ge enhanced complementary metal-oxide-semiconductor (CMOS) based NIR imaging camera with the InGaAs focal plane array (FPA). Extracted human teeth (n=52) with natural occlusal caries were imaged with both cameras at 1310-nm and the image contrast between sound and carious regions was calculated. After NIR imaging, teeth were sectioned and examined using more established methods, namely polarized light microscopy (PLM) and transverse microradiography (TMR) to calculate lesion severity. Lesions were then classified into 4 categories according to the lesion severity. Lesion contrast increased significantly with lesion severity for both cameras (p<0.05). The Ge enhanced CMOS camera equipped with the larger array and smaller pixels yielded higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
In vitro near-infrared imaging of occlusal dental caries using germanium enhanced CMOS camera.
Lee, Chulsung; Darling, Cynthia L; Fried, Daniel
2010-03-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310-nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study was to determine whether the lesion contrast derived from NIR transillumination can be used to estimate lesion severity. Another aim was to compare the performance of a new Ge enhanced complementary metal-oxide-semiconductor (CMOS) based NIR imaging camera with the InGaAs focal plane array (FPA). Extracted human teeth (n=52) with natural occlusal caries were imaged with both cameras at 1310-nm and the image contrast between sound and carious regions was calculated. After NIR imaging, teeth were sectioned and examined using more established methods, namely polarized light microscopy (PLM) and transverse microradiography (TMR) to calculate lesion severity. Lesions were then classified into 4 categories according to the lesion severity. Lesion contrast increased significantly with lesion severity for both cameras (p<0.05). The Ge enhanced CMOS camera equipped with the larger array and smaller pixels yielded higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
Hamze, Faeze; Ganjalikhan Nasab, Seyed Abdolreza; Eskandarizadeh, Ali; Shahravan, Arash; Akhavan Fard, Fatemeh; Sinaee, Neda
2018-01-01
Due to thermal hazard during composite restorations, this study was designed to scan the pulp temperature by thermocouple and infrared camera during photo polymerizing different composites. A mesio-occlso-distal (MOD) cavity was prepared in an extracted tooth and the K-type thermocouple was fixed in its pulp chamber. Subsequently, 1 mm increment of each composites were inserted (four composite types were incorporated) and photo polymerized employing either LED or QTH systems for 60 sec while the temperature was recorded with 10 sec intervals. Ultimately, the same tooth was hemisected bucco-lingually and the amalgam was removed. The same composite curing procedure was repeated while the thermogram was recorded using an infrared camera. Thereafter, the data was analyzed by repeated measured ANOVA followed by Tukey's HSD Post Hoc test for multiple comparisons ( α =0.05). The pulp temperature was significantly increased (repeated measures) during photo polymerization ( P =0.000) while there was no significant difference among the results recorded by thermocouple comparing to infrared camera ( P >0.05). Moreover, different composite materials and LCUs lead to similar outcomes ( P >0.05). Although various composites have significant different chemical compositions, they lead to similar pulp thermal changes. Moreover, both the infrared camera and the thermocouple would record parallel results of dental pulp temperature.
Feng, Lei; Fang, Hui; Zhou, Wei-Jun; Huang, Min; He, Yong
2006-09-01
Site-specific variable nitrogen application is one of the major precision crop production management operations. Obtaining sufficient crop nitrogen stress information is essential for achieving effective site-specific nitrogen applications. The present paper describes the development of a multi-spectral nitrogen deficiency sensor, which uses three channels (green, red, near-infrared) of crop images to determine the nitrogen level of canola. This sensor assesses the nitrogen stress by means of estimated SPAD value of the canola based on canola canopy reflectance sensed using three channels (green, red, near-infrared) of the multi-spectral camera. The core of this investigation is the calibration methods between the multi-spectral references and the nitrogen levels in crops measured using a SPAD 502 chlorophyll meter. Based on the results obtained from this study, it can be concluded that a multi-spectral CCD camera can provide sufficient information to perform reasonable SPAD values estimation during field operations.
Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio; Rispoli, Attilio
2010-01-01
This paper presents an innovative method for estimating the attitude of airborne electro-optical cameras with respect to the onboard autonomous navigation unit. The procedure is based on the use of attitude measurements under static conditions taken by an inertial unit and carrier-phase differential Global Positioning System to obtain accurate camera position estimates in the aircraft body reference frame, while image analysis allows line-of-sight unit vectors in the camera based reference frame to be computed. The method has been applied to the alignment of the visible and infrared cameras installed onboard the experimental aircraft of the Italian Aerospace Research Center and adopted for in-flight obstacle detection and collision avoidance. Results show an angular uncertainty on the order of 0.1° (rms). PMID:22315559
NASA Astrophysics Data System (ADS)
Laychak, M. B.
2008-06-01
In addition to the optical camera Megacam, the Canada-France-Hawaii Telescope operates a large field infrared camera, Wircam, and a spectrograph/spectropolimeter, Espadons. When these instruments were commissioned, the challenge arose to create educational outreach programmes incorporating the concepts of infrared astronomy and spectroscopy. We integrated spectroscopy into discussions of extrasolar planets and the search for life, two topics routinely requested by teachers for classroom talks. Making the infrared accessible to students provided a unique challenge, one that we met through the implementation and use of webcams modified for infrared use.
ERIC Educational Resources Information Center
Jeppsson, Fredrik; Frejd, Johanna; Lundmark, Frida
2017-01-01
This study focuses on investigating how students make use of their bodily experiences in combination with infrared (IR) cameras, as a way to make meaning in learning about heat, temperature, and friction. A class of 20 primary students (age 7-8 years), divided into three groups, took part in three IR camera laboratory experiments. The qualitative…
Of Detection Limits and Effective Mitigation: The Use of Infrared Cameras for Methane Leak Detection
NASA Astrophysics Data System (ADS)
Ravikumar, A. P.; Wang, J.; McGuire, M.; Bell, C.; Brandt, A. R.
2017-12-01
Mitigating methane emissions, a short-lived and potent greenhouse gas, is critical to limiting global temperature rise to two degree Celsius as outlined in the Paris Agreement. A major source of anthropogenic methane emissions in the United States is the oil and gas sector. To this effect, state and federal governments have recommended the use of optical gas imaging systems in periodic leak detection and repair (LDAR) surveys to detect for fugitive emissions or leaks. The most commonly used optical gas imaging systems (OGI) are infrared cameras. In this work, we systematically evaluate the limits of infrared (IR) camera based OGI system for use in methane leak detection programs. We analyze the effect of various parameters that influence the minimum detectable leak rates of infrared cameras. Blind leak detection tests were carried out at the Department of Energy's MONITOR natural gas test-facility in Fort Collins, CO. Leak sources included natural gas wellheads, separators, and tanks. With an EPA mandated 60 g/hr leak detection threshold for IR cameras, we test leak rates ranging from 4 g/hr to over 350 g/hr at imaging distances between 5 ft and 70 ft from the leak source. We perform these experiments over the course of a week, encompassing a wide range of wind and weather conditions. Using repeated measurements at a given leak rate and imaging distance, we generate detection probability curves as a function of leak-size for various imaging distances, and measurement conditions. In addition, we estimate the median detection threshold - leak-size at which the probability of detection is 50% - under various scenarios to reduce uncertainty in mitigation effectiveness. Preliminary analysis shows that the median detection threshold varies from 3 g/hr at an imaging distance of 5 ft to over 150 g/hr at 50 ft (ambient temperature: 80 F, winds < 4 m/s). Results from this study can be directly used to improve OGI based LDAR protocols and reduce uncertainty in estimated mitigation effectiveness. Furthermore, detection limits determined in this study can be used as standards to compare new detection technologies.
Quantifying seasonal variation of leaf area index using near-infrared digital camera in a rice paddy
NASA Astrophysics Data System (ADS)
Hwang, Y.; Ryu, Y.; Kim, J.
2017-12-01
Digital camera has been widely used to quantify leaf area index (LAI). Numerous simple and automatic methods have been proposed to improve the digital camera based LAI estimates. However, most studies in rice paddy relied on arbitrary thresholds or complex radiative transfer models to make binary images. Moreover, only a few study reported continuous, automatic observation of LAI over the season in rice paddy. The objective of this study is to quantify seasonal variations of LAI using raw near-infrared (NIR) images coupled with a histogram shape-based algorithm in a rice paddy. As vegetation highly reflects the NIR light, we installed NIR digital camera 1.8 m above the ground surface and acquired unsaturated raw format images at one-hour intervals between 15 to 80 º solar zenith angles over the entire growing season in 2016 (from May to September). We applied a sub-pixel classification combined with light scattering correction method. Finally, to confirm the accuracy of the quantified LAI, we also conducted direct (destructive sampling) and indirect (LAI-2200) manual observations of LAI once per ten days on average. Preliminary results show that NIR derived LAI agreed well with in-situ observations but divergence tended to appear once rice canopy is fully developed. The continuous monitoring of LAI in rice paddy will help to understand carbon and water fluxes better and evaluate satellite based LAI products.
Lee, Byoung-Hee
2016-01-01
[Purpose] This study investigated the effects of real-time feedback using infrared camera recognition technology-based augmented reality in gait training for children with cerebral palsy. [Subjects] Two subjects with cerebral palsy were recruited. [Methods] In this study, augmented reality based real-time feedback training was conducted for the subjects in two 30-minute sessions per week for four weeks. Spatiotemporal gait parameters were used to measure the effect of augmented reality-based real-time feedback training. [Results] Velocity, cadence, bilateral step and stride length, and functional ambulation improved after the intervention in both cases. [Conclusion] Although additional follow-up studies of the augmented reality based real-time feedback training are required, the results of this study demonstrate that it improved the gait ability of two children with cerebral palsy. These findings suggest a variety of applications of conservative therapeutic methods which require future clinical trials. PMID:27190489
Broadband image sensor array based on graphene-CMOS integration
NASA Astrophysics Data System (ADS)
Goossens, Stijn; Navickaite, Gabriele; Monasterio, Carles; Gupta, Shuchi; Piqueras, Juan José; Pérez, Raúl; Burwell, Gregory; Nikitskiy, Ivan; Lasanta, Tania; Galán, Teresa; Puma, Eric; Centeno, Alba; Pesquera, Amaia; Zurutuza, Amaia; Konstantatos, Gerasimos; Koppens, Frank
2017-06-01
Integrated circuits based on complementary metal-oxide-semiconductors (CMOS) are at the heart of the technological revolution of the past 40 years, enabling compact and low-cost microelectronic circuits and imaging systems. However, the diversification of this platform into applications other than microcircuits and visible-light cameras has been impeded by the difficulty to combine semiconductors other than silicon with CMOS. Here, we report the monolithic integration of a CMOS integrated circuit with graphene, operating as a high-mobility phototransistor. We demonstrate a high-resolution, broadband image sensor and operate it as a digital camera that is sensitive to ultraviolet, visible and infrared light (300-2,000 nm). The demonstrated graphene-CMOS integration is pivotal for incorporating 2D materials into the next-generation microelectronics, sensor arrays, low-power integrated photonics and CMOS imaging systems covering visible, infrared and terahertz frequencies.
A new spherical scanning system for infrared reflectography of paintings
NASA Astrophysics Data System (ADS)
Gargano, M.; Cavaliere, F.; Viganò, D.; Galli, A.; Ludwig, N.
2017-03-01
Infrared reflectography is an imaging technique used to visualize the underdrawings of ancient paintings; it relies on the fact that most pigment layers are quite transparent to infrared radiation in the spectral band between 0.8 μm and 2.5 μm. InGaAs sensor cameras are nowadays the most used devices to visualize the underdrawings but due to the small size of the detectors, these cameras are usually mounted on scanning systems to record high resolution reflectograms. This work describes a portable scanning system prototype based on a peculiar spherical scanning system built through a light weight and low cost motorized head. The motorized head was built with the purpose of allowing the refocusing adjustment needed to compensate the variable camera-painting distance during the rotation of the camera. The prototype has been tested first in laboratory and then in-situ for the Giotto panel "God the Father with Angels" with a 256 pixel per inch resolution. The system performance is comparable with that of other reflectographic devices with the advantage of extending the scanned area up to 1 m × 1 m, with a 40 min scanning time. The present configuration can be easily modified to increase the resolution up to 560 pixels per inch or to extend the scanned area up to 2 m × 2 m.
IR observations in gamma-ray blazars
NASA Technical Reports Server (NTRS)
Mahoney, W. A.; Gautier, T. N.; Ressler, M. E.; Wallyn, P.; Durouchoux, P.; Higdon, J. C.
1997-01-01
The infrared photometric and spectral observation of five gamma ray blazars in coordination with the energetic gamma ray experiment telescope (EGRET) onboard the Compton Gamma Ray Observatory is reported. The infrared measurements were made with a Cassegrain infrared camera and the mid-infrared large well imager at the Mt. Palomar 5 m telescope. The emphasis is on the three blazars observed simultaneously by EGRET and the ground-based telescope during viewing period 519. In addition to the acquisition of broadband spectral measurements for direct correlation with the 100 MeV EGRET observations, near infrared images were obtained, enabling a search for intra-day variability to be carried out.
C-RED One and C-RED2: SWIR high-performance cameras using Saphira e-APD and Snake InGaAs detectors
NASA Astrophysics Data System (ADS)
Gach, Jean-Luc; Feautrier, Philippe; Stadler, Eric; Clop, Fabien; Lemarchand, Stephane; Carmignani, Thomas; Wanwanscappel, Yann; Boutolleau, David
2018-02-01
After the development of the OCAM2 EMCCD fast visible camera dedicated to advanced adaptive optics wavefront sensing, First Light Imaging moved to the SWIR fast cameras with the development of the C-RED One and the C-RED 2 cameras. First Light Imaging's C-RED One infrared camera is capable of capturing up to 3500 full frames per second with a subelectron readout noise and very low background. C-RED One is based on the last version of the SAPHIRA detector developed by Leonardo UK. This breakthrough has been made possible thanks to the use of an e-APD infrared focal plane array which is a real disruptive technology in imagery. C-RED One is an autonomous system with an integrated cooling system and a vacuum regeneration system. It operates its sensor with a wide variety of read out techniques and processes video on-board thanks to an FPGA. We will show its performances and expose its main features. In addition to this project, First Light Imaging developed an InGaAs 640x512 fast camera with unprecedented performances in terms of noise, dark and readout speed based on the SNAKE SWIR detector from Sofradir. The camera was called C-RED 2. The C-RED 2 characteristics and performances will be described. The C-RED One project has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944. The C-RED 2 development is supported by the "Investments for the future" program and the Provence Alpes Côte d'Azur Region, in the frame of the CPER.
Free-form reflective optics for mid-infrared camera and spectrometer on board SPICA
NASA Astrophysics Data System (ADS)
Fujishiro, Naofumi; Kataza, Hirokazu; Wada, Takehiko; Ikeda, Yuji; Sakon, Itsuki; Oyabu, Shinki
2017-11-01
SPICA (Space Infrared Telescope for Cosmology and Astrophysics) is an astronomical mission optimized for mid-and far-infrared astronomy with a cryogenically cooled 3-m class telescope, envisioned for launch in early 2020s. Mid-infrared Camera and Spectrometer (MCS) is a focal plane instrument for SPICA with imaging and spectroscopic observing capabilities in the mid-infrared wavelength range of 5-38μm. MCS consists of two relay optical modules and following four scientific optical modules of WFC (Wide Field Camera; 5'x 5' field of view, f/11.7 and f/4.2 cameras), LRS (Low Resolution Spectrometer; 2'.5 long slits, prism dispersers, f/5.0 and f/1.7 cameras, spectral resolving power R ∼ 50-100), MRS (Mid Resolution Spectrometer; echelles, integral field units by image slicer, f/3.3 and f/1.9 cameras, R ∼ 1100-3000) and HRS (High Resolution Spectrometer; immersed echelles, f/6.0 and f/3.6 cameras, R ∼ 20000-30000). Here, we present optical design and expected optical performance of MCS. Most parts of MCS optics adopt off-axis reflective system for covering the wide wavelength range of 5-38μm without chromatic aberration and minimizing problems due to changes in shapes and refractive indices of materials from room temperature to cryogenic temperature. In order to achieve the high specification requirements of wide field of view, small F-number and large spectral resolving power with compact size, we employed the paraxial and aberration analysis of off-axial optical systems (Araki 2005 [1]) which is a design method using free-form surfaces for compact reflective optics such as head mount displays. As a result, we have successfully designed compact reflective optics for MCS with as-built performance of diffraction-limited image resolution.
High resolution infrared acquisitions droning over the LUSI mud eruption.
NASA Astrophysics Data System (ADS)
Di Felice, Fabio; Romeo, Giovanni; Di Stefano, Giuseppe; Mazzini, Adriano
2016-04-01
The use of low-cost hand-held infrared (IR) thermal cameras based on uncooled micro-bolometer detector arrays became more widespread during the recent years. Thermal cameras have the ability to estimate temperature values without contact and therefore can be used in circumstances where objects are difficult or dangerous to reach such as volcanic eruptions. Since May 2006 the Indonesian LUSI mud eruption continues to spew boiling mud, water, aqueous vapor, CO2, CH4 and covers a surface of nearly 7 km2. At this locality we performed surveys over the unreachable erupting crater. In the framework of the LUSI Lab project (ERC grant n° 308126), in 2014 and 2015, we acquired high resolution infrared images using a specifically equipped remote-controlled drone flying at an altitude of m 100. The drone is equipped with GPS and an autopilot system that allows pre-programming the flying path or designing grids. The mounted thermal camera has peak spectral sensitivity in LW wavelength (μm 10) that is characterized by low water vapor and CO2 absorption. The low distance (high resolution) acquisitions have a temperature detail every cm 40, therefore it is possible to detect and observe physical phenomena such as thermodynamic behavior, hot mud and fluids emissions locations and their time shifts. Despite the harsh logistics and the continuously varying gas concentrations we managed to collect thermal images to estimate the crater zone spatial thermal variations. We applied atmosphere corrections to calculate infrared absorption by high concentration of water vapor. Thousands of images have been stitched together to obtain a mosaic of the crater zone. Regular monitoring with heat variation measurements collected, e.g. every six months, could give important information about the volcano activity estimating its evolution. A future data base of infrared high resolution and visible images stored in a web server could be a useful monitoring tool. An interesting development will be to use a multi-spectral thermal camera to perform a complete near remote sensing to detect, not only temperature, but gas, sensitive to particular wavelengths.
Analysis of Infrared Signature Variation and Robust Filter-Based Supersonic Target Detection
Sun, Sun-Gu; Kim, Kyung-Tae
2014-01-01
The difficulty of small infrared target detection originates from the variations of infrared signatures. This paper presents the fundamental physics of infrared target variations and reports the results of variation analysis of infrared images acquired using a long wave infrared camera over a 24-hour period for different types of backgrounds. The detection parameters, such as signal-to-clutter ratio were compared according to the recording time, temperature and humidity. Through variation analysis, robust target detection methodologies are derived by controlling thresholds and designing a temporal contrast filter to achieve high detection rate and low false alarm rate. Experimental results validate the robustness of the proposed scheme by applying it to the synthetic and real infrared sequences. PMID:24672290
Prototype of microbolometer thermal infrared camera for forest fire detection from space
NASA Astrophysics Data System (ADS)
Guerin, Francois; Dantes, Didier; Bouzou, Nathalie; Chorier, Philippe; Bouchardy, Anne-Marie; Rollin, Joël.
2017-11-01
The contribution of the thermal infrared (TIR) camera to the Earth observation FUEGO mission is to participate; to discriminate the clouds and smoke; to detect the false alarms of forest fires; to monitor the forest fires. Consequently, the camera needs a large dynamic range of detectable radiances. A small volume, low mass and power are required by the small FUEGO payload. These specifications can be attractive for other similar missions.
Report Of The HST Strategy Panel: A Strategy For Recovery
1991-01-01
orbit change out: the Wide Field/Planetary Camera II (WFPC II), the Near-Infrared Camera and Multi- Object Spectrometer (NICMOS) and the Space ...are the Space Telescope Imaging Spectrograph (STB), the Near-Infrared Camera and Multi- Object Spectrom- eter (NICMOS), and the second Wide Field and...expected to fail to lock due to duplicity was 20%; on- orbit data indicates that 10% may be a better estimate, but the guide stars were preselected
Galaxies Gather at Great Distances
NASA Technical Reports Server (NTRS)
2006-01-01
[figure removed for brevity, see original site] Distant Galaxy Cluster Infrared Survey Poster [figure removed for brevity, see original site] [figure removed for brevity, see original site] Bird's Eye View Mosaic Bird's Eye View Mosaic with Clusters [figure removed for brevity, see original site] [figure removed for brevity, see original site] [figure removed for brevity, see original site] 9.1 Billion Light-Years 8.7 Billion Light-Years 8.6 Billion Light-Years Astronomers have discovered nearly 300 galaxy clusters and groups, including almost 100 located 8 to 10 billion light-years away, using the space-based Spitzer Space Telescope and the ground-based Mayall 4-meter telescope at Kitt Peak National Observatory in Tucson, Ariz. The new sample represents a six-fold increase in the number of known galaxy clusters and groups at such extreme distances, and will allow astronomers to systematically study massive galaxies two-thirds of the way back to the Big Bang. A mosaic portraying a bird's eye view of the field in which the distant clusters were found is shown at upper left. It spans a region of sky 40 times larger than that covered by the full moon as seen from Earth. Thousands of individual images from Spitzer's infrared array camera instrument were stitched together to create this mosaic. The distant clusters are marked with orange dots. Close-up images of three of the distant galaxy clusters are shown in the adjoining panels. The clusters appear as a concentration of red dots near the center of each image. These images reveal the galaxies as they were over 8 billion years ago, since that's how long their light took to reach Earth and Spitzer's infrared eyes. These pictures are false-color composites, combining ground-based optical images captured by the Mosaic-I camera on the Mayall 4-meter telescope at Kitt Peak, with infrared pictures taken by Spitzer's infrared array camera. Blue and green represent visible light at wavelengths of 0.4 microns and 0.8 microns, respectively, while red indicates infrared light at 4.5 microns. Kitt Peak National Observatory is part of the National Optical Astronomy Observatory in Tuscon, Ariz.Hamze, Faeze; Ganjalikhan Nasab, Seyed Abdolreza; Eskandarizadeh, Ali; Shahravan, Arash; Akhavan Fard, Fatemeh; Sinaee, Neda
2018-01-01
Introduction: Due to thermal hazard during composite restorations, this study was designed to scan the pulp temperature by thermocouple and infrared camera during photo polymerizing different composites. Methods and Materials: A mesio-occlso-distal (MOD) cavity was prepared in an extracted tooth and the K-type thermocouple was fixed in its pulp chamber. Subsequently, 1 mm increment of each composites were inserted (four composite types were incorporated) and photo polymerized employing either LED or QTH systems for 60 sec while the temperature was recorded with 10 sec intervals. Ultimately, the same tooth was hemisected bucco-lingually and the amalgam was removed. The same composite curing procedure was repeated while the thermogram was recorded using an infrared camera. Thereafter, the data was analyzed by repeated measured ANOVA followed by Tukey’s HSD Post Hoc test for multiple comparisons (α=0.05). Results: The pulp temperature was significantly increased (repeated measures) during photo polymerization (P=0.000) while there was no significant difference among the results recorded by thermocouple comparing to infrared camera (P>0.05). Moreover, different composite materials and LCUs lead to similar outcomes (P>0.05). Conclusion: Although various composites have significant different chemical compositions, they lead to similar pulp thermal changes. Moreover, both the infrared camera and the thermocouple would record parallel results of dental pulp temperature. PMID:29707014
Reflective all-sky thermal infrared cloud imager
DOE Office of Scientific and Technical Information (OSTI.GOV)
Redman, Brian J.; Shaw, Joseph A.; Nugent, Paul W.
A reflective all-sky imaging system has been built using a long-wave infrared microbolometer camera and a reflective metal sphere. This compact system was developed for measuring spatial and temporal patterns of clouds and their optical depth in support of applications including Earth-space optical communications. The camera is mounted to the side of the reflective sphere to leave the zenith sky unobstructed. The resulting geometric distortion is removed through an angular map derived from a combination of checkerboard-target imaging, geometric ray tracing, and sun-location-based alignment. A tape of high-emissivity material on the side of the reflector acts as a reference thatmore » is used to estimate and remove thermal emission from the metal sphere. In conclusion, once a bias that is under continuing study was removed, sky radiance measurements from the all-sky imager in the 8-14 μm wavelength range agreed to within 0.91 W/(m 2 sr) of measurements from a previously calibrated, lens-based infrared cloud imager over its 110° field of view.« less
Reflective all-sky thermal infrared cloud imager
Redman, Brian J.; Shaw, Joseph A.; Nugent, Paul W.; ...
2018-04-17
A reflective all-sky imaging system has been built using a long-wave infrared microbolometer camera and a reflective metal sphere. This compact system was developed for measuring spatial and temporal patterns of clouds and their optical depth in support of applications including Earth-space optical communications. The camera is mounted to the side of the reflective sphere to leave the zenith sky unobstructed. The resulting geometric distortion is removed through an angular map derived from a combination of checkerboard-target imaging, geometric ray tracing, and sun-location-based alignment. A tape of high-emissivity material on the side of the reflector acts as a reference thatmore » is used to estimate and remove thermal emission from the metal sphere. In conclusion, once a bias that is under continuing study was removed, sky radiance measurements from the all-sky imager in the 8-14 μm wavelength range agreed to within 0.91 W/(m 2 sr) of measurements from a previously calibrated, lens-based infrared cloud imager over its 110° field of view.« less
An adaptive enhancement algorithm for infrared video based on modified k-means clustering
NASA Astrophysics Data System (ADS)
Zhang, Linze; Wang, Jingqi; Wu, Wen
2016-09-01
In this paper, we have proposed a video enhancement algorithm to improve the output video of the infrared camera. Sometimes the video obtained by infrared camera is very dark since there is no clear target. In this case, infrared video should be divided into frame images by frame extraction, in order to carry out the image enhancement. For the first frame image, which can be divided into k sub images by using K-means clustering according to the gray interval it occupies before k sub images' histogram equalization according to the amount of information per sub image, we used a method to solve a problem that final cluster centers close to each other in some cases; and for the other frame images, their initial cluster centers can be determined by the final clustering centers of the previous ones, and the histogram equalization of each sub image will be carried out after image segmentation based on K-means clustering. The histogram equalization can make the gray value of the image to the whole gray level, and the gray level of each sub image is determined by the ratio of pixels to a frame image. Experimental results show that this algorithm can improve the contrast of infrared video where night target is not obvious which lead to a dim scene, and reduce the negative effect given by the overexposed pixels adaptively in a certain range.
Low-cost camera modifications and methodologies for very-high-resolution digital images
USDA-ARS?s Scientific Manuscript database
Aerial color and color-infrared photography are usually acquired at high altitude so the ground resolution of the photographs is < 1 m. Moreover, current color-infrared cameras and manned aircraft flight time are expensive, so the objective is the development of alternative methods for obtaining ve...
Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung
2016-08-31
Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.
Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung
2016-01-01
Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user’s head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768
LWIR NUC using an uncooled microbolometer camera
NASA Astrophysics Data System (ADS)
Laveigne, Joe; Franks, Greg; Sparkman, Kevin; Prewarski, Marcus; Nehring, Brian; McHugh, Steve
2010-04-01
Performing a good non-uniformity correction is a key part of achieving optimal performance from an infrared scene projector. Ideally, NUC will be performed in the same band in which the scene projector will be used. Cooled, large format MWIR cameras are readily available and have been successfully used to perform NUC, however, cooled large format LWIR cameras are not as common and are prohibitively expensive. Large format uncooled cameras are far more available and affordable, but present a range of challenges in practical use for performing NUC on an IRSP. Santa Barbara Infrared, Inc. reports progress on a continuing development program to use a microbolometer camera to perform LWIR NUC on an IRSP. Camera instability and temporal response and thermal resolution are the main difficulties. A discussion of processes developed to mitigate these issues follows.
Single Pixel Black Phosphorus Photodetector for Near-Infrared Imaging.
Miao, Jinshui; Song, Bo; Xu, Zhihao; Cai, Le; Zhang, Suoming; Dong, Lixin; Wang, Chuan
2018-01-01
Infrared imaging systems have wide range of military or civil applications and 2D nanomaterials have recently emerged as potential sensing materials that may outperform conventional ones such as HgCdTe, InGaAs, and InSb. As an example, 2D black phosphorus (BP) thin film has a thickness-dependent direct bandgap with low shot noise and noncryogenic operation for visible to mid-infrared photodetection. In this paper, the use of a single-pixel photodetector made with few-layer BP thin film for near-infrared imaging applications is demonstrated. The imaging is achieved by combining the photodetector with a digital micromirror device to encode and subsequently reconstruct the image based on compressive sensing algorithm. Stationary images of a near-infrared laser spot (λ = 830 nm) with up to 64 × 64 pixels are captured using this single-pixel BP camera with 2000 times of measurements, which is only half of the total number of pixels. The imaging platform demonstrated in this work circumvents the grand challenges of scalable BP material growth for photodetector array fabrication and shows the efficacy of utilizing the outstanding performance of BP photodetector for future high-speed infrared camera applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Portable, stand-off spectral imaging camera for detection of effluents and residues
NASA Astrophysics Data System (ADS)
Goldstein, Neil; St. Peter, Benjamin; Grot, Jonathan; Kogan, Michael; Fox, Marsha; Vujkovic-Cvijin, Pajo; Penny, Ryan; Cline, Jason
2015-06-01
A new, compact and portable spectral imaging camera, employing a MEMs-based encoded imaging approach, has been built and demonstrated for detection of hazardous contaminants including gaseous effluents and solid-liquid residues on surfaces. The camera is called the Thermal infrared Reconfigurable Analysis Camera for Effluents and Residues (TRACER). TRACER operates in the long wave infrared and has the potential to detect a wide variety of materials with characteristic spectral signatures in that region. The 30 lb. camera is tripod mounted and battery powered. A touch screen control panel provides a simple user interface for most operations. The MEMS spatial light modulator is a Texas Instruments Digital Microarray Array with custom electronics and firmware control. Simultaneous 1D-spatial and 1Dspectral dimensions are collected, with the second spatial dimension obtained by scanning the internal spectrometer slit. The sensor can be configured to collect data in several modes including full hyperspectral imagery using Hadamard multiplexing, panchromatic thermal imagery, and chemical-specific contrast imagery, switched with simple user commands. Matched filters and other analog filters can be generated internally on-the-fly and applied in hardware, substantially reducing detection time and improving SNR over HSI software processing, while reducing storage requirements. Results of preliminary instrument evaluation and measurements of flame exhaust are presented.
The Utility of Using a Near-Infrared (NIR) Camera to Measure Beach Surface Moisture
NASA Astrophysics Data System (ADS)
Nelson, S.; Schmutz, P. P.
2017-12-01
Surface moisture content is an important factor that must be considered when studying aeolian sediment transport in a beach environment. A few different instruments and procedures are available for measuring surface moisture content (i.e. moisture probes, LiDAR, and gravimetric moisture data from surface scrapings); however, these methods can be inaccurate, costly, and inapplicable, particularly in the field. Near-infrared (NIR) spectral band imagery is another technique used to obtain moisture data. NIR imagery has been predominately used through remote sensing and has yet to be used for ground-based measurements. Dry sand reflects infrared radiation given off by the sun and wet sand absorbs IR radiation. All things considered, this study assesses the utility of measuring surface moisture content of beach sand with a modified NIR camera. A traditional point and shoot digital camera was internally modified with the placement of a visible light-blocking filter. Images were taken of three different types of beach sand at controlled moisture content values, with sunlight as the source of infrared radiation. A technique was established through trial and error by comparing resultant histogram values using Adobe Photoshop with the various moisture conditions. The resultant IR absorption histogram values were calibrated to actual gravimetric moisture content from surface scrapings of the samples. Overall, the results illustrate that the NIR spectrum modified camera does not provide the ability to adequately measure beach surface moisture content. However, there were noted differences in IR absorption histogram values among the different sediment types. Sediment with darker quartz mineralogy provided larger variations in histogram values, but the technique is not sensitive enough to accurately represent low moisture percentages, which are of most importance when studying aeolian sediment transport.
VizieR Online Data Catalog: Photometry of YSOs in eight bright-rimmed clouds (Sharma+, 2016)
NASA Astrophysics Data System (ADS)
Sharma, S.; Pandey, A. K.; Borissova, J.; Ojha, D. K.; Ivanov, V. D.; Ogura, K.; Kobayashi, N.; Kurtev, R.; Gopinathan, M.; Yadav, R. K.
2016-08-01
Near-infrared (J, H, K') data for eight selected Bright-Rimmed Clouds (BRCs) along with two nearby field regions (see Table1) were collected with the Infrared Side Port Imager (ISPI) camera (FOV~10.5*10.5arcmin2; scale 0.3arcsec/pixel) on the 4m Blanco telescope at Cerro Tololo Inter-American Observatory (CTIO), Chile, during the nights of 2010 March 03-04. The seeing was ~1arcsec. The individual exposure times were 60 s per frame for all filters. The total exposure time for the target fields was 540s for each J, H, and K' band. We also used the infrared archived data taken from the Infrared Array Camera (IRAC) of the space-based Spitzer telescope at the 3.6, 4.5, 5.8, and 8.0μm bands. We obtained Basic Calibrated Data (BCD) from the Spitzer data archive for all BRCs (except SFO 76, which has no Spitzer data). The exposure time of each BCD was 10.4s (4 data files).
Pettit holds cameras in the U.S. Laboratory
2012-01-15
ISS030-E-175788 (15 Jan. 2012) --- NASA astronaut Don Pettit, Expedition 30 flight engineer, is pictured with two still cameras mounted together in the Destiny laboratory of the International Space Station. One camera is an infrared modified still camera.
Real time capable infrared thermography for ASDEX Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sieglin, B., E-mail: Bernhard.Sieglin@ipp.mpg.de; Faitsch, M.; Herrmann, A.
2015-11-15
Infrared (IR) thermography is widely used in fusion research to study power exhaust and incident heat load onto the plasma facing components. Due to the short pulse duration of today’s fusion experiments, IR systems have mostly been designed for off-line data analysis. For future long pulse devices (e.g., Wendelstein 7-X, ITER), a real time evaluation of the target temperature and heat flux is mandatory. This paper shows the development of a real time capable IR system for ASDEX Upgrade. A compact IR camera has been designed incorporating the necessary magnetic and electric shielding for the detector, cooler assembly. The cameramore » communication is based on the Camera Link industry standard. The data acquisition hardware is based on National Instruments hardware, consisting of a PXIe chassis inside and a fibre optical connected industry computer outside the torus hall. Image processing and data evaluation are performed using real time LabVIEW.« less
Emergency positioning system accuracy with infrared LEDs in high-security facilities
NASA Astrophysics Data System (ADS)
Knoch, Sierra N.; Nelson, Charles; Walker, Owens
2017-05-01
Instantaneous personnel location presents a challenge in Department of Defense applications where high levels of security restrict real-time tracking of crew members. During emergency situations, command and control requires immediate accountability of all personnel. Current radio frequency (RF) based indoor positioning systems can be unsuitable due to RF leakage and electromagnetic interference with sensitively calibrated machinery on variable platforms like ships, submarines and high-security facilities. Infrared light provide a possible solution to this problem. This paper proposes and evaluates an indoor line-of-sight positioning system that is comprised of IR and high-sensitivity CMOS camera receivers. In this system the movement of the LEDs is captured by the camera, uploaded and analyzed; the highest point of power is located and plotted to create a blueprint of crewmember location. Results provided evaluate accuracy as a function of both wavelength and environmental conditions. Research will further evaluate the accuracy of the LED transmitter and CMOS camera receiver system. Transmissions in both the 780 and 850nm IR are analyzed.
Reconstructing Face Image from the Thermal Infrared Spectrum to the Visible Spectrum †
Kresnaraman, Brahmastro; Deguchi, Daisuke; Takahashi, Tomokazu; Mekada, Yoshito; Ide, Ichiro; Murase, Hiroshi
2016-01-01
During the night or in poorly lit areas, thermal cameras are a better choice instead of normal cameras for security surveillance because they do not rely on illumination. A thermal camera is able to detect a person within its view, but identification from only thermal information is not an easy task. The purpose of this paper is to reconstruct the face image of a person from the thermal spectrum to the visible spectrum. After the reconstruction, further image processing can be employed, including identification/recognition. Concretely, we propose a two-step thermal-to-visible-spectrum reconstruction method based on Canonical Correlation Analysis (CCA). The reconstruction is done by utilizing the relationship between images in both thermal infrared and visible spectra obtained by CCA. The whole image is processed in the first step while the second step processes patches in an image. Results show that the proposed method gives satisfying results with the two-step approach and outperforms comparative methods in both quality and recognition evaluations. PMID:27110781
Multisensory System for the Detection and Localization of Peripheral Subcutaneous Veins
Fernández, Roemi; Armada, Manuel
2017-01-01
This paper proposes a multisensory system for the detection and localization of peripheral subcutaneous veins, as a first step for achieving automatic robotic insertion of catheters in the near future. The multisensory system is based on the combination of a SWIR (Short-Wave Infrared) camera, a TOF (Time-Of-Flight) camera and a NIR (Near Infrared) lighting source. The associated algorithm consists of two main parts: one devoted to the features extraction from the SWIR image, and another envisaged for the registration of the range data provided by the TOF camera, with the SWIR image and the results of the peripheral veins detection. In this way, the detected subcutaneous veins are mapped onto the 3D reconstructed surface, providing a full representation of the region of interest for the automatic catheter insertion. Several experimental tests were carried out in order to evaluate the capabilities of the presented approach. Preliminary results demonstrate the feasibility of the proposed design and highlight the potential benefits of the solution. PMID:28422075
Obstacle Detection and Avoidance of a Mobile Robotic Platform Using Active Depth Sensing
2014-06-01
price of nearly one tenth of a laser range finder, the Xbox Kinect uses an infrared projector and camera to capture images of its environment in three...inception. At the price of nearly one tenth of a laser range finder, the Xbox Kinect uses an infrared projector and camera to capture images of its...cropped between 280 and 480 pixels. ........11 Figure 9. RGB image captured by the camera on the Xbox Kinect. ...............................12 Figure
NASA Technical Reports Server (NTRS)
Watson, Dan M.
1997-01-01
Under the terms of our contract with NASA Ames Research Center, the University of Rochester (UR) offers the following final technical report on grant NAG 2-958, Molecular shocks associated with massive young stars: CO line images with a new far-infrared spectroscopic camera, given for implementation of the UR Far-Infrared Spectroscopic Camera (FISC) on the Kuiper Airborne Observatory (KAO), and use of this camera for observations of star-formation regions 1. Two KAO flights in FY 1995, the final year of KAO operations, were awarded to this program, conditional upon a technical readiness confirmation which was given in January 1995. The funding period covered in this report is 1 October 1994 - 30 September 1996. The project was supported with $30,000, and no funds remained at the conclusion of the project.
Infrared Imaging Camera Final Report CRADA No. TC02061.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roos, E. V.; Nebeker, S.
This was a collaborative effort between the University of California, Lawrence Livermore National Laboratory (LLNL) and Cordin Company (Cordin) to enhance the U.S. ability to develop a commercial infrared camera capable of capturing high-resolution images in a l 00 nanoseconds (ns) time frame. The Department of Energy (DOE), under an Initiative for Proliferation Prevention (IPP) project, funded the Russian Federation Nuclear Center All-Russian Scientific Institute of Experimental Physics (RFNC-VNIIEF) in Sarov. VNIIEF was funded to develop a prototype commercial infrared (IR) framing camera and to deliver a prototype IR camera to LLNL. LLNL and Cordin were partners with VNIIEF onmore » this project. A prototype IR camera was delivered by VNIIEF to LLNL in December 2006. In June of 2007, LLNL and Cordin evaluated the camera and the test results revealed that the camera exceeded presently available commercial IR cameras. Cordin believes that the camera can be sold on the international market. The camera is currently being used as a scientific tool within Russian nuclear centers. This project was originally designated as a two year project. The project was not started on time due to changes in the IPP project funding conditions; the project funding was re-directed through the International Science and Technology Center (ISTC), which delayed the project start by over one year. The project was not completed on schedule due to changes within the Russian government export regulations. These changes were directed by Export Control regulations on the export of high technology items that can be used to develop military weapons. The IR camera was on the list that export controls required. The ISTC and Russian government, after negotiations, allowed the delivery of the camera to LLNL. There were no significant technical or business changes to the original project.« less
Exploring the imaging properties of thin lenses for cryogenic infrared cameras
NASA Astrophysics Data System (ADS)
Druart, Guillaume; Verdet, Sebastien; Guerineau, Nicolas; Magli, Serge; Chambon, Mathieu; Grulois, Tatiana; Matallah, Noura
2016-05-01
Designing a cryogenic camera is a good strategy to miniaturize and simplify an infrared camera using a cooled detector. Indeed, the integration of optics inside the cold shield allows to simply athermalize the design, guarantees a cold pupil and releases the constraint on having a high back focal length for small focal length systems. By this way, cameras made of a single lens or two lenses are viable systems with good optical features and a good stability in image correction. However it involves a relatively significant additional optical mass inside the dewar and thus increases the cool down time of the camera. ONERA is currently exploring a minimalist strategy consisting in giving an imaging function to thin optical plates that are found in conventional dewars. By this way, we could make a cryogenic camera that has the same cool down time as a traditional dewar without an imagery function. Two examples will be presented: the first one is a camera using a dual-band infrared detector made of a lens outside the dewar and a lens inside the cold shield, the later having the main optical power of the system. We were able to design a cold plano-convex lens with a thickness lower than 1mm. The second example is an evolution of a former cryogenic camera called SOIE. We replaced the cold meniscus by a plano-convex Fresnel lens with a decrease of the optical thermal mass of 66%. The performances of both cameras will be compared.
NASA Technical Reports Server (NTRS)
Howell, Patricia A.; Winfree, William P.; Cramer, K. Elliott
2008-01-01
On July 12, 2006, British-born astronaut Piers Sellers became the first person to conduct thermal nondestructive evaluation experiments in space, demonstrating the feasibility of a new tool for detecting damage to the reinforced carbon-carbon (RCC) structures of the Shuttle. This new tool was an EVA (Extravehicular Activity, or spacewalk) compatible infrared camera developed by NASA engineers. Data was collected both on the wing leading edge of the Orbiter and on pre-damaged samples mounted in the Shuttle s cargo bay. A total of 10 infrared movies were collected during the EVA totaling over 250 megabytes of data. Images were downloaded from the orbiting Shuttle to Johnson Space Center for analysis and processing. Results are shown to be comparable to ground-based thermal inspections performed in the laboratory with the same type of camera and simulated solar heating. The EVA camera system detected flat-bottom holes as small as 2.54cm in diameter with 50% material loss from the back (hidden) surface in RCC during this first test of the EVA IR Camera. Data for the time history of the specimen temperature and the capability of the inspection system for imaging impact damage are presented.
Lee, Chulsung; Lee, Dustin; Darling, Cynthia L; Fried, Daniel
2010-01-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310 nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study is to determine whether the lesion contrast derived from NIR imaging in both transmission and reflectance can be used to estimate lesion severity. Two NIR imaging detector technologies are investigated: a new Ge-enhanced complementary metal-oxide-semiconductor (CMOS)-based NIR imaging camera, and an InGaAs focal plane array (FPA). Natural occlusal caries lesions are imaged with both cameras at 1310 nm, and the image contrast between sound and carious regions is calculated. After NIR imaging, teeth are sectioned and examined using polarized light microscopy (PLM) and transverse microradiography (TMR) to determine lesion severity. Lesions are then classified into four categories according to lesion severity. Lesion contrast increases significantly with lesion severity for both cameras (p<0.05). The Ge-enhanced CMOS camera equipped with the larger array and smaller pixels yields higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
Lee, Chulsung; Lee, Dustin; Darling, Cynthia L.; Fried, Daniel
2010-01-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310 nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study is to determine whether the lesion contrast derived from NIR imaging in both transmission and reflectance can be used to estimate lesion severity. Two NIR imaging detector technologies are investigated: a new Ge-enhanced complementary metal-oxide-semiconductor (CMOS)-based NIR imaging camera, and an InGaAs focal plane array (FPA). Natural occlusal caries lesions are imaged with both cameras at 1310 nm, and the image contrast between sound and carious regions is calculated. After NIR imaging, teeth are sectioned and examined using polarized light microscopy (PLM) and transverse microradiography (TMR) to determine lesion severity. Lesions are then classified into four categories according to lesion severity. Lesion contrast increases significantly with lesion severity for both cameras (p<0.05). The Ge-enhanced CMOS camera equipped with the larger array and smaller pixels yields higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity. PMID:20799842
NASA Astrophysics Data System (ADS)
Lee, Chulsung; Lee, Dustin; Darling, Cynthia L.; Fried, Daniel
2010-07-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310 nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study is to determine whether the lesion contrast derived from NIR imaging in both transmission and reflectance can be used to estimate lesion severity. Two NIR imaging detector technologies are investigated: a new Ge-enhanced complementary metal-oxide-semiconductor (CMOS)-based NIR imaging camera, and an InGaAs focal plane array (FPA). Natural occlusal caries lesions are imaged with both cameras at 1310 nm, and the image contrast between sound and carious regions is calculated. After NIR imaging, teeth are sectioned and examined using polarized light microscopy (PLM) and transverse microradiography (TMR) to determine lesion severity. Lesions are then classified into four categories according to lesion severity. Lesion contrast increases significantly with lesion severity for both cameras (p<0.05). The Ge-enhanced CMOS camera equipped with the larger array and smaller pixels yields higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
NASA Astrophysics Data System (ADS)
Dumoulin, Jean
2013-04-01
One of the objectives of ISTIMES project was to evaluate the potentialities offered by the integration of different electromagnetic techniques able to perform non-invasive diagnostics for surveillance and monitoring of transport infrastructures. Among the EM methods investigated, we focused our research and development efforts on uncooled infrared camera techniques due to their promising potential level of dissemination linked to their relative low cost on the market. On the other hand, works were also carried out to identify well adapted implementation protocols and key limits of Pulse Phase Thermography (PPT) and Principal Component Thermography (PCT) processing methods to analyse thermal image sequence and retrieve information about the inner structure. So the first part of this research works addresses infrared thermography measurement when it is used in quantitative mode (not in laboratory conditions) and not in qualitative mode (vision applied to survey). In such context, it requires to process in real time thermal radiative corrections on raw data acquired to take into account influences of natural environment evolution with time, thanks to additional measurements. But, camera sensor has to be enough smart to apply in real time calibration law and radiometric corrections in a varying atmosphere. So, a complete measurement system was studied and developed [1] with low cost infrared cameras available on the market. In the system developed, infrared camera is coupled with other sensors to feed simplified radiative models running, in real time, on GPU available on small PC. The whole measurement system was implemented on the "Musmeci" bridge located in Potenza (Italy). No traffic interruption was required during the mounting of our measurement system. The infrared camera was fixed on top of a mast at 6 m elevation from the surface of the bridge deck. A small weather station was added on the same mast at 1 m under the camera. A GPS antenna was also fixed at the basis of the mast and at a same elevation than the bridge deck surface. This trial took place during 4 days, but our system was leaved in stand alone acquisition mode only during 3 days. Thanks to the software developed and the small computer hardware used, thermal image were acquired at a frame rate of 0.1 Hz by averaging 50 thermal images leaving the original camera frame rate fixed at 5 Hz. Each hour, a thermal image sequence was stored on the internal hard drive and data were also retrieved, on demand, by using a wireless connection and a tablet PC. In the second part of this work, thermal image sequences analysis was carried out. Two analysis approaches were studied: one based on the use of the Fast Fourier Transform [2] and the second one based on the Principal Component Analysis [3-4]. Results obtained show that the inner structure of the deck was identified though thermal images were affected by the fact that the bridge was open to traffic during the whole experiments duration. ACKNOWLEDGEMENT - The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 225663. References [1] Dumoulin J. and Averty R., « Development of an infrared system coupled with a weather station for real time atmospheric corrections using GPU computing: Application to bridge monitoring", QIRT 2012, Naples, Italy, June 2012. [2] Cooley J.W., Tukey J.W., "An algorithm for the machine calculation of complex Fourier series", Mathematics of Computation, vol. 19, n° 90, 1965, p. 297-301. [3] Rajic N., "Principal component thermography for flaw contrast enhancement and flaw depth characterization in composite structures", Composite Structures, vol 58, pp 521-528, 2002. [4] Marinetti S., Grinzato E., Bison P. G., Bozzi E., Chimenti M., Pieri G. and Salvetti O. "Statistical analysis of IR thermographic sequences by PCA," Infrared Physics & Technology vol 46 pp 85-91, 2004.
Pettit works with two still cameras mounted together in the U.S. Laboratory
2012-01-21
ISS030-E-049636 (21 Jan. 2012) --- NASA astronaut Don Pettit, Expedition 30 flight engineer, works with two still cameras mounted together in the Destiny laboratory of the International Space Station. One camera is an infrared modified still camera.
Pettit works with two still cameras mounted together in the U.S. Laboratory
2012-01-21
ISS030-E-049643 (21 Jan. 2012) --- NASA astronaut Don Pettit, Expedition 30 flight engineer, works with two still cameras mounted together in the Destiny laboratory of the International Space Station. One camera is an infrared modified still camera.
Attitude identification for SCOLE using two infrared cameras
NASA Technical Reports Server (NTRS)
Shenhar, Joram
1991-01-01
An algorithm is presented that incorporates real time data from two infrared cameras and computes the attitude parameters of the Spacecraft COntrol Lab Experiment (SCOLE), a lab apparatus representing an offset feed antenna attached to the Space Shuttle by a flexible mast. The algorithm uses camera position data of three miniature light emitting diodes (LEDs), mounted on the SCOLE platform, permitting arbitrary camera placement and an on-line attitude extraction. The continuous nature of the algorithm allows identification of the placement of the two cameras with respect to some initial position of the three reference LEDs, followed by on-line six degrees of freedom attitude tracking, regardless of the attitude time history. A description is provided of the algorithm in the camera identification mode as well as the mode of target tracking. Experimental data from a reduced size SCOLE-like lab model, reflecting the performance of the camera identification and the tracking processes, are presented. Computer code for camera placement identification and SCOLE attitude tracking is listed.
Implementation and performance of shutterless uncooled micro-bolometer cameras
NASA Astrophysics Data System (ADS)
Das, J.; de Gaspari, D.; Cornet, P.; Deroo, P.; Vermeiren, J.; Merken, P.
2015-06-01
A shutterless algorithm is implemented into the Xenics LWIR thermal cameras and modules. Based on a calibration set and a global temperature coefficient the optimal non-uniformity correction is calculated onboard of the camera. The limited resources in the camera require a compact algorithm, hence the efficiency of the coding is important. The performance of the shutterless algorithm is studied by a comparison of the residual non-uniformity (RNU) and signal-to-noise ratio (SNR) between the shutterless and shuttered correction algorithm. From this comparison we conclude that the shutterless correction is only slightly less performant compared to the standard shuttered algorithm, making this algorithm very interesting for thermal infrared applications where small weight and size, and continuous operation are important.
NASA Astrophysics Data System (ADS)
Huang, Hua-Wei; Zhang, Yang
2008-08-01
An attempt has been made to characterize the colour spectrum of methane flame under various burning conditions using RGB and HSV colour models instead of resolving the real physical spectrum. The results demonstrate that each type of flame has its own characteristic distribution in both the RGB and HSV space. It has also been observed that the averaged B and G values in the RGB model represent well the CH* and C*2 emission of methane premixed flame. Theses features may be utilized for flame measurement and monitoring. The great advantage of using a conventional camera for monitoring flame properties based on the colour spectrum is that it is readily available, easy to interface with a computer, cost effective and has certain spatial resolution. Furthermore, it has been demonstrated that a conventional digital camera is able to image flame not only in the visible spectrum but also in the infrared. This feature is useful in avoiding the problem of image saturation typically encountered in capturing the very bright sooty flames. As a result, further digital imaging processing and quantitative information extraction is possible. It has been identified that an infrared image also has its own distribution in both the RGB and HSV colour space in comparison with a flame image in the visible spectrum.
ERIC Educational Resources Information Center
Haglund, Jesper; Melander, Emil; Weiszflog, Matthias; Andersson, Staffan
2017-01-01
Background: University physics students were engaged in open-ended thermodynamics laboratory activities with a focus on understanding a chosen phenomenon or the principle of laboratory apparatus, such as thermal radiation and a heat pump. Students had access to handheld infrared (IR) cameras for their investigations. Purpose: The purpose of the…
Development of an Infrared Remote Sensing System for Continuous Monitoring of Stromboli Volcano
NASA Astrophysics Data System (ADS)
Harig, R.; Burton, M.; Rausch, P.; Jordan, M.; Gorgas, J.; Gerhard, J.
2009-04-01
In order to monitor gases emitted by Stromboli volcano in the Eolian archipelago, Italy, a remote sensing system based on Fourier-transform infrared spectroscopy has been developed and installed on the summit of Stromboli volcano. Hot rocks and lava are used as sources of infrared radiation. The system is based on an interferometer with a single detector element in combination with an azimuth-elevation scanning mirror system. The mirror system is used to align the field of view of the instrument. In addition, the system is equipped with an infrared camera. Two basic modes of operation have been implemented: The user may use the infrared image to align the system to a vent that is to be examined. In addition, the scanning system may be used for (hyperspectral) imaging of the scene. In this mode, the scanning mirror is set sequentially move to all positions within a region of interest which is defined by the operator using the image generated from the infrared camera. The spectral range used for the measurements is 1600 - 4200 cm-1 allowing the quantification of many gases such as CO, CO2, SO2, and HCl. The spectral resolution is 0.5 cm-1. In order to protect the optical, mechanical and electrical parts of the system from the volcanic gases, all components are contained in a gas-tight aluminium housing. The system is controlled via TCP/IP (data transfer by WLAN), allowing the user to operate it from a remote PC. The infrared image of the scene and measured spectra are transferred to and displayed by a remote PC at INGV or TUHH in real-time. However, the system is capable of autonomous operation on the volcano, once a measurement has been started. Measurements are stored by an internal embedded PC.
NASA Technical Reports Server (NTRS)
Young, Erick T.; Rieke, G. H.; Low, Frank J.; Haller, E. E.; Beeman, J. W.
1989-01-01
Work at the University of Arizona and at Lawrence Berkeley Laboratory on the development of a far infrared array camera for the Multiband Imaging Photometer on the Space Infrared Telescope Facility (SIRTF) is discussed. The camera design uses stacked linear arrays of Ge:Ga photoconductors to make a full two-dimensional array. Initial results from a 1 x 16 array using a thermally isolated J-FET readout are presented. Dark currents below 300 electrons s(exp -1) and readout noises of 60 electrons were attained. Operation of these types of detectors in an ionizing radiation environment are discussed. Results of radiation testing using both low energy gamma rays and protons are given. Work on advanced C-MOS cascode readouts that promise lower temperature operation and higher levels of performance than the current J-FET based devices is described.
Development of infrared scene projectors for testing fire-fighter cameras
NASA Astrophysics Data System (ADS)
Neira, Jorge E.; Rice, Joseph P.; Amon, Francine K.
2008-04-01
We have developed two types of infrared scene projectors for hardware-in-the-loop testing of thermal imaging cameras such as those used by fire-fighters. In one, direct projection, images are projected directly into the camera. In the other, indirect projection, images are projected onto a diffuse screen, which is then viewed by the camera. Both projectors use a digital micromirror array as the spatial light modulator, in the form of a Micromirror Array Projection System (MAPS) engine having resolution of 800 x 600 with mirrors on a 17 micrometer pitch, aluminum-coated mirrors, and a ZnSe protective window. Fire-fighter cameras are often based upon uncooled microbolometer arrays and typically have resolutions of 320 x 240 or lower. For direct projection, we use an argon-arc source, which provides spectral radiance equivalent to a 10,000 Kelvin blackbody over the 7 micrometer to 14 micrometer wavelength range, to illuminate the micromirror array. For indirect projection, an expanded 4 watt CO II laser beam at a wavelength of 10.6 micrometers illuminates the micromirror array and the scene formed by the first-order diffracted light from the array is projected onto a diffuse aluminum screen. In both projectors, a well-calibrated reference camera is used to provide non-uniformity correction and brightness calibration of the projected scenes, and the fire-fighter cameras alternately view the same scenes. In this paper, we compare the two methods for this application and report on our quantitative results. Indirect projection has an advantage of being able to more easily fill the wide field of view of the fire-fighter cameras, which typically is about 50 degrees. Direct projection more efficiently utilizes the available light, which will become important in emerging multispectral and hyperspectral applications.
Discovery of hotspots on Io using disk-resolved infrared imaging
NASA Technical Reports Server (NTRS)
Spencer, J. R.; Shure, M. A.; Ressler, M. E.; Sinton, W. M.; Goguen, J. D.
1990-01-01
First results are presented using two new techniques for ground-based observation of Io's hotspots. An IR array camera was used to obtain direct IR images of Io with resolution better than 0.5 arcsec, so that more than one hotspot is seen on Io in Jupiter eclipse. The camera was also used to make the first observations of the Jupiter occultation of the hotspots. These new techniques have revealed and located at least three hotspots and will now permit routine ground-based monitoring of the locations, temperatures, and sizes of multiple hotspots on Io.
Long-Wavelength 640 x 486 GaAs/AlGaAs Quantum Well Infrared Photodetector Snap-Shot Camera
NASA Technical Reports Server (NTRS)
Gunapala, Sarath D.; Bandara, Sumith V.; Liu, John K.; Hong, Winn; Sundaram, Mani; Maker, Paul D.; Muller, Richard E.; Shott, Craig A.; Carralejo, Ronald
1998-01-01
A 9-micrometer cutoff 640 x 486 snap-shot quantum well infrared photodetector (QWIP) camera has been demonstrated. The performance of this QWIP camera is reported including indoor and outdoor imaging. The noise equivalent differential temperature (NE.deltaT) of 36 mK has been achieved at 300 K background with f/2 optics. This is in good agreement with expected focal plane array sensitivity due to the practical limitations on charge handling capacity of the multiplexer, read noise, bias voltage, and operating temperature.
Kim, Byeong Hak; Kim, Min Young; Chae, You Seong
2017-01-01
Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC. PMID:29280970
Kim, Byeong Hak; Kim, Min Young; Chae, You Seong
2017-12-27
Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.
Capabilities, performance, and status of the SOFIA science instrument suite
NASA Astrophysics Data System (ADS)
Miles, John W.; Helton, L. Andrew; Sankrit, Ravi; Andersson, B. G.; Becklin, E. E.; De Buizer, James M.; Dowell, C. D.; Dunham, Edward W.; Güsten, Rolf; Harper, Doyal A.; Herter, Terry L.; Keller, Luke D.; Klein, Randolf; Krabbe, Alfred; Marcum, Pamela M.; McLean, Ian S.; Reach, William T.; Richter, Matthew J.; Roellig, Thomas L.; Sandell, Göran; Savage, Maureen L.; Smith, Erin C.; Temi, Pasquale; Vacca, William D.; Vaillancourt, John E.; Van Cleve, Jeffery E.; Young, Erick T.; Zell, Peter T.
2013-09-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is an airborne observatory, carrying a 2.5 m telescope onboard a heavily modified Boeing 747SP aircraft. SOFIA is optimized for operation at infrared wavelengths, much of which is obscured for ground-based observatories by atmospheric water vapor. The SOFIA science instrument complement consists of seven instruments: FORCAST (Faint Object InfraRed CAmera for the SOFIA Telescope), GREAT (German Receiver for Astronomy at Terahertz Frequencies), HIPO (High-speed Imaging Photometer for Occultations), FLITECAM (First Light Infrared Test Experiment CAMera), FIFI-LS (Far-Infrared Field-Imaging Line Spectrometer), EXES (Echelon-Cross-Echelle Spectrograph), and HAWC (High-resolution Airborne Wideband Camera). FORCAST is a 5-40 μm imager with grism spectroscopy, developed at Cornell University. GREAT is a heterodyne spectrometer providing high-resolution spectroscopy in several bands from 60-240 μm, developed at the Max Planck Institute for Radio Astronomy. HIPO is a 0.3-1.1 μm imager, developed at Lowell Observatory. FLITECAM is a 1-5 μm wide-field imager with grism spectroscopy, developed at UCLA. FIFI-LS is a 42-210 μm integral field imaging grating spectrometer, developed at the University of Stuttgart. EXES is a 5-28 μm high-resolution spectrograph, developed at UC Davis and NASA ARC. HAWC is a 50-240 μm imager, developed at the University of Chicago, and undergoing an upgrade at JPL to add polarimetry capability and substantially larger GSFC detectors. We describe the capabilities, performance, and status of each instrument, highlighting science results obtained using FORCAST, GREAT, and HIPO during SOFIA Early Science observations conducted in 2011.
NASA Astrophysics Data System (ADS)
Kadosh, Itai; Sarusi, Gabby
2017-10-01
The use of dual cameras in parallax in order to detect and create 3-D images in mobile devices has been increasing over the last few years. We propose a concept where the second camera will be operating in the short-wavelength infrared (SWIR-1300 to 1800 nm) and thus have night vision capability while preserving most of the other advantages of dual cameras in terms of depth and 3-D capabilities. In order to maintain commonality of the two cameras, we propose to attach to one of the cameras a SWIR to visible upconversion layer that will convert the SWIR image into a visible image. For this purpose, the fore optics (the objective lenses) should be redesigned for the SWIR spectral range and the additional upconversion layer, whose thickness is <1 μm. Such layer should be attached in close proximity to the mobile device visible range camera sensor (the CMOS sensor). This paper presents such a SWIR objective optical design and optimization that is formed and fit mechanically to the visible objective design but with different lenses in order to maintain the commonality and as a proof-of-concept. Such a SWIR objective design is very challenging since it requires mimicking the original visible mobile camera lenses' sizes and the mechanical housing, so we can adhere to the visible optical and mechanical design. We present in depth a feasibility study and the overall optical system performance of such a SWIR mobile-device camera fore optics design.
Transient Infrared Measurement of Laser Absorption Properties of Porous Materials
NASA Astrophysics Data System (ADS)
Marynowicz, Andrzej
2016-06-01
The infrared thermography measurements of porous building materials have become more frequent in recent years. Many accompanying techniques for the thermal field generation have been developed, including one based on laser radiation. This work presents a simple optimization technique for estimation of the laser beam absorption for selected porous building materials, namely clinker brick and cement mortar. The transient temperature measurements were performed with the use of infrared camera during laser-induced heating-up of the samples' surfaces. As the results, the absorbed fractions of the incident laser beam together with its shape parameter are reported.
2009-03-01
infrared, thermal , or night vision applications. Understanding the true capabilities and limitations of the ALAN camera and its applicability to a...an option to more expensive infrared, thermal , or night vision applications. Ultimately, it will be clear whether the configuration of the Kestrel...45 A. THERMAL CAMERAS................................................................................45 1
Gundle, Kenneth R; White, Jedediah K; Conrad, Ernest U; Ching, Randal P
2017-01-01
Surgical navigation systems are increasingly used to aid resection and reconstruction of osseous malignancies. In the process of implementing image-based surgical navigation systems, there are numerous opportunities for error that may impact surgical outcome. This study aimed to examine modifiable sources of error in an idealized scenario, when using a bidirectional infrared surgical navigation system. Accuracy and precision were assessed using a computerized-numerical-controlled (CNC) machined grid with known distances between indentations while varying: 1) the distance from the grid to the navigation camera (range 150 to 247cm), 2) the distance from the grid to the patient tracker device (range 20 to 40cm), and 3) whether the minimum or maximum number of bidirectional infrared markers were actively functioning. For each scenario, distances between grid points were measured at 10-mm increments between 10 and 120mm, with twelve measurements made at each distance. The accuracy outcome was the root mean square (RMS) error between the navigation system distance and the actual grid distance. To assess precision, four indentations were recorded six times for each scenario while also varying the angle of the navigation system pointer. The outcome for precision testing was the standard deviation of the distance between each measured point to the mean three-dimensional coordinate of the six points for each cluster. Univariate and multiple linear regression revealed that as the distance from the navigation camera to the grid increased, the RMS error increased (p<0.001). The RMS error also increased when not all infrared markers were actively tracking (p=0.03), and as the measured distance increased (p<0.001). In a multivariate model, these factors accounted for 58% of the overall variance in the RMS error. Standard deviations in repeated measures also increased when not all infrared markers were active (p<0.001), and as the distance between navigation camera and physical space increased (p=0.005). Location of the patient tracker did not affect accuracy (0.36) or precision (p=0.97). In our model laboratory test environment, the infrared bidirectional navigation system was more accurate and precise when the distance from the navigation camera to the physical (working) space was minimized and all bidirectional markers were active. These findings may require alterations in operating room setup and software changes to improve the performance of this system.
GETTING TO THE HEART OF A GALAXY
NASA Technical Reports Server (NTRS)
2002-01-01
This collage of images in visible and infrared light reveals how the barred spiral galaxy NGC 1365 is feeding material into its central region, igniting massive star birth and probably causing its bulge of stars to grow. The material also is fueling a black hole in the galaxy's core. A galaxy's bulge is a central, football-shaped structure composed of stars, gas, and dust. The black-and-white image in the center, taken by a ground-based telescope, displays the entire galaxy. But the telescope's resolution is not powerful enough to reveal the flurry of activity in the galaxy's hub. The blue box in the galaxy's central region outlines the area observed by the NASA Hubble Space Telescope's visible-light camera, the Wide Field and Planetary Camera 2 (WFPC2). The red box pinpoints a narrower view taken by the Hubble telescope's infrared camera, the Near Infrared Camera and Multi-Object Spectrometer (NICMOS). A barred spiral is characterized by a lane of stars, gas, and dust slashing across a galaxy's central region. It has a small bulge that is dominated by a disk of material. The spiral arms begin at both ends of the bar. The bar is funneling material into the hub, which triggers star formation and feeds the bulge. The visible-light picture at upper left is a close-up view of the galaxy's hub. The bright yellow orb is the nucleus. The dark material surrounding the orb is gas and dust that is being funneled into the central region by the bar. The blue regions pinpoint young star clusters. In the infrared image at lower right, the Hubble telescope penetrates the dust seen in the WFPC2 picture to reveal more clusters of young stars. The bright blue dots represent young star clusters; the brightest of the red dots are young star clusters enshrouded in dust and visible only in the infrared image. The fainter red dots are older star clusters. The WFPC2 image is a composite of three filters: near-ultraviolet (3327 Angstroms), visible (5552 Angstroms), and near-infrared (8269 Angstroms). The NICMOS image, taken at a wavelength of 16,000 Angstroms, was combined with the visible and near-infrared wavelengths taken by WFPC2. The WFPC2 image was taken in January 1996; the NICMOS data were taken in April 1998. Credits for the ground-based image: Allan Sandage (The Observatories of the Carnegie Institution of Washington) and John Bedke (Computer Sciences Corporation and the Space Telescope Science Institute) Credits for the WFPC2 image: NASA and John Trauger (Jet Propulsion Laboratory) Credits for the NICMOS image: NASA, ESA, and C. Marcella Carollo (Columbia University)
Automated cloud classification using a ground based infra-red camera and texture analysis techniques
NASA Astrophysics Data System (ADS)
Rumi, Emal; Kerr, David; Coupland, Jeremy M.; Sandford, Andrew P.; Brettle, Mike J.
2013-10-01
Clouds play an important role in influencing the dynamics of local and global weather and climate conditions. Continuous monitoring of clouds is vital for weather forecasting and for air-traffic control. Convective clouds such as Towering Cumulus (TCU) and Cumulonimbus clouds (CB) are associated with thunderstorms, turbulence and atmospheric instability. Human observers periodically report the presence of CB and TCU clouds during operational hours at airports and observatories; however such observations are expensive and time limited. Robust, automatic classification of cloud type using infrared ground-based instrumentation offers the advantage of continuous, real-time (24/7) data capture and the representation of cloud structure in the form of a thermal map, which can greatly help to characterise certain cloud formations. The work presented here utilised a ground based infrared (8-14 μm) imaging device mounted on a pan/tilt unit for capturing high spatial resolution sky images. These images were processed to extract 45 separate textural features using statistical and spatial frequency based analytical techniques. These features were used to train a weighted k-nearest neighbour (KNN) classifier in order to determine cloud type. Ground truth data were obtained by inspection of images captured simultaneously from a visible wavelength colour camera at the same installation, with approximately the same field of view as the infrared device. These images were classified by a trained cloud observer. Results from the KNN classifier gave an encouraging success rate. A Probability of Detection (POD) of up to 90% with a Probability of False Alarm (POFA) as low as 16% was achieved.
Simultaneous digital super-resolution and nonuniformity correction for infrared imaging systems.
Meza, Pablo; Machuca, Guillermo; Torres, Sergio; Martin, Cesar San; Vera, Esteban
2015-07-20
In this article, we present a novel algorithm to achieve simultaneous digital super-resolution and nonuniformity correction from a sequence of infrared images. We propose to use spatial regularization terms that exploit nonlocal means and the absence of spatial correlation between the scene and the nonuniformity noise sources. We derive an iterative optimization algorithm based on a gradient descent minimization strategy. Results from infrared image sequences corrupted with simulated and real fixed-pattern noise show a competitive performance compared with state-of-the-art methods. A qualitative analysis on the experimental results obtained with images from a variety of infrared cameras indicates that the proposed method provides super-resolution images with significantly less fixed-pattern noise.
High spatial resolution infrared camera as ISS external experiment
NASA Astrophysics Data System (ADS)
Eckehard, Lorenz; Frerker, Hap; Fitch, Robert Alan
High spatial resolution infrared camera as ISS external experiment for monitoring global climate changes uses ISS internal and external resources (eg. data storage). The optical experiment will consist of an infrared camera for monitoring global climate changes from the ISS. This technology was evaluated by the German small satellite mission BIRD and further developed in different ESA projects. Compared to BIRD the presended instrument uses proven sensor advanced technologies (ISS external) and ISS on board processing and storage capabili-ties (internal). The instrument will be equipped with a serial interfaces for TM/TC and several relay commands for the power supply. For data processing and storage a mass memory is re-quired. The access to actual attitude data is highly desired to produce geo referenced maps-if possible by an on board processing.
Bennett, C.L.
1996-07-23
An imaging Fourier transform spectrometer is described having a Fourier transform infrared spectrometer providing a series of images to a focal plane array camera. The focal plane array camera is clocked to a multiple of zero crossing occurrences as caused by a moving mirror of the Fourier transform infrared spectrometer and as detected by a laser detector such that the frame capture rate of the focal plane array camera corresponds to a multiple of the zero crossing rate of the Fourier transform infrared spectrometer. The images are transmitted to a computer for processing such that representations of the images as viewed in the light of an arbitrary spectral ``fingerprint`` pattern can be displayed on a monitor or otherwise stored and manipulated by the computer. 2 figs.
Advanced imaging research and development at DARPA
NASA Astrophysics Data System (ADS)
Dhar, Nibir K.; Dat, Ravi
2012-06-01
Advances in imaging technology have huge impact on our daily lives. Innovations in optics, focal plane arrays (FPA), microelectronics and computation have revolutionized camera design. As a result, new approaches to camera design and low cost manufacturing is now possible. These advances are clearly evident in visible wavelength band due to pixel scaling, improvements in silicon material and CMOS technology. CMOS cameras are available in cell phones and many other consumer products. Advances in infrared imaging technology have been slow due to market volume and many technological barriers in detector materials, optics and fundamental limits imposed by the scaling laws of optics. There is of course much room for improvements in both, visible and infrared imaging technology. This paper highlights various technology development projects at DARPA to advance the imaging technology for both, visible and infrared. Challenges and potentials solutions are highlighted in areas related to wide field-of-view camera design, small pitch pixel, broadband and multiband detectors and focal plane arrays.
ASPIRE - Airborne Spectro-Polarization InfraRed Experiment
NASA Astrophysics Data System (ADS)
DeLuca, E.; Cheimets, P.; Golub, L.; Madsen, C. A.; Marquez, V.; Bryans, P.; Judge, P. G.; Lussier, L.; McIntosh, S. W.; Tomczyk, S.
2017-12-01
Direct measurements of coronal magnetic fields are critical for taking the next step in active region and solar wind modeling and for building the next generation of physics-based space-weather models. We are proposing a new airborne instrument to make these key observations. Building on the successful Airborne InfraRed Spectrograph (AIR-Spec) experiment for the 2017 eclipse, we will design and build a spectro-polarimeter to measure coronal magnetic field during the 2019 South Pacific eclipse. The new instrument will use the AIR-Spec optical bench and the proven pointing, tracking, and stabilization optics. A new cryogenic spectro-polarimeter will be built focusing on the strongest emission lines observed during the eclipse. The AIR-Spec IR camera, slit jaw camera and data acquisition system will all be reused. The poster will outline the optical design and the science goals for ASPIRE.
NASA Astrophysics Data System (ADS)
Jeong, Mira; Nam, Jae-Yeal; Ko, Byoung Chul
2017-09-01
In this paper, we focus on pupil center detection in various video sequences that include head poses and changes in illumination. To detect the pupil center, we first find four eye landmarks in each eye by using cascade local regression based on a regression forest. Based on the rough location of the pupil, a fast radial symmetric transform is applied using the previously found pupil location to rearrange the fine pupil center. As the final step, the pupil displacement is estimated between the previous frame and the current frame to maintain the level of accuracy against a false locating result occurring in a particular frame. We generated a new face dataset, called Keimyung University pupil detection (KMUPD), with infrared camera. The proposed method was successfully applied to the KMUPD dataset, and the results indicate that its pupil center detection capability is better than that of other methods and with a shorter processing time.
PNIC - A near infrared camera for testing focal plane arrays
NASA Astrophysics Data System (ADS)
Hereld, Mark; Harper, D. A.; Pernic, R. J.; Rauscher, Bernard J.
1990-07-01
This paper describes the design and the performance of the Astrophysical Research Consortium prototype near-infrared camera (pNIC) designed to test focal plane arrays both on and off the telescope. Special attention is given to the detector in pNIC, the mechanical and optical designs, the electronics, and the instrument interface. Experiments performed to illustrate the most salient aspects of pNIC are described.
Thermographic measurements of high-speed metal cutting
NASA Astrophysics Data System (ADS)
Mueller, Bernhard; Renz, Ulrich
2002-03-01
Thermographic measurements of a high-speed cutting process have been performed with an infrared camera. To realize images without motion blur the integration times were reduced to a few microseconds. Since the high tool wear influences the measured temperatures a set-up has been realized which enables small cutting lengths. Only single images have been recorded because the process is too fast to acquire a sequence of images even with the frame rate of the very fast infrared camera which has been used. To expose the camera when the rotating tool is in the middle of the camera image an experimental set-up with a light barrier and a digital delay generator with a time resolution of 1 ns has been realized. This enables a very exact triggering of the camera at the desired position of the tool in the image. Since the cutting depth is between 0.1 and 0.2 mm a high spatial resolution was also necessary which was obtained by a special close-up lens allowing a resolution of app. 45 microns. The experimental set-up will be described and infrared images and evaluated temperatures of a titanium alloy and a carbon steel will be presented for cutting speeds up to 42 m/s.
Instrumentation for Infrared Airglow Clutter.
1987-03-10
gain, and filter position to the Camera Head, and monitors these parameters as well as preamp video. GAZER is equipped with a Lenzar wide angle, low...Specifications/Parameters VIDEO SENSOR: Camera ...... . LENZAR Intensicon-8 LLLTV using 2nd gen * micro-channel intensifier and proprietary camera tube
Confocal retinal imaging using a digital light projector with a near infrared VCSEL source
NASA Astrophysics Data System (ADS)
Muller, Matthew S.; Elsner, Ann E.
2018-02-01
A custom near infrared VCSEL source has been implemented in a confocal non-mydriatic retinal camera, the Digital Light Ophthalmoscope (DLO). The use of near infrared light improves patient comfort, avoids pupil constriction, penetrates the deeper retina, and does not mask visual stimuli. The DLO performs confocal imaging by synchronizing a sequence of lines displayed with a digital micromirror device to the rolling shutter exposure of a 2D CMOS camera. Real-time software adjustments enable multiply scattered light imaging, which rapidly and cost-effectively emphasizes drusen and other scattering disruptions in the deeper retina. A separate 5.1" LCD display provides customizable visible stimuli for vision experiments with simultaneous near infrared imaging.
A Navigation System for the Visually Impaired: A Fusion of Vision and Depth Sensor
Kanwal, Nadia; Bostanci, Erkan; Currie, Keith; Clark, Adrian F.
2015-01-01
For a number of years, scientists have been trying to develop aids that can make visually impaired people more independent and aware of their surroundings. Computer-based automatic navigation tools are one example of this, motivated by the increasing miniaturization of electronics and the improvement in processing power and sensing capabilities. This paper presents a complete navigation system based on low cost and physically unobtrusive sensors such as a camera and an infrared sensor. The system is based around corners and depth values from Kinect's infrared sensor. Obstacles are found in images from a camera using corner detection, while input from the depth sensor provides the corresponding distance. The combination is both efficient and robust. The system not only identifies hurdles but also suggests a safe path (if available) to the left or right side and tells the user to stop, move left, or move right. The system has been tested in real time by both blindfolded and blind people at different indoor and outdoor locations, demonstrating that it operates adequately. PMID:27057135
Fiber-Optic Surface Temperature Sensor Based on Modal Interference.
Musin, Frédéric; Mégret, Patrice; Wuilpart, Marc
2016-07-28
Spatially-integrated surface temperature sensing is highly useful when it comes to controlling processes, detecting hazardous conditions or monitoring the health and safety of equipment and people. Fiber-optic sensing based on modal interference has shown great sensitivity to temperature variation, by means of cost-effective image-processing of few-mode interference patterns. New developments in the field of sensor configuration, as described in this paper, include an innovative cooling and heating phase discrimination functionality and more precise measurements, based entirely on the image processing of interference patterns. The proposed technique was applied to the measurement of the integrated surface temperature of a hollow cylinder and compared with a conventional measurement system, consisting of an infrared camera and precision temperature probe. As a result, the optical technique is in line with the reference system. Compared with conventional surface temperature probes, the optical technique has the following advantages: low heat capacity temperature measurement errors, easier spatial deployment, and replacement of multiple angle infrared camera shooting and the continuous monitoring of surfaces that are not visually accessible.
Infrared Speckle Interferometry with 2-D Arrays
NASA Technical Reports Server (NTRS)
Harvey, P. M.; Balkum, S. L.; Monin, J. L.
1994-01-01
We describe results from a program of speckle interferometry with two-dimensional infrared array detectors. Analysis of observations of eta Carinae made with 58 x 62 InSb detector are discussed. The data have been analyzed with both the Labeyrie autocorrelation, a deconvolution of shift-and-add data, and a phase restoration process. Development of a new camera based on a much lower noise HgCdTe detector will lead to a significant improvement i limiting magnitude for IR speckle interferometry.
Final Optical Design of PANIC, a Wide-Field Infrared Camera for CAHA
NASA Astrophysics Data System (ADS)
Cárdenas, M. C.; Gómez, J. Rodríguez; Lenzen, R.; Sánchez-Blanco, E.
We present the Final Optical Design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Ritchey-Chrtien focus of the Calar Alto 2.2 m telescope. This will be the first instrument built under the German-Spanish consortium that manages the Calar Alto observatory. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. The optical design produces a well defined internal pupil available to reducing the thermal background by a cryogenic pupil stop. A mosaic of four detectors Hawaii 2RG of 2 k ×2 k, made by Teledyne, will give a field of view of 31.9 arcmin ×31.9 arcmin.
Multi-channel automotive night vision system
NASA Astrophysics Data System (ADS)
Lu, Gang; Wang, Li-jun; Zhang, Yi
2013-09-01
A four-channel automotive night vision system is designed and developed .It is consist of the four active near-infrared cameras and an Mulit-channel image processing display unit,cameras were placed in the automobile front, left, right and rear of the system .The system uses near-infrared laser light source,the laser light beam is collimated, the light source contains a thermoelectric cooler (TEC),It can be synchronized with the camera focusing, also has an automatic light intensity adjustment, and thus can ensure the image quality. The principle of composition of the system is description in detail,on this basis, beam collimation,the LD driving and LD temperature control of near-infrared laser light source,four-channel image processing display are discussed.The system can be used in driver assistance, car BLIS, car parking assist system and car alarm system in day and night.
Optimising Camera Traps for Monitoring Small Mammals
Glen, Alistair S.; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce
2013-01-01
Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera’s field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera’s field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats ( Mustela erminea ), feral cats (Felis catus) and hedgehogs ( Erinaceus europaeus ). Trigger speeds of 0.2–2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera’s field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790
High-Resolution Mars Camera Test Image of Moon (Infrared)
NASA Technical Reports Server (NTRS)
2005-01-01
This crescent view of Earth's Moon in infrared wavelengths comes from a camera test by NASA's Mars Reconnaissance Orbiter spacecraft on its way to Mars. The mission's High Resolution Imaging Science Experiment camera took the image on Sept. 8, 2005, while at a distance of about 10 million kilometers (6 million miles) from the Moon. The dark feature on the right is Mare Crisium. From that distance, the Moon would appear as a star-like point of light to the unaided eye. The test verified the camera's focusing capability and provided an opportunity for calibration. The spacecraft's Context Camera and Optical Navigation Camera also performed as expected during the test. The Mars Reconnaissance Orbiter, launched on Aug. 12, 2005, is on course to reach Mars on March 10, 2006. After gradually adjusting the shape of its orbit for half a year, it will begin its primary science phase in November 2006. From the mission's planned science orbit about 300 kilometers (186 miles) above the surface of Mars, the high resolution camera will be able to discern features as small as one meter or yard across.QWIP technology for both military and civilian applications
NASA Astrophysics Data System (ADS)
Gunapala, Sarath D.; Kukkonen, Carl A.; Sirangelo, Mark N.; McQuiston, Barbara K.; Chehayeb, Riad; Kaufmann, M.
2001-10-01
Advanced thermal imaging infrared cameras have been a cost effective and reliable method to obtain the temperature of objects. Quantum Well Infrared Photodetector (QWIP) based thermal imaging systems have advanced the state-of-the-art and are the most sensitive commercially available thermal systems. QWIP Technologies LLC, under exclusive agreement with Caltech University, is currently manufacturing the QWIP-ChipTM, a 320 X 256 element, bound-to-quasibound QWIP FPA. The camera performance falls within the long-wave IR band, spectrally peaked at 8.5 μm. The camera is equipped with a 32-bit floating-point digital signal processor combined with multi- tasking software, delivering a digital acquisition resolution of 12-bits using nominal power consumption of less than 50 Watts. With a variety of video interface options, remote control capability via an RS-232 connection, and an integrated control driver circuit to support motorized zoom and focus- compatible lenses, this camera design has excellent application in both the military and commercial sector. In the area of remote sensing, high-performance QWIP systems can be used for high-resolution, target recognition as part of a new system of airborne platforms (including UAVs). Such systems also have direct application in law enforcement, surveillance, industrial monitoring and road hazard detection systems. This presentation will cover the current performance of the commercial QWIP cameras, conceptual platform systems and advanced image processing for use in both military remote sensing and civilian applications currently being developed in road hazard monitoring.
Cao, Yanpeng; Tisse, Christel-Loic
2014-02-01
In this Letter, we propose an efficient and accurate solution to remove temperature-dependent nonuniformity effects introduced by the imaging optics. This single-image-based approach computes optics-related fixed pattern noise (FPN) by fitting the derivatives of correction model to the gradient components, locally computed on an infrared image. A modified bilateral filtering algorithm is applied to local pixel output variations, so that the refined gradients are most likely caused by the nonuniformity associated with optics. The estimated bias field is subtracted from the raw infrared imagery to compensate the intensity variations caused by optics. The proposed method is fundamentally different from the existing nonuniformity correction (NUC) techniques developed for focal plane arrays (FPAs) and provides an essential image processing functionality to achieve completely shutterless NUC for uncooled long-wave infrared (LWIR) imaging systems.
Measurement of reach envelopes with a four-camera Selective Spot Recognition (SELSPOT) system
NASA Technical Reports Server (NTRS)
Stramler, J. H., Jr.; Woolford, B. J.
1983-01-01
The basic Selective Spot Recognition (SELSPOT) system is essentially a system which uses infrared LEDs and a 'camera' with an infrared-sensitive photodetector, a focusing lens, and some A/D electronics to produce a digital output representing an X and Y coordinate for each LED for each camera. When the data are synthesized across all cameras with appropriate calibrations, an XYZ set of coordinates is obtained for each LED at a given point in time. Attention is given to the operating modes, a system checkout, and reach envelopes and software. The Video Recording Adapter (VRA) represents the main addition to the basic SELSPOT system. The VRA contains a microprocessor and other electronics which permit user selection of several options and some interaction with the system.
Infrared stereo calibration for unmanned ground vehicle navigation
NASA Astrophysics Data System (ADS)
Harguess, Josh; Strange, Shawn
2014-06-01
The problem of calibrating two color cameras as a stereo pair has been heavily researched and many off-the-shelf software packages, such as Robot Operating System and OpenCV, include calibration routines that work in most cases. However, the problem of calibrating two infrared (IR) cameras for the purposes of sensor fusion and point could generation is relatively new and many challenges exist. We present a comparison of color camera and IR camera stereo calibration using data from an unmanned ground vehicle. There are two main challenges in IR stereo calibration; the calibration board (material, design, etc.) and the accuracy of calibration pattern detection. We present our analysis of these challenges along with our IR stereo calibration methodology. Finally, we present our results both visually and analytically with computed reprojection errors.
Goodman High Throughput Spectrograph | SOAR
SPARTAN Near-IR Camera Ohio State Infrared Imager/Spectrograph (OSIRIS) - NO LONGER AVAILABLE SOAR 320-850 nm wavelength range. The paper describing the instrument is Clemens et al. (2004) Applying for IRAF. Publishing results based on Goodman data?: ADS link to 2004 SPIE Goodman Spectrograph paper
Yoon, Se Jin; Noh, Si Cheol; Choi, Heung Ho
2007-01-01
The infrared diagnosis device provides two-dimensional images and patient-oriented results that can be easily understood by the inspection target by using infrared cameras; however, it has disadvantages such as large size, high price, and inconvenient maintenance. In this regard, this study has proposed small-sized diagnosis device for body heat using a single infrared sensor and implemented an infrared detection system using a single infrared sensor and an algorithm that represents thermography using the obtained data on the temperature of the point source. The developed systems had the temperature resolution of 0.1 degree and the reproducibility of +/-0.1 degree. The accuracy was 90.39% at the error bound of +/-0 degree and 99.98% at that of +/-0.1 degree. In order to evaluate the proposed algorithm and system, the infrared images of camera method was compared. The thermal images that have clinical meaning were obtained from a patient who has lesion to verify its clinical applicability.
Femtowatt incoherent image conversion from mid-infrared light to near-infrared light
NASA Astrophysics Data System (ADS)
Huang, Nan; Liu, Hongjun; Wang, Zhaolu; Han, Jing; Zhang, Shuan
2017-03-01
We report on the experimental conversion imaging of an incoherent continuous-wave dim source from mid-infrared light to near-infrared light with a lowest input power of 31 femtowatt (fW). Incoherent mid-infrared images of light emission from a heat lamp bulb with an adjustable power supply at window wavelengths ranging from 2.9 µm to 3.5 µm are used for upconversion. The sum-frequency generation is realized in a laser cavity with the resonant wavelength of 1064 nm pumped by an LD at 806 nm built around a periodically poled lithium niobate (PPLN) crystal. The converted infrared image in the wavelength range ~785 nm with a resolution of about 120 × 70 is low-noise detected using a silicon-based camera. By optimizing the system parameters, the upconversion quantum efficiency is predicted to be 28% for correctly polarized, on-axis and phase-matching light.
Near-infrared autofluorescence imaging to detect parathyroid glands in thyroid surgery.
Ladurner, R; Al Arabi, N; Guendogar, U; Hallfeldt, Kkj; Stepp, H; Gallwas, Jks
2018-01-01
Objective To identify and save parathyroid glands during thyroidectomy by displaying their autofluorescence. Methods Autofluorescence imaging was carried out during thyroidectomy with and without central lymph node dissection. After visual recognition by the surgeon, the parathyroid glands and the surrounding tissue were exposed to near-infrared light with a wavelength of 690-770 nm using a modified Karl Storz near infrared/indocyanine green endoscopic system. Parathyroid tissue was expected to show near infrared autofluorescence at 820 nm, captured in the blue channel of the camera. Results We investigated 41 parathyroid glands from 20 patients; 37 glands were identified correctly based on near-infrared autofluorescence. Neither lymph nodes nor thyroid revealed substantial autofluorescence and nor did adipose tissue. Conclusions Parathyroid tissue is characterised by showing autofluorescence in the near-infrared spectrum. This effect can be used to identify and preserve parathyroid glands during thyroidectomy.
AKARI Infrared Camera Survey of the Large Magellanic Cloud
NASA Astrophysics Data System (ADS)
Shimonishi, Takashi; Kato, Daisuke; Ita, Yoshifusa; Onaka, Takashi
2015-08-01
The Large Magellanic Cloud (LMC) is one of the closest external galaxies to the Milky Way and has been playing a central role in various fields of modern astronomy and astrophysics. We conducted an unbiased near- to mid-infrared imaging and spectroscopic survey of the LMC with the infrared satellite AKARI. An area of about 10 square degrees of the LMC was observed by five imaging bands (each centered at 3.2, 7, 11, 15, and 24 micron) and the low-resolution slitless prism spectroscopy mode (2--5 micron, R~20) equipped with the Infrared Camera on board AKARI. Based on the data obtained in the survey, we constructed the photometric and spectroscopic catalogues of point sources in the LMC. The photometric catalogue includes about 650,000, 90,000, 49,000, 17,000, 7,000 sources at 3.2, 7, 11, 15, and 24 micron, respectively (Ita et al. 2008, PASJ, 60, 435; Kato et al. 2012, AJ, 144, 179), while the spectroscopic catalogue includes 1,757 sources (Shimonishi et al. 2013, AJ, 145, 32). Both catalogs are publicly released and available through a website (AKARI Observers Page, http://www.ir.isas.ac.jp/AKARI/Observation/). The catalog includes various infrared sources such as young stellar objects, asymptotic giant branch stars, giants/supergiants, and many other cool or dust-enshrouded stars. A large number of near-infrared spectral data, coupled with complementary broadband photometric data, allow us to investigate infrared spectral features of sources by comparison with their spectral energy distributions. Combined use of the present AKARI LMC catalogues with other infrared catalogues such as SAGE and HERITAGE possesses scientific potential that can be applied to various astronomical studies. In this presentation, we report the details of the AKARI photometric and spectroscopic catalogues of the LMC.
NASA Astrophysics Data System (ADS)
Gouverneur, B.; Verstockt, S.; Pauwels, E.; Han, J.; de Zeeuw, P. M.; Vermeiren, J.
2012-10-01
Various visible and infrared cameras have been tested for the early detection of wildfires to protect archeological treasures. This analysis was possible thanks to the EU Firesense project (FP7-244088). Although visible cameras are low cost and give good results during daytime for smoke detection, they fall short under bad visibility conditions. In order to improve the fire detection probability and reduce the false alarms, several infrared bands are tested ranging from the NIR to the LWIR. The SWIR and the LWIR band are helpful to locate the fire through smoke if there is a direct Line Of Sight. The Emphasis is also put on the physical and the electro-optical system modeling for forest fire detection at short and longer ranges. The fusion in three bands (Visible, SWIR, LWIR) is discussed at the pixel level for image enhancement and for fire detection.
Passive Infrared Thermographic Imaging for Mobile Robot Object Identification
NASA Astrophysics Data System (ADS)
Hinders, M. K.; Fehlman, W. L.
2010-02-01
The usefulness of thermal infrared imaging as a mobile robot sensing modality is explored, and a set of thermal-physical features used to characterize passive thermal objects in outdoor environments is described. Objects that extend laterally beyond the thermal camera's field of view, such as brick walls, hedges, picket fences, and wood walls as well as compact objects that are laterally within the thermal camera's field of view, such as metal poles and tree trunks, are considered. Classification of passive thermal objects is a subtle process since they are not a source for their own emission of thermal energy. A detailed analysis is included of the acquisition and preprocessing of thermal images, as well as the generation and selection of thermal-physical features from these objects within thermal images. Classification performance using these features is discussed, as a precursor to the design of a physics-based model to automatically classify these objects.
Teaching physics and understanding infrared thermal imaging
NASA Astrophysics Data System (ADS)
Vollmer, Michael; Möllmann, Klaus-Peter
2017-08-01
Infrared thermal imaging is a very rapidly evolving field. The latest trends are small smartphone IR camera accessories, making infrared imaging a widespread and well-known consumer product. Applications range from medical diagnosis methods via building inspections and industrial predictive maintenance etc. also to visualization in the natural sciences. Infrared cameras do allow qualitative imaging and visualization but also quantitative measurements of the surface temperatures of objects. On the one hand, they are a particularly suitable tool to teach optics and radiation physics and many selected topics in different fields of physics, on the other hand there is an increasing need of engineers and physicists who understand these complex state of the art photonics systems. Therefore students must also learn and understand the physics underlying these systems.
Confocal Retinal Imaging Using a Digital Light Projector with a Near Infrared VCSEL Source
Muller, Matthew S.; Elsner, Ann E.
2018-01-01
A custom near infrared VCSEL source has been implemented in a confocal non-mydriatic retinal camera, the Digital Light Ophthalmoscope (DLO). The use of near infrared light improves patient comfort, avoids pupil constriction, penetrates the deeper retina, and does not mask visual stimuli. The DLO performs confocal imaging by synchronizing a sequence of lines displayed with a digital micromirror device to the rolling shutter exposure of a 2D CMOS camera. Real-time software adjustments enable multiply scattered light imaging, which rapidly and cost-effectively emphasizes drusen and other scattering disruptions in the deeper retina. A separate 5.1″ LCD display provides customizable visible stimuli for vision experiments with simultaneous near infrared imaging. PMID:29899586
Atomically thin noble metal dichalcogenide: a broadband mid-infrared semiconductor.
Yu, Xuechao; Yu, Peng; Wu, Di; Singh, Bahadur; Zeng, Qingsheng; Lin, Hsin; Zhou, Wu; Lin, Junhao; Suenaga, Kazu; Liu, Zheng; Wang, Qi Jie
2018-04-18
The interest in mid-infrared technologies surrounds plenty of important optoelectronic applications ranging from optical communications, biomedical imaging to night vision cameras, and so on. Although narrow bandgap semiconductors, such as Mercury Cadmium Telluride and Indium Antimonide, and quantum superlattices based on inter-subband transitions in wide bandgap semiconductors, have been employed for mid-infrared applications, it remains a daunting challenge to search for other materials that possess suitable bandgaps in this wavelength range. Here, we demonstrate experimentally for the first time that two-dimensional (2D) atomically thin PtSe 2 has a variable bandgap in the mid-infrared via layer and defect engineering. Here, we show that bilayer PtSe 2 combined with defects modulation possesses strong light absorption in the mid-infrared region, and we realize a mid-infrared photoconductive detector operating in a broadband mid-infrared range. Our results pave the way for atomically thin 2D noble metal dichalcogenides to be employed in high-performance mid-infrared optoelectronic devices.
A low-cost dual-camera imaging system for aerial applicators
USDA-ARS?s Scientific Manuscript database
Agricultural aircraft provide a readily available remote sensing platform as low-cost and easy-to-use consumer-grade cameras are being increasingly used for aerial imaging. In this article, we report on a dual-camera imaging system we recently assembled that can capture RGB and near-infrared (NIR) i...
A DirtI Application for LBT Commissioning Campaigns
NASA Astrophysics Data System (ADS)
Borelli, J. L.
2009-09-01
In order to characterize the Gregorian focal stations and test the performance achieved by the Large Binocular Telescope (LBT) adaptive optics system, two infrared test cameras were constructed within a joint project between INAF (Observatorio Astronomico di Bologna, Italy) and the Max Planck Institute for Astronomy (Germany). Is intended here to describe the functionality and successful results obtained with the Daemon for the Infrared Test Camera Interface (DirtI) during commissioning campaigns.
Ensuring long-term stability of infrared camera absolute calibration.
Kattnig, Alain; Thetas, Sophie; Primot, Jérôme
2015-07-13
Absolute calibration of cryogenic 3-5 µm and 8-10 µm infrared cameras is notoriously instable and thus has to be repeated before actual measurements. Moreover, the signal to noise ratio of the imagery is lowered, decreasing its quality. These performances degradations strongly lessen the suitability of Infrared Imaging. These defaults are often blamed on detectors reaching a different "response state" after each return to cryogenic conditions, while accounting for the detrimental effects of imperfect stray light management. We show here that detectors are not to be blamed and that the culprit can also dwell in proximity electronics. We identify an unexpected source of instability in the initial voltage of the integrating capacity of detectors. Then we show that this parameter can be easily measured and taken into account. This way we demonstrate that a one month old calibration of a 3-5 µm camera has retained its validity.
2016-04-15
The newest instrument, an infrared camera called the High-resolution Airborne Wideband Camera-Plus (HAWC+), was installed on the Stratospheric Observatory for Infrared Astronomy, SOFIA, in April of 2016. This is the only currently operating astronomical camera that makes images using far-infrared light, allowing studies of low-temperature early stages of star and planet formation. HAWC+ includes a polarimeter, a device that measures the alignment of incoming light waves. With the polarimeter, HAWC+ can map magnetic fields in star forming regions and in the environment around the supermassive black hole at the center of the Milky Way galaxy. These new maps can reveal how the strength and direction of magnetic fields affect the rate at which interstellar clouds condense to form new stars. A team led by C. Darren Dowell at NASA’s Jet Propulsion Laboratory and including participants from more than a dozen institutions developed the instrument.
Bennett, Charles L.
1996-01-01
An imaging Fourier transform spectrometer (10, 210) having a Fourier transform infrared spectrometer (12) providing a series of images (40) to a focal plane array camera (38). The focal plane array camera (38) is clocked to a multiple of zero crossing occurrences as caused by a moving mirror (18) of the Fourier transform infrared spectrometer (12) and as detected by a laser detector (50) such that the frame capture rate of the focal plane array camera (38) corresponds to a multiple of the zero crossing rate of the Fourier transform infrared spectrometer (12). The images (40) are transmitted to a computer (45) for processing such that representations of the images (40) as viewed in the light of an arbitrary spectral "fingerprint" pattern can be displayed on a monitor (60) or otherwise stored and manipulated by the computer (45).
CANICA: The Cananea Near-Infrared Camera at the 2.1 m OAGH Telescope
NASA Astrophysics Data System (ADS)
Carrasco, L.; Hernández Utrera, O.; Vázquez, S.; Mayya, Y. D.; Carrasco, E.; Pedraza, J.; Castillo-Domínguez, E.; Escobedo, G.; Devaraj, R.; Luna, A.
2017-10-01
The Cananea near-infrared camera (CANICA) is an instrument commissioned at the 2.12 m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located in Cananea, Sonora, México. CANICA operates in the near-infrared at multiple bands including J(1.24 μm), H(1.63 μm) and K' (2.12 μm) broad-bands. CANICA in located at the Ritchey-Chrétien focal plane of the telescope, reimaging the f/12 beam into f/6 beam. The detector is a 1024 × 1024 HgCdTe HAWAII array of 18.5 μm pixel size, covering a field of view of 5.5 × 5.5 arcmin2, for a plate scale of 0.32 arcsec/pixel. The camera is enclosed in a cryostat, cooled with liquid nitrogen to 77 K. The cryostat contains the collimator, two 15-position filter wheels, single fixed reimaging optics and the detector.
Navigating surgical fluorescence cameras using near-infrared optical tracking.
van Oosterom, Matthias; den Houting, David; van de Velde, Cornelis; van Leeuwen, Fijs
2018-05-01
Fluorescence guidance facilitates real-time intraoperative visualization of the tissue of interest. However, due to attenuation, the application of fluorescence guidance is restricted to superficial lesions. To overcome this shortcoming, we have previously applied three-dimensional surgical navigation to position the fluorescence camera in reach of the superficial fluorescent signal. Unfortunately, in open surgery, the near-infrared (NIR) optical tracking system (OTS) used for navigation also induced an interference during NIR fluorescence imaging. In an attempt to support future implementation of navigated fluorescence cameras, different aspects of this interference were characterized and solutions were sought after. Two commercial fluorescence cameras for open surgery were studied in (surgical) phantom and human tissue setups using two different NIR OTSs and one OTS simulating light-emitting diode setup. Following the outcome of these measurements, OTS settings were optimized. Measurements indicated the OTS interference was caused by: (1) spectral overlap between the OTS light and camera, (2) OTS light intensity, (3) OTS duty cycle, (4) OTS frequency, (5) fluorescence camera frequency, and (6) fluorescence camera sensitivity. By optimizing points 2 to 4, navigation of fluorescence cameras during open surgery could be facilitated. Optimization of the OTS and camera compatibility can be used to support navigated fluorescence guidance concepts. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Students' framing of laboratory exercises using infrared cameras
NASA Astrophysics Data System (ADS)
Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.
2015-12-01
Thermal science is challenging for students due to its largely imperceptible nature. Handheld infrared cameras offer a pedagogical opportunity for students to see otherwise invisible thermal phenomena. In the present study, a class of upper secondary technology students (N =30 ) partook in four IR-camera laboratory activities, designed around the predict-observe-explain approach of White and Gunstone. The activities involved central thermal concepts that focused on heat conduction and dissipative processes such as friction and collisions. Students' interactions within each activity were videotaped and the analysis focuses on how a purposefully selected group of three students engaged with the exercises. As the basis for an interpretative study, a "thick" narrative description of the students' epistemological and conceptual framing of the exercises and how they took advantage of the disciplinary affordance of IR cameras in the thermal domain is provided. Findings include that the students largely shared their conceptual framing of the four activities, but differed among themselves in their epistemological framing, for instance, in how far they found it relevant to digress from the laboratory instructions when inquiring into thermal phenomena. In conclusion, the study unveils the disciplinary affordances of infrared cameras, in the sense of their use in providing access to knowledge about macroscopic thermal science.
Gundle, Kenneth R.; White, Jedediah K.; Conrad, Ernest U.; Ching, Randal P.
2017-01-01
Introduction: Surgical navigation systems are increasingly used to aid resection and reconstruction of osseous malignancies. In the process of implementing image-based surgical navigation systems, there are numerous opportunities for error that may impact surgical outcome. This study aimed to examine modifiable sources of error in an idealized scenario, when using a bidirectional infrared surgical navigation system. Materials and Methods: Accuracy and precision were assessed using a computerized-numerical-controlled (CNC) machined grid with known distances between indentations while varying: 1) the distance from the grid to the navigation camera (range 150 to 247cm), 2) the distance from the grid to the patient tracker device (range 20 to 40cm), and 3) whether the minimum or maximum number of bidirectional infrared markers were actively functioning. For each scenario, distances between grid points were measured at 10-mm increments between 10 and 120mm, with twelve measurements made at each distance. The accuracy outcome was the root mean square (RMS) error between the navigation system distance and the actual grid distance. To assess precision, four indentations were recorded six times for each scenario while also varying the angle of the navigation system pointer. The outcome for precision testing was the standard deviation of the distance between each measured point to the mean three-dimensional coordinate of the six points for each cluster. Results: Univariate and multiple linear regression revealed that as the distance from the navigation camera to the grid increased, the RMS error increased (p<0.001). The RMS error also increased when not all infrared markers were actively tracking (p=0.03), and as the measured distance increased (p<0.001). In a multivariate model, these factors accounted for 58% of the overall variance in the RMS error. Standard deviations in repeated measures also increased when not all infrared markers were active (p<0.001), and as the distance between navigation camera and physical space increased (p=0.005). Location of the patient tracker did not affect accuracy (0.36) or precision (p=0.97) Conclusion: In our model laboratory test environment, the infrared bidirectional navigation system was more accurate and precise when the distance from the navigation camera to the physical (working) space was minimized and all bidirectional markers were active. These findings may require alterations in operating room setup and software changes to improve the performance of this system. PMID:28694888
A passive terahertz video camera based on lumped element kinetic inductance detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rowe, Sam, E-mail: sam.rowe@astro.cf.ac.uk; Pascale, Enzo; Doyle, Simon
We have developed a passive 350 GHz (850 μm) video-camera to demonstrate lumped element kinetic inductance detectors (LEKIDs)—designed originally for far-infrared astronomy—as an option for general purpose terrestrial terahertz imaging applications. The camera currently operates at a quasi-video frame rate of 2 Hz with a noise equivalent temperature difference per frame of ∼0.1 K, which is close to the background limit. The 152 element superconducting LEKID array is fabricated from a simple 40 nm aluminum film on a silicon dielectric substrate and is read out through a single microwave feedline with a cryogenic low noise amplifier and room temperature frequencymore » domain multiplexing electronics.« less
Forward-Looking Infrared Cameras for Micrometeorological Applications within Vineyards
Katurji, Marwan; Zawar-Reza, Peyman
2016-01-01
We apply the principles of atmospheric surface layer dynamics within a vineyard canopy to demonstrate the use of forward-looking infrared cameras measuring surface brightness temperature (spectrum bandwidth of 7.5 to 14 μm) at a relatively high temporal rate of 10 s. The temporal surface brightness signal over a few hours of the stable nighttime boundary layer, intermittently interrupted by periods of turbulent heat flux surges, was shown to be related to the observed meteorological measurements by an in situ eddy-covariance system, and reflected the above-canopy wind variability. The infrared raster images were collected and the resultant self-organized spatial cluster provided the meteorological context when compared to in situ data. The spatial brightness temperature pattern was explained in terms of the presence or absence of nighttime cloud cover and down-welling of long-wave radiation and the canopy turbulent heat flux. Time sequential thermography as demonstrated in this research provides positive evidence behind the application of thermal infrared cameras in the domain of micrometeorology, and to enhance our spatial understanding of turbulent eddy interactions with the surface. PMID:27649208
Kinzel, Paul J.; Legleiter, Carl; Nelson, Jonathan M.; Conaway, Jeffrey S.
2017-01-01
Thermal cameras with high sensitivity to medium and long wavelengths can resolve features at the surface of flowing water arising from turbulent mixing. Images acquired by these cameras can be processed with particle image velocimetry (PIV) to compute surface velocities based on the displacement of thermal features as they advect with the flow. We conducted a series of field measurements to test this methodology for remote sensing of surface velocities in rivers. We positioned an infrared video camera at multiple stations across bridges that spanned five rivers in Alaska. Simultaneous non-contact measurements of surface velocity were collected with a radar gun. In situ velocity profiles were collected with Acoustic Doppler Current Profilers (ADCP). Infrared image time series were collected at a frequency of 10Hz for a one-minute duration at a number of stations spaced across each bridge. Commercial PIV software used a cross-correlation algorithm to calculate pixel displacements between successive frames, which were then scaled to produce surface velocities. A blanking distance below the ADCP prevents a direct measurement of the surface velocity. However, we estimated surface velocity from the ADCP measurements using a program that normalizes each ADCP transect and combines those normalized transects to compute a mean measurement profile. The program can fit a power law to the profile and in so doing provides a velocity index, the ratio between the depth-averaged and surface velocity. For the rivers in this study, the velocity index ranged from 0.82 – 0.92. Average radar and extrapolated ADCP surface velocities were in good agreement with average infrared PIV calculations.
Method and apparatus for implementing material thermal property measurement by flash thermal imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Jiangang
A method and apparatus are provided for implementing measurement of material thermal properties including measurement of thermal effusivity of a coating and/or film or a bulk material of uniform property. The test apparatus includes an infrared camera, a data acquisition and processing computer coupled to the infrared camera for acquiring and processing thermal image data, a flash lamp providing an input of heat onto the surface of a two-layer sample with an enhanced optical filter covering the flash lamp attenuating an entire infrared wavelength range with a series of thermal images is taken of the surface of the two-layer sample.
Near-infrared fluorescence imaging with a mobile phone (Conference Presentation)
NASA Astrophysics Data System (ADS)
Ghassemi, Pejhman; Wang, Bohan; Wang, Jianting; Wang, Quanzeng; Chen, Yu; Pfefer, T. Joshua
2017-03-01
Mobile phone cameras employ sensors with near-infrared (NIR) sensitivity, yet this capability has not been exploited for biomedical purposes. Removing the IR-blocking filter from a phone-based camera opens the door to a wide range of techniques and applications for inexpensive, point-of-care biophotonic imaging and sensing. This study provides proof of principle for one of these modalities - phone-based NIR fluorescence imaging. An imaging system was assembled using a 780 nm light source along with excitation and emission filters with 800 nm and 825 nm cut-off wavelengths, respectively. Indocyanine green (ICG) was used as an NIR fluorescence contrast agent in an ex vivo rodent model, a resolution test target and a 3D-printed, tissue-simulating vascular phantom. Raw and processed images for red, green and blue pixel channels were analyzed for quantitative evaluation of fundamental performance characteristics including spectral sensitivity, detection linearity and spatial resolution. Mobile phone results were compared with a scientific CCD. The spatial resolution of CCD system was consistently superior to the phone, and green phone camera pixels showed better resolution than blue or green channels. The CCD exhibited similar sensitivity as processed red and blue pixels channels, yet a greater degree of detection linearity. Raw phone pixel data showed lower sensitivity but greater linearity than processed data. Overall, both qualitative and quantitative results provided strong evidence of the potential of phone-based NIR imaging, which may lead to a wide range of applications from cancer detection to glucose sensing.
Gyrocopter-Based Remote Sensing Platform
NASA Astrophysics Data System (ADS)
Weber, I.; Jenal, A.; Kneer, C.; Bongartz, J.
2015-04-01
In this paper the development of a lightweight and highly modularized airborne sensor platform for remote sensing applications utilizing a gyrocopter as a carrier platform is described. The current sensor configuration consists of a high resolution DSLR camera for VIS-RGB recordings. As a second sensor modality, a snapshot hyperspectral camera was integrated in the aircraft. Moreover a custom-developed thermal imaging system composed of a VIS-PAN camera and a LWIR-camera is used for aerial recordings in the thermal infrared range. Furthermore another custom-developed highly flexible imaging system for high resolution multispectral image acquisition with up to six spectral bands in the VIS-NIR range is presented. The performance of the overall system was tested during several flights with all sensor modalities and the precalculated demands with respect to spatial resolution and reliability were validated. The collected data sets were georeferenced, georectified, orthorectified and then stitched to mosaics.
Performance of Backshort-Under-Grid Kilopixel TES Arrays for HAWC+
NASA Technical Reports Server (NTRS)
Staguhn, J. G.; Benford, D. J.; Dowell, C. D.; Fixsen, D. J.; Hilton, G. C.; Irwin, K. D.; Jhabvala, C. A.; Maher, S. F.; Miller, T. M.; Moseley, S. H.;
2016-01-01
We present results from laboratory detector characterizations of the first kilopixel BUG arrays for the High- resolution Wideband Camera Plus (HAWC+) which is the imaging far-infrared polarimeter camera for the Stratospheric Observatory for Infrared Astronomy (SOFIA). Our tests demonstrate that the array performance is consistent with the predicted properties. Here, we highlight results obtained for the thermal conductivity, noise performance, detector speed, and first optical results demonstrating the pixel yield of the arrays.
Infrared imaging spectrometry by the use of bundled chalcogenide glass fibers and a PtSi CCD camera
NASA Astrophysics Data System (ADS)
Saito, Mitsunori; Kikuchi, Katsuhiro; Tanaka, Chinari; Sone, Hiroshi; Morimoto, Shozo; Yamashita, Toshiharu T.; Nishii, Junji
1999-10-01
A coherent fiber bundle for infrared image transmission was prepared by arranging 8400 chalcogenide (AsS) glass fibers. The fiber bundle, 1 m in length, is transmissive in the infrared spectral region of 1 - 6 micrometer. A remote spectroscopic imaging system was constructed with the fiber bundle and an infrared PtSi CCD camera. The system was used for the real-time observation (frame time: 1/60 s) of gas distribution. Infrared light from a SiC heater was delivered to a gas cell through a chalcogenide fiber, and transmitted light was observed through the fiber bundle. A band-pass filter was used for the selection of gas species. A He-Ne laser of 3.4 micrometer wavelength was also used for the observation of hydrocarbon gases. Gases bursting from a nozzle were observed successfully by a remote imaging system.
Enhancement of High-Speed Infrared Array Electronics (Center Director's Discretionary Fund)
NASA Technical Reports Server (NTRS)
Sutherland, W. T.
1996-01-01
A state-of-the-art infrared detector was to be used as the sensor in a new spectrometer-camera for astronomical observations. The sensitivity of the detector required the use of low-noise, high-speed electronics in the system design. The key component in the electronic system was the pre-amplifier that amplified the low voltage signal coming from the detector. The system was designed based on the selection of the amplifier and that was driven by the maximum noise level, which would yield the desired sensitivity for the telescope system.
Near infrared photography with a vacuum-cold camera. [Orion nebula observation
NASA Technical Reports Server (NTRS)
Rossano, G. S.; Russell, R. W.; Cornett, R. H.
1980-01-01
Sensitized cooled plates have been obtained of the Orion nebula region and of Sh2-149 in the wavelength ranges 8000 A-9000 A and 9,000 A-11,000 A with a recently designed and constructed vacuum-cold camera. Sensitization procedures are described and the camera design is presented.
Darmanis, Spyridon; Toms, Andrew; Durman, Robert; Moore, Donna; Eyres, Keith
2007-07-01
To reduce the operating time in computer-assisted navigated total knee replacement (TKR), by improving communication between the infrared camera and the trackers placed on the patient. The innovation involves placing a routinely used laser pointer on top of the camera, so that the infrared cameras focus precisely on the trackers located on the knee to be operated on. A prospective randomized study was performed involving 40 patients divided into two groups, A and B. Both groups underwent navigated TKR, but for group B patients a laser pointer was used to improve the targeting capabilities of the cameras. Without the laser pointer, the camera had to move a mean 9.2 times in order to identify the trackers. With the introduction of the laser pointer, this was reduced to 0.9 times. Accordingly, the additional mean time required without the laser pointer was 11.6 minutes. Time delays are a major problem in computer-assisted surgery, and our technical suggestion can contribute towards reducing the delays associated with this particular application.
Infrared Technology Trends and Implications to Home and Building Energy Use Efficiency
NASA Astrophysics Data System (ADS)
Woolaway, James T.
2008-09-01
It has long been realized that infrared technology would have applicability in improving the energy efficiency of homes and buildings. Walls that are missing or are poorly insulated can be quickly evaluated by looking at the thermal images of these surfaces. Similarly, air infiltration leaks under doors and around windows leave a telltale thermal signature easily seen in the infrared. The ability to view, evaluate and quickly respond to these images has immediate benefits in addressing and correcting situations where these types of losses are occurring. The principle issue that has been limiting the use of infrared technology in these applications has been the lack of availability and accessibility of infrared technology at a cost point suited to this market. The emergence of low cost microbolometer based infrared cameras, not needing sensor cooling, will greatly increase the accessibility and use of infrared technology for House Doctor inspections. The technology cost for this use is projected to be less than 1 per inspection.
Cheng, Victor S; Bai, Jinfen; Chen, Yazhu
2009-11-01
As the needs for various kinds of body surface information are wide-ranging, we developed an imaging-sensor integrated system that can synchronously acquire high-resolution three-dimensional (3D) far-infrared (FIR) thermal and true-color images of the body surface. The proposed system integrates one FIR camera and one color camera with a 3D structured light binocular profilometer. To eliminate the emotion disturbance of the inspector caused by the intensive light projection directly into the eye from the LCD projector, we have developed a gray encoding strategy based on the optimum fringe projection layout. A self-heated checkerboard has been employed to perform the calibration of different types of cameras. Then, we have calibrated the structured light emitted by the LCD projector, which is based on the stereo-vision idea and the least-squares quadric surface-fitting algorithm. Afterwards, the precise 3D surface can fuse with undistorted thermal and color images. To enhance medical applications, the region-of-interest (ROI) in the temperature or color image representing the surface area of clinical interest can be located in the corresponding position in the other images through coordinate system transformation. System evaluation demonstrated a mapping error between FIR and visual images of three pixels or less. Experiments show that this work is significantly useful in certain disease diagnoses.
Hyperspectral imaging polarimeter in the infrared
NASA Astrophysics Data System (ADS)
Jensen, Gary L.; Peterson, James Q.
1998-11-01
The Space Dynamics Laboratory at Utah State University is building an infrared Hyperspectral Imaging Polarimeter (HIP). Designed for high spatial and spectral resolution polarimetry of backscattered sunlight from cloud tops in the 2.7 micrometer water band, it will fly aboard the Flying Infrared Signatures Technology Aircraft (FISTA), an Air Force KC-135. It is a proof-of-concept sensor, combining hyperspectral pushbroom imaging with high speed, solid state polarimetry, using as many off-the-shelf components as possible, and utilizing an optical breadboard design for rapid prototyping. It is based around a 256 X 320 window selectable InSb camera, a solid-state Ferro-electric Liquid Crystal (FLC) polarimeter, and a transmissive diffraction grating.
Probabilistic multi-resolution human classification
NASA Astrophysics Data System (ADS)
Tu, Jun; Ran, H.
2006-02-01
Recently there has been some interest in using infrared cameras for human detection because of the sharply decreasing prices of infrared cameras. The training data used in our work for developing the probabilistic template consists images known to contain humans in different poses and orientation but having the same height. Multiresolution templates are performed. They are based on contour and edges. This is done so that the model does not learn the intensity variations among the background pixels and intensity variations among the foreground pixels. Each template at every level is then translated so that the centroid of the non-zero pixels matches the geometrical center of the image. After this normalization step, for each pixel of the template, the probability of it being pedestrian is calculated based on the how frequently it appears as 1 in the training data. We also use periodicity gait to verify the pedestrian in a Bayesian manner for the whole blob in a probabilistic way. The videos had quite a lot of variations in the scenes, sizes of people, amount of occlusions and clutter in the backgrounds as is clearly evident. Preliminary experiments show the robustness.
EARLY SCIENCE WITH SOFIA, THE STRATOSPHERIC OBSERVATORY FOR INFRARED ASTRONOMY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, E. T.; Becklin, E. E.; De Buizer, J. M.
The Stratospheric Observatory For Infrared Astronomy (SOFIA) is an airborne observatory consisting of a specially modified Boeing 747SP with a 2.7 m telescope, flying at altitudes as high as 13.7 km (45,000 ft). Designed to observe at wavelengths from 0.3 {mu}m to 1.6 mm, SOFIA operates above 99.8% of the water vapor that obscures much of the infrared and submillimeter. SOFIA has seven science instruments under development, including an occultation photometer, near-, mid-, and far-infrared cameras, infrared spectrometers, and heterodyne receivers. SOFIA, a joint project between NASA and the German Aerospace Center Deutsches Zentrum fuer Luft und-Raumfahrt, began initial sciencemore » flights in 2010 December, and has conducted 30 science flights in the subsequent year. During this early science period three instruments have flown: the mid-infrared camera FORCAST, the heterodyne spectrometer GREAT, and the occultation photometer HIPO. This Letter provides an overview of the observatory and its early performance.« less
ARNICA, the NICMOS 3 imaging camera of TIRGO.
NASA Astrophysics Data System (ADS)
Lisi, F.; Baffa, C.; Hunt, L.; Stanga, R.
ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 μm that Arcetri Observatory has designed and built as a general facility for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1″per pixel, with sky coverage of more than 4 min×4 min on the NICMOS 3 (256×256 pixels, 40 μm side) detector array. The camera is remotely controlled by a PC 486, connected to the array control electronics via a fiber-optics link. A C-language package, running under MS-DOS on the PC 486, acquires and stores the frames, and controls the timing of the array. The camera is intended for imaging of large extra-galactic and Galactic fields; a large effort has been dedicated to explore the possibility of achieving precise photometric measurements in the J, H, K astronomical bands, with very promising results.
Experience with the UKIRT InSb array camera
NASA Technical Reports Server (NTRS)
Mclean, Ian S.; Casali, Mark M.; Wright, Gillian S.; Aspin, Colin
1989-01-01
The cryogenic infrared camera, IRCAM, has been operating routinely on the 3.8 m UK Infrared Telescope on Mauna Kea, Hawaii for over two years. The camera, which uses a 62x58 element Indium Antimonide array from Santa Barbara Research Center, was designed and built at the Royal Observatory, Edinburgh which operates UKIRT on behalf of the UK Science and Engineering Research Council. Over the past two years at least 60% of the available time on UKIRT has been allocated for IRCAM observations. Described here are some of the properties of this instrument and its detector which influence astronomical performance. Observational techniques and the power of IR arrays with some recent astronomical results are discussed.
Visible-infrared achromatic imaging by wavefront coding with wide-angle automobile camera
NASA Astrophysics Data System (ADS)
Ohta, Mitsuhiko; Sakita, Koichi; Shimano, Takeshi; Sugiyama, Takashi; Shibasaki, Susumu
2016-09-01
We perform an experiment of achromatic imaging with wavefront coding (WFC) using a wide-angle automobile lens. Our original annular phase mask for WFC was inserted to the lens, for which the difference between the focal positions at 400 nm and at 950 nm is 0.10 mm. We acquired images of objects using a WFC camera with this lens under the conditions of visible and infrared light. As a result, the effect of the removal of the chromatic aberration of the WFC system was successfully determined. Moreover, we fabricated a demonstration set assuming the use of a night vision camera in an automobile and showed the effect of the WFC system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Croll, Bryce; Albert, Loic; Lafreniere, David
We present detections of the near-infrared thermal emission of three hot Jupiters and one brown dwarf using the Wide-field Infrared Camera (WIRCam) on the Canada-France-Hawaii Telescope (CFHT). These include Ks-band secondary eclipse detections of the hot Jupiters WASP-3b and Qatar-1b and the brown dwarf KELT-1b. We also report Y-band, K {sub CONT}-band, and two new and one reanalyzed Ks-band detections of the thermal emission of the hot Jupiter WASP-12b. We present a new reduction pipeline for CFHT/WIRCam data, which is optimized for high precision photometry. We also describe novel techniques for constraining systematic errors in ground-based near-infrared photometry, so asmore » to return reliable secondary eclipse depths and uncertainties. We discuss the noise properties of our ground-based photometry for wavelengths spanning the near-infrared (the YJHK bands), for faint and bright stars, and for the same object on several occasions. For the hot Jupiters WASP-3b and WASP-12b we demonstrate the repeatability of our eclipse depth measurements in the Ks band; we therefore place stringent limits on the systematics of ground-based, near-infrared photometry, and also rule out violent weather changes in the deep, high pressure atmospheres of these two hot Jupiters at the epochs of our observations.« less
Theodolite with CCD Camera for Safe Measurement of Laser-Beam Pointing
NASA Technical Reports Server (NTRS)
Crooke, Julie A.
2003-01-01
The simple addition of a charge-coupled-device (CCD) camera to a theodolite makes it safe to measure the pointing direction of a laser beam. The present state of the art requires this to be a custom addition because theodolites are manufactured without CCD cameras as standard or even optional equipment. A theodolite is an alignment telescope equipped with mechanisms to measure the azimuth and elevation angles to the sub-arcsecond level. When measuring the angular pointing direction of a Class ll laser with a theodolite, one could place a calculated amount of neutral density (ND) filters in front of the theodolite s telescope. One could then safely view and measure the laser s boresight looking through the theodolite s telescope without great risk to one s eyes. This method for a Class ll visible wavelength laser is not acceptable to even consider tempting for a Class IV laser and not applicable for an infrared (IR) laser. If one chooses insufficient attenuation or forgets to use the filters, then looking at the laser beam through the theodolite could cause instant blindness. The CCD camera is already commercially available. It is a small, inexpensive, blackand- white CCD circuit-board-level camera. An interface adaptor was designed and fabricated to mount the camera onto the eyepiece of the specific theodolite s viewing telescope. Other equipment needed for operation of the camera are power supplies, cables, and a black-and-white television monitor. The picture displayed on the monitor is equivalent to what one would see when looking directly through the theodolite. Again, the additional advantage afforded by a cheap black-and-white CCD camera is that it is sensitive to infrared as well as to visible light. Hence, one can use the camera coupled to a theodolite to measure the pointing of an infrared as well as a visible laser.
Development of a portable multispectral thermal infrared camera
NASA Technical Reports Server (NTRS)
Osterwisch, Frederick G.
1991-01-01
The purpose of this research and development effort was to design and build a prototype instrument designated the 'Thermal Infrared Multispectral Camera' (TIRC). The Phase 2 effort was a continuation of the Phase 1 feasibility study and preliminary design for such an instrument. The completed instrument designated AA465 has application in the field of geologic remote sensing and exploration. The AA465 Thermal Infrared Camera (TIRC) System is a field-portable multispectral thermal infrared camera operating over the 8.0 - 13.0 micron wavelength range. Its primary function is to acquire two-dimensional thermal infrared images of user-selected scenes. Thermal infrared energy emitted by the scene is collected, dispersed into ten 0.5 micron wide channels, and then measured and recorded by the AA465 System. This multispectral information is presented in real time on a color display to be used by the operator to identify spectral and spatial variations in the scenes emissivity and/or irradiance. This fundamental instrument capability has a wide variety of commercial and research applications. While ideally suited for two-man operation in the field, the AA465 System can be transported and operated effectively by a single user. Functionally, the instrument operates as if it were a single exposure camera. System measurement sensitivity requirements dictate relatively long (several minutes) instrument exposure times. As such, the instrument is not suited for recording time-variant information. The AA465 was fabricated, assembled, tested, and documented during this Phase 2 work period. The detailed design and fabrication of the instrument was performed during the period of June 1989 to July 1990. The software development effort and instrument integration/test extended from July 1990 to February 1991. Software development included an operator interface/menu structure, instrument internal control functions, DSP image processing code, and a display algorithm coding program. The instrument was delivered to NASA in March 1991. Potential commercial and research uses for this instrument are in its primary application as a field geologists exploration tool. Other applications have been suggested but not investigated in depth. These are measurements of process control in commercial materials processing and quality control functions which require information on surface heterogeneity.
Evolution of the SOFIA tracking control system
NASA Astrophysics Data System (ADS)
Fiebig, Norbert; Jakob, Holger; Pfüller, Enrico; Röser, Hans-Peter; Wiedemann, Manuel; Wolf, Jürgen
2014-07-01
The airborne observatory SOFIA (Stratospheric Observatory for Infrared Astronomy) is undergoing a modernization of its tracking system. This included new, highly sensitive tracking cameras, control computers, filter wheels and other equipment, as well as a major redesign of the control software. The experiences along the migration path from an aged 19" VMbus based control system to the application of modern industrial PCs, from VxWorks real-time operating system to embedded Linux and a state of the art software architecture are presented. Further, the concept is presented to operate the new camera also as a scientific instrument, in parallel to tracking.
NASA Astrophysics Data System (ADS)
Lu, Daren; Huo, Juan; Zhang, W.; Liu, J.
A series of satellite sensors in visible and infrared wavelengths have been successfully operated on board a number of research satellites, e.g. NOAA/AVHRR, the MODIS onboard Terra and Aqua, etc. A number of cloud and aerosol products are produced and released in recent years. However, the validation of the product quality and accuracy are still a challenge to the atmospheric remote sensing community. In this paper, we suggest a ground based validation scheme for satellite-derived cloud and aerosol products by using combined visible and thermal infrared all sky imaging observations as well as surface meteorological observations. In the scheme, a visible digital camera with a fish-eye lens is used to continuously monitor the all sky with the view angle greater than 180 deg. The digital camera system is calibrated for both its geometry and radiance (broad blue, green, and red band) so as to a retrieval method can be used to detect the clear and cloudy sky spatial distribution and their temporal variations. A calibrated scanning thermal infrared thermometer is used to monitor the all sky brightness temperature distribution. An algorithm is developed to detect the clear and cloudy sky as well as cloud base height by using sky brightness distribution and surface temperature and humidity as input. Based on these composite retrieval of clear and cloudy sky distribution, it can be used to validate the satellite retrievals in the sense of real-simultaneous comparison and statistics, respectively. What will be presented in this talk include the results of the field observations and comparisons completed in Beijing (40 deg N, 116.5 deg E) in year 2003 and 2004. This work is supported by NSFC grant No. 4002700, and MOST grant No 2001CCA02200
Infrared thermography for detection of laminar-turbulent transition in low-speed wind tunnel testing
NASA Astrophysics Data System (ADS)
Joseph, Liselle A.; Borgoltz, Aurelien; Devenport, William
2016-05-01
This work presents the details of a system for experimentally identifying laminar-to-turbulent transition using infrared thermography applied to large, metal models in low-speed wind tunnel tests. Key elements of the transition detection system include infrared cameras with sensitivity in the 7.5- to 14.0-µm spectral range and a thin, insulating coat for the model. The fidelity of the system was validated through experiments on two wind-turbine blade airfoil sections tested at Reynolds numbers between Re = 1.5 × 106 and 3 × 106. Results compare well with measurements from surface pressure distributions and stethoscope observations. However, the infrared-based system provides data over a much broader range of conditions and locations on the model. This paper chronicles the design, implementation and validation of the infrared transition detection system, a subject which has not been widely detailed in the literature to date.
Multiple-frame IR photo-recorder KIT-3M
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roos, E; Wilkins, P; Nebeker, N
2006-05-15
This paper reports the experimental results of a high-speed multi-frame infrared camera which has been developed in Sarov at VNIIEF. Earlier [1] we discussed the possibility of creation of the multi-frame infrared radiation photo-recorder with framing frequency about 1 MHz. The basis of the photo-recorder is a semiconductor ionization camera [2, 3], which converts IR radiation of spectral range 1-10 micrometers into a visible image. Several sequential thermal images are registered by using the IR converter in conjunction with a multi-frame electron-optical camera. In the present report we discuss the performance characteristics of a prototype commercial 9-frame high-speed IR photo-recorder.more » The image converter records infrared images of thermal fields corresponding to temperatures ranging from 300 C to 2000 C with an exposure time of 1-20 {micro}s at a frame frequency up to 500 KHz. The IR-photo-recorder camera is useful for recording the time evolution of thermal fields in fast processes such as gas dynamics, ballistics, pulsed welding, thermal processing, automotive industry, aircraft construction, in pulsed-power electric experiments, and for the measurement of spatial mode characteristics of IR-laser radiation.« less
A fuzzy automated object classification by infrared laser camera
NASA Astrophysics Data System (ADS)
Kanazawa, Seigo; Taniguchi, Kazuhiko; Asari, Kazunari; Kuramoto, Kei; Kobashi, Syoji; Hata, Yutaka
2011-06-01
Home security in night is very important, and the system that watches a person's movements is useful in the security. This paper describes a classification system of adult, child and the other object from distance distribution measured by an infrared laser camera. This camera radiates near infrared waves and receives reflected ones. Then, it converts the time of flight into distance distribution. Our method consists of 4 steps. First, we do background subtraction and noise rejection in the distance distribution. Second, we do fuzzy clustering in the distance distribution, and form several clusters. Third, we extract features such as the height, thickness, aspect ratio, area ratio of the cluster. Then, we make fuzzy if-then rules from knowledge of adult, child and the other object so as to classify the cluster to one of adult, child and the other object. Here, we made the fuzzy membership function with respect to each features. Finally, we classify the clusters to one with the highest fuzzy degree among adult, child and the other object. In our experiment, we set up the camera in room and tested three cases. The method successfully classified them in real time processing.
Study of optical techniques for the Ames unitary wind tunnel. Part 5: Infrared imagery
NASA Technical Reports Server (NTRS)
Lee, George
1992-01-01
A survey of infrared thermography for aerodynamics was made. Particular attention was paid to boundary layer transition detection. IR thermography flow visualization of 2-D and 3-D separation was surveyed. Heat transfer measurements and surface temperature measurements were also covered. Comparisons of several commercial IR cameras were made. The use of a recently purchased IR camera in the Ames Unitary Plan Wind Tunnels was studied. Optical access for these facilities and the methods to scan typical models was investigated.
Very low cost real time histogram-based contrast enhancer utilizing fixed-point DSP processing
NASA Astrophysics Data System (ADS)
McCaffrey, Nathaniel J.; Pantuso, Francis P.
1998-03-01
A real time contrast enhancement system utilizing histogram- based algorithms has been developed to operate on standard composite video signals. This low-cost DSP based system is designed with fixed-point algorithms and an off-chip look up table (LUT) to reduce the cost considerably over other contemporary approaches. This paper describes several real- time contrast enhancing systems advanced at the Sarnoff Corporation for high-speed visible and infrared cameras. The fixed-point enhancer was derived from these high performance cameras. The enhancer digitizes analog video and spatially subsamples the stream to qualify the scene's luminance. Simultaneously, the video is streamed through a LUT that has been programmed with the previous calculation. Reducing division operations by subsampling reduces calculation- cycles and also allows the processor to be used with cameras of nominal resolutions. All values are written to the LUT during blanking so no frames are lost. The enhancer measures 13 cm X 6.4 cm X 3.2 cm, operates off 9 VAC and consumes 12 W. This processor is small and inexpensive enough to be mounted with field deployed security cameras and can be used for surveillance, video forensics and real- time medical imaging.
High throughput integrated thermal characterization with non-contact optical calorimetry
NASA Astrophysics Data System (ADS)
Hou, Sichao; Huo, Ruiqing; Su, Ming
2017-10-01
Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.
NASA Astrophysics Data System (ADS)
Harrild, M.; Webley, P. W.; Dehn, J.
2015-12-01
The ability to detect and monitor precursory events, thermal signatures, and ongoing volcanic activity in near-realtime is an invaluable tool. Volcanic hazards often range from low level lava effusion to large explosive eruptions, easily capable of ejecting ash to aircraft cruise altitudes. Using ground based remote sensing to detect and monitor this activity is essential, but the required equipment is often expensive and difficult to maintain, which increases the risk to public safety and the likelihood of financial impact. Our investigation explores the use of 'off the shelf' cameras, ranging from computer webcams to low-light security cameras, to monitor volcanic incandescent activity in near-realtime. These cameras are ideal as they operate in the visible and near-infrared (NIR) portions of the electromagnetic spectrum, are relatively cheap to purchase, consume little power, are easily replaced, and can provide telemetered, near-realtime data. We focus on the early detection of volcanic activity, using automated scripts that capture streaming online webcam imagery and evaluate each image according to pixel brightness, in order to automatically detect and identify increases in potentially hazardous activity. The cameras used here range in price from 0 to 1,000 and the script is written in Python, an open source programming language, to reduce the overall cost to potential users and increase the accessibility of these tools, particularly in developing nations. In addition, by performing laboratory tests to determine the spectral response of these cameras, a direct comparison of collocated low-light and thermal infrared cameras has allowed approximate eruption temperatures to be correlated to pixel brightness. Data collected from several volcanoes; (1) Stromboli, Italy (2) Shiveluch, Russia (3) Fuego, Guatemala (4) Popcatépetl, México, along with campaign data from Stromboli (June, 2013), and laboratory tests are presented here.
Li, Hanlun; Zhang, Aiwu; Hu, Shaoxing
2015-01-01
This paper describes an airborne high resolution four-camera multispectral system which mainly consists of four identical monochrome cameras equipped with four interchangeable bandpass filters. For this multispectral system, an automatic multispectral data composing method was proposed. The homography registration model was chosen, and the scale-invariant feature transform (SIFT) and random sample consensus (RANSAC) were used to generate matching points. For the difficult registration problem between visible band images and near-infrared band images in cases lacking manmade objects, we presented an effective method based on the structural characteristics of the system. Experiments show that our method can acquire high quality multispectral images and the band-to-band alignment error of the composed multiple spectral images is less than 2.5 pixels. PMID:26205264
VizieR Online Data Catalog: Antennae galaxies (NGC 4038/4039) revisited (Whitmore+, 2010)
NASA Astrophysics Data System (ADS)
Whitmore, B. C.; Chandar, R.; Schweizer, F.; Rothberg, B.; Leitherer, C.; Rieke, M.; Rieke, G.; Blair, W. P.; Mengel, S.; Alonso-Herrero, A.
2012-06-01
Observations of the main bodies of NGC 4038/39 were made with the Hubble Space Telescope (HST), using the ACS, as part of Program GO-10188. Multi-band photometry was obtained in the following optical broadband filters: F435W (~B), F550M (~V), and F814W (~I). Archival F336W photometry of the Antennae (Program GO-5962) was used to supplement our optical ACS/WFC observations. Infrared observations were made using the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) camera on HST as part of Program GO-10188. Observations were made using the NIC2 camera with the F160W, F187N, and F237M filters, and the NIC3 camera with the F110W, F160W, F164W, F187N, and F222M filters. (10 data files).
Characterization and optimization for detector systems of IGRINS
NASA Astrophysics Data System (ADS)
Jeong, Ueejeong; Chun, Moo-Young; Oh, Jae Sok; Park, Chan; Yuk, In-Soo; Oh, Heeyoung; Kim, Kang-Min; Ko, Kyeong Yeon; Pavel, Michael D.; Yu, Young Sam; Jaffe, Daniel T.
2014-07-01
IGRINS (Immersion GRating INfrared Spectrometer) is a high resolution wide-band infrared spectrograph developed by the Korea Astronomy and Space Science Institute (KASI) and the University of Texas at Austin (UT). This spectrograph has H-band and K-band science cameras and a slit viewing camera, all three of which use Teledyne's λc~2.5μm 2k×2k HgCdTe HAWAII-2RG CMOS detectors. The two spectrograph cameras employ science grade detectors, while the slit viewing camera includes an engineering grade detector. Teledyne's cryogenic SIDECAR ASIC boards and JADE2 USB interface cards were installed to control those detectors. We performed experiments to characterize and optimize the detector systems in the IGRINS cryostat. We present measurements and optimization of noise, dark current, and referencelevel stability obtained under dark conditions. We also discuss well depth, linearity and conversion gain measurements obtained using an external light source.
2001-11-29
KENNEDY SPACE CENTER, Fla. -- Fully unwrapped, the Advanced Camera for Surveys, which is suspended by an overhead crane, is checked over by workers. Part of the payload on the Hubble Space Telescope Servicing Mission, STS-109, the ACS will increase the discovery efficiency of the HST by a factor of ten. It consists of three electronic cameras and a complement of filters and dispersers that detect light from the ultraviolet to the near infrared (1200 - 10,000 angstroms). The ACS was built through a collaborative effort between Johns Hopkins University, Goddard Space Flight Center, Ball Aerospace Corporation and Space Telescope Science Institute. Tasks for the mission include replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the ACS, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002
Barabino, G; Klein, J P; Porcheron, J; Grichine, A; Coll, J-L; Cottier, M
2016-12-01
This study assesses the value of using Intraoperative Near Infrared Fluorescence Imaging and Indocyanine green to detect colorectal carcinomatosis during oncological surgery. In colorectal carcinomatosis cancer, two of the most important prognostic factors are completeness of staging and completeness of cytoreductive surgery. Presently, intraoperative assessment of tumoral margins relies on palpation and visual inspection. The recent introduction of Near Infrared fluorescence image guidance provides new opportunities for surgical roles, particularly in cancer surgery. The study was a non-randomized, monocentric, pilot "ex vivo" blinded clinical trial validated by the ethical committee of University Hospital of Saint Etienne. Ten patients with colorectal carcinomatosis cancer scheduled for cytoreductive surgery were included. Patients received 0.25 mg/kg of Indocyanine green intravenously 24 h before surgery. A Near Infrared camera was used to detect "ex-vivo" fluorescent lesions. There was no surgical mortality. Each analysis was done blindly. In a total of 88 lesions analyzed, 58 were classified by a pathologist as cancerous and 30 as non-cancerous. Among the 58 cancerous lesions, 42 were correctly classified by the Intraoperative Near-Infrared camera (sensitivity of 72.4%). Among the 30 non-cancerous lesions, 18 were correctly classified by the Intraoperative Near-Infrared camera (specificity of 60.0%). Near Infrared fluorescence imaging is a promising technique for intraoperative tumor identification. It could help the surgeon to determine resection margins and reduce the risk of locoregional recurrence. Copyright © 2016 Elsevier Ltd, BASO ~ the Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.
NASA Astrophysics Data System (ADS)
Olafsen, L. J.; Olafsen, J. S.; Eaves, I. K.
2018-06-01
We report on an experimental investigation of the time-dependent spatial intensity distribution of near-infrared idler pulses from an optical parametric oscillator measured using an infrared (IR) camera, in contrast to beam profiles obtained using traditional knife-edge techniques. Comparisons show the information gained by utilizing the thermal camera provides more detail than the spatially- or time-averaged measurements from a knife-edge profile. Synchronization, averaging, and thresholding techniques are applied to enhance the images acquired. The additional information obtained can improve the process by which semiconductor devices and other IR lasers are characterized for their beam quality and output response and thereby result in IR devices with higher performance.
A portable W-band radar system for enhancement of infrared vision in fire fighting operations
NASA Astrophysics Data System (ADS)
Klenner, Mathias; Zech, Christian; Hülsmann, Axel; Kühn, Jutta; Schlechtweg, Michael; Hahmann, Konstantin; Kleiner, Bernhard; Ulrich, Michael; Ambacher, Oliver
2016-10-01
In this paper, we present a millimeter wave radar system which will enhance the performance of infrared cameras used for fire-fighting applications. The radar module is compact and lightweight such that the system can be combined with inertial sensors and integrated in a hand-held infrared camera. This allows for precise distance measurements in harsh environmental conditions, such as tunnel or industrial fires, where optical sensors are unreliable or fail. We discuss the design of the RF front-end, the antenna and a quasi-optical lens for beam shaping as well as signal processing and demonstrate the performance of the system by in situ measurements in a smoke filled environment.
High speed Infrared imaging method for observation of the fast varying temperature phenomena
NASA Astrophysics Data System (ADS)
Moghadam, Reza; Alavi, Kambiz; Yuan, Baohong
With new improvements in high-end commercial R&D camera technologies many challenges have been overcome for exploring the high-speed IR camera imaging. The core benefits of this technology is the ability to capture fast varying phenomena without image blur, acquire enough data to properly characterize dynamic energy, and increase the dynamic range without compromising the number of frames per second. This study presents a noninvasive method for determining the intensity field of a High Intensity Focused Ultrasound Device (HIFU) beam using Infrared imaging. High speed Infrared camera was placed above the tissue-mimicking material that was heated by HIFU with no other sensors present in the HIFU axial beam. A MATLAB simulation code used to perform a finite-element solution to the pressure wave propagation and heat equations within the phantom and temperature rise to the phantom was computed. Three different power levels of HIFU transducers were tested and the predicted temperature increase values were within about 25% of IR measurements. The fundamental theory and methods developed in this research can be used to detect fast varying temperature phenomena in combination with the infrared filters.
Research on Geometric Calibration of Spaceborne Linear Array Whiskbroom Camera
Sheng, Qinghong; Wang, Qi; Xiao, Hui; Wang, Qing
2018-01-01
The geometric calibration of a spaceborne thermal-infrared camera with a high spatial resolution and wide coverage can set benchmarks for providing an accurate geographical coordinate for the retrieval of land surface temperature. The practice of using linear array whiskbroom Charge-Coupled Device (CCD) arrays to image the Earth can help get thermal-infrared images of a large breadth with high spatial resolutions. Focusing on the whiskbroom characteristics of equal time intervals and unequal angles, the present study proposes a spaceborne linear-array-scanning imaging geometric model, whilst calibrating temporal system parameters and whiskbroom angle parameters. With the help of the YG-14—China’s first satellite equipped with thermal-infrared cameras of high spatial resolution—China’s Anyang Imaging and Taiyuan Imaging are used to conduct an experiment of geometric calibration and a verification test, respectively. Results have shown that the plane positioning accuracy without ground control points (GCPs) is better than 30 pixels and the plane positioning accuracy with GCPs is better than 1 pixel. PMID:29337885
Infrared needle mapping to assist biopsy procedures and training.
Shar, Bruce; Leis, John; Coucher, John
2018-04-01
A computed tomography (CT) biopsy is a radiological procedure which involves using a needle to withdraw tissue or a fluid specimen from a lesion of interest inside a patient's body. The needle is progressively advanced into the patient's body, guided by the most recent CT scan. CT guided biopsies invariably expose patients to high dosages of radiation, due to the number of scans required whilst the needle is advanced. This study details the design of a novel method to aid biopsy procedures using infrared cameras. Two cameras are used to image the biopsy needle area, from which the proposed algorithm computes an estimate of the needle endpoint, which is projected onto the CT image space. This estimated position may be used to guide the needle between scans, and results in a reduction in the number of CT scans that need to be performed during the biopsy procedure. The authors formulate a 2D augmentation system which compensates for camera pose, and show that multiple low-cost infrared imaging devices provide a promising approach.
Software defined multi-spectral imaging for Arctic sensor networks
NASA Astrophysics Data System (ADS)
Siewert, Sam; Angoth, Vivek; Krishnamurthy, Ramnarayan; Mani, Karthikeyan; Mock, Kenrick; Singh, Surjith B.; Srivistava, Saurav; Wagner, Chris; Claus, Ryan; Vis, Matthew Demi
2016-05-01
Availability of off-the-shelf infrared sensors combined with high definition visible cameras has made possible the construction of a Software Defined Multi-Spectral Imager (SDMSI) combining long-wave, near-infrared and visible imaging. The SDMSI requires a real-time embedded processor to fuse images and to create real-time depth maps for opportunistic uplink in sensor networks. Researchers at Embry Riddle Aeronautical University working with University of Alaska Anchorage at the Arctic Domain Awareness Center and the University of Colorado Boulder have built several versions of a low-cost drop-in-place SDMSI to test alternatives for power efficient image fusion. The SDMSI is intended for use in field applications including marine security, search and rescue operations and environmental surveys in the Arctic region. Based on Arctic marine sensor network mission goals, the team has designed the SDMSI to include features to rank images based on saliency and to provide on camera fusion and depth mapping. A major challenge has been the design of the camera computing system to operate within a 10 to 20 Watt power budget. This paper presents a power analysis of three options: 1) multi-core, 2) field programmable gate array with multi-core, and 3) graphics processing units with multi-core. For each test, power consumed for common fusion workloads has been measured at a range of frame rates and resolutions. Detailed analyses from our power efficiency comparison for workloads specific to stereo depth mapping and sensor fusion are summarized. Preliminary mission feasibility results from testing with off-the-shelf long-wave infrared and visible cameras in Alaska and Arizona are also summarized to demonstrate the value of the SDMSI for applications such as ice tracking, ocean color, soil moisture, animal and marine vessel detection and tracking. The goal is to select the most power efficient solution for the SDMSI for use on UAVs (Unoccupied Aerial Vehicles) and other drop-in-place installations in the Arctic. The prototype selected will be field tested in Alaska in the summer of 2016.
Euro Banknote Recognition System for Blind People.
Dunai Dunai, Larisa; Chillarón Pérez, Mónica; Peris-Fajarnés, Guillermo; Lengua Lengua, Ismael
2017-01-20
This paper presents the development of a portable system with the aim of allowing blind people to detect and recognize Euro banknotes. The developed device is based on a Raspberry Pi electronic instrument and a Raspberry Pi camera, Pi NoIR (No Infrared filter) dotted with additional infrared light, which is embedded into a pair of sunglasses that permit blind and visually impaired people to independently handle Euro banknotes, especially when receiving their cash back when shopping. The banknote detection is based on the modified Viola and Jones algorithms, while the banknote value recognition relies on the Speed Up Robust Features (SURF) technique. The accuracies of banknote detection and banknote value recognition are 84% and 97.5%, respectively.
Euro Banknote Recognition System for Blind People
Dunai Dunai, Larisa; Chillarón Pérez, Mónica; Peris-Fajarnés, Guillermo; Lengua Lengua, Ismael
2017-01-01
This paper presents the development of a portable system with the aim of allowing blind people to detect and recognize Euro banknotes. The developed device is based on a Raspberry Pi electronic instrument and a Raspberry Pi camera, Pi NoIR (No Infrared filter) dotted with additional infrared light, which is embedded into a pair of sunglasses that permit blind and visually impaired people to independently handle Euro banknotes, especially when receiving their cash back when shopping. The banknote detection is based on the modified Viola and Jones algorithms, while the banknote value recognition relies on the Speed Up Robust Features (SURF) technique. The accuracies of banknote detection and banknote value recognition are 84% and 97.5%, respectively. PMID:28117703
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pandya, Shwetang N., E-mail: pandya.shwetang@LHD.nifs.ac.jp; Sano, Ryuichi; Peterson, Byron J.
An Infrared imaging Video Bolometer (IRVB) diagnostic is currently being used in the Large Helical Device (LHD) for studying the localization of radiation structures near the magnetic island and helical divertor X-points during plasma detachment and for 3D tomography. This research demands high signal to noise ratio (SNR) and sensitivity to improve the temporal resolution for studying the evolution of radiation structures during plasma detachment and a wide IRVB field of view (FoV) for tomography. Introduction of an infrared periscope allows achievement of a higher SNR and higher sensitivity, which in turn, permits a twofold improvement in the temporal resolutionmore » of the diagnostic. Higher SNR along with wide FoV is achieved simultaneously by reducing the separation of the IRVB detector (metal foil) from the bolometer's aperture and the LHD plasma. Altering the distances to meet the aforesaid requirements results in an increased separation between the foil and the IR camera. This leads to a degradation of the diagnostic performance in terms of its sensitivity by 1.5-fold. Using an infrared periscope to image the IRVB foil results in a 7.5-fold increase in the number of IR camera pixels imaging the foil. This improves the IRVB sensitivity which depends on the square root of the number of IR camera pixels being averaged per bolometer channel. Despite the slower f-number (f/# = 1.35) and reduced transmission (τ{sub 0} = 89%, due to an increased number of lens elements) for the periscope, the diagnostic with an infrared periscope operational on LHD has improved in terms of sensitivity and SNR by a factor of 1.4 and 4.5, respectively, as compared to the original diagnostic without a periscope (i.e., IRVB foil being directly imaged by the IR camera through conventional optics). The bolometer's field of view has also increased by two times. The paper discusses these improvements in apt details.« less
The Stratospheric Observatory for Infrared Astronomy - A New Tool for Planetary Science
NASA Astrophysics Data System (ADS)
Ruzek, M. J.; Becklin, E.; Burgdorf, M. J.; Reach, W.
2010-12-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a joint US/German effort to fly a 2.5 meter telescope on a modified Boeing 747SP aircraft at stratospheric altitudes where the atmosphere is largely transparent at infrared wavelengths. Key goals of the observatory include understanding the formation of stars and planets; the origin and evolution of the interstellar medium; the star formation history of galaxies; and planetary science. SOFIA offers the convenient accessibility of a ground-based observatory coupled with performance advantages of a space-based telescope. SOFIA’s scientific instruments can be exchanged regularly for repairs, to accommodate changing scientific requirements, and to incorporate new technologies. SOFIA’s portability will enable specialized observations of transient and location-specific events such as stellar occultations of Trans-Neptunian Objects. Unlike many spaceborne observatories, SOFIA can observe bright planets and moons directly, and can observe objects closer to the sun than Earth, e.g. comets in their most active phase, and the planet Venus. SOFIA’s first generation instruments cover the spectral range of .3 to 240 microns and have been designed with planetary science in mind. The High-speed Imaging Photometer for Occultations (HIPO) is designed to measure occultations of stars by Kuiper Belt Objects, with SOFIA flying into the predicted shadows and timing the occultation ingress and egress to determine the size of the occulting body. HIPO will also enable transit observations of extrasolar planets. The Faint Object Infrared Camera for the SOFIA Telescope (FORCAST) and the High-resolution Airborne Wideband Camera (HAWC) will enable mid-infrared and far-infrared (respectively) imaging with a wide range of filters for comets and giant planets, and colorimetric observations of small, unresolved bodies to measure the spectral energy distribution of their thermal emission. The German Receiver for Astronomy at Terahertz Frequencies (GREAT) will measure far-infrared and microwave spectral lines at km/s resolution to search for molecular species and achieve a significant improvement over current knowledge of abundance and distribution of water in planetary bodies. The Echelon Cross Echelle Spectrograph (EXES) and the Field Imaging Far Infrared Line Spectrometer (FIFI LS) will provide high-resolution spectral data between 5 and 210 microns to support mineralogical analysis of solar system and extrasolar debris disk dust and observe spectral features in planetary atmospheres. The First Light Infrared Test Experiment Camera (FLITECAM) will offer imaging and moderate resolution spectroscopy at wavelengths between 1 and 5 microns for observations of comets and asteroids, and can be used simultaneously with HIPO to characterize the atmosphere of transiting exoplanets. SOFIA’s first light flight occurred in May, 2010 and the first short science observing program is scheduled to begin in November, 2010. The Program will issue a call for new instrumentation proposals in the summer of 2011, as well as regular calls for observing proposals beginning in late summer 2011. SOFIA is expected to make ~120 science mission flights each year when fully operational in 2014.
Acquisition of the spatial temperature distribution of rock faces by using infrared thermography
NASA Astrophysics Data System (ADS)
Beham, Michael; Rode, Matthias; Schnepfleitner, Harald; Sass, Oliver
2013-04-01
Rock temperature plays a central role for weathering and therefore influences the risk potential originating from rockfall processes. So far, for the acquisition of temperature mainly point-based measuring methods have been used and accordingly, two-dimensional temperature data is rare. To overcome this limitation, an infrared camera was used to collect and analyse data on the spatial temperature distribution on 10 x 10 m sections of rock faces in the Gesäuse (900m a.s.l.) and in the Dachsteingebirge (2700m a.s.l.) within the framework of the research project ROCKING ALPS (FWF-P24244). The advantage of infrared thermography to capture area-wide temperatures has hardly ever been used in this context. In order to investigate the differences between north-facing and south-facing rock faces at about the same period of time it was necessary to move the camera between the sites. The resulting offset of the time lapse infrared images made it necessary to develop a sophisticated methodology to rectify the captured images in order to create matching datasets for future analysis. With the relatively simple camera used, one of the main challenges was to find a way to convert the colour-scale or grey-scale values of the rectified image back to temperature values after the rectification process. The processing steps were mainly carried out with MATLAB. South-facing rock faces generally experienced higher temperatures and amplitudes compared to the north facing ones. In view of the spatial temperature distribution, the temperatures of shady areas were clearly below those of sunny ones, with the latter also showing the highest amplitudes. Joints and sun-shaded areas were characterised by attenuated diurnal temperature fluctuations closely paralleled to the air temperature. The temperature of protruding rock parts and of loose debris responded very quick to changes in radiation and air temperatures while massive rock reacted more slowly. The potential effects of temperature on weathering could only be assessed in a qualitative way by now. However, the variability of temperatures and amplitudes on a rather small and homogeneous section of a rockwall is surprisingly high which challenges any statements on weathering effectiveness based on point measurements. In simple terms, the use of infrared thermography has proven its value in the presented pilot study and is going to be a promising tool for research into rock weathering.
Infrared-enhanced TV for fire detection
NASA Technical Reports Server (NTRS)
Hall, J. R.
1978-01-01
Closed-circuit television is superior to conventional smoke or heat sensors for detecting fires in large open spaces. Single TV camera scans entire area, whereas many conventional sensors and maze of interconnecting wiring might be required to get same coverage. Camera is monitored by person who would trip alarm if fire were detected, or electronic circuitry could process camera signal for fully-automatic alarm system.
Design and Calibration of a Dispersive Imaging Spectrometer Adaptor for a Fast IR Camera on NSTX-U
NASA Astrophysics Data System (ADS)
Reksoatmodjo, Richard; Gray, Travis; Princeton Plasma Physics Laboratory Team
2017-10-01
A dispersive spectrometer adaptor was designed, constructed and calibrated for use on a fast infrared camera employed to measure temperatures on the lower divertor tiles of the NSTX-U tokamak. This adaptor efficiently and evenly filters and distributes long-wavelength infrared photons between 8.0 and 12.0 microns across the 128x128 pixel detector of the fast IR camera. By determining the width of these separated wavelength bands across the camera detector, and then determining the corresponding average photon count for each photon wavelength, a very accurate measurement of the temperature, and thus heat flux, of the divertor tiles can be calculated using Plank's law. This approach of designing an exterior dispersive adaptor for the fast IR camera allows accurate temperature measurements to be made of materials with unknown emissivity. Further, the relative simplicity and affordability of this adaptor design provides an attractive option over more expensive, slower, dispersive IR camera systems. This work was made possible by funding from the Department of Energy for the Summer Undergraduate Laboratory Internship (SULI) program. This work is supported by the US DOE Contract No. DE-AC02-09CH11466.
A Near-Infrared Spectrometer to Measure Zodiacal Light Absorption Spectrum
NASA Technical Reports Server (NTRS)
Kutyrev, A. S.; Arendt, R.; Dwek, E.; Kimble, R.; Moseley, S. H.; Rapchun, D.; Silverberg, R. F.
2010-01-01
We have developed a high throughput infrared spectrometer for zodiacal light fraunhofer lines measurements. The instrument is based on a cryogenic dual silicon Fabry-Perot etalon which is designed to achieve high signal to noise Fraunhofer line profile measurements. Very large aperture silicon Fabry-Perot etalons and fast camera optics make these measurements possible. The results of the absorption line profile measurements will provide a model free measure of the zodiacal Light intensity in the near infrared. The knowledge of the zodiacal light brightness is crucial for accurate subtraction of zodiacal light foreground for accurate measure of the extragalactic background light after the subtraction of zodiacal light foreground. We present the final design of the instrument and the first results of its performance.
Combined hostile fire and optics detection
NASA Astrophysics Data System (ADS)
Brännlund, Carl; Tidström, Jonas; Henriksson, Markus; Sjöqvist, Lars
2013-10-01
Snipers and other optically guided weapon systems are serious threats in military operations. We have studied a SWIR (Short Wave Infrared) camera-based system with capability to detect and locate snipers both before and after shot over a large field-of-view. The high frame rate SWIR-camera allows resolution of the temporal profile of muzzle flashes which is the infrared signature associated with the ejection of the bullet from the rifle. The capability to detect and discriminate sniper muzzle flashes with this system has been verified by FOI in earlier studies. In this work we have extended the system by adding a laser channel for optics detection. A laser diode with slit-shaped beam profile is scanned over the camera field-of-view to detect retro reflection from optical sights. The optics detection system has been tested at various distances up to 1.15 km showing the feasibility to detect rifle scopes in full daylight. The high speed camera gives the possibility to discriminate false alarms by analyzing the temporal data. The intensity variation, caused by atmospheric turbulence, enables discrimination of small sights from larger reflectors due to aperture averaging, although the targets only cover a single pixel. It is shown that optics detection can be integrated in combination with muzzle flash detection by adding a scanning rectangular laser slit. The overall optics detection capability by continuous surveillance of a relatively large field-of-view looks promising. This type of multifunctional system may become an important tool to detect snipers before and after shot.
SOFIA tracking image simulation
NASA Astrophysics Data System (ADS)
Taylor, Charles R.; Gross, Michael A. K.
2016-09-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) tracking camera simulator is a component of the Telescope Assembly Simulator (TASim). TASim is a software simulation of the telescope optics, mounting, and control software. Currently in its fifth major version, TASim is relied upon for telescope operator training, mission planning and rehearsal, and mission control and science instrument software development and testing. TASim has recently been extended for hardware-in-the-loop operation in support of telescope and camera hardware development and control and tracking software improvements. All three SOFIA optical tracking cameras are simulated, including the Focal Plane Imager (FPI), which has recently been upgraded to the status of a science instrument that can be used on its own or in parallel with one of the seven infrared science instruments. The simulation includes tracking camera image simulation of starfields based on the UCAC4 catalog at real-time rates of 4-20 frames per second. For its role in training and planning, it is important for the tracker image simulation to provide images with a realistic appearance and response to changes in operating parameters. For its role in tracker software improvements, it is vital to have realistic signal and noise levels and precise star positions. The design of the software simulation for precise subpixel starfield rendering (including radial distortion), realistic point-spread function as a function of focus, tilt, and collimation, and streaking due to telescope motion will be described. The calibration of the simulation for light sensitivity, dark and bias signal, and noise will also be presented
Near-surface Thermal Infrared Imaging of a Mixed Forest
NASA Astrophysics Data System (ADS)
Aubrecht, D. M.; Helliker, B. R.; Richardson, A. D.
2014-12-01
Measurement of an organism's temperature is of basic physiological importance and therefore necessary for ecosystem modeling, yet most models derive leaf temperature from energy balance arguments or assume it is equal to air temperature. This is because continuous, direct measurement of leaf temperature outside of a controlled environment is difficult and rarely done. Of even greater challenge is measuring leaf temperature with the resolution required to understand the underlying energy balance and regulation of plant processes. To measure leaf temperature through the year, we have mounted a high-resolution, thermal infrared camera overlooking the canopy of a temperate deciduous forest. The camera is co-located with an eddy covariance system and a suite of radiometric sensors. Our camera measures longwave thermal infrared (λ = 7.5-14 microns) using a microbolometer array. Suspended in the canopy within the camera FOV is a matte black copper plate instrumented with fine wire thermocouples that acts as a thermal reference for each image. In this presentation, I will discuss the challenges of continuous, long-term field operation of the camera, as well as measurement sensitivity to physical and environmental parameters. Based on this analysis, I will show that the uncertainties in converting radiometric signal to leaf temperature are well constrained. The key parameter for minimizing uncertainty is the emissivity of the objects being imaged: measuring the emissivity to within 0.01 enables leaf temperature to be calculated to within 0.5°C. Finally, I will present differences in leaf temperature observed amongst species. From our two-year record, we characterize high frequency, daily, and seasonal thermal signatures of leaves and crowns, in relation to environmental conditions. Our images are taken with sufficient spatial and temporal resolution to quantify the preferential heating of sunlit portions of the canopy and the cooling effect of wind gusts. Future work will be focused on correlations between hyperspectral vegetation indices, fluxes, and thermal signatures to characterize vegetation stress. As water stress increases, causing photosynthesis and transpiration to shutdown, heat fluxes, leaf temperature, and narrow band vegetation indices should report signatures of the affected processes.
Video System Highlights Hydrogen Fires
NASA Technical Reports Server (NTRS)
Youngquist, Robert C.; Gleman, Stuart M.; Moerk, John S.
1992-01-01
Video system combines images from visible spectrum and from three bands in infrared spectrum to produce color-coded display in which hydrogen fires distinguished from other sources of heat. Includes linear array of 64 discrete lead selenide mid-infrared detectors operating at room temperature. Images overlaid on black and white image of same scene from standard commercial video camera. In final image, hydrogen fires appear red; carbon-based fires, blue; and other hot objects, mainly green and combinations of green and red. Where no thermal source present, image remains in black and white. System enables high degree of discrimination between hydrogen flames and other thermal emitters.
Cameras Reveal Elements in the Short Wave Infrared
NASA Technical Reports Server (NTRS)
2010-01-01
Goodrich ISR Systems Inc. (formerly Sensors Unlimited Inc.), based out of Princeton, New Jersey, received Small Business Innovation Research (SBIR) contracts from the Jet Propulsion Laboratory, Marshall Space Flight Center, Kennedy Space Center, Goddard Space Flight Center, Ames Research Center, Stennis Space Center, and Langley Research Center to assist in advancing and refining indium gallium arsenide imaging technology. Used on the Lunar Crater Observation and Sensing Satellite (LCROSS) mission in 2009 for imaging the short wave infrared wavelengths, the technology has dozens of applications in military, security and surveillance, machine vision, medical, spectroscopy, semiconductor inspection, instrumentation, thermography, and telecommunications.
HUBBLE SPACE TELESCOPE RESOLVES VOLCANOES ON IO
NASA Technical Reports Server (NTRS)
2002-01-01
This picture is a composite of a black and white near infrared image of Jupiter and its satellite Io and a color image of Io at shorter wavelengths taken at almost the same time on March 5, 1994. These are the first images of a giant planet or its satellites taken by NASA's Hubble Space Telescope (HST) since the repair mission in December 1993. Io is too small for ground-based telescopes to see the surface details. The moon's angular diameter of one arc second is at the resolution limit of ground based telescopes. Many of these markings correspond to volcanoes that were first revealed in 1979 during the Voyager spacecraft flyby of Jupiter. Several of the volcanoes periodically are active because Io is heated by tides raised by Jupiter's powerful gravity. The volcano Pele appears as a dark spot surrounded by an irregular orange oval in the lower part of the image. The orange material has been ejected from the volcano and spread over a huge area. Though the volcano was first discovered by Voyager, the distinctive orange color of the volcanic deposits is a new discovery in these HST images. (Voyager missed it because its cameras were not sensitive to the near-infrared wavelengths where the color is apparent). The sulfur and sulfur dioxide that probably dominate Io's surface composition cannot produce this orange color, so the Pele volcano must be generating material with a more unusual composition, possibly rich in sodium. The Jupiter image, taken in near-infrared light, was obtained with HST's Wide Field and Planetary Camera in wide field mode. High altitude ammonia crystal clouds are bright in this image because they reflect infrared light before it is absorbed by methane in Jupiter's atmosphere. The most prominent feature is the Great Red Spot, which is conspicuous because of its high clouds. A cap of high-altitude haze appears at Jupiter's south pole. The Wide Field/Planetary Camera 2 was developed by the Jet Propulsion Laboratory and managed by the Goddard Spaced Flight Center for NASA's Office of Space Science. Credit: John Spencer, Lowell Observatory; NASA
Jolliff, B.; Knoll, A.; Morris, R.V.; Moersch, J.; McSween, H.; Gilmore, M.; Arvidson, R.; Greeley, R.; Herkenhoff, K.; Squyres, S.
2002-01-01
Blind field tests of the Field Integration Design and Operations (FIDO) prototype Mars rover were carried out 7-16 May 2000. A Core Operations Team (COT), sequestered at the Jet Propulsion Laboratory without knowledge of test site location, prepared command sequences and interpreted data acquired by the rover. Instrument sensors included a stereo panoramic camera, navigational and hazard-avoidance cameras, a color microscopic imager, an infrared point spectrometer, and a rock coring drill. The COT designed command sequences, which were relayed by satellite uplink to the rover, and evaluated instrument data. Using aerial photos and Airborne Visible and Infrared Imaging Spectrometer (AVIRIS) data, and information from the rover sensors, the COT inferred the geology of the landing site during the 18 sol mission, including lithologic diversity, stratigraphic relationships, environments of deposition, and weathering characteristics. Prominent lithologic units were interpreted to be dolomite-bearing rocks, kaolinite-bearing altered felsic volcanic materials, and basalt. The color panoramic camera revealed sedimentary layering and rock textures, and geologic relationships seen in rock exposures. The infrared point spectrometer permitted identification of prominent carbonate and kaolinite spectral features and permitted correlations to outcrops that could not be reached by the rover. The color microscopic imager revealed fine-scale rock textures, soil components, and results of coring experiments. Test results show that close-up interrogation of rocks is essential to investigations of geologic environments and that observations must include scales ranging from individual boulders and outcrops (microscopic, macroscopic) to orbital remote sensing, with sufficient intermediate steps (descent images) to connect in situ and remote observations.
Nighttime Foreground Pedestrian Detection Based on Three-Dimensional Voxel Surface Model.
Li, Jing; Zhang, Fangbing; Wei, Lisong; Yang, Tao; Lu, Zhaoyang
2017-10-16
Pedestrian detection is among the most frequently-used preprocessing tasks in many surveillance application fields, from low-level people counting to high-level scene understanding. Even though many approaches perform well in the daytime with sufficient illumination, pedestrian detection at night is still a critical and challenging problem for video surveillance systems. To respond to this need, in this paper, we provide an affordable solution with a near-infrared stereo network camera, as well as a novel three-dimensional foreground pedestrian detection model. Specifically, instead of using an expensive thermal camera, we build a near-infrared stereo vision system with two calibrated network cameras and near-infrared lamps. The core of the system is a novel voxel surface model, which is able to estimate the dynamic changes of three-dimensional geometric information of the surveillance scene and to segment and locate foreground pedestrians in real time. A free update policy for unknown points is designed for model updating, and the extracted shadow of the pedestrian is adopted to remove foreground false alarms. To evaluate the performance of the proposed model, the system is deployed in several nighttime surveillance scenes. Experimental results demonstrate that our method is capable of nighttime pedestrian segmentation and detection in real time under heavy occlusion. In addition, the qualitative and quantitative comparison results show that our work outperforms classical background subtraction approaches and a recent RGB-D method, as well as achieving comparable performance with the state-of-the-art deep learning pedestrian detection method even with a much lower hardware cost.
Nighttime Foreground Pedestrian Detection Based on Three-Dimensional Voxel Surface Model
Li, Jing; Zhang, Fangbing; Wei, Lisong; Lu, Zhaoyang
2017-01-01
Pedestrian detection is among the most frequently-used preprocessing tasks in many surveillance application fields, from low-level people counting to high-level scene understanding. Even though many approaches perform well in the daytime with sufficient illumination, pedestrian detection at night is still a critical and challenging problem for video surveillance systems. To respond to this need, in this paper, we provide an affordable solution with a near-infrared stereo network camera, as well as a novel three-dimensional foreground pedestrian detection model. Specifically, instead of using an expensive thermal camera, we build a near-infrared stereo vision system with two calibrated network cameras and near-infrared lamps. The core of the system is a novel voxel surface model, which is able to estimate the dynamic changes of three-dimensional geometric information of the surveillance scene and to segment and locate foreground pedestrians in real time. A free update policy for unknown points is designed for model updating, and the extracted shadow of the pedestrian is adopted to remove foreground false alarms. To evaluate the performance of the proposed model, the system is deployed in several nighttime surveillance scenes. Experimental results demonstrate that our method is capable of nighttime pedestrian segmentation and detection in real time under heavy occlusion. In addition, the qualitative and quantitative comparison results show that our work outperforms classical background subtraction approaches and a recent RGB-D method, as well as achieving comparable performance with the state-of-the-art deep learning pedestrian detection method even with a much lower hardware cost. PMID:29035295
NASA Technical Reports Server (NTRS)
Harper, D. A.
1996-01-01
The objective of this grant was to construct a series of far infrared photometers, cameras, and supporting systems for use in astronomical observations in the Kuiper Airborne Observatory. The observations have included studies of galaxies, star formation regions, and objects within the Solar System.
Continuous All-Sky Cloud Measurements: Cloud Fraction Analysis Based on a Newly Developed Instrument
NASA Astrophysics Data System (ADS)
Aebi, C.; Groebner, J.; Kaempfer, N.; Vuilleumier, L.
2017-12-01
Clouds play an important role in the climate system and are also a crucial parameter for the Earth's surface energy budget. Ground-based measurements of clouds provide data in a high temporal resolution in order to quantify its influence on radiation. The newly developed all-sky cloud camera at PMOD/WRC in Davos (Switzerland), the infrared cloud camera (IRCCAM), is a microbolometer sensitive in the 8 - 14 μm wavelength range. To get all-sky information the camera is located on top of a frame looking downward on a spherical gold-plated mirror. The IRCCAM has been measuring continuously (day and nighttime) with a time resolution of one minute in Davos since September 2015. To assess the performance of the IRCCAM, two different visible all-sky cameras (Mobotix Q24M and Schreder VIS-J1006), which can only operate during daytime, are installed in Davos. All three camera systems have different software for calculating fractional cloud coverage from images. Our study analyzes mainly the fractional cloud coverage of the IRCCAM and compares it with the fractional cloud coverage calculated from the two visible cameras. Preliminary results of the measurement accuracy of the IRCCAM compared to the visible camera indicate that 78 % of the data are within ± 1 octa and even 93 % within ± 2 octas. An uncertainty of 1-2 octas corresponds to the measurement uncertainty of human observers. Therefore, the IRCCAM shows similar performance in detection of cloud coverage as the visible cameras and the human observers, with the advantage that continuous measurements with high temporal resolution are possible.
Autonomous Docking Based on Infrared System for Electric Vehicle Charging in Urban Areas
Pérez, Joshué; Nashashibi, Fawzi; Lefaudeux, Benjamin; Resende, Paulo; Pollard, Evangeline
2013-01-01
Electric vehicles are progressively introduced in urban areas, because of their ability to reduce air pollution, fuel consumption and noise nuisance. Nowadays, some big cities are launching the first electric car-sharing projects to clear traffic jams and enhance urban mobility, as an alternative to the classic public transportation systems. However, there are still some problems to be solved related to energy storage, electric charging and autonomy. In this paper, we present an autonomous docking system for electric vehicles recharging based on an embarked infrared camera performing infrared beacons detection installed in the infrastructure. A visual servoing system coupled with an automatic controller allows the vehicle to dock accurately to the recharging booth in a street parking area. The results show good behavior of the implemented system, which is currently deployed as a real prototype system in the city of Paris. PMID:23429581
Autonomous docking based on infrared system for electric vehicle charging in urban areas.
Pérez, Joshué; Nashashibi, Fawzi; Lefaudeux, Benjamin; Resende, Paulo; Pollard, Evangeline
2013-02-21
Electric vehicles are progressively introduced in urban areas, because of their ability to reduce air pollution, fuel consumption and noise nuisance. Nowadays, some big cities are launching the first electric car-sharing projects to clear traffic jams and enhance urban mobility, as an alternative to the classic public transportation systems. However, there are still some problems to be solved related to energy storage, electric charging and autonomy. In this paper, we present an autonomous docking system for electric vehicles recharging based on an embarked infrared camera performing infrared beacons detection installed in the infrastructure. A visual servoing system coupled with an automatic controller allows the vehicle to dock accurately to the recharging booth in a street parking area. The results show good behavior of the implemented system, which is currently deployed as a real prototype system in the city of Paris.
Multi-color IR sensors based on QWIP technology for security and surveillance applications
NASA Astrophysics Data System (ADS)
Sundaram, Mani; Reisinger, Axel; Dennis, Richard; Patnaude, Kelly; Burrows, Douglas; Cook, Robert; Bundas, Jason
2006-05-01
Room-temperature targets are detected at the furthest distance by imaging them in the long wavelength (LW: 8-12 μm) infrared spectral band where they glow brightest. Focal plane arrays (FPAs) based on quantum well infrared photodetectors (QWIPs) have sensitivity, noise, and cost metrics that have enabled them to become the best commercial solution for certain security and surveillance applications. Recently, QWIP technology has advanced to provide pixelregistered dual-band imaging in both the midwave (MW: 3-5 μm) and longwave infrared spectral bands in a single chip. This elegant technology affords a degree of target discrimination as well as the ability to maximize detection range for hot targets (e.g. missile plumes) by imaging in the midwave and for room-temperature targets (e.g. humans, trucks) by imaging in the longwave with one simple camera. Detection-range calculations are illustrated and FPA performance is presented.
Occultation Spectrophotometry of Extrasolar Planets with SOFIA
NASA Astrophysics Data System (ADS)
Angerhausen, Daniel; Huber, Klaus F.; Mandell, Avi M.; McElwain, Michael W.; Czesla, Stefan; Madhusudhan, Nikku; Morse, Jon A.
2014-04-01
The NASA/DLR Stratospheric Observatory for Infrared Astronomy (SOFIA), a 2.5-meter infrared telescope on board a Boeing 747-SP, will conduct 0.3 - 1,600 μm photometric, spectroscopic, and imaging observations from altitudes as high as 45,000 ft., where the average atmospheric transmission is greater than 80 percent. SOFIA's first light cameras and spectrometers, as well as future generations of instruments, will make important contributions to the characterization of the physical properties of exoplanets. Our analysis shows that optical and near-infrared photometric and spectrophotometric follow-up observations during planetary transits and eclipses will be feasible with SOFIA's instrumentation, in particular the HIPO-FLITECAM optical/NIR instruments. The airborne-based platform has unique advantages in comparison to ground- and space-based observatories in this field of research which we will outline here. Furthermore we will present two exemplary science cases, that will be conducted in SOFIA's cycle 1.
Occultation Spectrophotometry of Extrasolar Planets with SOFIA
NASA Technical Reports Server (NTRS)
Angerhausen, Daniel; Huber, Klaus F.; Mandell, Avi M.; McElwain, Michael W.; Czesla, Stefan; Madhusudhan, Nikku
2012-01-01
The NASA/DLR Stratospheric Observatory for Infrared Astronomy (SOFIA), a 2.5- meter infrared telescope on board a Boeing 747-SP, will conduct 0.3 - 1,600 micrometer photometric, spectroscopic, and imaging observations from altitudes as high as 45,000 ft., where the average atmospheric transmission is greater than 80 percent. SOFIA's first light cameras and spectrometers, as well as future generations of instruments, will make important contributions to the characterization of the physical properties of exoplanets. Our analysis shows that optical and near-infrared photometric and spectrophotometric follow-up observations during planetary transits and eclipses will be feasible with SOFIA's instrumentation, in particular the HIPOFLITECAM optical/NIR instruments. The airborne-based platform has unique advantages in comparison to ground- and space-based observatories in this field of research which we will outline here. Furthermore we will present two exemplary science cases, that will be conducted in SOFIA's cycle 1.
Invisible marker based augmented reality system
NASA Astrophysics Data System (ADS)
Park, Hanhoon; Park, Jong-Il
2005-07-01
Augmented reality (AR) has recently gained significant attention. The previous AR techniques usually need a fiducial marker with known geometry or objects of which the structure can be easily estimated such as cube. Placing a marker in the workspace of the user can be intrusive. To overcome this limitation, we present an AR system using invisible markers which are created/drawn with an infrared (IR) fluorescent pen. Two cameras are used: an IR camera and a visible camera, which are positioned in each side of a cold mirror so that their optical centers coincide with each other. We track the invisible markers using IR camera and visualize AR in the view of visible camera. Additional algorithms are employed for the system to have a reliable performance in the cluttered background. Experimental results are given to demonstrate the viability of the proposed system. As an application of the proposed system, the invisible marker can act as a Vision-Based Identity and Geometry (VBIG) tag, which can significantly extend the functionality of RFID. The invisible tag is the same as RFID in that it is not perceivable while more powerful in that the tag information can be presented to the user by direct projection using a mobile projector or by visualizing AR on the screen of mobile PDA.
NASA Astrophysics Data System (ADS)
Li, J.; Wu, Z.; Wei, X.; Zhang, Y.; Feng, F.; Guo, F.
2018-04-01
Cross-calibration has the advantages of high precision, low resource requirements and simple implementation. It has been widely used in recent years. The four wide-field-of-view (WFV) cameras on-board Gaofen-1 satellite provide high spatial resolution and wide combined coverage (4 × 200 km) without onboard calibration. In this paper, the four-band radiometric cross-calibration coefficients of WFV1 camera were obtained based on radiation and geometry matching taking Landsat 8 OLI (Operational Land Imager) sensor as reference. Scale Invariant Feature Transform (SIFT) feature detection method and distance and included angle weighting method were introduced to correct misregistration of WFV-OLI image pair. The radiative transfer model was used to eliminate difference between OLI sensor and WFV1 camera through the spectral match factor (SMF). The near-infrared band of WFV1 camera encompasses water vapor absorption bands, thus a Look Up Table (LUT) for SMF varies from water vapor amount is established to estimate the water vapor effects. The surface synchronization experiment was designed to verify the reliability of the cross-calibration coefficients, which seem to perform better than the official coefficients claimed by the China Centre for Resources Satellite Data and Application (CCRSDA).
Development of an Extra-vehicular (EVA) Infrared (IR) Camera Inspection System
NASA Technical Reports Server (NTRS)
Gazarik, Michael; Johnson, Dave; Kist, Ed; Novak, Frank; Antill, Charles; Haakenson, David; Howell, Patricia; Pandolf, John; Jenkins, Rusty; Yates, Rusty
2006-01-01
Designed to fulfill a critical inspection need for the Space Shuttle Program, the EVA IR Camera System can detect crack and subsurface defects in the Reinforced Carbon-Carbon (RCC) sections of the Space Shuttle s Thermal Protection System (TPS). The EVA IR Camera performs this detection by taking advantage of the natural thermal gradients induced in the RCC by solar flux and thermal emission from the Earth. This instrument is a compact, low-mass, low-power solution (1.2cm3, 1.5kg, 5.0W) for TPS inspection that exceeds existing requirements for feature detection. Taking advantage of ground-based IR thermography techniques, the EVA IR Camera System provides the Space Shuttle program with a solution that can be accommodated by the existing inspection system. The EVA IR Camera System augments the visible and laser inspection systems and finds cracks and subsurface damage that is not measurable by the other sensors, and thus fills a critical gap in the Space Shuttle s inspection needs. This paper discusses the on-orbit RCC inspection measurement concept and requirements, and then presents a detailed description of the EVA IR Camera System design.
NASA Astrophysics Data System (ADS)
Bachche, Shivaji; Oka, Koichi
2013-06-01
This paper presents the comparative study of various color space models to determine the suitable color space model for detection of green sweet peppers. The images were captured by using CCD cameras and infrared cameras and processed by using Halcon image processing software. The LED ring around the camera neck was used as an artificial lighting to enhance the feature parameters. For color images, CieLab, YIQ, YUV, HSI and HSV whereas for infrared images, grayscale color space models were selected for image processing. In case of color images, HSV color space model was found more significant with high percentage of green sweet pepper detection followed by HSI color space model as both provides information in terms of hue/lightness/chroma or hue/lightness/saturation which are often more relevant to discriminate the fruit from image at specific threshold value. The overlapped fruits or fruits covered by leaves can be detected in better way by using HSV color space model as the reflection feature from fruits had higher histogram than reflection feature from leaves. The IR 80 optical filter failed to distinguish fruits from images as filter blocks useful information on features. Computation of 3D coordinates of recognized green sweet peppers was also conducted in which Halcon image processing software provides location and orientation of the fruits accurately. The depth accuracy of Z axis was examined in which 500 to 600 mm distance between cameras and fruits was found significant to compute the depth distance precisely when distance between two cameras maintained to 100 mm.
NASA Astrophysics Data System (ADS)
Arvidson, R. E.; Squyres, S. W.; Baumgartner, E. T.; Schenker, P. S.; Niebur, C. S.; Larsen, K. W.; SeelosIV, F. P.; Snider, N. O.; Jolliff, B. L.
2002-08-01
The Field Integration Design and Operations (FIDO) prototype Mars rover was deployed and operated remotely for 2 weeks in May 2000 in the Black Rock Summit area of Nevada. The blind science operation trials were designed to evaluate the extent to which FIDO-class rovers can be used to conduct traverse science and collect samples. FIDO-based instruments included stereo cameras for navigation and imaging, an infrared point spectrometer, a color microscopic imager for characterization of rocks and soils, and a rock drill for core acquisition. Body-mounted ``belly'' cameras aided drill deployment, and front and rear hazard cameras enabled terrain hazard avoidance. Airborne Visible and Infrared Imaging Spectrometer (AVIRIS) data, a high spatial resolution IKONOS orbital image, and a suite of descent images were used to provide regional- and local-scale terrain and rock type information, from which hypotheses were developed for testing during operations. The rover visited three sites, traversed 30 m, and acquired 1.3 gigabytes of data. The relatively small traverse distance resulted from a geologically rich site in which materials identified on a regional scale from remote-sensing data could be identified on a local scale using rover-based data. Results demonstrate the synergy of mapping terrain from orbit and during descent using imaging and spectroscopy, followed by a rover mission to test inferences and to make discoveries that can be accomplished only with surface mobility systems.
Star Formation as Seen by the Infrared Array Camera on Spitzer
NASA Technical Reports Server (NTRS)
Smith, Howard A.; Allen, L.; Megeath, T.; Barmby, P.; Calvet, N.; Fazio, G.; Hartmann, L.; Myers, P.; Marengo, M.; Gutermuth, R.
2004-01-01
The Infrared Array Camera (IRAC) onboard Spitzer has imaged regions of star formation (SF) in its four IR bands with spatial resolutions of approximately 2"/pixel. IRAC is sensitive enough to detect very faint, embedded young stars at levels of tens of Jy, and IRAC photometry can categorize their stages of development: from young protostars with infalling envelopes (Class 0/1) to stars whose infrared excesses derive from accreting circumstellar disks (Class 11) to evolved stars dominated by photospheric emission. The IRAC images also clearly reveal and help diagnose associated regions of shocked and/or PDR emission in the clouds; we find existing models provide a good start at explaining the continuum of the SF regions IRAC observes.
Near infra-red astronomy with adaptive optics and laser guide stars at the Keck Observatory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Max, C.E.; Gavel, D.T.; Olivier, S.S.
1995-08-03
A laser guide star adaptive optics system is being built for the W. M. Keck Observatory`s 10-meter Keck II telescope. Two new near infra-red instruments will be used with this system: a high-resolution camera (NIRC 2) and an echelle spectrometer (NIRSPEC). The authors describe the expected capabilities of these instruments for high-resolution astronomy, using adaptive optics with either a natural star or a sodium-layer laser guide star as a reference. They compare the expected performance of these planned Keck adaptive optics instruments with that predicted for the NICMOS near infra-red camera, which is scheduled to be installed on the Hubblemore » Space Telescope in 1997.« less
Khokhlova, Vera A.; Shmeleva, Svetlana M.; Gavrilov, Leonid R.; Martin, Eleanor; Sadhoo, Neelaksh; Shaw, Adam
2013-01-01
Considerable progress has been achieved in the use of infrared (IR) techniques for qualitative mapping of acoustic fields of high intensity focused ultrasound (HIFU) transducers. The authors have previously developed and demonstrated a method based on IR camera measurement of the temperature rise induced in an absorber less than 2 mm thick by ultrasonic bursts of less than 1 s duration. The goal of this paper was to make the method more quantitative and estimate the absolute intensity distributions by determining an overall calibration factor for the absorber and camera system. The implemented approach involved correlating the temperature rise measured in an absorber using an IR camera with the pressure distribution measured in water using a hydrophone. The measurements were conducted for two HIFU transducers and a flat physiotherapy transducer of 1 MHz frequency. Corresponding correction factors between the free field intensity and temperature were obtained and allowed the conversion of temperature images to intensity distributions. The system described here was able to map in good detail focused and unfocused ultrasound fields with sub-millimeter structure and with local time average intensity from below 0.1 W/cm2 to at least 50 W/cm2. Significantly higher intensities could be measured simply by reducing the duty cycle. PMID:23927199
Khokhlova, Vera A; Shmeleva, Svetlana M; Gavrilov, Leonid R; Martin, Eleanor; Sadhoo, Neelaksh; Shaw, Adam
2013-08-01
Considerable progress has been achieved in the use of infrared (IR) techniques for qualitative mapping of acoustic fields of high intensity focused ultrasound (HIFU) transducers. The authors have previously developed and demonstrated a method based on IR camera measurement of the temperature rise induced in an absorber less than 2 mm thick by ultrasonic bursts of less than 1 s duration. The goal of this paper was to make the method more quantitative and estimate the absolute intensity distributions by determining an overall calibration factor for the absorber and camera system. The implemented approach involved correlating the temperature rise measured in an absorber using an IR camera with the pressure distribution measured in water using a hydrophone. The measurements were conducted for two HIFU transducers and a flat physiotherapy transducer of 1 MHz frequency. Corresponding correction factors between the free field intensity and temperature were obtained and allowed the conversion of temperature images to intensity distributions. The system described here was able to map in good detail focused and unfocused ultrasound fields with sub-millimeter structure and with local time average intensity from below 0.1 W/cm(2) to at least 50 W/cm(2). Significantly higher intensities could be measured simply by reducing the duty cycle.
Non-destructive 3D shape measurement of transparent and black objects with thermal fringes
NASA Astrophysics Data System (ADS)
Brahm, Anika; Rößler, Conrad; Dietrich, Patrick; Heist, Stefan; Kühmstedt, Peter; Notni, Gunther
2016-05-01
Fringe projection is a well-established optical method for the non-destructive contactless three-dimensional (3D) measurement of object surfaces. Typically, fringe sequences in the visible wavelength range (VIS) are projected onto the surfaces of objects to be measured and are observed by two cameras in a stereo vision setup. The reconstruction is done by finding corresponding pixels in both cameras followed by triangulation. Problems can occur if the properties of some materials disturb the measurements. If the objects are transparent, translucent, reflective, or strongly absorbing in the VIS range, the projected patterns cannot be recorded properly. To overcome these challenges, we present a new alternative approach in the infrared (IR) region of the electromagnetic spectrum. For this purpose, two long-wavelength infrared (LWIR) cameras (7.5 - 13 μm) are used to detect the emitted heat radiation from surfaces which is induced by a pattern projection unit driven by a CO2 laser (10.6 μm). Thus, materials like glass or black objects, e.g. carbon fiber materials, can be measured non-destructively without the need of any additional paintings. We will demonstrate the basic principles of this heat pattern approach and show two types of 3D systems based on a freeform mirror and a GOBO wheel (GOes Before Optics) projector unit.
Hyperspectral imaging spectro radiometer improves radiometric accuracy
NASA Astrophysics Data System (ADS)
Prel, Florent; Moreau, Louis; Bouchard, Robert; Bullis, Ritchie D.; Roy, Claude; Vallières, Christian; Levesque, Luc
2013-06-01
Reliable and accurate infrared characterization is necessary to measure the specific spectral signatures of aircrafts and associated infrared counter-measures protections (i.e. flares). Infrared characterization is essential to improve counter measures efficiency, improve friend-foe identification and reduce the risk of friendly fire. Typical infrared characterization measurement setups include a variety of panchromatic cameras and spectroradiometers. Each instrument brings essential information; cameras measure the spatial distribution of targets and spectroradiometers provide the spectral distribution of the emitted energy. However, the combination of separate instruments brings out possible radiometric errors and uncertainties that can be reduced with Hyperspectral imagers. These instruments combine both spectral and spatial information into the same data. These instruments measure both the spectral and spatial distribution of the energy at the same time ensuring the temporal and spatial cohesion of collected information. This paper presents a quantitative analysis of the main contributors of radiometric uncertainties and shows how a hyperspectral imager can reduce these uncertainties.
Interpretation of multispectral and infrared thermal surveys of the Suez Canal Zone, Egypt
NASA Technical Reports Server (NTRS)
Elshazly, E. M.; Hady, M. A. A. H.; Hafez, M. A. A.; Salman, A. B.; Morsy, M. A.; Elrakaiby, M. M.; Alaassy, I. E. E.; Kamel, A. F.
1977-01-01
Remote sensing airborne surveys were conducted, as part of the plan of rehabilitation, of the Suez Canal Zone using I2S multispectral camera and Bendix LN-3 infrared passive scanner. The multispectral camera gives four separate photographs for the same scene in the blue, green, red, and near infrared bands. The scanner was operated in the microwave bands of 8 to 14 microns and the thermal surveying was carried out both at night and in the day time. The surveys, coupled with intensive ground investigations, were utilized in the construction of new geological, structural lineation and drainage maps for the Suez Canal Zone on a scale of approximately 1:20,000, which are superior to the maps made by normal aerial photography. A considerable number of anomalies belonging to various types were revealed through the interpretation of the executed multispectral and infrared thermal surveys.
NASA Astrophysics Data System (ADS)
Chatterjee, Abhijit; Verma, Anurag
2016-05-01
The Advanced Wide Field Sensor (AWiFS) camera caters to high temporal resolution requirement of Resourcesat-2A mission with repeativity of 5 days. The AWiFS camera consists of four spectral bands, three in the visible and near IR and one in the short wave infrared. The imaging concept in VNIR bands is based on push broom scanning that uses linear array silicon charge coupled device (CCD) based Focal Plane Array (FPA). On-Board Calibration unit for these CCD based FPAs is used to monitor any degradation in FPA during entire mission life. Four LEDs are operated in constant current mode and 16 different light intensity levels are generated by electronically changing exposure of CCD throughout the calibration cycle. This paper describes experimental setup and characterization results of various flight model visible LEDs (λP=650nm) for development of On-Board Calibration unit of Advanced Wide Field Sensor (AWiFS) camera of RESOURCESAT-2A. Various LED configurations have been studied to meet dynamic range coverage of 6000 pixels silicon CCD based focal plane array from 20% to 60% of saturation during night pass of the satellite to identify degradation of detector elements. The paper also explains comparison of simulation and experimental results of CCD output profile at different LED combinations in constant current mode.
Unmanned Ground Vehicle Perception Using Thermal Infrared Cameras
NASA Technical Reports Server (NTRS)
Rankin, Arturo; Huertas, Andres; Matthies, Larry; Bajracharya, Max; Assad, Christopher; Brennan, Shane; Bellut, Paolo; Sherwin, Gary
2011-01-01
TIR cameras can be used for day/night Unmanned Ground Vehicle (UGV) autonomous navigation when stealth is required. The quality of uncooled TIR cameras has significantly improved over the last decade, making them a viable option at low speed Limiting factors for stereo ranging with uncooled LWIR cameras are image blur and low texture scenes TIR perception capabilities JPL has explored includes: (1) single and dual band TIR terrain classification (2) obstacle detection (pedestrian, vehicle, tree trunks, ditches, and water) (3) perception thru obscurants
WiseEye: Next Generation Expandable and Programmable Camera Trap Platform for Wildlife Research.
Nazir, Sajid; Newey, Scott; Irvine, R Justin; Verdicchio, Fabio; Davidson, Paul; Fairhurst, Gorry; Wal, René van der
2017-01-01
The widespread availability of relatively cheap, reliable and easy to use digital camera traps has led to their extensive use for wildlife research, monitoring and public outreach. Users of these units are, however, often frustrated by the limited options for controlling camera functions, the generation of large numbers of images, and the lack of flexibility to suit different research environments and questions. We describe the development of a user-customisable open source camera trap platform named 'WiseEye', designed to provide flexible camera trap technology for wildlife researchers. The novel platform is based on a Raspberry Pi single-board computer and compatible peripherals that allow the user to control its functions and performance. We introduce the concept of confirmatory sensing, in which the Passive Infrared triggering is confirmed through other modalities (i.e. radar, pixel change) to reduce the occurrence of false positives images. This concept, together with user-definable metadata, aided identification of spurious images and greatly reduced post-collection processing time. When tested against a commercial camera trap, WiseEye was found to reduce the incidence of false positive images and false negatives across a range of test conditions. WiseEye represents a step-change in camera trap functionality, greatly increasing the value of this technology for wildlife research and conservation management.
WiseEye: Next Generation Expandable and Programmable Camera Trap Platform for Wildlife Research
Nazir, Sajid; Newey, Scott; Irvine, R. Justin; Verdicchio, Fabio; Davidson, Paul; Fairhurst, Gorry; van der Wal, René
2017-01-01
The widespread availability of relatively cheap, reliable and easy to use digital camera traps has led to their extensive use for wildlife research, monitoring and public outreach. Users of these units are, however, often frustrated by the limited options for controlling camera functions, the generation of large numbers of images, and the lack of flexibility to suit different research environments and questions. We describe the development of a user-customisable open source camera trap platform named ‘WiseEye’, designed to provide flexible camera trap technology for wildlife researchers. The novel platform is based on a Raspberry Pi single-board computer and compatible peripherals that allow the user to control its functions and performance. We introduce the concept of confirmatory sensing, in which the Passive Infrared triggering is confirmed through other modalities (i.e. radar, pixel change) to reduce the occurrence of false positives images. This concept, together with user-definable metadata, aided identification of spurious images and greatly reduced post-collection processing time. When tested against a commercial camera trap, WiseEye was found to reduce the incidence of false positive images and false negatives across a range of test conditions. WiseEye represents a step-change in camera trap functionality, greatly increasing the value of this technology for wildlife research and conservation management. PMID:28076444
BAE Systems' 17μm LWIR camera core for civil, commercial, and military applications
NASA Astrophysics Data System (ADS)
Lee, Jeffrey; Rodriguez, Christian; Blackwell, Richard
2013-06-01
Seventeen (17) µm pixel Long Wave Infrared (LWIR) Sensors based on vanadium oxide (VOx) micro-bolometers have been in full rate production at BAE Systems' Night Vision Sensors facility in Lexington, MA for the past five years.[1] We introduce here a commercial camera core product, the Airia-MTM imaging module, in a VGA format that reads out in 30 and 60Hz progressive modes. The camera core is architected to conserve power with all digital interfaces from the readout integrated circuit through video output. The architecture enables a variety of input/output interfaces including Camera Link, USB 2.0, micro-display drivers and optional RS-170 analog output supporting legacy systems. The modular board architecture of the electronics facilitates hardware upgrades allow us to capitalize on the latest high performance low power electronics developed for the mobile phones. Software and firmware is field upgradeable through a USB 2.0 port. The USB port also gives users access to up to 100 digitally stored (lossless) images.
NASA Technical Reports Server (NTRS)
Clegg, R. H.; Scherz, J. P.
1975-01-01
Successful aerial photography depends on aerial cameras providing acceptable photographs within cost restrictions of the job. For topographic mapping where ultimate accuracy is required only large format mapping cameras will suffice. For mapping environmental patterns of vegetation, soils, or water pollution, 9-inch cameras often exceed accuracy and cost requirements, and small formats may be better. In choosing the best camera for environmental mapping, relative capabilities and costs must be understood. This study compares resolution, photo interpretation potential, metric accuracy, and cost of 9-inch, 70mm, and 35mm cameras for obtaining simultaneous color and color infrared photography for environmental mapping purposes.
Sun, Guanghao; Nakayama, Yosuke; Dagdanpurev, Sumiyakhand; Abe, Shigeto; Nishimura, Hidekazu; Kirimoto, Tetsuo; Matsui, Takemi
2017-02-01
Infrared thermography (IRT) is used to screen febrile passengers at international airports, but it suffers from low sensitivity. This study explored the application of a combined visible and thermal image processing approach that uses a CMOS camera equipped with IRT to remotely sense multiple vital signs and screen patients with suspected infectious diseases. An IRT system that produced visible and thermal images was used for image acquisition. The subjects' respiration rates were measured by monitoring temperature changes around the nasal areas on thermal images; facial skin temperatures were measured simultaneously. Facial blood circulation causes tiny color changes in visible facial images that enable the determination of the heart rate. A logistic regression discriminant function predicted the likelihood of infection within 10s, based on the measured vital signs. Sixteen patients with an influenza-like illness and 22 control subjects participated in a clinical test at a clinic in Fukushima, Japan. The vital-sign-based IRT screening system had a sensitivity of 87.5% and a negative predictive value of 91.7%; these values are higher than those of conventional fever-based screening approaches. Multiple vital-sign-based screening efficiently detected patients with suspected infectious diseases. It offers a promising alternative to conventional fever-based screening. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
First experiences with ARNICA, the ARCETRI observatory imaging camera
NASA Astrophysics Data System (ADS)
Lisi, F.; Baffa, C.; Hunt, L.; Maiolino, R.; Moriondo, G.; Stanga, R.
1994-03-01
ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 micrometer that Arcetri Observatory has designed and built as a common use instrument for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1 sec per pixel, with sky coverage of more than 4 min x 4 min on the NICMOS 3 (256 x 256 pixels, 40 micrometer side) detector array. The optical path is compact enough to be enclosed in a 25.4 cm diameter dewar; the working temperature of detector and optics is 76 K. We give an estimate of performance, in terms of sensitivity with an assigned observing time, along with some preliminary considerations on photometric accuracy.
Apollo 9 Mission image - S0-65 Multispectral Photography - Georgia
2009-02-19
AS09-26A-3792A (11 March 1969) --- Color infrared photograph of the Atlanta, Georgia area taken on March 11, 1969, by one of the four synchronized cameras of the Apollo 9 Earth Resources Survey (SO-65) experiment. At 11:21 a.m. (EST) when this picture was taken, the Apollo 9 spacecraft was at an altitude of 106 nautical miles, and the sun elevation was 47 degrees above the horizon. The location of the point on Earth's surface at which the four-camera combination was aimed was 33 degrees 10 minutes north latitude, and 84 degrees and 40 minutes west longitude. The other three cameras used: (B) black and white film with a red filter; (C) black and white infrared film; and (D) black and white film with a green filter.
Low Cost and Efficient 3d Indoor Mapping Using Multiple Consumer Rgb-D Cameras
NASA Astrophysics Data System (ADS)
Chen, C.; Yang, B. S.; Song, S.
2016-06-01
Driven by the miniaturization, lightweight of positioning and remote sensing sensors as well as the urgent needs for fusing indoor and outdoor maps for next generation navigation, 3D indoor mapping from mobile scanning is a hot research and application topic. The point clouds with auxiliary data such as colour, infrared images derived from 3D indoor mobile mapping suite can be used in a variety of novel applications, including indoor scene visualization, automated floorplan generation, gaming, reverse engineering, navigation, simulation and etc. State-of-the-art 3D indoor mapping systems equipped with multiple laser scanners product accurate point clouds of building interiors containing billions of points. However, these laser scanner based systems are mostly expensive and not portable. Low cost consumer RGB-D Cameras provides an alternative way to solve the core challenge of indoor mapping that is capturing detailed underlying geometry of the building interiors. Nevertheless, RGB-D Cameras have a very limited field of view resulting in low efficiency in the data collecting stage and incomplete dataset that missing major building structures (e.g. ceilings, walls). Endeavour to collect a complete scene without data blanks using single RGB-D Camera is not technic sound because of the large amount of human labour and position parameters need to be solved. To find an efficient and low cost way to solve the 3D indoor mapping, in this paper, we present an indoor mapping suite prototype that is built upon a novel calibration method which calibrates internal parameters and external parameters of multiple RGB-D Cameras. Three Kinect sensors are mounted on a rig with different view direction to form a large field of view. The calibration procedure is three folds: 1, the internal parameters of the colour and infrared camera inside each Kinect are calibrated using a chess board pattern, respectively; 2, the external parameters between the colour and infrared camera inside each Kinect are calibrated using a chess board pattern; 3, the external parameters between every Kinect are firstly calculated using a pre-set calibration field and further refined by an iterative closet point algorithm. Experiments are carried out to validate the proposed method upon RGB-D datasets collected by the indoor mapping suite prototype. The effectiveness and accuracy of the proposed method is evaluated by comparing the point clouds derived from the prototype with ground truth data collected by commercial terrestrial laser scanner at ultra-high density. The overall analysis of the results shows that the proposed method achieves seamless integration of multiple point clouds form different RGB-D cameras collected at 30 frame per second.
Kim, Jong Hyun; Hong, Hyung Gil; Park, Kang Ryoung
2017-05-08
Because intelligent surveillance systems have recently undergone rapid growth, research on accurately detecting humans in videos captured at a long distance is growing in importance. The existing research using visible light cameras has mainly focused on methods of human detection for daytime hours when there is outside light, but human detection during nighttime hours when there is no outside light is difficult. Thus, methods that employ additional near-infrared (NIR) illuminators and NIR cameras or thermal cameras have been used. However, in the case of NIR illuminators, there are limitations in terms of the illumination angle and distance. There are also difficulties because the illuminator power must be adaptively adjusted depending on whether the object is close or far away. In the case of thermal cameras, their cost is still high, which makes it difficult to install and use them in a variety of places. Because of this, research has been conducted on nighttime human detection using visible light cameras, but this has focused on objects at a short distance in an indoor environment or the use of video-based methods to capture multiple images and process them, which causes problems related to the increase in the processing time. To resolve these problems, this paper presents a method that uses a single image captured at night on a visible light camera to detect humans in a variety of environments based on a convolutional neural network. Experimental results using a self-constructed Dongguk night-time human detection database (DNHD-DB1) and two open databases (Korea advanced institute of science and technology (KAIST) and computer vision center (CVC) databases), as well as high-accuracy human detection in a variety of environments, show that the method has excellent performance compared to existing methods.
Ambient-Light-Canceling Camera Using Subtraction of Frames
NASA Technical Reports Server (NTRS)
Morookian, John Michael
2004-01-01
The ambient-light-canceling camera (ALCC) is a proposed near-infrared electronic camera that would utilize a combination of (1) synchronized illumination during alternate frame periods and (2) subtraction of readouts from consecutive frames to obtain images without a background component of ambient light. The ALCC is intended especially for use in tracking the motion of an eye by the pupil center corneal reflection (PCCR) method. Eye tracking by the PCCR method has shown potential for application in human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological deficiencies. In the PCCR method, an eye is illuminated by near-infrared light from a lightemitting diode (LED). Some of the infrared light is reflected from the surface of the cornea. Some of the infrared light enters the eye through the pupil and is reflected from back of the eye out through the pupil a phenomenon commonly observed as the red-eye effect in flash photography. An electronic camera is oriented to image the user's eye. The output of the camera is digitized and processed by algorithms that locate the two reflections. Then from the locations of the centers of the two reflections, the direction of gaze is computed. As described thus far, the PCCR method is susceptible to errors caused by reflections of ambient light. Although a near-infrared band-pass optical filter can be used to discriminate against ambient light, some sources of ambient light have enough in-band power to compete with the LED signal. The mode of operation of the ALCC would complement or supplant spectral filtering by providing more nearly complete cancellation of the effect of ambient light. In the operation of the ALCC, a near-infrared LED would be pulsed on during one camera frame period and off during the next frame period. Thus, the scene would be illuminated by both the LED (signal) light and the ambient (background) light during one frame period, and would be illuminated with only ambient (background) light during the next frame period. The camera output would be digitized and sent to a computer, wherein the pixel values of the background-only frame would be subtracted from the pixel values of the signal-plus-background frame to obtain signal-only pixel values (see figure). To prevent artifacts of motion from entering the images, it would be necessary to acquire image data at a rate greater than the standard video rate of 30 frames per second. For this purpose, the ALCC would exploit a novel control technique developed at NASA s Jet Propulsion Laboratory for advanced charge-coupled-device (CCD) cameras. This technique provides for readout from a subwindow [region of interest (ROI)] within the image frame. Because the desired reflections from the eye would typically occupy a small fraction of the area within the image frame, the ROI capability would make it possible to acquire and subtract pixel values at rates of several hundred frames per second considerably greater than the standard video rate and sufficient to both (1) suppress motion artifacts and (2) track the motion of the eye between consecutive subtractive frame pairs.
A CMOS camera-based system for clinical photoplethysmographic applications
NASA Astrophysics Data System (ADS)
Humphreys, Kenneth; Markham, Charles; Ward, Tomas E.
2005-06-01
In this work an image-based photoplethysmography (PPG) system is developed and tested against a conventional finger-based system as commonly used in clinical practise. A PPG is essentially an optical instrument consisting of a near infrared (NIR) source and detector that is capable of tracking blood flow changes in body tissue. When used with a number of wavelengths in the NIR band blood oxygenation changes as well as other blood chemical signatures can be ascertained yielding a very useful device in the clinical realm. Conventionally such a device requires direct contact with the tissue under investigation which eliminates the possibility of its use for applications like wound management where the tissue oxygenation measurement could be extremely useful. To circumnavigate this shortcoming we have developed a CMOS camera-based system, which can successfully extract the PPG signal without contact with the tissue under investigation. A comparison of our results with conventional techniques has yielded excellent results.
NASA Astrophysics Data System (ADS)
Ibarra-Castanedo, C.; Brault, L.; Marcotte, F.; Genest, M.; Farley, V.; Maldague, X.
2012-06-01
Water ingress in honeycomb structures is of great concern for the civil and military aerospace industries. Pressure and temperature variations during take-off and landing produce considerable stress on aircraft structures, promoting moisture ingress (by diffusion through fibers or by direct ingress through voids, cracks or unsealed joints) into the core. The presence of water (or other fluids such as kerosene, hydraulic fluid and de-icing agents) in any of its forms (gas vapor, liquid or ice) promotes corrosion, cell breakage, and induce composite layer delaminations and skin disbonds. In this study, testing specimens were produced from unserviceable parts from military aircraft. In order to simulate atmospheric conditions during landing, selected core areas were filled with measured quantities of water and then frozen in a cold chamber. The specimens were then removed from the chamber and monitored for over 20 minutes as they warm up using a cooled high-resolution infrared camera. Results have shown that detection and quantification of water ingress on honeycomb sandwich structures by passive infrared thermography is possible using a HD mid-wave infrared cameras for volumes of water as low as 0.2 ml and from a distance as far as 20 m from the target.
SOFIA Science Instruments: Commissioning, Upgrades and Future Opportunities
NASA Technical Reports Server (NTRS)
Smith, Erin C.
2014-01-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is the world's largest airborne observatory, featuring a 2.5 meter telescope housed in the aft section of a Boeing 747sp aircraft. SOFIA's current instrument suite includes: FORCAST (Faint Object InfraRed CAmera for the SOFIA Telescope), a 5-40 µm dual band imager/grism spectrometer developed at Cornell University; HIPO (High-speed Imaging Photometer for Occultations), a 0.3-1.1 micron imager built by Lowell Observatory; FLITECAM (First Light Infrared Test Experiment CAMera), a 1-5 micron wide-field imager/grism spectrometer developed at UCLA; FIFI-LS (Far-Infrared Field-Imaging Line Spectrometer), a 42-210 micron IFU grating spectrograph completed by University Stuttgart; and EXES (Echelon-Cross- Echelle Spectrograph), a 5-28 micron high-resolution spectrometer being completed by UC Davis and NASA Ames. A second generation instrument, HAWC+ (Highresolution Airborne Wideband Camera), is a 50-240 micron imager being upgraded at JPL to add polarimetry and new detectors developed at GSFC. SOFIA will continually update its instrument suite with new instrumentation, technology demonstration experiments and upgrades to the existing instrument suite. This paper details instrument capabilities and status as well as plans for future instrumentation, including the call for proposals for 3rd generation SOFIA science instruments.
POLICAN: A near-infrared imaging polarimeter at OAGH
NASA Astrophysics Data System (ADS)
Devaraj, R.; Luna, A.; Carrasco, L.; Mayya, Y. D.; Serrano-Bernal, O.
2017-07-01
We present a near-infrared linear imaging polarimeter POLICAN, developed for the Cananea near-infrared camera (CANICA) at the 2.1m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located at Cananea, Sonora, México. POLICAN reaches a limiting magnitude to about 16th mag with a polarimetric accuracy of about 1% for bright sources.
ARNICA: the Arcetri Observatory NICMOS3 imaging camera
NASA Astrophysics Data System (ADS)
Lisi, Franco; Baffa, Carlo; Hunt, Leslie K.
1993-10-01
ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 micrometers that Arcetri Observatory has designed and built as a general facility for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1' per pixel, with sky coverage of more than 4' X 4' on the NICMOS 3 (256 X 256 pixels, 40 micrometers side) detector array. The optical path is compact enough to be enclosed in a 25.4 cm diameter dewar; the working temperature is 76 K. The camera is remotely controlled by a 486 PC, connected to the array control electronics via a fiber-optics link. A C-language package, running under MS-DOS on the 486 PC, acquires and stores the frames, and controls the timing of the array. We give an estimate of performance, in terms of sensitivity with an assigned observing time, along with some details on the main parameters of the NICMOS 3 detector.
2001-11-26
KENNEDY SPACE CENTER, Fla. -- A piece of equipment for Hubble Space Telescope Servicing mission is moved inside Hangar AE, Cape Canaveral. In the canister is the Advanced Camera for Surveys (ACS). The ACS will increase the discovery efficiency of the HST by a factor of ten. It consists of three electronic cameras and a complement of filters and dispersers that detect light from the ultraviolet to the near infrared (1200 - 10,000 angstroms). The ACS was built through a collaborative effort between Johns Hopkins University, Goddard Space Flight Center, Ball Aerospace Corporation and Space Telescope Science Institute. The goal of the mission, STS-109, is to service the HST, replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the ACS, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002
2001-11-29
KENNEDY SPACE CENTER, Fla. -- In Hangar A&E, workers watch as an overhead crane lifts the Advanced Camera for Surveys out of its transportation container. Part of the payload on the Hubble Space Telescope Servicing Mission, STS-109, the ACS will increase the discovery efficiency of the HST by a factor of ten. It consists of three electronic cameras and a complement of filters and dispersers that detect light from the ultraviolet to the near infrared (1200 - 10,000 angstroms). The ACS was built through a collaborative effort between Johns Hopkins University, Goddard Space Flight Center, Ball Aerospace Corporation and Space Telescope Science Institute. Tasks for the mission include replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the ACS, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002
2001-11-26
KENNEDY SPACE CENTER, Fla. - A piece of equipment for Hubble Space Telescope Servicing mission arrives at Hangar AE, Cape Canaveral. Inside the canister is the Advanced Camera for Surveys (ACS). The ACS will increase the discovery efficiency of the HST by a factor of ten. It consists of three electronic cameras and a complement of filters and dispersers that detect light from the ultraviolet to the near infrared (1200 - 10,000 angstroms). The ACS was built through a collaborative effort between Johns Hopkins University, Goddard Space Flight Center, Ball Aerospace Corporation and Space Telescope Science Institute. The goal of the mission, STS-109, is to service the HST, replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the ACS, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002
THE VARIABLE NEAR-INFRARED COUNTERPART OF THE MICROQUASAR GRS 1758–258
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luque-Escamilla, Pedro L.; Martí, Josep; Muñoz-Arjonilla, Álvaro J., E-mail: peter@ujaen.es, E-mail: jmarti@ujaen.es, E-mail: ajmunoz@ujaen.es
2014-12-10
We present a new study of the microquasar system GRS 1758–258 in the near-infrared domain based on archival observations with the Hubble Space Telescope and the NICMOS camera. In addition to confirming the near-infrared counterpart pointed out by Muñoz-Arjonilla et al., we show that this object displays significant photometric variability. From its average magnitudes, we also find that GRS 1758–258 fits well within the correlation between the optical/near-infrared and X-ray luminosity known to exist for low-mass, black-hole candidate X-ray binaries in a hard state. Moreover, the spectral energy distribution built using all radio, near-infrared, and X-ray data available closest inmore » time to the NICMOS observations can be reasonably interpreted in terms of a self-absorbed radio jet and an irradiated accretion disk model around a stellar-mass black hole. All these facts match the expected behavior of a compact binary system and strengthen our confidence in the counterpart identification.« less
Integrative Multi-Spectral Sensor Device for Far-Infrared and Visible Light Fusion
NASA Astrophysics Data System (ADS)
Qiao, Tiezhu; Chen, Lulu; Pang, Yusong; Yan, Gaowei
2018-06-01
Infrared and visible light image fusion technology is a hot spot in the research of multi-sensor fusion technology in recent years. Existing infrared and visible light fusion technologies need to register before fusion because of using two cameras. However, the application effect of the registration technology has yet to be improved. Hence, a novel integrative multi-spectral sensor device is proposed for infrared and visible light fusion, and by using the beam splitter prism, the coaxial light incident from the same lens is projected to the infrared charge coupled device (CCD) and visible light CCD, respectively. In this paper, the imaging mechanism of the proposed sensor device is studied with the process of the signals acquisition and fusion. The simulation experiment, which involves the entire process of the optic system, signal acquisition, and signal fusion, is constructed based on imaging effect model. Additionally, the quality evaluation index is adopted to analyze the simulation result. The experimental results demonstrate that the proposed sensor device is effective and feasible.
NASA Astrophysics Data System (ADS)
Gogler, Slawomir; Bieszczad, Grzegorz; Krupinski, Michal
2013-10-01
Thermal imagers and used therein infrared array sensors are subject to calibration procedure and evaluation of their voltage sensitivity on incident radiation during manufacturing process. The calibration procedure is especially important in so-called radiometric cameras, where accurate radiometric quantities, given in physical units, are of concern. Even though non-radiometric cameras are not expected to stand up to such elevated standards, it is still important, that the image faithfully represents temperature variations across the scene. Detectors used in thermal camera are illuminated by infrared radiation transmitted through an infrared transmitting optical system. Often an optical system, when exposed to uniform Lambertian source forms a non-uniform irradiation distribution in its image plane. In order to be able to carry out an accurate non-uniformity correction it is essential to correctly predict irradiation distribution from a uniform source. In the article a non-uniformity correction method has been presented, that takes into account optical system's radiometry. Predictions of the irradiation distribution have been confronted with measured irradiance values. Presented radiometric model allows fast and accurate non-uniformity correction to be carried out.
Solution for the nonuniformity correction of infrared focal plane arrays.
Zhou, Huixin; Liu, Shangqian; Lai, Rui; Wang, Dabao; Cheng, Yubao
2005-05-20
Based on the S-curve model of the detector response of infrared focal plan arrays (IRFPAs), an improved two-point correction algorithm is presented. The algorithm first transforms the nonlinear image data into linear data and then uses the normal two-point algorithm to correct the linear data. The algorithm can effectively overcome the influence of nonlinearity of the detector's response, and it enlarges the correction precision and the dynamic range of the response. A real-time imaging-signal-processing system for IRFPAs that is based on a digital signal processor and field-programmable gate arrays is also presented. The nonuniformity correction capability of the presented solution is validated by experimental imaging procedures of a 128 x 128 pixel IRFPA camera prototype.
New gonioscopy system using only infrared light.
Sugimoto, Kota; Ito, Kunio; Matsunaga, Koichi; Miura, Katsuya; Esaki, Koji; Uji, Yukitaka
2005-08-01
To describe an infrared gonioscopy system designed to observe the anterior chamber angle under natural mydriasis in a completely darkened room. An infrared light filter was used to modify the light source of the slit-lamp microscope. A television monitor connected to a CCD monochrome camera was used to indirectly observe the angle. Use of the infrared system enabled observation of the angle under natural mydriasis in a completely darkened room. Infrared gonioscopy is a useful procedure for the observation of the angle under natural mydriasis.
Near-infrared observations of the variable crab nebula
NASA Astrophysics Data System (ADS)
Yamamoto, M.; Mori, K.; Shibata, S.; Tsujimoto, M.; Misawa, T.; Burrows, D.; Kawai, N.
We present three near-infrared NIR observations of the Crab Nebula obtained with CISCO on the Subaru Telescope and Quick Infrared Camera on the University of HAWAII 88 inch Telescope The observations were performed on 2004 September 2005 February and 2005 October and were coordinated with X-ray observations obtained with the Chandra X-ray observatory within 10 days As shown in previous optical and X-ray monitoring observations outward-moving wisps and variable knots are detected also in our NIR observations The NIR variations are closely correlated with variations in the X-ray observations indicating that both variations are driven by the same physical process We discuss the origin of NIR-emitting particles based on the temporal variations as well as the spectral energy distributions of each variable component
Clear New View of a Classic Spiral
NASA Astrophysics Data System (ADS)
2010-05-01
ESO is releasing a beautiful image of the nearby galaxy Messier 83 taken by the HAWK-I instrument on ESO's Very Large Telescope (VLT) at the Paranal Observatory in Chile. The picture shows the galaxy in infrared light and demonstrates the impressive power of the camera to create one of the sharpest and most detailed pictures of Messier 83 ever taken from the ground. The galaxy Messier 83 (eso0825) is located about 15 million light-years away in the constellation of Hydra (the Sea Serpent). It spans over 40 000 light-years, only 40 percent the size of the Milky Way, but in many ways is quite similar to our home galaxy, both in its spiral shape and the presence of a bar of stars across its centre. Messier 83 is famous among astronomers for its many supernovae: vast explosions that end the lives of some stars. Over the last century, six supernovae have been observed in Messier 83 - a record number that is matched by only one other galaxy. Even without supernovae, Messier 83 is one of the brightest nearby galaxies, visible using just binoculars. Messier 83 has been observed in the infrared part of the spectrum using HAWK-I [1], a powerful camera on ESO's Very Large Telescope (VLT). When viewed in infrared light most of the obscuring dust that hides much of Messier 83 becomes transparent. The brightly lit gas around hot young stars in the spiral arms is also less prominent in infrared pictures. As a result much more of the structure of the galaxy and the vast hordes of its constituent stars can be seen. This clear view is important for astronomers looking for clusters of young stars, especially those hidden in dusty regions of the galaxy. Studying such star clusters was one of the main scientific goals of these observations [2]. When compared to earlier images, the acute vision of HAWK-I reveals far more stars within the galaxy. The combination of the huge mirror of the VLT, the large field of view and great sensitivity of the camera, and the superb observing conditions at ESO's Paranal Observatory makes HAWK-I one of the most powerful near-infrared imagers in the world. Astronomers are eagerly queuing up for the chance to use the camera, which began operation in 2007 (eso0736), and to get some of the best ground-based infrared images ever of the night sky. Notes [1] HAWK-I stands for High-Acuity Wide-field K-band Imager. More technical details about the camera can be found in an earlier press release (eso0736). [2] The data used to prepare this image were acquired by a team led by Mark Gieles (University of Cambridge) and Yuri Beletsky (ESO). Mischa Schirmer (University of Bonn) performed the challenging data processing. More information ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world's most advanced visible-light astronomical observatory and VISTA, the world's largest survey telescope. ESO is the European partner of a revolutionary astronomical telescope ALMA, the largest astronomical project in existence. ESO is currently planning a 42-metre European Extremely Large optical/near-infrared Telescope, the E-ELT, which will become "the world's biggest eye on the sky".
An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates
Hobbs, Michael T.; Brehme, Cheryl S.
2017-01-01
Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.
An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates.
Hobbs, Michael T; Brehme, Cheryl S
2017-01-01
Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.
Using Thermal Radiation in Detection of Negative Obstacles
NASA Technical Reports Server (NTRS)
Rankin, Arturo L.; Matthies, Larry H.
2009-01-01
A method of automated detection of negative obstacles (potholes, ditches, and the like) ahead of ground vehicles at night involves processing of imagery from thermal-infrared cameras aimed at the terrain ahead of the vehicles. The method is being developed as part of an overall obstacle-avoidance scheme for autonomous and semi-autonomous offroad robotic vehicles. The method could also be applied to help human drivers of cars and trucks avoid negative obstacles -- a development that may entail only modest additional cost inasmuch as some commercially available passenger cars are already equipped with infrared cameras as aids for nighttime operation.
Liang, Kun; Yang, Cailan; Peng, Li; Zhou, Bo
2017-02-01
In uncooled long-wave IR camera systems, the temperature of a focal plane array (FPA) is variable along with the environmental temperature as well as the operating time. The spatial nonuniformity of the FPA, which is partly affected by the FPA temperature, obviously changes as well, resulting in reduced image quality. This study presents a real-time nonuniformity correction algorithm based on FPA temperature to compensate for nonuniformity caused by FPA temperature fluctuation. First, gain coefficients are calculated using a two-point correction technique. Then offset parameters at different FPA temperatures are obtained and stored in tables. When the camera operates, the offset tables are called to update the current offset parameters via a temperature-dependent interpolation. Finally, the gain coefficients and offset parameters are used to correct the output of the IR camera in real time. The proposed algorithm is evaluated and compared with two representative shutterless algorithms [minimizing the sum of the squares of errors algorithm (MSSE), template-based solution algorithm (TBS)] using IR images captured by a 384×288 pixel uncooled IR camera with a 17 μm pitch. Experimental results show that this method can quickly trace the response drift of the detector units when the FPA temperature changes. The quality of the proposed algorithm is as good as MSSE, while the processing time is as short as TBS, which means the proposed algorithm is good for real-time control and at the same time has a high correction effect.
CATAVIÑA: new infrared camera for OAN-SPM
NASA Astrophysics Data System (ADS)
Iriarte, Arturo; Cruz-González, Irene; Martínez, Luis A.; Tinoco, Silvio; Lara, Gerardo; Ruiz, Elfego; Sohn, Erika; Bernal, Abel; Angeles, Fernando; Moreno, Arturo; Murillo, Francisco; Langarica, Rosalía; Luna, Esteban; Salas, Luis; Cajero, Vicente
2006-06-01
CATAVIÑA is a near-infrared camera system to be operated in conjunction with the existing multi-purpose nearinfrared optical bench "CAMALEON" in OAN-SPM. Observing modes include direct imaging, spectroscopy, Fabry- Perot interferometry and polarimetry. This contribution focuses on the optomechanics and detector controller description of CATAVIÑA, which is planned to start operating later in 2006. The camera consists of an 8 inch LN2 dewar containing a 10 filter carousel, a radiation baffle and the detector circuit board mount. The system is based on a Rockwell 1024x1024 HgCdTe (HAWAII-I) FPA, operating in the 1 to 2.5 micron window. The detector controller/readout system was designed and developed at UNAM Instituto de Astronomia. It is based on five Texas Instruments DSK digital signal processor (DSP) modules. One module generates the detector and ADC-system control, while the remaining four are in charge of the acquisition of each of the detector's quadrants. Each DSP has a built-in expanded memory module in order to store more than one image. The detector read-out and signal driver subsystems are mounted onto the dewar in a "back-pack" fashion, each containing four independent pre-amplifiers, converters and signal drivers, that communicate through fiber optics with their respective DSPs. This system has the possibility of programming the offset input voltage and converter gain. The controller software architecture is based on a client/server model. The client sends commands through the TCP/IP protocol and acquires the image. The server consists of a microcomputer with an embedded Linux operating system, which runs the main program that receives the user commands and interacts with the timing and acquisition DSPs. The observer's interface allows for several readout and image processing modes.
Physics teaching by infrared remote sensing of vegetation
NASA Astrophysics Data System (ADS)
Schüttler, Tobias; Maman, Shimrit; Girwidz, Raimund
2018-05-01
Context- and project-based teaching has proven to foster different affective and cognitive aspects of learning. As a versatile and multidisciplinary scientific research area with diverse applications for everyday life, satellite remote sensing is an interesting context for physics education. In this paper we give a brief overview of satellite remote sensing of vegetation and how to obtain your own, individual infrared remote sensing data with affordable converted digital cameras. This novel technique provides the opportunity to conduct individual remote sensing measurement projects with students in their respective environment. The data can be compared to real satellite data and is of sufficient accuracy for educational purposes.
Background-Limited Infrared-Submillimeter Spectroscopy (BLISS)
NASA Technical Reports Server (NTRS)
Bradford, Charles Matt
2004-01-01
The bulk of the cosmic far-infrared background light will soon be resolved into its individual sources with Spitzer, Astro-F, Herschel, and submm/mm ground-based cameras. The sources will be dusty galaxies at z approximately equal to 1-4. Their physical conditions and processes in these galaxies are directly probed with moderate-resolution spectroscopy from 20 micrometers to 1 mm. Currently large cold telescopes are being combined with sensitive direct detectors, offering the potential for mid-far-IR spectroscopy at the background limit (BLISS). The capability will allow routine observations of even modest high-redshift galaxies in a variety of lines. The BLISS instrument's capabilities are described in this presentation.
Apollo 9 Mission image - S0-65 Multispectral Photography - New Mexico and Texas
1969-03-12
AS09-26A-3807A (12 March 1969) --- Color infrared photograph of the Texas-New Mexico border area, between Lubbock and Roswell, taken on March 12, 1969, by one of the four synchronized cameras of the Apollo 9 Earth Resources Survey (SO65). At 11:30 a.m. (EST) when this picture was made the Apollo 9 spacecraft was at an altitude of 119 nautical miles, and the sun elevation was 38 degrees above the horizon. The location of the point on Earth's surface at which the four-camera combination was aimed was 33 degrees 42 minutes north latitude, and 103 degrees 1 minute west longitude. The other three cameras used: (B) black and white film with a red filter; (C) black and white infrared film; and (D) black and white film with a green filter.
More than Meets the Eye - Infrared Cameras in Open-Ended University Thermodynamics Labs
NASA Astrophysics Data System (ADS)
Melander, Emil; Haglund, Jesper; Weiszflog, Matthias; Andersson, Staffan
2016-12-01
Educational research has found that students have challenges understanding thermal science. Undergraduate physics students have difficulties differentiating basic thermal concepts, such as heat, temperature, and internal energy. Engineering students have been found to have difficulties grasping surface emissivity as a thermal material property. One potential source of students' challenges with thermal science is the lack of opportunity to visualize energy transfer in intuitive ways with traditional measurement equipment. Thermodynamics laboratories have typically depended on point measures of temperature by use of thermometers (detecting heat conduction) or pyrometers (detecting heat radiation). In contrast, thermal imaging by means of an infrared (IR) camera provides a real-time, holistic image. Here we provide some background on IR cameras and their uses in education, and summarize five qualitative investigations that we have used in our courses.
Apollo 9 Mission image - S0-65 Multispectral Photography - Georgia
2009-02-19
AS09-26A-3816A (12 March 1969) --- Color infrared photograph of the Atlantic coast of Georgia, Brunswick area, taken on March 12, 1969, by one of the four synchronized cameras of the Apollo 9 Earth Resources Survey SO65 Experiment. At 11:35 a.m. (EST) when this picture was made the Apollo 9 spacecraft was at an altitude of 102 nautical miles, and the sun elevation was 51 degrees above the horizon. The location of the point on Earth's surface at which the four-camera combination was aimed 31 degrees 16 minutes north latitude, and 81 degrees 17 minutes west longitude. The other three cameras used: (B) black and white film with a red filter; (C) black and white infrared film; and (D) black and white film with a green filter.
Diffraction experiments with infrared remote controls
NASA Astrophysics Data System (ADS)
Kuhn, Jochen; Vogt, Patrik
2012-02-01
In this paper we describe an experiment in which radiation emitted by an infrared remote control is passed through a diffraction grating. An image of the diffraction pattern is captured using a cell phone camera and then used to determine the wavelength of the radiation.
2001 Mars Odyssey Images Earth (Visible and Infrared)
NASA Technical Reports Server (NTRS)
2001-01-01
2001 Mars Odyssey's Thermal Emission Imaging System (THEMIS) acquired these images of the Earth using its visible and infrared cameras as it left the Earth. The visible image shows the thin crescent viewed from Odyssey's perspective. The infrared image was acquired at exactly the same time, but shows the entire Earth using the infrared's 'night-vision' capability. Invisible light the instrument sees only reflected sunlight and therefore sees nothing on the night side of the planet. In infrared light the camera observes the light emitted by all regions of the Earth. The coldest ground temperatures seen correspond to the nighttime regions of Antarctica; the warmest temperatures occur in Australia. The low temperature in Antarctica is minus 50 degrees Celsius (minus 58 degrees Fahrenheit); the high temperature at night in Australia 9 degrees Celsius(48.2 degrees Fahrenheit). These temperatures agree remarkably well with observed temperatures of minus 63 degrees Celsius at Vostok Station in Antarctica, and 10 degrees Celsius in Australia. The images were taken at a distance of 3,563,735 kilometers (more than 2 million miles) on April 19,2001 as the Odyssey spacecraft left Earth.
Strategic options towards an affordable high-performance infrared camera
NASA Astrophysics Data System (ADS)
Oduor, Patrick; Mizuno, Genki; Dutta, Achyut K.; Lewis, Jay; Dhar, Nibir K.
2016-05-01
The promise of infrared (IR) imaging attaining low-cost akin to CMOS sensors success has been hampered by the inability to achieve cost advantages that are necessary for crossover from military and industrial applications into the consumer and mass-scale commercial realm despite well documented advantages. Banpil Photonics is developing affordable IR cameras by adopting new strategies to speed-up the decline of the IR camera cost curve. We present a new short-wave IR (SWIR) camera; 640x512 pixel InGaAs uncooled system that is high sensitivity low noise (<50e-), high dynamic range (100 dB), high-frame rates (> 500 frames per second (FPS)) at full resolution, and low power consumption (< 1 W) in a compact system. This camera paves the way towards mass market adoption by not only demonstrating high-performance IR imaging capability value add demanded by military and industrial application, but also illuminates a path towards justifiable price points essential for consumer facing application industries such as automotive, medical, and security imaging adoption. Among the strategic options presented include new sensor manufacturing technologies that scale favorably towards automation, multi-focal plane array compatible readout electronics, and dense or ultra-small pixel pitch devices.
A near-Infrared SETI Experiment: Alignment and Astrometric precision
NASA Astrophysics Data System (ADS)
Duenas, Andres; Maire, Jerome; Wright, Shelley; Drake, Frank D.; Marcy, Geoffrey W.; Siemion, Andrew; Stone, Remington P. S.; Tallis, Melisa; Treffers, Richard R.; Werthimer, Dan
2016-06-01
Beginning in March 2015, a Near-InfraRed Optical SETI (NIROSETI) instrument aiming to search for fast nanosecond laser pulses, has been commissioned on the Nickel 1m-telescope at Lick Observatory. The NIROSETI instrument makes use of an optical guide camera, SONY ICX694 CCD from PointGrey, to align our selected sources into two 200µm near-infrared Avalanche Photo Diodes (APD) with a field-of-view of 2.5"x2.5" each. These APD detectors operate at very fast bandwidths and are able to detect pulse widths extending down into the nanosecond range. Aligning sources onto these relatively small detectors requires characterizing the guide camera plate scale, static optical distortion solution, and relative orientation with respect to the APD detectors. We determined the guide camera plate scale as 55.9+- 2.7 milli-arcseconds/pixel and magnitude limit of 18.15mag (+1.07/-0.58) in V-band. We will present the full distortion solution of the guide camera, orientation, and our alignment method between the camera and the two APDs, and will discuss target selection within the NIROSETI observational campaign, including coordination with Breakthrough Listen.
Integrated Lloyd's mirror on planar waveguide facet as a spectrometer.
Morand, Alain; Benech, Pierre; Gri, Martine
2017-12-10
A low-cost and simple Fourier transform spectrometer based on the Lloyd's mirror configuration is proposed in order to have a very stable interferogram. A planar waveguide coupled to a fiber injection is used to spatially disperse the optical beam. A second beam superposed to the previous one is obtained by a total reflection of the incident beam on a vertical glass face integrated in the chip by dicing with a specific circular precision saw. The interferogram at the waveguide output is imaged on a near-infrared camera with an objective lens. The contrast and the fringe period are thus dependent on the type and the fiber position and can be optimized to the pixel size and the length of the camera. Spectral resolution close to λ/Δλ=80 is reached with a camera with 320 pixels of 25 μm width in a wavelength range from O to L bands.
Cryogenic solid Schmidt camera as a base for future wide-field IR systems
NASA Astrophysics Data System (ADS)
Yudin, Alexey N.
2011-11-01
Work is focused on study of capability of solid Schmidt camera to serve as a wide-field infrared lens for aircraft system with whole sphere coverage, working in 8-14 um spectral range, coupled with spherical focal array of megapixel class. Designs of 16 mm f/0.2 lens with 60 and 90 degrees sensor diagonal are presented, their image quality is compared with conventional solid design. Achromatic design with significantly improved performance, containing enclosed soft correcting lens behind protective front lens is proposed. One of the main goals of the work is to estimate benefits from curved detector arrays in 8-14 um spectral range wide-field systems. Coupling of photodetector with solid Schmidt camera by means of frustrated total internal reflection is considered, with corresponding tolerance analysis. The whole lens, except front element, is considered to be cryogenic, with solid Schmidt unit to be flown by hydrogen for improvement of bulk transmission.
NASA Astrophysics Data System (ADS)
Colbert, Fred
2013-05-01
There has been a significant increase in the number of in-house Infrared Thermographic Predictive Maintenance programs for Electrical/Mechanical inspections as compared to out-sourced programs using hired consultants. In addition, the number of infrared consulting services companies offering out-sourced programs has also has grown exponentially. These market segments include: Building Envelope (commercial and residential), Refractory, Boiler Evaluations, etc... These surges are driven by two main factors: 1. The low cost of investment in the equipment (the cost of cameras and peripherals continues to decline). 2. Novel marketing campaigns by the camera manufacturers who are looking to sell more cameras into an otherwise saturated market. The key characteristics of these campaigns are to over simplify the applications and understate the significances of technical training, specific skills and experience that's needed to obtain the risk-lowering information that a facility manager needs. These camera selling campaigns focuses on the simplicity of taking a thermogram, but ignores the critical factors of what it takes to actually perform and manage a creditable, valid IR program, which in-turn expose everyone to tremendous liability. As the In-house vs. Out-sourced consulting services compete for market share head to head with each other in a constricted market space, the price for out-sourced/consulting services drops to try to compete on price for more market share. The consequences of this approach are, something must be compromised to be able to stay competitive from a price point, and that compromise is the knowledge, technical skills and experience of the thermographer. This also ends up being reflected back into the skill sets of the in-house thermographer as well. This over simplification of the skill and experience is producing the "Perfect Storm" for Infrared Thermography, for both in-house and out-sourced programs.
NASA Astrophysics Data System (ADS)
Romanishkin, Igor D.; Grachev, Pavel V.; Pominova, Daria V.; Burmistrov, Ivan A.; Sildos, Ilmo; Vanetsev, Alexander S.; Orlovskaya, Elena O.; Orlovskii, Yuri V.; Loschenov, Victor B.; Ryabova, Anastasia V.
2018-04-01
In this work we investigated the use of composite crystalline core/shell nanoparticles LaF3:Nd3+(1%)@DyPO4 for fluorescence-based contactless thermometry, as well as laser-induced hyperthermia effect in optical model of biological tissue with modeled neoplasm. In preparation for this, a thermal calibration of the nanoparticles luminescence spectra was carried out. The results of the spectroscopic temperature measurement were compared to infrared thermal camera measurements. It showed that there is a significant difference between temperature recorded with IR camera and the actual temperature of the nanoparticles in the depth of the tissue model. The temperature calculated using the spectral method was up to 10 °C higher.
Practical aspects of modern interferometry for optical manufacturing quality control: Part 2
NASA Astrophysics Data System (ADS)
Smythe, Robert
2012-07-01
Modern phase shifting interferometers enable the manufacture of optical systems that drive the global economy. Semiconductor chips, solid-state cameras, cell phone cameras, infrared imaging systems, space based satellite imaging and DVD and Blu-Ray disks are all enabled by phase shifting interferometers. Theoretical treatments of data analysis and instrument design advance the technology but often are not helpful towards the practical use of interferometers. An understanding of the parameters that drive system performance is critical to produce useful results. Any interferometer will produce a data map and results; this paper, in three parts, reviews some of the key issues to minimize error sources in that data and provide a valid measurement.
Practical aspects of modern interferometry for optical manufacturing quality control, Part 3
NASA Astrophysics Data System (ADS)
Smythe, Robert A.
2012-09-01
Modern phase shifting interferometers enable the manufacture of optical systems that drive the global economy. Semiconductor chips, solid-state cameras, cell phone cameras, infrared imaging systems, space-based satellite imaging, and DVD and Blu-Ray disks are all enabled by phase-shifting interferometers. Theoretical treatments of data analysis and instrument design advance the technology but often are not helpful toward the practical use of interferometers. An understanding of the parameters that drive the system performance is critical to produce useful results. Any interferometer will produce a data map and results; this paper, in three parts, reviews some of the key issues to minimize error sources in that data and provide a valid measurement.
Small Unmanned Aerial Vehicles; DHS’s Answer to Border Surveillance Requirements
2013-03-01
5 of more than 4000 illegal aliens, including the seizure of more than 15,000 pounds of marijuana .13 In addition to the Predator UAVs being...payload includes two color video cameras, an infrared camera that offers night vision capability and synthetic aperture radar that provides high
1996-01-01
used to locate and characterize a magnetic dipole source, and this finding accelerated the development of superconducting tensor gradiometers for... superconducting magnetic field gradiometer, two-color infrared camera, synthetic aperture radar, and a visible spectrum camera. The combination of these...Pieter Hoekstra, Blackhawk GeoSciences ......................................... 68 Prediction for UXO Shape and Orientation Effects on Magnetic
NASA Astrophysics Data System (ADS)
Nugent, P. W.; Shaw, J. A.; Piazzolla, S.
2013-02-01
The continuous demand for high data return in deep space and near-Earth satellite missions has led NASA and international institutions to consider alternative technologies for high-data-rate communications. One solution is the establishment of wide-bandwidth Earth-space optical communication links, which require (among other things) a nearly obstruction-free atmospheric path. Considering the atmospheric channel, the most common and most apparent impairments on Earth-space optical communication paths arise from clouds. Therefore, the characterization of the statistical behavior of cloud coverage for optical communication ground station candidate sites is of vital importance. In this article, we describe the development and deployment of a ground-based, long-wavelength infrared cloud imaging system able to monitor and characterize the cloud coverage. This system is based on a commercially available camera with a 62-deg diagonal field of view. A novel internal-shutter-based calibration technique allows radiometric calibration of the camera, which operates without a thermoelectric cooler. This cloud imaging system provides continuous day-night cloud detection with constant sensitivity. The cloud imaging system also includes data-processing algorithms that calculate and remove atmospheric emission to isolate cloud signatures, and enable classification of clouds according to their optical attenuation. Measurements of long-wavelength infrared cloud radiance are used to retrieve the optical attenuation (cloud optical depth due to absorption and scattering) in the wavelength range of interest from visible to near-infrared, where the cloud attenuation is quite constant. This article addresses the specifics of the operation, calibration, and data processing of the imaging system that was deployed at the NASA/JPL Table Mountain Facility (TMF) in California. Data are reported from July 2008 to July 2010. These data describe seasonal variability in cloud cover at the TMF site, with cloud amount (percentage of cloudy pixels) peaking at just over 51 percent during February, of which more than 60 percent had optical attenuation exceeding 12 dB at wavelengths in the range from the visible to the near-infrared. The lowest cloud amount was found during August, averaging 19.6 percent, and these clouds were mostly optically thin, with low attenuation.
Fuzzy System-Based Target Selection for a NIR Camera-Based Gaze Tracker
Naqvi, Rizwan Ali; Arsalan, Muhammad; Park, Kang Ryoung
2017-01-01
Gaze-based interaction (GBI) techniques have been a popular subject of research in the last few decades. Among other applications, GBI can be used by persons with disabilities to perform everyday tasks, as a game interface, and can play a pivotal role in the human computer interface (HCI) field. While gaze tracking systems have shown high accuracy in GBI, detecting a user’s gaze for target selection is a challenging problem that needs to be considered while using a gaze detection system. Past research has used the blinking of the eyes for this purpose as well as dwell time-based methods, but these techniques are either inconvenient for the user or requires a long time for target selection. Therefore, in this paper, we propose a method for fuzzy system-based target selection for near-infrared (NIR) camera-based gaze trackers. The results of experiments performed in addition to tests of the usability and on-screen keyboard use of the proposed method show that it is better than previous methods. PMID:28420114
Infrared Instrument for Detecting Hydrogen Fires
NASA Technical Reports Server (NTRS)
Youngquist, Robert; Ihlefeld, Curtis; Immer, Christopher; Oostdyk, Rebecca; Cox, Robert; Taylor, John
2006-01-01
The figure shows an instrument incorporating an infrared camera for detecting small hydrogen fires. The instrument has been developed as an improved replacement for prior infrared and ultraviolet instruments used to detect hydrogen fires. The need for this or any such instrument arises because hydrogen fires (e.g., those associated with leaks from tanks, valves, and ducts) pose a great danger, yet they emit so little visible light that they are mostly undetectable by the unaided human eye. The main performance advantage offered by the present instrument over prior hydrogen-fire-detecting instruments lies in its greater ability to avoid false alarms by discriminating against reflected infrared light, including that originating in (1) the Sun, (2) welding torches, and (3) deliberately ignited hydrogen flames (e.g., ullage-burn-off flames) that are nearby but outside the field of view intended to be monitored by the instrument. Like prior such instruments, this instrument is based mostly on the principle of detecting infrared emission above a threshold level. However, in addition, this instrument utilizes information on the spatial distribution of infrared light from a source that it detects. Because the combination of spatial and threshold information about a flame tends to constitute a unique signature that differs from that of reflected infrared light originating in a source not in the field of view, the incidence of false alarms is reduced substantially below that of related prior threshold- based instruments.
Field trials for determining the visible and infrared transmittance of screening smoke
NASA Astrophysics Data System (ADS)
Sánchez Oliveros, Carmen; Santa-María Sánchez, Guillermo; Rosique Pérez, Carlos
2009-09-01
In order to evaluate the concealment capability of smoke, the Countermeasures Laboratory of the Institute of Technology "Marañosa" (ITM) has done a set of tests for measuring the transmittances of multispectral smoke tins in several bands of the electromagnetic spectrum. The smoke composition based on red phosphorous has been developed and patented by this laboratory as a part of a projectile development. The smoke transmittance was measured by means of thermography as well as spectroradiometry. Black bodies and halogen lamps were used as infrared and visible source of radiation. The measurements were carried out in June of 2008 at the Marañosa field (Spain) with two MWIR cameras, two LWIR cameras, one CCD visible camera, one CVF IR spectroradiometer covering the interval 1.5 to 14 microns and one array silicon based spectroradiometer for the 0.2 to 1.1 μm spectra. The transmittance and dimensions of the smoke screen were characterized in the visible band, MWIR (3 - 5 μm and LWIR (8 - 12 μm) regions. The size of the screen was about 30 meters wide and 5 meters high. The transmittances in the IR bands were about 0.3 and better than 0.1 in the visible one. The screens showed to be effective over the time of persistence for all of the tests. The results obtained from the imaging and non-imaging systems were in good accordance. The meteorological conditions during tests such as the wind speed are determinant for the use of this kind of optical countermeasures.
NASA Astrophysics Data System (ADS)
Croll, Bryce; Albert, Loic; Jayawardhana, Ray; Cushing, Michael; Moutou, Claire; Lafreniere, David; Johnson, John Asher; Bonomo, Aldo S.; Deleuil, Magali; Fortney, Jonathan
2015-03-01
We present detections of the near-infrared thermal emission of three hot Jupiters and one brown dwarf using the Wide-field Infrared Camera (WIRCam) on the Canada-France-Hawaii Telescope (CFHT). These include Ks-band secondary eclipse detections of the hot Jupiters WASP-3b and Qatar-1b and the brown dwarf KELT-1b. We also report Y-band, K CONT-band, and two new and one reanalyzed Ks-band detections of the thermal emission of the hot Jupiter WASP-12b. We present a new reduction pipeline for CFHT/WIRCam data, which is optimized for high precision photometry. We also describe novel techniques for constraining systematic errors in ground-based near-infrared photometry, so as to return reliable secondary eclipse depths and uncertainties. We discuss the noise properties of our ground-based photometry for wavelengths spanning the near-infrared (the YJHK bands), for faint and bright stars, and for the same object on several occasions. For the hot Jupiters WASP-3b and WASP-12b we demonstrate the repeatability of our eclipse depth measurements in the Ks band; we therefore place stringent limits on the systematics of ground-based, near-infrared photometry, and also rule out violent weather changes in the deep, high pressure atmospheres of these two hot Jupiters at the epochs of our observations. Based on observations obtained with WIRCam, a joint project of Canada-France-Hawaii Telescope (CFHT), Taiwan, Korea, Canada, France, at the CFHT, which is operated by the National Research Council (NRC) of Canada, the Institute National des Sciences de l'Univers of the Centre National de la Recherche Scientifique of France, and the University of Hawaii.
Status of the JWST Science Instrument Payload
NASA Technical Reports Server (NTRS)
Greenhouse, Matt
2016-01-01
The James Webb Space Telescope (JWST) Integrated Science Instrument Module (ISIM) system consists of five sensors (4 science): Mid-Infrared Instrument (MIRI), Near Infrared Imager and Slitless Spectrograph (NIRISS), Fine Guidance Sensor (FGS), Near InfraRed Camera (NIRCam), Near InfraRed Spectrograph (NIRSpec); and nine instrument support systems: Optical metering structure system, Electrical Harness System; Harness Radiator System, ISIM Electronics Compartment, ISIM Remote Services Unit, Cryogenic Thermal Control System, Command and Data Handling System, Flight Software System, Operations Scripts System.
CIRCE: The Canarias InfraRed Camera Experiment for the Gran Telescopio Canarias
NASA Astrophysics Data System (ADS)
Eikenberry, Stephen S.; Charcos, Miguel; Edwards, Michelle L.; Garner, Alan; Lasso-Cabrera, Nestor; Stelter, Richard D.; Marin-Franch, Antonio; Raines, S. Nicholas; Ackley, Kendall; Bennett, John G.; Cenarro, Javier A.; Chinn, Brian; Donoso, H. Veronica; Frommeyer, Raymond; Hanna, Kevin; Herlevich, Michael D.; Julian, Jeff; Miller, Paola; Mullin, Scott; Murphey, Charles H.; Packham, Chris; Varosi, Frank; Vega, Claudia; Warner, Craig; Ramaprakash, A. N.; Burse, Mahesh; Punnadi, Sunjit; Chordia, Pravin; Gerarts, Andreas; Martín, Héctor De Paz; Calero, María Martín; Scarpa, Riccardo; Acosta, Sergio Fernandez; Sánchez, William Miguel Hernández; Siegel, Benjamin; Pérez, Francisco Francisco; Martín, Himar D. Viera; Losada, José A. Rodríguez; Nuñez, Agustín; Tejero, Álvaro; González, Carlos E. Martín; Rodríguez, César Cabrera; Sendra, Jordi Molgó; Rodriguez, J. Esteban; Cáceres, J. Israel Fernádez; García, Luis A. Rodríguez; Lopez, Manuel Huertas; Dominguez, Raul; Gaggstatter, Tim; Lavers, Antonio Cabrera; Geier, Stefan; Pessev, Peter; Sarajedini, Ata; Castro-Tirado, A. J.
The Canarias InfraRed Camera Experiment (CIRCE) is a near-infrared (1-2.5μm) imager, polarimeter and low-resolution spectrograph operating as a visitor instrument for the Gran Telescopio Canarias (GTC) 10.4-m telescope. It was designed and built largely by graduate students and postdocs, with help from the University of Florida (UF) astronomy engineering group, and is funded by the UF and the US National Science Foundation. CIRCE is intended to help fill the gap in near-infrared capabilities prior to the arrival of Especrografo Multiobjecto Infra-Rojo (EMIR) to the GTC and will also provide the following scientific capabilities to compliment EMIR after its arrival: high-resolution imaging, narrowband imaging, high-time-resolution photometry, imaging polarimetry, and low resolution spectroscopy. In this paper, we review the design, fabrication, integration, lab testing, and on-sky performance results for CIRCE. These include a novel approach to the opto-mechanical design, fabrication, and alignment.
Statistical Analysis of an Infrared Thermography Inspection of Reinforced Carbon-Carbon
NASA Technical Reports Server (NTRS)
Comeaux, Kayla
2011-01-01
Each piece of flight hardware being used on the shuttle must be analyzed and pass NASA requirements before the shuttle is ready for launch. One tool used to detect cracks that lie within flight hardware is Infrared Flash Thermography. This is a non-destructive testing technique which uses an intense flash of light to heat up the surface of a material after which an Infrared camera is used to record the cooling of the material. Since cracks within the material obstruct the natural heat flow through the material, they are visible when viewing the data from the Infrared camera. We used Ecotherm, a software program, to collect data pertaining to the delaminations and analyzed the data using Ecotherm and University of Dayton Log Logistic Probability of Detection (POD) Software. The goal was to reproduce the statistical analysis produced by the University of Dayton software, by using scatter plots, log transforms, and residuals to test the assumption of normality for the residuals.
Generative technique for dynamic infrared image sequences
NASA Astrophysics Data System (ADS)
Zhang, Qian; Cao, Zhiguo; Zhang, Tianxu
2001-09-01
The generative technique of the dynamic infrared image was discussed in this paper. Because infrared sensor differs from CCD camera in imaging mechanism, it generates the infrared image by incepting the infrared radiation of scene (including target and background). The infrared imaging sensor is affected deeply by the atmospheric radiation, the environmental radiation and the attenuation of atmospheric radiation transfers. Therefore at first in this paper the imaging influence of all kinds of the radiations was analyzed and the calculation formula of radiation was provided, in addition, the passive scene and the active scene were analyzed separately. Then the methods of calculation in the passive scene were provided, and the functions of the scene model, the atmospheric transmission model and the material physical attribute databases were explained. Secondly based on the infrared imaging model, the design idea, the achievable way and the software frame for the simulation software of the infrared image sequence were introduced in SGI workstation. Under the guidance of the idea above, in the third segment of the paper an example of simulative infrared image sequences was presented, which used the sea and sky as background and used the warship as target and used the aircraft as eye point. At last the simulation synthetically was evaluated and the betterment scheme was presented.
Auto-converging stereo cameras for 3D robotic tele-operation
NASA Astrophysics Data System (ADS)
Edmondson, Richard; Aycock, Todd; Chenault, David
2012-06-01
Polaris Sensor Technologies has developed a Stereovision Upgrade Kit for TALON robot to provide enhanced depth perception to the operator. This kit previously required the TALON Operator Control Unit to be equipped with the optional touchscreen interface to allow for operator control of the camera convergence angle adjustment. This adjustment allowed for optimal camera convergence independent of the distance from the camera to the object being viewed. Polaris has recently improved the performance of the stereo camera by implementing an Automatic Convergence algorithm in a field programmable gate array in the camera assembly. This algorithm uses scene content to automatically adjust the camera convergence angle, freeing the operator to focus on the task rather than adjustment of the vision system. The autoconvergence capability has been demonstrated on both visible zoom cameras and longwave infrared microbolometer stereo pairs.
Iwasaki, Yoichiro; Misumi, Masato; Nakamiya, Toshiyuki
2015-01-01
To realize road traffic flow surveillance under various environments which contain poor visibility conditions, we have already proposed two vehicle detection methods using thermal images taken with an infrared thermal camera. The first method uses pattern recognition for the windshields and their surroundings to detect vehicles. However, the first method decreases the vehicle detection accuracy in winter season. To maintain high vehicle detection accuracy in all seasons, we developed the second method. The second method uses tires' thermal energy reflection areas on a road as the detection targets. The second method did not achieve high detection accuracy for vehicles on left-hand and right-hand lanes except for two center-lanes. Therefore, we have developed a new method based on the second method to increase the vehicle detection accuracy. This paper proposes the new method and shows that the detection accuracy for vehicles on all lanes is 92.1%. Therefore, by combining the first method and the new method, high vehicle detection accuracies are maintained under various environments, and road traffic flow surveillance can be realized.
NASA Astrophysics Data System (ADS)
Gutschwager, Berndt; Hollandt, Jörg
2017-01-01
We present a novel method of nonuniformity correction (NUC) of infrared cameras and focal plane arrays (FPA) in a wide optical spectral range by reading radiance temperatures and by applying a radiation source with an unknown and spatially nonhomogeneous radiance temperature distribution. The benefit of this novel method is that it works with the display and the calculation of radiance temperatures, it can be applied to radiation sources of arbitrary spatial radiance temperature distribution, and it only requires sufficient temporal stability of this distribution during the measurement process. In contrast to this method, an initially presented method described the calculation of NUC correction with the reading of monitored radiance values. Both methods are based on the recording of several (at least three) images of a radiation source and a purposeful row- and line-shift of these sequent images in relation to the first primary image. The mathematical procedure is explained in detail. Its numerical verification with a source of a predefined nonhomogeneous radiance temperature distribution and a thermal imager of a predefined nonuniform FPA responsivity is presented.
Fast reconstruction of optical properties for complex segmentations in near infrared imaging
NASA Astrophysics Data System (ADS)
Jiang, Jingjing; Wolf, Martin; Sánchez Majos, Salvador
2017-04-01
The intrinsic ill-posed nature of the inverse problem in near infrared imaging makes the reconstruction of fine details of objects deeply embedded in turbid media challenging even for the large amounts of data provided by time-resolved cameras. In addition, most reconstruction algorithms for this type of measurements are only suitable for highly symmetric geometries and rely on a linear approximation to the diffusion equation since a numerical solution of the fully non-linear problem is computationally too expensive. In this paper, we will show that a problem of practical interest can be successfully addressed making efficient use of the totality of the information supplied by time-resolved cameras. We set aside the goal of achieving high spatial resolution for deep structures and focus on the reconstruction of complex arrangements of large regions. We show numerical results based on a combined approach of wavelength-normalized data and prior geometrical information, defining a fully parallelizable problem in arbitrary geometries for time-resolved measurements. Fast reconstructions are obtained using a diffusion approximation and Monte-Carlo simulations, parallelized in a multicore computer and a GPU respectively.
Estimating Clothing Thermal Insulation Using an Infrared Camera
Lee, Jeong-Hoon; Kim, Young-Keun; Kim, Kyung-Soo; Kim, Soohyun
2016-01-01
In this paper, a novel algorithm for estimating clothing insulation is proposed to assess thermal comfort, based on the non-contact and real-time measurements of the face and clothing temperatures by an infrared camera. The proposed method can accurately measure the clothing insulation of various garments under different clothing fit and sitting postures. The proposed estimation method is investigated to be effective to measure its clothing insulation significantly in different seasonal clothing conditions using a paired t-test in 99% confidence interval. Temperatures simulated with the proposed estimated insulation value show closer to the values of actual temperature than those with individual clothing insulation values. Upper clothing’s temperature is more accurate within 3% error and lower clothing’s temperature is more accurate by 3.7%~6.2% error in indoor working scenarios. The proposed algorithm can reflect the effect of air layer which makes insulation different in the calculation to estimate clothing insulation using the temperature of the face and clothing. In future, the proposed method is expected to be applied to evaluate the customized passenger comfort effectively. PMID:27005625
Iwasaki, Yoichiro; Misumi, Masato; Nakamiya, Toshiyuki
2015-01-01
To realize road traffic flow surveillance under various environments which contain poor visibility conditions, we have already proposed two vehicle detection methods using thermal images taken with an infrared thermal camera. The first method uses pattern recognition for the windshields and their surroundings to detect vehicles. However, the first method decreases the vehicle detection accuracy in winter season. To maintain high vehicle detection accuracy in all seasons, we developed the second method. The second method uses tires' thermal energy reflection areas on a road as the detection targets. The second method did not achieve high detection accuracy for vehicles on left-hand and right-hand lanes except for two center-lanes. Therefore, we have developed a new method based on the second method to increase the vehicle detection accuracy. This paper proposes the new method and shows that the detection accuracy for vehicles on all lanes is 92.1%. Therefore, by combining the first method and the new method, high vehicle detection accuracies are maintained under various environments, and road traffic flow surveillance can be realized. PMID:25763384
A Wide-field Camera and Fully Remote Operations at the Wyoming Infrared Observatory
NASA Astrophysics Data System (ADS)
Findlay, Joseph R.; Kobulnicky, Henry A.; Weger, James S.; Bucher, Gerald A.; Perry, Marvin C.; Myers, Adam D.; Pierce, Michael J.; Vogel, Conrad
2016-11-01
Upgrades at the 2.3 meter Wyoming Infrared Observatory telescope have provided the capability for fully remote operations by a single operator from the University of Wyoming campus. A line-of-sight 300 Megabit s-1 11 GHz radio link provides high-speed internet for data transfer and remote operations that include several realtime video feeds. Uninterruptable power is ensured by a 10 kVA battery supply for critical systems and a 55 kW autostart diesel generator capable of running the entire observatory for up to a week. The construction of a new four-element prime-focus corrector with fused-silica elements allows imaging over a 40‧ field of view with a new 40962 UV-sensitive prime-focus camera and filter wheel. A new telescope control system facilitates the remote operations model and provides 20″ rms pointing over the usable sky. Taken together, these improvements pave the way for a new generation of sky surveys supporting space-based missions and flexible-cadence observations advancing emerging astrophysical priorities such as planet detection, quasar variability, and long-term time-domain campaigns.
An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates
2017-01-01
Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing. PMID:28981533
Upgrade of the infrared camera diagnostics for the JET ITER-like wall divertor.
Balboa, I; Arnoux, G; Eich, T; Sieglin, B; Devaux, S; Zeidner, W; Morlock, C; Kruezi, U; Sergienko, G; Kinna, D; Thomas, P D; Rack, M
2012-10-01
For the new ITER-like wall at JET, two new infrared diagnostics (KL9B, KL3B) have been installed. These diagnostics can operate between 3.5 and 5 μm and up to sampling frequencies of ∼20 kHz. KL9B and KL3B image the horizontal and vertical tiles of the divertor. The divertor tiles are tungsten coated carbon fiber composite except the central tile which is bulk tungsten and consists of lamella segments. The thermal emission between lamellae affects the surface temperature measurement and therefore KL9A has been upgraded to achieve a higher spatial resolution (by a factor of 2). A technical description of KL9A, KL9B, and KL3B and cross correlation with a near infrared camera and a two-color pyrometer is presented.
C-RED One : the infrared camera using the Saphira e-APD detector
NASA Astrophysics Data System (ADS)
Greffe, Timothée.; Feautrier, Philippe; Gach, Jean-Luc; Stadler, Eric; Clop, Fabien; Lemarchand, Stephane; Boutolleau, David; Baker, Ian
2016-08-01
Name for Person Card: Observatoire de la Côte d'Azur First Light Imaging' C-RED One infrared camera is capable of capturing up to 3500 full frames per second with a sub-electron readout noise and very low background. This breakthrough has been made possible thanks to the use of an e- APD infrared focal plane array which is a real disruptive technology in imagery. C-RED One is an autonomous system with an integrated cooling system and a vacuum regeneration system. It operates its sensor with a wide variety of read out techniques and processes video on-board thanks to an FPGA. We will show its performances and expose its main features. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.
Wavelet subspace decomposition of thermal infrared images for defect detection in artworks
NASA Astrophysics Data System (ADS)
Ahmad, M. Z.; Khan, A. A.; Mezghani, S.; Perrin, E.; Mouhoubi, K.; Bodnar, J. L.; Vrabie, V.
2016-07-01
Health of ancient artworks must be routinely monitored for their adequate preservation. Faults in these artworks may develop over time and must be identified as precisely as possible. The classical acoustic testing techniques, being invasive, risk causing permanent damage during periodic inspections. Infrared thermometry offers a promising solution to map faults in artworks. It involves heating the artwork and recording its thermal response using infrared camera. A novel strategy based on pseudo-random binary excitation principle is used in this work to suppress the risks associated with prolonged heating. The objective of this work is to develop an automatic scheme for detecting faults in the captured images. An efficient scheme based on wavelet based subspace decomposition is developed which favors identification of, the otherwise invisible, weaker faults. Two major problems addressed in this work are the selection of the optimal wavelet basis and the subspace level selection. A novel criterion based on regional mutual information is proposed for the latter. The approach is successfully tested on a laboratory based sample as well as real artworks. A new contrast enhancement metric is developed to demonstrate the quantitative efficiency of the algorithm. The algorithm is successfully deployed for both laboratory based and real artworks.
Binocular Multispectral Adaptive Imaging System (BMAIS)
2010-07-26
system for pilots that adaptively integrates shortwave infrared (SWIR), visible, near ‐IR (NIR), off‐head thermal, and computer symbology/imagery into...respective areas. BMAIS is a binocular helmet mounted imaging system that features dual shortwave infrared (SWIR) cameras, embedded image processors and...algorithms and fusion of other sensor sites such as forward looking infrared (FLIR) and other aircraft subsystems. BMAIS is attached to the helmet
NIRAC: Near Infrared Airglow Camera for the International Space Station
NASA Astrophysics Data System (ADS)
Gelinas, L. J.; Rudy, R. J.; Hecht, J. H.
2017-12-01
NIRAC is a space based infrared airglow imager that will be deployed to the International Space Station in late 2018, under the auspices of the Space Test Program. NIRAC will survey OH airglow emissions in the 1.6 micron wavelength regime, exploring the spatial and temporal variability of emission intensities at latitudes from 51° south to 51° north. Atmospheric perturbations in the 80-100 km altitude range, including those produced by atmospheric gravity waves (AGWs), are observable in the OH airglow. The objective of the NIRAC experiment is to make near global measurement of the OH airglow and airglow perturbations. These emissions also provide a bright source of illumination at night, allowing for nighttime detection of clouds and surface characteristics. The instrument, developed by the Aerospace Space Science Applications Laboratory, employs a space-compatible FPGA for camera control and data collection and a novel, custom optical system to eliminate image smear due to orbital motion. NIRAC utilizes a high-performance, large format infrared focal plane array, transitioning technology used in the existing Aerospace Corporation ground-based airglow imager to a space based platform. The high-sensitivity, four megapixel imager has a native spatial resolution of 100 meters at ISS altitudes. The 23° x 23° FOV sweeps out a 150 km swath of the OH airglow layer as viewed from the ISS, and is sensitive to OH intensity perturbations down to 0.1%. The detector has a 1.7 micron cutoff that precludes the need for cold optics and reduces cooling requirements (to 180 K). Detector cooling is provided by a compact, lightweight cryocooler capable of reaching 120K, providing a great deal of margin.
An Automatic Image-Based Modelling Method Applied to Forensic Infography
Zancajo-Blazquez, Sandra; Gonzalez-Aguilera, Diego; Gonzalez-Jorge, Higinio; Hernandez-Lopez, David
2015-01-01
This paper presents a new method based on 3D reconstruction from images that demonstrates the utility and integration of close-range photogrammetry and computer vision as an efficient alternative to modelling complex objects and scenarios of forensic infography. The results obtained confirm the validity of the method compared to other existing alternatives as it guarantees the following: (i) flexibility, permitting work with any type of camera (calibrated and non-calibrated, smartphone or tablet) and image (visible, infrared, thermal, etc.); (ii) automation, allowing the reconstruction of three-dimensional scenarios in the absence of manual intervention, and (iii) high quality results, sometimes providing higher resolution than modern laser scanning systems. As a result, each ocular inspection of a crime scene with any camera performed by the scientific police can be transformed into a scaled 3d model. PMID:25793628
NASA Astrophysics Data System (ADS)
Naqvi, Rizwan Ali; Park, Kang Ryoung
2016-06-01
Gaze tracking systems are widely used in human-computer interfaces, interfaces for the disabled, game interfaces, and for controlling home appliances. Most studies on gaze detection have focused on enhancing its accuracy, whereas few have considered the discrimination of intentional gaze fixation (looking at a target to activate or select it) from unintentional fixation while using gaze detection systems. Previous research methods based on the use of a keyboard or mouse button, eye blinking, and the dwell time of gaze position have various limitations. Therefore, we propose a method for discriminating between intentional and unintentional gaze fixation using a multimodal fuzzy logic algorithm applied to a gaze tracking system with a near-infrared camera sensor. Experimental results show that the proposed method outperforms the conventional method for determining gaze fixation.
Li, Jin; Liu, Zilong
2017-07-24
Remote sensing cameras in the visible/near infrared range are essential tools in Earth-observation, deep-space exploration, and celestial navigation. Their imaging performance, i.e. image quality here, directly determines the target-observation performance of a spacecraft, and even the successful completion of a space mission. Unfortunately, the camera itself, such as a optical system, a image sensor, and a electronic system, limits the on-orbit imaging performance. Here, we demonstrate an on-orbit high-resolution imaging method based on the invariable modulation transfer function (IMTF) of cameras. The IMTF, which is stable and invariable to the changing of ground targets, atmosphere, and environment on orbit or on the ground, depending on the camera itself, is extracted using a pixel optical focal-plane (PFP). The PFP produces multiple spatial frequency targets, which are used to calculate the IMTF at different frequencies. The resulting IMTF in combination with a constrained least-squares filter compensates for the IMTF, which represents the removal of the imaging effects limited by the camera itself. This method is experimentally confirmed. Experiments on an on-orbit panchromatic camera indicate that the proposed method increases 6.5 times of the average gradient, 3.3 times of the edge intensity, and 1.56 times of the MTF value compared to the case when IMTF is not used. This opens a door to push the limitation of a camera itself, enabling high-resolution on-orbit optical imaging.
New-style defect inspection system of film
NASA Astrophysics Data System (ADS)
Liang, Yan; Liu, Wenyao; Liu, Ming; Lee, Ronggang
2002-09-01
An inspection system has been developed for on-line detection of film defects, which bases on combination of photoelectric imaging and digital image processing. The system runs in high speed of maximum 60m/min. Moving film is illuminated by LED array which emits even infrared (peak wavelength λp=940nm), and infrared images are obtained with a high quality and high speed CCD camera. The application software based on Visual C++6.0 under Windows processes images in real time by means of such algorithms as median filter, edge detection and projection, etc. The system is made up of four modules, which are introduced in detail in the paper. On-line experiment results shows that the inspection system can recognize defects precisely in high speed and run reliably in practical application.
Comparison of parameters of modern cooled and uncooled thermal cameras
NASA Astrophysics Data System (ADS)
Bareła, Jarosław; Kastek, Mariusz; Firmanty, Krzysztof; Krupiński, Michał
2017-10-01
During the design of a system employing thermal cameras one always faces a problem of choosing the camera types best suited for the task. In many cases such a choice is far from optimal one, and there are several reasons for that. System designers often favor tried and tested solution they are used to. They do not follow the latest developments in the field of infrared technology and sometimes their choices are based on prejudice and not on facts. The paper presents the results of measurements of basic parameters of MWIR and LWIR thermal cameras, carried out in a specialized testing laboratory. The measured parameters are decisive in terms of image quality generated by thermal cameras. All measurements were conducted according to current procedures and standards. However the camera settings were not optimized for a specific test conditions or parameter measurements. Instead the real settings used in normal camera operations were applied to obtain realistic camera performance figures. For example there were significant differences between measured values of noise parameters and catalogue data provided by manufacturers, due to the application of edge detection filters to increase detection and recognition ranges. The purpose of this paper is to provide help in choosing the optimal thermal camera for particular application, answering the question whether to opt for cheaper microbolometer device or apply slightly better (in terms of specifications) yet more expensive cooled unit. Measurements and analysis were performed by qualified personnel with several dozen years of experience in both designing and testing of thermal camera systems with both cooled and uncooled focal plane arrays. Cameras of similar array sizes and optics were compared, and for each tested group the best performing devices were selected.
BIG MAC: A bolometer array for mid-infrared astronomy, Center Director's Discretionary Fund
NASA Technical Reports Server (NTRS)
Telesco, C. M.; Decher, R.; Baugher, C.
1985-01-01
The infrared array referred to as Big Mac (for Marshall Array Camera), was designed for ground based astronomical observations in the wavelength range 5 to 35 microns. It contains 20 discrete gallium-doped germanium bolometer detectors at a temperature of 1.4K. Each bolometer is irradiated by a square field mirror constituting a single pixel of the array. The mirrors are arranged contiguously in four columns and five rows, thus defining the array configuration. Big Mac utilized cold reimaging optics and an up looking dewar. The total Big Mac system also contains a telescope interface tube for mounting the dewar and a computer for data acquisition and processing. Initial astronomical observations at a major infrared observatory indicate that Big Mac performance is excellent, having achieved the design specifications and making this instrument an outstanding tool for astrophysics.
A hidden view of wildlife conservation: How camera traps aid science, research and management
O'Connell, Allan F.
2015-01-01
Camera traps — remotely activated cameras with infrared sensors — first gained measurable popularity in wildlife conservation in the early 1990s. Today, they’re used for a variety of activities, from species-specific research to broad-scale inventory or monitoring programs that, in some cases, attempt to detect biodiversity across vast landscapes. As this modern tool continues to evolve, it’s worth examining its uses and benefits for wildlife management and conservation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinwiddie, Ralph Barton; Parris, Larkin S.; Lindal, John M.
This paper explores the temperature range extension of long-wavelength infrared (LWIR) cameras by placing an aperture in front of the lens. An aperture smaller than the lens will reduce the radiance to the sensor, allowing the camera to image targets much hotter than typically allowable. These higher temperatures were accurately determined after developing a correction factor which was applied to the built-in temperature calibration. The relationship between aperture diameter and temperature range is linear. The effect of pre-lens apertures on the image uniformity is a form of anti-vignetting, meaning the corners appear brighter (hotter) than the rest of the image.more » An example of using this technique to measure temperatures of high melting point polymers during 3D printing provide valuable information of the time required for the weld-line temperature to fall below the glass transition temperature.« less
Augmented reality in laser laboratories
NASA Astrophysics Data System (ADS)
Quercioli, Franco
2018-05-01
Laser safety glasses block visibility of the laser light. This is a big nuisance when a clear view of the beam path is required. A headset made up of a smartphone and a viewer can overcome this problem. The user looks at the image of the real world on the cellphone display, captured by its rear camera. An unimpeded and safe sight of the laser beam is then achieved. If the infrared blocking filter of the smartphone camera is removed, the spectral sensitivity of the CMOS image sensor extends in the near infrared region up to 1100 nm. This substantial improvement widens the usability of the device to many laser systems for industrial and medical applications, which are located in this spectral region. The paper describes this modification of a phone camera to extend its sensitivity beyond the visible and make a true augmented reality laser viewer.
Basic temperature correction of QWIP cameras in thermoelastic/plastic tests of composite materials.
Boccardi, Simone; Carlomagno, Giovanni Maria; Meola, Carosena
2016-12-01
The present work is concerned with the use of a quantum well infrared photodetector (QWIP) infrared camera to measure very small temperature variations, which are related to thermoelastic/plastic effects, developing on composites under relatively low loads, either periodic or due to impact. As is evident from previous work, some temperature variations are difficult to measure, being at the edge of the IR camera resolution and/or affected by the instrument noise. Conversely, they may be valuable to get either information about the material characteristics and its behavior under periodic load (thermoelastic), or to assess the overall extension of delaminations due to impact (thermo-plastic). An image post-processing procedure is herein described that, with the help of a reference signal, allows for suppression of the instrument noise and better discrimination of thermal signatures induced by the two different loads.
The Hubble Space Telescope: UV, Visible, and Near-Infrared Pursuits
NASA Technical Reports Server (NTRS)
Wiseman, Jennifer
2010-01-01
The Hubble Space Telescope continues to push the limits on world-class astrophysics. Cameras including the Advanced Camera for Surveys and the new panchromatic Wide Field Camera 3 which was installed nu last year's successful servicing mission S2N4,o{fer imaging from near-infrared through ultraviolet wavelengths. Spectroscopic studies of sources from black holes to exoplanet atmospheres are making great advances through the versatile use of STIS, the Space Telescope Imaging Spectrograph. The new Cosmic Origins Spectrograph, also installed last year, is the most sensitive UV spectrograph to fly io space and is uniquely suited to address particular scientific questions on galaxy halos, the intergalactic medium, and the cosmic web. With these outstanding capabilities on HST come complex needs for laboratory astrophysics support including atomic and line identification data. I will provide an overview of Hubble's current capabilities and the scientific programs and goals that particularly benefit from the studies of laboratory astrophysics.
Feasibility evaluation of a motion detection system with face images for stereotactic radiosurgery.
Yamakawa, Takuya; Ogawa, Koichi; Iyatomi, Hitoshi; Kunieda, Etsuo
2011-01-01
In stereotactic radiosurgery we can irradiate a targeted volume precisely with a narrow high-energy x-ray beam, and thus the motion of a targeted area may cause side effects to normal organs. This paper describes our motion detection system with three USB cameras. To reduce the effect of change in illuminance in a tracking area we used an infrared light and USB cameras that were sensitive to the infrared light. The motion detection of a patient was performed by tracking his/her ears and nose with three USB cameras, where pattern matching between a predefined template image for each view and acquired images was done by an exhaustive search method with a general-purpose computing on a graphics processing unit (GPGPU). The results of the experiments showed that the measurement accuracy of our system was less than 0.7 mm, amounting to less than half of that of our previous system.
Tracking a Head-Mounted Display in a Room-Sized Environment with Head-Mounted Cameras
1990-04-01
poor resolution and a very limited working volume [Wan90]. 4 OPTOTRAK [Nor88] uses one camera with two dual-axis CCD infrared position sensors. Each...Nor88] Northern Digital. Trade literature on Optotrak - Northern Digital’s Three Dimensional Optical Motion Tracking and Analysis System. Northern Digital
The diagnosing of plasmas using spectroscopy and imaging on Proto-MPEX
NASA Astrophysics Data System (ADS)
Baldwin, K. A.; Biewer, T. M.; Crouse Powers, J.; Hardin, R.; Johnson, S.; McCleese, A.; Shaw, G. C.; Showers, M.; Skeen, C.
2015-11-01
The Prototype Material Plasma Exposure eXperiment (Proto-MPEX) is a linear plasma device being developed at Oak Ridge National Laboratory (ORNL). This machine plans to study plasma-material interaction (PMI) physics relevant to future fusion reactors. We tested and learned to use tools of spectroscopy and imaging. These tools consist of a spectrometer, a high speed camera, an infrared camera, and a thermocouple. The spectrometer measures the color of the light from the plasma and its intensity. We also used a high speed camera to see how the magnetic field acts on the plasma, and how it is heated to the fourth state of matter. The thermocouples measure the temperature of the objects they are placed against, which in this case are the end plates of the machine. We also used the infrared camera to see the heat pattern of the plasma on the end plates. Data from these instruments will be shown. This work was supported by the US. D.O.E. contract DE-AC05-00OR22725, and the Oak Ridge Associated Universities ARC program.
Hubble Space Telescope Resolves Volcanoes on Io
NASA Technical Reports Server (NTRS)
1994-01-01
This picture is a composite of a black and white near infrared image of Jupiter and its satellite Io and a color image of Io at shorter wavelengths taken at almost the same time on March 5, 1994. These are the first images of a giant planet or its satellites taken by NASA's Hubble Space Telescope (HST) since the repair mission in December 1993.
Io is too small for ground-based telescopes to see the surface details. The moon's angular diameter of one arc second is at the resolution limit of ground based telescopes.Many of these markings correspond to volcanoes that were first revealed in 1979 during the Voyager spacecraft flyby of Jupiter. Several of the volcanoes periodically are active because Io is heated by tides raised by Jupiter's powerful gravity.The volcano Pele appears as a dark spot surrounded by an irregular orange oval in the lower part of the image. The orange material has been ejected from the volcano and spread over a huge area. Though the volcano was first discovered by Voyager, the distinctive orange color of the volcanic deposits is a new discovery in these HST images. (Voyager missed it because its cameras were not sensitive to the near-infrared wavelengths where the color is apparent). The sulfur and sulfur dioxide that probably dominate Io's surface composition cannot produce this orange color, so the Pele volcano must be generating material with a more unusual composition, possibly rich in sodium.The Jupiter image, taken in near-infrared light, was obtained with HST's Wide Field and Planetary Camera in wide field mode. High altitude ammonia crystal clouds are bright in this image because they reflect infrared light before it is absorbed by methane in Jupiter's atmosphere. The most prominent feature is the Great Red Spot, which is conspicuous because of its high clouds. A cap of high-altitude haze appears at Jupiter's south pole.The Wide Field/Planetary Camera 2 was developed by the Jet Propulsion Laboratory and managed by the Goddard Spaced Flight Center for NASA's Office of Space Science.This image and other images and data received from the Hubble Space Telescope are posted on the World Wide Web on the Space Telescope Science Institute home page at URL http://oposite.stsci.edu/pubinfo/Studies of coronal lines with electronic cameras during the eclipse of 7 march 1970.
Fort, B
1970-12-01
The experimental design described here allows us to study with 2-A. bandpass filters the brightness distribution of the green coronal line, the two infrared lines of Fe XIII, and the neighboring coronal continuum. For the first time, in an eclipse expedition, electrostatic cameras derived from the Lallemand type are used; full advantage was taken of their speed, especially in the near infrared spectral range, and their good photometric qualities. They permit the measurement of intensity and polarization of the lines in the corona to a height of 1.25 solar radii above the limb of the sun, with a spatial resolution >/= (10")(2).
NASA Technical Reports Server (NTRS)
Voellmer, George M.; Jackson, Michael L.; Shirron, Peter J.; Tuttle, James G.
2002-01-01
The High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter And Far Infrared Experiment (SAFIRE) will use identical Adiabatic Demagnetization Refrigerators (ADR) to cool their detectors to 200mK and 100mK, respectively. In order to minimize thermal loads on the salt pill, a Kevlar suspension system is used to hold it in place. An innovative, kinematic suspension system is presented. The suspension system is unique in that it consists of two parts that can be assembled and tensioned offline, and later bolted onto the salt pill.
2005-06-20
One of the two pictures of Tempel 1 (see also PIA02101) taken by Deep Impact's medium-resolution camera is shown next to data of the comet taken by the spacecraft's infrared spectrometer. This instrument breaks apart light like a prism to reveal the "fingerprints," or signatures, of chemicals. Even though the spacecraft was over 10 days away from the comet when these data were acquired, it detected some of the molecules making up the comet's gas and dust envelope, or coma. The signatures of these molecules -- including water, hydrocarbons, carbon dioxide and carbon monoxide -- can be seen in the graph, or spectrum. Deep Impact's impactor spacecraft is scheduled to collide with Tempel 1 at 10:52 p.m. Pacific time on July 3 (1:52 a.m. Eastern time, July 4). The mission's flyby spacecraft will use its infrared spectrometer to sample the ejected material, providing the first look at the chemical composition of a comet's nucleus. These data were acquired from June 20 to 21, 2005. The picture of Tempel 1 was taken by the flyby spacecraft's medium-resolution instrument camera. The infrared spectrometer uses the same telescope as the high-resolution instrument camera. http://photojournal.jpl.nasa.gov/catalog/PIA02100
NASA Technical Reports Server (NTRS)
2005-01-01
[figure removed for brevity, see original site] Figure 1: Temperature Map This image composite shows comet Tempel 1 in visible (left) and infrared (right) light (figure 1). The infrared picture highlights the warm, or sunlit, side of the comet, where NASA's Deep Impact probe later hit. These data were acquired about six minutes before impact. The visible image was taken by the medium-resolution camera on the mission's flyby spacecraft, and the infrared data were acquired by the flyby craft's infrared spectrometer.PyEmir: Data Reduction Pipeline for EMIR, the GTC Near-IR Multi-Object Spectrograph
NASA Astrophysics Data System (ADS)
Pascual, S.; Gallego, J.; Cardiel, N.; Eliche-Moral, M. C.
2010-12-01
EMIR is the near-infrared wide-field camera and multi-slit spectrograph being built for Gran Telescopio Canarias. We present here the work being done on its data processing pipeline. PyEmir is based on Python and it will process automatically data taken in both imaging and spectroscopy mode. PyEmir is begin developed by the UCM Group of Extragalactic Astrophysics and Astronomical Instrumentation.
A Portable Burn Pan for the Disposal of Excess Propellants
2016-06-01
of Vegetation in Vacinity of Burn Pan Caused by Radiant Heat ............... 32 Figure 12. Wet Propellant (120 kg) and Dry Propellant (460 kg) Burn...35 Figure 14. Graph of Component Temperatures During an HUTS Burn Pan Test ........................ 37 Figure 15. IR Camera Thermal...detector HUTS Howitzer Unit Training System burn pan IR Infrared JBER Joint Base Elmendorf Richardson (AK) Kg Kilogram m meter mg/kg milligram
Low-cost panoramic infrared surveillance system
NASA Astrophysics Data System (ADS)
Kecskes, Ian; Engel, Ezra; Wolfe, Christopher M.; Thomson, George
2017-05-01
A nighttime surveillance concept consisting of a single surface omnidirectional mirror assembly and an uncooled Vanadium Oxide (VOx) longwave infrared (LWIR) camera has been developed. This configuration provides a continuous field of view spanning 360° in azimuth and more than 110° in elevation. Both the camera and the mirror are readily available, off-the-shelf, inexpensive products. The mirror assembly is marketed for use in the visible spectrum and requires only minor modifications to function in the LWIR spectrum. The compactness and portability of this optical package offers significant advantages over many existing infrared surveillance systems. The developed system was evaluated on its ability to detect moving, human-sized heat sources at ranges between 10 m and 70 m. Raw camera images captured by the system are converted from rectangular coordinates in the camera focal plane to polar coordinates and then unwrapped into the users azimuth and elevation system. Digital background subtraction and color mapping are applied to the images to increase the users ability to extract moving items from background clutter. A second optical system consisting of a commercially available 50 mm f/1.2 ATHERM lens and a second LWIR camera is used to examine the details of objects of interest identified using the panoramic imager. A description of the components of the proof of concept is given, followed by a presentation of raw images taken by the panoramic LWIR imager. A description of the method by which these images are analyzed is given, along with a presentation of these results side-by-side with the output of the 50 mm LWIR imager and a panoramic visible light imager. Finally, a discussion of the concept and its future development are given.
A protection system for the JET ITER-like wall based on imaging diagnostics.
Arnoux, G; Devaux, S; Alves, D; Balboa, I; Balorin, C; Balshaw, N; Beldishevski, M; Carvalho, P; Clever, M; Cramp, S; de Pablos, J-L; de la Cal, E; Falie, D; Garcia-Sanchez, P; Felton, R; Gervaise, V; Goodyear, A; Horton, A; Jachmich, S; Huber, A; Jouve, M; Kinna, D; Kruezi, U; Manzanares, A; Martin, V; McCullen, P; Moncada, V; Obrejan, K; Patel, K; Lomas, P J; Neto, A; Rimini, F; Ruset, C; Schweer, B; Sergienko, G; Sieglin, B; Soleto, A; Stamp, M; Stephen, A; Thomas, P D; Valcárcel, D F; Williams, J; Wilson, J; Zastrow, K-D
2012-10-01
The new JET ITER-like wall (made of beryllium and tungsten) is more fragile than the former carbon fiber composite wall and requires active protection to prevent excessive heat loads on the plasma facing components (PFC). Analog CCD cameras operating in the near infrared wavelength are used to measure surface temperature of the PFCs. Region of interest (ROI) analysis is performed in real time and the maximum temperature measured in each ROI is sent to the vessel thermal map. The protection of the ITER-like wall system started in October 2011 and has already successfully led to a safe landing of the plasma when hot spots were observed on the Be main chamber PFCs. Divertor protection is more of a challenge due to dust deposits that often generate false hot spots. In this contribution we describe the camera, data capture and real time processing systems. We discuss the calibration strategy for the temperature measurements with cross validation with thermal IR cameras and bi-color pyrometers. Most importantly, we demonstrate that a protection system based on CCD cameras can work and show examples of hot spot detections that stop the plasma pulse. The limits of such a design and the associated constraints on the operations are also presented.
A protection system for the JET ITER-like wall based on imaging diagnosticsa)
NASA Astrophysics Data System (ADS)
Arnoux, G.; Devaux, S.; Alves, D.; Balboa, I.; Balorin, C.; Balshaw, N.; Beldishevski, M.; Carvalho, P.; Clever, M.; Cramp, S.; de Pablos, J.-L.; de la Cal, E.; Falie, D.; Garcia-Sanchez, P.; Felton, R.; Gervaise, V.; Goodyear, A.; Horton, A.; Jachmich, S.; Huber, A.; Jouve, M.; Kinna, D.; Kruezi, U.; Manzanares, A.; Martin, V.; McCullen, P.; Moncada, V.; Obrejan, K.; Patel, K.; Lomas, P. J.; Neto, A.; Rimini, F.; Ruset, C.; Schweer, B.; Sergienko, G.; Sieglin, B.; Soleto, A.; Stamp, M.; Stephen, A.; Thomas, P. D.; Valcárcel, D. F.; Williams, J.; Wilson, J.; Zastrow, K.-D.; JET-EFDA Contributors
2012-10-01
The new JET ITER-like wall (made of beryllium and tungsten) is more fragile than the former carbon fiber composite wall and requires active protection to prevent excessive heat loads on the plasma facing components (PFC). Analog CCD cameras operating in the near infrared wavelength are used to measure surface temperature of the PFCs. Region of interest (ROI) analysis is performed in real time and the maximum temperature measured in each ROI is sent to the vessel thermal map. The protection of the ITER-like wall system started in October 2011 and has already successfully led to a safe landing of the plasma when hot spots were observed on the Be main chamber PFCs. Divertor protection is more of a challenge due to dust deposits that often generate false hot spots. In this contribution we describe the camera, data capture and real time processing systems. We discuss the calibration strategy for the temperature measurements with cross validation with thermal IR cameras and bi-color pyrometers. Most importantly, we demonstrate that a protection system based on CCD cameras can work and show examples of hot spot detections that stop the plasma pulse. The limits of such a design and the associated constraints on the operations are also presented.
The Stratospheric Observatory for Infrared Astronomy (SOFIA)
NASA Astrophysics Data System (ADS)
Wolf, J.
2004-05-01
The Stratospheric Observatory for Infrared Astronomy, SOFIA, will carry a 3-meter-class telescope onboard a Boeing 747SP aircraft to altitudes of 41,000 to 45,000 ft, above most of the atmosphere's IR-absorbing water vapor. The telescope was developed and built in Germany and has been delivered to the U.S. in September 2002. The integration into the B747SP has been com- pleted and functional tests are under way in Waco, Texas. In early 2005 flight-testing of the observatory will initially be dedi-cated to the re-certification of the modified aircraft, then performance tests of the telescope and the electronics and data systems will commence. Later in 2005 after transferring to its home base, NASA's Ames Research Center in Moffett Field, California, SOFIA will start astrophysical observations. A suite of specialized infrared cameras and spectrometers covering wave-lengths between 1 and 600 ?m is being developed by U.S. and German science institutions. In addition to the infrared instruments, a high-speed visible range CCD camera will use the airborne observatory to chase the shadows of celestial bodies during occultations. Once SOFIA will be in routine operations with a planned observing schedule of up to 960 hours at altitude per year, it might also be available as a platform to serendipitous observations not using the main telescope, such as recordings of meteor streams or the search for extra-solar planets transiting their central stars. These are areas of research in which amateur astronomers with relatively small telescopes and state-of-the-art imaging equipment can contribute.
Near infrared observations of S 155. Evidence of induced star formation?
NASA Astrophysics Data System (ADS)
Hunt, L. K.; Lisi, F.; Felli, M.; Tofani, G.
In order to investigate the possible existence of embedded objects of recent formation in the area of the Cepheus B - Sh2-155 interface, the authors have observed the region of the compact radio continuum source with the new near infrared camera ARNICA and the TIRGO telescope.
surface temperature profile of a sandbox containing buried objects using a long-wave infrared camera. Images were recorded for several days under ambient...time of day . Best detection of buried objects corresponded to shallow depths for observed intervals where maxima/minima ambient temperatures coincided
Discovery of the near-infrared counterpart to the luminous neutron-star low-mass X-ray binary GX 3+1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van den Berg, Maureen; Fridriksson, Joel K.; Homan, Jeroen
2014-10-01
Using the High Resolution Camera on board the Chandra X-ray Observatory, we have measured an accurate position for the bright persistent neutron star X-ray binary and atoll source GX 3+1. At a location that is consistent with this new position, we have discovered the near-infrared (NIR) counterpart to GX 3+1 in images taken with the PANIC and FourStar cameras on the Magellan Baade Telescope. The identification of this K{sub s} = 15.8 ± 0.1 mag star as the counterpart is based on the presence of a Br γ emission line in an NIR spectrum taken with the Folded-port InfraRed Echelettemore » spectrograph on the Baade Telescope. The absolute magnitude derived from the best available distance estimate to GX 3+1 indicates that the mass donor in the system is not a late-type giant. We find that the NIR light in GX 3+1 is likely dominated by the contribution from a heated outer accretion disk. This is similar to what has been found for the NIR flux from the brighter class of Z sources, but unlike the behavior of atolls fainter (L{sub X} ≈ 10{sup 36}-10{sup 37} erg s{sup –1}) than GX 3+1, where optically thin synchrotron emission from a jet probably dominates the NIR flux.« less
Additive Manufacturing Infrared Inspection
NASA Technical Reports Server (NTRS)
Gaddy, Darrell; Nettles, Mindy
2015-01-01
The Additive Manufacturing Infrared Inspection Task started the development of a real-time dimensional inspection technique and digital quality record for the additive manufacturing process using infrared camera imaging and processing techniques. This project will benefit additive manufacturing by providing real-time inspection of internal geometry that is not currently possible and reduce the time and cost of additive manufactured parts with automated real-time dimensional inspections which deletes post-production inspections.
Calibration procedures of the Tore-Supra infrared endoscopes
NASA Astrophysics Data System (ADS)
Desgranges, C.; Jouve, M.; Balorin, C.; Reichle, R.; Firdaouss, M.; Lipa, M.; Chantant, M.; Gardarein, J. L.; Saille, A.; Loarer, T.
2018-01-01
Five endoscopes equipped with infrared cameras working in the medium infrared range (3-5 μm) are installed on the controlled thermonuclear fusion research device Tore-Supra. These endoscopes aim at monitoring the plasma facing components surface temperature to prevent their overheating. Signals delivered by infrared cameras through endoscopes are analysed and used on the one hand through a real time feedback control loop acting on the heating systems of the plasma to decrease plasma facing components surface temperatures when necessary, on the other hand for physics studies such as determination of the incoming heat flux . To ensure these two roles a very accurate knowledge of the absolute surface temperatures is mandatory. Consequently the infrared endoscopes must be calibrated through a very careful procedure. This means determining their transmission coefficients which is a delicate operation. Methods to calibrate infrared endoscopes during the shutdown period of the Tore-Supra machine will be presented. As they do not allow determining the possible transmittances evolution during operation an in-situ method is presented. It permits the validation of the calibration performed in laboratory as well as the monitoring of their evolution during machine operation. This is possible by the use of the endoscope shutter and a dedicated plasma scenario developed to heat it. Possible improvements of this method are briefly evoked.
Optimal design of an earth observation optical system with dual spectral and high resolution
NASA Astrophysics Data System (ADS)
Yan, Pei-pei; Jiang, Kai; Liu, Kai; Duan, Jing; Shan, Qiusha
2017-02-01
With the increasing demand of the high-resolution remote sensing images by military and civilians, Countries around the world are optimistic about the prospect of higher resolution remote sensing images. Moreover, design a visible/infrared integrative optic system has important value in earth observation. Because visible system can't identify camouflage and recon at night, so we should associate visible camera with infrared camera. An earth observation optical system with dual spectral and high resolution is designed. The paper mainly researches on the integrative design of visible and infrared optic system, which makes the system lighter and smaller, and achieves one satellite with two uses. The working waveband of the system covers visible, middle infrared (3-5um). Dual waveband clear imaging is achieved with dispersive RC system. The focal length of visible system is 3056mm, F/# is 10.91. And the focal length of middle infrared system is 1120mm, F/# is 4. In order to suppress the middle infrared thermal radiation and stray light, the second imaging system is achieved and the narcissus phenomenon is analyzed. The system characteristic is that the structure is simple. And the especial requirements of the Modulation Transfer Function (MTF), spot, energy concentration, and distortion etc. are all satisfied.
Monitoring machining conditions by infrared images
NASA Astrophysics Data System (ADS)
Borelli, Joao E.; Gonzaga Trabasso, Luis; Gonzaga, Adilson; Coelho, Reginaldo T.
2001-03-01
During machining process the knowledge of the temperature is the most important factor in tool analysis. It allows to control main factors that influence tool use, life time and waste. The temperature in the contact area between the piece and the tool is resulting from the material removal in cutting operation and it is too difficult to be obtained because the tool and the work piece are in motion. One way to measure the temperature in this situation is detecting the infrared radiation. This work presents a new methodology for diagnosis and monitoring of machining processes with the use of infrared images. The infrared image provides a map in gray tones of the elements in the process: tool, work piece and chips. Each gray tone in the image corresponds to a certain temperature for each one of those materials and the relationship between the gray tones and the temperature is gotten by the previous of infrared camera calibration. The system developed in this work uses an infrared camera, a frame grabber board and a software composed of three modules. The first module makes the image acquisition and processing. The second module makes the feature image extraction and performs the feature vector. Finally, the third module uses fuzzy logic to evaluate the feature vector and supplies the tool state diagnostic as output.
NASA Astrophysics Data System (ADS)
Xu, Weichao; Shen, Jingling; Zhang, Cunlin; Tao, Ning; Feng, Lichun
2008-03-01
The applications of ultrasonic infrared thermal wave nondestructive evaluation for crack detection of several materials, which often used in aviation alloy. For instance, steel and carbon fiber. It is difficult to test cracks interfacial or vertical with structure's surface by the traditional nondestructive testing methods. Ultrasonic infrared thermal wave nondestructive testing technology uses high-power and low-frequency ultrasonic as heat source to excite the sample and an infrared video camera as a detector to detect the surface temperature. The ultrasonic emitter launch pulses of ultrasonic into the skin of the sample, which causes the crack interfaces to rub and dissipate energy as heat, and then caused local increase in temperature at one of the specimen surfaces. The infrared camera images the returning thermal wave reflections from subsurface cracks. A computer collects and processes the thermal images according to different properties of samples to get the satisfied effect. In this paper, a steel plate with fatigue crack we designed and a juncture of carbon fiber composite that has been used in a space probe were tested and get satisfying results. The ultrasonic infrared thermal wave nondestructive detection is fast, sensitive for cracks, especially cracks that vertical with structure's surface. It is significative for nondestructive testing in manufacture produce and application of aviation, cosmography and optoelectronics.
HUBBLE SPIES BROWN DWARFS IN NEARBY STELLAR NURSERY
NASA Technical Reports Server (NTRS)
2002-01-01
Probing deep within a neighborhood stellar nursery, NASA's Hubble Space Telescope uncovered a swarm of newborn brown dwarfs. The orbiting observatory's near-infrared camera revealed about 50 of these objects throughout the Orion Nebula's Trapezium cluster [image at right], about 1,500 light-years from Earth. Appearing like glistening precious stones surrounding a setting of sparkling diamonds, more than 300 fledgling stars and brown dwarfs surround the brightest, most massive stars [center of picture] in Hubble's view of the Trapezium cluster's central region. All of the celestial objects in the Trapezium were born together in this hotbed of star formation. The cluster is named for the trapezoidal alignment of those central massive stars. Brown dwarfs are gaseous objects with masses so low that their cores never become hot enough to fuse hydrogen, the thermonuclear fuel stars like the Sun need to shine steadily. Instead, these gaseous objects fade and cool as they grow older. Brown dwarfs around the age of the Sun (5 billion years old) are very cool and dim, and therefore are difficult for telescopes to find. The brown dwarfs discovered in the Trapezium, however, are youngsters (1 million years old). So they're still hot and bright, and easier to see. This finding, along with observations from ground-based telescopes, is further evidence that brown dwarfs, once considered exotic objects, are nearly as abundant as stars. The image and results appear in the Sept. 20 issue of the Astrophysical Journal. The brown dwarfs are too dim to be seen in a visible-light image taken by the Hubble telescope's Wide Field and Planetary Camera 2 [picture at left]. This view also doesn't show the assemblage of infant stars seen in the near-infrared image. That's because the young stars are embedded in dense clouds of dust and gas. The Hubble telescope's near-infrared camera, the Near Infrared Camera and Multi-Object Spectrometer, penetrated those clouds to capture a view of those objects. The brown dwarfs are the faintest objects in the image. Surveying the cluster's central region, the Hubble telescope spied brown dwarfs with masses equaling 10 to 80 Jupiters. Researchers think there may be less massive brown dwarfs that are beyond the limits of Hubble's vision. The near-infrared image was taken Jan. 17, 1998. Two near-infrared filters were used to obtain information on the colors of the stars at two wavelengths (1.1 and 1.6 microns). The Trapezium picture is 1 light-year across. This composite image was made from a 'mosaic' of nine separate, but adjoining images. In this false-color image, blue corresponds to warmer, more massive stars, and red to cooler, less massive stars and brown dwarfs, and stars that are heavily obscured by dust. The visible-light data were taken in 1994 and 1995. Credits for near-infrared image: NASA; K.L. Luhman (Harvard-Smithsonian Center for Astrophysics, Cambridge, Mass.); and G. Schneider, E. Young, G. Rieke, A. Cotera, H. Chen, M. Rieke, R. Thompson (Steward Observatory, University of Arizona, Tucson, Ariz.) Credits for visible-light picture: NASA, C.R. O'Dell and S.K. Wong (Rice University)
NASA Astrophysics Data System (ADS)
Dumoulin, J.; Averty, R.
2012-04-01
One of the objectives of ISTIMES project is to evaluate the potentialities offered by the integration of different electromagnetic techniques able to perform non-invasive diagnostics for surveillance and monitoring of transport infrastructures. Among the EM methods investigated, uncooled infrared camera is a promising technique due to its dissemination potential according to its relative low cost on the market. Infrared thermography, when it is used in quantitative mode (not in laboratory conditions) and not in qualitative mode (vision applied to survey), requires to process in real time thermal radiative corrections on raw data acquired to take into account influences of natural environment evolution with time. But, camera sensor has to be enough smart to apply in real time calibration law and radiometric corrections in a varying atmosphere. So, a complete measurement system was studied and developed with low cost infrared cameras available on the market. In the system developed, infrared camera is coupled with other sensors to feed simplified radiative models running, in real time, on GPU available on small PC. The system studied and developed uses a fast Ethernet camera FLIR A320 [1] coupled with a VAISALA WXT520 [2] weather station and a light GPS unit [3] for positioning and dating. It can be used with other Ethernet infrared cameras (i.e. visible ones) but requires to be able to access measured data at raw level. In the present study, it has been made possible thanks to a specific agreement signed with FLIR Company. The prototype system studied and developed is implemented on low cost small computer that integrates a GPU card to allow real time parallel computing [4] of simplified radiometric [5] heat balance using information measured with the weather station. An HMI was developed under Linux using OpenSource and complementary pieces of software developed at IFSTTAR. This new HMI called "IrLaW" has various functionalities that let it compliant to be use in real site for long term monitoring. It can be remotely controlled in wire or wireless communication mode depending on what is the context of measurement and the degree of accessibility to the system when it is running on real site. To complete and conclude, thanks to the development of a high level library, but also to the deployment of a daemon, our developed measurement system was tuned to be compatible with OGC standards. Complementary functionalities were also developed to allow the system to self declare to 52North. For that, a specific plugin was developed to be inserted previously at 52North level. Finally, data are also accessible by tasking the system when required, fort instance by using the web portal developed in the ISTIMES Framework. ACKNOWLEDGEMENT - The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 225663.
IGR J19294+1816: a new Be-X-ray binary revealed through infrared spectroscopy
NASA Astrophysics Data System (ADS)
Rodes-Roca, J. J.; Bernabeu, G.; Magazzù, A.; Torrejón, J. M.; Solano, E.
2018-05-01
The aim of this work is to characterize the counterpart to the INTErnational Gamma-Ray Astrophysics Laboratory high-mass X-ray binary candidate IGR J19294+1816 so as to establish its true nature. We obtained H-band spectra of the selected counterpart acquired with the Near Infrared Camera and Spectrograph instrument mounted on the Telescopio Nazionale Galileo 3.5-m telescope which represents the first infrared spectrum ever taken of this source. We complement the spectral analysis with infrared photometry from UKIDSS, 2MASS, WISE, and NEOWISE data bases. We classify the mass donor as a Be star. Subsequently, we compute its distance by properly taking into account the contamination produced by the circumstellar envelope. The findings indicate that IGR J19294+1816 is a transient source with a B1Ve donor at a distance of d = 11 ± 1 kpc, and luminosities of the order of 1036-37 erg s-1, displaying the typical behaviour of a Be-X-ray binary.
NASA Astrophysics Data System (ADS)
Meola, Joseph; Absi, Anthony; Islam, Mohammed N.; Peterson, Lauren M.; Ke, Kevin; Freeman, Michael J.; Ifaraguerri, Agustin I.
2014-06-01
Hyperspectral imaging systems are currently used for numerous activities related to spectral identification of materials. These passive imaging systems rely on naturally reflected/emitted radiation as the source of the signal. Thermal infrared systems measure radiation emitted from objects in the scene. As such, they can operate at both day and night. However, visible through shortwave infrared systems measure solar illumination reflected from objects. As a result, their use is limited to daytime applications. Omni Sciences has produced high powered broadband shortwave infrared super-continuum laser illuminators. A 64-watt breadboard system was recently packaged and tested at Wright-Patterson Air Force Base to gauge beam quality and to serve as a proof-of-concept for potential use as an illuminator for a hyperspectral receiver. The laser illuminator was placed in a tower and directed along a 1.4km slant path to various target materials with reflected radiation measured with both a broadband camera and a hyperspectral imaging system to gauge performance.
Thermographic imaging for high-temperature composite materials: A defect detection study
NASA Technical Reports Server (NTRS)
Roth, Don J.; Bodis, James R.; Bishop, Chip
1995-01-01
The ability of a thermographic imaging technique for detecting flat-bottom hole defects of various diameters and depths was evaluated in four composite systems (two types of ceramic matrix composites, one metal matrix composite, and one polymer matrix composite) of interest as high-temperature structural materials. The holes ranged from 1 to 13 mm in diameter and 0.1 to 2.5 mm in depth in samples approximately 2-3 mm thick. The thermographic imaging system utilized a scanning mirror optical system and infrared (IR) focusing lens in conjunction with a mercury cadmium telluride infrared detector element to obtain high resolution infrared images. High intensity flash lamps located on the same side as the infrared camera were used to heat the samples. After heating, up to 30 images were sequentially acquired at 70-150 msec intervals. Limits of detectability based on depth and diameter of the flat-bottom holes were defined for each composite material. Ultrasonic and radiographic images of the samples were obtained and compared with the thermographic images.
Early Science Results from SOFIA, the Worlds Largest Airborne Observatory
NASA Astrophysics Data System (ADS)
De Buizer, J.
2012-09-01
The Stratospheric Observatory for Infrared Astronomy, or SOFIA, is the largest flying observatory ever built, consisting of a 2.7-meter diameter telescope embedded in a modified Boeing 747-SP aircraft. SOFIA is a joint project between NASA and the German Aerospace Center Deutsches Zentrum fur Luft und-Raumfahrt. By flying at altitudes up to 45000 feet, the observatory gets above 99.9% of the infrared-absorbing water vapor in the Earth's atmosphere. This opens up an almost uninterrupted wavelength range from 0.3-1600 microns that is in large part obscured from ground based observatories. Since its 'Initial Science Flight' in December 2010, SOFIA has flown several dozen science flights, and has observed a wide array of objects from Solar System bodies, to stellar nurseries, to distant galaxies. This talk will review some of the exciting new science results from these first flights which were made by three instruments: the mid-infrared camera FORCAST, the far-infrared heterodyne spectrometer GREAT, and the optical occultation photometer HIPO.
The PALM-3000 high-order adaptive optics system for Palomar Observatory
NASA Astrophysics Data System (ADS)
Bouchez, Antonin H.; Dekany, Richard G.; Angione, John R.; Baranec, Christoph; Britton, Matthew C.; Bui, Khanh; Burruss, Rick S.; Cromer, John L.; Guiwits, Stephen R.; Henning, John R.; Hickey, Jeff; McKenna, Daniel L.; Moore, Anna M.; Roberts, Jennifer E.; Trinh, Thang Q.; Troy, Mitchell; Truong, Tuan N.; Velur, Viswa
2008-07-01
Deployed as a multi-user shared facility on the 5.1 meter Hale Telescope at Palomar Observatory, the PALM-3000 highorder upgrade to the successful Palomar Adaptive Optics System will deliver extreme AO correction in the near-infrared, and diffraction-limited images down to visible wavelengths, using both natural and sodium laser guide stars. Wavefront control will be provided by two deformable mirrors, a 3368 active actuator woofer and 349 active actuator tweeter, controlled at up to 3 kHz using an innovative wavefront processor based on a cluster of 17 graphics processing units. A Shack-Hartmann wavefront sensor with selectable pupil sampling will provide high-order wavefront sensing, while an infrared tip/tilt sensor and visible truth wavefront sensor will provide low-order LGS control. Four back-end instruments are planned at first light: the PHARO near-infrared camera/spectrograph, the SWIFT visible light integral field spectrograph, Project 1640, a near-infrared coronagraphic integral field spectrograph, and 888Cam, a high-resolution visible light imager.
Variable field-of-view visible and near-infrared polarization compound-eye endoscope.
Kagawa, K; Shogenji, R; Tanaka, E; Yamada, K; Kawahito, S; Tanida, J
2012-01-01
A multi-functional compound-eye endoscope enabling variable field-of-view and polarization imaging as well as extremely deep focus is presented, which is based on a compact compound-eye camera called TOMBO (thin observation module by bound optics). Fixed and movable mirrors are introduced to control the field of view. Metal-wire-grid polarizer thin film applicable to both of visible and near-infrared lights is attached to the lenses in TOMBO and light sources. Control of the field-of-view, polarization and wavelength of the illumination realizes several observation modes such as three-dimensional shape measurement, wide field-of-view, and close-up observation of the superficial tissues and structures beneath the skin.
Maturity assessment of harumanis mango using thermal camera sensor
NASA Astrophysics Data System (ADS)
Sa'ad, F. S. A.; Shakaff, A. Y. Md.; Zakaria, A.; Abdullah, A. H.; Ibrahim, M. F.
2017-03-01
The perceived quality of fruits, such as mangoes, is greatly dependent on many parameters such as ripeness, shape, size, and is influenced by other factors such as harvesting time. Unfortunately, a manual fruit grading has several drawbacks such as subjectivity, tediousness and inconsistency. By automating the procedure, as well as developing new classification technique, it may solve these problems. This paper presents the novel work on the using Infrared as a Tool in Quality Monitoring of Harumanis Mangoes. The histogram of infrared image was used to distinguish and classify the level of ripeness of the fruits based on the colour spectrum by week. The approach proposed thermal data was able to achieve 90.5% correct classification.
NASA Astrophysics Data System (ADS)
Olson, Craig; Theisen, Michael; Pace, Teresa; Halford, Carl; Driggers, Ronald
2016-05-01
The mission of an Infrared Search and Track (IRST) system is to detect and locate (sometimes called find and fix) enemy aircraft at significant ranges. Two extreme opposite examples of IRST applications are 1) long range offensive aircraft detection when electronic warfare equipment is jammed, compromised, or intentionally turned off, and 2) distributed aperture systems where enemy aircraft may be in the proximity of the host aircraft. Past IRST systems have been primarily long range offensive systems that were based on the LWIR second generation thermal imager. The new IRST systems are primarily based on staring infrared focal planes and sensors. In the same manner that FLIR92 did not work well in the design of staring infrared cameras (NVTherm was developed to address staring infrared sensor performance), current modeling techniques do not adequately describe the performance of a staring IRST sensor. There are no standard military IRST models (per AFRL and NAVAIR), and each program appears to perform their own modeling. For this reason, L-3 has decided to develop a corporate model, working with AFRL and NAVAIR, for the analysis, design, and evaluation of IRST concepts, programs, and solutions. This paper provides some of the first analyses in the L-3 IRST model development program for the optimization of staring IRST sensors.
A mobile laboratory for surface and subsurface imaging in geo-hazard monitoring activity
NASA Astrophysics Data System (ADS)
Cornacchia, Carmela; Bavusi, Massimo; Loperte, Antonio; Pergola, Nicola; Pignatti, Stefano; Ponzo, Felice; Lapenna, Vincenzo
2010-05-01
A new research infrastructure for supporting ground-based remote sensing observations in the different phases of georisk management cycle is presented. This instrumental facility has been designed and realised by TeRN, a public-private consortium on Earth Observations and Natural Risks, in the frame of the project "ImpresAmbiente" funded by Italian Ministry of Research and University. The new infrastructure is equipped with ground-based sensors (hyperspectral cameras, thermal cameras, laser scanning and electromagnetic antennae) able to remotely map physical parameters and/or earth-surface properties (temperature, soil moisture, land cover, etc…) and to illuminate near-surface geological structures (fault, groundwater tables, landslide bodies etc...). Furthermore, the system can be used for non-invasive investigations of architectonic buildings and civil infrastructures (bridges, tunnel, road pavements, etc...) interested by natural and man-made hazards. The hyperspectral cameras can acquire high resolution images of earth-surface and cultural objects. They are operating in the Visible Near InfraRed (0.4÷1.0μm) with 1600 spatial pixel and 3.7nm of spectral sampling and in the Short Wave InfraRed (1.3÷2.5µm) spectral region with 320 spatial pixel and 5nm of spectral sampling. The IR cameras are operating in the Medium Wavelength InfraRed (3÷5µm; 640x512; NETD< 20 mK) and in the Very Long Wavelength InfraRed region (7.7÷11.5 µm; 320x256; NETD<25 mK) with a frame rate higher than 100Hz and are both equipped with a set of optical filters in order to operate in multi-spectral configuration. The technological innovation of ground-based laser scanning equipment has led to an increased resolution performances of surveys with applications in several field, as geology, architecture, environmental monitoring and cultural heritage. As a consequence, laser data can be useful integrated with traditional monitoring techniques. The Laser Scanner is characterized by very high data acquisition repetition rate up to 500.000 pxl/sec with a range resolution of 0.1 mm, vertical and horizontal FoV of 310° and 360° respectively with a resolution of 0.0018°. The system is also equipped with a metric camera allows to georeference the high resolution images acquired. The electromagnetic sensors allow to obtain in near real time high-resolution 2D and 3D subsurface tomographic images. The main components are a fully automatic resistivity meter for DC electrical surveys (resistivity) and Induced Polarization, a Ground Penetrating Radar with antennas covering range for 400 MHz to 1.5 GHz and a gradiometric magnetometric system. All the sensors can be installed on a mobile van and remotely controlled using wi-fi technologies. An all-time network connection capability is guaranteed by a self-configurable satellite link for data communication, which allows to transmit in near-real time experimental data coming from the field surveys and to share other geospatial information. This ICT facility is well suited for emergency response activities during and after catastrophic events. Sensor synergy, multi-temporal and multi-scale resolutions of surface and sub-surface imaging are the key technical features of this instrumental facility. Finally, in this work we shortly present some first preliminary results obtained during the emergence phase of Abruzzo earthquake (Central Italy).
The Sensor Irony: How Reliance on Sensor Technology is Limiting Our View of the Battlefield
2010-05-10
thermal ) camera, as well as a laser illuminator/range finder.73 Similar to the MQ- 1 , the MQ-9 Reaper is primarily a strike asset for emerging targets...Wescam 14TS. 1 Both systems have an Electro-optical (daylight) TV camera, an Infra-red ( thermal ) camera, as well as a laser illuminator/range finder...Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour
1994-07-10
TEMPUS, an electromagnetic levitation facility that allows containerless processing of metallic samples in microgravity, first flew on the IML-2 Spacelab mission. The principle of electromagnetic levitation is used commonly in ground-based experiments to melt and then cool metallic melts below their freezing points without solidification occurring. The TEMPUS operation is controlled by its own microprocessor system; although commands may be sent remotely from the ground and real time adjustments may be made by the crew. Two video cameras, a two-color pyrometer for measuring sample temperatures, and a fast infrared detector for monitoring solidification spikes, will be mounted to the process chamber to facilitate observation and analysis. In addition, a dedicated high-resolution video camera can be attached to the TEMPUS to measure the sample volume precisely.
Stop outbreak of SARS with infrared cameras
NASA Astrophysics Data System (ADS)
Wu, Yigang M.
2004-04-01
SARS (Severe Acute Respiratory Syndrome, commonly known as Atypical Pneumonia in mainland China) caused 8422 people affected and resulting in 918 deaths worldwide in half year. This disease can be transmitted by respiratory droplets or by contact with a patient's respiratory secretions. This means it can be spread out very rapidly through the public transportations by the travelers with the syndrome. The challenge was to stop the SARS carriers traveling around by trains, airplanes, coaches and etc. It is impractical with traditional oral thermometers or spot infrared thermometers to screen the tens of travelers with elevated body temperature from thousands of normal travelers in hours. The thermal imager with temperature measurement function is a logical choice for this special application although there are some limitations and drawbacks. This paper discusses the real SARS applications of industrial infrared cameras in China from April to July 2003.
Forest fire autonomous decision system based on fuzzy logic
NASA Astrophysics Data System (ADS)
Lei, Z.; Lu, Jianhua
2010-11-01
The proposed system integrates GPS / pseudolite / IMU and thermal camera in order to autonomously process the graphs by identification, extraction, tracking of forest fire or hot spots. The airborne detection platform, the graph-based algorithms and the signal processing frame are analyzed detailed; especially the rules of the decision function are expressed in terms of fuzzy logic, which is an appropriate method to express imprecise knowledge. The membership function and weights of the rules are fixed through a supervised learning process. The perception system in this paper is based on a network of sensorial stations and central stations. The sensorial stations collect data including infrared and visual images and meteorological information. The central stations exchange data to perform distributed analysis. The experiment results show that working procedure of detection system is reasonable and can accurately output the detection alarm and the computation of infrared oscillations.
Cloud cameras at the Pierre Auger Observatory
NASA Astrophysics Data System (ADS)
Winnick, Michael G.
2010-06-01
This thesis presents the results of measurements made by infrared cloud cameras installed at the Pierre Auger Observatory in Argentina. These cameras were used to record cloud conditions during operation of the observatory's fluorescence detectors. As cloud may affect the measurement of fluorescence from cosmic ray extensive air showers, the cloud cameras provide a record of which measurements have been interfered with by cloud. Several image processing algorithms were developed, along with a methodology for the detection of cloud within infrared images taken by the cloud cameras. A graphical user interface (GUI) was developed to expediate this, as a large number of images need to be checked for cloud. A cross-check between images recorded by three of the observatory's cloud cameras is presented, along with a comparison with independent cloud measurements made by LIDAR. Despite the cloud cameras and LIDAR observing different areas of the sky, a good agreement is observed in the measured cloud fraction between the two instruments, particularly on very clear and overcast nights. Cloud information recorded by the cloud cameras, with cloud height information measured by the LIDAR, was used to identify those extensive air showers that were obscured by cloud. These events were used to study the effectiveness of standard quality cuts at removing cloud afflicted events. Of all of the standard quality cuts studied in this thesis, the LIDAR cloud fraction cut was the most effective at preferentially removing cloud obscured events. A 'cloudy pixel' veto is also presented, whereby cloud obscured measurements are excluded during the standard hybrid analysis, and new extensive air shower reconstructed parameters determined. The application of such a veto would provide a slight increase to the number of events available for higher level analysis.
The LST scientific instruments
NASA Technical Reports Server (NTRS)
Levin, G. M.
1975-01-01
Seven scientific instruments are presently being studied for use with the Large Space Telescope (LST). These instruments are the F/24 Field Camera, the F/48-F/96 Planetary Camera, the High Resolution Spectrograph, the Faint Object Spectrograph, the Infrared Photometer, and the Astrometer. These instruments are being designed as facility instruments to be replaceable during the life of the Observatory.
Intraoperative near-infrared autofluorescence imaging of parathyroid glands.
Ladurner, Roland; Sommerey, Sandra; Arabi, Nora Al; Hallfeldt, Klaus K J; Stepp, Herbert; Gallwas, Julia K S
2017-08-01
To identify parathyroid glands intraoperatively by exposing their autofluorescence using near-infrared light. Fluorescence imaging was carried out during minimally invasive and open parathyroid and thyroid surgery. After identification, the parathyroid glands as well as the surrounding tissue were exposed to near-infrared (NIR) light with a wavelength of 690-770 nm using a modified Karl Storz near-infrared/indocyanine green (NIR/ICG) endoscopic system. Parathyroid tissue was expected to show near-infrared autofluorescence, captured in the blue channel of the camera. Whenever possible the visual identification of parathyroid tissue was confirmed histologically. In preliminary investigations, using the original NIR/ICG endoscopic system we noticed considerable interference of light in the blue channel overlying the autofluorescence. Therefore, we modified the light source by interposing additional filters. In a second series, we investigated 35 parathyroid glands from 25 patients. Twenty-seven glands were identified correctly based on NIR autofluorescence. Regarding the extent of autofluorescence, there were no noticeable differences between parathyroid adenomas, hyperplasia and normal parathyroid glands. In contrast, thyroid tissue, lymph nodes and adipose tissue revealed no substantial autofluorescence. Parathyroid tissue is characterized by showing autofluorescence in the near-infrared spectrum. This effect can be used to distinguish parathyroid glands from other cervical tissue entities.
Imaging spectroscopy using embedded diffractive optical arrays
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele; Hinnrichs, Bradford
2017-09-01
Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera based on diffractive optic arrays. This approach to hyperspectral imaging has been demonstrated in all three infrared bands SWIR, MWIR and LWIR. The hyperspectral optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of this infrared hyperspectral sensor. This new and innovative approach to an infrared hyperspectral imaging spectrometer uses micro-optics that are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a small satellite, mini-UAV, commercial quadcopter or man portable. Also, an application of how this spectral imaging technology can easily be used to quantify the mass and volume flow rates of hydrocarbon gases. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. The detector array is divided into sub-images covered by each lenslet. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the number of simultaneous different spectral images collected each frame of the camera. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame. This system spans the SWIR and MWIR bands with a single optical array and focal plane array.
Winter precipitation particle size distribution measurement by Multi-Angle Snowflake Camera
NASA Astrophysics Data System (ADS)
Huang, Gwo-Jong; Kleinkort, Cameron; Bringi, V. N.; Notaroš, Branislav M.
2017-12-01
From the radar meteorology viewpoint, the most important properties for quantitative precipitation estimation of winter events are 3D shape, size, and mass of precipitation particles, as well as the particle size distribution (PSD). In order to measure these properties precisely, optical instruments may be the best choice. The Multi-Angle Snowflake Camera (MASC) is a relatively new instrument equipped with three high-resolution cameras to capture the winter precipitation particle images from three non-parallel angles, in addition to measuring the particle fall speed using two pairs of infrared motion sensors. However, the results from the MASC so far are usually presented as monthly or seasonally, and particle sizes are given as histograms, no previous studies have used the MASC for a single storm study, and no researchers use MASC to measure the PSD. We propose the methodology for obtaining the winter precipitation PSD measured by the MASC, and present and discuss the development, implementation, and application of the new technique for PSD computation based on MASC images. Overall, this is the first study of the MASC-based PSD. We present PSD MASC experiments and results for segments of two snow events to demonstrate the performance of our PSD algorithm. The results show that the self-consistency of the MASC measured single-camera PSDs is good. To cross-validate PSD measurements, we compare MASC mean PSD (averaged over three cameras) with the collocated 2D Video Disdrometer, and observe good agreements of the two sets of results.
High-performance camera module for fast quality inspection in industrial printing applications
NASA Astrophysics Data System (ADS)
Fürtler, Johannes; Bodenstorfer, Ernst; Mayer, Konrad J.; Brodersen, Jörg; Heiss, Dorothea; Penz, Harald; Eckel, Christian; Gravogl, Klaus; Nachtnebel, Herbert
2007-02-01
Today, printing products which must meet highest quality standards, e.g., banknotes, stamps, or vouchers, are automatically checked by optical inspection systems. Typically, the examination of fine details of the print or security features demands images taken from various perspectives, with different spectral sensitivity (visible, infrared, ultraviolet), and with high resolution. Consequently, the inspection system is equipped with several cameras and has to cope with an enormous data rate to be processed in real-time. Hence, it is desirable to move image processing tasks into the camera to reduce the amount of data which has to be transferred to the (central) image processing system. The idea is to transfer relevant information only, i.e., features of the image instead of the raw image data from the sensor. These features are then further processed. In this paper a color line-scan camera for line rates up to 100 kHz is presented. The camera is based on a commercial CMOS (complementary metal oxide semiconductor) area image sensor and a field programmable gate array (FPGA). It implements extraction of image features which are well suited to detect print flaws like blotches of ink, color smears, splashes, spots and scratches. The camera design and several image processing methods implemented on the FPGA are described, including flat field correction, compensation of geometric distortions, color transformation, as well as decimation and neighborhood operations.
A GRAND VIEW OF THE BIRTH OF 'HEFTY' STARS - 30 DORADUS NEBULA MONTAGE
NASA Technical Reports Server (NTRS)
2002-01-01
This picture, taken in visible light with the Hubble Space Telescope's Wide Field and Planetary Camera 2 (WFPC2), represents a sweeping view of the 30 Doradus Nebula. But Hubble's infrared camera - the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) - has probed deeper into smaller regions of this nebula to unveil the stormy birth of massive stars. The montages of images in the upper left and upper right represent this deeper view. Each square in the montages is 15.5 light-years (19 arcseconds) across. The brilliant cluster R136, containing dozens of very massive stars, is at the center of this image. The infrared and visible-light views reveal several dust pillars that point toward R136, some with bright stars at their tips. One of them, at left in the visible-light image, resembles a fist with an extended index finger pointing directly at R136. The energetic radiation and high-speed material emitted by the massive stars in R136 are responsible for shaping the pillars and causing the heads of some of them to collapse, forming new stars. The infrared montage at upper left is enlarged in an accompanying image. Credits for NICMOS montages: NASA/Nolan Walborn (Space Telescope Science Institute, Baltimore, Md.) and Rodolfo Barba' (La Plata Observatory, La Plata, Argentina) Credits for WFPC2 image: NASA/John Trauger (Jet Propulsion Laboratory, Pasadena, Calif.) and James Westphal (California Institute of Technology, Pasadena, Calif.)
The Earth and Moon As Seen by 2001 Mars Odyssey's Thermal Emission Imaging System
NASA Technical Reports Server (NTRS)
2001-01-01
2001 Mars Odyssey's Thermal Emission Imaging System (THEMIS) took this portrait of the Earth and its companion Moon, using the infrared camera, one of two cameras in the instrument. It was taken at a distance of 3,563,735 kilometers (more than 2 million miles) on April 19, 2001 as the 2001 Mars Odyssey spacecraft left the Earth. From this distance and perspective the camera was able to acquire an image that directly shows the true distance from the Earth to the Moon. The Earth's diameter is about 12,750 km, and the distance from the Earth to the Moon is about 385,000 km, corresponding to 30 Earth diameters. The dark region seen on Earth in the infrared temperature image is the cold south pole, with a temperature of minus 50 degrees Celsius (minus 58 degrees Fahrenheit). The small bright region above it is warm Australia. This image was acquired using the 9.1 um infrared filter, one of nine filters that the instrument will use to map the mineral composition and temperature of the martian surface. From this great distance, each picture element (pixel) in the image corresponds to a region 900 by 900 kilometers or greater in size or about size of the state of Texas. Once Odyssey reaches Mars orbit each infrared pixel will cover a region only 100 by 100 meters on the surface, about the size of a major league baseball field.
NASA Astrophysics Data System (ADS)
Miller, R.; Lintz, H. E.; Thomas, C. K.; Salino-Hugg, M. J.; Niemeier, J. J.; Kruger, A.
2014-12-01
Budburst, the initiation of annual growth in plants, is sensitive to climate and is used to monitor physiological responses to climate change. Accurately forecasting budburst response to these changes demands an understanding of the drivers of budburst. Current research and predictive models focus on population or landscape-level drivers, yet fundamental questions regarding drivers of budburst diversity within an individual tree remain unanswered. We hypothesize that foliar temperature, an important physiological property, may be a dominant driver of differences in the timing of budburst within a single tree. Studying these differences facilitates development of high throughput phenotyping technology used to improve predictive budburst models. We present spatial and temporal variation in foliar temperature as a function of physical drivers culminating in a single-tree budburst model based on foliar temperature. We use a novel remote sensing approach, combined with on-site meteorological measurements, to demonstrate important intra-canopy differences between air and foliar temperature. We mounted a thermal infrared camera within an old-growth canopy at the H.J. Andrews LTER forest and imaged an 8m by 10.6m section of a Douglas-fir crown. Sampling one image per minute, approximately 30,000 thermal infrared images were collected over a one-month period to approximate foliar temperature before, during and after budburst. Using time-lapse photography in the visible spectrum, we documented budburst at fifteen-minute intervals with eight cameras stratified across the thermal infrared camera's field of view. Within the imaged tree's crown, we installed a pyranometer, 2D sonic anemometer and fan-aspirated thermohygrometer and collected 3,000 measurements of net shortwave radiation, wind speed, air temperature and relative humidity. We documented a difference of several days in the timing of budburst across both vertical and horizontal gradients. We also observed clear spatial and temporal foliar temperature gradients. In addition to exploring physical drivers of budburst, this remote sensing approach provides insight into intra-canopy structural complexity and opportunities to advance our understanding of vegetation-atmospheric interactions.
Night vision imaging system design, integration and verification in spacecraft vacuum thermal test
NASA Astrophysics Data System (ADS)
Shang, Yonghong; Wang, Jing; Gong, Zhe; Li, Xiyuan; Pei, Yifei; Bai, Tingzhu; Zhen, Haijing
2015-08-01
The purposes of spacecraft vacuum thermal test are to characterize the thermal control systems of the spacecraft and its component in its cruise configuration and to allow for early retirement of risks associated with mission-specific and novel thermal designs. The orbit heat flux is simulating by infrared lamp, infrared cage or electric heater. As infrared cage and electric heater do not emit visible light, or infrared lamp just emits limited visible light test, ordinary camera could not operate due to low luminous density in test. Moreover, some special instruments such as satellite-borne infrared sensors are sensitive to visible light and it couldn't compensate light during test. For improving the ability of fine monitoring on spacecraft and exhibition of test progress in condition of ultra-low luminous density, night vision imaging system is designed and integrated by BISEE. System is consist of high-gain image intensifier ICCD camera, assistant luminance system, glare protect system, thermal control system and computer control system. The multi-frame accumulation target detect technology is adopted for high quality image recognition in captive test. Optical system, mechanical system and electrical system are designed and integrated highly adaptable to vacuum environment. Molybdenum/Polyimide thin film electrical heater controls the temperature of ICCD camera. The results of performance validation test shown that system could operate under vacuum thermal environment of 1.33×10-3Pa vacuum degree and 100K shroud temperature in the space environment simulator, and its working temperature is maintains at 5° during two-day test. The night vision imaging system could obtain video quality of 60lp/mm resolving power.
Influence of coolant tube curvature on film cooling effectiveness as detected by infrared imagery
NASA Technical Reports Server (NTRS)
Papell, S. S.; Graham, R. W.; Cageao, R. P.
1979-01-01
Thermal film cooling footprints observed by infrared imagery from straight, curved, and looped coolant tube geometries are compared. It was hypothesized that the differences in secondary flow and in the turbulence structure of flow through these three tubes should influence the mixing properties between the coolant and the main stream. A flow visualization tunnel, an infrared camera and detector, and a Hilsch tube were employed to test the hypothesis.
Multimodel Kalman filtering for adaptive nonuniformity correction in infrared sensors.
Pezoa, Jorge E; Hayat, Majeed M; Torres, Sergio N; Rahman, Md Saifur
2006-06-01
We present an adaptive technique for the estimation of nonuniformity parameters of infrared focal-plane arrays that is robust with respect to changes and uncertainties in scene and sensor characteristics. The proposed algorithm is based on using a bank of Kalman filters in parallel. Each filter independently estimates state variables comprising the gain and the bias matrices of the sensor, according to its own dynamic-model parameters. The supervising component of the algorithm then generates the final estimates of the state variables by forming a weighted superposition of all the estimates rendered by each Kalman filter. The weights are computed and updated iteratively, according to the a posteriori-likelihood principle. The performance of the estimator and its ability to compensate for fixed-pattern noise is tested using both simulated and real data obtained from two cameras operating in the mid- and long-wave infrared regime.
NASA Technical Reports Server (NTRS)
Leviton, Douglas; Frey, Bradley
2005-01-01
The current refractive optical design of the James Webb Space Telescope (JWST) Near Infrared Camera (NIRCam) uses three infrared materials in its lenses: LiF, BaF2, and ZnSe. In order to provide the instrument s optical designers with accurate, heretofore unavailable data for absolute refractive index based on actual cryogenic measurements, two prismatic samples of each material were measured using the cryogenic, high accuracy, refraction measuring system (CHARMS) at NASA GSFC, densely covering the temperature range from 15 to 320 K and wavelength range from 0.4 to 5.6 microns. Measurement methods are discussed and graphical and tabulated data for absolute refractive index, dispersion, and thermo-optic coefficient for these three materials are presented along with estimates of uncertainty. Coefficients for second order polynomial fits of measured index to temperature are provided for many wavelengths to allow accurate interpolation of index to other wavelengths and temperatures.
Analysis of the temperature of the hot tool in the cut of woven fabric using infrared images
NASA Astrophysics Data System (ADS)
Borelli, Joao E.; Verderio, Leonardo A.; Gonzaga, Adilson; Ruffino, Rosalvo T.
2001-03-01
Textile manufacture occupies a prominence place in the national economy. By virtue of its importance researches have been made on the development of new materials, equipment and methods used in the production process. The cutting of textiles starts in the basic stage, to be followed within the process of the making of clothes and other articles. In the hot cutting of fabric, one of the variables of great importance in the control of the process is the contact temperature between the tool and the fabric. The work presents a technique for the measurement of the temperature based on the processing of infrared images. For this a system was developed composed of an infrared camera, a framegrabber PC board and software that analyzes the punctual temperature in the cut area enabling the operator to achieve the necessary control of the other variables involved in the process.
Flight test comparison of film type SO-289 and film type 2424 in the AMPS camera
NASA Technical Reports Server (NTRS)
Perry, L.
1975-01-01
A flight test was conducted to determine the suitability of SO-289 multispectral infrared aerial film for Earth Resources' use. It was directly compared to film type 2424, infrared aerographic film, the IR film in current use. The exposure parameters for both films are given.
Taking on the Heat--A Narrative Account of How Infrared Cameras Invite Instant Inquiry
ERIC Educational Resources Information Center
Haglund, Jesper; Jeppsson, Fredrik; Schönborn, Konrad J.
2016-01-01
Integration of technology, social learning and scientific models offers pedagogical opportunities for science education. A particularly interesting area is thermal science, where students often struggle with abstract concepts, such as heat. In taking on this conceptual obstacle, we explore how hand-held infrared (IR) visualization technology can…
Bispectral infrared forest fire detection and analysis using classification techniques
NASA Astrophysics Data System (ADS)
Aranda, Jose M.; Melendez, Juan; de Castro, Antonio J.; Lopez, Fernando
2004-01-01
Infrared cameras are well established as a useful tool for fire detection, but their use for quantitative forest fire measurements faces difficulties, due to the complex spatial and spectral structure of fires. In this work it is shown that some of these difficulties can be overcome by applying classification techniques, a standard tool for the analysis of satellite multispectral images, to bi-spectral images of fires. Images were acquired by two cameras that operate in the medium infrared (MIR) and thermal infrared (TIR) bands. They provide simultaneous and co-registered images, calibrated in brightness temperatures. The MIR-TIR scatterplot of these images can be used to classify the scene into different fire regions (background, ashes, and several ember and flame regions). It is shown that classification makes possible to obtain quantitative measurements of physical fire parameters like rate of spread, embers temperature, and radiated power in the MIR and TIR bands. An estimation of total radiated power and heat release per unit area is also made and compared with values derived from heat of combustion and fuel consumption.
Application of infrared uncooled cameras in surveillance systems
NASA Astrophysics Data System (ADS)
Dulski, R.; Bareła, J.; Trzaskawka, P.; PiÄ tkowski, T.
2013-10-01
The recent necessity to protect military bases, convoys and patrols gave serious impact to the development of multisensor security systems for perimeter protection. One of the most important devices used in such systems are IR cameras. The paper discusses technical possibilities and limitations to use uncooled IR camera in a multi-sensor surveillance system for perimeter protection. Effective ranges of detection depend on the class of the sensor used and the observed scene itself. Application of IR camera increases the probability of intruder detection regardless of the time of day or weather conditions. It also simultaneously decreased the false alarm rate produced by the surveillance system. The role of IR cameras in the system was discussed as well as technical possibilities to detect human being. Comparison of commercially available IR cameras, capable to achieve desired ranges was done. The required spatial resolution for detection, recognition and identification was calculated. The simulation of detection ranges was done using a new model for predicting target acquisition performance which uses the Targeting Task Performance (TTP) metric. Like its predecessor, the Johnson criteria, the new model bounds the range performance with image quality. The scope of presented analysis is limited to the estimation of detection, recognition and identification ranges for typical thermal cameras with uncooled microbolometer focal plane arrays. This type of cameras is most widely used in security systems because of competitive price to performance ratio. Detection, recognition and identification range calculations were made, and the appropriate results for the devices with selected technical specifications were compared and discussed.
Image Processing Occupancy Sensor
DOE Office of Scientific and Technical Information (OSTI.GOV)
The Image Processing Occupancy Sensor, or IPOS, is a novel sensor technology developed at the National Renewable Energy Laboratory (NREL). The sensor is based on low-cost embedded microprocessors widely used by the smartphone industry and leverages mature open-source computer vision software libraries. Compared to traditional passive infrared and ultrasonic-based motion sensors currently used for occupancy detection, IPOS has shown the potential for improved accuracy and a richer set of feedback signals for occupant-optimized lighting, daylighting, temperature setback, ventilation control, and other occupancy and location-based uses. Unlike traditional passive infrared (PIR) or ultrasonic occupancy sensors, which infer occupancy based only onmore » motion, IPOS uses digital image-based analysis to detect and classify various aspects of occupancy, including the presence of occupants regardless of motion, their number, location, and activity levels of occupants, as well as the illuminance properties of the monitored space. The IPOS software leverages the recent availability of low-cost embedded computing platforms, computer vision software libraries, and camera elements.« less
Robert, Clélia; Michau, Vincent; Fleury, Bruno; Magli, Serge; Vial, Laurent
2012-07-02
Adaptive optics provide real-time compensation for atmospheric turbulence. The correction quality relies on a key element: the wavefront sensor. We have designed an adaptive optics system in the mid-infrared range providing high spatial resolution for ground-to-air applications, integrating a Shack-Hartmann infrared wavefront sensor operating on an extended source. This paper describes and justifies the design of the infrared wavefront sensor, while defining and characterizing the Shack-Hartmann wavefront sensor camera. Performance and illustration of field tests are also reported.
Bach, Aaron J E; Stewart, Ian B; Disher, Alice E; Costello, Joseph T
2015-01-01
Skin temperature assessment has historically been undertaken with conductive devices affixed to the skin. With the development of technology, infrared devices are increasingly utilised in the measurement of skin temperature. Therefore, our purpose was to evaluate the agreement between four skin temperature devices at rest, during exercise in the heat, and recovery. Mean skin temperature ([Formula: see text]) was assessed in thirty healthy males during 30 min rest (24.0 ± 1.2°C, 56 ± 8%), 30 min cycle in the heat (38.0 ± 0.5°C, 41 ± 2%), and 45 min recovery (24.0 ± 1.3°C, 56 ± 9%). [Formula: see text] was assessed at four sites using two conductive devices (thermistors, iButtons) and two infrared devices (infrared thermometer, infrared camera). Bland-Altman plots demonstrated mean bias ± limits of agreement between the thermistors and iButtons as follows (rest, exercise, recovery): -0.01 ± 0.04, 0.26 ± 0.85, -0.37 ± 0.98°C; thermistors and infrared thermometer: 0.34 ± 0.44, -0.44 ± 1.23, -1.04 ± 1.75°C; thermistors and infrared camera (rest, recovery): 0.83 ± 0.77, 1.88 ± 1.87°C. Pairwise comparisons of [Formula: see text] found significant differences (p < 0.05) between thermistors and both infrared devices during resting conditions, and significant differences between the thermistors and all other devices tested during exercise in the heat and recovery. These results indicate poor agreement between conductive and infrared devices at rest, during exercise in the heat, and subsequent recovery. Infrared devices may not be suitable for monitoring [Formula: see text] in the presence of, or following, metabolic and environmental induced heat stress.
NASA Astrophysics Data System (ADS)
Orton, Glenn S.; Yanamandra-Fisher, P. A.; Parrish, P. D.; Mousis, O.; Pantin, E.; Fuse, T.; Fujiyoshi, T.; Simon-Miller, A.; Morales-Juberias, R.; Tollestrup, E.; Connelley, M.; Trujillo, C.; Hora, J.; Irwin, P.; Fletcher, L.; Hill, D.; Kollmansberger, S.
2006-09-01
White Oval BA, constituted from 3 predecessor vortices (known as Jupiter's "classical" White Ovals) after successive mergers in 1998 and 2000, became second-largest vortex in the atmosphere of Jupiter (and possibly the solar system) at the time of its formation. While it continues in this distinction,it required a name change after a 2005 December through 2006 February transformation which made it appear visually the same color as the Great Red Spot. Our campaign to understand the changes involved examination of the detailed color and wind field using Hubble Space Telescope instrumentation on several orbits in April. The field of temperatures, ammonia distribution and clouds were also examined using the mid-infrared VISIR camera/spectrometer on ESO's 8.2-m Very Large Telescope, the NASA Infrared telescope with the mid-infrared MIRSI instrument and the refurbished near-infrared facility camera NSFCam2. High-resolution images of the Oval were made before the color change with the COMICS mid-infrared facility on the 8.2-m Subaru telescope.We are using these images, togther with images acquired at the IRTF and with the Gemini/North NIRI near-infrared camera between January, 2005, and August, 2006, to characterize the extent to which changes in storm strength (vorticity, postive vertical motion) influenced (i) the depth from which colored cloud particles may have been "dredged up" from depth or (ii) the altitude to which particles may have been lofted and subject to high-energy UV radiation which caused a color change, as alternative explanations for the phenomenon. Clues to this will provide clues to the chemistry of Jupiter's cloud system and its well-known colors in general. The behavior of Oval BA, its interaction with the Great Red Spot in particular,are also being compared with dynamical models run with the EPIC code.
Viking site selection and certification
NASA Technical Reports Server (NTRS)
Masursky, H.; Crabill, N. L.
1981-01-01
The landing site selection and certification effort for the Viking mission to Mars is reviewed from the premission phase through the acquisition of data and decisions during mission operations and the immediate postlanding evaluation. The utility and limitations of the orbital television and infrared data and ground based radar observation of candidate and actual landing sites are evaluated. Additional instruments and types of observations which would have been useful include higher resolution cameras, radar altimeters, and terrain hazard avoidance capability in the landing system. Suggestions based on this experience that might be applied to future missions are included.
A high-sensitivity EM-CCD camera for the open port telescope cavity of SOFIA
NASA Astrophysics Data System (ADS)
Wiedemann, Manuel; Wolf, Jürgen; McGrotty, Paul; Edwards, Chris; Krabbe, Alfred
2016-08-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) has three target acquisition and tracking cameras. All three imagers originally used the same cameras, which did not meet the sensitivity requirements, due to low quantum efficiency and high dark current. The Focal Plane Imager (FPI) suffered the most from high dark current, since it operated in the aircraft cabin at room temperatures without active cooling. In early 2013 the FPI was upgraded with an iXon3 888 from Andor Techonolgy. Compared to the original cameras, the iXon3 has a factor five higher QE, thanks to its back-illuminated sensor, and orders of magnitude lower dark current, due to a thermo-electric cooler and "inverted mode operation." This leads to an increase in sensitivity of about five stellar magnitudes. The Wide Field Imager (WFI) and Fine Field Imager (FFI) shall now be upgraded with equally sensitive cameras. However, they are exposed to stratospheric conditions in flight (typical conditions: T≍-40° C, p≍ 0:1 atm) and there are no off-the-shelf CCD cameras with the performance of an iXon3, suited for these conditions. Therefore, Andor Technology and the Deutsches SOFIA Institut (DSI) are jointly developing and qualifying a camera for these conditions, based on the iXon3 888. These changes include replacement of electrical components with MIL-SPEC or industrial grade components and various system optimizations, a new data interface that allows the image data transmission over 30m of cable from the camera to the controller, a new power converter in the camera to generate all necessary operating voltages of the camera locally and a new housing that fulfills airworthiness requirements. A prototype of this camera has been built and tested in an environmental test chamber at temperatures down to T=-62° C and pressure equivalent to 50 000 ft altitude. In this paper, we will report about the development of the camera and present results from the environmental testing.
Detailed in situ laser calibration of the infrared imaging video bolometer for the JT-60U tokamak
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parchamy, H.; Peterson, B. J.; Konoshima, S.
2006-10-15
The infrared imaging video bolometer (IRVB) in JT-60U includes a single graphite-coated gold foil with an effective area of 9x7 cm{sup 2} and a thickness of 2.5 {mu}m. The thermal images of the foil resulting from the plasma radiation are provided by an IR camera. The calibration technique of the IRVB gives confidence in the absolute levels of the measured values of the plasma radiation. The in situ calibration is carried out in order to obtain local foil properties such as the thermal diffusivity {kappa} and the product of the thermal conductivity k and the thickness t{sub f} of themore » foil. These quantities are necessary for solving the two-dimensional heat diffusion equation of the foil which is used in the experiments. These parameters are determined by comparing the measured temperature profiles (for kt{sub f}) and their decays (for {kappa}) with the corresponding results of a finite element model using the measured HeNe laser power profile as a known radiation power source. The infrared camera (Indigo/Omega) is calibrated by fitting the temperature rise of a heated plate to the resulting camera data using the Stefan-Boltzmann law.« less
A practical indoor context-aware surveillance system with multi-Kinect sensors
NASA Astrophysics Data System (ADS)
Jia, Lili; You, Ying; Li, Tiezhu; Zhang, Shun
2014-11-01
In this paper we develop a novel practical application, which give scalable services to the end users when abnormal actives are happening. Architecture of the application has been presented consisting of network infrared cameras and a communication module. In this intelligent surveillance system we use Kinect sensors as the input cameras. Kinect is an infrared laser camera which its user can access the raw infrared sensor stream. We install several Kinect sensors in one room to track the human skeletons. Each sensor returns the body positions with 15 coordinates in its own coordinate system. We use calibration algorithms to calibrate all the body positions points into one unified coordinate system. With the body positions points, we can infer the surveillance context. Furthermore, the messages from the metadata index matrix will be sent to mobile phone through communication module. User will instantly be aware of an abnormal case happened in the room without having to check the website. In conclusion, theoretical analysis and experimental results in this paper show that the proposed system is reasonable and efficient. And the application method introduced in this paper is not only to discourage the criminals and assist police in the apprehension of suspects, but also can enabled the end-users monitor the indoor environments anywhere and anytime by their phones.
Cryogenic optical systems for the rapid infrared imager/spectrometer (RIMAS)
NASA Astrophysics Data System (ADS)
Capone, John I.; Content, David A.; Kutyrev, Alexander S.; Robinson, Frederick D.; Lotkin, Gennadiy N.; Toy, Vicki L.; Veilleux, Sylvain; Moseley, Samuel H.; Gehrels, Neil A.; Vogel, Stuart N.
2014-07-01
The Rapid Infrared Imager/Spectrometer (RIMAS) is designed to perform follow-up observations of transient astronomical sources at near infrared (NIR) wavelengths (0.9 - 2.4 microns). In particular, RIMAS will be used to perform photometric and spectroscopic observations of gamma-ray burst (GRB) afterglows to compliment the Swift satellite's science goals. Upon completion, RIMAS will be installed on Lowell Observatory's 4.3 meter Discovery Channel Telescope (DCT) located in Happy Jack, Arizona. The instrument's optical design includes a collimator lens assembly, a dichroic to divide the wavelength coverage into two optical arms (0.9 - 1.4 microns and 1.4 - 2.4 microns respectively), and a camera lens assembly for each optical arm. Because the wavelength coverage extends out to 2.4 microns, all optical elements are cooled to ~70 K. Filters and transmission gratings are located on wheels prior to each camera allowing the instrument to be quickly configured for photometry or spectroscopy. An athermal optomechanical design is being implemented to prevent lenses from loosing their room temperature alignment as the system is cooled. The thermal expansion of materials used in this design have been measured in the lab. Additionally, RIMAS has a guide camera consisting of four lenses to aid observers in passing light from target sources through spectroscopic slits. Efforts to align these optics are ongoing.
Spectral measurements of muzzle flash with multispectral and hyperspectral sensor
NASA Astrophysics Data System (ADS)
Kastek, M.; Dulski, R.; Trzaskawka, P.; Piątkowski, T.; Polakowski, H.
2011-08-01
The paper presents some practical aspects of the measurements of muzzle flash signatures. Selected signatures of sniper shot in typical scenarios has been presented. Signatures registered during all phases of muzzle flash were analyzed. High precision laboratory measurements were made in a special ballistic laboratory and as a result several flash patterns were registered. The field measurements of a muzzle flash were also performed. During the tests several infrared cameras were used, including the measurement class devices with high accuracy and frame rates. The registrations were made in NWIR, SWIR and LWIR spectral bands simultaneously. An ultra fast visual camera was also used for visible spectra registration. Some typical infrared shot signatures were presented. Beside the cameras, the LWIR imaging spectroradiometer HyperCam was also used during the laboratory experiments and the field tests. The signatures collected by the HyperCam device were useful for the determination of spectral characteristics of the muzzle flash, whereas the analysis of thermal images registered during the tests provided the data on temperature distribution in the flash area. As a result of the measurement session the signatures of several types handguns, machine guns and sniper rifles were obtained which will be used in the development of passive infrared systems for sniper detection.
A dual-band adaptor for infrared imaging.
McLean, A G; Ahn, J-W; Maingi, R; Gray, T K; Roquemore, A L
2012-05-01
A novel imaging adaptor providing the capability to extend a standard single-band infrared (IR) camera into a two-color or dual-band device has been developed for application to high-speed IR thermography on the National Spherical Tokamak Experiment (NSTX). Temperature measurement with two-band infrared imaging has the advantage of being mostly independent of surface emissivity, which may vary significantly in the liquid lithium divertor installed on NSTX as compared to that of an all-carbon first wall. In order to take advantage of the high-speed capability of the existing IR camera at NSTX (1.6-6.2 kHz frame rate), a commercial visible-range optical splitter was extensively modified to operate in the medium wavelength and long wavelength IR. This two-band IR adapter utilizes a dichroic beamsplitter, which reflects 4-6 μm wavelengths and transmits 7-10 μm wavelength radiation, each with >95% efficiency and projects each IR channel image side-by-side on the camera's detector. Cutoff filters are used in each IR channel, and ZnSe imaging optics and mirrors optimized for broadband IR use are incorporated into the design. In-situ and ex-situ temperature calibration and preliminary data of the NSTX divertor during plasma discharges are presented, with contrasting results for dual-band vs. single-band IR operation.
Kong, Soo-Keun; Chon, Kyong-Myong; Goh, Eui-Kyung; Lee, Il-Woo; Wang, Soo-Geun
2014-05-01
High-resolution computed tomography has been used mainly in the diagnosis of middle ear disease, such as high-jugular bulb, congenital cholesteatoma, and ossicular disruption. However, certain diagnoses are confirmed through exploratory tympanotomy. There are few noninvasive methods available to observe the middle ear. The purpose of this study was to investigate the effect of glycerol as a refractive index matching material and an infrared (IR) camera system for extratympanic observation. 30% glycerol was used as a refractive index matching material in five fresh cadavers. Each material was divided into four subgroups; GN (glycerol no) group, GO (glycerol out) group, GI (glycerol in) group, and GB (glycerol both) group. A printed letter and middle ear structures on the inside tympanic membrane were observed using a visible and IR ray camera system. In the GB group, there were marked a transilluminated letter or an ossicle on the inside tympanic membrane. In particular, a footplate of stapes was even transilluminated using the IR camera system in the GB group. This method can be useful in the diagnosis of diseases of the middle ear if it is clinically applied through further studies.
NASA Astrophysics Data System (ADS)
Kong, Soo-Keun; Chon, Kyong-Myong; Goh, Eui-Kyung; Lee, Il-Woo; Wang, Soo-Geun
2014-05-01
High-resolution computed tomography has been used mainly in the diagnosis of middle ear disease, such as high-jugular bulb, congenital cholesteatoma, and ossicular disruption. However, certain diagnoses are confirmed through exploratory tympanotomy. There are few noninvasive methods available to observe the middle ear. The purpose of this study was to investigate the effect of glycerol as a refractive index matching material and an infrared (IR) camera system for extratympanic observation. 30% glycerol was used as a refractive index matching material in five fresh cadavers. Each material was divided into four subgroups; GN (glycerol no) group, GO (glycerol out) group, GI (glycerol in) group, and GB (glycerol both) group. A printed letter and middle ear structures on the inside tympanic membrane were observed using a visible and IR ray camera system. In the GB group, there were marked a transilluminated letter or an ossicle on the inside tympanic membrane. In particular, a footplate of stapes was even transilluminated using the IR camera system in the GB group. This method can be useful in the diagnosis of diseases of the middle ear if it is clinically applied through further studies.
Low-cost telepresence for collaborative virtual environments.
Rhee, Seon-Min; Ziegler, Remo; Park, Jiyoung; Naef, Martin; Gross, Markus; Kim, Myoung-Hee
2007-01-01
We present a novel low-cost method for visual communication and telepresence in a CAVE -like environment, relying on 2D stereo-based video avatars. The system combines a selection of proven efficient algorithms and approximations in a unique way, resulting in a convincing stereoscopic real-time representation of a remote user acquired in a spatially immersive display. The system was designed to extend existing projection systems with acquisition capabilities requiring minimal hardware modifications and cost. The system uses infrared-based image segmentation to enable concurrent acquisition and projection in an immersive environment without a static background. The system consists of two color cameras and two additional b/w cameras used for segmentation in the near-IR spectrum. There is no need for special optics as the mask and color image are merged using image-warping based on a depth estimation. The resulting stereo image stream is compressed, streamed across a network, and displayed as a frame-sequential stereo texture on a billboard in the remote virtual environment.
Color Infrared view of Houston, TX, USA
1991-09-18
This color infrared view of Houston (29.5N, 95.0W) was taken with a dual camera mount. Compare this scene with STS048-78-034 for an analysis of the unique properties of each film type. Comparative tests such as this aids in determining the kinds of information unique to each film system and evaluates and compares photography taken through hazy atmospheres. Infrared film is best at penetrating haze, vegetation detection and producing a sharp image.
1997-01-18
KENNEDY SPACE CENTER, FLA. - Workers in KSC's Vertical Processing Facility inspect the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) on its handling fixture. NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS also is HST's first cryogenic instrument — its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 degrees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is targeted Feb. 11 aboard Discovery with a crew of seven.
1997-01-18
KENNEDY SPACE CENTER, FLA. - Workers in KSC's Vertical Processing Facility lift the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) prior to its installation in the Second Axial Carrier. NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS also is HST's first cryogenic instrument — its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 degrees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is targeted Feb. 11 aboard Discovery with a crew of seven.
1997-01-18
KENNEDY SPACE CENTER, FLA. - Workers in KSC's Vertical Processing Facility lower the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) into the Second Axial Carrier. NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS also is HST's first cryogenic instrument — its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 degrees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is targeted Feb. 11 aboard Discovery with a crew of seven.
1997-01-16
KENNEDY SPACE CENTER, FLA. - Workers in KSC's Vertical Processing Facility lower the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) into the Second Axial Carrier. NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS is HST's first cryogenic instrument -- its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 derees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is targeted Feb. 11 aboard Discovery with a crew of seven.
Thermoelectric infrared imaging sensors for automotive applications
NASA Astrophysics Data System (ADS)
Hirota, Masaki; Nakajima, Yasushi; Saito, Masanori; Satou, Fuminori; Uchiyama, Makoto
2004-07-01
This paper describes three low-cost thermoelectric infrared imaging sensors having a 1,536, 2,304, and 10,800 element thermoelectric focal plane array (FPA) respectively and two experimental automotive application systems. The FPAs are basically fabricated with a conventional IC process and micromachining technologies and have a low cost potential. Among these sensors, the sensor having 2,304 elements provide high responsivity of 5,500 V/W and a very small size with adopting a vacuum-sealed package integrated with a wide-angle ZnS lens. One experimental system incorporated in the Nissan ASV-2 is a blind spot pedestrian warning system that employs four infrared imaging sensors. This system helps alert the driver to the presence of a pedestrian in a blind spot by detecting the infrared radiation emitted from the person"s body. The system can also prevent the vehicle from moving in the direction of the pedestrian. The other is a rearview camera system with an infrared detection function. This system consists of a visible camera and infrared sensors, and it helps alert the driver to the presence of a pedestrian in a rear blind spot. Various issues that will need to be addressed in order to expand the automotive applications of IR imaging sensors in the future are also summarized. This performance is suitable for consumer electronics as well as automotive applications.
Preliminary optical design of PANIC, a wide-field infrared camera for CAHA
NASA Astrophysics Data System (ADS)
Cárdenas, M. C.; Rodríguez Gómez, J.; Lenzen, R.; Sánchez-Blanco, E.
2008-07-01
In this paper, we present the preliminary optical design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Calar Alto 2.2 m telescope. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. A mosaic of four Hawaii 2RG of 2k x 2k made by Teledyne is used as detector and will give a field of view of 31.9 arcmin x 31.9 arcmin. This cryogenic instrument has been optimized for the Y, J, H and K bands. Special care has been taken in the selection of the standard IR materials used for the optics in order to maximize the instrument throughput and to include the z band. The main challenges of this design are: to produce a well defined internal pupil which allows reducing the thermal background by a cryogenic pupil stop; the correction of off-axis aberrations due to the large field available; the correction of chromatic aberration because of the wide spectral coverage; and the capability of introduction of narrow band filters (~1%) in the system minimizing the degradation in the filter passband without a collimated stage in the camera. We show the optomechanical error budget and compensation strategy that allows our as built design to met the performances from an optical point of view. Finally, we demonstrate the flexibility of the design showing the performances of PANIC at the CAHA 3.5m telescope.
High resolution imaging of the Venus night side using a Rockwell 128x128 HgCdTe array
NASA Technical Reports Server (NTRS)
Hodapp, K.-W.; Sinton, W.; Ragent, B.; Allen, D.
1989-01-01
The University of Hawaii operates an infrared camera with a 128x128 HgCdTe detector array on loan from JPL's High Resolution Imaging Spectrometer (HIRIS) project. The characteristics of this camera system are discussed. The infrared camera was used to obtain images of the night side of Venus prior to and after inferior conjunction in 1988. The images confirm Allen and Crawford's (1984) discovery of bright features on the dark hemisphere of Venus visible in the H and K bands. Our images of these features are the best obtained to date. Researchers derive a pseudo rotation period of 6.5 days for these features and 1.74 microns brightness temperatures between 425 K and 480 K. The features are produced by nonuniform absorption in the middle cloud layer (47 to 57 Km altitude) of thermal radiation from the lower Venus atmosphere (20 to 30 Km altitude). A more detailed analysis of the data is in progress.
3D medical thermography device
NASA Astrophysics Data System (ADS)
Moghadam, Peyman
2015-05-01
In this paper, a novel handheld 3D medical thermography system is introduced. The proposed system consists of a thermal-infrared camera, a color camera and a depth camera rigidly attached in close proximity and mounted on an ergonomic handle. As a practitioner holding the device smoothly moves it around the human body parts, the proposed system generates and builds up a precise 3D thermogram model by incorporating information from each new measurement in real-time. The data is acquired in motion, thus it provides multiple points of view. When processed, these multiple points of view are adaptively combined by taking into account the reliability of each individual measurement which can vary due to a variety of factors such as angle of incidence, distance between the device and the subject and environmental sensor data or other factors influencing a confidence of the thermal-infrared data when captured. Finally, several case studies are presented to support the usability and performance of the proposed system.
Calibration of Contactless Pulse Oximetry
Bartula, Marek; Bresch, Erik; Rocque, Mukul; Meftah, Mohammed; Kirenko, Ihor
2017-01-01
BACKGROUND: Contactless, camera-based photoplethysmography (PPG) interrogates shallower skin layers than conventional contact probes, either transmissive or reflective. This raises questions on the calibratability of camera-based pulse oximetry. METHODS: We made video recordings of the foreheads of 41 healthy adults at 660 and 840 nm, and remote PPG signals were extracted. Subjects were in normoxic, hypoxic, and low temperature conditions. Ratio-of-ratios were compared to reference Spo2 from 4 contact probes. RESULTS: A calibration curve based on artifact-free data was determined for a population of 26 individuals. For an Spo2 range of approximately 83% to 100% and discarding short-term errors, a root mean square error of 1.15% was found with an upper 99% one-sided confidence limit of 1.65%. Under normoxic conditions, a decrease in ambient temperature from 23 to 7°C resulted in a calibration error of 0.1% (±1.3%, 99% confidence interval) based on measurements for 3 subjects. PPG signal strengths varied strongly among individuals from about 0.9 × 10−3 to 4.6 × 10−3 for the infrared wavelength. CONCLUSIONS: For healthy adults, the results present strong evidence that camera-based contactless pulse oximetry is fundamentally feasible because long-term (eg, 10 minutes) error stemming from variation among individuals expressed as A*rms is significantly lower (<1.65%) than that required by the International Organization for Standardization standard (<4%) with the notion that short-term errors should be added. A first illustration of such errors has been provided with A**rms = 2.54% for 40 individuals, including 6 with dark skin. Low signal strength and subject motion present critical challenges that will have to be addressed to make camera-based pulse oximetry practically feasible. PMID:27258081
Autonomous stress imaging cores: from concept to reality
NASA Astrophysics Data System (ADS)
van der Velden, Stephen; Rajic, Nik; Brooks, Chris; Galea, Steve
2016-04-01
The historical reliance of thermoelastic stress analysis on cooled infrared detection has created significant cost and practical impediments to the widespread use of this powerful full-field stress measurement technique. The emergence of low-cost microbolometers as a practical alternative has allowed for an expansion of the traditional role of thermoelastic stress analysis, and raises the possibility that it may in future become a viable structural health monitoring modality. Experimental results are shown to confirm that high resolution stress imagery can be obtained from an uncooled thermal camera core significantly smaller than any infrared imaging device previously applied to TSA. The paper provides a summary of progress toward the development of an autonomous stress-imaging capability based on this core.
Optimal trajectory planning for a UAV glider using atmospheric thermals
NASA Astrophysics Data System (ADS)
Kagabo, Wilson B.
An Unmanned Aerial Vehicle Glider (UAV glider) uses atmospheric energy in its different forms to remain aloft for extended flight durations. This UAV glider's aim is to extract atmospheric thermal energy and use it to supplement its battery energy usage and increase the mission period. Given an infrared camera identified atmospheric thermal of known strength and location; current wind speed and direction; current battery level; altitude and location of the UAV glider; and estimating the expected altitude gain from the thermal, is it possible to make an energy-efficient based motivation to fly to an atmospheric thermal so as to achieve UAV glider extended flight time? For this work, an infrared thermal camera aboard the UAV glider takes continuous forward-looking ground images of "hot spots". Through image processing a candidate atmospheric thermal strength and location is estimated. An Intelligent Decision Model incorporates this information with the current UAV glider status and weather conditions to provide an energy-based recommendation to modify the flight path of the UAV glider. Research, development, and simulation of the Intelligent Decision Model is the primary focus of this work. Three models are developed: (1) Battery Usage Model, (2) Intelligent Decision Model, and (3) Altitude Gain Model. The Battery Usage Model comes from the candidate flight trajectory, wind speed & direction and aircraft dynamic model. Intelligent Decision Model uses a fuzzy logic based approach. The Altitude Gain Model requires the strength and size of the thermal and is found a priori.
High-Resolution Surface Reconstruction from Imagery for Close Range Cultural Heritage Applications
NASA Astrophysics Data System (ADS)
Wenzel, K.; Abdel-Wahab, M.; Cefalu, A.; Fritsch, D.
2012-07-01
The recording of high resolution point clouds with sub-mm resolution is a demanding and cost intensive task, especially with current equipment like handheld laser scanners. We present an image based approached, where techniques of image matching and dense surface reconstruction are combined with a compact and affordable rig of off-the-shelf industry cameras. Such cameras provide high spatial resolution with low radiometric noise, which enables a one-shot solution and thus an efficient data acquisition while satisfying high accuracy requirements. However, the largest drawback of image based solutions is often the acquisition of surfaces with low texture where the image matching process might fail. Thus, an additional structured light projector is employed, represented here by the pseudo-random pattern projector of the Microsoft Kinect. Its strong infrared-laser projects speckles of different sizes. By using dense image matching techniques on the acquired images, a 3D point can be derived for almost each pixel. The use of multiple cameras enables the acquisition of a high resolution point cloud with high accuracy for each shot. For the proposed system up to 3.5 Mio. 3D points with sub-mm accuracy can be derived per shot. The registration of multiple shots is performed by Structure and Motion reconstruction techniques, where feature points are used to derive the camera positions and rotations automatically without initial information.
Berkeley Lab Scientists to Play Role in New Space Telescope
circling distant suns, among other science aims. The Wide Field Infrared Survey Telescope (WFIRST) will Hubble Space Telescope's Wide Field Camera 3 infrared imager. A Hubble large-scale mapping survey of the survey of the M31 galaxy (shown here) required 432 "pointings" of its imager, while only two
Pre-discovery detections and progenitor candidate for SPIRITS17qm in NGC 1365
NASA Astrophysics Data System (ADS)
Jencson, J. E.; Bond, H. E.; Adams, S. M.; Kasliwal, M. M.
2018-04-01
We report the detection of a pre-discovery outburst of SPIRITS17qm, discovered as part of the ongoing Spitzer InfraRed Intensive Transients Survey (SPIRITS) using the 3.6 and 4.5 micron imaging channels ([3.6] and [4.5]) of the Infrared Array Camera (IRAC) on the Spitzer Space Telescope (ATel #11575).
Pre-discovery detections and progenitor candidate for SPIRITS17pc in NGC 4388
NASA Astrophysics Data System (ADS)
Jencson, J. E.; Bond, H. E.; Adams, S. M.; Kasliwal, M. M.
2018-04-01
We report detections of pre-discovery outbursts of SPIRITS17pc, discovered as part of the ongoing Spitzer InfraRed Intensive Transients Survey (SPIRITS) using the 3.6 and 4.5 micron imaging channels ([3.6] and [4.5]) of the Infrared Array Camera (IRAC) on the Spitzer Space Telescope (ATel #11575).
NASA Astrophysics Data System (ADS)
Basilevsky, A. T.; Shalygina, O. S.; Bondarenko, N. V.; Shalygin, E. V.; Markiewicz, W. J.
2017-09-01
The aim of this work is a comparative study of several typical radar-dark parabolas, the neighboring plains and some other geologic units seen in the study areas which include craters Adivar, Bassi, Bathsheba, du Chatelet and Sitwell, at two depths scales: the upper several meters of the study object available through the Magellan-based microwave (at 12.6 cm wavelength) properties (microwave emissivity, Fresnel reflectivity, large-scale surface roughness, and radar cross-section), and the upper hundreds microns of the object characterized by the 1 micron emissivity resulted from the analysis of the near infra-red (NIR) irradiation of the night-side of the Venusian surface measured by the Venus Monitoring Camera (VMC) on-board of Venus Express (VEx).
A New Calibration Method for Commercial RGB-D Sensors.
Darwish, Walid; Tang, Shenjun; Li, Wenbin; Chen, Wu
2017-05-24
Commercial RGB-D sensors such as Kinect and Structure Sensors have been widely used in the game industry, where geometric fidelity is not of utmost importance. For applications in which high quality 3D is required, i.e., 3D building models of centimeter‑level accuracy, accurate and reliable calibrations of these sensors are required. This paper presents a new model for calibrating the depth measurements of RGB-D sensors based on the structured light concept. Additionally, a new automatic method is proposed for the calibration of all RGB-D parameters, including internal calibration parameters for all cameras, the baseline between the infrared and RGB cameras, and the depth error model. When compared with traditional calibration methods, this new model shows a significant improvement in depth precision for both near and far ranges.
Sensor Fusion Based Model for Collision Free Mobile Robot Navigation
Almasri, Marwah; Elleithy, Khaled; Alajlan, Abrar
2015-01-01
Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot’s wheels, and 24 fuzzy rules for the robot’s movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes. PMID:26712766
Sensor Fusion Based Model for Collision Free Mobile Robot Navigation.
Almasri, Marwah; Elleithy, Khaled; Alajlan, Abrar
2015-12-26
Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot's wheels, and 24 fuzzy rules for the robot's movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes.
Adaptive illumination source for multispectral vision system applied to material discrimination
NASA Astrophysics Data System (ADS)
Conde, Olga M.; Cobo, Adolfo; Cantero, Paulino; Conde, David; Mirapeix, Jesús; Cubillas, Ana M.; López-Higuera, José M.
2008-04-01
A multispectral system based on a monochrome camera and an adaptive illumination source is presented in this paper. Its preliminary application is focused on material discrimination for food and beverage industries, where monochrome, color and infrared imaging have been successfully applied for this task. This work proposes a different approach, in which the relevant wavelengths for the required discrimination task are selected in advance using a Sequential Forward Floating Selection (SFFS) Algorithm. A light source, based on Light Emitting Diodes (LEDs) at these wavelengths is then used to sequentially illuminate the material under analysis, and the resulting images are captured by a CCD camera with spectral response in the entire range of the selected wavelengths. Finally, the several multispectral planes obtained are processed using a Spectral Angle Mapping (SAM) algorithm, whose output is the desired material classification. Among other advantages, this approach of controlled and specific illumination produces multispectral imaging with a simple monochrome camera, and cold illumination restricted to specific relevant wavelengths, which is desirable for the food and beverage industry. The proposed system has been tested with success for the automatic detection of foreign object in the tobacco processing industry.
A miniature low-cost LWIR camera with a 160×120 microbolometer FPA
NASA Astrophysics Data System (ADS)
Tepegoz, Murat; Kucukkomurler, Alper; Tankut, Firat; Eminoglu, Selim; Akin, Tayfun
2014-06-01
This paper presents the development of a miniature LWIR thermal camera, MSE070D, which targets value performance infrared imaging applications, where a 160x120 CMOS-based microbolometer FPA is utilized. MSE070D features a universal USB interface that can communicate with computers and some particular mobile devices in the market. In addition, it offers high flexibility and mobility with the help of its USB powered nature, eliminating the need for any external power source, thanks to its low-power requirement option. MSE070D provides thermal imaging with its 1.65 inch3 volume with the use of a vacuum packaged CMOS-based microbolometer type thermal sensor MS1670A-VP, achieving moderate performance with a very low production cost. MSE070D allows 30 fps thermal video imaging with the 160x120 FPA size while resulting in an NETD lower than 350 mK with f/1 optics. It is possible to obtain test electronics and software, miniature camera cores, complete Application Programming Interfaces (APIs) and relevant documentation with MSE070D, as MikroSens want to help its customers to evaluate its products and to ensure quick time-to-market for systems manufacturers.
Scheme for predictive fault diagnosis in photo-voltaic modules using thermal imaging
NASA Astrophysics Data System (ADS)
Jaffery, Zainul Abdin; Dubey, Ashwani Kumar; Irshad; Haque, Ahteshamul
2017-06-01
Degradation of PV modules can cause excessive overheating which results in a reduced power output and eventually failure of solar panel. To maintain the long term reliability of solar modules and maximize the power output, faults in modules need to be diagnosed at an early stage. This paper provides a comprehensive algorithm for fault diagnosis in solar modules using infrared thermography. Infrared Thermography (IRT) is a reliable, non-destructive, fast and cost effective technique which is widely used to identify where and how faults occurred in an electrical installation. Infrared images were used for condition monitoring of solar modules and fuzzy logic have been used to incorporate intelligent classification of faults. An automatic approach has been suggested for fault detection, classification and analysis. IR images were acquired using an IR camera. To have an estimation of thermal condition of PV module, the faulty panel images were compared to a healthy PV module thermal image. A fuzzy rule-base was used to classify faults automatically. Maintenance actions have been advised based on type of faults.
Infrared Thermal Imaging for Automated Detection of Diabetic Foot Complications
van Netten, Jaap J.; van Baal, Jeff G.; Liu, Chanjuan; van der Heijden, Ferdi; Bus, Sicco A.
2013-01-01
Background Although thermal imaging can be a valuable technology in the prevention and management of diabetic foot disease, it is not yet widely used in clinical practice. Technological advancement in infrared imaging increases its application range. The aim was to explore the first steps in the applicability of high-resolution infrared thermal imaging for noninvasive automated detection of signs of diabetic foot disease. Methods The plantar foot surfaces of 15 diabetes patients were imaged with an infrared camera (resolution, 1.2 mm/pixel): 5 patients had no visible signs of foot complications, 5 patients had local complications (e.g., abundant callus or neuropathic ulcer), and 5 patients had diffuse complications (e.g., Charcot foot, infected ulcer, or critical ischemia). Foot temperature was calculated as mean temperature across pixels for the whole foot and for specified regions of interest (ROIs). Results No differences in mean temperature >1.5 °C between the ipsilateral and the contralateral foot were found in patients without complications. In patients with local complications, mean temperatures of the ipsilateral and the contralateral foot were similar, but temperature at the ROI was >2 °C higher compared with the corresponding region in the contralateral foot and to the mean of the whole ipsilateral foot. In patients with diffuse complications, mean temperature differences of >3 °C between ipsilateral and contralateral foot were found. Conclusions With an algorithm based on parameters that can be captured and analyzed with a high-resolution infrared camera and a computer, it is possible to detect signs of diabetic foot disease and to discriminate between no, local, or diffuse diabetic foot complications. As such, an intelligent telemedicine monitoring system for noninvasive automated detection of signs of diabetic foot disease is one step closer. Future studies are essential to confirm and extend these promising early findings. PMID:24124937
Infrared thermal imaging for automated detection of diabetic foot complications.
van Netten, Jaap J; van Baal, Jeff G; Liu, Chanjuan; van der Heijden, Ferdi; Bus, Sicco A
2013-09-01
Although thermal imaging can be a valuable technology in the prevention and management of diabetic foot disease, it is not yet widely used in clinical practice. Technological advancement in infrared imaging increases its application range. The aim was to explore the first steps in the applicability of high-resolution infrared thermal imaging for noninvasive automated detection of signs of diabetic foot disease. The plantar foot surfaces of 15 diabetes patients were imaged with an infrared camera (resolution, 1.2 mm/pixel): 5 patients had no visible signs of foot complications, 5 patients had local complications (e.g., abundant callus or neuropathic ulcer), and 5 patients had diffuse complications (e.g., Charcot foot, infected ulcer, or critical ischemia). Foot temperature was calculated as mean temperature across pixels for the whole foot and for specified regions of interest (ROIs). No differences in mean temperature >1.5 °C between the ipsilateral and the contralateral foot were found in patients without complications. In patients with local complications, mean temperatures of the ipsilateral and the contralateral foot were similar, but temperature at the ROI was >2 °C higher compared with the corresponding region in the contralateral foot and to the mean of the whole ipsilateral foot. In patients with diffuse complications, mean temperature differences of >3 °C between ipsilateral and contralateral foot were found. With an algorithm based on parameters that can be captured and analyzed with a high-resolution infrared camera and a computer, it is possible to detect signs of diabetic foot disease and to discriminate between no, local, or diffuse diabetic foot complications. As such, an intelligent telemedicine monitoring system for noninvasive automated detection of signs of diabetic foot disease is one step closer. Future studies are essential to confirm and extend these promising early findings. © 2013 Diabetes Technology Society.
Lee, Hoonsoo; Kim, Moon S; Lohumi, Santosh; Cho, Byoung-Kwan
2018-06-05
Extensive research has been conducted on non-destructive and rapid detection of melamine in powdered foods in the last decade. While Raman and near-infrared hyperspectral imaging techniques have been successful in terms of non-destructive and rapid measurement, they have limitations with respect to measurement time and detection capability, respectively. Therefore, the objective of this study was to develop a mercury cadmium telluride (MCT)-based short-wave infrared (SWIR) hyperspectral imaging system and algorithm to detect melamine quantitatively in milk powder. The SWIR hyperspectral imaging system consisted of a custom-designed illumination system, a SWIR hyperspectral camera, a data acquisition module and a sample transfer table. SWIR hyperspectral images were obtained for melamine-milk samples with different melamine concentrations, pure melamine and pure milk powder. Analysis of variance and the partial least squares regression method over the 1000-2500 nm wavelength region were used to develop an optimal model for detection. The results showed that a melamine concentration as low as 50 ppm in melamine-milk powder samples could be detected. Thus, the MCT-based SWIR hyperspectral imaging system has the potential for quantitative and qualitative detection of adulterants in powder samples.
SOFIA science instruments: commissioning, upgrades and future opportunities
NASA Astrophysics Data System (ADS)
Smith, Erin C.; Miles, John W.; Helton, L. Andrew; Sankrit, Ravi; Andersson, B. G.; Becklin, Eric E.; De Buizer, James M.; Dowell, C. D.; Dunham, Edward W.; Güsten, Rolf; Harper, Doyal A.; Herter, Terry L.; Keller, Luke D.; Klein, Randolf; Krabbe, Alfred; Logsdon, Sarah; Marcum, Pamela M.; McLean, Ian S.; Reach, William T.; Richter, Matthew J.; Roellig, Thomas L.; Sandell, Göran; Savage, Maureen L.; Temi, Pasquale; Vacca, William D.; Vaillancourt, John E.; Van Cleve, Jeffrey E.; Young, Erick T.
2014-07-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is the world's largest airborne observatory, featuring a 2.5 meter effective aperture telescope housed in the aft section of a Boeing 747SP aircraft. SOFIA's current instrument suite includes: FORCAST (Faint Object InfraRed CAmera for the SOFIA Telescope), a 5-40 μm dual band imager/grism spectrometer developed at Cornell University; HIPO (High-speed Imaging Photometer for Occultations), a 0.3-1.1μm imager built by Lowell Observatory; GREAT (German Receiver for Astronomy at Terahertz Frequencies), a multichannel heterodyne spectrometer from 60-240 μm, developed by a consortium led by the Max Planck Institute for Radio Astronomy; FLITECAM (First Light Infrared Test Experiment CAMera), a 1-5 μm wide-field imager/grism spectrometer developed at UCLA; FIFI-LS (Far-Infrared Field-Imaging Line Spectrometer), a 42-200 μm IFU grating spectrograph completed by University Stuttgart; and EXES (Echelon-Cross-Echelle Spectrograph), a 5-28 μm highresolution spectrometer designed at the University of Texas and being completed by UC Davis and NASA Ames Research Center. HAWC+ (High-resolution Airborne Wideband Camera) is a 50-240 μm imager that was originally developed at the University of Chicago as a first-generation instrument (HAWC), and is being upgraded at JPL to add polarimetry and new detectors developed at Goddard Space Flight Center (GSFC). SOFIA will continually update its instrument suite with new instrumentation, technology demonstration experiments and upgrades to the existing instrument suite. This paper details the current instrument capabilities and status, as well as the plans for future instrumentation.
High resolution multispectral photogrammetric imagery: enhancement, interpretation and evaluations
NASA Astrophysics Data System (ADS)
Roberts, Arthur; Haefele, Martin; Bostater, Charles; Becker, Thomas
2007-10-01
A variety of aerial mapping cameras were adapted and developed into simulated multiband digital photogrammetric mapping systems. Direct digital multispectral, two multiband cameras (IIS 4 band and Itek 9 band) and paired mapping and reconnaissance cameras were evaluated for digital spectral performance and photogrammetric mapping accuracy in an aquatic environment. Aerial films (24cm X 24cm format) tested were: Agfa color negative and extended red (visible and near infrared) panchromatic, and; Kodak color infrared and B&W (visible and near infrared) infrared. All films were negative processed to published standards and digitally converted at either 16 (color) or 10 (B&W) microns. Excellent precision in the digital conversions was obtained with scanning errors of less than one micron. Radiometric data conversion was undertaken using linear density conversion and centered 8 bit histogram exposure. This resulted in multiple 8 bit spectral image bands that were unaltered (not radiometrically enhanced) "optical count" conversions of film density. This provided the best film density conversion to a digital product while retaining the original film density characteristics. Data covering water depth, water quality, surface roughness, and bottom substrate were acquired using different measurement techniques as well as different techniques to locate sampling points on the imagery. Despite extensive efforts to obtain accurate ground truth data location errors, measurement errors, and variations in the correlation between water depth and remotely sensed signal persisted. These errors must be considered endemic and may not be removed through even the most elaborate sampling set up. Results indicate that multispectral photogrammetric systems offer improved feature mapping capability.
A control system of a mini survey facility for photometric monitoring
NASA Astrophysics Data System (ADS)
Tsutsui, Hironori; Yanagisawa, Kenshi; Izumiura, Hideyuki; Shimizu, Yasuhiro; Hanaue, Takumi; Ita, Yoshifusa; Ichikawa, Takashi; Komiyama, Takahiro
2016-08-01
We have built a control system for a mini survey facility dedicated to photometric monitoring of nearby bright (K<5) stars in the near-infrared region. The facility comprises a 4-m-diameter rotating dome and a small (30-mm aperture) wide-field (5 × 5 sq. deg. field of view) infrared (1.0-2.5 microns) camera on an equatorial fork mount, as well as power sources and other associated equipment. All the components other than the camera are controlled by microcomputerbased I/O boards that were developed in-house and are in many of the open-use instruments in our observatory. We present the specifications and configuration of the facility hardware, as well as the structure of its control software.
Robot Towed Shortwave Infrared Camera for Specific Surface Area Retrieval of Surface Snow
NASA Astrophysics Data System (ADS)
Elliott, J.; Lines, A.; Ray, L.; Albert, M. R.
2017-12-01
Optical grain size and specific surface area are key parameters for measuring the atmospheric interactions of snow, as well as tracking metamorphosis and allowing for the ground truthing of remote sensing data. We describe a device using a shortwave infrared camera with changeable optical bandpass filters (centered at 1300 nm and 1550 nm) that can be used to quickly measure the average SSA over an area of 0.25 m^2. The device and method are compared with calculations made from measurements taken with a field spectral radiometer. The instrument is designed to be towed by a small autonomous ground vehicle, and therefore rides above the snow surface on ultra high molecular weight polyethylene (UHMW) skis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sokovikov, Mikhail, E-mail: sokovikov@icmm.ru; Chudinov, Vasiliy; Bilalov, Dmitry
2015-10-27
The behavior of specimens dynamically loaded during split Hopkinson (Kolsky) bar tests in a regime close to simple shear conditions was studied. The lateral surface of the specimens was investigated in-situ using a high-speed infrared camera CEDIP Silver 450M. The temperature field distribution obtained at different time allowed one to trace the evolution of plastic strain localization. The process of target perforation involving plug formation and ejection was examined using a high-speed infrared camera and a VISAR velocity measurement system. The microstructure of tested specimens was analyzed using an optical interferometer-profiler and a scanning electron microscope. The development of plasticmore » shear instability regions has been simulated numerically.« less
Deep Near-Infrared Surveys and Young Brown Dwarf Populations in Star-Forming Regions
NASA Astrophysics Data System (ADS)
Tamura, M.; Naoi, T.; Oasa, Y.; Nakajima, Y.; Nagashima, C.; Nagayama, T.; Baba, D.; Nagata, T.; Sato, S.; Kato, D.; Kurita, M.; Sugitani, K.; Itoh, Y.; Nakaya, H.; Pickles, A.
2003-06-01
We are currently conducting three kinds of IR surveys of star forming regions (SFRs) in order to seek for very low-mass young stellar populations. First is a deep JHKs-bands (simultaneous) survey with the SIRIUS camera on the IRSF 1.4m or the UH 2.2m telescopes. Second is a very deep JHKs survey with the CISCO IR camera on the Subaru 8.2m telescope. Third is a high resolution companion search around nearby YSOs with the CIAO adaptive optics coronagraph IR camera on the Subaru. In this contribution, we describe our SIRIUS camera and present preliminary results of the ongoing surveys with this new instrument.
NASA Astrophysics Data System (ADS)
Wang, Sheng; Bandini, Filippo; Jakobsen, Jakob; Zarco-Tejada, Pablo J.; Köppl, Christian Josef; Haugård Olesen, Daniel; Ibrom, Andreas; Bauer-Gottwein, Peter; Garcia, Monica
2017-04-01
Unmanned Aerial Systems (UAS) can collect optical and thermal hyperspatial (<1m) imagery with low cost and flexible revisit times regardless of cloudy conditions. The reflectance and radiometric temperature signatures of the land surface, closely linked with the vegetation structure and functioning, are already part of models to predict Evapotranspiration (ET) and Gross Primary Productivity (GPP) from satellites. However, there remain challenges for an operational monitoring using UAS compared to satellites: the payload capacity of most commercial UAS is less than 2 kg, but miniaturized sensors have low signal to noise ratios and small field of view requires mosaicking hundreds of images and accurate orthorectification. In addition, wind gusts and lower platform stability require appropriate geometric and radiometric corrections. Finally, modeling fluxes on days without images is still an issue for both satellite and UAS applications. This study focuses on designing an operational UAS-based monitoring system including payload design, sensor calibration, based on routine collection of optical and thermal images in a Danish willow field to perform a joint monitoring of ET and GPP dynamics over continuous time at daily time steps. The payload (<2 kg) consists of a multispectral camera (Tetra Mini-MCA6), a thermal infrared camera (FLIR Tau 2), a digital camera (Sony RX-100) used to retrieve accurate digital elevation models (DEMs) for multispectral and thermal image orthorectification, and a standard GNSS single frequency receiver (UBlox) or a real time kinematic double frequency system (Novatel Inc. flexpack6+OEM628). Geometric calibration of the digital and multispectral cameras was conducted to recover intrinsic camera parameters. After geometric calibration, accurate DEMs with vertical errors about 10cm could be retrieved. Radiometric calibration for the multispectral camera was conducted with an integrating sphere (Labsphere CSTM-USS-2000C) and the laboratory calibration showed that the camera measured radiance had a bias within ±4.8%. The thermal camera was calibrated using a black body at varying target and ambient temperatures and resulted in laboratory accuracy with RMSE of 0.95 K. A joint model of ET and GPP was applied using two parsimonious, physiologically based models, a modified version of the Priestley-Taylor Jet Propulsion Laboratory model (Fisher et al., 2008; Garcia et al., 2013) and a Light Use Efficiency approach (Potter et al., 1993). Both models estimate ET and GPP under optimum potential conditions down-regulated by the same biophysical constraints dependent on remote sensing and atmospheric data to reflect multiple stresses. Vegetation indices were calculated from the multispectral data to assess vegetation conditions, while thermal infrared imagery was used to compute a thermal inertia index to infer soil moisture constraints. To interpolate radiometric temperature between flights, a prognostic Surface Energy Balance model (Margulis et al., 2001) based on the force-restore method was applied in a data assimilation scheme to obtain continuous ET and GPP fluxes. With this operational system, regular flight campaigns with a hexacopter (DJI S900) have been conducted in a Danish willow flux site (Risø) over the 2016 growing season. The observed energy, water and carbon fluxes from the Risø eddy covariance flux tower were used to validate the model simulation. This UAS monitoring system is suitable for agricultural management and land-atmosphere interaction studies.
Infrared-thermographic screening of the activity and enantioselectivity of enzymes.
Reetz, M T; Hermes, M; Becker, M H
2001-05-01
The infrared radiation caused by the heat of reaction of an enantioselective enzyme-catalyzed transformation can be detected by modern photovoltaic infrared (IR)-thermographic cameras equipped with focal-plane array detectors. Specifically, in the lipase-catalyzed enantioselective acylation of racemic 1-phenylethanol, the (R)- and (S)-substrates are allowed to react separately in the wells of microtiter plates, the (R)-alcohol showing hot spots in the IR-thermographic images. Thus, highly enantioselective enzymes can be identified at kinetic resolution.
Noise-cancellation-based nonuniformity correction algorithm for infrared focal-plane arrays.
Godoy, Sebastián E; Pezoa, Jorge E; Torres, Sergio N
2008-10-10
The spatial fixed-pattern noise (FPN) inherently generated in infrared (IR) imaging systems compromises severely the quality of the acquired imagery, even making such images inappropriate for some applications. The FPN refers to the inability of the photodetectors in the focal-plane array to render a uniform output image when a uniform-intensity scene is being imaged. We present a noise-cancellation-based algorithm that compensates for the additive component of the FPN. The proposed method relies on the assumption that a source of noise correlated to the additive FPN is available to the IR camera. An important feature of the algorithm is that all the calculations are reduced to a simple equation, which allows for the bias compensation of the raw imagery. The algorithm performance is tested using real IR image sequences and is compared to some classical methodologies. (c) 2008 Optical Society of America
Remote optical observations of actively burning biomass fires using potassium line spectral emission
NASA Astrophysics Data System (ADS)
Magidimisha, Edwin; Griffith, Derek J.
2016-02-01
Wildland fires are a widespread, seasonal and largely man-made hazard which have a broad range of negative effects. These wildfires cause not only the destruction of homes, infrastructure, cultivated forests and natural habitats but also contribute to climate change through greenhouse gas emissions and aerosol particle production. Global satellite-based monitoring of biomass burning using thermal infrared sensors is currently a powerful tool to assist in finding ways to establish suppression strategies and to understand the role that fires play in global climate change. Advances in silicon-based camera technology present opportunities to resolve the challenge of ubiquitous wildfire early detection in a cost-effective manner. This study investigated several feasibility aspects of detecting wildland fires using near-infrared (NIR) spectral line emissions from electronically excited potassium (K) atoms at wavelengths of 766.5 and 769.9 nm, during biomass burning.
NASA Technical Reports Server (NTRS)
Barry, R. K.; Satyapal, S.; Greenhouse, M. A.; Barclay, R.; Amato, D.; Arritt, B.; Brown, G.; Harvey, V.; Holt, C.; Kuhn, J.
2000-01-01
We discuss work in progress on a near-infrared tunable bandpass filter for the Goddard baseline wide field camera concept of the Next Generation Space Telescope (NGST) Integrated Science Instrument Module (ISIM). This filter, the Demonstration Unit for Low Order Cryogenic Etalon (DULCE), is designed to demonstrate a high efficiency scanning Fabry-Perot etalon operating in interference orders 1 - 4 at 30K with a high stability DSP based servo control system. DULCE is currently the only available tunable filter for lower order cryogenic operation in the near infrared. In this application, scanning etalons will illuminate the focal plane arrays with a single order of interference to enable wide field lower resolution hyperspectral imaging over a wide range of redshifts. We discuss why tunable filters are an important instrument component in future space-based observatories.