A Ground-Based Near Infrared Camera Array System for UAV Auto-Landing in GPS-Denied Environment.
Yang, Tao; Li, Guangpo; Li, Jing; Zhang, Yanning; Zhang, Xiaoqiang; Zhang, Zhuoyue; Li, Zhi
2016-08-30
This paper proposes a novel infrared camera array guidance system with capability to track and provide real time position and speed of a fixed-wing Unmanned air vehicle (UAV) during a landing process. The system mainly include three novel parts: (1) Infrared camera array and near infrared laser lamp based cooperative long range optical imaging module; (2) Large scale outdoor camera array calibration module; and (3) Laser marker detection and 3D tracking module. Extensive automatic landing experiments with fixed-wing flight demonstrate that our infrared camera array system has the unique ability to guide the UAV landing safely and accurately in real time. Moreover, the measurement and control distance of our system is more than 1000 m. The experimental results also demonstrate that our system can be used for UAV automatic accurate landing in Global Position System (GPS)-denied environments.
ARNICA, the Arcetri Near-Infrared Camera
NASA Astrophysics Data System (ADS)
Lisi, F.; Baffa, C.; Bilotti, V.; Bonaccini, D.; del Vecchio, C.; Gennari, S.; Hunt, L. K.; Marcucci, G.; Stanga, R.
1996-04-01
ARNICA (ARcetri Near-Infrared CAmera) is the imaging camera for the near-infrared bands between 1.0 and 2.5 microns that the Arcetri Observatory has designed and built for the Infrared Telescope TIRGO located at Gornergrat, Switzerland. We describe the mechanical and optical design of the camera, and report on the astronomical performance of ARNICA as measured during the commissioning runs at the TIRGO (December, 1992 to December 1993), and an observing run at the William Herschel Telescope, Canary Islands (December, 1993). System performance is defined in terms of efficiency of the camera+telescope system and camera sensitivity for extended and point-like sources. (SECTION: Astronomical Instrumentation)
The development of large-aperture test system of infrared camera and visible CCD camera
NASA Astrophysics Data System (ADS)
Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying
2015-10-01
Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.
SPARTAN Near-IR Camera SPARTAN Cookbook Ohio State Infrared Imager/Spectrograph (OSIRIS) - NO LONGER Instrumentation at SOAR»SPARTAN Near-IR Camera SPARTAN Near-IR Camera System Overview The Spartan Infrared Camera is a high spatial resolution near-IR imager. Spartan has a focal plane conisisting of four "
Electro-optical system for gunshot detection: analysis, concept, and performance
NASA Astrophysics Data System (ADS)
Kastek, M.; Dulski, R.; Madura, H.; Trzaskawka, P.; Bieszczad, G.; Sosnowski, T.
2011-08-01
The paper discusses technical possibilities to build an effective electro-optical sensor unit for sniper detection using infrared cameras. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. At first, the analysis was presented of three distinguished phases of sniper activity: before, during and after the shot. On the basis of experimental data the parameters defining the relevant sniper signatures were determined which are essential in assessing the capability of infrared camera to detect sniper activity. A sniper body and muzzle flash were analyzed as targets and the descriptions of phenomena which make it possible to detect sniper activities in infrared spectra as well as analysis of physical limitations were performed. The analyzed infrared systems were simulated using NVTherm software. The calculations for several cameras, equipped with different lenses and detector types were performed. The simulation of detection ranges was performed for the selected scenarios of sniper detection tasks. After the analysis of simulation results, the technical specifications of infrared sniper detection system were discussed, required to provide assumed detection range. Finally the infrared camera setup was proposed which can detected sniper from 1000 meters range.
Space-based infrared sensors of space target imaging effect analysis
NASA Astrophysics Data System (ADS)
Dai, Huayu; Zhang, Yasheng; Zhou, Haijun; Zhao, Shuang
2018-02-01
Target identification problem is one of the core problem of ballistic missile defense system, infrared imaging simulation is an important means of target detection and recognition. This paper first established the space-based infrared sensors ballistic target imaging model of point source on the planet's atmosphere; then from two aspects of space-based sensors camera parameters and target characteristics simulated atmosphere ballistic target of infrared imaging effect, analyzed the camera line of sight jitter, camera system noise and different imaging effects of wave on the target.
Low-cost thermo-electric infrared FPAs and their automotive applications
NASA Astrophysics Data System (ADS)
Hirota, Masaki; Ohta, Yoshimi; Fukuyama, Yasuhiro
2008-04-01
This paper describes three low-cost infrared focal plane arrays (FPAs) having a 1,536, 2,304, and 10,800 elements and experimental vehicle systems. They have a low-cost potential because each element consists of p-n polysilicon thermocouples, which allows the use of low-cost ultra-fine microfabrication technology commonly employed in the conventional semiconductor manufacturing processes. To increase the responsivity of FPA, we have developed a precisely patterned Au-black absorber that has high infrared absorptivity of more than 90%. The FPA having a 2,304 elements achieved high resposivity of 4,300 V/W. In order to reduce package cost, we developed a vacuum-sealed package integrated with a molded ZnS lens. The camera aiming the temperature measurement of a passenger cabin is compact and light weight devices that measures 45 x 45 x 30 mm and weighs 190 g. The camera achieves a noise equivalent temperature deviation (NETD) of less than 0.7°C from 0 to 40°C. In this paper, we also present a several experimental systems that use infrared cameras. One experimental system is a blind spot pedestrian warning system that employs four infrared cameras. It can detect the infrared radiation emitted from a human body and alerts the driver when a pedestrian is in a blind spot. The system can also prevent the vehicle from moving in the direction of the pedestrian. Another system uses a visible-light camera and infrared sensors to detect the presence of a pedestrian in a rear blind spot and alerts the driver. The third system is a new type of human-machine interface system that enables the driver to control the car's audio system without letting go of the steering wheel. Uncooled infrared cameras are still costly, which limits their automotive use to high-end luxury cars at present. To promote widespread use of IR imaging sensors on vehicles, we need to reduce their cost further.
Mitigation of Atmospheric Effects on Imaging Systems
2004-03-31
focal length. The imaging system had two cameras: an Electrim camera sensitive in the visible (0.6 µ m) waveband and an Amber QWIP infrared camera...sensitive in the 9–micron region. The Amber QWIP infrared camera had 256x256 pixels, pixel pitch 38 mµ , focal length of 1.8 m, FOV of 5.4 x5.4 mr...each day. Unfortunately, signals from the different read ports of the Electrim camera picked up noise on their way to the digitizer, and this resulted
Infrared detectors and test technology of cryogenic camera
NASA Astrophysics Data System (ADS)
Yang, Xiaole; Liu, Xingxin; Xing, Mailing; Ling, Long
2016-10-01
Cryogenic camera which is widely used in deep space detection cools down optical system and support structure by cryogenic refrigeration technology, thereby improving the sensitivity. Discussing the characteristics and design points of infrared detector combined with camera's characteristics. At the same time, cryogenic background test systems of chip and detector assembly are established. Chip test system is based on variable cryogenic and multilayer Dewar, and assembly test system is based on target and background simulator in the thermal vacuum environment. The core of test is to establish cryogenic background. Non-uniformity, ratio of dead pixels and noise of test result are given finally. The establishment of test system supports for the design and calculation of infrared systems.
NASA Technical Reports Server (NTRS)
Gazanik, Michael; Johnson, Dave; Kist, Ed; Novak, Frank; Antill, Charles; Haakenson, David; Howell, Patricia; Jenkins, Rusty; Yates, Rusty; Stephan, Ryan;
2005-01-01
In November 2004, NASA's Space Shuttle Program approved the development of the Extravehicular (EVA) Infrared (IR) Camera to test the application of infrared thermography to on-orbit reinforced carbon-carbon (RCC) damage detection. A multi-center team composed of members from NASA's Johnson Space Center (JSC), Langley Research Center (LaRC), and Goddard Space Flight Center (GSFC) was formed to develop the camera system and plan a flight test. The initial development schedule called for the delivery of the system in time to support STS-115 in late 2005. At the request of Shuttle Program managers and the flight crews, the team accelerated its schedule and delivered a certified EVA IR Camera system in time to support STS-114 in July 2005 as a contingency. The development of the camera system, led by LaRC, was based on the Commercial-Off-the-Shelf (COTS) FLIR S65 handheld infrared camera. An assessment of the S65 system in regards to space-flight operation was critical to the project. This paper discusses the space-flight assessment and describes the significant modifications required for EVA use by the astronaut crew. The on-orbit inspection technique will be demonstrated during the third EVA of STS-121 in September 2005 by imaging damaged RCC samples mounted in a box in the Shuttle's cargo bay.
Variation in detection among passive infrared triggered-cameras used in wildlife research
Damm, Philip E.; Grand, James B.; Barnett, Steven W.
2010-01-01
Precise and accurate estimates of demographics such as age structure, productivity, and density are necessary in determining habitat and harvest management strategies for wildlife populations. Surveys using automated cameras are becoming an increasingly popular tool for estimating these parameters. However, most camera studies fail to incorporate detection probabilities, leading to parameter underestimation. The objective of this study was to determine the sources of heterogeneity in detection for trail cameras that incorporate a passive infrared (PIR) triggering system sensitive to heat and motion. Images were collected at four baited sites within the Conecuh National Forest, Alabama, using three cameras at each site operating continuously over the same seven-day period. Detection was estimated for four groups of animals based on taxonomic group and body size. Our hypotheses of detection considered variation among bait sites and cameras. The best model (w=0.99) estimated different rates of detection for each camera in addition to different detection rates for four animal groupings. Factors that explain this variability might include poor manufacturing tolerances, variation in PIR sensitivity, animal behavior, and species-specific infrared radiation. Population surveys using trail cameras with PIR systems must incorporate detection rates for individual cameras. Incorporating time-lapse triggering systems into survey designs should eliminate issues associated with PIR systems.
Performance and Calibration of H2RG Detectors and SIDECAR ASICs for the RATIR Camera
NASA Technical Reports Server (NTRS)
Fox, Ori D.; Kutyrev, Alexander S.; Rapchun, David A.; Klein, Christopher R.; Butler, Nathaniel R.; Bloom, Josh; de Diego, Jos A.; Simn Farah, Alejandro D.; Gehrels, Neil A.; Georgiev, Leonid;
2012-01-01
The Reionization And Transient Infra,.Red (RATIR) camera has been built for rapid Gamma,.Ray Burst (GRE) followup and will provide simultaneous optical and infrared photometric capabilities. The infrared portion of this camera incorporates two Teledyne HgCdTe HAWAII-2RG detectors, controlled by Teledyne's SIDECAR ASICs. While other ground-based systems have used the SIDECAR before, this system also utilizes Teledyne's JADE2 interface card and IDE development environment. Together, this setup comprises Teledyne's Development Kit, which is a bundled solution that can be efficiently integrated into future ground-based systems. In this presentation, we characterize the system's read noise, dark current, and conversion gain.
Design of an infrared camera based aircraft detection system for laser guide star installations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman, H.; Macintosh, B.
1996-03-05
There have been incidents in which the irradiance resulting from laser guide stars have temporarily blinded pilots or passengers of aircraft. An aircraft detection system based on passive near infrared cameras (instead of active radar) is described in this report.
Standoff aircraft IR characterization with ABB dual-band hyper spectral imager
NASA Astrophysics Data System (ADS)
Prel, Florent; Moreau, Louis; Lantagne, Stéphane; Bullis, Ritchie D.; Roy, Claude; Vallières, Christian; Levesque, Luc
2012-09-01
Remote sensing infrared characterization of rapidly evolving events generally involves the combination of a spectro-radiometer and infrared camera(s) as separated instruments. Time synchronization, spatial coregistration, consistent radiometric calibration and managing several systems are important challenges to overcome; they complicate the target infrared characterization data processing and increase the sources of errors affecting the final radiometric accuracy. MR-i is a dual-band Hyperspectal imaging spectro-radiometer, that combines two 256 x 256 pixels infrared cameras and an infrared spectro-radiometer into one single instrument. This field instrument generates spectral datacubes in the MWIR and LWIR. It is designed to acquire the spectral signatures of rapidly evolving events. The design is modular. The spectrometer has two output ports configured with two simultaneously operated cameras to either widen the spectral coverage or to increase the dynamic range of the measured amplitudes. Various telescope options are available for the input port. Recent platform developments and field trial measurements performances will be presented for a system configuration dedicated to the characterization of airborne targets.
Multi-channel automotive night vision system
NASA Astrophysics Data System (ADS)
Lu, Gang; Wang, Li-jun; Zhang, Yi
2013-09-01
A four-channel automotive night vision system is designed and developed .It is consist of the four active near-infrared cameras and an Mulit-channel image processing display unit,cameras were placed in the automobile front, left, right and rear of the system .The system uses near-infrared laser light source,the laser light beam is collimated, the light source contains a thermoelectric cooler (TEC),It can be synchronized with the camera focusing, also has an automatic light intensity adjustment, and thus can ensure the image quality. The principle of composition of the system is description in detail,on this basis, beam collimation,the LD driving and LD temperature control of near-infrared laser light source,four-channel image processing display are discussed.The system can be used in driver assistance, car BLIS, car parking assist system and car alarm system in day and night.
Sniper detection using infrared camera: technical possibilities and limitations
NASA Astrophysics Data System (ADS)
Kastek, M.; Dulski, R.; Trzaskawka, P.; Bieszczad, G.
2010-04-01
The paper discusses technical possibilities to build an effective system for sniper detection using infrared cameras. Descriptions of phenomena which make it possible to detect sniper activities in infrared spectra as well as analysis of physical limitations were performed. Cooled and uncooled detectors were considered. Three phases of sniper activities were taken into consideration: before, during and after the shot. On the basis of experimental data the parameters defining the target were determined which are essential in assessing the capability of infrared camera to detect sniper activity. A sniper body and muzzle flash were analyzed as targets. The simulation of detection ranges was done for the assumed scenario of sniper detection task. The infrared sniper detection system was discussed, capable of fulfilling the requirements. The discussion of the results of analysis and simulations was finally presented.
Robust Behavior Recognition in Intelligent Surveillance Environments.
Batchuluun, Ganbayar; Kim, Yeong Gon; Kim, Jong Hyun; Hong, Hyung Gil; Park, Kang Ryoung
2016-06-30
Intelligent surveillance systems have been studied by many researchers. These systems should be operated in both daytime and nighttime, but objects are invisible in images captured by visible light camera during the night. Therefore, near infrared (NIR) cameras, thermal cameras (based on medium-wavelength infrared (MWIR), and long-wavelength infrared (LWIR) light) have been considered for usage during the nighttime as an alternative. Due to the usage during both daytime and nighttime, and the limitation of requiring an additional NIR illuminator (which should illuminate a wide area over a great distance) for NIR cameras during the nighttime, a dual system of visible light and thermal cameras is used in our research, and we propose a new behavior recognition in intelligent surveillance environments. Twelve datasets were compiled by collecting data in various environments, and they were used to obtain experimental results. The recognition accuracy of our method was found to be 97.6%, thereby confirming the ability of our method to outperform previous methods.
Design of a Remote Infrared Images and Other Data Acquisition Station for outdoor applications
NASA Astrophysics Data System (ADS)
Béland, M.-A.; Djupkep, F. B. D.; Bendada, A.; Maldague, X.; Ferrarini, G.; Bison, P.; Grinzato, E.
2013-05-01
The Infrared Images and Other Data Acquisition Station enables a user, who is located inside a laboratory, to acquire visible and infrared images and distances in an outdoor environment with the help of an Internet connection. This station can acquire data using an infrared camera, a visible camera, and a rangefinder. The system can be used through a web page or through Python functions.
Space imaging infrared optical guidance for autonomous ground vehicle
NASA Astrophysics Data System (ADS)
Akiyama, Akira; Kobayashi, Nobuaki; Mutoh, Eiichiro; Kumagai, Hideo; Yamada, Hirofumi; Ishii, Hiromitsu
2008-08-01
We have developed the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle based on the uncooled infrared camera and focusing technique to detect the objects to be evaded and to set the drive path. For this purpose we made servomotor drive system to control the focus function of the infrared camera lens. To determine the best focus position we use the auto focus image processing of Daubechies wavelet transform technique with 4 terms. From the determined best focus position we transformed it to the distance of the object. We made the aluminum frame ground vehicle to mount the auto focus infrared unit. Its size is 900mm long and 800mm wide. This vehicle mounted Ackerman front steering system and the rear motor drive system. To confirm the guidance ability of the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle we had the experiments for the detection ability of the infrared auto focus unit to the actual car on the road and the roadside wall. As a result the auto focus image processing based on the Daubechies wavelet transform technique detects the best focus image clearly and give the depth of the object from the infrared camera unit.
Measurement of reach envelopes with a four-camera Selective Spot Recognition (SELSPOT) system
NASA Technical Reports Server (NTRS)
Stramler, J. H., Jr.; Woolford, B. J.
1983-01-01
The basic Selective Spot Recognition (SELSPOT) system is essentially a system which uses infrared LEDs and a 'camera' with an infrared-sensitive photodetector, a focusing lens, and some A/D electronics to produce a digital output representing an X and Y coordinate for each LED for each camera. When the data are synthesized across all cameras with appropriate calibrations, an XYZ set of coordinates is obtained for each LED at a given point in time. Attention is given to the operating modes, a system checkout, and reach envelopes and software. The Video Recording Adapter (VRA) represents the main addition to the basic SELSPOT system. The VRA contains a microprocessor and other electronics which permit user selection of several options and some interaction with the system.
Uncooled infrared sensors: rapid growth and future perspective
NASA Astrophysics Data System (ADS)
Balcerak, Raymond S.
2000-07-01
The uncooled infrared cameras are now available for both the military and commercial markets. The current camera technology incorporates the fruits of many years of development, focusing on the details of pixel design, novel material processing, and low noise read-out electronics. The rapid insertion of cameras into systems is testimony to the successful completion of this 'first phase' of development. In the military market, the first uncooled infrared cameras will be used for weapon sights, driver's viewers and helmet mounted cameras. Major commercial applications include night driving, security, police and fire fighting, and thermography, primarily for preventive maintenance and process control. The technology for the next generation of cameras is even more demanding, but within reach. The paper outlines the technology program planned for the next generation of cameras, and the approaches to further enhance performance, even to the radiation limit of thermal detectors.
Hamze, Faeze; Ganjalikhan Nasab, Seyed Abdolreza; Eskandarizadeh, Ali; Shahravan, Arash; Akhavan Fard, Fatemeh; Sinaee, Neda
2018-01-01
Due to thermal hazard during composite restorations, this study was designed to scan the pulp temperature by thermocouple and infrared camera during photo polymerizing different composites. A mesio-occlso-distal (MOD) cavity was prepared in an extracted tooth and the K-type thermocouple was fixed in its pulp chamber. Subsequently, 1 mm increment of each composites were inserted (four composite types were incorporated) and photo polymerized employing either LED or QTH systems for 60 sec while the temperature was recorded with 10 sec intervals. Ultimately, the same tooth was hemisected bucco-lingually and the amalgam was removed. The same composite curing procedure was repeated while the thermogram was recorded using an infrared camera. Thereafter, the data was analyzed by repeated measured ANOVA followed by Tukey's HSD Post Hoc test for multiple comparisons ( α =0.05). The pulp temperature was significantly increased (repeated measures) during photo polymerization ( P =0.000) while there was no significant difference among the results recorded by thermocouple comparing to infrared camera ( P >0.05). Moreover, different composite materials and LCUs lead to similar outcomes ( P >0.05). Although various composites have significant different chemical compositions, they lead to similar pulp thermal changes. Moreover, both the infrared camera and the thermocouple would record parallel results of dental pulp temperature.
A new high-speed IR camera system
NASA Technical Reports Server (NTRS)
Travis, Jeffrey W.; Shu, Peter K.; Jhabvala, Murzy D.; Kasten, Michael S.; Moseley, Samuel H.; Casey, Sean C.; Mcgovern, Lawrence K.; Luers, Philip J.; Dabney, Philip W.; Kaipa, Ravi C.
1994-01-01
A multi-organizational team at the Goddard Space Flight Center is developing a new far infrared (FIR) camera system which furthers the state of the art for this type of instrument by the incorporating recent advances in several technological disciplines. All aspects of the camera system are optimized for operation at the high data rates required for astronomical observations in the far infrared. The instrument is built around a Blocked Impurity Band (BIB) detector array which exhibits responsivity over a broad wavelength band and which is capable of operating at 1000 frames/sec, and consists of a focal plane dewar, a compact camera head electronics package, and a Digital Signal Processor (DSP)-based data system residing in a standard 486 personal computer. In this paper we discuss the overall system architecture, the focal plane dewar, and advanced features and design considerations for the electronics. This system, or one derived from it, may prove useful for many commercial and/or industrial infrared imaging or spectroscopic applications, including thermal machine vision for robotic manufacturing, photographic observation of short-duration thermal events such as combustion or chemical reactions, and high-resolution surveillance imaging.
Exploring the imaging properties of thin lenses for cryogenic infrared cameras
NASA Astrophysics Data System (ADS)
Druart, Guillaume; Verdet, Sebastien; Guerineau, Nicolas; Magli, Serge; Chambon, Mathieu; Grulois, Tatiana; Matallah, Noura
2016-05-01
Designing a cryogenic camera is a good strategy to miniaturize and simplify an infrared camera using a cooled detector. Indeed, the integration of optics inside the cold shield allows to simply athermalize the design, guarantees a cold pupil and releases the constraint on having a high back focal length for small focal length systems. By this way, cameras made of a single lens or two lenses are viable systems with good optical features and a good stability in image correction. However it involves a relatively significant additional optical mass inside the dewar and thus increases the cool down time of the camera. ONERA is currently exploring a minimalist strategy consisting in giving an imaging function to thin optical plates that are found in conventional dewars. By this way, we could make a cryogenic camera that has the same cool down time as a traditional dewar without an imagery function. Two examples will be presented: the first one is a camera using a dual-band infrared detector made of a lens outside the dewar and a lens inside the cold shield, the later having the main optical power of the system. We were able to design a cold plano-convex lens with a thickness lower than 1mm. The second example is an evolution of a former cryogenic camera called SOIE. We replaced the cold meniscus by a plano-convex Fresnel lens with a decrease of the optical thermal mass of 66%. The performances of both cameras will be compared.
Yoon, Se Jin; Noh, Si Cheol; Choi, Heung Ho
2007-01-01
The infrared diagnosis device provides two-dimensional images and patient-oriented results that can be easily understood by the inspection target by using infrared cameras; however, it has disadvantages such as large size, high price, and inconvenient maintenance. In this regard, this study has proposed small-sized diagnosis device for body heat using a single infrared sensor and implemented an infrared detection system using a single infrared sensor and an algorithm that represents thermography using the obtained data on the temperature of the point source. The developed systems had the temperature resolution of 0.1 degree and the reproducibility of +/-0.1 degree. The accuracy was 90.39% at the error bound of +/-0 degree and 99.98% at that of +/-0.1 degree. In order to evaluate the proposed algorithm and system, the infrared images of camera method was compared. The thermal images that have clinical meaning were obtained from a patient who has lesion to verify its clinical applicability.
Hamze, Faeze; Ganjalikhan Nasab, Seyed Abdolreza; Eskandarizadeh, Ali; Shahravan, Arash; Akhavan Fard, Fatemeh; Sinaee, Neda
2018-01-01
Introduction: Due to thermal hazard during composite restorations, this study was designed to scan the pulp temperature by thermocouple and infrared camera during photo polymerizing different composites. Methods and Materials: A mesio-occlso-distal (MOD) cavity was prepared in an extracted tooth and the K-type thermocouple was fixed in its pulp chamber. Subsequently, 1 mm increment of each composites were inserted (four composite types were incorporated) and photo polymerized employing either LED or QTH systems for 60 sec while the temperature was recorded with 10 sec intervals. Ultimately, the same tooth was hemisected bucco-lingually and the amalgam was removed. The same composite curing procedure was repeated while the thermogram was recorded using an infrared camera. Thereafter, the data was analyzed by repeated measured ANOVA followed by Tukey’s HSD Post Hoc test for multiple comparisons (α=0.05). Results: The pulp temperature was significantly increased (repeated measures) during photo polymerization (P=0.000) while there was no significant difference among the results recorded by thermocouple comparing to infrared camera (P>0.05). Moreover, different composite materials and LCUs lead to similar outcomes (P>0.05). Conclusion: Although various composites have significant different chemical compositions, they lead to similar pulp thermal changes. Moreover, both the infrared camera and the thermocouple would record parallel results of dental pulp temperature. PMID:29707014
A low-cost dual-camera imaging system for aerial applicators
USDA-ARS?s Scientific Manuscript database
Agricultural aircraft provide a readily available remote sensing platform as low-cost and easy-to-use consumer-grade cameras are being increasingly used for aerial imaging. In this article, we report on a dual-camera imaging system we recently assembled that can capture RGB and near-infrared (NIR) i...
Free-form reflective optics for mid-infrared camera and spectrometer on board SPICA
NASA Astrophysics Data System (ADS)
Fujishiro, Naofumi; Kataza, Hirokazu; Wada, Takehiko; Ikeda, Yuji; Sakon, Itsuki; Oyabu, Shinki
2017-11-01
SPICA (Space Infrared Telescope for Cosmology and Astrophysics) is an astronomical mission optimized for mid-and far-infrared astronomy with a cryogenically cooled 3-m class telescope, envisioned for launch in early 2020s. Mid-infrared Camera and Spectrometer (MCS) is a focal plane instrument for SPICA with imaging and spectroscopic observing capabilities in the mid-infrared wavelength range of 5-38μm. MCS consists of two relay optical modules and following four scientific optical modules of WFC (Wide Field Camera; 5'x 5' field of view, f/11.7 and f/4.2 cameras), LRS (Low Resolution Spectrometer; 2'.5 long slits, prism dispersers, f/5.0 and f/1.7 cameras, spectral resolving power R ∼ 50-100), MRS (Mid Resolution Spectrometer; echelles, integral field units by image slicer, f/3.3 and f/1.9 cameras, R ∼ 1100-3000) and HRS (High Resolution Spectrometer; immersed echelles, f/6.0 and f/3.6 cameras, R ∼ 20000-30000). Here, we present optical design and expected optical performance of MCS. Most parts of MCS optics adopt off-axis reflective system for covering the wide wavelength range of 5-38μm without chromatic aberration and minimizing problems due to changes in shapes and refractive indices of materials from room temperature to cryogenic temperature. In order to achieve the high specification requirements of wide field of view, small F-number and large spectral resolving power with compact size, we employed the paraxial and aberration analysis of off-axial optical systems (Araki 2005 [1]) which is a design method using free-form surfaces for compact reflective optics such as head mount displays. As a result, we have successfully designed compact reflective optics for MCS with as-built performance of diffraction-limited image resolution.
A portable W-band radar system for enhancement of infrared vision in fire fighting operations
NASA Astrophysics Data System (ADS)
Klenner, Mathias; Zech, Christian; Hülsmann, Axel; Kühn, Jutta; Schlechtweg, Michael; Hahmann, Konstantin; Kleiner, Bernhard; Ulrich, Michael; Ambacher, Oliver
2016-10-01
In this paper, we present a millimeter wave radar system which will enhance the performance of infrared cameras used for fire-fighting applications. The radar module is compact and lightweight such that the system can be combined with inertial sensors and integrated in a hand-held infrared camera. This allows for precise distance measurements in harsh environmental conditions, such as tunnel or industrial fires, where optical sensors are unreliable or fail. We discuss the design of the RF front-end, the antenna and a quasi-optical lens for beam shaping as well as signal processing and demonstrate the performance of the system by in situ measurements in a smoke filled environment.
NASA Astrophysics Data System (ADS)
Rumbaugh, Roy N.; Grealish, Kevin; Kacir, Tom; Arsenault, Barry; Murphy, Robert H.; Miller, Scott
2003-09-01
A new 4th generation MicroIR architecture is introduced as the latest in the highly successful Standard Camera Core (SCC) series by BAE SYSTEMS to offer an infrared imaging engine with greatly reduced size, weight, power, and cost. The advanced SCC500 architecture provides great flexibility in configuration to include multiple resolutions, an industry standard Real Time Operating System (RTOS) for customer specific software application plug-ins, and a highly modular construction for unique physical and interface options. These microbolometer based camera cores offer outstanding and reliable performance over an extended operating temperature range to meet the demanding requirements of real-world environments. A highly integrated lens and shutter is included in the new SCC500 product enabling easy, drop-in camera designs for quick time-to-market product introductions.
Development of plenoptic infrared camera using low dimensional material based photodetectors
NASA Astrophysics Data System (ADS)
Chen, Liangliang
Infrared (IR) sensor has extended imaging from submicron visible spectrum to tens of microns wavelength, which has been widely used for military and civilian application. The conventional bulk semiconductor materials based IR cameras suffer from low frame rate, low resolution, temperature dependent and highly cost, while the unusual Carbon Nanotube (CNT), low dimensional material based nanotechnology has been made much progress in research and industry. The unique properties of CNT lead to investigate CNT based IR photodetectors and imaging system, resolving the sensitivity, speed and cooling difficulties in state of the art IR imagings. The reliability and stability is critical to the transition from nano science to nano engineering especially for infrared sensing. It is not only for the fundamental understanding of CNT photoresponse induced processes, but also for the development of a novel infrared sensitive material with unique optical and electrical features. In the proposed research, the sandwich-structured sensor was fabricated within two polymer layers. The substrate polyimide provided sensor with isolation to background noise, and top parylene packing blocked humid environmental factors. At the same time, the fabrication process was optimized by real time electrical detection dielectrophoresis and multiple annealing to improve fabrication yield and sensor performance. The nanoscale infrared photodetector was characterized by digital microscopy and precise linear stage in order for fully understanding it. Besides, the low noise, high gain readout system was designed together with CNT photodetector to make the nano sensor IR camera available. To explore more of infrared light, we employ compressive sensing algorithm into light field sampling, 3-D camera and compressive video sensing. The redundant of whole light field, including angular images for light field, binocular images for 3-D camera and temporal information of video streams, are extracted and expressed in compressive approach. The following computational algorithms are applied to reconstruct images beyond 2D static information. The super resolution signal processing was then used to enhance and improve the image spatial resolution. The whole camera system brings a deeply detailed content for infrared spectrum sensing.
Status of the JWST Science Instrument Payload
NASA Technical Reports Server (NTRS)
Greenhouse, Matt
2016-01-01
The James Webb Space Telescope (JWST) Integrated Science Instrument Module (ISIM) system consists of five sensors (4 science): Mid-Infrared Instrument (MIRI), Near Infrared Imager and Slitless Spectrograph (NIRISS), Fine Guidance Sensor (FGS), Near InfraRed Camera (NIRCam), Near InfraRed Spectrograph (NIRSpec); and nine instrument support systems: Optical metering structure system, Electrical Harness System; Harness Radiator System, ISIM Electronics Compartment, ISIM Remote Services Unit, Cryogenic Thermal Control System, Command and Data Handling System, Flight Software System, Operations Scripts System.
Infrared imaging spectrometry by the use of bundled chalcogenide glass fibers and a PtSi CCD camera
NASA Astrophysics Data System (ADS)
Saito, Mitsunori; Kikuchi, Katsuhiro; Tanaka, Chinari; Sone, Hiroshi; Morimoto, Shozo; Yamashita, Toshiharu T.; Nishii, Junji
1999-10-01
A coherent fiber bundle for infrared image transmission was prepared by arranging 8400 chalcogenide (AsS) glass fibers. The fiber bundle, 1 m in length, is transmissive in the infrared spectral region of 1 - 6 micrometer. A remote spectroscopic imaging system was constructed with the fiber bundle and an infrared PtSi CCD camera. The system was used for the real-time observation (frame time: 1/60 s) of gas distribution. Infrared light from a SiC heater was delivered to a gas cell through a chalcogenide fiber, and transmitted light was observed through the fiber bundle. A band-pass filter was used for the selection of gas species. A He-Ne laser of 3.4 micrometer wavelength was also used for the observation of hydrocarbon gases. Gases bursting from a nozzle were observed successfully by a remote imaging system.
Thermal-depth matching in dynamic scene based on affine projection and feature registration
NASA Astrophysics Data System (ADS)
Wang, Hongyu; Jia, Tong; Wu, Chengdong; Li, Yongqiang
2018-03-01
This paper aims to study the construction of 3D temperature distribution reconstruction system based on depth and thermal infrared information. Initially, a traditional calibration method cannot be directly used, because the depth and thermal infrared camera is not sensitive to the color calibration board. Therefore, this paper aims to design a depth and thermal infrared camera calibration board to complete the calibration of the depth and thermal infrared camera. Meanwhile a local feature descriptors in thermal and depth images is proposed. The belief propagation matching algorithm is also investigated based on the space affine transformation matching and local feature matching. The 3D temperature distribution model is built based on the matching of 3D point cloud and 2D thermal infrared information. Experimental results show that the method can accurately construct the 3D temperature distribution model, and has strong robustness.
High-frame-rate infrared and visible cameras for test range instrumentation
NASA Astrophysics Data System (ADS)
Ambrose, Joseph G.; King, B.; Tower, John R.; Hughes, Gary W.; Levine, Peter A.; Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; O'Mara, K.; Sjursen, W.; McCaffrey, Nathaniel J.; Pantuso, Francis P.
1995-09-01
Field deployable, high frame rate camera systems have been developed to support the test and evaluation activities at the White Sands Missile Range. The infrared cameras employ a 640 by 480 format PtSi focal plane array (FPA). The visible cameras employ a 1024 by 1024 format backside illuminated CCD. The monolithic, MOS architecture of the PtSi FPA supports commandable frame rate, frame size, and integration time. The infrared cameras provide 3 - 5 micron thermal imaging in selectable modes from 30 Hz frame rate, 640 by 480 frame size, 33 ms integration time to 300 Hz frame rate, 133 by 142 frame size, 1 ms integration time. The infrared cameras employ a 500 mm, f/1.7 lens. Video outputs are 12-bit digital video and RS170 analog video with histogram-based contrast enhancement. The 1024 by 1024 format CCD has a 32-port, split-frame transfer architecture. The visible cameras exploit this architecture to provide selectable modes from 30 Hz frame rate, 1024 by 1024 frame size, 32 ms integration time to 300 Hz frame rate, 1024 by 1024 frame size (with 2:1 vertical binning), 0.5 ms integration time. The visible cameras employ a 500 mm, f/4 lens, with integration time controlled by an electro-optical shutter. Video outputs are RS170 analog video (512 by 480 pixels), and 12-bit digital video.
Unmanned Ground Vehicle Perception Using Thermal Infrared Cameras
NASA Technical Reports Server (NTRS)
Rankin, Arturo; Huertas, Andres; Matthies, Larry; Bajracharya, Max; Assad, Christopher; Brennan, Shane; Bellutta, Paolo; Sherwin, Gary W.
2011-01-01
The ability to perform off-road autonomous navigation at any time of day or night is a requirement for some unmanned ground vehicle (UGV) programs. Because there are times when it is desirable for military UGVs to operate without emitting strong, detectable electromagnetic signals, a passive only terrain perception mode of operation is also often a requirement. Thermal infrared (TIR) cameras can be used to provide day and night passive terrain perception. TIR cameras have a detector sensitive to either mid-wave infrared (MWIR) radiation (3-5?m) or long-wave infrared (LWIR) radiation (8-12?m). With the recent emergence of high-quality uncooled LWIR cameras, TIR cameras have become viable passive perception options for some UGV programs. The Jet Propulsion Laboratory (JPL) has used a stereo pair of TIR cameras under several UGV programs to perform stereo ranging, terrain mapping, tree-trunk detection, pedestrian detection, negative obstacle detection, and water detection based on object reflections. In addition, we have evaluated stereo range data at a variety of UGV speeds, evaluated dual-band TIR classification of soil, vegetation, and rock terrain types, analyzed 24 hour water and 12 hour mud TIR imagery, and analyzed TIR imagery for hazard detection through smoke. Since TIR cameras do not currently provide the resolution available from megapixel color cameras, a UGV's daytime safe speed is often reduced when using TIR instead of color cameras. In this paper, we summarize the UGV terrain perception work JPL has performed with TIR cameras over the last decade and describe a calibration target developed by General Dynamics Robotic Systems (GDRS) for TIR cameras and other sensors.
NASA Technical Reports Server (NTRS)
Tueller, Jack (Technical Monitor); Fazio, Giovanni G.; Tolls, Volker
2004-01-01
The purpose of this study was to investigate the feasibility of developing a daytime star tracker for ULDB flights using a commercially available off-the-shelf infrared array camera. This report describes the system used for ground-based tests, the observations, the test results, and gives recommendations for continued development.
Teaching physics and understanding infrared thermal imaging
NASA Astrophysics Data System (ADS)
Vollmer, Michael; Möllmann, Klaus-Peter
2017-08-01
Infrared thermal imaging is a very rapidly evolving field. The latest trends are small smartphone IR camera accessories, making infrared imaging a widespread and well-known consumer product. Applications range from medical diagnosis methods via building inspections and industrial predictive maintenance etc. also to visualization in the natural sciences. Infrared cameras do allow qualitative imaging and visualization but also quantitative measurements of the surface temperatures of objects. On the one hand, they are a particularly suitable tool to teach optics and radiation physics and many selected topics in different fields of physics, on the other hand there is an increasing need of engineers and physicists who understand these complex state of the art photonics systems. Therefore students must also learn and understand the physics underlying these systems.
NASA Technical Reports Server (NTRS)
Harper, D. A.
1996-01-01
The objective of this grant was to construct a series of far infrared photometers, cameras, and supporting systems for use in astronomical observations in the Kuiper Airborne Observatory. The observations have included studies of galaxies, star formation regions, and objects within the Solar System.
Optimal design of an earth observation optical system with dual spectral and high resolution
NASA Astrophysics Data System (ADS)
Yan, Pei-pei; Jiang, Kai; Liu, Kai; Duan, Jing; Shan, Qiusha
2017-02-01
With the increasing demand of the high-resolution remote sensing images by military and civilians, Countries around the world are optimistic about the prospect of higher resolution remote sensing images. Moreover, design a visible/infrared integrative optic system has important value in earth observation. Because visible system can't identify camouflage and recon at night, so we should associate visible camera with infrared camera. An earth observation optical system with dual spectral and high resolution is designed. The paper mainly researches on the integrative design of visible and infrared optic system, which makes the system lighter and smaller, and achieves one satellite with two uses. The working waveband of the system covers visible, middle infrared (3-5um). Dual waveband clear imaging is achieved with dispersive RC system. The focal length of visible system is 3056mm, F/# is 10.91. And the focal length of middle infrared system is 1120mm, F/# is 4. In order to suppress the middle infrared thermal radiation and stray light, the second imaging system is achieved and the narcissus phenomenon is analyzed. The system characteristic is that the structure is simple. And the especial requirements of the Modulation Transfer Function (MTF), spot, energy concentration, and distortion etc. are all satisfied.
Yang, Xiaofeng; Wu, Wei; Wang, Guoan
2015-04-01
This paper presents a surgical optical navigation system with non-invasive, real-time, and positioning characteristics for open surgical procedure. The design was based on the principle of near-infrared fluorescence molecular imaging. The in vivo fluorescence excitation technology, multi-channel spectral camera technology and image fusion software technology were used. Visible and near-infrared light ring LED excitation source, multi-channel band pass filters, spectral camera 2 CCD optical sensor technology and computer systems were integrated, and, as a result, a new surgical optical navigation system was successfully developed. When the near-infrared fluorescence was injected, the system could display anatomical images of the tissue surface and near-infrared fluorescent functional images of surgical field simultaneously. The system can identify the lymphatic vessels, lymph node, tumor edge which doctor cannot find out with naked eye intra-operatively. Our research will guide effectively the surgeon to remove the tumor tissue to improve significantly the success rate of surgery. The technologies have obtained a national patent, with patent No. ZI. 2011 1 0292374. 1.
Visible-infrared achromatic imaging by wavefront coding with wide-angle automobile camera
NASA Astrophysics Data System (ADS)
Ohta, Mitsuhiko; Sakita, Koichi; Shimano, Takeshi; Sugiyama, Takashi; Shibasaki, Susumu
2016-09-01
We perform an experiment of achromatic imaging with wavefront coding (WFC) using a wide-angle automobile lens. Our original annular phase mask for WFC was inserted to the lens, for which the difference between the focal positions at 400 nm and at 950 nm is 0.10 mm. We acquired images of objects using a WFC camera with this lens under the conditions of visible and infrared light. As a result, the effect of the removal of the chromatic aberration of the WFC system was successfully determined. Moreover, we fabricated a demonstration set assuming the use of a night vision camera in an automobile and showed the effect of the WFC system.
New gonioscopy system using only infrared light.
Sugimoto, Kota; Ito, Kunio; Matsunaga, Koichi; Miura, Katsuya; Esaki, Koji; Uji, Yukitaka
2005-08-01
To describe an infrared gonioscopy system designed to observe the anterior chamber angle under natural mydriasis in a completely darkened room. An infrared light filter was used to modify the light source of the slit-lamp microscope. A television monitor connected to a CCD monochrome camera was used to indirectly observe the angle. Use of the infrared system enabled observation of the angle under natural mydriasis in a completely darkened room. Infrared gonioscopy is a useful procedure for the observation of the angle under natural mydriasis.
Near-infrared transillumination photography of intraocular tumours.
Krohn, Jørgen; Ulltang, Erlend; Kjersem, Bård
2013-10-01
To present a technique for near-infrared transillumination imaging of intraocular tumours based on the modifications of a conventional digital slit lamp camera system. The Haag-Streit Photo-Slit Lamp BX 900 (Haag-Streit AG) was used for transillumination photography by gently pressing the tip of the background illumination cable against the surface of the patient's eye. Thus the light from the flash unit was transmitted into the eye, leading to improved illumination and image resolution. The modification for near-infrared photography was done by replacing the original camera with a Canon EOS 30D (Canon Inc) converted by Advanced Camera Services Ltd. In this camera, the infrared blocking filter was exchanged for a 720 nm long-pass filter, so that the near-infrared part of the spectrum was recorded by the sensor. The technique was applied in eight patients: three with anterior choroidal melanoma, three with ciliary body melanoma and two with ocular pigment alterations. The good diagnostic quality of the photographs made it possible to evaluate the exact location and extent of the lesions in relation to pigmented intraocular landmarks such as the ora serrata and ciliary body. The photographic procedure did not lead to any complications. We recommend near-infrared transillumination photography as a supplementary diagnostic tool for the evaluation and documentation of anteriorly located intraocular tumours.
Development of a portable multispectral thermal infrared camera
NASA Technical Reports Server (NTRS)
Osterwisch, Frederick G.
1991-01-01
The purpose of this research and development effort was to design and build a prototype instrument designated the 'Thermal Infrared Multispectral Camera' (TIRC). The Phase 2 effort was a continuation of the Phase 1 feasibility study and preliminary design for such an instrument. The completed instrument designated AA465 has application in the field of geologic remote sensing and exploration. The AA465 Thermal Infrared Camera (TIRC) System is a field-portable multispectral thermal infrared camera operating over the 8.0 - 13.0 micron wavelength range. Its primary function is to acquire two-dimensional thermal infrared images of user-selected scenes. Thermal infrared energy emitted by the scene is collected, dispersed into ten 0.5 micron wide channels, and then measured and recorded by the AA465 System. This multispectral information is presented in real time on a color display to be used by the operator to identify spectral and spatial variations in the scenes emissivity and/or irradiance. This fundamental instrument capability has a wide variety of commercial and research applications. While ideally suited for two-man operation in the field, the AA465 System can be transported and operated effectively by a single user. Functionally, the instrument operates as if it were a single exposure camera. System measurement sensitivity requirements dictate relatively long (several minutes) instrument exposure times. As such, the instrument is not suited for recording time-variant information. The AA465 was fabricated, assembled, tested, and documented during this Phase 2 work period. The detailed design and fabrication of the instrument was performed during the period of June 1989 to July 1990. The software development effort and instrument integration/test extended from July 1990 to February 1991. Software development included an operator interface/menu structure, instrument internal control functions, DSP image processing code, and a display algorithm coding program. The instrument was delivered to NASA in March 1991. Potential commercial and research uses for this instrument are in its primary application as a field geologists exploration tool. Other applications have been suggested but not investigated in depth. These are measurements of process control in commercial materials processing and quality control functions which require information on surface heterogeneity.
Multi-spectral imaging with infrared sensitive organic light emitting diode
Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky
2014-01-01
Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions. PMID:25091589
Multi-spectral imaging with infrared sensitive organic light emitting diode
NASA Astrophysics Data System (ADS)
Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky
2014-08-01
Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions.
NASA Astrophysics Data System (ADS)
Gogler, Slawomir; Bieszczad, Grzegorz; Krupinski, Michal
2013-10-01
Thermal imagers and used therein infrared array sensors are subject to calibration procedure and evaluation of their voltage sensitivity on incident radiation during manufacturing process. The calibration procedure is especially important in so-called radiometric cameras, where accurate radiometric quantities, given in physical units, are of concern. Even though non-radiometric cameras are not expected to stand up to such elevated standards, it is still important, that the image faithfully represents temperature variations across the scene. Detectors used in thermal camera are illuminated by infrared radiation transmitted through an infrared transmitting optical system. Often an optical system, when exposed to uniform Lambertian source forms a non-uniform irradiation distribution in its image plane. In order to be able to carry out an accurate non-uniformity correction it is essential to correctly predict irradiation distribution from a uniform source. In the article a non-uniformity correction method has been presented, that takes into account optical system's radiometry. Predictions of the irradiation distribution have been confronted with measured irradiance values. Presented radiometric model allows fast and accurate non-uniformity correction to be carried out.
Night vision imaging system design, integration and verification in spacecraft vacuum thermal test
NASA Astrophysics Data System (ADS)
Shang, Yonghong; Wang, Jing; Gong, Zhe; Li, Xiyuan; Pei, Yifei; Bai, Tingzhu; Zhen, Haijing
2015-08-01
The purposes of spacecraft vacuum thermal test are to characterize the thermal control systems of the spacecraft and its component in its cruise configuration and to allow for early retirement of risks associated with mission-specific and novel thermal designs. The orbit heat flux is simulating by infrared lamp, infrared cage or electric heater. As infrared cage and electric heater do not emit visible light, or infrared lamp just emits limited visible light test, ordinary camera could not operate due to low luminous density in test. Moreover, some special instruments such as satellite-borne infrared sensors are sensitive to visible light and it couldn't compensate light during test. For improving the ability of fine monitoring on spacecraft and exhibition of test progress in condition of ultra-low luminous density, night vision imaging system is designed and integrated by BISEE. System is consist of high-gain image intensifier ICCD camera, assistant luminance system, glare protect system, thermal control system and computer control system. The multi-frame accumulation target detect technology is adopted for high quality image recognition in captive test. Optical system, mechanical system and electrical system are designed and integrated highly adaptable to vacuum environment. Molybdenum/Polyimide thin film electrical heater controls the temperature of ICCD camera. The results of performance validation test shown that system could operate under vacuum thermal environment of 1.33×10-3Pa vacuum degree and 100K shroud temperature in the space environment simulator, and its working temperature is maintains at 5° during two-day test. The night vision imaging system could obtain video quality of 60lp/mm resolving power.
Binocular Multispectral Adaptive Imaging System (BMAIS)
2010-07-26
system for pilots that adaptively integrates shortwave infrared (SWIR), visible, near ‐IR (NIR), off‐head thermal, and computer symbology/imagery into...respective areas. BMAIS is a binocular helmet mounted imaging system that features dual shortwave infrared (SWIR) cameras, embedded image processors and...algorithms and fusion of other sensor sites such as forward looking infrared (FLIR) and other aircraft subsystems. BMAIS is attached to the helmet
SLR digital camera for forensic photography
NASA Astrophysics Data System (ADS)
Har, Donghwan; Son, Youngho; Lee, Sungwon
2004-06-01
Forensic photography, which was systematically established in the late 19th century by Alphonse Bertillon of France, has developed a lot for about 100 years. The development will be more accelerated with the development of high technologies, in particular the digital technology. This paper reviews three studies to answer the question: Can the SLR digital camera replace the traditional silver halide type ultraviolet photography and infrared photography? 1. Comparison of relative ultraviolet and infrared sensitivity of SLR digital camera to silver halide photography. 2. How much ultraviolet or infrared sensitivity is improved when removing the UV/IR cutoff filter built in the SLR digital camera? 3. Comparison of relative sensitivity of CCD and CMOS for ultraviolet and infrared. The test result showed that the SLR digital camera has a very low sensitivity for ultraviolet and infrared. The cause was found to be the UV/IR cutoff filter mounted in front of the image sensor. Removing the UV/IR cutoff filter significantly improved the sensitivity for ultraviolet and infrared. Particularly for infrared, the sensitivity of the SLR digital camera was better than that of the silver halide film. This shows the possibility of replacing the silver halide type ultraviolet photography and infrared photography with the SLR digital camera. Thus, the SLR digital camera seems to be useful for forensic photography, which deals with a lot of ultraviolet and infrared photographs.
NASA Technical Reports Server (NTRS)
Voellmer, George M.; Jackson, Michael L.; Shirron, Peter J.; Tuttle, James G.
2002-01-01
The High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter And Far Infrared Experiment (SAFIRE) will use identical Adiabatic Demagnetization Refrigerators (ADR) to cool their detectors to 200mK and 100mK, respectively. In order to minimize thermal loads on the salt pill, a Kevlar suspension system is used to hold it in place. An innovative, kinematic suspension system is presented. The suspension system is unique in that it consists of two parts that can be assembled and tensioned offline, and later bolted onto the salt pill.
A new spherical scanning system for infrared reflectography of paintings
NASA Astrophysics Data System (ADS)
Gargano, M.; Cavaliere, F.; Viganò, D.; Galli, A.; Ludwig, N.
2017-03-01
Infrared reflectography is an imaging technique used to visualize the underdrawings of ancient paintings; it relies on the fact that most pigment layers are quite transparent to infrared radiation in the spectral band between 0.8 μm and 2.5 μm. InGaAs sensor cameras are nowadays the most used devices to visualize the underdrawings but due to the small size of the detectors, these cameras are usually mounted on scanning systems to record high resolution reflectograms. This work describes a portable scanning system prototype based on a peculiar spherical scanning system built through a light weight and low cost motorized head. The motorized head was built with the purpose of allowing the refocusing adjustment needed to compensate the variable camera-painting distance during the rotation of the camera. The prototype has been tested first in laboratory and then in-situ for the Giotto panel "God the Father with Angels" with a 256 pixel per inch resolution. The system performance is comparable with that of other reflectographic devices with the advantage of extending the scanned area up to 1 m × 1 m, with a 40 min scanning time. The present configuration can be easily modified to increase the resolution up to 560 pixels per inch or to extend the scanned area up to 2 m × 2 m.
A DirtI Application for LBT Commissioning Campaigns
NASA Astrophysics Data System (ADS)
Borelli, J. L.
2009-09-01
In order to characterize the Gregorian focal stations and test the performance achieved by the Large Binocular Telescope (LBT) adaptive optics system, two infrared test cameras were constructed within a joint project between INAF (Observatorio Astronomico di Bologna, Italy) and the Max Planck Institute for Astronomy (Germany). Is intended here to describe the functionality and successful results obtained with the Daemon for the Infrared Test Camera Interface (DirtI) during commissioning campaigns.
Characterization and optimization for detector systems of IGRINS
NASA Astrophysics Data System (ADS)
Jeong, Ueejeong; Chun, Moo-Young; Oh, Jae Sok; Park, Chan; Yuk, In-Soo; Oh, Heeyoung; Kim, Kang-Min; Ko, Kyeong Yeon; Pavel, Michael D.; Yu, Young Sam; Jaffe, Daniel T.
2014-07-01
IGRINS (Immersion GRating INfrared Spectrometer) is a high resolution wide-band infrared spectrograph developed by the Korea Astronomy and Space Science Institute (KASI) and the University of Texas at Austin (UT). This spectrograph has H-band and K-band science cameras and a slit viewing camera, all three of which use Teledyne's λc~2.5μm 2k×2k HgCdTe HAWAII-2RG CMOS detectors. The two spectrograph cameras employ science grade detectors, while the slit viewing camera includes an engineering grade detector. Teledyne's cryogenic SIDECAR ASIC boards and JADE2 USB interface cards were installed to control those detectors. We performed experiments to characterize and optimize the detector systems in the IGRINS cryostat. We present measurements and optimization of noise, dark current, and referencelevel stability obtained under dark conditions. We also discuss well depth, linearity and conversion gain measurements obtained using an external light source.
Kang, Jin Kyu; Hong, Hyung Gil; Park, Kang Ryoung
2017-07-08
A number of studies have been conducted to enhance the pedestrian detection accuracy of intelligent surveillance systems. However, detecting pedestrians under outdoor conditions is a challenging problem due to the varying lighting, shadows, and occlusions. In recent times, a growing number of studies have been performed on visible light camera-based pedestrian detection systems using a convolutional neural network (CNN) in order to make the pedestrian detection process more resilient to such conditions. However, visible light cameras still cannot detect pedestrians during nighttime, and are easily affected by shadows and lighting. There are many studies on CNN-based pedestrian detection through the use of far-infrared (FIR) light cameras (i.e., thermal cameras) to address such difficulties. However, when the solar radiation increases and the background temperature reaches the same level as the body temperature, it remains difficult for the FIR light camera to detect pedestrians due to the insignificant difference between the pedestrian and non-pedestrian features within the images. Researchers have been trying to solve this issue by inputting both the visible light and the FIR camera images into the CNN as the input. This, however, takes a longer time to process, and makes the system structure more complex as the CNN needs to process both camera images. This research adaptively selects a more appropriate candidate between two pedestrian images from visible light and FIR cameras based on a fuzzy inference system (FIS), and the selected candidate is verified with a CNN. Three types of databases were tested, taking into account various environmental factors using visible light and FIR cameras. The results showed that the proposed method performs better than the previously reported methods.
Feasibility evaluation of a motion detection system with face images for stereotactic radiosurgery.
Yamakawa, Takuya; Ogawa, Koichi; Iyatomi, Hitoshi; Kunieda, Etsuo
2011-01-01
In stereotactic radiosurgery we can irradiate a targeted volume precisely with a narrow high-energy x-ray beam, and thus the motion of a targeted area may cause side effects to normal organs. This paper describes our motion detection system with three USB cameras. To reduce the effect of change in illuminance in a tracking area we used an infrared light and USB cameras that were sensitive to the infrared light. The motion detection of a patient was performed by tracking his/her ears and nose with three USB cameras, where pattern matching between a predefined template image for each view and acquired images was done by an exhaustive search method with a general-purpose computing on a graphics processing unit (GPGPU). The results of the experiments showed that the measurement accuracy of our system was less than 0.7 mm, amounting to less than half of that of our previous system.
Gundle, Kenneth R; White, Jedediah K; Conrad, Ernest U; Ching, Randal P
2017-01-01
Surgical navigation systems are increasingly used to aid resection and reconstruction of osseous malignancies. In the process of implementing image-based surgical navigation systems, there are numerous opportunities for error that may impact surgical outcome. This study aimed to examine modifiable sources of error in an idealized scenario, when using a bidirectional infrared surgical navigation system. Accuracy and precision were assessed using a computerized-numerical-controlled (CNC) machined grid with known distances between indentations while varying: 1) the distance from the grid to the navigation camera (range 150 to 247cm), 2) the distance from the grid to the patient tracker device (range 20 to 40cm), and 3) whether the minimum or maximum number of bidirectional infrared markers were actively functioning. For each scenario, distances between grid points were measured at 10-mm increments between 10 and 120mm, with twelve measurements made at each distance. The accuracy outcome was the root mean square (RMS) error between the navigation system distance and the actual grid distance. To assess precision, four indentations were recorded six times for each scenario while also varying the angle of the navigation system pointer. The outcome for precision testing was the standard deviation of the distance between each measured point to the mean three-dimensional coordinate of the six points for each cluster. Univariate and multiple linear regression revealed that as the distance from the navigation camera to the grid increased, the RMS error increased (p<0.001). The RMS error also increased when not all infrared markers were actively tracking (p=0.03), and as the measured distance increased (p<0.001). In a multivariate model, these factors accounted for 58% of the overall variance in the RMS error. Standard deviations in repeated measures also increased when not all infrared markers were active (p<0.001), and as the distance between navigation camera and physical space increased (p=0.005). Location of the patient tracker did not affect accuracy (0.36) or precision (p=0.97). In our model laboratory test environment, the infrared bidirectional navigation system was more accurate and precise when the distance from the navigation camera to the physical (working) space was minimized and all bidirectional markers were active. These findings may require alterations in operating room setup and software changes to improve the performance of this system.
Near infra-red astronomy with adaptive optics and laser guide stars at the Keck Observatory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Max, C.E.; Gavel, D.T.; Olivier, S.S.
1995-08-03
A laser guide star adaptive optics system is being built for the W. M. Keck Observatory`s 10-meter Keck II telescope. Two new near infra-red instruments will be used with this system: a high-resolution camera (NIRC 2) and an echelle spectrometer (NIRSPEC). The authors describe the expected capabilities of these instruments for high-resolution astronomy, using adaptive optics with either a natural star or a sodium-layer laser guide star as a reference. They compare the expected performance of these planned Keck adaptive optics instruments with that predicted for the NICMOS near infra-red camera, which is scheduled to be installed on the Hubblemore » Space Telescope in 1997.« less
A fuzzy automated object classification by infrared laser camera
NASA Astrophysics Data System (ADS)
Kanazawa, Seigo; Taniguchi, Kazuhiko; Asari, Kazunari; Kuramoto, Kei; Kobashi, Syoji; Hata, Yutaka
2011-06-01
Home security in night is very important, and the system that watches a person's movements is useful in the security. This paper describes a classification system of adult, child and the other object from distance distribution measured by an infrared laser camera. This camera radiates near infrared waves and receives reflected ones. Then, it converts the time of flight into distance distribution. Our method consists of 4 steps. First, we do background subtraction and noise rejection in the distance distribution. Second, we do fuzzy clustering in the distance distribution, and form several clusters. Third, we extract features such as the height, thickness, aspect ratio, area ratio of the cluster. Then, we make fuzzy if-then rules from knowledge of adult, child and the other object so as to classify the cluster to one of adult, child and the other object. Here, we made the fuzzy membership function with respect to each features. Finally, we classify the clusters to one with the highest fuzzy degree among adult, child and the other object. In our experiment, we set up the camera in room and tested three cases. The method successfully classified them in real time processing.
Low-cost panoramic infrared surveillance system
NASA Astrophysics Data System (ADS)
Kecskes, Ian; Engel, Ezra; Wolfe, Christopher M.; Thomson, George
2017-05-01
A nighttime surveillance concept consisting of a single surface omnidirectional mirror assembly and an uncooled Vanadium Oxide (VOx) longwave infrared (LWIR) camera has been developed. This configuration provides a continuous field of view spanning 360° in azimuth and more than 110° in elevation. Both the camera and the mirror are readily available, off-the-shelf, inexpensive products. The mirror assembly is marketed for use in the visible spectrum and requires only minor modifications to function in the LWIR spectrum. The compactness and portability of this optical package offers significant advantages over many existing infrared surveillance systems. The developed system was evaluated on its ability to detect moving, human-sized heat sources at ranges between 10 m and 70 m. Raw camera images captured by the system are converted from rectangular coordinates in the camera focal plane to polar coordinates and then unwrapped into the users azimuth and elevation system. Digital background subtraction and color mapping are applied to the images to increase the users ability to extract moving items from background clutter. A second optical system consisting of a commercially available 50 mm f/1.2 ATHERM lens and a second LWIR camera is used to examine the details of objects of interest identified using the panoramic imager. A description of the components of the proof of concept is given, followed by a presentation of raw images taken by the panoramic LWIR imager. A description of the method by which these images are analyzed is given, along with a presentation of these results side-by-side with the output of the 50 mm LWIR imager and a panoramic visible light imager. Finally, a discussion of the concept and its future development are given.
Lock-in thermography using a cellphone attachment infrared camera
NASA Astrophysics Data System (ADS)
Razani, Marjan; Parkhimchyk, Artur; Tabatabaei, Nima
2018-03-01
Lock-in thermography (LIT) is a thermal-wave-based, non-destructive testing, technique which has been widely utilized in research settings for characterization and evaluation of biological and industrial materials. However, despite promising research outcomes, the wide spread adaptation of LIT in industry, and its commercialization, is hindered by the high cost of the infrared cameras used in the LIT setups. In this paper, we report on the feasibility of using inexpensive cellphone attachment infrared cameras for performing LIT. While the cost of such cameras is over two orders of magnitude less than their research-grade counterparts, our experimental results on block sample with subsurface defects and tooth with early dental caries suggest that acceptable performance can be achieved through careful instrumentation and implementation of proper data acquisition and image processing steps. We anticipate this study to pave the way for development of low-cost thermography systems and their commercialization as inexpensive tools for non-destructive testing of industrial samples as well as affordable clinical devices for diagnostic imaging of biological tissues.
Infrared cameras are potential traceable "fixed points" for future thermometry studies.
Yap Kannan, R; Keresztes, K; Hussain, S; Coats, T J; Bown, M J
2015-01-01
The National physical laboratory (NPL) requires "fixed points" whose temperatures have been established by the International Temperature Scale of 1990 (ITS 90) be used for device calibration. In practice, "near" blackbody radiators together with the standard platinum resistance thermometer is accepted as a standard. The aim of this study was to report the correlation and limits of agreement (LOA) of the thermal infrared camera and non-contact infrared temporal thermometer against each other and the "near" blackbody radiator. Temperature readings from an infrared thermography camera (FLIR T650sc) and a non-contact infrared temporal thermometer (Hubdic FS-700) were compared to a near blackbody (Hyperion R blackbody model 982) at 0.5 °C increments between 20-40 °C. At each increment, blackbody cavity temperature was confirmed with the platinum resistance thermometer. Measurements were taken initially with the thermal infrared camera followed by the infrared thermometer, with each device mounted in turn on a stand at a fixed distance of 20 cm and 5 cm from the blackbody aperture, respectively. The platinum thermometer under-estimated the blackbody temperature by 0.015 °C (95% LOA: -0.08 °C to 0.05 °C), in contrast to the thermal infrared camera and infrared thermometer which over-estimated the blackbody temperature by 0.16 °C (95% LOA: 0.03 °C to 0.28 °C) and 0.75 °C (95% LOA: -0.30 °C to 1.79 °C), respectively. Infrared thermometer over-estimates thermal infrared camera measurements by 0.6 °C (95% LOA: -0.46 °C to 1.65 °C). In conclusion, the thermal infrared camera is a potential temperature reference "fixed point" that could substitute mercury thermometers. However, further repeatability and reproducibility studies will be required with different models of thermal infrared cameras.
Infrared Camera Diagnostic for Heat Flux Measurements on NSTX
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. Mastrovito; R. Maingi; H.W. Kugel
2003-03-25
An infrared imaging system has been installed on NSTX (National Spherical Torus Experiment) at the Princeton Plasma Physics Laboratory to measure the surface temperatures on the lower divertor and center stack. The imaging system is based on an Indigo Alpha 160 x 128 microbolometer camera with 12 bits/pixel operating in the 7-13 {micro}m range with a 30 Hz frame rate and a dynamic temperature range of 0-700 degrees C. From these data and knowledge of graphite thermal properties, the heat flux is derived with a classic one-dimensional conduction model. Preliminary results of heat flux scaling are reported.
C-RED One : the infrared camera using the Saphira e-APD detector
NASA Astrophysics Data System (ADS)
Greffe, Timothée.; Feautrier, Philippe; Gach, Jean-Luc; Stadler, Eric; Clop, Fabien; Lemarchand, Stephane; Boutolleau, David; Baker, Ian
2016-08-01
Name for Person Card: Observatoire de la Côte d'Azur First Light Imaging' C-RED One infrared camera is capable of capturing up to 3500 full frames per second with a sub-electron readout noise and very low background. This breakthrough has been made possible thanks to the use of an e- APD infrared focal plane array which is a real disruptive technology in imagery. C-RED One is an autonomous system with an integrated cooling system and a vacuum regeneration system. It operates its sensor with a wide variety of read out techniques and processes video on-board thanks to an FPGA. We will show its performances and expose its main features. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.
NASA Astrophysics Data System (ADS)
Dumoulin, J.; Averty, R.
2012-04-01
One of the objectives of ISTIMES project is to evaluate the potentialities offered by the integration of different electromagnetic techniques able to perform non-invasive diagnostics for surveillance and monitoring of transport infrastructures. Among the EM methods investigated, uncooled infrared camera is a promising technique due to its dissemination potential according to its relative low cost on the market. Infrared thermography, when it is used in quantitative mode (not in laboratory conditions) and not in qualitative mode (vision applied to survey), requires to process in real time thermal radiative corrections on raw data acquired to take into account influences of natural environment evolution with time. But, camera sensor has to be enough smart to apply in real time calibration law and radiometric corrections in a varying atmosphere. So, a complete measurement system was studied and developed with low cost infrared cameras available on the market. In the system developed, infrared camera is coupled with other sensors to feed simplified radiative models running, in real time, on GPU available on small PC. The system studied and developed uses a fast Ethernet camera FLIR A320 [1] coupled with a VAISALA WXT520 [2] weather station and a light GPS unit [3] for positioning and dating. It can be used with other Ethernet infrared cameras (i.e. visible ones) but requires to be able to access measured data at raw level. In the present study, it has been made possible thanks to a specific agreement signed with FLIR Company. The prototype system studied and developed is implemented on low cost small computer that integrates a GPU card to allow real time parallel computing [4] of simplified radiometric [5] heat balance using information measured with the weather station. An HMI was developed under Linux using OpenSource and complementary pieces of software developed at IFSTTAR. This new HMI called "IrLaW" has various functionalities that let it compliant to be use in real site for long term monitoring. It can be remotely controlled in wire or wireless communication mode depending on what is the context of measurement and the degree of accessibility to the system when it is running on real site. To complete and conclude, thanks to the development of a high level library, but also to the deployment of a daemon, our developed measurement system was tuned to be compatible with OGC standards. Complementary functionalities were also developed to allow the system to self declare to 52North. For that, a specific plugin was developed to be inserted previously at 52North level. Finally, data are also accessible by tasking the system when required, fort instance by using the web portal developed in the ISTIMES Framework. ACKNOWLEDGEMENT - The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 225663.
NASA Astrophysics Data System (ADS)
Ghionis, George; Trygonis, Vassilis; Karydis, Antonis; Vousdoukas, Michalis; Alexandrakis, George; Drakopoulos, Panos; Amdreadis, Olympos; Psarros, Fotis; Velegrakis, Antonis; Poulos, Serafim
2016-04-01
Effective beach management requires environmental assessments that are based on sound science, are cost-effective and are available to beach users and managers in an accessible, timely and transparent manner. The most common problems are: 1) The available field data are scarce and of sub-optimal spatio-temporal resolution and coverage, 2) our understanding of local beach processes needs to be improved in order to accurately model/forecast beach dynamics under a changing climate, and 3) the information provided by coastal scientists/engineers in the form of data, models and scientific interpretation is often too complicated to be of direct use by coastal managers/decision makers. A multispectral video system has been developed, consisting of one or more video cameras operating in the visible part of the spectrum, a passive near-infrared (NIR) camera, an active NIR camera system, a thermal infrared camera and a spherical video camera, coupled with innovative image processing algorithms and a telemetric system for the monitoring of coastal environmental parameters. The complete system has the capability to record, process and communicate (in quasi-real time) high frequency information on shoreline position, wave breaking zones, wave run-up, erosion hot spots along the shoreline, nearshore wave height, turbidity, underwater visibility, wind speed and direction, air and sea temperature, solar radiation, UV radiation, relative humidity, barometric pressure and rainfall. An innovative, remotely-controlled interactive visual monitoring system, based on the spherical video camera (with 360°field of view), combines the video streams from all cameras and can be used by beach managers to monitor (in real time) beach user numbers, flow activities and safety at beaches of high touristic value. The high resolution near infrared cameras permit 24-hour monitoring of beach processes, while the thermal camera provides information on beach sediment temperature and moisture, can detect upwelling in the nearshore zone, and enhances the safety of beach users. All data can be presented in real- or quasi-real time and are stored for future analysis and training/validation of coastal processes models. Acknowledgements: This work was supported by the project BEACHTOUR (11SYN-8-1466) of the Operational Program "Cooperation 2011, Competitiveness and Entrepreneurship", co-funded by the European Regional Development Fund and the Greek Ministry of Education and Religious Affairs.
Thermoelectric infrared imaging sensors for automotive applications
NASA Astrophysics Data System (ADS)
Hirota, Masaki; Nakajima, Yasushi; Saito, Masanori; Satou, Fuminori; Uchiyama, Makoto
2004-07-01
This paper describes three low-cost thermoelectric infrared imaging sensors having a 1,536, 2,304, and 10,800 element thermoelectric focal plane array (FPA) respectively and two experimental automotive application systems. The FPAs are basically fabricated with a conventional IC process and micromachining technologies and have a low cost potential. Among these sensors, the sensor having 2,304 elements provide high responsivity of 5,500 V/W and a very small size with adopting a vacuum-sealed package integrated with a wide-angle ZnS lens. One experimental system incorporated in the Nissan ASV-2 is a blind spot pedestrian warning system that employs four infrared imaging sensors. This system helps alert the driver to the presence of a pedestrian in a blind spot by detecting the infrared radiation emitted from the person"s body. The system can also prevent the vehicle from moving in the direction of the pedestrian. The other is a rearview camera system with an infrared detection function. This system consists of a visible camera and infrared sensors, and it helps alert the driver to the presence of a pedestrian in a rear blind spot. Various issues that will need to be addressed in order to expand the automotive applications of IR imaging sensors in the future are also summarized. This performance is suitable for consumer electronics as well as automotive applications.
AMICA (Antarctic Multiband Infrared CAmera) project
NASA Astrophysics Data System (ADS)
Dolci, Mauro; Straniero, Oscar; Valentini, Gaetano; Di Rico, Gianluca; Ragni, Maurizio; Pelusi, Danilo; Di Varano, Igor; Giuliani, Croce; Di Cianno, Amico; Valentini, Angelo; Corcione, Leonardo; Bortoletto, Favio; D'Alessandro, Maurizio; Bonoli, Carlotta; Giro, Enrico; Fantinel, Daniela; Magrin, Demetrio; Zerbi, Filippo M.; Riva, Alberto; Molinari, Emilio; Conconi, Paolo; De Caprio, Vincenzo; Busso, Maurizio; Tosti, Gino; Nucciarelli, Giuliano; Roncella, Fabio; Abia, Carlos
2006-06-01
The Antarctic Plateau offers unique opportunities for ground-based Infrared Astronomy. AMICA (Antarctic Multiband Infrared CAmera) is an instrument designed to perform astronomical imaging from Dome-C in the near- (1 - 5 μm) and mid- (5 - 27 μm) infrared wavelength regions. The camera consists of two channels, equipped with a Raytheon InSb 256 array detector and a DRS MF-128 Si:As IBC array detector, cryocooled at 35 and 7 K respectively. Cryogenic devices will move a filter wheel and a sliding mirror, used to feed alternatively the two detectors. Fast control and readout, synchronized with the chopping secondary mirror of the telescope, will be required because of the large background expected at these wavelengths, especially beyond 10 μm. An environmental control system is needed to ensure the correct start-up, shut-down and housekeeping of the camera. The main technical challenge is represented by the extreme environmental conditions of Dome C (T about -90 °C, p around 640 mbar) and the need for a complete automatization of the overall system. AMICA will be mounted at the Nasmyth focus of the 80 cm IRAIT telescope and will perform survey-mode automatic observations of selected regions of the Southern sky. The first goal will be a direct estimate of the observational quality of this new highly promising site for Infrared Astronomy. In addition, IRAIT, equipped with AMICA, is expected to provide a significant improvement in the knowledge of fundamental astrophysical processes, such as the late stages of stellar evolution (especially AGB and post-AGB stars) and the star formation.
Firefly: A HOT camera core for thermal imagers with enhanced functionality
NASA Astrophysics Data System (ADS)
Pillans, Luke; Harmer, Jack; Edwards, Tim
2015-06-01
Raising the operating temperature of mercury cadmium telluride infrared detectors from 80K to above 160K creates new applications for high performance infrared imagers by vastly reducing the size, weight and power consumption of the integrated cryogenic cooler. Realizing the benefits of Higher Operating Temperature (HOT) requires a new kind of infrared camera core with the flexibility to address emerging applications in handheld, weapon mounted and UAV markets. This paper discusses the Firefly core developed to address these needs by Selex ES in Southampton UK. Firefly represents a fundamental redesign of the infrared signal chain reducing power consumption and providing compatibility with low cost, low power Commercial Off-The-Shelf (COTS) computing technology. This paper describes key innovations in this signal chain: a ROIC purpose built to minimize power consumption in the proximity electronics, GPU based image processing of infrared video, and a software customisable infrared core which can communicate wirelessly with other Battlespace systems.
Infrared stereo calibration for unmanned ground vehicle navigation
NASA Astrophysics Data System (ADS)
Harguess, Josh; Strange, Shawn
2014-06-01
The problem of calibrating two color cameras as a stereo pair has been heavily researched and many off-the-shelf software packages, such as Robot Operating System and OpenCV, include calibration routines that work in most cases. However, the problem of calibrating two infrared (IR) cameras for the purposes of sensor fusion and point could generation is relatively new and many challenges exist. We present a comparison of color camera and IR camera stereo calibration using data from an unmanned ground vehicle. There are two main challenges in IR stereo calibration; the calibration board (material, design, etc.) and the accuracy of calibration pattern detection. We present our analysis of these challenges along with our IR stereo calibration methodology. Finally, we present our results both visually and analytically with computed reprojection errors.
An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates
Hobbs, Michael T.; Brehme, Cheryl S.
2017-01-01
Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.
An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates.
Hobbs, Michael T; Brehme, Cheryl S
2017-01-01
Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.
Improved calibration-based non-uniformity correction method for uncooled infrared camera
NASA Astrophysics Data System (ADS)
Liu, Chengwei; Sui, Xiubao
2017-08-01
With the latest improvements of microbolometer focal plane arrays (FPA), uncooled infrared (IR) cameras are becoming the most widely used devices in thermography, especially in handheld devices. However the influences derived from changing ambient condition and the non-uniform response of the sensors make it more difficult to correct the nonuniformity of uncooled infrared camera. In this paper, based on the infrared radiation characteristic in the TEC-less uncooled infrared camera, a novel model was proposed for calibration-based non-uniformity correction (NUC). In this model, we introduce the FPA temperature, together with the responses of microbolometer under different ambient temperature to calculate the correction parameters. Based on the proposed model, we can work out the correction parameters with the calibration measurements under controlled ambient condition and uniform blackbody. All correction parameters can be determined after the calibration process and then be used to correct the non-uniformity of the infrared camera in real time. This paper presents the detail of the compensation procedure and the performance of the proposed calibration-based non-uniformity correction method. And our method was evaluated on realistic IR images obtained by a 384x288 pixels uncooled long wave infrared (LWIR) camera operated under changed ambient condition. The results show that our method can exclude the influence caused by the changed ambient condition, and ensure that the infrared camera has a stable performance.
Emergency positioning system accuracy with infrared LEDs in high-security facilities
NASA Astrophysics Data System (ADS)
Knoch, Sierra N.; Nelson, Charles; Walker, Owens
2017-05-01
Instantaneous personnel location presents a challenge in Department of Defense applications where high levels of security restrict real-time tracking of crew members. During emergency situations, command and control requires immediate accountability of all personnel. Current radio frequency (RF) based indoor positioning systems can be unsuitable due to RF leakage and electromagnetic interference with sensitively calibrated machinery on variable platforms like ships, submarines and high-security facilities. Infrared light provide a possible solution to this problem. This paper proposes and evaluates an indoor line-of-sight positioning system that is comprised of IR and high-sensitivity CMOS camera receivers. In this system the movement of the LEDs is captured by the camera, uploaded and analyzed; the highest point of power is located and plotted to create a blueprint of crewmember location. Results provided evaluate accuracy as a function of both wavelength and environmental conditions. Research will further evaluate the accuracy of the LED transmitter and CMOS camera receiver system. Transmissions in both the 780 and 850nm IR are analyzed.
Infrared-enhanced TV for fire detection
NASA Technical Reports Server (NTRS)
Hall, J. R.
1978-01-01
Closed-circuit television is superior to conventional smoke or heat sensors for detecting fires in large open spaces. Single TV camera scans entire area, whereas many conventional sensors and maze of interconnecting wiring might be required to get same coverage. Camera is monitored by person who would trip alarm if fire were detected, or electronic circuitry could process camera signal for fully-automatic alarm system.
Seeing in a different light—using an infrared camera to teach heat transfer and optical phenomena
NASA Astrophysics Data System (ADS)
Pei Wong, Choun; Subramaniam, R.
2018-05-01
The infrared camera is a useful tool in physics education to ‘see’ in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.
Seeing in a Different Light--Using an Infrared Camera to Teach Heat Transfer and Optical Phenomena
ERIC Educational Resources Information Center
Wong, Choun Pei; Subramaniam, R.
2018-01-01
The infrared camera is a useful tool in physics education to 'see' in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.
Opto-mechanical system design of test system for near-infrared and visible target
NASA Astrophysics Data System (ADS)
Wang, Chunyan; Zhu, Guodong; Wang, Yuchao
2014-12-01
Guidance precision is the key indexes of the guided weapon shooting. The factors of guidance precision including: information processing precision, control system accuracy, laser irradiation accuracy and so on. The laser irradiation precision is an important factor. This paper aimed at the demand of the precision test of laser irradiator,and developed the laser precision test system. The system consists of modified cassegrain system, the wide range CCD camera, tracking turntable and industrial PC, and makes visible light and near infrared target imaging at the same time with a Near IR camera. Through the analysis of the design results, when it exposures the target of 1000 meters that the system measurement precision is43mm, fully meet the needs of the laser precision test.
A multipurpose camera system for monitoring Kīlauea Volcano, Hawai'i
Patrick, Matthew R.; Orr, Tim R.; Lee, Lopaka; Moniz, Cyril J.
2015-01-01
We describe a low-cost, compact multipurpose camera system designed for field deployment at active volcanoes that can be used either as a webcam (transmitting images back to an observatory in real-time) or as a time-lapse camera system (storing images onto the camera system for periodic retrieval during field visits). The system also has the capability to acquire high-definition video. The camera system uses a Raspberry Pi single-board computer and a 5-megapixel low-light (near-infrared sensitive) camera, as well as a small Global Positioning System (GPS) module to ensure accurate time-stamping of images. Custom Python scripts control the webcam and GPS unit and handle data management. The inexpensive nature of the system allows it to be installed at hazardous sites where it might be lost. Another major advantage of this camera system is that it provides accurate internal timing (independent of network connection) and, because a full Linux operating system and the Python programming language are available on the camera system itself, it has the versatility to be configured for the specific needs of the user. We describe example deployments of the camera at Kīlauea Volcano, Hawai‘i, to monitor ongoing summit lava lake activity.
A practical indoor context-aware surveillance system with multi-Kinect sensors
NASA Astrophysics Data System (ADS)
Jia, Lili; You, Ying; Li, Tiezhu; Zhang, Shun
2014-11-01
In this paper we develop a novel practical application, which give scalable services to the end users when abnormal actives are happening. Architecture of the application has been presented consisting of network infrared cameras and a communication module. In this intelligent surveillance system we use Kinect sensors as the input cameras. Kinect is an infrared laser camera which its user can access the raw infrared sensor stream. We install several Kinect sensors in one room to track the human skeletons. Each sensor returns the body positions with 15 coordinates in its own coordinate system. We use calibration algorithms to calibrate all the body positions points into one unified coordinate system. With the body positions points, we can infer the surveillance context. Furthermore, the messages from the metadata index matrix will be sent to mobile phone through communication module. User will instantly be aware of an abnormal case happened in the room without having to check the website. In conclusion, theoretical analysis and experimental results in this paper show that the proposed system is reasonable and efficient. And the application method introduced in this paper is not only to discourage the criminals and assist police in the apprehension of suspects, but also can enabled the end-users monitor the indoor environments anywhere and anytime by their phones.
Recent developments in space shuttle remote sensing, using hand-held film cameras
NASA Technical Reports Server (NTRS)
Amsbury, David L.; Bremer, Jeffrey M.
1992-01-01
The authors report on the advantages and disadvantages of a number of camera systems which are currently employed for space shuttle remote sensing operations. Systems discussed include the modified Hasselbad, the Rolleiflex 6008, the Linkof 5-inch format system, and the Nikon F3/F4 systems. Film/filter combinations (color positive films, color infrared films, color negative films and polarization filters) are presented.
Framework for 2D-3D image fusion of infrared thermography with preoperative MRI.
Hoffmann, Nico; Weidner, Florian; Urban, Peter; Meyer, Tobias; Schnabel, Christian; Radev, Yordan; Schackert, Gabriele; Petersohn, Uwe; Koch, Edmund; Gumhold, Stefan; Steiner, Gerald; Kirsch, Matthias
2017-11-27
Multimodal medical image fusion combines information of one or more images in order to improve the diagnostic value. While previous applications mainly focus on merging images from computed tomography, magnetic resonance imaging (MRI), ultrasonic and single-photon emission computed tomography, we propose a novel approach for the registration and fusion of preoperative 3D MRI with intraoperative 2D infrared thermography. Image-guided neurosurgeries are based on neuronavigation systems, which further allow us track the position and orientation of arbitrary cameras. Hereby, we are able to relate the 2D coordinate system of the infrared camera with the 3D MRI coordinate system. The registered image data are now combined by calibration-based image fusion in order to map our intraoperative 2D thermographic images onto the respective brain surface recovered from preoperative MRI. In extensive accuracy measurements, we found that the proposed framework achieves a mean accuracy of 2.46 mm.
Super Resolution Algorithm for CCTVs
NASA Astrophysics Data System (ADS)
Gohshi, Seiichi
2015-03-01
Recently, security cameras and CCTV systems have become an important part of our daily lives. The rising demand for such systems has created business opportunities in this field, especially in big cities. Analogue CCTV systems are being replaced by digital systems, and HDTV CCTV has become quite common. HDTV CCTV can achieve images with high contrast and decent quality if they are clicked in daylight. However, the quality of an image clicked at night does not always have sufficient contrast and resolution because of poor lighting conditions. CCTV systems depend on infrared light at night to compensate for insufficient lighting conditions, thereby producing monochrome images and videos. However, these images and videos do not have high contrast and are blurred. We propose a nonlinear signal processing technique that significantly improves visual and image qualities (contrast and resolution) of low-contrast infrared images. The proposed method enables the use of infrared cameras for various purposes such as night shot and poor lighting environments under poor lighting conditions.
3D medical thermography device
NASA Astrophysics Data System (ADS)
Moghadam, Peyman
2015-05-01
In this paper, a novel handheld 3D medical thermography system is introduced. The proposed system consists of a thermal-infrared camera, a color camera and a depth camera rigidly attached in close proximity and mounted on an ergonomic handle. As a practitioner holding the device smoothly moves it around the human body parts, the proposed system generates and builds up a precise 3D thermogram model by incorporating information from each new measurement in real-time. The data is acquired in motion, thus it provides multiple points of view. When processed, these multiple points of view are adaptively combined by taking into account the reliability of each individual measurement which can vary due to a variety of factors such as angle of incidence, distance between the device and the subject and environmental sensor data or other factors influencing a confidence of the thermal-infrared data when captured. Finally, several case studies are presented to support the usability and performance of the proposed system.
Thermal Imaging with Novel Infrared Focal Plane Arrays and Quantitative Analysis of Thermal Imagery
NASA Technical Reports Server (NTRS)
Gunapala, S. D.; Rafol, S. B.; Bandara, S. V.; Liu, J. K.; Mumolo, J. M.; Soibel, A.; Ting, D. Z.; Tidrow, Meimei
2012-01-01
We have developed a single long-wavelength infrared (LWIR) quantum well infrared photodetector (QWIP) camera for thermography. This camera has been used to measure the temperature profile of patients. A pixel coregistered simultaneously reading mid-wavelength infrared (MWIR)/LWIR dual-band QWIP camera was developed to improve the accuracy of temperature measurements especially with objects with unknown emissivity. Even the dualband measurement can provide inaccurate results due to the fact that emissivity is a function of wavelength. Thus we have been developing a four-band QWIP camera for accurate temperature measurement of remote object.
NASA Astrophysics Data System (ADS)
Kadosh, Itai; Sarusi, Gabby
2017-10-01
The use of dual cameras in parallax in order to detect and create 3-D images in mobile devices has been increasing over the last few years. We propose a concept where the second camera will be operating in the short-wavelength infrared (SWIR-1300 to 1800 nm) and thus have night vision capability while preserving most of the other advantages of dual cameras in terms of depth and 3-D capabilities. In order to maintain commonality of the two cameras, we propose to attach to one of the cameras a SWIR to visible upconversion layer that will convert the SWIR image into a visible image. For this purpose, the fore optics (the objective lenses) should be redesigned for the SWIR spectral range and the additional upconversion layer, whose thickness is <1 μm. Such layer should be attached in close proximity to the mobile device visible range camera sensor (the CMOS sensor). This paper presents such a SWIR objective optical design and optimization that is formed and fit mechanically to the visible objective design but with different lenses in order to maintain the commonality and as a proof-of-concept. Such a SWIR objective design is very challenging since it requires mimicking the original visible mobile camera lenses' sizes and the mechanical housing, so we can adhere to the visible optical and mechanical design. We present in depth a feasibility study and the overall optical system performance of such a SWIR mobile-device camera fore optics design.
Gundle, Kenneth R.; White, Jedediah K.; Conrad, Ernest U.; Ching, Randal P.
2017-01-01
Introduction: Surgical navigation systems are increasingly used to aid resection and reconstruction of osseous malignancies. In the process of implementing image-based surgical navigation systems, there are numerous opportunities for error that may impact surgical outcome. This study aimed to examine modifiable sources of error in an idealized scenario, when using a bidirectional infrared surgical navigation system. Materials and Methods: Accuracy and precision were assessed using a computerized-numerical-controlled (CNC) machined grid with known distances between indentations while varying: 1) the distance from the grid to the navigation camera (range 150 to 247cm), 2) the distance from the grid to the patient tracker device (range 20 to 40cm), and 3) whether the minimum or maximum number of bidirectional infrared markers were actively functioning. For each scenario, distances between grid points were measured at 10-mm increments between 10 and 120mm, with twelve measurements made at each distance. The accuracy outcome was the root mean square (RMS) error between the navigation system distance and the actual grid distance. To assess precision, four indentations were recorded six times for each scenario while also varying the angle of the navigation system pointer. The outcome for precision testing was the standard deviation of the distance between each measured point to the mean three-dimensional coordinate of the six points for each cluster. Results: Univariate and multiple linear regression revealed that as the distance from the navigation camera to the grid increased, the RMS error increased (p<0.001). The RMS error also increased when not all infrared markers were actively tracking (p=0.03), and as the measured distance increased (p<0.001). In a multivariate model, these factors accounted for 58% of the overall variance in the RMS error. Standard deviations in repeated measures also increased when not all infrared markers were active (p<0.001), and as the distance between navigation camera and physical space increased (p=0.005). Location of the patient tracker did not affect accuracy (0.36) or precision (p=0.97) Conclusion: In our model laboratory test environment, the infrared bidirectional navigation system was more accurate and precise when the distance from the navigation camera to the physical (working) space was minimized and all bidirectional markers were active. These findings may require alterations in operating room setup and software changes to improve the performance of this system. PMID:28694888
An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates
2017-01-01
Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing. PMID:28981533
Camera Systems Rapidly Scan Large Structures
NASA Technical Reports Server (NTRS)
2013-01-01
Needing a method to quickly scan large structures like an aircraft wing, Langley Research Center developed the line scanning thermography (LST) system. LST works in tandem with a moving infrared camera to capture how a material responds to changes in temperature. Princeton Junction, New Jersey-based MISTRAS Group Inc. now licenses the technology and uses it in power stations and industrial plants.
Development of an Infrared Remote Sensing System for Continuous Monitoring of Stromboli Volcano
NASA Astrophysics Data System (ADS)
Harig, R.; Burton, M.; Rausch, P.; Jordan, M.; Gorgas, J.; Gerhard, J.
2009-04-01
In order to monitor gases emitted by Stromboli volcano in the Eolian archipelago, Italy, a remote sensing system based on Fourier-transform infrared spectroscopy has been developed and installed on the summit of Stromboli volcano. Hot rocks and lava are used as sources of infrared radiation. The system is based on an interferometer with a single detector element in combination with an azimuth-elevation scanning mirror system. The mirror system is used to align the field of view of the instrument. In addition, the system is equipped with an infrared camera. Two basic modes of operation have been implemented: The user may use the infrared image to align the system to a vent that is to be examined. In addition, the scanning system may be used for (hyperspectral) imaging of the scene. In this mode, the scanning mirror is set sequentially move to all positions within a region of interest which is defined by the operator using the image generated from the infrared camera. The spectral range used for the measurements is 1600 - 4200 cm-1 allowing the quantification of many gases such as CO, CO2, SO2, and HCl. The spectral resolution is 0.5 cm-1. In order to protect the optical, mechanical and electrical parts of the system from the volcanic gases, all components are contained in a gas-tight aluminium housing. The system is controlled via TCP/IP (data transfer by WLAN), allowing the user to operate it from a remote PC. The infrared image of the scene and measured spectra are transferred to and displayed by a remote PC at INGV or TUHH in real-time. However, the system is capable of autonomous operation on the volcano, once a measurement has been started. Measurements are stored by an internal embedded PC.
Selecting among competing models of electro-optic, infrared camera system range performance
Nichols, Jonathan M.; Hines, James E.; Nichols, James D.
2013-01-01
Range performance is often the key requirement around which electro-optical and infrared camera systems are designed. This work presents an objective framework for evaluating competing range performance models. Model selection based on the Akaike’s Information Criterion (AIC) is presented for the type of data collected during a typical human observer and target identification experiment. These methods are then demonstrated on observer responses to both visible and infrared imagery in which one of three maritime targets was placed at various ranges. We compare the performance of a number of different models, including those appearing previously in the literature. We conclude that our model-based approach offers substantial improvements over the traditional approach to inference, including increased precision and the ability to make predictions for some distances other than the specific set for which experimental trials were conducted.
QWIP technology for both military and civilian applications
NASA Astrophysics Data System (ADS)
Gunapala, Sarath D.; Kukkonen, Carl A.; Sirangelo, Mark N.; McQuiston, Barbara K.; Chehayeb, Riad; Kaufmann, M.
2001-10-01
Advanced thermal imaging infrared cameras have been a cost effective and reliable method to obtain the temperature of objects. Quantum Well Infrared Photodetector (QWIP) based thermal imaging systems have advanced the state-of-the-art and are the most sensitive commercially available thermal systems. QWIP Technologies LLC, under exclusive agreement with Caltech University, is currently manufacturing the QWIP-ChipTM, a 320 X 256 element, bound-to-quasibound QWIP FPA. The camera performance falls within the long-wave IR band, spectrally peaked at 8.5 μm. The camera is equipped with a 32-bit floating-point digital signal processor combined with multi- tasking software, delivering a digital acquisition resolution of 12-bits using nominal power consumption of less than 50 Watts. With a variety of video interface options, remote control capability via an RS-232 connection, and an integrated control driver circuit to support motorized zoom and focus- compatible lenses, this camera design has excellent application in both the military and commercial sector. In the area of remote sensing, high-performance QWIP systems can be used for high-resolution, target recognition as part of a new system of airborne platforms (including UAVs). Such systems also have direct application in law enforcement, surveillance, industrial monitoring and road hazard detection systems. This presentation will cover the current performance of the commercial QWIP cameras, conceptual platform systems and advanced image processing for use in both military remote sensing and civilian applications currently being developed in road hazard monitoring.
Conception of a cheap infrared camera using a Fresnel lens
NASA Astrophysics Data System (ADS)
Grulois, Tatiana; Druart, Guillaume; Guérineau, Nicolas; Crastes, Arnaud; Sauer, Hervé; Chavel, Pierre
2014-09-01
Today huge efforts are made in the research and industrial areas to design compact and cheap uncooled infrared optical systems for low-cost imagery applications. Indeed, infrared cameras are currently too expensive to be widespread. If we manage to cut their cost, we expect to open new types of markets. In this paper, we will present the cheap broadband microimager we have designed. It operates in the long-wavelength infrared range and uses only one silicon lens at a minimal cost for the manufacturing process. Our concept is based on the use of a thin optics. Therefore inexpensive unconventional materials can be used because some absorption can be tolerated. Our imager uses a thin Fresnel lens. Up to now, Fresnel lenses have not been used for broadband imagery applications because of their disastrous chromatic properties. However, we show that working in a high diffraction order can significantly reduce chromatism. A prototype has been made and the performance of our camera will be discussed. Its characterization has been carried out in terms of modulation transfer function (MTF) and noise equivalent temperature difference (NETD). Finally, experimental images will be presented.
Forward-Looking Infrared Cameras for Micrometeorological Applications within Vineyards
Katurji, Marwan; Zawar-Reza, Peyman
2016-01-01
We apply the principles of atmospheric surface layer dynamics within a vineyard canopy to demonstrate the use of forward-looking infrared cameras measuring surface brightness temperature (spectrum bandwidth of 7.5 to 14 μm) at a relatively high temporal rate of 10 s. The temporal surface brightness signal over a few hours of the stable nighttime boundary layer, intermittently interrupted by periods of turbulent heat flux surges, was shown to be related to the observed meteorological measurements by an in situ eddy-covariance system, and reflected the above-canopy wind variability. The infrared raster images were collected and the resultant self-organized spatial cluster provided the meteorological context when compared to in situ data. The spatial brightness temperature pattern was explained in terms of the presence or absence of nighttime cloud cover and down-welling of long-wave radiation and the canopy turbulent heat flux. Time sequential thermography as demonstrated in this research provides positive evidence behind the application of thermal infrared cameras in the domain of micrometeorology, and to enhance our spatial understanding of turbulent eddy interactions with the surface. PMID:27649208
A goggle navigation system for cancer resection surgery
NASA Astrophysics Data System (ADS)
Xu, Junbin; Shao, Pengfei; Yue, Ting; Zhang, Shiwu; Ding, Houzhu; Wang, Jinkun; Xu, Ronald
2014-02-01
We describe a portable fluorescence goggle navigation system for cancer margin assessment during oncologic surgeries. The system consists of a computer, a head mount display (HMD) device, a near infrared (NIR) CCD camera, a miniature CMOS camera, and a 780 nm laser diode excitation light source. The fluorescence and the background images of the surgical scene are acquired by the CCD camera and the CMOS camera respectively, co-registered, and displayed on the HMD device in real-time. The spatial resolution and the co-registration deviation of the goggle navigation system are evaluated quantitatively. The technical feasibility of the proposed goggle system is tested in an ex vivo tumor model. Our experiments demonstrate the feasibility of using a goggle navigation system for intraoperative margin detection and surgical guidance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lomanowski, B. A., E-mail: b.a.lomanowski@durham.ac.uk; Sharples, R. M.; Meigs, A. G.
2014-11-15
The mirror-linked divertor spectroscopy diagnostic on JET has been upgraded with a new visible and near-infrared grating and filtered spectroscopy system. New capabilities include extended near-infrared coverage up to 1875 nm, capturing the hydrogen Paschen series, as well as a 2 kHz frame rate filtered imaging camera system for fast measurements of impurity (Be II) and deuterium Dα, Dβ, Dγ line emission in the outer divertor. The expanded system provides unique capabilities for studying spatially resolved divertor plasma dynamics at near-ELM resolved timescales as well as a test bed for feasibility assessment of near-infrared spectroscopy.
Strategic options towards an affordable high-performance infrared camera
NASA Astrophysics Data System (ADS)
Oduor, Patrick; Mizuno, Genki; Dutta, Achyut K.; Lewis, Jay; Dhar, Nibir K.
2016-05-01
The promise of infrared (IR) imaging attaining low-cost akin to CMOS sensors success has been hampered by the inability to achieve cost advantages that are necessary for crossover from military and industrial applications into the consumer and mass-scale commercial realm despite well documented advantages. Banpil Photonics is developing affordable IR cameras by adopting new strategies to speed-up the decline of the IR camera cost curve. We present a new short-wave IR (SWIR) camera; 640x512 pixel InGaAs uncooled system that is high sensitivity low noise (<50e-), high dynamic range (100 dB), high-frame rates (> 500 frames per second (FPS)) at full resolution, and low power consumption (< 1 W) in a compact system. This camera paves the way towards mass market adoption by not only demonstrating high-performance IR imaging capability value add demanded by military and industrial application, but also illuminates a path towards justifiable price points essential for consumer facing application industries such as automotive, medical, and security imaging adoption. Among the strategic options presented include new sensor manufacturing technologies that scale favorably towards automation, multi-focal plane array compatible readout electronics, and dense or ultra-small pixel pitch devices.
NASA Astrophysics Data System (ADS)
Hata, Yutaka; Kanazawa, Seigo; Endo, Maki; Tsuchiya, Naoki; Nakajima, Hiroshi
2012-06-01
This paper proposes a heart rate monitoring system for detecting autonomic nervous system by the heart rate variability using an air pressure sensor to diagnose mental disease. Moreover, we propose a human behavior monitoring system for detecting the human trajectory in home by an infrared camera. In day and night times, the human behavior monitoring system detects the human movement in home. The heart rate monitoring system detects the heart rate in bed in night time. The air pressure sensor consists of a rubber tube, cushion cover and pressure sensor, and it detects the heart rate by setting it to bed. It unconstraintly detects the RR-intervals; thereby the autonomic nervous system can be assessed. The autonomic nervous system analysis can examine the mental disease. While, the human behavior monitoring system obtains distance distribution image by an infrared camera. It classifies adult, child and the other object from distance distribution obtained by the camera, and records their trajectories. This behavior, i.e., trajectory in home, strongly corresponds to cognitive disorders. Thus, the total system can detect mental disease and cognitive disorders by uncontacted sensors to human body.
Multisensory System for the Detection and Localization of Peripheral Subcutaneous Veins
Fernández, Roemi; Armada, Manuel
2017-01-01
This paper proposes a multisensory system for the detection and localization of peripheral subcutaneous veins, as a first step for achieving automatic robotic insertion of catheters in the near future. The multisensory system is based on the combination of a SWIR (Short-Wave Infrared) camera, a TOF (Time-Of-Flight) camera and a NIR (Near Infrared) lighting source. The associated algorithm consists of two main parts: one devoted to the features extraction from the SWIR image, and another envisaged for the registration of the range data provided by the TOF camera, with the SWIR image and the results of the peripheral veins detection. In this way, the detected subcutaneous veins are mapped onto the 3D reconstructed surface, providing a full representation of the region of interest for the automatic catheter insertion. Several experimental tests were carried out in order to evaluate the capabilities of the presented approach. Preliminary results demonstrate the feasibility of the proposed design and highlight the potential benefits of the solution. PMID:28422075
Navigating surgical fluorescence cameras using near-infrared optical tracking.
van Oosterom, Matthias; den Houting, David; van de Velde, Cornelis; van Leeuwen, Fijs
2018-05-01
Fluorescence guidance facilitates real-time intraoperative visualization of the tissue of interest. However, due to attenuation, the application of fluorescence guidance is restricted to superficial lesions. To overcome this shortcoming, we have previously applied three-dimensional surgical navigation to position the fluorescence camera in reach of the superficial fluorescent signal. Unfortunately, in open surgery, the near-infrared (NIR) optical tracking system (OTS) used for navigation also induced an interference during NIR fluorescence imaging. In an attempt to support future implementation of navigated fluorescence cameras, different aspects of this interference were characterized and solutions were sought after. Two commercial fluorescence cameras for open surgery were studied in (surgical) phantom and human tissue setups using two different NIR OTSs and one OTS simulating light-emitting diode setup. Following the outcome of these measurements, OTS settings were optimized. Measurements indicated the OTS interference was caused by: (1) spectral overlap between the OTS light and camera, (2) OTS light intensity, (3) OTS duty cycle, (4) OTS frequency, (5) fluorescence camera frequency, and (6) fluorescence camera sensitivity. By optimizing points 2 to 4, navigation of fluorescence cameras during open surgery could be facilitated. Optimization of the OTS and camera compatibility can be used to support navigated fluorescence guidance concepts. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
2001-11-29
KENNEDY SPACE CENTER, Fla. -- Fully unwrapped, the Advanced Camera for Surveys, which is suspended by an overhead crane, is checked over by workers. Part of the payload on the Hubble Space Telescope Servicing Mission, STS-109, the ACS will increase the discovery efficiency of the HST by a factor of ten. It consists of three electronic cameras and a complement of filters and dispersers that detect light from the ultraviolet to the near infrared (1200 - 10,000 angstroms). The ACS was built through a collaborative effort between Johns Hopkins University, Goddard Space Flight Center, Ball Aerospace Corporation and Space Telescope Science Institute. Tasks for the mission include replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the ACS, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002
In-situ calibration of nonuniformity in infrared staring and modulated systems
NASA Astrophysics Data System (ADS)
Black, Wiley T.
Infrared cameras can directly measure the apparent temperature of objects, providing thermal imaging. However, the raw output from most infrared cameras suffers from a strong, often limiting noise source called nonuniformity. Manufacturing imperfections in infrared focal planes lead to high pixel-to-pixel sensitivity to electronic bias, focal plane temperature, and other effects. The resulting imagery can only provide useful thermal imaging after a nonuniformity calibration has been performed. Traditionally, these calibrations are performed by momentarily blocking the field of view with a at temperature plate or blackbody cavity. However because the pattern is a coupling of manufactured sensitivities with operational variations, periodic recalibration is required, sometimes on the order of tens of seconds. A class of computational methods called Scene-Based Nonuniformity Correction (SBNUC) has been researched for over 20 years where the nonuniformity calibration is estimated in digital processing by analysis of the video stream in the presence of camera motion. The most sophisticated SBNUC methods can completely and robustly eliminate the high-spatial frequency component of nonuniformity with only an initial reference calibration or potentially no physical calibration. I will demonstrate a novel algorithm that advances these SBNUC techniques to support all spatial frequencies of nonuniformity correction. Long-wave infrared microgrid polarimeters are a class of camera that incorporate a microscale per-pixel wire-grid polarizer directly affixed to each pixel of the focal plane. These cameras have the capability of simultaneously measuring thermal imagery and polarization in a robust integrated package with no moving parts. I will describe the necessary adaptations of my SBNUC method to operate on this class of sensor as well as demonstrate SBNUC performance in LWIR polarimetry video collected on the UA mall.
NASA Astrophysics Data System (ADS)
Dumoulin, Jean
2013-04-01
One of the objectives of ISTIMES project was to evaluate the potentialities offered by the integration of different electromagnetic techniques able to perform non-invasive diagnostics for surveillance and monitoring of transport infrastructures. Among the EM methods investigated, we focused our research and development efforts on uncooled infrared camera techniques due to their promising potential level of dissemination linked to their relative low cost on the market. On the other hand, works were also carried out to identify well adapted implementation protocols and key limits of Pulse Phase Thermography (PPT) and Principal Component Thermography (PCT) processing methods to analyse thermal image sequence and retrieve information about the inner structure. So the first part of this research works addresses infrared thermography measurement when it is used in quantitative mode (not in laboratory conditions) and not in qualitative mode (vision applied to survey). In such context, it requires to process in real time thermal radiative corrections on raw data acquired to take into account influences of natural environment evolution with time, thanks to additional measurements. But, camera sensor has to be enough smart to apply in real time calibration law and radiometric corrections in a varying atmosphere. So, a complete measurement system was studied and developed [1] with low cost infrared cameras available on the market. In the system developed, infrared camera is coupled with other sensors to feed simplified radiative models running, in real time, on GPU available on small PC. The whole measurement system was implemented on the "Musmeci" bridge located in Potenza (Italy). No traffic interruption was required during the mounting of our measurement system. The infrared camera was fixed on top of a mast at 6 m elevation from the surface of the bridge deck. A small weather station was added on the same mast at 1 m under the camera. A GPS antenna was also fixed at the basis of the mast and at a same elevation than the bridge deck surface. This trial took place during 4 days, but our system was leaved in stand alone acquisition mode only during 3 days. Thanks to the software developed and the small computer hardware used, thermal image were acquired at a frame rate of 0.1 Hz by averaging 50 thermal images leaving the original camera frame rate fixed at 5 Hz. Each hour, a thermal image sequence was stored on the internal hard drive and data were also retrieved, on demand, by using a wireless connection and a tablet PC. In the second part of this work, thermal image sequences analysis was carried out. Two analysis approaches were studied: one based on the use of the Fast Fourier Transform [2] and the second one based on the Principal Component Analysis [3-4]. Results obtained show that the inner structure of the deck was identified though thermal images were affected by the fact that the bridge was open to traffic during the whole experiments duration. ACKNOWLEDGEMENT - The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 225663. References [1] Dumoulin J. and Averty R., « Development of an infrared system coupled with a weather station for real time atmospheric corrections using GPU computing: Application to bridge monitoring", QIRT 2012, Naples, Italy, June 2012. [2] Cooley J.W., Tukey J.W., "An algorithm for the machine calculation of complex Fourier series", Mathematics of Computation, vol. 19, n° 90, 1965, p. 297-301. [3] Rajic N., "Principal component thermography for flaw contrast enhancement and flaw depth characterization in composite structures", Composite Structures, vol 58, pp 521-528, 2002. [4] Marinetti S., Grinzato E., Bison P. G., Bozzi E., Chimenti M., Pieri G. and Salvetti O. "Statistical analysis of IR thermographic sequences by PCA," Infrared Physics & Technology vol 46 pp 85-91, 2004.
Kong, Soo-Keun; Chon, Kyong-Myong; Goh, Eui-Kyung; Lee, Il-Woo; Wang, Soo-Geun
2014-05-01
High-resolution computed tomography has been used mainly in the diagnosis of middle ear disease, such as high-jugular bulb, congenital cholesteatoma, and ossicular disruption. However, certain diagnoses are confirmed through exploratory tympanotomy. There are few noninvasive methods available to observe the middle ear. The purpose of this study was to investigate the effect of glycerol as a refractive index matching material and an infrared (IR) camera system for extratympanic observation. 30% glycerol was used as a refractive index matching material in five fresh cadavers. Each material was divided into four subgroups; GN (glycerol no) group, GO (glycerol out) group, GI (glycerol in) group, and GB (glycerol both) group. A printed letter and middle ear structures on the inside tympanic membrane were observed using a visible and IR ray camera system. In the GB group, there were marked a transilluminated letter or an ossicle on the inside tympanic membrane. In particular, a footplate of stapes was even transilluminated using the IR camera system in the GB group. This method can be useful in the diagnosis of diseases of the middle ear if it is clinically applied through further studies.
NASA Astrophysics Data System (ADS)
Kong, Soo-Keun; Chon, Kyong-Myong; Goh, Eui-Kyung; Lee, Il-Woo; Wang, Soo-Geun
2014-05-01
High-resolution computed tomography has been used mainly in the diagnosis of middle ear disease, such as high-jugular bulb, congenital cholesteatoma, and ossicular disruption. However, certain diagnoses are confirmed through exploratory tympanotomy. There are few noninvasive methods available to observe the middle ear. The purpose of this study was to investigate the effect of glycerol as a refractive index matching material and an infrared (IR) camera system for extratympanic observation. 30% glycerol was used as a refractive index matching material in five fresh cadavers. Each material was divided into four subgroups; GN (glycerol no) group, GO (glycerol out) group, GI (glycerol in) group, and GB (glycerol both) group. A printed letter and middle ear structures on the inside tympanic membrane were observed using a visible and IR ray camera system. In the GB group, there were marked a transilluminated letter or an ossicle on the inside tympanic membrane. In particular, a footplate of stapes was even transilluminated using the IR camera system in the GB group. This method can be useful in the diagnosis of diseases of the middle ear if it is clinically applied through further studies.
[Evaluation of Iris Morphology Viewed through Stromal Edematous Corneas by Infrared Camera].
Kobayashi, Masaaki; Morishige, Naoyuki; Morita, Yukiko; Yamada, Naoyuki; Kobayashi, Motomi; Sonoda, Koh-Hei
2016-02-01
We reported that the application of infrared camera enables us to observe iris morphology in Peters' anomaly through edematous corneas. To observe the iris morphology in bullous keratopathy or failure grafts with an infrared camera. Eleven bullous keratopathy or failure grafts subjects (6 men and 5 women, mean age ± SD; 72.7 ± 13.0 years old) were enrolled in this study. The iris morphology was observed by applying visible light mode and near infrared light mode of infrared camera (MeibomPen). The detectability of pupil shapes, iris patterns and presence of iridectomy was evaluated. Infrared mode observation enabled us to detect the pupil shapes in 11 out of 11 cases, iris patterns in 3 out of 11 cases, and presence of iridetomy in 9 out of 11 cases although visible light mode observation could not detect any iris morphological changes. Applying infrared optics was valuable for observation of the iris morphology through stromal edematous corneas.
Tracking a Head-Mounted Display in a Room-Sized Environment with Head-Mounted Cameras
1990-04-01
poor resolution and a very limited working volume [Wan90]. 4 OPTOTRAK [Nor88] uses one camera with two dual-axis CCD infrared position sensors. Each...Nor88] Northern Digital. Trade literature on Optotrak - Northern Digital’s Three Dimensional Optical Motion Tracking and Analysis System. Northern Digital
Coherent infrared imaging camera (CIRIC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.
1995-07-01
New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerousmore » and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.« less
Data acquisition system for operational earth observation missions
NASA Technical Reports Server (NTRS)
Deerwester, J. M.; Alexander, D.; Arno, R. D.; Edsinger, L. E.; Norman, S. M.; Sinclair, K. F.; Tindle, E. L.; Wood, R. D.
1972-01-01
The data acquisition system capabilities expected to be available in the 1980 time period as part of operational Earth observation missions are identified. By data acquisition system is meant the sensor platform (spacecraft or aircraft), the sensors themselves and the communication system. Future capabilities and support requirements are projected for the following sensors: film camera, return beam vidicon, multispectral scanner, infrared scanner, infrared radiometer, microwave scanner, microwave radiometer, coherent side-looking radar, and scatterometer.
High-Resolution Mars Camera Test Image of Moon Infrared
2005-09-13
This crescent view of Earth Moon in infrared wavelengths comes from a camera test by NASA Mars Reconnaissance Orbiter spacecraft on its way to Mars. This image was taken by taken by the High Resolution Imaging Science Experiment camera Sept. 8, 2005.
Zhu, Banghe; Rasmussen, John C.; Sevick-Muraca, Eva M.
2014-01-01
Purpose: Although fluorescence molecular imaging is rapidly evolving as a new combinational drug/device technology platform for molecularly guided surgery and noninvasive imaging, there remains no performance standards for efficient translation of “first-in-humans” fluorescent imaging agents using these devices. Methods: The authors employed a stable, solid phantom designed to exaggerate the confounding effects of tissue light scattering and to mimic low concentrations (nM–pM) of near-infrared fluorescent dyes expected clinically for molecular imaging in order to evaluate and compare the commonly used charge coupled device (CCD) camera systems employed in preclinical studies and in human investigational studies. Results: The results show that intensified CCD systems offer greater contrast with larger signal-to-noise ratios in comparison to their unintensified CCD systems operated at clinically reasonable, subsecond acquisition times. Conclusions: Camera imaging performance could impact the success of future “first-in-humans” near-infrared fluorescence imaging agent studies. PMID:24506637
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Banghe; Rasmussen, John C.; Sevick-Muraca, Eva M., E-mail: Eva.Sevick@uth.tmc.edu
2014-02-15
Purpose: Although fluorescence molecular imaging is rapidly evolving as a new combinational drug/device technology platform for molecularly guided surgery and noninvasive imaging, there remains no performance standards for efficient translation of “first-in-humans” fluorescent imaging agents using these devices. Methods: The authors employed a stable, solid phantom designed to exaggerate the confounding effects of tissue light scattering and to mimic low concentrations (nM–pM) of near-infrared fluorescent dyes expected clinically for molecular imaging in order to evaluate and compare the commonly used charge coupled device (CCD) camera systems employed in preclinical studies and in human investigational studies. Results: The results show thatmore » intensified CCD systems offer greater contrast with larger signal-to-noise ratios in comparison to their unintensified CCD systems operated at clinically reasonable, subsecond acquisition times. Conclusions: Camera imaging performance could impact the success of future “first-in-humans” near-infrared fluorescence imaging agent studies.« less
Research on Geometric Calibration of Spaceborne Linear Array Whiskbroom Camera
Sheng, Qinghong; Wang, Qi; Xiao, Hui; Wang, Qing
2018-01-01
The geometric calibration of a spaceborne thermal-infrared camera with a high spatial resolution and wide coverage can set benchmarks for providing an accurate geographical coordinate for the retrieval of land surface temperature. The practice of using linear array whiskbroom Charge-Coupled Device (CCD) arrays to image the Earth can help get thermal-infrared images of a large breadth with high spatial resolutions. Focusing on the whiskbroom characteristics of equal time intervals and unequal angles, the present study proposes a spaceborne linear-array-scanning imaging geometric model, whilst calibrating temporal system parameters and whiskbroom angle parameters. With the help of the YG-14—China’s first satellite equipped with thermal-infrared cameras of high spatial resolution—China’s Anyang Imaging and Taiyuan Imaging are used to conduct an experiment of geometric calibration and a verification test, respectively. Results have shown that the plane positioning accuracy without ground control points (GCPs) is better than 30 pixels and the plane positioning accuracy with GCPs is better than 1 pixel. PMID:29337885
AMICA: The First camera for Near- and Mid-Infrared Astronomical Imaging at Dome C
NASA Astrophysics Data System (ADS)
Straniero, O.; Dolci, M.; Valentini, A.; Valentini, G.; di Rico, G.; Ragni, M.; Giuliani, C.; di Cianno, A.; di Varano, I.; Corcione, L.; Bortoletto, F.; D'Alessandro, M.; Magrin, D.; Bonoli, C.; Giro, E.; Fantinel, D.; Zerbi, F. M.; Riva, A.; de Caprio, V.; Molinari, E.; Conconi, P.; Busso, M.; Tosti, G.; Abia, C. A.
AMICA (Antarctic Multiband Infrared CAmera) is an instrument designed to perform astronomical imaging in the near- (1{-}5 μm) and mid- (5 27 μm) infrared wavelength regions. Equipped with two detectors, an InSb 2562 and a Si:As 1282 IBC, cooled at 35 and 7 K respectively, it will be the first instrument to investigate the potential of the Italian-French base Concordia for IR astronomy. The main technical challenge is represented by the extreme conditions of Dome C (T ˜ -90 °C, p ˜640 mbar). An environmental control system ensures the correct start-up, shut-down and housekeeping of the various components of the camera. AMICA will be mounted on the IRAIT telescope and will perform survey-mode observations in the Southern sky. The first task is to provide important site-quality data. Substantial contributions to the solution of fundamental astrophysical quests, such as those related to late phases of stellar evolution and to star formation processes, are also expected.
Infrared needle mapping to assist biopsy procedures and training.
Shar, Bruce; Leis, John; Coucher, John
2018-04-01
A computed tomography (CT) biopsy is a radiological procedure which involves using a needle to withdraw tissue or a fluid specimen from a lesion of interest inside a patient's body. The needle is progressively advanced into the patient's body, guided by the most recent CT scan. CT guided biopsies invariably expose patients to high dosages of radiation, due to the number of scans required whilst the needle is advanced. This study details the design of a novel method to aid biopsy procedures using infrared cameras. Two cameras are used to image the biopsy needle area, from which the proposed algorithm computes an estimate of the needle endpoint, which is projected onto the CT image space. This estimated position may be used to guide the needle between scans, and results in a reduction in the number of CT scans that need to be performed during the biopsy procedure. The authors formulate a 2D augmentation system which compensates for camera pose, and show that multiple low-cost infrared imaging devices provide a promising approach.
Concept of electro-optical sensor module for sniper detection system
NASA Astrophysics Data System (ADS)
Trzaskawka, Piotr; Dulski, Rafal; Kastek, Mariusz
2010-10-01
The paper presents an initial concept of the electro-optical sensor unit for sniper detection purposes. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. Being a part of a larger system it should contribute to greater overall system efficiency and lower false alarm rate thanks to data and sensor fusion techniques. Additionally, it is expected to provide some pre-shot detection capabilities. Generally acoustic (or radar) systems used for shot detection offer only "after-the-shot" information and they cannot prevent enemy attack, which in case of a skilled sniper opponent usually means trouble. The passive imaging sensors presented in this paper, together with active systems detecting pointed optics, are capable of detecting specific shooter signatures or at least the presence of suspected objects in the vicinity. The proposed sensor unit use thermal camera as a primary sniper and shot detection tool. The basic camera parameters such as focal plane array size and type, focal length and aperture were chosen on the basis of assumed tactical characteristics of the system (mainly detection range) and current technology level. In order to provide costeffective solution the commercially available daylight camera modules and infrared focal plane arrays were tested, including fast cooled infrared array modules capable of 1000 fps image acquisition rate. The daylight camera operates as a support, providing corresponding visual image, easier to comprehend for a human operator. The initial assumptions concerning sensor operation were verified during laboratory and field test and some example shot recording sequences are presented.
Near-infrared high-resolution real-time omnidirectional imaging platform for drone detection
NASA Astrophysics Data System (ADS)
Popovic, Vladan; Ott, Beat; Wellig, Peter; Leblebici, Yusuf
2016-10-01
Recent technological advancements in hardware systems have made higher quality cameras. State of the art panoramic systems use them to produce videos with a resolution of 9000 x 2400 pixels at a rate of 30 frames per second (fps).1 Many modern applications use object tracking to determine the speed and the path taken by each object moving through a scene. The detection requires detailed pixel analysis between two frames. In fields like surveillance systems or crowd analysis, this must be achieved in real time.2 In this paper, we focus on the system-level design of multi-camera sensor acquiring near-infrared (NIR) spectrum and its ability to detect mini-UAVs in a representative rural Swiss environment. The presented results show the UAV detection from the trial that we conducted during a field trial in August 2015.
A low-cost video-oculography system for vestibular function testing.
Jihwan Park; Youngsun Kong; Yunyoung Nam
2017-07-01
In order to remain in focus during head movements, vestibular-ocular reflex causes eyes to move in the opposite direction to head movement. Disorders of vestibular system decrease vision, causing abnormal nystagmus and dizziness. To diagnose abnormal nystagmus, various studies have been reported including the use of rotating chair tests and videonystagmography. However, these tests are unsuitable for home use due to their high costs. Thus, a low-cost video-oculography system is necessary to obtain clinical features at home. In this paper, we present a low-cost video-oculography system using an infrared camera and Raspberry Pi board for tracking the pupils and evaluating a vestibular system. Horizontal eye movement is derived from video data obtained from an infrared camera and infrared light-emitting diodes, and the velocity of head rotation is obtained from a gyroscope sensor. Each pupil was extracted using a morphology operation and a contour detection method. Rotatory chair tests were conducted with our developed device. To evaluate our system, gain, asymmetry, and phase were measured and compared with System 2000. The average IQR errors of gain, phase and asymmetry were 0.81, 2.74 and 17.35, respectively. We showed that our system is able to measure clinical features.
Application of infrared camera to bituminous concrete pavements: measuring vehicle
NASA Astrophysics Data System (ADS)
Janků, Michal; Stryk, Josef
2017-09-01
Infrared thermography (IR) has been used for decades in certain fields. However, the technological level of advancement of measuring devices has not been sufficient for some applications. Over the recent years, good quality thermal cameras with high resolution and very high thermal sensitivity have started to appear on the market. The development in the field of measuring technologies allowed the use of infrared thermography in new fields and for larger number of users. This article describes the research in progress in Transport Research Centre with a focus on the use of infrared thermography for diagnostics of bituminous road pavements. A measuring vehicle, equipped with a thermal camera, digital camera and GPS sensor, was designed for the diagnostics of pavements. New, highly sensitive, thermal cameras allow to measure very small temperature differences from the moving vehicle. This study shows the potential of a high-speed inspection without lane closures while using IR thermography.
A control system of a mini survey facility for photometric monitoring
NASA Astrophysics Data System (ADS)
Tsutsui, Hironori; Yanagisawa, Kenshi; Izumiura, Hideyuki; Shimizu, Yasuhiro; Hanaue, Takumi; Ita, Yoshifusa; Ichikawa, Takashi; Komiyama, Takahiro
2016-08-01
We have built a control system for a mini survey facility dedicated to photometric monitoring of nearby bright (K<5) stars in the near-infrared region. The facility comprises a 4-m-diameter rotating dome and a small (30-mm aperture) wide-field (5 × 5 sq. deg. field of view) infrared (1.0-2.5 microns) camera on an equatorial fork mount, as well as power sources and other associated equipment. All the components other than the camera are controlled by microcomputerbased I/O boards that were developed in-house and are in many of the open-use instruments in our observatory. We present the specifications and configuration of the facility hardware, as well as the structure of its control software.
NASA Astrophysics Data System (ADS)
Vainer, Boris G.; Morozov, Vitaly V.
A peculiar branch of biophotonics is a measurement, visualisation and quantitative analysis of infrared (IR) radiation emitted from living object surfaces. Focal plane array (FPA)-based IR cameras make it possible to realize in medicine the so called interventional infrared thermal diagnostics. An integrated technique aimed at the advancement of this new approach in biomedical science and practice is described in the paper. The assembled system includes a high-performance short-wave (2.45-3.05 μm) or long-wave (8-14 μm) IR camera, two laser Doppler flowmeters (LDF) and additional equipment and complementary facilities implementing the monitoring of human cardiovascular status. All these means operate synchronously. It is first ascertained the relationship between infrared thermography (IRT) and LDF data in humans in regard to their systemic cardiovascular reactivity. Blood supply real-time dynamics in a narcotized patient is first visualized and quantitatively represented during surgery in order to observe how the general hyperoxia influences thermoregulatory mechanisms; an abrupt increase in temperature of the upper limb is observed using IRT. It is outlined that the IRT-based integrated technique may act as a take-off runway leading to elaboration of informative new methods directly applicable to medicine and biomedical sciences.
Unstructured Facility Navigation by Applying the NIST 4D/RCS Architecture
2006-07-01
control, and the planner); wire- less data and emergency stop radios; GPS receiver; inertial navigation unit; dual stereo cameras; infrared sensors...current Actuators Wheel motors, camera controls Scale & filter signals status commands commands commands GPS Antenna Dual stereo cameras...used in the sensory processing module include the two pairs of stereo color cameras, the physical bumper and infrared bumper sensors, the motor
Research on a solid state-streak camera based on an electro-optic crystal
NASA Astrophysics Data System (ADS)
Wang, Chen; Liu, Baiyu; Bai, Yonglin; Bai, Xiaohong; Tian, Jinshou; Yang, Wenzheng; Xian, Ouyang
2006-06-01
With excellent temporal resolution ranging from nanosecond to sub-picoseconds, a streak camera is widely utilized in measuring ultrafast light phenomena, such as detecting synchrotron radiation, examining inertial confinement fusion target, and making measurements of laser-induced discharge. In combination with appropriate optics or spectroscope, the streak camera delivers intensity vs. position (or wavelength) information on the ultrafast process. The current streak camera is based on a sweep electric pulse and an image converting tube with a wavelength-sensitive photocathode ranging from the x-ray to near infrared region. This kind of streak camera is comparatively costly and complex. This paper describes the design and performance of a new-style streak camera based on an electro-optic crystal with large electro-optic coefficient. Crystal streak camera accomplishes the goal of time resolution by direct photon beam deflection using the electro-optic effect which can replace the current streak camera from the visible to near infrared region. After computer-aided simulation, we design a crystal streak camera which has the potential of time resolution between 1ns and 10ns.Some further improvements in sweep electric circuits, a crystal with a larger electro-optic coefficient, for example LN (γ 33=33.6×10 -12m/v) and the optimal optic system may lead to better time resolution less than 1ns.
1999-05-12
to an infrared television camera AVTO TVS-2100. The detector in the camera was an InSb crystal having sensitivity in the wavelength region between 3.0...Serial Number: Navy Case: 79,823 camera AVTO TVS-2100, with a detector of the In Sb crystal, having peak sensitivity in the wavelength region between
Students' Framing of Laboratory Exercises Using Infrared Cameras
ERIC Educational Resources Information Center
Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.
2015-01-01
Thermal science is challenging for students due to its largely imperceptible nature. Handheld infrared cameras offer a pedagogical opportunity for students to see otherwise invisible thermal phenomena. In the present study, a class of upper secondary technology students (N = 30) partook in four IR-camera laboratory activities, designed around the…
Preliminary Study of UAS Equipped with Thermal Camera for Volcanic Geothermal Monitoring in Taiwan
Chio, Shih-Hong; Lin, Cheng-Horng
2017-01-01
Thermal infrared cameras sense the temperature information of sensed scenes. With the development of UASs (Unmanned Aircraft Systems), thermal infrared cameras can now be carried on a quadcopter UAV (Unmanned Aircraft Vehicle) to appropriately collect high-resolution thermal images for volcanic geothermal monitoring in a local area. Therefore, the quadcopter UAS used to acquire thermal images for volcanic geothermal monitoring has been developed in Taiwan as part of this study to overcome the difficult terrain with highly variable topography and extreme environmental conditions. An XM6 thermal infrared camera was employed in this thermal image collection system. The Trimble BD970 GNSS (Global Navigation Satellite System) OEM (Original Equipment Manufacturer) board was also carried on the quadcopter UAV to gather dual-frequency GNSS observations in order to determine the flying trajectory data by using the Post-Processed Kinematic (PPK) technique; this will be used to establish the position and orientation of collected thermal images with less ground control points (GCPs). The digital surface model (DSM) and thermal orthoimages were then produced from collected thermal images. Tests conducted in the Hsiaoyukeng area of Taiwan’s Yangmingshan National Park show that the difference between produced DSM and airborne LIDAR (Light Detection and Ranging) data are about 37% between −1 m and 1 m, and 66% between −2 m and 2 m in the area surrounded by GCPs. As the accuracy of thermal orthoimages is about 1.78 m, it is deemed sufficient for volcanic geothermal monitoring. In addition, the thermal orthoimages show some phenomena not only more globally than do the traditional methods for volcanic geothermal monitoring, but they also show that the developed system can be further employed in Taiwan in the future. PMID:28718790
Preliminary Study of UAS Equipped with Thermal Camera for Volcanic Geothermal Monitoring in Taiwan.
Chio, Shih-Hong; Lin, Cheng-Horng
2017-07-18
Thermal infrared cameras sense the temperature information of sensed scenes. With the development of UASs (Unmanned Aircraft Systems), thermal infrared cameras can now be carried on a quadcopter UAV (Unmanned Aircraft Vehicle) to appropriately collect high-resolution thermal images for volcanic geothermal monitoring in a local area. Therefore, the quadcopter UAS used to acquire thermal images for volcanic geothermal monitoring has been developed in Taiwan as part of this study to overcome the difficult terrain with highly variable topography and extreme environmental conditions. An XM6 thermal infrared camera was employed in this thermal image collection system. The Trimble BD970 GNSS (Global Navigation Satellite System) OEM (Original Equipment Manufacturer) board was also carried on the quadcopter UAV to gather dual-frequency GNSS observations in order to determine the flying trajectory data by using the Post-Processed Kinematic (PPK) technique; this will be used to establish the position and orientation of collected thermal images with less ground control points (GCPs). The digital surface model (DSM) and thermal orthoimages were then produced from collected thermal images. Tests conducted in the Hsiaoyukeng area of Taiwan's Yangmingshan National Park show that the difference between produced DSM and airborne LIDAR (Light Detection and Ranging) data are about 37% between -1 m and 1 m, and 66% between -2 m and 2 m in the area surrounded by GCPs. As the accuracy of thermal orthoimages is about 1.78 m, it is deemed sufficient for volcanic geothermal monitoring. In addition, the thermal orthoimages show some phenomena not only more globally than do the traditional methods for volcanic geothermal monitoring, but they also show that the developed system can be further employed in Taiwan in the future.
NASA Astrophysics Data System (ADS)
Castro Marín, J. M.; Brown, V. J. G.; López Jiménez, A. C.; Rodríguez Gómez, J.; Rodrigo, R.
2001-05-01
The optical, spectroscopic infrared remote imaging system (OSIRIS) is an instrument carried on board the European Space Agency spacecraft Rosetta that will be launched in January 2003 to study in situ the comet Wirtanen. The electronic design of the mechanism controller board (MCB) system of the two OSIRIS optical cameras, the narrow angle camera, and the wide angle camera, is described here. The system is comprised of two boards mounted on an aluminum frame as part of an electronics box that contains the power supply and the digital processor unit of the instrument. The mechanisms controlled by the MCB for each camera are the front door assembly and a filter wheel assembly. The front door assembly for each camera is driven by a four phase, permanent magnet stepper motor. Each filter wheel assembly consists of two, eight filter wheels. Each wheel is driven by a four phase, variable reluctance stepper motor. Each motor, for all the assemblies, also contains a redundant set of four stator phase windings that can be energized separately or in parallel with the main windings. All stepper motors are driven in both directions using the full step unipolar mode of operation. The MCB also performs general housekeeping data acquisition of the OSIRIS instrument, i.e., mechanism position encoders and temperature measurements. The electronic design application used is quite new due to use of a field programmable gate array electronic devices that avoid the use of the now traditional system controlled by microcontrollers and software. Electrical tests of the engineering model have been performed successfully and the system is ready for space qualification after environmental testing. This system may be of interest to institutions involved in future space experiments with similar needs for mechanisms control.
Of Detection Limits and Effective Mitigation: The Use of Infrared Cameras for Methane Leak Detection
NASA Astrophysics Data System (ADS)
Ravikumar, A. P.; Wang, J.; McGuire, M.; Bell, C.; Brandt, A. R.
2017-12-01
Mitigating methane emissions, a short-lived and potent greenhouse gas, is critical to limiting global temperature rise to two degree Celsius as outlined in the Paris Agreement. A major source of anthropogenic methane emissions in the United States is the oil and gas sector. To this effect, state and federal governments have recommended the use of optical gas imaging systems in periodic leak detection and repair (LDAR) surveys to detect for fugitive emissions or leaks. The most commonly used optical gas imaging systems (OGI) are infrared cameras. In this work, we systematically evaluate the limits of infrared (IR) camera based OGI system for use in methane leak detection programs. We analyze the effect of various parameters that influence the minimum detectable leak rates of infrared cameras. Blind leak detection tests were carried out at the Department of Energy's MONITOR natural gas test-facility in Fort Collins, CO. Leak sources included natural gas wellheads, separators, and tanks. With an EPA mandated 60 g/hr leak detection threshold for IR cameras, we test leak rates ranging from 4 g/hr to over 350 g/hr at imaging distances between 5 ft and 70 ft from the leak source. We perform these experiments over the course of a week, encompassing a wide range of wind and weather conditions. Using repeated measurements at a given leak rate and imaging distance, we generate detection probability curves as a function of leak-size for various imaging distances, and measurement conditions. In addition, we estimate the median detection threshold - leak-size at which the probability of detection is 50% - under various scenarios to reduce uncertainty in mitigation effectiveness. Preliminary analysis shows that the median detection threshold varies from 3 g/hr at an imaging distance of 5 ft to over 150 g/hr at 50 ft (ambient temperature: 80 F, winds < 4 m/s). Results from this study can be directly used to improve OGI based LDAR protocols and reduce uncertainty in estimated mitigation effectiveness. Furthermore, detection limits determined in this study can be used as standards to compare new detection technologies.
Design and Calibration of a Dispersive Imaging Spectrometer Adaptor for a Fast IR Camera on NSTX-U
NASA Astrophysics Data System (ADS)
Reksoatmodjo, Richard; Gray, Travis; Princeton Plasma Physics Laboratory Team
2017-10-01
A dispersive spectrometer adaptor was designed, constructed and calibrated for use on a fast infrared camera employed to measure temperatures on the lower divertor tiles of the NSTX-U tokamak. This adaptor efficiently and evenly filters and distributes long-wavelength infrared photons between 8.0 and 12.0 microns across the 128x128 pixel detector of the fast IR camera. By determining the width of these separated wavelength bands across the camera detector, and then determining the corresponding average photon count for each photon wavelength, a very accurate measurement of the temperature, and thus heat flux, of the divertor tiles can be calculated using Plank's law. This approach of designing an exterior dispersive adaptor for the fast IR camera allows accurate temperature measurements to be made of materials with unknown emissivity. Further, the relative simplicity and affordability of this adaptor design provides an attractive option over more expensive, slower, dispersive IR camera systems. This work was made possible by funding from the Department of Energy for the Summer Undergraduate Laboratory Internship (SULI) program. This work is supported by the US DOE Contract No. DE-AC02-09CH11466.
NASA Astrophysics Data System (ADS)
Tabuchi, Toru; Yamagata, Shigeki; Tamura, Tetsuo
2003-04-01
There are increasing demands for information to avoid accident in automobile traffic increase. We will discuss that an infrared camera can identify three conditions (dry, aquaplane, frozen) of the road surface. Principles of this method are; 1.We have found 3-color infrared camera can distinguish those conditions using proper data processing 2.The emissivity of the materials on the road surface (conclete, water, ice) differs in three wavelength regions. 3.The sky's temperature is lower than the road's. The emissivity of the road depends on the road surface conditions. Therefore, 3-color infrared camera measure the energy reflected from the sky on the road surface and self radiation of road surface. The road condition can be distinguished by processing the energy pattern measured in three wavelength regions. We were able to collect the experimental results that the emissivity of conclete is differ from water. The infrared camera whose NETD (Noise Equivalent Temperature Difference) at each 3-wavelength is 1.0C or less can distinguish the road conditions by using emissivity difference.
C-RED One and C-RED2: SWIR high-performance cameras using Saphira e-APD and Snake InGaAs detectors
NASA Astrophysics Data System (ADS)
Gach, Jean-Luc; Feautrier, Philippe; Stadler, Eric; Clop, Fabien; Lemarchand, Stephane; Carmignani, Thomas; Wanwanscappel, Yann; Boutolleau, David
2018-02-01
After the development of the OCAM2 EMCCD fast visible camera dedicated to advanced adaptive optics wavefront sensing, First Light Imaging moved to the SWIR fast cameras with the development of the C-RED One and the C-RED 2 cameras. First Light Imaging's C-RED One infrared camera is capable of capturing up to 3500 full frames per second with a subelectron readout noise and very low background. C-RED One is based on the last version of the SAPHIRA detector developed by Leonardo UK. This breakthrough has been made possible thanks to the use of an e-APD infrared focal plane array which is a real disruptive technology in imagery. C-RED One is an autonomous system with an integrated cooling system and a vacuum regeneration system. It operates its sensor with a wide variety of read out techniques and processes video on-board thanks to an FPGA. We will show its performances and expose its main features. In addition to this project, First Light Imaging developed an InGaAs 640x512 fast camera with unprecedented performances in terms of noise, dark and readout speed based on the SNAKE SWIR detector from Sofradir. The camera was called C-RED 2. The C-RED 2 characteristics and performances will be described. The C-RED One project has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944. The C-RED 2 development is supported by the "Investments for the future" program and the Provence Alpes Côte d'Azur Region, in the frame of the CPER.
The Sensor Irony: How Reliance on Sensor Technology is Limiting Our View of the Battlefield
2010-05-10
thermal ) camera, as well as a laser illuminator/range finder.73 Similar to the MQ- 1 , the MQ-9 Reaper is primarily a strike asset for emerging targets...Wescam 14TS. 1 Both systems have an Electro-optical (daylight) TV camera, an Infra-red ( thermal ) camera, as well as a laser illuminator/range finder...Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour
NASA Technical Reports Server (NTRS)
Simpson, C.; Eisenhardt, P.
1998-01-01
We investigate the ability of the Space Infrared Telescope Facility's Infrared Array Camera to detect distant (z3) galaxies and measure their photometric redshifts. Our analysis shows that changing the original long wavelength filter specifications provides significant improvements in performance in this and other areas.
NASA Astrophysics Data System (ADS)
Markham, James; Cosgrove, Joseph; Scire, James; Haldeman, Charles; Agoos, Ian
2014-12-01
This paper announces the implementation of a long wavelength infrared camera to obtain high-speed thermal images of an aircraft engine's in-service thermal barrier coated turbine blades. Long wavelength thermal images were captured of first-stage blades. The achieved temporal and spatial resolutions allowed for the identification of cooling-hole locations. The software and synchronization components of the system allowed for the selection of any blade on the turbine wheel, with tuning capability to image from leading edge to trailing edge. Its first application delivered calibrated thermal images as a function of turbine rotational speed at both steady state conditions and during engine transients. In advance of presenting these data for the purpose of understanding engine operation, this paper focuses on the components of the system, verification of high-speed synchronized operation, and the integration of the system with the commercial jet engine test bed.
Markham, James; Cosgrove, Joseph; Scire, James; Haldeman, Charles; Agoos, Ian
2014-12-01
This paper announces the implementation of a long wavelength infrared camera to obtain high-speed thermal images of an aircraft engine's in-service thermal barrier coated turbine blades. Long wavelength thermal images were captured of first-stage blades. The achieved temporal and spatial resolutions allowed for the identification of cooling-hole locations. The software and synchronization components of the system allowed for the selection of any blade on the turbine wheel, with tuning capability to image from leading edge to trailing edge. Its first application delivered calibrated thermal images as a function of turbine rotational speed at both steady state conditions and during engine transients. In advance of presenting these data for the purpose of understanding engine operation, this paper focuses on the components of the system, verification of high-speed synchronized operation, and the integration of the system with the commercial jet engine test bed.
2001-11-26
KENNEDY SPACE CENTER, Fla. -- A piece of equipment for Hubble Space Telescope Servicing mission is moved inside Hangar AE, Cape Canaveral. In the canister is the Advanced Camera for Surveys (ACS). The ACS will increase the discovery efficiency of the HST by a factor of ten. It consists of three electronic cameras and a complement of filters and dispersers that detect light from the ultraviolet to the near infrared (1200 - 10,000 angstroms). The ACS was built through a collaborative effort between Johns Hopkins University, Goddard Space Flight Center, Ball Aerospace Corporation and Space Telescope Science Institute. The goal of the mission, STS-109, is to service the HST, replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the ACS, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002
2001-11-29
KENNEDY SPACE CENTER, Fla. -- In Hangar A&E, workers watch as an overhead crane lifts the Advanced Camera for Surveys out of its transportation container. Part of the payload on the Hubble Space Telescope Servicing Mission, STS-109, the ACS will increase the discovery efficiency of the HST by a factor of ten. It consists of three electronic cameras and a complement of filters and dispersers that detect light from the ultraviolet to the near infrared (1200 - 10,000 angstroms). The ACS was built through a collaborative effort between Johns Hopkins University, Goddard Space Flight Center, Ball Aerospace Corporation and Space Telescope Science Institute. Tasks for the mission include replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the ACS, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002
2001-11-26
KENNEDY SPACE CENTER, Fla. - A piece of equipment for Hubble Space Telescope Servicing mission arrives at Hangar AE, Cape Canaveral. Inside the canister is the Advanced Camera for Surveys (ACS). The ACS will increase the discovery efficiency of the HST by a factor of ten. It consists of three electronic cameras and a complement of filters and dispersers that detect light from the ultraviolet to the near infrared (1200 - 10,000 angstroms). The ACS was built through a collaborative effort between Johns Hopkins University, Goddard Space Flight Center, Ball Aerospace Corporation and Space Telescope Science Institute. The goal of the mission, STS-109, is to service the HST, replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the ACS, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002
ARNICA, the Arcetri near-infrared camera: Astronomical performance assessment.
NASA Astrophysics Data System (ADS)
Hunt, L. K.; Lisi, F.; Testi, L.; Baffa, C.; Borelli, S.; Maiolino, R.; Moriondo, G.; Stanga, R. M.
1996-01-01
The Arcetri near-infrared camera ARNICA was built as a users' instrument for the Infrared Telescope at Gornergrat (TIRGO), and is based on a 256x256 NICMOS 3 detector. In this paper, we discuss ARNICA's optical and astronomical performance at the TIRGO and at the William Herschel Telescope on La Palma. Optical performance is evaluated in terms of plate scale, distortion, point spread function, and ghosting. Astronomical performance is characterized by camera efficiency, sensitivity, and spatial uniformity of the photometry.
Development of an Extra-vehicular (EVA) Infrared (IR) Camera Inspection System
NASA Technical Reports Server (NTRS)
Gazarik, Michael; Johnson, Dave; Kist, Ed; Novak, Frank; Antill, Charles; Haakenson, David; Howell, Patricia; Pandolf, John; Jenkins, Rusty; Yates, Rusty
2006-01-01
Designed to fulfill a critical inspection need for the Space Shuttle Program, the EVA IR Camera System can detect crack and subsurface defects in the Reinforced Carbon-Carbon (RCC) sections of the Space Shuttle s Thermal Protection System (TPS). The EVA IR Camera performs this detection by taking advantage of the natural thermal gradients induced in the RCC by solar flux and thermal emission from the Earth. This instrument is a compact, low-mass, low-power solution (1.2cm3, 1.5kg, 5.0W) for TPS inspection that exceeds existing requirements for feature detection. Taking advantage of ground-based IR thermography techniques, the EVA IR Camera System provides the Space Shuttle program with a solution that can be accommodated by the existing inspection system. The EVA IR Camera System augments the visible and laser inspection systems and finds cracks and subsurface damage that is not measurable by the other sensors, and thus fills a critical gap in the Space Shuttle s inspection needs. This paper discusses the on-orbit RCC inspection measurement concept and requirements, and then presents a detailed description of the EVA IR Camera System design.
NASA Technical Reports Server (NTRS)
Howell, Patricia A.; Winfree, William P.; Cramer, K. Elliott
2008-01-01
On July 12, 2006, British-born astronaut Piers Sellers became the first person to conduct thermal nondestructive evaluation experiments in space, demonstrating the feasibility of a new tool for detecting damage to the reinforced carbon-carbon (RCC) structures of the Shuttle. This new tool was an EVA (Extravehicular Activity, or spacewalk) compatible infrared camera developed by NASA engineers. Data was collected both on the wing leading edge of the Orbiter and on pre-damaged samples mounted in the Shuttle s cargo bay. A total of 10 infrared movies were collected during the EVA totaling over 250 megabytes of data. Images were downloaded from the orbiting Shuttle to Johnson Space Center for analysis and processing. Results are shown to be comparable to ground-based thermal inspections performed in the laboratory with the same type of camera and simulated solar heating. The EVA camera system detected flat-bottom holes as small as 2.54cm in diameter with 50% material loss from the back (hidden) surface in RCC during this first test of the EVA IR Camera. Data for the time history of the specimen temperature and the capability of the inspection system for imaging impact damage are presented.
NASA Astrophysics Data System (ADS)
Liu, Chengwei; Sui, Xiubao; Gu, Guohua; Chen, Qian
2018-02-01
For the uncooled long-wave infrared (LWIR) camera, the infrared (IR) irradiation the focal plane array (FPA) receives is a crucial factor that affects the image quality. Ambient temperature fluctuation as well as system power consumption can result in changes of FPA temperature and radiation characteristics inside the IR camera; these will further degrade the imaging performance. In this paper, we present a novel shutterless non-uniformity correction method to compensate for non-uniformity derived from the variation of ambient temperature. Our method combines a calibration-based method and the properties of a scene-based method to obtain correction parameters at different ambient temperature conditions, so that the IR camera performance can be less influenced by ambient temperature fluctuation or system power consumption. The calibration process is carried out in a temperature chamber with slowly changing ambient temperature and a black body as uniform radiation source. Enough uniform images are captured and the gain coefficients are calculated during this period. Then in practical application, the offset parameters are calculated via the least squares method based on the gain coefficients, the captured uniform images and the actual scene. Thus we can get a corrected output through the gain coefficients and offset parameters. The performance of our proposed method is evaluated on realistic IR images and compared with two existing methods. The images we used in experiments are obtained by a 384× 288 pixels uncooled LWIR camera. Results show that our proposed method can adaptively update correction parameters as the actual target scene changes and is more stable to temperature fluctuation than the other two methods.
Combined hostile fire and optics detection
NASA Astrophysics Data System (ADS)
Brännlund, Carl; Tidström, Jonas; Henriksson, Markus; Sjöqvist, Lars
2013-10-01
Snipers and other optically guided weapon systems are serious threats in military operations. We have studied a SWIR (Short Wave Infrared) camera-based system with capability to detect and locate snipers both before and after shot over a large field-of-view. The high frame rate SWIR-camera allows resolution of the temporal profile of muzzle flashes which is the infrared signature associated with the ejection of the bullet from the rifle. The capability to detect and discriminate sniper muzzle flashes with this system has been verified by FOI in earlier studies. In this work we have extended the system by adding a laser channel for optics detection. A laser diode with slit-shaped beam profile is scanned over the camera field-of-view to detect retro reflection from optical sights. The optics detection system has been tested at various distances up to 1.15 km showing the feasibility to detect rifle scopes in full daylight. The high speed camera gives the possibility to discriminate false alarms by analyzing the temporal data. The intensity variation, caused by atmospheric turbulence, enables discrimination of small sights from larger reflectors due to aperture averaging, although the targets only cover a single pixel. It is shown that optics detection can be integrated in combination with muzzle flash detection by adding a scanning rectangular laser slit. The overall optics detection capability by continuous surveillance of a relatively large field-of-view looks promising. This type of multifunctional system may become an important tool to detect snipers before and after shot.
An Inexpensive Digital Infrared Camera
ERIC Educational Resources Information Center
Mills, Allan
2012-01-01
Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)
33 CFR 117.993 - Lake Champlain.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) A sufficient number of infrared cameras shall be maintained in good working order at all times with... infrared cameras to verify that the channel is clear of all approaching vessel traffic. All approaching...
33 CFR 117.993 - Lake Champlain.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) A sufficient number of infrared cameras shall be maintained in good working order at all times with... infrared cameras to verify that the channel is clear of all approaching vessel traffic. All approaching...
A Portable, Inexpensive, Nonmydriatic Fundus Camera Based on the Raspberry Pi® Computer.
Shen, Bailey Y; Mukai, Shizuo
2017-01-01
Purpose. Nonmydriatic fundus cameras allow retinal photography without pharmacologic dilation of the pupil. However, currently available nonmydriatic fundus cameras are bulky, not portable, and expensive. Taking advantage of recent advances in mobile technology, we sought to create a nonmydriatic fundus camera that was affordable and could be carried in a white coat pocket. Methods. We built a point-and-shoot prototype camera using a Raspberry Pi computer, an infrared-sensitive camera board, a dual infrared and white light light-emitting diode, a battery, a 5-inch touchscreen liquid crystal display, and a disposable 20-diopter condensing lens. Our prototype camera was based on indirect ophthalmoscopy with both infrared and white lights. Results. The prototype camera measured 133mm × 91mm × 45mm and weighed 386 grams. The total cost of the components, including the disposable lens, was $185.20. The camera was able to obtain good-quality fundus images without pharmacologic dilation of the pupils. Conclusion. A fully functional, inexpensive, handheld, nonmydriatic fundus camera can be easily assembled from a relatively small number of components. With modest improvements, such a camera could be useful for a variety of healthcare professionals, particularly those who work in settings where a traditional table-mounted nonmydriatic fundus camera would be inconvenient.
A Portable, Inexpensive, Nonmydriatic Fundus Camera Based on the Raspberry Pi® Computer
Shen, Bailey Y.
2017-01-01
Purpose. Nonmydriatic fundus cameras allow retinal photography without pharmacologic dilation of the pupil. However, currently available nonmydriatic fundus cameras are bulky, not portable, and expensive. Taking advantage of recent advances in mobile technology, we sought to create a nonmydriatic fundus camera that was affordable and could be carried in a white coat pocket. Methods. We built a point-and-shoot prototype camera using a Raspberry Pi computer, an infrared-sensitive camera board, a dual infrared and white light light-emitting diode, a battery, a 5-inch touchscreen liquid crystal display, and a disposable 20-diopter condensing lens. Our prototype camera was based on indirect ophthalmoscopy with both infrared and white lights. Results. The prototype camera measured 133mm × 91mm × 45mm and weighed 386 grams. The total cost of the components, including the disposable lens, was $185.20. The camera was able to obtain good-quality fundus images without pharmacologic dilation of the pupils. Conclusion. A fully functional, inexpensive, handheld, nonmydriatic fundus camera can be easily assembled from a relatively small number of components. With modest improvements, such a camera could be useful for a variety of healthcare professionals, particularly those who work in settings where a traditional table-mounted nonmydriatic fundus camera would be inconvenient. PMID:28396802
Reflective all-sky thermal infrared cloud imager.
Redman, Brian J; Shaw, Joseph A; Nugent, Paul W; Clark, R Trevor; Piazzolla, Sabino
2018-04-30
A reflective all-sky imaging system has been built using a long-wave infrared microbolometer camera and a reflective metal sphere. This compact system was developed for measuring spatial and temporal patterns of clouds and their optical depth in support of applications including Earth-space optical communications. The camera is mounted to the side of the reflective sphere to leave the zenith sky unobstructed. The resulting geometric distortion is removed through an angular map derived from a combination of checkerboard-target imaging, geometric ray tracing, and sun-location-based alignment. A tape of high-emissivity material on the side of the reflector acts as a reference that is used to estimate and remove thermal emission from the metal sphere. Once a bias that is under continuing study was removed, sky radiance measurements from the all-sky imager in the 8-14 μm wavelength range agreed to within 0.91 W/(m 2 sr) of measurements from a previously calibrated, lens-based infrared cloud imager over its 110° field of view.
C-RED one: ultra-high speed wavefront sensing in the infrared made possible
NASA Astrophysics Data System (ADS)
Gach, J.-L.; Feautrier, Philippe; Stadler, Eric; Greffe, Timothee; Clop, Fabien; Lemarchand, Stéphane; Carmignani, Thomas; Boutolleau, David; Baker, Ian
2016-07-01
First Light Imaging's CRED-ONE infrared camera is capable of capturing up to 3500 full frames per second with a subelectron readout noise. This breakthrough has been made possible thanks to the use of an e-APD infrared focal plane array which is a real disruptive technology in imagery. We will show the performances of the camera, its main features and compare them to other high performance wavefront sensing cameras like OCAM2 in the visible and in the infrared. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.
Cryogenic optical systems for the rapid infrared imager/spectrometer (RIMAS)
NASA Astrophysics Data System (ADS)
Capone, John I.; Content, David A.; Kutyrev, Alexander S.; Robinson, Frederick D.; Lotkin, Gennadiy N.; Toy, Vicki L.; Veilleux, Sylvain; Moseley, Samuel H.; Gehrels, Neil A.; Vogel, Stuart N.
2014-07-01
The Rapid Infrared Imager/Spectrometer (RIMAS) is designed to perform follow-up observations of transient astronomical sources at near infrared (NIR) wavelengths (0.9 - 2.4 microns). In particular, RIMAS will be used to perform photometric and spectroscopic observations of gamma-ray burst (GRB) afterglows to compliment the Swift satellite's science goals. Upon completion, RIMAS will be installed on Lowell Observatory's 4.3 meter Discovery Channel Telescope (DCT) located in Happy Jack, Arizona. The instrument's optical design includes a collimator lens assembly, a dichroic to divide the wavelength coverage into two optical arms (0.9 - 1.4 microns and 1.4 - 2.4 microns respectively), and a camera lens assembly for each optical arm. Because the wavelength coverage extends out to 2.4 microns, all optical elements are cooled to ~70 K. Filters and transmission gratings are located on wheels prior to each camera allowing the instrument to be quickly configured for photometry or spectroscopy. An athermal optomechanical design is being implemented to prevent lenses from loosing their room temperature alignment as the system is cooled. The thermal expansion of materials used in this design have been measured in the lab. Additionally, RIMAS has a guide camera consisting of four lenses to aid observers in passing light from target sources through spectroscopic slits. Efforts to align these optics are ongoing.
The Earth and Moon As Seen by 2001 Mars Odyssey's Thermal Emission Imaging System
NASA Technical Reports Server (NTRS)
2001-01-01
2001 Mars Odyssey's Thermal Emission Imaging System (THEMIS) took this portrait of the Earth and its companion Moon, using the infrared camera, one of two cameras in the instrument. It was taken at a distance of 3,563,735 kilometers (more than 2 million miles) on April 19, 2001 as the 2001 Mars Odyssey spacecraft left the Earth. From this distance and perspective the camera was able to acquire an image that directly shows the true distance from the Earth to the Moon. The Earth's diameter is about 12,750 km, and the distance from the Earth to the Moon is about 385,000 km, corresponding to 30 Earth diameters. The dark region seen on Earth in the infrared temperature image is the cold south pole, with a temperature of minus 50 degrees Celsius (minus 58 degrees Fahrenheit). The small bright region above it is warm Australia. This image was acquired using the 9.1 um infrared filter, one of nine filters that the instrument will use to map the mineral composition and temperature of the martian surface. From this great distance, each picture element (pixel) in the image corresponds to a region 900 by 900 kilometers or greater in size or about size of the state of Texas. Once Odyssey reaches Mars orbit each infrared pixel will cover a region only 100 by 100 meters on the surface, about the size of a major league baseball field.
Enhanced Video-Oculography System
NASA Technical Reports Server (NTRS)
Moore, Steven T.; MacDougall, Hamish G.
2009-01-01
A previously developed video-oculography system has been enhanced for use in measuring vestibulo-ocular reflexes of a human subject in a centrifuge, motor vehicle, or other setting. The system as previously developed included a lightweight digital video camera mounted on goggles. The left eye was illuminated by an infrared light-emitting diode via a dichroic mirror, and the camera captured images of the left eye in infrared light. To extract eye-movement data, the digitized video images were processed by software running in a laptop computer. Eye movements were calibrated by having the subject view a target pattern, fixed with respect to the subject s head, generated by a goggle-mounted laser with a diffraction grating. The system as enhanced includes a second camera for imaging the scene from the subject s perspective, and two inertial measurement units (IMUs) for measuring linear accelerations and rates of rotation for computing head movements. One IMU is mounted on the goggles, the other on the centrifuge or vehicle frame. All eye-movement and head-motion data are time-stamped. In addition, the subject s point of regard is superimposed on each scene image to enable analysis of patterns of gaze in real time.
Portable Long-Wavelength Infrared Camera for Civilian Application
NASA Technical Reports Server (NTRS)
Gunapala, S. D.; Krabach, T. N.; Bandara, S. V.; Liu, J. K.
1997-01-01
In this paper, we discuss the performance of this portable long-wavelength infrared camera in quantum efficiency, NEAT, minimum resolvable temperature differnce (MRTD), uniformity, etc. and its application in science, medicine and defense.
Infrared thermography for detection of laminar-turbulent transition in low-speed wind tunnel testing
NASA Astrophysics Data System (ADS)
Joseph, Liselle A.; Borgoltz, Aurelien; Devenport, William
2016-05-01
This work presents the details of a system for experimentally identifying laminar-to-turbulent transition using infrared thermography applied to large, metal models in low-speed wind tunnel tests. Key elements of the transition detection system include infrared cameras with sensitivity in the 7.5- to 14.0-µm spectral range and a thin, insulating coat for the model. The fidelity of the system was validated through experiments on two wind-turbine blade airfoil sections tested at Reynolds numbers between Re = 1.5 × 106 and 3 × 106. Results compare well with measurements from surface pressure distributions and stethoscope observations. However, the infrared-based system provides data over a much broader range of conditions and locations on the model. This paper chronicles the design, implementation and validation of the infrared transition detection system, a subject which has not been widely detailed in the literature to date.
Low-cost uncooled VOx infrared camera development
NASA Astrophysics Data System (ADS)
Li, Chuan; Han, C. J.; Skidmore, George D.; Cook, Grady; Kubala, Kenny; Bates, Robert; Temple, Dorota; Lannon, John; Hilton, Allan; Glukh, Konstantin; Hardy, Busbee
2013-06-01
The DRS Tamarisk® 320 camera, introduced in 2011, is a low cost commercial camera based on the 17 µm pixel pitch 320×240 VOx microbolometer technology. A higher resolution 17 µm pixel pitch 640×480 Tamarisk®640 has also been developed and is now in production serving the commercial markets. Recently, under the DARPA sponsored Low Cost Thermal Imager-Manufacturing (LCTI-M) program and internal project, DRS is leading a team of industrial experts from FiveFocal, RTI International and MEMSCAP to develop a small form factor uncooled infrared camera for the military and commercial markets. The objective of the DARPA LCTI-M program is to develop a low SWaP camera (<3.5 cm3 in volume and <500 mW in power consumption) that costs less than US $500 based on a 10,000 units per month production rate. To meet this challenge, DRS is developing several innovative technologies including a small pixel pitch 640×512 VOx uncooled detector, an advanced digital ROIC and low power miniature camera electronics. In addition, DRS and its partners are developing innovative manufacturing processes to reduce production cycle time and costs including wafer scale optic and vacuum packaging manufacturing and a 3-dimensional integrated camera assembly. This paper provides an overview of the DRS Tamarisk® project and LCTI-M related uncooled technology development activities. Highlights of recent progress and challenges will also be discussed. It should be noted that BAE Systems and Raytheon Vision Systems are also participants of the DARPA LCTI-M program.
Auto-converging stereo cameras for 3D robotic tele-operation
NASA Astrophysics Data System (ADS)
Edmondson, Richard; Aycock, Todd; Chenault, David
2012-06-01
Polaris Sensor Technologies has developed a Stereovision Upgrade Kit for TALON robot to provide enhanced depth perception to the operator. This kit previously required the TALON Operator Control Unit to be equipped with the optional touchscreen interface to allow for operator control of the camera convergence angle adjustment. This adjustment allowed for optimal camera convergence independent of the distance from the camera to the object being viewed. Polaris has recently improved the performance of the stereo camera by implementing an Automatic Convergence algorithm in a field programmable gate array in the camera assembly. This algorithm uses scene content to automatically adjust the camera convergence angle, freeing the operator to focus on the task rather than adjustment of the vision system. The autoconvergence capability has been demonstrated on both visible zoom cameras and longwave infrared microbolometer stereo pairs.
Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor.
Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung
2018-03-23
Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works.
Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor
Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung
2018-01-01
Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works. PMID:29570690
NASA Astrophysics Data System (ADS)
Schimert, Thomas R.; Ratcliff, David D.; Brady, John F., III; Ropson, Steven J.; Gooch, Roland W.; Ritchey, Bobbi; McCardel, P.; Rachels, K.; Wand, Marty; Weinstein, M.; Wynn, John
1999-07-01
Low power and low cost are primary requirements for an imaging infrared camera used in unattended ground sensor arrays. In this paper, an amorphous silicon (a-Si) microbolometer-based uncooled infrared camera technology offering a low cost, low power solution to infrared surveillance for UGS applications is presented. A 15 X 31 micro infrared camera (MIRC) has been demonstrated which exhibits an f/1 noise equivalent temperature difference sensitivity approximately 67 mK. This sensitivity has been achieved without the use of a thermoelectric cooler for array temperature stabilization thereby significantly reducing the power requirements. The chopperless camera is capable of operating from snapshot mode (1 Hz) to video frame rate (30 Hz). Power consumption of 0.4 W without display, and 0.75 W with display, respectively, has been demonstrated at 30 Hz operation. The 15 X 31 camera demonstrated exhibits a 35 mm camera form factor employing a low cost f/1 singlet optic and LED display, as well as low cost vacuum packaging. A larger 120 X 160 version of the MIRC is also in development and will be discussed. The 120 X 160 MIRC exhibits a substantially smaller form factor and incorporates all the low cost, low power features demonstrated in the 15 X 31 MIRC prototype. In this paper, a-Si microbolometer technology for the MIRC will be presented. Also, the key features and performance parameters of the MIRC are presented.
NASA Astrophysics Data System (ADS)
Sumriddetchkajorn, Sarun; Chaitavon, Kosom
2009-07-01
This paper introduces a parallel measurement approach for fast infrared-based human temperature screening suitable for use in a large public area. Our key idea is based on the combination of simple image processing algorithms, infrared technology, and human flow management. With this multidisciplinary concept, we arrange as many people as possible in a two-dimensional space in front of a thermal imaging camera and then highlight all human facial areas through simple image filtering, image morphological, and particle analysis processes. In this way, an individual's face in live thermal image can be located and the maximum facial skin temperature can be monitored and displayed. Our experiment shows a measured 1 ms processing time in highlighting all human face areas. With a thermal imaging camera having an FOV lens of 24° × 18° and 320 × 240 active pixels, the maximum facial skin temperatures from three people's faces located at 1.3 m from the camera can also be simultaneously monitored and displayed in a measured rate of 31 fps, limited by the looping process in determining coordinates of all faces. For our 3-day test under the ambient temperature of 24-30 °C, 57-72% relative humidity, and weak wind from the outside hospital building, hyperthermic patients can be identified with 100% sensitivity and 36.4% specificity when the temperature threshold level and the offset temperature value are appropriately chosen. Appropriately locating our system away from the building doors, air conditioners and electric fans in order to eliminate wind blow coming toward the camera lens can significantly help improve our system specificity.
Robert, Clélia; Michau, Vincent; Fleury, Bruno; Magli, Serge; Vial, Laurent
2012-07-02
Adaptive optics provide real-time compensation for atmospheric turbulence. The correction quality relies on a key element: the wavefront sensor. We have designed an adaptive optics system in the mid-infrared range providing high spatial resolution for ground-to-air applications, integrating a Shack-Hartmann infrared wavefront sensor operating on an extended source. This paper describes and justifies the design of the infrared wavefront sensor, while defining and characterizing the Shack-Hartmann wavefront sensor camera. Performance and illustration of field tests are also reported.
Tsunoda, Koichi; Tsunoda, Atsunobu; Ishimoto, ShinnIchi; Kimura, Satoko
2006-01-01
The exclusive charge-coupled device (CCD) camera system for the endoscope and electronic fiberscopes are in widespread use. However, both are usually stationary in an office or examination room, and a wheeled cart is needed for mobility. The total costs of the CCD camera system and electronic fiberscopy system are at least US Dollars 10,000 and US Dollars 30,000, respectively. Recently, the performance of audio and visual instruments has improved dramatically, with a concomitant reduction in their cost. Commercially available CCD video cameras with small monitors have become common. They provide excellent image quality and are much smaller and less expensive than previous models. The authors have developed adaptors for the popular mini-digital video (mini-DV) camera. The camera also provides video and acoustic output signals; therefore, the endoscopic images can be viewed on a large monitor simultaneously. The new system (a mini-DV video camera and an adaptor) costs only US Dollars 1,000. Therefore, the system is both cost-effective and useful for the outpatient clinic or casualty setting, or on house calls for the purpose of patient education. In the future, the authors plan to introduce the clinical application of a high-vision camera and an infrared camera as medical instruments for clinical and research situations.
Wen, Feng; Yu, Minzhong; Wu, Dezheng; Ma, Juanmei; Wu, Lezheng
2002-07-01
To observe the effect of indocyanine green angiography (ICGA) with infrared fundus camera on subsequent dark adaptation and the Ganzfeld electroretinogram (ERG), the ERGs of 38 eyes with different retinal diseases were recorded before and after ICGA during a 40-min dark adaptation period. ICGA was performed with Topcon 50IA retina camera. Ganzfeld ERG was recorded with Neuropack II evoked response recorder. The results showed that ICGA did not affect the latencies and the amplitudes in ERG of rod response, cone response and mixed maximum response (p>0.05). It suggests that ICGA using infrared fundus camera could be performed prior to the recording of the Ganzfeld ERG.
Color Infrared view of Houston, TX, USA
1991-09-18
This color infrared view of Houston (29.5N, 95.0W) was taken with a dual camera mount. Compare this scene with STS048-78-034 for an analysis of the unique properties of each film type. Comparative tests such as this aids in determining the kinds of information unique to each film system and evaluates and compares photography taken through hazy atmospheres. Infrared film is best at penetrating haze, vegetation detection and producing a sharp image.
Linear array of photodiodes to track a human speaker for video recording
NASA Astrophysics Data System (ADS)
DeTone, D.; Neal, H.; Lougheed, R.
2012-12-01
Communication and collaboration using stored digital media has garnered more interest by many areas of business, government and education in recent years. This is due primarily to improvements in the quality of cameras and speed of computers. An advantage of digital media is that it can serve as an effective alternative when physical interaction is not possible. Video recordings that allow for viewers to discern a presenter's facial features, lips and hand motions are more effective than videos that do not. To attain this, one must maintain a video capture in which the speaker occupies a significant portion of the captured pixels. However, camera operators are costly, and often do an imperfect job of tracking presenters in unrehearsed situations. This creates motivation for a robust, automated system that directs a video camera to follow a presenter as he or she walks anywhere in the front of a lecture hall or large conference room. Such a system is presented. The system consists of a commercial, off-the-shelf pan/tilt/zoom (PTZ) color video camera, a necklace of infrared LEDs and a linear photodiode array detector. Electronic output from the photodiode array is processed to generate the location of the LED necklace, which is worn by a human speaker. The computer controls the video camera movements to record video of the speaker. The speaker's vertical position and depth are assumed to remain relatively constant- the video camera is sent only panning (horizontal) movement commands. The LED necklace is flashed at 70Hz at a 50% duty cycle to provide noise-filtering capability. The benefit to using a photodiode array versus a standard video camera is its higher frame rate (4kHz vs. 60Hz). The higher frame rate allows for the filtering of infrared noise such as sunlight and indoor lighting-a capability absent from other tracking technologies. The system has been tested in a large lecture hall and is shown to be effective.
Automatic forest-fire measuring using ground stations and Unmanned Aerial Systems.
Martínez-de Dios, José Ramiro; Merino, Luis; Caballero, Fernando; Ollero, Anibal
2011-01-01
This paper presents a novel system for automatic forest-fire measurement using cameras distributed at ground stations and mounted on Unmanned Aerial Systems (UAS). It can obtain geometrical measurements of forest fires in real-time such as the location and shape of the fire front, flame height and rate of spread, among others. Measurement of forest fires is a challenging problem that is affected by numerous potential sources of error. The proposed system addresses them by exploiting the complementarities between infrared and visual cameras located at different ground locations together with others onboard Unmanned Aerial Systems (UAS). The system applies image processing and geo-location techniques to obtain forest-fire measurements individually from each camera and then integrates the results from all the cameras using statistical data fusion techniques. The proposed system has been extensively tested and validated in close-to-operational conditions in field fire experiments with controlled safety conditions carried out in Portugal and Spain from 2001 to 2006.
Automatic Forest-Fire Measuring Using Ground Stations and Unmanned Aerial Systems
Martínez-de Dios, José Ramiro; Merino, Luis; Caballero, Fernando; Ollero, Anibal
2011-01-01
This paper presents a novel system for automatic forest-fire measurement using cameras distributed at ground stations and mounted on Unmanned Aerial Systems (UAS). It can obtain geometrical measurements of forest fires in real-time such as the location and shape of the fire front, flame height and rate of spread, among others. Measurement of forest fires is a challenging problem that is affected by numerous potential sources of error. The proposed system addresses them by exploiting the complementarities between infrared and visual cameras located at different ground locations together with others onboard Unmanned Aerial Systems (UAS). The system applies image processing and geo-location techniques to obtain forest-fire measurements individually from each camera and then integrates the results from all the cameras using statistical data fusion techniques. The proposed system has been extensively tested and validated in close-to-operational conditions in field fire experiments with controlled safety conditions carried out in Portugal and Spain from 2001 to 2006. PMID:22163958
2001 Mars Odyssey Images Earth (Visible and Infrared)
NASA Technical Reports Server (NTRS)
2001-01-01
2001 Mars Odyssey's Thermal Emission Imaging System (THEMIS) acquired these images of the Earth using its visible and infrared cameras as it left the Earth. The visible image shows the thin crescent viewed from Odyssey's perspective. The infrared image was acquired at exactly the same time, but shows the entire Earth using the infrared's 'night-vision' capability. Invisible light the instrument sees only reflected sunlight and therefore sees nothing on the night side of the planet. In infrared light the camera observes the light emitted by all regions of the Earth. The coldest ground temperatures seen correspond to the nighttime regions of Antarctica; the warmest temperatures occur in Australia. The low temperature in Antarctica is minus 50 degrees Celsius (minus 58 degrees Fahrenheit); the high temperature at night in Australia 9 degrees Celsius(48.2 degrees Fahrenheit). These temperatures agree remarkably well with observed temperatures of minus 63 degrees Celsius at Vostok Station in Antarctica, and 10 degrees Celsius in Australia. The images were taken at a distance of 3,563,735 kilometers (more than 2 million miles) on April 19,2001 as the Odyssey spacecraft left Earth.
Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung
2016-08-31
Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.
Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung
2016-01-01
Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user’s head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768
A user-friendly technical set-up for infrared photography of forensic findings.
Rost, Thomas; Kalberer, Nicole; Scheurer, Eva
2017-09-01
Infrared photography is interesting for a use in forensic science and forensic medicine since it reveals findings that normally are almost invisible to the human eye. Originally, infrared photography has been made possible by the placement of an infrared light transmission filter screwed in front of the camera objective lens. However, this set-up is associated with many drawbacks such as the loss of the autofocus function, the need of an external infrared source, and long exposure times which make the use of a tripod necessary. These limitations prevented up to now the routine application of infrared photography in forensics. In this study the use of a professional modification inside the digital camera body was evaluated regarding camera handling and image quality. This permanent modification consisted of the replacement of the in-built infrared blocking filter by an infrared transmission filter of 700nm and 830nm, respectively. The application of this camera set-up for the photo-documentation of forensically relevant post-mortem findings was investigated in examples of trace evidence such as gunshot residues on the skin, in external findings, e.g. hematomas, as well as in an exemplary internal finding, i.e., Wischnewski spots in a putrefied stomach. The application of scattered light created by indirect flashlight yielded a more uniform illumination of the object, and the use of the 700nm filter resulted in better pictures than the 830nm filter. Compared to pictures taken under visible light, infrared photographs generally yielded better contrast. This allowed for discerning more details and revealed findings which were not visible otherwise, such as imprints on a fabric and tattoos in mummified skin. The permanent modification of a digital camera by building in a 700nm infrared transmission filter resulted in a user-friendly and efficient set-up which qualified for the use in daily forensic routine. Main advantages were a clear picture in the viewfinder, an auto-focus usable over the whole range of infrared light, and the possibility of using short shutter speeds which allows taking infrared pictures free-hand. The proposed set-up with a modification of the camera allows a user-friendly application of infrared photography in post-mortem settings. Copyright © 2017 Elsevier B.V. All rights reserved.
Real Time Eye Tracking and Hand Tracking Using Regular Video Cameras for Human Computer Interaction
2011-01-01
Paperwork Reduction Project (0704-0188) Washington, DC 20503. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) January...understand us. More specifically, the computer should be able to infer what we wish to see, do , and interact with through our movements, gestures, and...in depth freedom. Our system differs from the majority of other systems in that we do not use infrared, stereo-cameras, specially-constructed
Volcanic Cloud and Aerosol Monitor (VOLCAM) for Deep Space Gateway
NASA Astrophysics Data System (ADS)
Krotkov, N.; Bhartia, P. K.; Torres, O.; Li, C.; Sander, S.; Realmuto, V.; Carn, S.; Herman, J.
2018-02-01
We propose complementary ultraviolet (UV) and thermal Infrared (TIR) filter cameras for a dual-purpose whole Earth imaging with complementary natural hazards applications and Earth system science goals.
Non Contacting Evaluation of Strains and Cracking Using Optical and Infrared Imaging Techniques
1988-08-22
Compatible Zenith Z-386 microcomputer with plotter II. 3-D Motion Measurinq System 1. Complete OPTOTRAK three dimensional digitizing system. System includes...acquisition unit - 16 single ended analog input channels 3. Data Analysis Package software (KINEPLOT) 4. Extra OPTOTRAK Camera (max 224 per system
Augmented reality in laser laboratories
NASA Astrophysics Data System (ADS)
Quercioli, Franco
2018-05-01
Laser safety glasses block visibility of the laser light. This is a big nuisance when a clear view of the beam path is required. A headset made up of a smartphone and a viewer can overcome this problem. The user looks at the image of the real world on the cellphone display, captured by its rear camera. An unimpeded and safe sight of the laser beam is then achieved. If the infrared blocking filter of the smartphone camera is removed, the spectral sensitivity of the CMOS image sensor extends in the near infrared region up to 1100 nm. This substantial improvement widens the usability of the device to many laser systems for industrial and medical applications, which are located in this spectral region. The paper describes this modification of a phone camera to extend its sensitivity beyond the visible and make a true augmented reality laser viewer.
TIFR Near Infrared Imaging Camera-II on the 3.6 m Devasthal Optical Telescope
NASA Astrophysics Data System (ADS)
Baug, T.; Ojha, D. K.; Ghosh, S. K.; Sharma, S.; Pandey, A. K.; Kumar, Brijesh; Ghosh, Arpan; Ninan, J. P.; Naik, M. B.; D’Costa, S. L. A.; Poojary, S. S.; Sandimani, P. R.; Shah, H.; Krishna Reddy, B.; Pandey, S. B.; Chand, H.
Tata Institute of Fundamental Research (TIFR) Near Infrared Imaging Camera-II (TIRCAM2) is a closed-cycle Helium cryo-cooled imaging camera equipped with a Raytheon 512×512 pixels InSb Aladdin III Quadrant focal plane array (FPA) having sensitivity to photons in the 1-5μm wavelength band. In this paper, we present the performance of the camera on the newly installed 3.6m Devasthal Optical Telescope (DOT) based on the calibration observations carried out during 2017 May 11-14 and 2017 October 7-31. After the preliminary characterization, the camera has been released to the Indian and Belgian astronomical community for science observations since 2017 May. The camera offers a field-of-view (FoV) of ˜86.5‧‧×86.5‧‧ on the DOT with a pixel scale of 0.169‧‧. The seeing at the telescope site in the near-infrared (NIR) bands is typically sub-arcsecond with the best seeing of ˜0.45‧‧ realized in the NIR K-band on 2017 October 16. The camera is found to be capable of deep observations in the J, H and K bands comparable to other 4m class telescopes available world-wide. Another highlight of this camera is the observational capability for sources up to Wide-field Infrared Survey Explorer (WISE) W1-band (3.4μm) magnitudes of 9.2 in the narrow L-band (nbL; λcen˜ 3.59μm). Hence, the camera could be a good complementary instrument to observe the bright nbL-band sources that are saturated in the Spitzer-Infrared Array Camera (IRAC) ([3.6] ≲ 7.92 mag) and the WISE W1-band ([3.4] ≲ 8.1 mag). Sources with strong polycyclic aromatic hydrocarbon (PAH) emission at 3.3μm are also detected. Details of the observations and estimated parameters are presented in this paper.
Mobile viewer system for virtual 3D space using infrared LED point markers and camera
NASA Astrophysics Data System (ADS)
Sakamoto, Kunio; Taneji, Shoto
2006-09-01
The authors have developed a 3D workspace system using collaborative imaging devices. A stereoscopic display enables this system to project 3D information. In this paper, we describe the position detecting system for a see-through 3D viewer. A 3D display system is useful technology for virtual reality, mixed reality and augmented reality. We have researched spatial imaging and interaction system. We have ever proposed 3D displays using the slit as a parallax barrier, the lenticular screen and the holographic optical elements(HOEs) for displaying active image 1)2)3)4). The purpose of this paper is to propose the interactive system using these 3D imaging technologies. The observer can view virtual images in the real world when the user watches the screen of a see-through 3D viewer. The goal of our research is to build the display system as follows; when users see the real world through the mobile viewer, the display system gives users virtual 3D images, which is floating in the air, and the observers can touch these floating images and interact them such that kids can make play clay. The key technologies of this system are the position recognition system and the spatial imaging display. The 3D images are presented by the improved parallax barrier 3D display. Here the authors discuss the measuring method of the mobile viewer using infrared LED point markers and a camera in the 3D workspace (augmented reality world). The authors show the geometric analysis of the proposed measuring method, which is the simplest method using a single camera not the stereo camera, and the results of our viewer system.
MIRIADS: miniature infrared imaging applications development system description and operation
NASA Astrophysics Data System (ADS)
Baxter, Christopher R.; Massie, Mark A.; McCarley, Paul L.; Couture, Michael E.
2001-10-01
A cooperative effort between the U.S. Air Force Research Laboratory, Nova Research, Inc., the Raytheon Infrared Operations (RIO) and Optics 1, Inc. has successfully produced a miniature infrared camera system that offers significant real-time signal and image processing capabilities by virtue of its modular design. This paper will present an operational overview of the system as well as results from initial testing of the 'Modular Infrared Imaging Applications Development System' (MIRIADS) configured as a missile early-warning detection system. The MIRIADS device can operate virtually any infrared focal plane array (FPA) that currently exists. Programmable on-board logic applies user-defined processing functions to the real-time digital image data for a variety of functions. Daughterboards may be plugged onto the system to expand the digital and analog processing capabilities of the system. A unique full hemispherical infrared fisheye optical system designed and produced by Optics 1, Inc. is utilized by the MIRIADS in a missile warning application to demonstrate the flexibility of the overall system to be applied to a variety of current and future AFRL missions.
TIRCAM2: The TIFR near infrared imaging camera
NASA Astrophysics Data System (ADS)
Naik, M. B.; Ojha, D. K.; Ghosh, S. K.; Poojary, S. S.; Jadhav, R. B.; Meshram, G. S.; Sandimani, P. R.; Bhagat, S. B.; D'Costa, S. L. A.; Gharat, S. M.; Bakalkar, C. B.; Ninan, J. P.; Joshi, J. S.
2012-12-01
TIRCAM2 (TIFR near infrared imaging camera - II) is a closed cycle cooled imager that has been developed by the Infrared Astronomy Group at the Tata Institute of Fundamental Research for observations in the near infrared band of 1 to 3.7 μm with existing Indian telescopes. In this paper, we describe some of the technical details of TIRCAM2 and report its observing capabilities, measured performance and limiting magnitudes with the 2-m IUCAA Girawali telescope and the 1.2-m PRL Gurushikhar telescope. The main highlight is the camera's capability of observing in the nbL (3.59 mum) band enabling our primary motivation of mapping of Polycyclic Aromatic Hydrocarbon (PAH) emission at 3.3 mum.
Calibration and verification of thermographic cameras for geometric measurements
NASA Astrophysics Data System (ADS)
Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P.
2011-03-01
Infrared thermography is a technique with an increasing degree of development and applications. Quality assessment in the measurements performed with the thermal cameras should be achieved through metrology calibration and verification. Infrared cameras acquire temperature and geometric information, although calibration and verification procedures are only usual for thermal data. Black bodies are used for these purposes. Moreover, the geometric information is important for many fields as architecture, civil engineering and industry. This work presents a calibration procedure that allows the photogrammetric restitution and a portable artefact to verify the geometric accuracy, repeatability and drift of thermographic cameras. These results allow the incorporation of this information into the quality control processes of the companies. A grid based on burning lamps is used for the geometric calibration of thermographic cameras. The artefact designed for the geometric verification consists of five delrin spheres and seven cubes of different sizes. Metrology traceability for the artefact is obtained from a coordinate measuring machine. Two sets of targets with different reflectivity are fixed to the spheres and cubes to make data processing and photogrammetric restitution possible. Reflectivity was the chosen material propriety due to the thermographic and visual cameras ability to detect it. Two thermographic cameras from Flir and Nec manufacturers, and one visible camera from Jai are calibrated, verified and compared using calibration grids and the standard artefact. The calibration system based on burning lamps shows its capability to perform the internal orientation of the thermal cameras. Verification results show repeatability better than 1 mm for all cases, being better than 0.5 mm for the visible one. As it must be expected, also accuracy appears higher in the visible camera, and the geometric comparison between thermographic cameras shows slightly better results for the Nec camera.
The Near-Earth Object Camera: A Next-Generation Minor Planet Survey
NASA Astrophysics Data System (ADS)
Mainzer, Amy K.; Wright, Edward L.; Bauer, James; Grav, Tommy; Cutri, Roc M.; Masiero, Joseph; Nugent, Carolyn R.
2015-11-01
The Near-Earth Object Camera (NEOCam) is a next-generation asteroid and comet survey designed to discover, characterize, and track large numbers of minor planets using a 50 cm infrared telescope located at the Sun-Earth L1 Lagrange point. Proposed to NASA's Discovery program, NEOCam is designed to carry out a comprehensive inventory of the small bodies in the inner regions of our solar system. It address three themes: 1) quantify the potential hazard that near-Earth objects may pose to Earth; 2) study the origins and evolution of our solar system as revealed by its small body populations; and 3) identify the best destinations for future robotic and human exploration. With a dual channel infrared imager that observes at 4-5 and 6-10 micron bands simultaneously through the use of a beamsplitter, NEOCam enables measurements of asteroid diameters and thermal inertia. NEOCam complements existing and planned visible light surveys in terms of orbital element phase space and wavelengths, since albedos can be determined for objects with both visible and infrared flux measurements. NEOCam was awarded technology development funding in 2011 to mature the necessary megapixel infrared detectors.
Enhancement of High-Speed Infrared Array Electronics (Center Director's Discretionary Fund)
NASA Technical Reports Server (NTRS)
Sutherland, W. T.
1996-01-01
A state-of-the-art infrared detector was to be used as the sensor in a new spectrometer-camera for astronomical observations. The sensitivity of the detector required the use of low-noise, high-speed electronics in the system design. The key component in the electronic system was the pre-amplifier that amplified the low voltage signal coming from the detector. The system was designed based on the selection of the amplifier and that was driven by the maximum noise level, which would yield the desired sensitivity for the telescope system.
Method and apparatus for coherent imaging of infrared energy
Hutchinson, D.P.
1998-05-12
A coherent camera system performs ranging, spectroscopy, and thermal imaging. Local oscillator radiation is combined with target scene radiation to enable heterodyne detection by the coherent camera`s two-dimensional photodetector array. Versatility enables deployment of the system in either a passive mode (where no laser energy is actively transmitted toward the target scene) or an active mode (where a transmitting laser is used to actively illuminate the target scene). The two-dimensional photodetector array eliminates the need to mechanically scan the detector. Each element of the photodetector array produces an intermediate frequency signal that is amplified, filtered, and rectified by the coherent camera`s integrated circuitry. By spectroscopic examination of the frequency components of each pixel of the detector array, a high-resolution, three-dimensional or holographic image of the target scene is produced for applications such as air pollution studies, atmospheric disturbance monitoring, and military weapons targeting. 8 figs.
Synchronized Electronic Shutter System (SESS) for Thermal Nondestructive Evaluation
NASA Technical Reports Server (NTRS)
Zalameda, Joseph N.
2001-01-01
The purpose of this paper is to describe a new method for thermal nondestructive evaluation. This method uses a synchronized electronic shutter system (SESS) to remove the heat lamp's influence on the thermal data during and after flash heating. There are two main concerns when using flash heating. The first concern is during the flash when the photons are reflected back into the camera. This tends to saturate the detectors and potentially introduces unknown and uncorrectable errors when curve fitting the data to a model. To address this, an electronically controlled shutter was placed over the infrared camera lens. Before firing the flash lamps, the shutter is opened to acquire the necessary background data for offset calibration. During flash heating, the shutter is closed to prevent the photons from the high intensity flash from saturating the camera's detectors. The second concern is after the flash heating where the lamps radiate heat after firing. This residual cooling introduces an unwanted transient thermal response into the data. To remove this residual effect, a shutter was placed over the flash lamps to block the infrared heat radiating from the flash head after heating. This helped to remove the transient contribution of the flash. The flash lamp shutters were synchronized electronically with the camera shutter. Results are given comparing the use of the thermal inspection with and without the shutter system.
2002-03-07
STS-109 Astronaut Michael J. Massimino, mission specialist, perched on the Shuttle's robotic arm, is preparing to install the Electronic Support Module (ESM) in the aft shroud of the Hubble Space telescope (HST), with the assistance of astronaut James H. Newman (out of frame). The module will support a new experimental cooling system to be installed during the next day's fifth and final space walk of the mission. That cooling system is designed to bring the telescope's Near-Infrared Camera and Multi Spectrometer (NICMOS) back to life the which had been dormant since January 1999 when its original coolant ran out. The Space Shuttle Columbia STS-109 mission lifted off March 1, 2002 with goals of repairing and upgrading the Hubble Space Telescope (HST). The Marshall Space Flight Center in Huntsville, Alabama had the responsibility for the design, development, and construction of the HST, which is the most powerful and sophisticated telescope ever built. In addition to the installation of the experimental cooling system for the Hubble's Near-Infrared Camera and NICMOS, STS-109 upgrades to the HST included replacement of the solar array panels, replacement of the power control unit (PCU), and replacement of the Faint Object Camera (FOC) with a new advanced camera for Surveys (ACS). Lasting 10 days, 22 hours, and 11 minutes, the STS-109 mission was the 108th flight overall in NASA's Space Shuttle Program.
Evolution of the SOFIA tracking control system
NASA Astrophysics Data System (ADS)
Fiebig, Norbert; Jakob, Holger; Pfüller, Enrico; Röser, Hans-Peter; Wiedemann, Manuel; Wolf, Jürgen
2014-07-01
The airborne observatory SOFIA (Stratospheric Observatory for Infrared Astronomy) is undergoing a modernization of its tracking system. This included new, highly sensitive tracking cameras, control computers, filter wheels and other equipment, as well as a major redesign of the control software. The experiences along the migration path from an aged 19" VMbus based control system to the application of modern industrial PCs, from VxWorks real-time operating system to embedded Linux and a state of the art software architecture are presented. Further, the concept is presented to operate the new camera also as a scientific instrument, in parallel to tracking.
Observation of runaway electrons by infrared camera in J-TEXT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tong, R. H.; Chen, Z. Y., E-mail: zychen@hust.edu.cn; Zhang, M.
2016-11-15
When the energy of confined runaway electrons approaches several tens of MeV, the runaway electrons can emit synchrotron radiation in the range of infrared wavelength. An infrared camera working in the wavelength of 3-5 μm has been developed to study the runaway electrons in the Joint Texas Experimental Tokamak (J-TEXT). The camera is located in the equatorial plane looking tangentially into the direction of electron approach. The runaway electron beam inside the plasma has been observed at the flattop phase. With a fast acquisition of the camera, the behavior of runaway electron beam has been observed directly during the runawaymore » current plateau following the massive gas injection triggered disruptions.« less
InfraCAM (trade mark): A Hand-Held Commercial Infrared Camera Modified for Spaceborne Applications
NASA Technical Reports Server (NTRS)
Manitakos, Daniel; Jones, Jeffrey; Melikian, Simon
1996-01-01
In 1994, Inframetrics introduced the InfraCAM(TM), a high resolution hand-held thermal imager. As the world's smallest, lightest and lowest power PtSi based infrared camera, the InfraCAM is ideal for a wise range of industrial, non destructive testing, surveillance and scientific applications. In addition to numerous commercial applications, the light weight and low power consumption of the InfraCAM make it extremely valuable for adaptation to space borne applications. Consequently, the InfraCAM has been selected by NASA Lewis Research Center (LeRC) in Cleveland, Ohio, for use as part of the DARTFire (Diffusive and Radiative Transport in Fires) space borne experiment. In this experiment, a solid fuel is ignited in a low gravity environment. The combustion period is recorded by both visible and infrared cameras. The infrared camera measures the emission from polymethyl methacrylate, (PMMA) and combustion products in six distinct narrow spectral bands. Four cameras successfully completed all qualification tests at Inframetrics and at NASA Lewis. They are presently being used for ground based testing in preparation for space flight in the fall of 1995.
Field Characterization | Concentrating Solar Power | NREL
receivers for performance issues. It uses an infrared (IR) camera, global positioning system (GPS each row of a parabolic trough plant, using the GPS data to automate IR imaging and analyze
NASA Astrophysics Data System (ADS)
Druart, Guillaume; Matallah, Noura; Guerineau, Nicolas; Magli, Serge; Chambon, Mathieu; Jenouvrier, Pierre; Mallet, Eric; Reibel, Yann
2014-06-01
Today, both military and civilian applications require miniaturized optical systems in order to give an imagery function to vehicles with small payload capacity. After the development of megapixel focal plane arrays (FPA) with micro-sized pixels, this miniaturization will become feasible with the integration of optical functions in the detector area. In the field of cooled infrared imaging systems, the detector area is the Detector-Dewar-Cooler Assembly (DDCA). SOFRADIR and ONERA have launched a new research and innovation partnership, called OSMOSIS, to develop disruptive technologies for DDCA to improve the performance and compactness of optronic systems. With this collaboration, we will break down the technological barriers of DDCA, a sealed and cooled environment dedicated to the infrared detectors, to explore Dewar-level integration of optics. This technological breakthrough will bring more compact multipurpose thermal imaging products, as well as new thermal capabilities such as 3D imagery or multispectral imagery. Previous developments will be recalled (SOIE and FISBI cameras) and new developments will be presented. In particular, we will focus on a dual-band MWIR-LWIR camera and a multichannel camera.
Broadband image sensor array based on graphene-CMOS integration
NASA Astrophysics Data System (ADS)
Goossens, Stijn; Navickaite, Gabriele; Monasterio, Carles; Gupta, Shuchi; Piqueras, Juan José; Pérez, Raúl; Burwell, Gregory; Nikitskiy, Ivan; Lasanta, Tania; Galán, Teresa; Puma, Eric; Centeno, Alba; Pesquera, Amaia; Zurutuza, Amaia; Konstantatos, Gerasimos; Koppens, Frank
2017-06-01
Integrated circuits based on complementary metal-oxide-semiconductors (CMOS) are at the heart of the technological revolution of the past 40 years, enabling compact and low-cost microelectronic circuits and imaging systems. However, the diversification of this platform into applications other than microcircuits and visible-light cameras has been impeded by the difficulty to combine semiconductors other than silicon with CMOS. Here, we report the monolithic integration of a CMOS integrated circuit with graphene, operating as a high-mobility phototransistor. We demonstrate a high-resolution, broadband image sensor and operate it as a digital camera that is sensitive to ultraviolet, visible and infrared light (300-2,000 nm). The demonstrated graphene-CMOS integration is pivotal for incorporating 2D materials into the next-generation microelectronics, sensor arrays, low-power integrated photonics and CMOS imaging systems covering visible, infrared and terahertz frequencies.
The Texas Thermal Interface: A real-time computer interface for an Inframetrics infrared camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Storek, D.J.; Gentle, K.W.
1996-03-01
The Texas Thermal Interface (TTI) offers an advantageous alternative to the conventional video path for computer analysis of infrared images from Inframetrics cameras. The TTI provides real-time computer data acquisition of 48 consecutive fields (version described here) with 8-bit pixels. The alternative requires time-consuming individual frame grabs from video tape with frequent loss of resolution in the D/A/D conversion. Within seconds after the event, the TTI temperature files may be viewed and processed to infer heat fluxes or other quantities as needed. The system cost is far less than commercial units which offer less capability. The system was developed formore » and is being used to measure heat fluxes to the plasma-facing components in a tokamak. {copyright} {ital 1996 American Institute of Physics.}« less
Li, Hanlun; Zhang, Aiwu; Hu, Shaoxing
2015-01-01
This paper describes an airborne high resolution four-camera multispectral system which mainly consists of four identical monochrome cameras equipped with four interchangeable bandpass filters. For this multispectral system, an automatic multispectral data composing method was proposed. The homography registration model was chosen, and the scale-invariant feature transform (SIFT) and random sample consensus (RANSAC) were used to generate matching points. For the difficult registration problem between visible band images and near-infrared band images in cases lacking manmade objects, we presented an effective method based on the structural characteristics of the system. Experiments show that our method can acquire high quality multispectral images and the band-to-band alignment error of the composed multiple spectral images is less than 2.5 pixels. PMID:26205264
Gyrocopter-Based Remote Sensing Platform
NASA Astrophysics Data System (ADS)
Weber, I.; Jenal, A.; Kneer, C.; Bongartz, J.
2015-04-01
In this paper the development of a lightweight and highly modularized airborne sensor platform for remote sensing applications utilizing a gyrocopter as a carrier platform is described. The current sensor configuration consists of a high resolution DSLR camera for VIS-RGB recordings. As a second sensor modality, a snapshot hyperspectral camera was integrated in the aircraft. Moreover a custom-developed thermal imaging system composed of a VIS-PAN camera and a LWIR-camera is used for aerial recordings in the thermal infrared range. Furthermore another custom-developed highly flexible imaging system for high resolution multispectral image acquisition with up to six spectral bands in the VIS-NIR range is presented. The performance of the overall system was tested during several flights with all sensor modalities and the precalculated demands with respect to spatial resolution and reliability were validated. The collected data sets were georeferenced, georectified, orthorectified and then stitched to mosaics.
NASA Astrophysics Data System (ADS)
Orton, Glenn S.; Yanamandra-Fisher, P. A.; Parrish, P. D.; Mousis, O.; Pantin, E.; Fuse, T.; Fujiyoshi, T.; Simon-Miller, A.; Morales-Juberias, R.; Tollestrup, E.; Connelley, M.; Trujillo, C.; Hora, J.; Irwin, P.; Fletcher, L.; Hill, D.; Kollmansberger, S.
2006-09-01
White Oval BA, constituted from 3 predecessor vortices (known as Jupiter's "classical" White Ovals) after successive mergers in 1998 and 2000, became second-largest vortex in the atmosphere of Jupiter (and possibly the solar system) at the time of its formation. While it continues in this distinction,it required a name change after a 2005 December through 2006 February transformation which made it appear visually the same color as the Great Red Spot. Our campaign to understand the changes involved examination of the detailed color and wind field using Hubble Space Telescope instrumentation on several orbits in April. The field of temperatures, ammonia distribution and clouds were also examined using the mid-infrared VISIR camera/spectrometer on ESO's 8.2-m Very Large Telescope, the NASA Infrared telescope with the mid-infrared MIRSI instrument and the refurbished near-infrared facility camera NSFCam2. High-resolution images of the Oval were made before the color change with the COMICS mid-infrared facility on the 8.2-m Subaru telescope.We are using these images, togther with images acquired at the IRTF and with the Gemini/North NIRI near-infrared camera between January, 2005, and August, 2006, to characterize the extent to which changes in storm strength (vorticity, postive vertical motion) influenced (i) the depth from which colored cloud particles may have been "dredged up" from depth or (ii) the altitude to which particles may have been lofted and subject to high-energy UV radiation which caused a color change, as alternative explanations for the phenomenon. Clues to this will provide clues to the chemistry of Jupiter's cloud system and its well-known colors in general. The behavior of Oval BA, its interaction with the Great Red Spot in particular,are also being compared with dynamical models run with the EPIC code.
Remote sensing technologies are a class of instrument and sensor systems that include laser imageries, imaging spectrometers, and visible to thermal infrared cameras. These systems have been successfully used for gas phase chemical compound identification in a variety of field e...
High resolution multispectral photogrammetric imagery: enhancement, interpretation and evaluations
NASA Astrophysics Data System (ADS)
Roberts, Arthur; Haefele, Martin; Bostater, Charles; Becker, Thomas
2007-10-01
A variety of aerial mapping cameras were adapted and developed into simulated multiband digital photogrammetric mapping systems. Direct digital multispectral, two multiband cameras (IIS 4 band and Itek 9 band) and paired mapping and reconnaissance cameras were evaluated for digital spectral performance and photogrammetric mapping accuracy in an aquatic environment. Aerial films (24cm X 24cm format) tested were: Agfa color negative and extended red (visible and near infrared) panchromatic, and; Kodak color infrared and B&W (visible and near infrared) infrared. All films were negative processed to published standards and digitally converted at either 16 (color) or 10 (B&W) microns. Excellent precision in the digital conversions was obtained with scanning errors of less than one micron. Radiometric data conversion was undertaken using linear density conversion and centered 8 bit histogram exposure. This resulted in multiple 8 bit spectral image bands that were unaltered (not radiometrically enhanced) "optical count" conversions of film density. This provided the best film density conversion to a digital product while retaining the original film density characteristics. Data covering water depth, water quality, surface roughness, and bottom substrate were acquired using different measurement techniques as well as different techniques to locate sampling points on the imagery. Despite extensive efforts to obtain accurate ground truth data location errors, measurement errors, and variations in the correlation between water depth and remotely sensed signal persisted. These errors must be considered endemic and may not be removed through even the most elaborate sampling set up. Results indicate that multispectral photogrammetric systems offer improved feature mapping capability.
Nighttime Foreground Pedestrian Detection Based on Three-Dimensional Voxel Surface Model.
Li, Jing; Zhang, Fangbing; Wei, Lisong; Yang, Tao; Lu, Zhaoyang
2017-10-16
Pedestrian detection is among the most frequently-used preprocessing tasks in many surveillance application fields, from low-level people counting to high-level scene understanding. Even though many approaches perform well in the daytime with sufficient illumination, pedestrian detection at night is still a critical and challenging problem for video surveillance systems. To respond to this need, in this paper, we provide an affordable solution with a near-infrared stereo network camera, as well as a novel three-dimensional foreground pedestrian detection model. Specifically, instead of using an expensive thermal camera, we build a near-infrared stereo vision system with two calibrated network cameras and near-infrared lamps. The core of the system is a novel voxel surface model, which is able to estimate the dynamic changes of three-dimensional geometric information of the surveillance scene and to segment and locate foreground pedestrians in real time. A free update policy for unknown points is designed for model updating, and the extracted shadow of the pedestrian is adopted to remove foreground false alarms. To evaluate the performance of the proposed model, the system is deployed in several nighttime surveillance scenes. Experimental results demonstrate that our method is capable of nighttime pedestrian segmentation and detection in real time under heavy occlusion. In addition, the qualitative and quantitative comparison results show that our work outperforms classical background subtraction approaches and a recent RGB-D method, as well as achieving comparable performance with the state-of-the-art deep learning pedestrian detection method even with a much lower hardware cost.
Nighttime Foreground Pedestrian Detection Based on Three-Dimensional Voxel Surface Model
Li, Jing; Zhang, Fangbing; Wei, Lisong; Lu, Zhaoyang
2017-01-01
Pedestrian detection is among the most frequently-used preprocessing tasks in many surveillance application fields, from low-level people counting to high-level scene understanding. Even though many approaches perform well in the daytime with sufficient illumination, pedestrian detection at night is still a critical and challenging problem for video surveillance systems. To respond to this need, in this paper, we provide an affordable solution with a near-infrared stereo network camera, as well as a novel three-dimensional foreground pedestrian detection model. Specifically, instead of using an expensive thermal camera, we build a near-infrared stereo vision system with two calibrated network cameras and near-infrared lamps. The core of the system is a novel voxel surface model, which is able to estimate the dynamic changes of three-dimensional geometric information of the surveillance scene and to segment and locate foreground pedestrians in real time. A free update policy for unknown points is designed for model updating, and the extracted shadow of the pedestrian is adopted to remove foreground false alarms. To evaluate the performance of the proposed model, the system is deployed in several nighttime surveillance scenes. Experimental results demonstrate that our method is capable of nighttime pedestrian segmentation and detection in real time under heavy occlusion. In addition, the qualitative and quantitative comparison results show that our work outperforms classical background subtraction approaches and a recent RGB-D method, as well as achieving comparable performance with the state-of-the-art deep learning pedestrian detection method even with a much lower hardware cost. PMID:29035295
The Nimbus 4 data catalog. Volume 8: Data orbits 5206-10,120, 1 May 1971 - 30 April 1972
NASA Technical Reports Server (NTRS)
1972-01-01
Data from various instruments onboard the Nimbus 4 are presented, including the image dissector camera system, the temperature-humidity infrared radiometer, infrared interferometer spectrometer, and monitor of ultraviolet solar energy experiments. This data was collected from 1 May 1971 to 30 Apr. 1972. Orbital elements and daily sensor data are presented in tabular form.
NASA Technical Reports Server (NTRS)
Mcree, Griffith J., Jr.; Roberts, A. Sidney, Jr.
1991-01-01
An experimental program aimed at identifying areas in low speed aerodynamic research where infrared imaging systems can make significant contributions is discussed. Implementing a new technique, a long electrically heated wire was placed across a laminar flow. By measuring the temperature distribution along the wire with the IR imaging camera, the flow behavior was identified.
Near-Infrared Imaging for Detecting Caries and Structural Deformities in Teeth.
Angelino, Keith; Edlund, David A; Shah, Pratik
2017-01-01
2-D radiographs, while commonly used for evaluating sub-surface hard structures of teeth, have low sensitivity for early caries lesions, particularly those on tooth occlusal surfaces. Radiographs are also frequently refused by patients over safety concerns. Translucency of teeth in the near-infrared (NIR) range offers a non-ionizing and safe approach to detect dental caries. We report the construction of an NIR (850 nm) LED imaging system, comprised of an NIR source and an intraoral camera for rapid dental evaluations. The NIR system was used to image teeth of ten consenting human subjects and successfully detected secondary, amalgam-occluded and early caries lesions without supplementary image processing. The camera-wand system was also capable of revealing demineralized areas, deep and superficial cracks, and other clinical features of teeth usually visualized by X-rays. The NIR system's clinical utility, simplistic design, low cost, and user friendliness make it an effective dental caries screening technology in conjunction or in place of radiographs.
NASA Astrophysics Data System (ADS)
Laychak, M. B.
2008-06-01
In addition to the optical camera Megacam, the Canada-France-Hawaii Telescope operates a large field infrared camera, Wircam, and a spectrograph/spectropolimeter, Espadons. When these instruments were commissioned, the challenge arose to create educational outreach programmes incorporating the concepts of infrared astronomy and spectroscopy. We integrated spectroscopy into discussions of extrasolar planets and the search for life, two topics routinely requested by teachers for classroom talks. Making the infrared accessible to students provided a unique challenge, one that we met through the implementation and use of webcams modified for infrared use.
Model of an optical system's influence on sensitivity of microbolometric focal plane array
NASA Astrophysics Data System (ADS)
Gogler, Sławomir; Bieszczad, Grzegorz; Zarzycka, Alicja; Szymańska, Magdalena; Sosnowski, Tomasz
2012-10-01
Thermal imagers and used therein infrared array sensors are subject to calibration procedure and evaluation of their voltage sensitivity on incident radiation during manufacturing process. The calibration procedure is especially important in so-called radiometric cameras, where accurate radiometric quantities, given in physical units, are of concern. Even though non-radiometric cameras are not expected to stand up to such elevated standards, it is still important, that the image faithfully represents temperature variations across the scene. The detectors used in thermal camera are illuminated by infrared radiation transmitted through a specialized optical system. Each optical system used influences irradiation distribution across an sensor array. In the article a model describing irradiation distribution across an array sensor working with an optical system used in the calibration set-up has been proposed. In the said method optical and geometrical considerations of the array set-up have been taken into account. By means of Monte-Carlo simulation, large number of rays has been traced to the sensor plane, what allowed to determine the irradiation distribution across the image plane for different aperture limiting configurations. Simulated results have been confronted with proposed analytical expression. Presented radiometric model allows fast and accurate non-uniformity correction to be carried out.
ERIC Educational Resources Information Center
Jeppsson, Fredrik; Frejd, Johanna; Lundmark, Frida
2017-01-01
This study focuses on investigating how students make use of their bodily experiences in combination with infrared (IR) cameras, as a way to make meaning in learning about heat, temperature, and friction. A class of 20 primary students (age 7-8 years), divided into three groups, took part in three IR camera laboratory experiments. The qualitative…
NASA Astrophysics Data System (ADS)
Sáez-Cano, G.; Morales de los Ríos, J. A.; del Peral, L.; Neronov, A.; Wada, S.; Rodríguez Frías, M. D.
2015-03-01
The origin of cosmic rays have remained a mistery for more than a century. JEM-EUSO is a pioneer space-based telescope that will be located at the International Space Station (ISS) and its aim is to detect Ultra High Energy Cosmic Rays (UHECR) and Extremely High Energy Cosmic Rays (EHECR) by observing the atmosphere. Unlike ground-based telescopes, JEM-EUSO will observe from upwards, and therefore, for a properly UHECR reconstruction under cloudy conditions, a key element of JEM-EUSO is an Atmospheric Monitoring System (AMS). This AMS consists of a space qualified bi-spectral Infrared Camera, that will provide the cloud coverage and cloud top height in the JEM-EUSO Field of View (FoV) and a LIDAR, that will measure the atmospheric optical depth in the direction it has been shot. In this paper we will explain the effects of clouds for the determination of the UHECR arrival direction. Moreover, since the cloud top height retrieval is crucial to analyze the UHECR and EHECR events under cloudy conditions, the retrieval algorithm that fulfills the technical requierements of the Infrared Camera of JEM-EUSO to reconstruct the cloud top height is presently reported.
Reflective all-sky thermal infrared cloud imager
DOE Office of Scientific and Technical Information (OSTI.GOV)
Redman, Brian J.; Shaw, Joseph A.; Nugent, Paul W.
A reflective all-sky imaging system has been built using a long-wave infrared microbolometer camera and a reflective metal sphere. This compact system was developed for measuring spatial and temporal patterns of clouds and their optical depth in support of applications including Earth-space optical communications. The camera is mounted to the side of the reflective sphere to leave the zenith sky unobstructed. The resulting geometric distortion is removed through an angular map derived from a combination of checkerboard-target imaging, geometric ray tracing, and sun-location-based alignment. A tape of high-emissivity material on the side of the reflector acts as a reference thatmore » is used to estimate and remove thermal emission from the metal sphere. In conclusion, once a bias that is under continuing study was removed, sky radiance measurements from the all-sky imager in the 8-14 μm wavelength range agreed to within 0.91 W/(m 2 sr) of measurements from a previously calibrated, lens-based infrared cloud imager over its 110° field of view.« less
Reflective all-sky thermal infrared cloud imager
Redman, Brian J.; Shaw, Joseph A.; Nugent, Paul W.; ...
2018-04-17
A reflective all-sky imaging system has been built using a long-wave infrared microbolometer camera and a reflective metal sphere. This compact system was developed for measuring spatial and temporal patterns of clouds and their optical depth in support of applications including Earth-space optical communications. The camera is mounted to the side of the reflective sphere to leave the zenith sky unobstructed. The resulting geometric distortion is removed through an angular map derived from a combination of checkerboard-target imaging, geometric ray tracing, and sun-location-based alignment. A tape of high-emissivity material on the side of the reflector acts as a reference thatmore » is used to estimate and remove thermal emission from the metal sphere. In conclusion, once a bias that is under continuing study was removed, sky radiance measurements from the all-sky imager in the 8-14 μm wavelength range agreed to within 0.91 W/(m 2 sr) of measurements from a previously calibrated, lens-based infrared cloud imager over its 110° field of view.« less
Preclinical Evaluation of Robotic-Assisted Sentinel Lymph Node Fluorescence Imaging
Liss, Michael A.; Farshchi-Heydari, Salman; Qin, Zhengtao; Hickey, Sean A.; Hall, David J.; Kane, Christopher J.; Vera, David R.
2015-01-01
An ideal substance to provide convenient and accurate targeting for sentinel lymph node (SLN) mapping during robotic-assisted surgery has yet to be found. We used an animal model to determine the ability of the FireFly camera system to detect fluorescent SLNs after administration of a dual-labeled molecular imaging agent. Methods We injected the footpads of New Zealand White rabbits with 1.7 or 8.4 nmol of tilmanocept labeled with 99mTc and a near-infrared fluorophore, IRDye800CW. One and 36 h after injection, popliteal lymph nodes, representing the SLNs, were dissected with the assistance of the FireFly camera system, a fluorescence-capable endoscopic imaging system. After excision of the paraaortic lymph nodes, which represented non-SLNs, we assayed all lymph nodes for radioactivity and fluorescence intensity. Results Fluorescence within all popliteal lymph nodes was easily detected by the FireFly camera system. Fluorescence within the lymph channel could be imaged during the 1-h studies. When compared with the paraaortic lymph nodes, the popliteal lymph nodes retain greater than 95% of the radioactivity at both 1 and 36 h after injection. At both doses (1.7 and 8.4 nmol), the popliteal nodes had higher (P < 0.050) optical fluorescence intensity than the paraaortic nodes at the 1- and 36-h time points. Conclusion The FireFly camera system can easily detect tilmanocept labeled with a near-infrared fluorophore at least 36 h after administration. This ability will permit image acquisition and subsequent verification of fluorescence-labeled SLNs during robotic-assisted surgery. PMID:25024425
Preclinical evaluation of robotic-assisted sentinel lymph node fluorescence imaging.
Liss, Michael A; Farshchi-Heydari, Salman; Qin, Zhengtao; Hickey, Sean A; Hall, David J; Kane, Christopher J; Vera, David R
2014-09-01
An ideal substance to provide convenient and accurate targeting for sentinel lymph node (SLN) mapping during robotic-assisted surgery has yet to be found. We used an animal model to determine the ability of the FireFly camera system to detect fluorescent SLNs after administration of a dual-labeled molecular imaging agent. We injected the footpads of New Zealand White rabbits with 1.7 or 8.4 nmol of tilmanocept labeled with (99m)Tc and a near-infrared fluorophore, IRDye800CW. One and 36 h after injection, popliteal lymph nodes, representing the SLNs, were dissected with the assistance of the FireFly camera system, a fluorescence-capable endoscopic imaging system. After excision of the paraaortic lymph nodes, which represented non-SLNs, we assayed all lymph nodes for radioactivity and fluorescence intensity. Fluorescence within all popliteal lymph nodes was easily detected by the FireFly camera system. Fluorescence within the lymph channel could be imaged during the 1-h studies. When compared with the paraaortic lymph nodes, the popliteal lymph nodes retain greater than 95% of the radioactivity at both 1 and 36 h after injection. At both doses (1.7 and 8.4 nmol), the popliteal nodes had higher (P < 0.050) optical fluorescence intensity than the paraaortic nodes at the 1- and 36-h time points. The FireFly camera system can easily detect tilmanocept labeled with a near-infrared fluorophore at least 36 h after administration. This ability will permit image acquisition and subsequent verification of fluorescence-labeled SLNs during robotic-assisted surgery. © 2014 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sokovikov, Mikhail, E-mail: sokovikov@icmm.ru; Chudinov, Vasiliy; Bilalov, Dmitry
2015-10-27
The behavior of specimens dynamically loaded during split Hopkinson (Kolsky) bar tests in a regime close to simple shear conditions was studied. The lateral surface of the specimens was investigated in-situ using a high-speed infrared camera CEDIP Silver 450M. The temperature field distribution obtained at different time allowed one to trace the evolution of plastic strain localization. The process of target perforation involving plug formation and ejection was examined using a high-speed infrared camera and a VISAR velocity measurement system. The microstructure of tested specimens was analyzed using an optical interferometer-profiler and a scanning electron microscope. The development of plasticmore » shear instability regions has been simulated numerically.« less
Prototype of microbolometer thermal infrared camera for forest fire detection from space
NASA Astrophysics Data System (ADS)
Guerin, Francois; Dantes, Didier; Bouzou, Nathalie; Chorier, Philippe; Bouchardy, Anne-Marie; Rollin, Joël.
2017-11-01
The contribution of the thermal infrared (TIR) camera to the Earth observation FUEGO mission is to participate; to discriminate the clouds and smoke; to detect the false alarms of forest fires; to monitor the forest fires. Consequently, the camera needs a large dynamic range of detectable radiances. A small volume, low mass and power are required by the small FUEGO payload. These specifications can be attractive for other similar missions.
Report Of The HST Strategy Panel: A Strategy For Recovery
1991-01-01
orbit change out: the Wide Field/Planetary Camera II (WFPC II), the Near-Infrared Camera and Multi- Object Spectrometer (NICMOS) and the Space ...are the Space Telescope Imaging Spectrograph (STB), the Near-Infrared Camera and Multi- Object Spectrom- eter (NICMOS), and the second Wide Field and...expected to fail to lock due to duplicity was 20%; on- orbit data indicates that 10% may be a better estimate, but the guide stars were preselected
NASA Astrophysics Data System (ADS)
Gouverneur, B.; Verstockt, S.; Pauwels, E.; Han, J.; de Zeeuw, P. M.; Vermeiren, J.
2012-10-01
Various visible and infrared cameras have been tested for the early detection of wildfires to protect archeological treasures. This analysis was possible thanks to the EU Firesense project (FP7-244088). Although visible cameras are low cost and give good results during daytime for smoke detection, they fall short under bad visibility conditions. In order to improve the fire detection probability and reduce the false alarms, several infrared bands are tested ranging from the NIR to the LWIR. The SWIR and the LWIR band are helpful to locate the fire through smoke if there is a direct Line Of Sight. The Emphasis is also put on the physical and the electro-optical system modeling for forest fire detection at short and longer ranges. The fusion in three bands (Visible, SWIR, LWIR) is discussed at the pixel level for image enhancement and for fire detection.
Real time capable infrared thermography for ASDEX Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sieglin, B., E-mail: Bernhard.Sieglin@ipp.mpg.de; Faitsch, M.; Herrmann, A.
2015-11-15
Infrared (IR) thermography is widely used in fusion research to study power exhaust and incident heat load onto the plasma facing components. Due to the short pulse duration of today’s fusion experiments, IR systems have mostly been designed for off-line data analysis. For future long pulse devices (e.g., Wendelstein 7-X, ITER), a real time evaluation of the target temperature and heat flux is mandatory. This paper shows the development of a real time capable IR system for ASDEX Upgrade. A compact IR camera has been designed incorporating the necessary magnetic and electric shielding for the detector, cooler assembly. The cameramore » communication is based on the Camera Link industry standard. The data acquisition hardware is based on National Instruments hardware, consisting of a PXIe chassis inside and a fibre optical connected industry computer outside the torus hall. Image processing and data evaluation are performed using real time LabVIEW.« less
Calibration procedures of the Tore-Supra infrared endoscopes
NASA Astrophysics Data System (ADS)
Desgranges, C.; Jouve, M.; Balorin, C.; Reichle, R.; Firdaouss, M.; Lipa, M.; Chantant, M.; Gardarein, J. L.; Saille, A.; Loarer, T.
2018-01-01
Five endoscopes equipped with infrared cameras working in the medium infrared range (3-5 μm) are installed on the controlled thermonuclear fusion research device Tore-Supra. These endoscopes aim at monitoring the plasma facing components surface temperature to prevent their overheating. Signals delivered by infrared cameras through endoscopes are analysed and used on the one hand through a real time feedback control loop acting on the heating systems of the plasma to decrease plasma facing components surface temperatures when necessary, on the other hand for physics studies such as determination of the incoming heat flux . To ensure these two roles a very accurate knowledge of the absolute surface temperatures is mandatory. Consequently the infrared endoscopes must be calibrated through a very careful procedure. This means determining their transmission coefficients which is a delicate operation. Methods to calibrate infrared endoscopes during the shutdown period of the Tore-Supra machine will be presented. As they do not allow determining the possible transmittances evolution during operation an in-situ method is presented. It permits the validation of the calibration performed in laboratory as well as the monitoring of their evolution during machine operation. This is possible by the use of the endoscope shutter and a dedicated plasma scenario developed to heat it. Possible improvements of this method are briefly evoked.
Monitoring machining conditions by infrared images
NASA Astrophysics Data System (ADS)
Borelli, Joao E.; Gonzaga Trabasso, Luis; Gonzaga, Adilson; Coelho, Reginaldo T.
2001-03-01
During machining process the knowledge of the temperature is the most important factor in tool analysis. It allows to control main factors that influence tool use, life time and waste. The temperature in the contact area between the piece and the tool is resulting from the material removal in cutting operation and it is too difficult to be obtained because the tool and the work piece are in motion. One way to measure the temperature in this situation is detecting the infrared radiation. This work presents a new methodology for diagnosis and monitoring of machining processes with the use of infrared images. The infrared image provides a map in gray tones of the elements in the process: tool, work piece and chips. Each gray tone in the image corresponds to a certain temperature for each one of those materials and the relationship between the gray tones and the temperature is gotten by the previous of infrared camera calibration. The system developed in this work uses an infrared camera, a frame grabber board and a software composed of three modules. The first module makes the image acquisition and processing. The second module makes the feature image extraction and performs the feature vector. Finally, the third module uses fuzzy logic to evaluate the feature vector and supplies the tool state diagnostic as output.
Single Pixel Black Phosphorus Photodetector for Near-Infrared Imaging.
Miao, Jinshui; Song, Bo; Xu, Zhihao; Cai, Le; Zhang, Suoming; Dong, Lixin; Wang, Chuan
2018-01-01
Infrared imaging systems have wide range of military or civil applications and 2D nanomaterials have recently emerged as potential sensing materials that may outperform conventional ones such as HgCdTe, InGaAs, and InSb. As an example, 2D black phosphorus (BP) thin film has a thickness-dependent direct bandgap with low shot noise and noncryogenic operation for visible to mid-infrared photodetection. In this paper, the use of a single-pixel photodetector made with few-layer BP thin film for near-infrared imaging applications is demonstrated. The imaging is achieved by combining the photodetector with a digital micromirror device to encode and subsequently reconstruct the image based on compressive sensing algorithm. Stationary images of a near-infrared laser spot (λ = 830 nm) with up to 64 × 64 pixels are captured using this single-pixel BP camera with 2000 times of measurements, which is only half of the total number of pixels. The imaging platform demonstrated in this work circumvents the grand challenges of scalable BP material growth for photodetector array fabrication and shows the efficacy of utilizing the outstanding performance of BP photodetector for future high-speed infrared camera applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Pattern recognition applied to infrared images for early alerts in fog
NASA Astrophysics Data System (ADS)
Boucher, Vincent; Marchetti, Mario; Dumoulin, Jean; Cord, Aurélien
2014-09-01
Fog conditions are the cause of severe car accidents in western countries because of the poor induced visibility. Its forecast and intensity are still very difficult to predict by weather services. Infrared cameras allow to detect and to identify objects in fog while visibility is too low for eye detection. Over the past years, the implementation of cost effective infrared cameras on some vehicles has enabled such detection. On the other hand pattern recognition algorithms based on Canny filters and Hough transformation are a common tool applied to images. Based on these facts, a joint research program between IFSTTAR and Cerema has been developed to study the benefit of infrared images obtained in a fog tunnel during its natural dissipation. Pattern recognition algorithms have been applied, specifically on road signs which shape is usually associated to a specific meaning (circular for a speed limit, triangle for an alert, …). It has been shown that road signs were detected early enough in images, with respect to images in the visible spectrum, to trigger useful alerts for Advanced Driver Assistance Systems.
Cheng, Victor S; Bai, Jinfen; Chen, Yazhu
2009-11-01
As the needs for various kinds of body surface information are wide-ranging, we developed an imaging-sensor integrated system that can synchronously acquire high-resolution three-dimensional (3D) far-infrared (FIR) thermal and true-color images of the body surface. The proposed system integrates one FIR camera and one color camera with a 3D structured light binocular profilometer. To eliminate the emotion disturbance of the inspector caused by the intensive light projection directly into the eye from the LCD projector, we have developed a gray encoding strategy based on the optimum fringe projection layout. A self-heated checkerboard has been employed to perform the calibration of different types of cameras. Then, we have calibrated the structured light emitted by the LCD projector, which is based on the stereo-vision idea and the least-squares quadric surface-fitting algorithm. Afterwards, the precise 3D surface can fuse with undistorted thermal and color images. To enhance medical applications, the region-of-interest (ROI) in the temperature or color image representing the surface area of clinical interest can be located in the corresponding position in the other images through coordinate system transformation. System evaluation demonstrated a mapping error between FIR and visual images of three pixels or less. Experiments show that this work is significantly useful in certain disease diagnoses.
The spacecraft control laboratory experiment optical attitude measurement system
NASA Technical Reports Server (NTRS)
Welch, Sharon S.; Montgomery, Raymond C.; Barsky, Michael F.
1991-01-01
A stereo camera tracking system was developed to provide a near real-time measure of the position and attitude of the Spacecraft COntrol Laboratory Experiment (SCOLE). The SCOLE is a mockup of the shuttle-like vehicle with an attached flexible mast and (simulated) antenna, and was designed to provide a laboratory environment for the verification and testing of control laws for large flexible spacecraft. Actuators and sensors located on the shuttle and antenna sense the states of the spacecraft and allow the position and attitude to be controlled. The stereo camera tracking system which was developed consists of two position sensitive detector cameras which sense the locations of small infrared LEDs attached to the surface of the shuttle. Information on shuttle position and attitude is provided in six degrees-of-freedom. The design of this optical system, calibration, and tracking algorithm are described. The performance of the system is evaluated for yaw only.
Zhu, Banghe; Rasmussen, John C.; Litorja, Maritoni
2017-01-01
To date, no emerging preclinical or clinical near-infrared fluorescence (NIRF) imaging devices for non-invasive and/or surgical guidance have their performances validated on working standards with SI units of radiance that enable comparison or quantitative quality assurance. In this work, we developed and deployed a methodology to calibrate a stable, solid phantom for emission radiance with units of mW · sr−1 · cm−2 for use in characterizing the measurement sensitivity of ICCD and IsCMOS detection, signal-to-noise ratio, and contrast. In addition, at calibrated radiances, we assess transverse and lateral resolution of ICCD and IsCMOS camera systems. The methodology allowed determination of superior SNR of the ICCD over the IsCMOS camera system and superior resolution of the IsCMOS over the ICCD camera system. Contrast depended upon the camera settings (binning and integration time) and gain of intensifier. Finally, because of architecture of CMOS and CCD camera systems resulting in vastly different performance, we comment on the utility of these systems for small animal imaging as well as clinical applications for non-invasive and surgical guidance. PMID:26552078
NASA Astrophysics Data System (ADS)
Jantzen, Connie; Slagle, Rick
1997-05-01
The distinction between exposure time and sample rate is often the first point raised in any discussion of high speed imaging. Many high speed events require exposure times considerably shorter than those that can be achieved solely by the sample rate of the camera, where exposure time equals 1/sample rate. Gating, a method of achieving short exposure times in digital cameras, is often difficult to achieve for exposure time requirements shorter than 100 microseconds. This paper discusses the advantages and limitations of using the short duration light pulse of a near infrared laser with high speed digital imaging systems. By closely matching the output wavelength of the pulsed laser to the peak near infrared response of current sensors, high speed image capture can be accomplished at very low (visible) light levels of illumination. By virtue of the short duration light pulse, adjustable to as short as two microseconds, image capture of very high speed events can be achieved at relatively low sample rates of less than 100 pictures per second, without image blur. For our initial investigations, we chose a ballistic subject. The results of early experimentation revealed the limitations of applying traditional ballistic imaging methods when using a pulsed infrared lightsource with a digital imaging system. These early disappointing results clarified the need to further identify the unique system characteristics of the digital imager and pulsed infrared combination. It was also necessary to investigate how the infrared reflectance and transmittance of common materials affects the imaging process. This experimental work yielded a surprising, successful methodology which will prove useful in imaging ballistic and weapons tests, as well as forensics, flow visualizations, spray pattern analyses, and nocturnal animal behavioral studies.
Near-Infrared Imaging for Detecting Caries and Structural Deformities in Teeth
Angelino, Keith; Edlund, David A.
2017-01-01
2-D radiographs, while commonly used for evaluating sub-surface hard structures of teeth, have low sensitivity for early caries lesions, particularly those on tooth occlusal surfaces. Radiographs are also frequently refused by patients over safety concerns. Translucency of teeth in the near-infrared (NIR) range offers a non-ionizing and safe approach to detect dental caries. We report the construction of an NIR (850 nm) LED imaging system, comprised of an NIR source and an intraoral camera for rapid dental evaluations. The NIR system was used to image teeth of ten consenting human subjects and successfully detected secondary, amalgam–occluded and early caries lesions without supplementary image processing. The camera-wand system was also capable of revealing demineralized areas, deep and superficial cracks, and other clinical features of teeth usually visualized by X-rays. The NIR system’s clinical utility, simplistic design, low cost, and user friendliness make it an effective dental caries screening technology in conjunction or in place of radiographs. PMID:28507826
Low-cost camera modifications and methodologies for very-high-resolution digital images
USDA-ARS?s Scientific Manuscript database
Aerial color and color-infrared photography are usually acquired at high altitude so the ground resolution of the photographs is < 1 m. Moreover, current color-infrared cameras and manned aircraft flight time are expensive, so the objective is the development of alternative methods for obtaining ve...
High resolution imaging of the Venus night side using a Rockwell 128x128 HgCdTe array
NASA Technical Reports Server (NTRS)
Hodapp, K.-W.; Sinton, W.; Ragent, B.; Allen, D.
1989-01-01
The University of Hawaii operates an infrared camera with a 128x128 HgCdTe detector array on loan from JPL's High Resolution Imaging Spectrometer (HIRIS) project. The characteristics of this camera system are discussed. The infrared camera was used to obtain images of the night side of Venus prior to and after inferior conjunction in 1988. The images confirm Allen and Crawford's (1984) discovery of bright features on the dark hemisphere of Venus visible in the H and K bands. Our images of these features are the best obtained to date. Researchers derive a pseudo rotation period of 6.5 days for these features and 1.74 microns brightness temperatures between 425 K and 480 K. The features are produced by nonuniform absorption in the middle cloud layer (47 to 57 Km altitude) of thermal radiation from the lower Venus atmosphere (20 to 30 Km altitude). A more detailed analysis of the data is in progress.
Non-optically combined multispectral source for IR, visible, and laser testing
NASA Astrophysics Data System (ADS)
Laveigne, Joe; Rich, Brian; McHugh, Steve; Chua, Peter
2010-04-01
Electro Optical technology continues to advance, incorporating developments in infrared and laser technology into smaller, more tightly-integrated systems that can see and discriminate military targets at ever-increasing distances. New systems incorporate laser illumination and ranging with gated sensors that allow unparalleled vision at a distance. These new capabilities augment existing all-weather performance in the mid-wave infrared (MWIR) and long-wave infrared (LWIR), as well as low light level visible and near infrared (VNIR), giving the user multiple means of looking at targets of interest. There is a need in the test industry to generate imagery in the relevant spectral bands, and to provide temporal stimulus for testing range-gated systems. Santa Barbara Infrared (SBIR) has developed a new means of combining a uniform infrared source with uniform laser and visible sources for electro-optics (EO) testing. The source has been designed to allow laboratory testing of surveillance systems incorporating an infrared imager and a range-gated camera; and for field testing of emerging multi-spectral/fused sensor systems. A description of the source will be presented along with performance data relating to EO testing, including output in pertinent spectral bands, stability and resolution.
Application of infrared uncooled cameras in surveillance systems
NASA Astrophysics Data System (ADS)
Dulski, R.; Bareła, J.; Trzaskawka, P.; PiÄ tkowski, T.
2013-10-01
The recent necessity to protect military bases, convoys and patrols gave serious impact to the development of multisensor security systems for perimeter protection. One of the most important devices used in such systems are IR cameras. The paper discusses technical possibilities and limitations to use uncooled IR camera in a multi-sensor surveillance system for perimeter protection. Effective ranges of detection depend on the class of the sensor used and the observed scene itself. Application of IR camera increases the probability of intruder detection regardless of the time of day or weather conditions. It also simultaneously decreased the false alarm rate produced by the surveillance system. The role of IR cameras in the system was discussed as well as technical possibilities to detect human being. Comparison of commercially available IR cameras, capable to achieve desired ranges was done. The required spatial resolution for detection, recognition and identification was calculated. The simulation of detection ranges was done using a new model for predicting target acquisition performance which uses the Targeting Task Performance (TTP) metric. Like its predecessor, the Johnson criteria, the new model bounds the range performance with image quality. The scope of presented analysis is limited to the estimation of detection, recognition and identification ranges for typical thermal cameras with uncooled microbolometer focal plane arrays. This type of cameras is most widely used in security systems because of competitive price to performance ratio. Detection, recognition and identification range calculations were made, and the appropriate results for the devices with selected technical specifications were compared and discussed.
A Navigation System for the Visually Impaired: A Fusion of Vision and Depth Sensor
Kanwal, Nadia; Bostanci, Erkan; Currie, Keith; Clark, Adrian F.
2015-01-01
For a number of years, scientists have been trying to develop aids that can make visually impaired people more independent and aware of their surroundings. Computer-based automatic navigation tools are one example of this, motivated by the increasing miniaturization of electronics and the improvement in processing power and sensing capabilities. This paper presents a complete navigation system based on low cost and physically unobtrusive sensors such as a camera and an infrared sensor. The system is based around corners and depth values from Kinect's infrared sensor. Obstacles are found in images from a camera using corner detection, while input from the depth sensor provides the corresponding distance. The combination is both efficient and robust. The system not only identifies hurdles but also suggests a safe path (if available) to the left or right side and tells the user to stop, move left, or move right. The system has been tested in real time by both blindfolded and blind people at different indoor and outdoor locations, demonstrating that it operates adequately. PMID:27057135
LWIR NUC using an uncooled microbolometer camera
NASA Astrophysics Data System (ADS)
Laveigne, Joe; Franks, Greg; Sparkman, Kevin; Prewarski, Marcus; Nehring, Brian; McHugh, Steve
2010-04-01
Performing a good non-uniformity correction is a key part of achieving optimal performance from an infrared scene projector. Ideally, NUC will be performed in the same band in which the scene projector will be used. Cooled, large format MWIR cameras are readily available and have been successfully used to perform NUC, however, cooled large format LWIR cameras are not as common and are prohibitively expensive. Large format uncooled cameras are far more available and affordable, but present a range of challenges in practical use for performing NUC on an IRSP. Santa Barbara Infrared, Inc. reports progress on a continuing development program to use a microbolometer camera to perform LWIR NUC on an IRSP. Camera instability and temporal response and thermal resolution are the main difficulties. A discussion of processes developed to mitigate these issues follows.
NASA Technical Reports Server (NTRS)
Garbeff, Theodore J., II; Baerny, Jennifer K.
2017-01-01
The following details recent efforts undertaken at the NASA Ames Unitary Plan wind tunnels to design and deploy an advanced, production-level infrared (IR) flow visualization data system. Highly sensitive IR cameras, coupled with in-line image processing, have enabled the visualization of wind tunnel model surface flow features as they develop in real-time. Boundary layer transition, shock impingement, junction flow, vortex dynamics, and buffet are routinely observed in both transonic and supersonic flow regimes all without the need of dedicated ramps in test section total temperature. Successful measurements have been performed on wing-body sting mounted test articles, semi-span floor mounted aircraft models, and sting mounted launch vehicle configurations. The unique requirements of imaging in production wind tunnel testing has led to advancements in the deployment of advanced IR cameras in a harsh test environment, robust data acquisition storage and workflow, real-time image processing algorithms, and evaluation of optimal surface treatments. The addition of a multi-camera IR flow visualization data system to the Ames UPWT has demonstrated itself to be a valuable analyses tool in the study of new and old aircraft/launch vehicle aerodynamics and has provided new insight for the evaluation of computational techniques.
Low-speed flowfield characterization by infrared measurements of surface temperatures
NASA Technical Reports Server (NTRS)
Gartenberg, E.; Roberts, A. S., Jr.; Mcree, G. J.
1989-01-01
An experimental program was aimed at identifying areas in low speed aerodynamic research where infrared imaging systems can make significant contributions. Implementing a new technique, a long electrically heated wire was placed across a laminar jet. By measuring the temperature distribution along the wire with the IR imaging camera, the flow behavior was identified. Furthermore, using Nusselt number correlations, the velocity distribution could be deduced. The same approach was used to survey wakes behind cylinders in a wind-tunnel. This method is suited to investigate flows with position dependent velocities, e.g., boundary layers, confined flows, jets, wakes, and shear layers. It was found that the IR imaging camera cannot accurately track high gradient temperature fields. A correlation procedure was devised to account for this limitation. Other wind-tunnel experiments included tracking the development of the laminar boundary layer over a warmed flat plate by measuring the chordwise temperature distribution. This technique was applied also to the flow downstream from a rearward facing step. Finally, the IR imaging system was used to study boundary layer behavior over an airfoil at angles of attack from zero up to separation. The results were confirmed with tufts observable both visually and with the IR imaging camera.
Pettit holds cameras in the U.S. Laboratory
2012-01-15
ISS030-E-175788 (15 Jan. 2012) --- NASA astronaut Don Pettit, Expedition 30 flight engineer, is pictured with two still cameras mounted together in the Destiny laboratory of the International Space Station. One camera is an infrared modified still camera.
Video System Highlights Hydrogen Fires
NASA Technical Reports Server (NTRS)
Youngquist, Robert C.; Gleman, Stuart M.; Moerk, John S.
1992-01-01
Video system combines images from visible spectrum and from three bands in infrared spectrum to produce color-coded display in which hydrogen fires distinguished from other sources of heat. Includes linear array of 64 discrete lead selenide mid-infrared detectors operating at room temperature. Images overlaid on black and white image of same scene from standard commercial video camera. In final image, hydrogen fires appear red; carbon-based fires, blue; and other hot objects, mainly green and combinations of green and red. Where no thermal source present, image remains in black and white. System enables high degree of discrimination between hydrogen flames and other thermal emitters.
BAE Systems' 17μm LWIR camera core for civil, commercial, and military applications
NASA Astrophysics Data System (ADS)
Lee, Jeffrey; Rodriguez, Christian; Blackwell, Richard
2013-06-01
Seventeen (17) µm pixel Long Wave Infrared (LWIR) Sensors based on vanadium oxide (VOx) micro-bolometers have been in full rate production at BAE Systems' Night Vision Sensors facility in Lexington, MA for the past five years.[1] We introduce here a commercial camera core product, the Airia-MTM imaging module, in a VGA format that reads out in 30 and 60Hz progressive modes. The camera core is architected to conserve power with all digital interfaces from the readout integrated circuit through video output. The architecture enables a variety of input/output interfaces including Camera Link, USB 2.0, micro-display drivers and optional RS-170 analog output supporting legacy systems. The modular board architecture of the electronics facilitates hardware upgrades allow us to capitalize on the latest high performance low power electronics developed for the mobile phones. Software and firmware is field upgradeable through a USB 2.0 port. The USB port also gives users access to up to 100 digitally stored (lossless) images.
NASA Astrophysics Data System (ADS)
Meola, Joseph; Absi, Anthony; Islam, Mohammed N.; Peterson, Lauren M.; Ke, Kevin; Freeman, Michael J.; Ifaraguerri, Agustin I.
2014-06-01
Hyperspectral imaging systems are currently used for numerous activities related to spectral identification of materials. These passive imaging systems rely on naturally reflected/emitted radiation as the source of the signal. Thermal infrared systems measure radiation emitted from objects in the scene. As such, they can operate at both day and night. However, visible through shortwave infrared systems measure solar illumination reflected from objects. As a result, their use is limited to daytime applications. Omni Sciences has produced high powered broadband shortwave infrared super-continuum laser illuminators. A 64-watt breadboard system was recently packaged and tested at Wright-Patterson Air Force Base to gauge beam quality and to serve as a proof-of-concept for potential use as an illuminator for a hyperspectral receiver. The laser illuminator was placed in a tower and directed along a 1.4km slant path to various target materials with reflected radiation measured with both a broadband camera and a hyperspectral imaging system to gauge performance.
NASA Astrophysics Data System (ADS)
Chady, Tomasz; Gorący, Krzysztof
2018-04-01
Active infrared thermography is increasingly used for nondestructive testing of various materials. Properties of this method are creating a unique possibility to utilize it for inspection of composites. In the case of active thermography, an external energy source is usually used to induce a thermal contrast inside tested objects. The conventional heating methods (like halogen lamps or flash lamps) are utilized for this purpose. In this study, we propose to use a cooling unit. The proposed system consists of a thermal imaging infrared camera, which is used to observe the surface of the inspected specimen and a specially designed cooling unit with thermoelectric modules (the Peltier modules).
Obstacle Detection and Avoidance of a Mobile Robotic Platform Using Active Depth Sensing
2014-06-01
price of nearly one tenth of a laser range finder, the Xbox Kinect uses an infrared projector and camera to capture images of its environment in three...inception. At the price of nearly one tenth of a laser range finder, the Xbox Kinect uses an infrared projector and camera to capture images of its...cropped between 280 and 480 pixels. ........11 Figure 9. RGB image captured by the camera on the Xbox Kinect. ...............................12 Figure
NASA Technical Reports Server (NTRS)
Gunapala, S.; Bandara, S. V.; Liu, J. K.; Hong, W.; Sundaram, M.; Maker, P. D.; Muller, R. E.
1997-01-01
In this paper, we discuss the development of this very sensitive long waelength infrared (LWIR) camera based on a GaAs/AlGaAs QWIP focal plane array (FPA) and its performance in quantum efficiency, NEAT, uniformity, and operability.
NASA Technical Reports Server (NTRS)
Watson, Dan M.
1997-01-01
Under the terms of our contract with NASA Ames Research Center, the University of Rochester (UR) offers the following final technical report on grant NAG 2-958, Molecular shocks associated with massive young stars: CO line images with a new far-infrared spectroscopic camera, given for implementation of the UR Far-Infrared Spectroscopic Camera (FISC) on the Kuiper Airborne Observatory (KAO), and use of this camera for observations of star-formation regions 1. Two KAO flights in FY 1995, the final year of KAO operations, were awarded to this program, conditional upon a technical readiness confirmation which was given in January 1995. The funding period covered in this report is 1 October 1994 - 30 September 1996. The project was supported with $30,000, and no funds remained at the conclusion of the project.
Experimental and numerical study of plastic shear instability under high-speed loading conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sokovikov, Mikhail, E-mail: sokovikov@icmm.ru, E-mail: naimark@icmm.ru; Chudinov, Vasiliy, E-mail: sokovikov@icmm.ru, E-mail: naimark@icmm.ru; Bilalov, Dmitry, E-mail: sokovikov@icmm.ru, E-mail: naimark@icmm.ru
2014-11-14
The behavior of specimens dynamically loaded during the split Hopkinson (Kolsky) bar tests in a regime close to simple shear conditions was studied. The lateral surface of the specimens was investigated in a real-time mode with the aid of a high-speed infra-red camera CEDIP Silver 450M. The temperature field distribution obtained at different time made it possible to trace the evolution of plastic strain localization. The process of target perforation involving plug formation and ejection was examined using a high-speed infra-red camera and a VISAR velocity measurement system. The microstructure of tested specimens was analyzed using an optical interferometer-profilometer andmore » a scanning electron microscope. The development of plastic shear instability regions has been simulated numerically.« less
A TV Camera System Which Extracts Feature Points For Non-Contact Eye Movement Detection
NASA Astrophysics Data System (ADS)
Tomono, Akira; Iida, Muneo; Kobayashi, Yukio
1990-04-01
This paper proposes a highly efficient camera system which extracts, irrespective of background, feature points such as the pupil, corneal reflection image and dot-marks pasted on a human face in order to detect human eye movement by image processing. Two eye movement detection methods are sugested: One utilizing face orientation as well as pupil position, The other utilizing pupil and corneal reflection images. A method of extracting these feature points using LEDs as illumination devices and a new TV camera system designed to record eye movement are proposed. Two kinds of infra-red LEDs are used. These LEDs are set up a short distance apart and emit polarized light of different wavelengths. One light source beams from near the optical axis of the lens and the other is some distance from the optical axis. The LEDs are operated in synchronization with the camera. The camera includes 3 CCD image pick-up sensors and a prism system with 2 boundary layers. Incident rays are separated into 2 wavelengths by the first boundary layer of the prism. One set of rays forms an image on CCD-3. The other set is split by the half-mirror layer of the prism and forms an image including the regularly reflected component by placing a polarizing filter in front of CCD-1 or another image not including the component by not placing a polarizing filter in front of CCD-2. Thus, three images with different reflection characteristics are obtained by three CCDs. Through the experiment, it is shown that two kinds of subtraction operations between the three images output from CCDs accentuate three kinds of feature points: the pupil and corneal reflection images and the dot-marks. Since the S/N ratio of the subtracted image is extremely high, the thresholding process is simple and allows reducting the intensity of the infra-red illumination. A high speed image processing apparatus using this camera system is decribed. Realtime processing of the subtraction, thresholding and gravity position calculation of the feature points is possible.
Spectral measurements of muzzle flash with multispectral and hyperspectral sensor
NASA Astrophysics Data System (ADS)
Kastek, M.; Dulski, R.; Trzaskawka, P.; Piątkowski, T.; Polakowski, H.
2011-08-01
The paper presents some practical aspects of the measurements of muzzle flash signatures. Selected signatures of sniper shot in typical scenarios has been presented. Signatures registered during all phases of muzzle flash were analyzed. High precision laboratory measurements were made in a special ballistic laboratory and as a result several flash patterns were registered. The field measurements of a muzzle flash were also performed. During the tests several infrared cameras were used, including the measurement class devices with high accuracy and frame rates. The registrations were made in NWIR, SWIR and LWIR spectral bands simultaneously. An ultra fast visual camera was also used for visible spectra registration. Some typical infrared shot signatures were presented. Beside the cameras, the LWIR imaging spectroradiometer HyperCam was also used during the laboratory experiments and the field tests. The signatures collected by the HyperCam device were useful for the determination of spectral characteristics of the muzzle flash, whereas the analysis of thermal images registered during the tests provided the data on temperature distribution in the flash area. As a result of the measurement session the signatures of several types handguns, machine guns and sniper rifles were obtained which will be used in the development of passive infrared systems for sniper detection.
Infrared Imaging Camera Final Report CRADA No. TC02061.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roos, E. V.; Nebeker, S.
This was a collaborative effort between the University of California, Lawrence Livermore National Laboratory (LLNL) and Cordin Company (Cordin) to enhance the U.S. ability to develop a commercial infrared camera capable of capturing high-resolution images in a l 00 nanoseconds (ns) time frame. The Department of Energy (DOE), under an Initiative for Proliferation Prevention (IPP) project, funded the Russian Federation Nuclear Center All-Russian Scientific Institute of Experimental Physics (RFNC-VNIIEF) in Sarov. VNIIEF was funded to develop a prototype commercial infrared (IR) framing camera and to deliver a prototype IR camera to LLNL. LLNL and Cordin were partners with VNIIEF onmore » this project. A prototype IR camera was delivered by VNIIEF to LLNL in December 2006. In June of 2007, LLNL and Cordin evaluated the camera and the test results revealed that the camera exceeded presently available commercial IR cameras. Cordin believes that the camera can be sold on the international market. The camera is currently being used as a scientific tool within Russian nuclear centers. This project was originally designated as a two year project. The project was not started on time due to changes in the IPP project funding conditions; the project funding was re-directed through the International Science and Technology Center (ISTC), which delayed the project start by over one year. The project was not completed on schedule due to changes within the Russian government export regulations. These changes were directed by Export Control regulations on the export of high technology items that can be used to develop military weapons. The IR camera was on the list that export controls required. The ISTC and Russian government, after negotiations, allowed the delivery of the camera to LLNL. There were no significant technical or business changes to the original project.« less
Investigation of the influence of spatial degrees of freedom on thermal infrared measurement
NASA Astrophysics Data System (ADS)
Fleuret, Julien R.; Yousefi, Bardia; Lei, Lei; Djupkep Dizeu, Frank Billy; Zhang, Hai; Sfarra, Stefano; Ouellet, Denis; Maldague, Xavier P. V.
2017-05-01
Long Wavelength Infrared (LWIR) cameras can provide a representation of a part of the light spectrum that is sensitive to temperature. These cameras also named Thermal Infrared (TIR) cameras are powerful tools to detect features that cannot be seen by other imaging technologies. For instance they enable defect detection in material, fever and anxiety in mammals and many other features for numerous applications. However, the accuracy of thermal cameras can be affected by many parameters; the most critical involves the relative position of the camera with respect to the object of interest. Several models have been proposed in order to minimize the influence of some of the parameters but they are mostly related to specific applications. Because such models are based on some prior informations related to context, their applicability to other contexts cannot be easily assessed. The few models remaining are mostly associated with a specific device. In this paper the authors studied the influence of the camera position on the measurement accuracy. Modeling of the position of the camera from the object of interest depends on many parameters. In order to propose a study which is as accurate as possible, the position of the camera will be represented as a five dimensions model. The aim of this study is to investigate and attempt to introduce a model which is as independent from the device as possible.
Imaging spectroscopy using embedded diffractive optical arrays
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele; Hinnrichs, Bradford
2017-09-01
Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera based on diffractive optic arrays. This approach to hyperspectral imaging has been demonstrated in all three infrared bands SWIR, MWIR and LWIR. The hyperspectral optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of this infrared hyperspectral sensor. This new and innovative approach to an infrared hyperspectral imaging spectrometer uses micro-optics that are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a small satellite, mini-UAV, commercial quadcopter or man portable. Also, an application of how this spectral imaging technology can easily be used to quantify the mass and volume flow rates of hydrocarbon gases. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. The detector array is divided into sub-images covered by each lenslet. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the number of simultaneous different spectral images collected each frame of the camera. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame. This system spans the SWIR and MWIR bands with a single optical array and focal plane array.
A Real-Time Optical 3D Tracker for Head-Mounted Display Systems
1990-03-01
paper. OPTOTRAK [Nor88] uses one camera with two dual-axis CCD infrared position sensors. Each position sen- sor has a dedicated processor board to...enhance the use- [Nor88] Northern Digital. Trade literature on Optotrak fulness of head-mounted display systems. - Northern Digital’s Three Dimensional
Camera traps can be heard and seen by animals.
Meek, Paul D; Ballard, Guy-Anthony; Fleming, Peter J S; Schaefer, Michael; Williams, Warwick; Falzon, Greg
2014-01-01
Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals' hearing and produce illumination that can be seen by many species.
Raspberry Pi camera with intervalometer used as crescograph
NASA Astrophysics Data System (ADS)
Albert, Stefan; Surducan, Vasile
2017-12-01
The intervalometer is an attachment or facility on a photo-camera that operates the shutter regularly at set intervals over a period. Professional cameras with built in intervalometers are expensive and quite difficult to find. The Canon CHDK open source operating system allows intervalometer implementation on Canon cameras only. However finding a Canon camera with near infra-red (NIR) photographic lens at affordable price is impossible. On experiments requiring several cameras (used to measure growth in plants - the crescographs, but also for coarse evaluation of the water content of leaves), the costs of the equipment are often over budget. Using two Raspberry Pi modules each equipped with a low cost NIR camera and a WIFI adapter (for downloading pictures stored on the SD card) and some freely available software, we have implemented two low budget intervalometer cameras. The shutting interval, the number of pictures to be taken, image resolution and some other parameters can be fully programmed. Cameras have been in use continuously for three months (July-October 2017) in a relevant environment (outside), proving the concept functionality.
Pettit works with two still cameras mounted together in the U.S. Laboratory
2012-01-21
ISS030-E-049636 (21 Jan. 2012) --- NASA astronaut Don Pettit, Expedition 30 flight engineer, works with two still cameras mounted together in the Destiny laboratory of the International Space Station. One camera is an infrared modified still camera.
Pettit works with two still cameras mounted together in the U.S. Laboratory
2012-01-21
ISS030-E-049643 (21 Jan. 2012) --- NASA astronaut Don Pettit, Expedition 30 flight engineer, works with two still cameras mounted together in the Destiny laboratory of the International Space Station. One camera is an infrared modified still camera.
Attitude identification for SCOLE using two infrared cameras
NASA Technical Reports Server (NTRS)
Shenhar, Joram
1991-01-01
An algorithm is presented that incorporates real time data from two infrared cameras and computes the attitude parameters of the Spacecraft COntrol Lab Experiment (SCOLE), a lab apparatus representing an offset feed antenna attached to the Space Shuttle by a flexible mast. The algorithm uses camera position data of three miniature light emitting diodes (LEDs), mounted on the SCOLE platform, permitting arbitrary camera placement and an on-line attitude extraction. The continuous nature of the algorithm allows identification of the placement of the two cameras with respect to some initial position of the three reference LEDs, followed by on-line six degrees of freedom attitude tracking, regardless of the attitude time history. A description is provided of the algorithm in the camera identification mode as well as the mode of target tracking. Experimental data from a reduced size SCOLE-like lab model, reflecting the performance of the camera identification and the tracking processes, are presented. Computer code for camera placement identification and SCOLE attitude tracking is listed.
High throughput integrated thermal characterization with non-contact optical calorimetry
NASA Astrophysics Data System (ADS)
Hou, Sichao; Huo, Ruiqing; Su, Ming
2017-10-01
Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.
An Overview of the CBERS-2 Satellite and Comparison of the CBERS-2 CCD Data with the L5 TM Data
NASA Technical Reports Server (NTRS)
Chandler, Gyanesh
2007-01-01
CBERS satellite carries on-board a multi sensor payload with different spatial resolutions and collection frequencies. HRCCD (High Resolution CCD Camera), IRMSS (Infrared Multispectral Scanner), and WFI (Wide-Field Imager). The CCD and the WFI camera operate in the VNIR regions, while the IRMSS operates in SWIR and thermal region. In addition to the imaging payload, the satellite carries a Data Collection System (DCS) and Space Environment Monitor (SEM).
MS Grunsfeld wearing EMU in Airlock
2002-03-08
STS109-E-5721 (8 March 2002) --- Astronaut John M. Grunsfeld, STS-109 payload commander, attired in the extravehicular mobility unit (EMU) space suit, completed suited is in the Space Shuttle Columbias airlock. Grunsfeld and Richard M. Linnehan, mission specialist, were about to participate in STS-109s fifth space walk. Activities for EVA-5 centered around the Near-Infrared Camera and Multi-Object Spectrometer (NICMOS) to install a Cryogenic Cooler and its Cooling System Radiator. The image was recorded with a digital still camera.
MS Grunsfeld wearing EMU in Airlock joined by MS Newman and Massimino
2002-03-08
STS109-E-5722 (8 March 2002) --- Astronaut John M. Grunsfeld (center), STS-109 payload commander, attired in the extravehicular mobility unit (EMU) space suit, is photographed with astronauts James H. Newman (left) and Michael J. Massimino, both mission specialists, prior to the fifth space walk. Activities for EVA-5 centered around the Near-Infrared Camera and Multi-Object Spectrometer (NICMOS) to install a Cryogenic Cooler and its Cooling System Radiator. The image was recorded with a digital still camera.
Uses of infrared thermography in the low-cost solar array program
NASA Technical Reports Server (NTRS)
Glazer, S. D.
1982-01-01
The Jet Propulsion Laboratory has used infrared thermography extensively in the Low-Cost Solar Array (LSA) photovoltaics program. A two-dimensional scanning infrared radiometer has been used to make field inspections of large free-standing photovoltaic arrays and smaller demonstration sites consisting of integrally mounted rooftop systems. These field inspections have proven especially valuable in the research and early development phases of the program, since certain types of module design flaws and environmental degradation manifest themselves in unique thermal patterns. The infrared camera was also used extensively in a series of laboratory tests on photovoltaic cells to obtain peak cell temperatures and thermal patterns during off-design operating conditions. The infrared field inspections and the laboratory experiments are discussed, and sample results are presented.
James Webb Space Telescope Project (JWST) Overview
NASA Technical Reports Server (NTRS)
Dutta, Mitra
2008-01-01
This presentation provides an overview of the James Webb Space Telescope (JWST) Project. The JWST is an infrared telescope designed to collect data in the cosmic dark zone. Specifically, the mission of the JWST is to study the origin and evolution of galaxies, stars and planetary systems. It is a deployable telescope with a 6.5 m diameter, segmented, adjustable primary mirror. outfitted with cryogenic temperature telescope and instruments for infrared performance. The JWST is several times more sensitive than previous telescope and other photographic and electronic detection methods. It hosts a near infrared camera, near infrared spectrometer, mid-infrared instrument and a fine guidance sensor. The JWST mission objection and architecture, integrated science payload, instrument overview, and operational orbit are described.
Development of infrared scene projectors for testing fire-fighter cameras
NASA Astrophysics Data System (ADS)
Neira, Jorge E.; Rice, Joseph P.; Amon, Francine K.
2008-04-01
We have developed two types of infrared scene projectors for hardware-in-the-loop testing of thermal imaging cameras such as those used by fire-fighters. In one, direct projection, images are projected directly into the camera. In the other, indirect projection, images are projected onto a diffuse screen, which is then viewed by the camera. Both projectors use a digital micromirror array as the spatial light modulator, in the form of a Micromirror Array Projection System (MAPS) engine having resolution of 800 x 600 with mirrors on a 17 micrometer pitch, aluminum-coated mirrors, and a ZnSe protective window. Fire-fighter cameras are often based upon uncooled microbolometer arrays and typically have resolutions of 320 x 240 or lower. For direct projection, we use an argon-arc source, which provides spectral radiance equivalent to a 10,000 Kelvin blackbody over the 7 micrometer to 14 micrometer wavelength range, to illuminate the micromirror array. For indirect projection, an expanded 4 watt CO II laser beam at a wavelength of 10.6 micrometers illuminates the micromirror array and the scene formed by the first-order diffracted light from the array is projected onto a diffuse aluminum screen. In both projectors, a well-calibrated reference camera is used to provide non-uniformity correction and brightness calibration of the projected scenes, and the fire-fighter cameras alternately view the same scenes. In this paper, we compare the two methods for this application and report on our quantitative results. Indirect projection has an advantage of being able to more easily fill the wide field of view of the fire-fighter cameras, which typically is about 50 degrees. Direct projection more efficiently utilizes the available light, which will become important in emerging multispectral and hyperspectral applications.
NASA Astrophysics Data System (ADS)
Saari, H.; Akujärvi, A.; Holmlund, C.; Ojanen, H.; Kaivosoja, J.; Nissinen, A.; Niemeläinen, O.
2017-10-01
The accurate determination of the quality parameters of crops requires a spectral range from 400 nm to 2500 nm (Kawamura et al., 2010, Thenkabail et al., 2002). Presently the hyperspectral imaging systems that cover this wavelength range consist of several separate hyperspectral imagers and the system weight is from 5 to 15 kg. In addition the cost of the Short Wave Infrared (SWIR) cameras is high ( 50 k€). VTT has previously developed compact hyperspectral imagers for drones and Cubesats for Visible and Very near Infrared (VNIR) spectral ranges (Saari et al., 2013, Mannila et al., 2013, Näsilä et al., 2016). Recently VTT has started to develop a hyperspectral imaging system that will enable imaging simultaneously in the Visible, VNIR, and SWIR spectral bands. The system can be operated from a drone, on a camera stand, or attached to a tractor. The targeted main applications of the DroneKnowledge hyperspectral system are grass, peas, and cereals. In this paper the characteristics of the built system are shortly described. The system was used for spectral measurements of wheat, several grass species and pea plants fixed to the camera mount in the test fields in Southern Finland and in the green house. The wheat, grass and pea field measurements were also carried out using the system mounted on the tractor. The work is part of the Finnish nationally funded DroneKnowledge - Towards knowledge based export of small UAS remote sensing technology project.
Analysis of data from the thermal imaging inspection system project.
DOT National Transportation Integrated Search
2009-12-01
The goal of this study was to use temperature measurements derived from infrared cameras to identify trucks with potential brake, tire, or hub defects. Data were collected at inspection sites on six different days and vehicles were subjected to CVSA ...
Near-infrared face recognition utilizing open CV software
NASA Astrophysics Data System (ADS)
Sellami, Louiza; Ngo, Hau; Fowler, Chris J.; Kearney, Liam M.
2014-06-01
Commercially available hardware, freely available algorithms, and authors' developed software are synergized successfully to detect and recognize subjects in an environment without visible light. This project integrates three major components: an illumination device operating in near infrared (NIR) spectrum, a NIR capable camera and a software algorithm capable of performing image manipulation, facial detection and recognition. Focusing our efforts in the near infrared spectrum allows the low budget system to operate covertly while still allowing for accurate face recognition. In doing so a valuable function has been developed which presents potential benefits in future civilian and military security and surveillance operations.
ERIC Educational Resources Information Center
Haglund, Jesper; Melander, Emil; Weiszflog, Matthias; Andersson, Staffan
2017-01-01
Background: University physics students were engaged in open-ended thermodynamics laboratory activities with a focus on understanding a chosen phenomenon or the principle of laboratory apparatus, such as thermal radiation and a heat pump. Students had access to handheld infrared (IR) cameras for their investigations. Purpose: The purpose of the…
NASA Technical Reports Server (NTRS)
Defrere, D.; Hinz, P.; Downey, E.; Boehm, M.; Danchi, W. C.; Durney, O.; Ertel, S.; Hill, J. M.; Hoffmann, W. F.; Mennesson, B.;
2016-01-01
The Large Binocular Telescope Interferometer uses a near-infrared camera to measure the optical path length variations between the two AO-corrected apertures and provide high-angular resolution observations for all its science channels (1.5-13 microns). There is however a wavelength dependent component to the atmospheric turbulence, which can introduce optical path length errors when observing at a wavelength different from that of the fringe sensing camera. Water vapor in particular is highly dispersive and its effect must be taken into account for high-precision infrared interferometric observations as described previously for VLTI/MIDI or the Keck Interferometer Nuller. In this paper, we describe the new sensing approach that has been developed at the LBT to measure and monitor the optical path length fluctuations due to dry air and water vapor separately. After reviewing the current performance of the system for dry air seeing compensation, we present simultaneous H-, K-, and N-band observations that illustrate the feasibility of our feed forward approach to stabilize the path length fluctuations seen by the LBTI nuller uses a near-infrared camera to measure the optical path length variations between the two AO-corrected apertures and provide high-angular resolution observations for all its science channels (1.5-13 microns). There is however a wavelength dependent component to the atmospheric turbulence, which can introduce optical path length errors when observing at a wavelength different from that of the fringe sensing camera. Water vapor in particular is highly dispersive and its effect must be taken into account for high-precision infrared interferometric observations as described previously for VLTI MIDI or the Keck Interferometer Nuller. In this paper, we describe the new sensing approach that has been developed at the LBT to measure and monitor the optical path length fluctuations due to dry air and water vapor separately. After reviewing the current performance of the system for dry air seeing compensation, we present simultaneous H-, K-, and N-band observations that illustrate the feasibility of our feed forward approach to stabilize the path length fluctuations seen by the LBTI nuller.
Autonomous Docking Based on Infrared System for Electric Vehicle Charging in Urban Areas
Pérez, Joshué; Nashashibi, Fawzi; Lefaudeux, Benjamin; Resende, Paulo; Pollard, Evangeline
2013-01-01
Electric vehicles are progressively introduced in urban areas, because of their ability to reduce air pollution, fuel consumption and noise nuisance. Nowadays, some big cities are launching the first electric car-sharing projects to clear traffic jams and enhance urban mobility, as an alternative to the classic public transportation systems. However, there are still some problems to be solved related to energy storage, electric charging and autonomy. In this paper, we present an autonomous docking system for electric vehicles recharging based on an embarked infrared camera performing infrared beacons detection installed in the infrastructure. A visual servoing system coupled with an automatic controller allows the vehicle to dock accurately to the recharging booth in a street parking area. The results show good behavior of the implemented system, which is currently deployed as a real prototype system in the city of Paris. PMID:23429581
Autonomous docking based on infrared system for electric vehicle charging in urban areas.
Pérez, Joshué; Nashashibi, Fawzi; Lefaudeux, Benjamin; Resende, Paulo; Pollard, Evangeline
2013-02-21
Electric vehicles are progressively introduced in urban areas, because of their ability to reduce air pollution, fuel consumption and noise nuisance. Nowadays, some big cities are launching the first electric car-sharing projects to clear traffic jams and enhance urban mobility, as an alternative to the classic public transportation systems. However, there are still some problems to be solved related to energy storage, electric charging and autonomy. In this paper, we present an autonomous docking system for electric vehicles recharging based on an embarked infrared camera performing infrared beacons detection installed in the infrastructure. A visual servoing system coupled with an automatic controller allows the vehicle to dock accurately to the recharging booth in a street parking area. The results show good behavior of the implemented system, which is currently deployed as a real prototype system in the city of Paris.
Land-based infrared imagery for marine mammal detection
NASA Astrophysics Data System (ADS)
Graber, Joseph; Thomson, Jim; Polagye, Brian; Jessup, Andrew
2011-09-01
A land-based infrared (IR) camera is used to detect endangered Southern Resident killer whales in Puget Sound, Washington, USA. The observations are motivated by a proposed tidal energy pilot project, which will be required to monitor for environmental effects. Potential monitoring methods also include visual observation, passive acoustics, and active acoustics. The effectiveness of observations in the infrared spectrum is compared to observations in the visible spectrum to assess the viability of infrared imagery for cetacean detection and classification. Imagery was obtained at Lime Kiln Park, Washington from 7/6/10-7/9/10 using a FLIR Thermovision A40M infrared camera (7.5-14μm, 37°HFOV, 320x240 pixels) under ideal atmospheric conditions (clear skies, calm seas, and wind speed 0-4 m/s). Whales were detected during both day (9 detections) and night (75 detections) at distances ranging from 42 to 162 m. The temperature contrast between dorsal fins and the sea surface ranged from 0.5 to 4.6 °C. Differences in emissivity from sea surface to dorsal fin are shown to aid detection at high incidence angles (near grazing). A comparison to theory is presented, and observed deviations from theory are investigated. A guide for infrared camera selection based on site geometry and desired target size is presented, with specific considerations regarding marine mammal detection. Atmospheric conditions required to use visible and infrared cameras for marine mammal detection are established and compared with 2008 meteorological data for the proposed tidal energy site. Using conservative assumptions, infrared observations are predicted to provide a 74% increase in hours of possible detection, compared with visual observations.
Water Plume Temperature Measurements by an Unmanned Aerial System (UAS)
DeMario, Anthony; Lopez, Pete; Plewka, Eli; Wix, Ryan; Xia, Hai; Zamora, Emily; Gessler, Dan; Yalin, Azer P.
2017-01-01
We report on the development and testing of a proof of principle water temperature measurement system deployed on an unmanned aerial system (UAS), for field measurements of thermal discharges into water. The primary elements of the system include a quad-copter UAS to which has been integrated, for the first time, both a thermal imaging infrared (IR) camera and an immersible probe that can be dipped below the water surface to obtain vertical water temperature profiles. The IR camera is used to take images of the overall water surface to geo-locate the plume, while the immersible probe provides quantitative temperature depth profiles at specific locations. The full system has been tested including the navigation of the UAS, its ability to safely carry the sensor payload, and the performance of both the IR camera and the temperature probe. Finally, the UAS sensor system was successfully deployed in a pilot field study at a coal burning power plant, and obtained images and temperature profiles of the thermal effluent. PMID:28178215
Water Plume Temperature Measurements by an Unmanned Aerial System (UAS).
DeMario, Anthony; Lopez, Pete; Plewka, Eli; Wix, Ryan; Xia, Hai; Zamora, Emily; Gessler, Dan; Yalin, Azer P
2017-02-07
We report on the development and testing of a proof of principle water temperature measurement system deployed on an unmanned aerial system (UAS), for field measurements of thermal discharges into water. The primary elements of the system include a quad-copter UAS to which has been integrated, for the first time, both a thermal imaging infrared (IR) camera and an immersible probe that can be dipped below the water surface to obtain vertical water temperature profiles. The IR camera is used to take images of the overall water surface to geo-locate the plume, while the immersible probe provides quantitative temperature depth profiles at specific locations. The full system has been tested including the navigation of the UAS, its ability to safely carry the sensor payload, and the performance of both the IR camera and the temperature probe. Finally, the UAS sensor system was successfully deployed in a pilot field study at a coal burning power plant, and obtained images and temperature profiles of the thermal effluent.
Invisible marker based augmented reality system
NASA Astrophysics Data System (ADS)
Park, Hanhoon; Park, Jong-Il
2005-07-01
Augmented reality (AR) has recently gained significant attention. The previous AR techniques usually need a fiducial marker with known geometry or objects of which the structure can be easily estimated such as cube. Placing a marker in the workspace of the user can be intrusive. To overcome this limitation, we present an AR system using invisible markers which are created/drawn with an infrared (IR) fluorescent pen. Two cameras are used: an IR camera and a visible camera, which are positioned in each side of a cold mirror so that their optical centers coincide with each other. We track the invisible markers using IR camera and visualize AR in the view of visible camera. Additional algorithms are employed for the system to have a reliable performance in the cluttered background. Experimental results are given to demonstrate the viability of the proposed system. As an application of the proposed system, the invisible marker can act as a Vision-Based Identity and Geometry (VBIG) tag, which can significantly extend the functionality of RFID. The invisible tag is the same as RFID in that it is not perceivable while more powerful in that the tag information can be presented to the user by direct projection using a mobile projector or by visualizing AR on the screen of mobile PDA.
Simultaneous digital super-resolution and nonuniformity correction for infrared imaging systems.
Meza, Pablo; Machuca, Guillermo; Torres, Sergio; Martin, Cesar San; Vera, Esteban
2015-07-20
In this article, we present a novel algorithm to achieve simultaneous digital super-resolution and nonuniformity correction from a sequence of infrared images. We propose to use spatial regularization terms that exploit nonlocal means and the absence of spatial correlation between the scene and the nonuniformity noise sources. We derive an iterative optimization algorithm based on a gradient descent minimization strategy. Results from infrared image sequences corrupted with simulated and real fixed-pattern noise show a competitive performance compared with state-of-the-art methods. A qualitative analysis on the experimental results obtained with images from a variety of infrared cameras indicates that the proposed method provides super-resolution images with significantly less fixed-pattern noise.
NASA Technical Reports Server (NTRS)
Holleman, Elizabeth; Sharp, David; Sheller, Richard; Styron, Jason
2007-01-01
This paper describes the application of a FUR Systems A40M infrared (IR) digital camera for thermal monitoring of a Liquid Oxygen (LOX) and Ethanol bi-propellant Reaction Control Engine (RCE) during Auxiliary Propulsion System (APS) testing at the National Aeronautics & Space Administration's (NASA) White Sands Test Facility (WSTF) near Las Cruces, New Mexico. Typically, NASA has relied mostly on the use of ThermoCouples (TC) for this type of thermal monitoring due to the variability of constraints required to accurately map rapidly changing temperatures from ambient to glowing hot chamber material. Obtaining accurate real-time temperatures in the JR spectrum is made even more elusive by the changing emissivity of the chamber material as it begins to glow. The parameters evaluated prior to APS testing included: (1) remote operation of the A40M camera using fiber optic Firewire signal sender and receiver units; (2) operation of the camera inside a Pelco explosion proof enclosure with a germanium window; (3) remote analog signal display for real-time monitoring; (4) remote digital data acquisition of the A40M's sensor information using FUR's ThermaCAM Researcher Pro 2.8 software; and (5) overall reliability of the system. An initial characterization report was prepared after the A40M characterization tests at Marshall Space Flight Center (MSFC) to document controlled heat source comparisons to calibrated TCs. Summary IR digital data recorded from WSTF's APS testing is included within this document along with findings, lessons learned, and recommendations for further usage as a monitoring tool for the development of rocket engines.
HUBBLE PROVIDES 'ONE-TWO PUNCH' TO SEE BIRTH OF STARS IN GALACTIC WRECKAGE
NASA Technical Reports Server (NTRS)
2002-01-01
Two powerful cameras aboard NASA's Hubble Space Telescope teamed up to capture the final stages in the grand assembly of galaxies. The photograph, taken by the Advanced Camera for Surveys (ACS) and the revived Near Infrared Camera and Multi-Object Spectrometer (NICMOS), shows a tumultuous collision between four galaxies located 1 billion light-years from Earth. The galactic car wreck is creating a torrent of new stars. The tangled up galaxies, called IRAS 19297-0406, are crammed together in the center of the picture. IRAS 19297-0406 is part of a class of galaxies known as ultraluminous infrared galaxies (ULIRGs). ULIRGs are considered the progenitors of massive elliptical galaxies. ULIRGs glow fiercely in infrared light, appearing 100 times brighter than our Milky Way Galaxy. The large amount of dust in these galaxies produces the brilliant infrared glow. The dust is generated by a firestorm of star birth triggered by the collisions. IRAS 19297-0406 is producing about 200 new Sun-like stars every year -- about 100 times more stars than our Milky Way creates. The hotbed of this star formation is the central region [the yellow objects]. This area is swamped in the dust created by the flurry of star formation. The bright blue material surrounding the central region corresponds to the ultraviolet glow of new stars. The ultraviolet light is not obscured by dust. Astronomers believe that this area is creating fewer new stars and therefore not as much dust. The colliding system [yellow and blue regions] has a diameter of about 30,000 light-years, or about half the size of the Milky Way. The tail [faint blue material at left] extends out for another 20,000 light-years. Astronomers used both cameras to witness the flocks of new stars that are forming from the galactic wreckage. NICMOS penetrated the dusty veil that masks the intense star birth in the central region. ACS captured the visible starlight of the colliding system's blue outer region. IRAS 19297-0406 may be similar to the so-called Hickson compact groups -- clusters of at least four galaxies in a tight configuration that are isolated from other galaxies. The galaxies are so close together that they lose energy from the relentless pull of gravity. Eventually, they fall into each other and form one massive galaxy. This color-composite image was made by combining photographs taken in near-infrared light with NICMOS and ultraviolet and visible light with ACS. The pictures were taken with these filters: the H-band and J-band on NICMOS; the V-band on the ACS wide-field camera; and the U-band on the ACS high-resolution camera. The images were taken on May 13 and 14. Credits: NASA, the NICMOS Group (STScI, ESA), and the NICMOS Science Team (University of Arizona)
Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio; Rispoli, Attilio
2010-01-01
This paper presents an innovative method for estimating the attitude of airborne electro-optical cameras with respect to the onboard autonomous navigation unit. The procedure is based on the use of attitude measurements under static conditions taken by an inertial unit and carrier-phase differential Global Positioning System to obtain accurate camera position estimates in the aircraft body reference frame, while image analysis allows line-of-sight unit vectors in the camera based reference frame to be computed. The method has been applied to the alignment of the visible and infrared cameras installed onboard the experimental aircraft of the Italian Aerospace Research Center and adopted for in-flight obstacle detection and collision avoidance. Results show an angular uncertainty on the order of 0.1° (rms). PMID:22315559
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chemerisov, S.; Bailey, J.; Heltemes, T.
A series of four one-day irradiations was conducted with 100Mo-enriched disk targets. After irradiation, the enriched disks were removed from the target and dissolved. The resulting solution was processed using a NorthStar RadioGenix™ 99mTc generator either at Argonne National Laboratory or at the NorthStar Medical Radioisotopes facility. Runs on the RadioGenix system produced inconsistent analytical results for 99mTc in the Tc/Mo solution. These inconsistencies were attributed to the impurities in the solution or improper column packing. During the irradiations, the performance of the optic transitional radiation (OTR) and infrared cameras was tested in high radiation field. The OTR cameras survivedmore » all irradiations, while the IR cameras failed every time. The addition of X-ray and neutron shielding improved camera survivability and decreased the number of upsets.« less
Cryogenic solid Schmidt camera as a base for future wide-field IR systems
NASA Astrophysics Data System (ADS)
Yudin, Alexey N.
2011-11-01
Work is focused on study of capability of solid Schmidt camera to serve as a wide-field infrared lens for aircraft system with whole sphere coverage, working in 8-14 um spectral range, coupled with spherical focal array of megapixel class. Designs of 16 mm f/0.2 lens with 60 and 90 degrees sensor diagonal are presented, their image quality is compared with conventional solid design. Achromatic design with significantly improved performance, containing enclosed soft correcting lens behind protective front lens is proposed. One of the main goals of the work is to estimate benefits from curved detector arrays in 8-14 um spectral range wide-field systems. Coupling of photodetector with solid Schmidt camera by means of frustrated total internal reflection is considered, with corresponding tolerance analysis. The whole lens, except front element, is considered to be cryogenic, with solid Schmidt unit to be flown by hydrogen for improvement of bulk transmission.
Long-Wavelength 640 x 486 GaAs/AlGaAs Quantum Well Infrared Photodetector Snap-Shot Camera
NASA Technical Reports Server (NTRS)
Gunapala, Sarath D.; Bandara, Sumith V.; Liu, John K.; Hong, Winn; Sundaram, Mani; Maker, Paul D.; Muller, Richard E.; Shott, Craig A.; Carralejo, Ronald
1998-01-01
A 9-micrometer cutoff 640 x 486 snap-shot quantum well infrared photodetector (QWIP) camera has been demonstrated. The performance of this QWIP camera is reported including indoor and outdoor imaging. The noise equivalent differential temperature (NE.deltaT) of 36 mK has been achieved at 300 K background with f/2 optics. This is in good agreement with expected focal plane array sensitivity due to the practical limitations on charge handling capacity of the multiplexer, read noise, bias voltage, and operating temperature.
ERIC Educational Resources Information Center
Ballard, David M.
1990-01-01
Examines the characteristics of three types of motion detectors: Doppler radar, infrared, and ultrasonic wave, and how they are used on school buses to prevent students from being killed by their own school bus. Other safety devices cited are bus crossing arms and a camera monitor system. (MLF)
Preliminary optical design of PANIC, a wide-field infrared camera for CAHA
NASA Astrophysics Data System (ADS)
Cárdenas, M. C.; Rodríguez Gómez, J.; Lenzen, R.; Sánchez-Blanco, E.
2008-07-01
In this paper, we present the preliminary optical design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Calar Alto 2.2 m telescope. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. A mosaic of four Hawaii 2RG of 2k x 2k made by Teledyne is used as detector and will give a field of view of 31.9 arcmin x 31.9 arcmin. This cryogenic instrument has been optimized for the Y, J, H and K bands. Special care has been taken in the selection of the standard IR materials used for the optics in order to maximize the instrument throughput and to include the z band. The main challenges of this design are: to produce a well defined internal pupil which allows reducing the thermal background by a cryogenic pupil stop; the correction of off-axis aberrations due to the large field available; the correction of chromatic aberration because of the wide spectral coverage; and the capability of introduction of narrow band filters (~1%) in the system minimizing the degradation in the filter passband without a collimated stage in the camera. We show the optomechanical error budget and compensation strategy that allows our as built design to met the performances from an optical point of view. Finally, we demonstrate the flexibility of the design showing the performances of PANIC at the CAHA 3.5m telescope.
University of Virginia suborbital infrared sensing experiment
NASA Astrophysics Data System (ADS)
Holland, Stephen; Nunnally, Clayton; Armstrong, Sarah; Laufer, Gabriel
2002-03-01
An Orion sounding rocket launched from Wallops Flight Facility carried a University of Virginia payload to an altitude of 47 km and returned infrared measurements of the Earth's upper atmosphere and video images of the ocean. The payload launch was the result of a three-year undergraduate design project by a multi-disciplinary student group from the University of Virginia and James Madison University. As part of a new multi-year design course, undergraduate students designed, built, tested, and participated in the launch of a suborbital platform from which atmospheric remote sensors and other scientific experiments could operate. The first launch included a simplified atmospheric measurement system intended to demonstrate full system operation and remote sensing capabilities during suborbital flight. A thermoelectrically cooled HgCdTe infrared detector, with peak sensitivity at 10 micrometers , measured upwelling radiation and a small camera and VCR system, aligned with the infrared sensor, provided a ground reference. Additionally, a simple orientation sensor, consisting of three photodiodes, equipped with red, green, and blue light with dichroic filters, was tested. Temperature measurements of the upper atmosphere were successfully obtained during the flight. Video images were successfully recorded on-board the payload and proved a valuable tool in the data analysis process. The photodiode system, intended as a replacement for the camera and VCR system, functioned well, despite low signal amplification. This fully integrated and flight tested payload will serve as a platform for future atmospheric sensing experiments. It is currently being modified for a second suborbital flight that will incorporate a gas filter correlation radiometry (GFCR) instrument to measure the distribution of stratospheric methane and imaging capabilities to record the chlorophyll distribution in the Metompkin Bay as an indicator of pollution runoff.
Simultaneous Tracking of Multiple Points Using a Wiimote
ERIC Educational Resources Information Center
Skeffington, Alex; Scully, Kyle
2012-01-01
This paper reviews the construction of an inexpensive motion tracking and data logging system, which can be used for a wide variety of teaching experiments ranging from entry-level physics courses to advanced courses. The system utilizes an affordable infrared camera found in a Nintendo Wiimote to track IR LEDs mounted to the objects to be…
Emteborg, Håkan; Zeleny, Reinhard; Charoud-Got, Jean; Martos, Gustavo; Lüddeke, Jörg; Schellin, Holger; Teipel, Katharina
2014-01-01
Coupling an infrared (IR) camera to a freeze dryer for on-line monitoring of freeze-drying cycles is described for the first time. Normally, product temperature is measured using a few invasive Pt-100 probes, resulting in poor spatial resolution. To overcome this, an IR camera was placed on a process-scale freeze dryer. Imaging took place every 120 s through a Germanium window comprising 30,000 measurement points obtained contact-free from −40°C to 25°C. Results are presented for an empty system, bulk drying of cheese slurry, and drying of 1 mL human serum in 150 vials. During freezing of the empty system, differences of more than 5°C were measured on the shelf. Adding a tray to the empty system, a difference of more than 8°C was observed. These temperature differences probably cause different ice structures affecting the drying speed during sublimation. A temperature difference of maximum 13°C was observed in bulk mode during sublimation. When drying in vials, differences of more than 10°C were observed. Gradually, the large temperature differences disappeared during secondary drying and products were transformed into uniformly dry cakes. The experimental data show that the IR camera is a highly versatile on-line monitoring tool for different kinds of freeze-drying processes. © 2014 European Union 103:2088–2097, 2014 PMID:24902839
Emteborg, Håkan; Zeleny, Reinhard; Charoud-Got, Jean; Martos, Gustavo; Lüddeke, Jörg; Schellin, Holger; Teipel, Katharina
2014-07-01
Coupling an infrared (IR) camera to a freeze dryer for on-line monitoring of freeze-drying cycles is described for the first time. Normally, product temperature is measured using a few invasive Pt-100 probes, resulting in poor spatial resolution. To overcome this, an IR camera was placed on a process-scale freeze dryer. Imaging took place every 120 s through a Germanium window comprising 30,000 measurement points obtained contact-free from -40 °C to 25 °C. Results are presented for an empty system, bulk drying of cheese slurry, and drying of 1 mL human serum in 150 vials. During freezing of the empty system, differences of more than 5 °C were measured on the shelf. Adding a tray to the empty system, a difference of more than 8 °C was observed. These temperature differences probably cause different ice structures affecting the drying speed during sublimation. A temperature difference of maximum 13 °C was observed in bulk mode during sublimation. When drying in vials, differences of more than 10 °C were observed. Gradually, the large temperature differences disappeared during secondary drying and products were transformed into uniformly dry cakes. The experimental data show that the IR camera is a highly versatile on-line monitoring tool for different kinds of freeze-drying processes. © 2014 European Union.
NASA Technical Reports Server (NTRS)
Anderson, James E.; Tepper, Edward H.; Trevino, Louis A.
1991-01-01
Manned tests in Chamber B at NASA JSC were conducted in May and June of 1990 to better quantify the Space Shuttle Extravehicular Mobility Unit's (EMU) thermal performance in the cold environmental extremes of space. Use of an infrared imaging camera with real-time video monitoring of the output significantly added to the scope, quality and interpretation of the test conduct and data acquisition. Results of this test program have been effective in the thermal certification of a new insulation configuration and the '5000 Series' glove. In addition, the acceptable thermal performance of flight garments with visually deteriorated insulation was successfully demonstrated, thereby saving significant inspection and garment replacement cost. This test program also established a new method for collecting data vital to improving crew thermal comfort in a cold environment.
ASPIRE - Airborne Spectro-Polarization InfraRed Experiment
NASA Astrophysics Data System (ADS)
DeLuca, E.; Cheimets, P.; Golub, L.; Madsen, C. A.; Marquez, V.; Bryans, P.; Judge, P. G.; Lussier, L.; McIntosh, S. W.; Tomczyk, S.
2017-12-01
Direct measurements of coronal magnetic fields are critical for taking the next step in active region and solar wind modeling and for building the next generation of physics-based space-weather models. We are proposing a new airborne instrument to make these key observations. Building on the successful Airborne InfraRed Spectrograph (AIR-Spec) experiment for the 2017 eclipse, we will design and build a spectro-polarimeter to measure coronal magnetic field during the 2019 South Pacific eclipse. The new instrument will use the AIR-Spec optical bench and the proven pointing, tracking, and stabilization optics. A new cryogenic spectro-polarimeter will be built focusing on the strongest emission lines observed during the eclipse. The AIR-Spec IR camera, slit jaw camera and data acquisition system will all be reused. The poster will outline the optical design and the science goals for ASPIRE.
Practical aspects of modern interferometry for optical manufacturing quality control: Part 2
NASA Astrophysics Data System (ADS)
Smythe, Robert
2012-07-01
Modern phase shifting interferometers enable the manufacture of optical systems that drive the global economy. Semiconductor chips, solid-state cameras, cell phone cameras, infrared imaging systems, space based satellite imaging and DVD and Blu-Ray disks are all enabled by phase shifting interferometers. Theoretical treatments of data analysis and instrument design advance the technology but often are not helpful towards the practical use of interferometers. An understanding of the parameters that drive system performance is critical to produce useful results. Any interferometer will produce a data map and results; this paper, in three parts, reviews some of the key issues to minimize error sources in that data and provide a valid measurement.
Practical aspects of modern interferometry for optical manufacturing quality control, Part 3
NASA Astrophysics Data System (ADS)
Smythe, Robert A.
2012-09-01
Modern phase shifting interferometers enable the manufacture of optical systems that drive the global economy. Semiconductor chips, solid-state cameras, cell phone cameras, infrared imaging systems, space-based satellite imaging, and DVD and Blu-Ray disks are all enabled by phase-shifting interferometers. Theoretical treatments of data analysis and instrument design advance the technology but often are not helpful toward the practical use of interferometers. An understanding of the parameters that drive the system performance is critical to produce useful results. Any interferometer will produce a data map and results; this paper, in three parts, reviews some of the key issues to minimize error sources in that data and provide a valid measurement.
Design and development of an airborne multispectral imaging system
NASA Astrophysics Data System (ADS)
Kulkarni, Rahul R.; Bachnak, Rafic; Lyle, Stacey; Steidley, Carl W.
2002-08-01
Advances in imaging technology and sensors have made airborne remote sensing systems viable for many applications that require reasonably good resolution at low cost. Digital cameras are making their mark on the market by providing high resolution at very high rates. This paper describes an aircraft-mounted imaging system (AMIS) that is being designed and developed at Texas A&M University-Corpus Christi (A&M-CC) with the support of a grant from NASA. The approach is to first develop and test a one-camera system that will be upgraded into a five-camera system that offers multi-spectral capabilities. AMIS will be low cost, rugged, portable and has its own battery power source. Its immediate use will be to acquire images of the Coastal area in the Gulf of Mexico for a variety of studies covering vast spectra from near ultraviolet region to near infrared region. This paper describes AMIS and its characteristics, discusses the process for selecting the major components, and presents the progress.
NASA Astrophysics Data System (ADS)
Bratcher, Tim; Kroutil, Robert; Lanouette, André; Lewis, Paul E.; Miller, David; Shen, Sylvia; Thomas, Mark
2013-05-01
The development concept paper for the MSIC system was first introduced in August 2012 by these authors. This paper describes the final assembly, testing, and commercial availability of the Mapping System Interface Card (MSIC). The 2.3kg MSIC is a self-contained, compact variable configuration, low cost real-time precision metadata annotator with embedded INS/GPS designed specifically for use in small aircraft. The MSIC was specifically designed to convert commercial-off-the-shelf (COTS) digital cameras and imaging/non-imaging spectrometers with Camera Link standard data streams into mapping systems for airborne emergency response and scientific remote sensing applications. COTS digital cameras and imaging/non-imaging spectrometers covering the ultraviolet through long-wave infrared wavelengths are important tools now readily available and affordable for use by emergency responders and scientists. The MSIC will significantly enhance the capability of emergency responders and scientists by providing a direct transformation of these important COTS sensor tools into low-cost real-time aerial mapping systems.
ERIC Educational Resources Information Center
Catelli, Francisco; Giovannini, Odilon; Bolzan, Vicente Dall Agnol
2011-01-01
The interference fringes produced by a diffraction grating illuminated with radiation from a TV remote control and a red laser beam are, simultaneously, captured by a digital camera. Based on an image with two interference patterns, an estimate of the infrared radiation wavelength emitted by a TV remote control is made. (Contains 4 figures.)
2009-03-01
infrared, thermal , or night vision applications. Understanding the true capabilities and limitations of the ALAN camera and its applicability to a...an option to more expensive infrared, thermal , or night vision applications. Ultimately, it will be clear whether the configuration of the Kestrel...45 A. THERMAL CAMERAS................................................................................45 1
NIRCam: Development and Testing of the JWST Near-Infrared Camera
NASA Technical Reports Server (NTRS)
Greene, Thomas; Beichman, Charles; Gully-Santiago, Michael; Jaffe, Daniel; Kelly, Douglas; Krist, John; Rieke, Marcia; Smith, Eric H.
2011-01-01
The Near Infrared Camera (NIRCam) is one of the four science instruments of the James Webb Space Telescope (JWST). Its high sensitivity, high spatial resolution images over the 0.6 - 5 microns wavelength region will be essential for making significant findings in many science areas as well as for aligning the JWST primary mirror segments and telescope. The NIRCam engineering test unit was recently assembled and has undergone successful cryogenic testing. The NIRCam collimator and camera optics and their mountings are also progressing, with a brass-board system demonstrating relatively low wavefront error across a wide field of view. The flight model?s long-wavelength Si grisms have been fabricated, and its coronagraph masks are now being made. Both the short (0.6 - 2.3 microns) and long (2.4 - 5.0 microns) wavelength flight detectors show good performance and are undergoing final assembly and testing. The flight model subsystems should all be completed later this year through early 2011, and NIRCam will be cryogenically tested in the first half of 2011 before delivery to the JWST integrated science instrument module (ISIM).
Method for enhanced control of welding processes
Sheaffer, Donald A.; Renzi, Ronald F.; Tung, David M.; Schroder, Kevin
2000-01-01
Method and system for producing high quality welds in welding processes, in general, and gas tungsten arc (GTA) welding, in particular by controlling weld penetration. Light emitted from a weld pool is collected from the backside of a workpiece by optical means during welding and transmitted to a digital video camera for further processing, after the emitted light is first passed through a short wavelength pass filter to remove infrared radiation. By filtering out the infrared component of the light emitted from the backside weld pool image, the present invention provides for the accurate determination of the weld pool boundary. Data from the digital camera is fed to an imaging board which focuses on a 100.times.100 pixel portion of the image. The board performs a thresholding operation and provides this information to a digital signal processor to compute the backside weld pool dimensions and area. This information is used by a control system, in a dynamic feedback mode, to automatically adjust appropriate parameters of a welding system, such as the welding current, to control weld penetration and thus, create a uniform weld bead and high quality weld.
Camera Traps Can Be Heard and Seen by Animals
Meek, Paul D.; Ballard, Guy-Anthony; Fleming, Peter J. S.; Schaefer, Michael; Williams, Warwick; Falzon, Greg
2014-01-01
Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals’ hearing and produce illumination that can be seen by many species. PMID:25354356
Khokhlova, Vera A.; Shmeleva, Svetlana M.; Gavrilov, Leonid R.; Martin, Eleanor; Sadhoo, Neelaksh; Shaw, Adam
2013-01-01
Considerable progress has been achieved in the use of infrared (IR) techniques for qualitative mapping of acoustic fields of high intensity focused ultrasound (HIFU) transducers. The authors have previously developed and demonstrated a method based on IR camera measurement of the temperature rise induced in an absorber less than 2 mm thick by ultrasonic bursts of less than 1 s duration. The goal of this paper was to make the method more quantitative and estimate the absolute intensity distributions by determining an overall calibration factor for the absorber and camera system. The implemented approach involved correlating the temperature rise measured in an absorber using an IR camera with the pressure distribution measured in water using a hydrophone. The measurements were conducted for two HIFU transducers and a flat physiotherapy transducer of 1 MHz frequency. Corresponding correction factors between the free field intensity and temperature were obtained and allowed the conversion of temperature images to intensity distributions. The system described here was able to map in good detail focused and unfocused ultrasound fields with sub-millimeter structure and with local time average intensity from below 0.1 W/cm2 to at least 50 W/cm2. Significantly higher intensities could be measured simply by reducing the duty cycle. PMID:23927199
Khokhlova, Vera A; Shmeleva, Svetlana M; Gavrilov, Leonid R; Martin, Eleanor; Sadhoo, Neelaksh; Shaw, Adam
2013-08-01
Considerable progress has been achieved in the use of infrared (IR) techniques for qualitative mapping of acoustic fields of high intensity focused ultrasound (HIFU) transducers. The authors have previously developed and demonstrated a method based on IR camera measurement of the temperature rise induced in an absorber less than 2 mm thick by ultrasonic bursts of less than 1 s duration. The goal of this paper was to make the method more quantitative and estimate the absolute intensity distributions by determining an overall calibration factor for the absorber and camera system. The implemented approach involved correlating the temperature rise measured in an absorber using an IR camera with the pressure distribution measured in water using a hydrophone. The measurements were conducted for two HIFU transducers and a flat physiotherapy transducer of 1 MHz frequency. Corresponding correction factors between the free field intensity and temperature were obtained and allowed the conversion of temperature images to intensity distributions. The system described here was able to map in good detail focused and unfocused ultrasound fields with sub-millimeter structure and with local time average intensity from below 0.1 W/cm(2) to at least 50 W/cm(2). Significantly higher intensities could be measured simply by reducing the duty cycle.
2001-12-01
KENNEDY SPACE CENTER, Fla. - STS-109 Mission Specialist Richard Lennehan (left) and Payload Commander John Grunsfeld get a feel for tools and equipment that will be used on the mission. The crew is at KSC to take part in Crew Equipment Interface Test activities that include familiarization with the orbiter and equipment. The goal of the mission is to service the HST, replacing Solar Array 2 with Solar Array 3, replacing the Power Control Unit, removing the Faint Object Camera and installing the Advanced Camera for Surveys, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002
High spatial resolution infrared camera as ISS external experiment
NASA Astrophysics Data System (ADS)
Eckehard, Lorenz; Frerker, Hap; Fitch, Robert Alan
High spatial resolution infrared camera as ISS external experiment for monitoring global climate changes uses ISS internal and external resources (eg. data storage). The optical experiment will consist of an infrared camera for monitoring global climate changes from the ISS. This technology was evaluated by the German small satellite mission BIRD and further developed in different ESA projects. Compared to BIRD the presended instrument uses proven sensor advanced technologies (ISS external) and ISS on board processing and storage capabili-ties (internal). The instrument will be equipped with a serial interfaces for TM/TC and several relay commands for the power supply. For data processing and storage a mass memory is re-quired. The access to actual attitude data is highly desired to produce geo referenced maps-if possible by an on board processing.
Bennett, C.L.
1996-07-23
An imaging Fourier transform spectrometer is described having a Fourier transform infrared spectrometer providing a series of images to a focal plane array camera. The focal plane array camera is clocked to a multiple of zero crossing occurrences as caused by a moving mirror of the Fourier transform infrared spectrometer and as detected by a laser detector such that the frame capture rate of the focal plane array camera corresponds to a multiple of the zero crossing rate of the Fourier transform infrared spectrometer. The images are transmitted to a computer for processing such that representations of the images as viewed in the light of an arbitrary spectral ``fingerprint`` pattern can be displayed on a monitor or otherwise stored and manipulated by the computer. 2 figs.
Advanced imaging research and development at DARPA
NASA Astrophysics Data System (ADS)
Dhar, Nibir K.; Dat, Ravi
2012-06-01
Advances in imaging technology have huge impact on our daily lives. Innovations in optics, focal plane arrays (FPA), microelectronics and computation have revolutionized camera design. As a result, new approaches to camera design and low cost manufacturing is now possible. These advances are clearly evident in visible wavelength band due to pixel scaling, improvements in silicon material and CMOS technology. CMOS cameras are available in cell phones and many other consumer products. Advances in infrared imaging technology have been slow due to market volume and many technological barriers in detector materials, optics and fundamental limits imposed by the scaling laws of optics. There is of course much room for improvements in both, visible and infrared imaging technology. This paper highlights various technology development projects at DARPA to advance the imaging technology for both, visible and infrared. Challenges and potentials solutions are highlighted in areas related to wide field-of-view camera design, small pitch pixel, broadband and multiband detectors and focal plane arrays.
Comparison of parameters of modern cooled and uncooled thermal cameras
NASA Astrophysics Data System (ADS)
Bareła, Jarosław; Kastek, Mariusz; Firmanty, Krzysztof; Krupiński, Michał
2017-10-01
During the design of a system employing thermal cameras one always faces a problem of choosing the camera types best suited for the task. In many cases such a choice is far from optimal one, and there are several reasons for that. System designers often favor tried and tested solution they are used to. They do not follow the latest developments in the field of infrared technology and sometimes their choices are based on prejudice and not on facts. The paper presents the results of measurements of basic parameters of MWIR and LWIR thermal cameras, carried out in a specialized testing laboratory. The measured parameters are decisive in terms of image quality generated by thermal cameras. All measurements were conducted according to current procedures and standards. However the camera settings were not optimized for a specific test conditions or parameter measurements. Instead the real settings used in normal camera operations were applied to obtain realistic camera performance figures. For example there were significant differences between measured values of noise parameters and catalogue data provided by manufacturers, due to the application of edge detection filters to increase detection and recognition ranges. The purpose of this paper is to provide help in choosing the optimal thermal camera for particular application, answering the question whether to opt for cheaper microbolometer device or apply slightly better (in terms of specifications) yet more expensive cooled unit. Measurements and analysis were performed by qualified personnel with several dozen years of experience in both designing and testing of thermal camera systems with both cooled and uncooled focal plane arrays. Cameras of similar array sizes and optics were compared, and for each tested group the best performing devices were selected.
VizieR Online Data Catalog: The hot Jupiter Kepler-13Ab planet's occultation (Shporer+, 2014)
NASA Astrophysics Data System (ADS)
Shporer, A.; O'Rourke, J. G.; Knutson, H. A.; Szabo, G. M.; Zhao, M.; Burrows, A.; Fortney, J.; Agol, E.; Cowan, N. B.; Desert, J.-M.; Howard, A. W.; Isaacson, H.; Lewis, N. K.; Showman, A. P.; Todorov, K. O.
2017-07-01
Here we carry out an atmospheric characterization of Kepler-13Ab by measuring its occultation in four different wavelength bands, from the infrared (IR; Spitzer/Infrared array camera (IRAC) 4.5 um and 3.6 um), through the near-IR (NIR; Ks band), to the optical (Kepler). We also analyze the Kepler phase curve and obtain Keck/high-resolution echelle spectrometer (HIRES) spectra that result in revised parameters for the objects in the system. (4 data files).
A Low Power Cryogenic Shutter Mechanism for use in Infrared Images
NASA Technical Reports Server (NTRS)
Schwinger, D. Scott; Hakun, Claef F.
2000-01-01
This paper discusses the requirements, design, operation, and testing of the shutter mechanism for the Infrared Array Camera (IRAC). The shutter moves a mirror panel into or out of the incoming light path transitioning IRAC between data acquisition and calibration modes. The mechanism features a torsion flexure suspension system, two low-power rotary actuators, a balanced shaft, and a variable reluctance position sensor. Each of these items is discussed along with problems encountered during development and the implemented solutions.
Development of infrared goggles and prototype
NASA Astrophysics Data System (ADS)
Tsuchimoto, Kouzou; Komatsubara, Shigeyuki; Fujikawa, Masaru; Otsuka, Toshiaki; Kan, Moriyasu; Matsumura, Norihide
2006-05-01
We aimed at developing a hands free type practical wearable thermography which will not hinder walking or working of the person wearing the equipment. We installed a small format camera core module, which was recently developed, into the fire fighter's helmet and incorporated image transmission function over radio to the equipment. We combined this thermography with a see-through type head mount display, and called it "Infrared Goggles". A prototype was developed for verification test of lifesaving support system in fire fighting activities.
NASA Technical Reports Server (NTRS)
Gunapala, Sarath D.; Park, Jin S.; Sarusi, Gabby; Lin, True-Lon; Liu, John K.; Maker, Paul D.; Muller, Richard E.; Shott, Craig A.; Hoelter, Ted
1997-01-01
In this paper, we discuss the development of very sensitive, very long wavelength infrared GaAs/Al(x)Ga(1-x)As quantum well infrared photodetectors (QWIP's) based on bound-to-quasi-bound intersubband transition, fabrication of random reflectors for efficient light coupling, and the demonstration of a 15 micro-m cutoff 128 x 128 focal plane array imaging camera. Excellent imagery, with a noise equivalent differential temperature (N E(delta T)) of 30 mK has been achieved.
The NASA - Arc 10/20 micron camera
NASA Technical Reports Server (NTRS)
Roellig, T. L.; Cooper, R.; Deutsch, L. K.; Mccreight, C.; Mckelvey, M.; Pendleton, Y. J.; Witteborn, F. C.; Yuen, L.; Mcmahon, T.; Werner, M. W.
1994-01-01
A new infrared camera (AIR Camera) has been developed at NASA - Ames Research Center for observations from ground-based telescopes. The heart of the camera is a Hughes 58 x 62 pixel Arsenic-doped Silicon detector array that has the spectral sensitivity range to allow observations in both the 10 and 20 micron atmospheric windows.
Overview of LBTI: A Multipurpose Facility for High Spatial Resolution Observations
NASA Technical Reports Server (NTRS)
Hinz, P. M.; Defrere, D.; Skemer, A.; Bailey, V.; Stone, J.; Spalding, E.; Vaz, A.; Pinna, E.; Puglisi, A.; Esposito, S.;
2016-01-01
The Large Binocular Telescope Interferometer (LBTI) is a high spatial resolution instrument developed for coherent imaging and nulling interferometry using the 14.4 m baseline of the 2x8.4 m LBT. The unique telescope design, comprising of the dual apertures on a common elevation-azimuth mount, enables a broad use of observing modes. The full system is comprised of dual adaptive optics systems, a near-infrared phasing camera, a 1-5 micrometer camera (called LMIRCam), and an 8-13 micrometer camera (called NOMIC). The key program for LBTI is the Hunt for Observable Signatures of Terrestrial planetary Systems (HOSTS), a survey using nulling interferometry to constrain the typical brightness from exozodiacal dust around nearby stars. Additional observations focus on the detection and characterization of giant planets in the thermal infrared, high spatial resolution imaging of complex scenes such as Jupiter's moon, Io, planets forming in transition disks, and the structure of active Galactic Nuclei (AGN). Several instrumental upgrades are currently underway to improve and expand the capabilities of LBTI. These include: Improving the performance and limiting magnitude of the parallel adaptive optics systems; quadrupling the field of view of LMIRcam (increasing to 20"x20"); adding an integral field spectrometry mode; and implementing a new algorithm for path length correction that accounts for dispersion due to atmospheric water vapor. We present the current architecture and performance of LBTI, as well as an overview of the upgrades.
High resolution infrared acquisitions droning over the LUSI mud eruption.
NASA Astrophysics Data System (ADS)
Di Felice, Fabio; Romeo, Giovanni; Di Stefano, Giuseppe; Mazzini, Adriano
2016-04-01
The use of low-cost hand-held infrared (IR) thermal cameras based on uncooled micro-bolometer detector arrays became more widespread during the recent years. Thermal cameras have the ability to estimate temperature values without contact and therefore can be used in circumstances where objects are difficult or dangerous to reach such as volcanic eruptions. Since May 2006 the Indonesian LUSI mud eruption continues to spew boiling mud, water, aqueous vapor, CO2, CH4 and covers a surface of nearly 7 km2. At this locality we performed surveys over the unreachable erupting crater. In the framework of the LUSI Lab project (ERC grant n° 308126), in 2014 and 2015, we acquired high resolution infrared images using a specifically equipped remote-controlled drone flying at an altitude of m 100. The drone is equipped with GPS and an autopilot system that allows pre-programming the flying path or designing grids. The mounted thermal camera has peak spectral sensitivity in LW wavelength (μm 10) that is characterized by low water vapor and CO2 absorption. The low distance (high resolution) acquisitions have a temperature detail every cm 40, therefore it is possible to detect and observe physical phenomena such as thermodynamic behavior, hot mud and fluids emissions locations and their time shifts. Despite the harsh logistics and the continuously varying gas concentrations we managed to collect thermal images to estimate the crater zone spatial thermal variations. We applied atmosphere corrections to calculate infrared absorption by high concentration of water vapor. Thousands of images have been stitched together to obtain a mosaic of the crater zone. Regular monitoring with heat variation measurements collected, e.g. every six months, could give important information about the volcano activity estimating its evolution. A future data base of infrared high resolution and visible images stored in a web server could be a useful monitoring tool. An interesting development will be to use a multi-spectral thermal camera to perform a complete near remote sensing to detect, not only temperature, but gas, sensitive to particular wavelengths.
NASA Astrophysics Data System (ADS)
Georgiou, Giota; Verdaasdonk, Rudolf M.; van der Veen, Albert; Klaessens, John H.
2017-02-01
In the development of new near-infrared (NIR) fluorescence dyes for image guided surgery, there is a need for new NIR sensitive camera systems that can easily be adjusted to specific wavelength ranges in contrast the present clinical systems that are only optimized for ICG. To test alternative camera systems, a setup was developed to mimic the fluorescence light in a tissue phantom to measure the sensitivity and resolution. Selected narrow band NIR LED's were used to illuminate a 6mm diameter circular diffuse plate to create uniform intensity controllable light spot (μW-mW) as target/source for NIR camera's. Layers of (artificial) tissue with controlled thickness could be placed on the spot to mimic a fluorescent `cancer' embedded in tissue. This setup was used to compare a range of NIR sensitive consumer's cameras for potential use in image guided surgery. The image of the spot obtained with the cameras was captured and analyzed using ImageJ software. Enhanced CCD night vision cameras were the most sensitive capable of showing intensities < 1 μW through 5 mm of tissue. However, there was no control over the automatic gain and hence noise level. NIR sensitive DSLR cameras proved relative less sensitive but could be fully manually controlled as to gain (ISO 25600) and exposure time and are therefore preferred for a clinical setting in combination with Wi-Fi remote control. The NIR fluorescence testing setup proved to be useful for camera testing and can be used for development and quality control of new NIR fluorescence guided surgery equipment.
Development of blood vessel searching system for HMS
NASA Astrophysics Data System (ADS)
Kandani, Hirofumi; Uenoya, Toshiyuki; Uetsuji, Yasutomo; Nakamachi, Eiji
2008-08-01
In this study, we develop a new 3D miniature blood vessel searching system by using near-infrared LED light, a CMOS camera module with an image processing unit for a health monitoring system (HMS), a drug delivery system (DDS) which requires very high performance for automatic micro blood volume extraction and automatic blood examination. Our objective is to fabricate a highly reliable micro detection system by utilizing image capturing, image processing, and micro blood extraction devices. For the searching system to determine 3D blood vessel location, we employ the stereo method. The stereo method is a common photogrammetric method. It employs the optical path principle to detect 3D location of the disparity between two cameras. The principle for blood vessel visualization is derived from the ratio of hemoglobin's absorption of the near-infrared LED light. To get a high quality blood vessel image, we adopted an LED, with peak a wavelength of 940nm. The LED is set on the dorsal side of the finger and it irradiates the human finger. A blood vessel image is captured by a CMOS camera module, which is set below the palmer side of the finger. 2D blood vessel location can be detected by the luminance distribution of a one pixel line. To examine the accuracy of our detecting system, we carried out experiments using finger phantoms with blood vessel diameters of 0.5, 0.75, 1.0mm, at the depths of 0.5 ~ 2.0 mm from the phantom's surface. The experimental results of the estimated depth obtained by our detecting system shows good agreements with the given depths, and the viability of this system is confirmed.
PNIC - A near infrared camera for testing focal plane arrays
NASA Astrophysics Data System (ADS)
Hereld, Mark; Harper, D. A.; Pernic, R. J.; Rauscher, Bernard J.
1990-07-01
This paper describes the design and the performance of the Astrophysical Research Consortium prototype near-infrared camera (pNIC) designed to test focal plane arrays both on and off the telescope. Special attention is given to the detector in pNIC, the mechanical and optical designs, the electronics, and the instrument interface. Experiments performed to illustrate the most salient aspects of pNIC are described.
Thermographic measurements of high-speed metal cutting
NASA Astrophysics Data System (ADS)
Mueller, Bernhard; Renz, Ulrich
2002-03-01
Thermographic measurements of a high-speed cutting process have been performed with an infrared camera. To realize images without motion blur the integration times were reduced to a few microseconds. Since the high tool wear influences the measured temperatures a set-up has been realized which enables small cutting lengths. Only single images have been recorded because the process is too fast to acquire a sequence of images even with the frame rate of the very fast infrared camera which has been used. To expose the camera when the rotating tool is in the middle of the camera image an experimental set-up with a light barrier and a digital delay generator with a time resolution of 1 ns has been realized. This enables a very exact triggering of the camera at the desired position of the tool in the image. Since the cutting depth is between 0.1 and 0.2 mm a high spatial resolution was also necessary which was obtained by a special close-up lens allowing a resolution of app. 45 microns. The experimental set-up will be described and infrared images and evaluated temperatures of a titanium alloy and a carbon steel will be presented for cutting speeds up to 42 m/s.
Development of the compact infrared camera (CIRC) for Earth observation
NASA Astrophysics Data System (ADS)
Naitoh, Masataka; Katayama, Haruyoshi; Harada, Masatomo; Nakamura, Ryoko; Kato, Eri; Tange, Yoshio; Sato, Ryota; Nakau, Koji
2017-11-01
The Compact Infrared Camera (CIRC) is an instrument equipped with an uncooled infrared array detector (microbolometer). We adopted the microbolometer, because it does not require a cooling system such as a mechanical cooler, and athermal optics, which does not require an active thermal control of optics. This can reduce the size, cost, and electrical power consumption of the sensor. The main mission of the CIRC is to demonstrate the technology for detecting wildfire, which are major and chronic disasters affecting many countries in the Asia-Pacific region. It is possible to increase observational frequency of wildfires, if CIRCs are carried on a various satellites by taking advantages of small size and light weight. We have developed two CIRCs. The first will be launched in JFY 2013 onboard Advanced Land Observing Satellite-2 (ALOS- 2), and the second will be launched in JFY 2014 onboard CALorimetric Electron Telescope (CALET) of the Japanese Experiment Module (JEM) at the International Space Station(ISS). We have finished the ground Calibration of the first CIRC onboard ALOS-2. In this paper, we provide an overview of the CIRC and its results of ground calibration.
VizieR Online Data Catalog: Galaxies morphology and IR photometry II. (Gavazzi+ 1996)
NASA Astrophysics Data System (ADS)
Gavazzi, G.; Pierini, D.; Baffa, C.; Lisi, F.; Hunt, L. K.; Randone, I.; Boselli, A.
1996-05-01
We present near-infrared H-band (1.65μm) surface photometry of 297 galaxies (mostly) in the Coma Supercluster obtained with the Arcetri NICMOS3 camera, ARNICA, mounted on the Gornergrat Infrared Telescope. Magnitudes and diameters within the 21.5mag/arcsec2 isophote, concentration indices, and total H magnitudes are derived. Combining these observations with those obtained similarly using the Calar Alto telescopes (Paper I,
Instrumentation for Infrared Airglow Clutter.
1987-03-10
gain, and filter position to the Camera Head, and monitors these parameters as well as preamp video. GAZER is equipped with a Lenzar wide angle, low...Specifications/Parameters VIDEO SENSOR: Camera ...... . LENZAR Intensicon-8 LLLTV using 2nd gen * micro-channel intensifier and proprietary camera tube
Confocal retinal imaging using a digital light projector with a near infrared VCSEL source
NASA Astrophysics Data System (ADS)
Muller, Matthew S.; Elsner, Ann E.
2018-02-01
A custom near infrared VCSEL source has been implemented in a confocal non-mydriatic retinal camera, the Digital Light Ophthalmoscope (DLO). The use of near infrared light improves patient comfort, avoids pupil constriction, penetrates the deeper retina, and does not mask visual stimuli. The DLO performs confocal imaging by synchronizing a sequence of lines displayed with a digital micromirror device to the rolling shutter exposure of a 2D CMOS camera. Real-time software adjustments enable multiply scattered light imaging, which rapidly and cost-effectively emphasizes drusen and other scattering disruptions in the deeper retina. A separate 5.1" LCD display provides customizable visible stimuli for vision experiments with simultaneous near infrared imaging.
2005-11-01
is in relation to obstacles. Clearly, existing optical sensors are too large for this proposed system. Again, the approach utilizing biomimicry ...results in our latest conflicts. The Predator, a medium altitude system cruising at 70 knots and equipped with electro- optical and infrared cameras...that exist today, but the vehicles are also platforms for new concepts outside the status quo. 206 III. Technology Biomimicry is a new
THE VARIABLE NEAR-INFRARED COUNTERPART OF THE MICROQUASAR GRS 1758–258
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luque-Escamilla, Pedro L.; Martí, Josep; Muñoz-Arjonilla, Álvaro J., E-mail: peter@ujaen.es, E-mail: jmarti@ujaen.es, E-mail: ajmunoz@ujaen.es
2014-12-10
We present a new study of the microquasar system GRS 1758–258 in the near-infrared domain based on archival observations with the Hubble Space Telescope and the NICMOS camera. In addition to confirming the near-infrared counterpart pointed out by Muñoz-Arjonilla et al., we show that this object displays significant photometric variability. From its average magnitudes, we also find that GRS 1758–258 fits well within the correlation between the optical/near-infrared and X-ray luminosity known to exist for low-mass, black-hole candidate X-ray binaries in a hard state. Moreover, the spectral energy distribution built using all radio, near-infrared, and X-ray data available closest inmore » time to the NICMOS observations can be reasonably interpreted in terms of a self-absorbed radio jet and an irradiated accretion disk model around a stellar-mass black hole. All these facts match the expected behavior of a compact binary system and strengthen our confidence in the counterpart identification.« less
NASA Astrophysics Data System (ADS)
Yamamoto, Naoyuki; Saito, Tsubasa; Ogawa, Satoru; Ishimaru, Ichiro
2016-05-01
We developed the palm size (optical unit: 73[mm]×102[mm]×66[mm]) and light weight (total weight with electrical controller: 1.7[kg]) middle infrared (wavelength range: 8[μm]-14[μm]) 2-dimensional spectroscopy for UAV (Unmanned Air Vehicle) like drone. And we successfully demonstrated the flights with the developed hyperspectral camera mounted on the multi-copter so-called drone in 15/Sep./2015 at Kagawa prefecture in Japan. We had proposed 2 dimensional imaging type Fourier spectroscopy that was the near-common path temporal phase-shift interferometer. We install the variable phase shifter onto optical Fourier transform plane of infinity corrected imaging optical systems. The variable phase shifter was configured with a movable mirror and a fixed mirror. The movable mirror was actuated by the impact drive piezo-electric device (stroke: 4.5[mm], resolution: 0.01[μm], maker: Technohands Co.,Ltd., type:XDT50-45, price: around 1,000USD). We realized the wavefront division type and near common path interferometry that has strong robustness against mechanical vibrations. Without anti-mechanical vibration systems, the palm-size Fourier spectroscopy was realized. And we were able to utilize the small and low-cost middle infrared camera that was the micro borometer array (un-cooled VOxMicroborometer, pixel array: 336×256, pixel pitch: 17[μm], frame rate 60[Hz], maker: FLIR, type: Quark 336, price: around 5,000USD). And this apparatus was able to be operated by single board computer (Raspberry Pi.). Thus, total cost was less than 10,000 USD. We joined with KAMOME-PJ (Kanagawa Advanced MOdule for Material Evaluation Project) with DRONE FACTORY Corp., KUUSATSU Corp., Fuji Imvac Inc. And we successfully obtained the middle infrared spectroscopic imaging with multi-copter drone.
Network Centric Operations NCO Case Study. The British Approach to Low-Intensity Operations: Part I
2007-02-12
Army’s institutional memory of jungle warfare (during WW2 ) had dissipated by 1948. Nonetheless, individuals within the Army who had experienced such...with a specially stabilised TV camera mounting. It also fitted infrared surveillance systems to its Beaver spotter planes, which helped detect
Kim, Byeong Hak; Kim, Min Young; Chae, You Seong
2017-01-01
Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC. PMID:29280970
Kim, Byeong Hak; Kim, Min Young; Chae, You Seong
2017-12-27
Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.
Initial Checkout Results of the Compact Infrared Camera (circ) for Earth Observation
NASA Astrophysics Data System (ADS)
Kato, E.; Katayama, H.; Sakai, M.; Nakajima, Y.; Kimura, T.; Nakau, K.; Tonooka, H.
2015-04-01
Compact Infrared Camera (CIRC) is a technology-demonstration instrument equipped with an uncooled infrared array detector (microbolometer) for space application. CIRC is the first microbolometer sensor without a calibration function in orbit, like a shutter system or an onboard blackbody. The main objective of the CIRC is to detect wildfires, which are major and chronic disasters affecting various countries of Southeast Asia, particularly considering the effects of global warming and climate change. The CIRC achieves a small size (approximately 200 mm), light mass (approximately 3 kg), and low electrical power consumption (<20 W) by employing athermal optics and a shutterless system. The CIRC can be consequently mounted on multiple satellites to enable highfrequency observation. Installation of CIRCs on the ALOS-2 and on the JEM/CALET is expected to increase observation frequency. We present the initial check-out results of the CIRC onboard ALOS-2. Since the initial check-out phase (July 4-14, 2014), the CIRC has acquired the images of Earth. CIRC was demonstrated to function according to its intended design. After the early calibration validation phase, which confirmed the temperature accuracy of observed data, CIRC data has been available to the public January 2015 onward. We also introduce a few observational results about wildfire, volcanoes, and heat-island.
Low Cost and Efficient 3d Indoor Mapping Using Multiple Consumer Rgb-D Cameras
NASA Astrophysics Data System (ADS)
Chen, C.; Yang, B. S.; Song, S.
2016-06-01
Driven by the miniaturization, lightweight of positioning and remote sensing sensors as well as the urgent needs for fusing indoor and outdoor maps for next generation navigation, 3D indoor mapping from mobile scanning is a hot research and application topic. The point clouds with auxiliary data such as colour, infrared images derived from 3D indoor mobile mapping suite can be used in a variety of novel applications, including indoor scene visualization, automated floorplan generation, gaming, reverse engineering, navigation, simulation and etc. State-of-the-art 3D indoor mapping systems equipped with multiple laser scanners product accurate point clouds of building interiors containing billions of points. However, these laser scanner based systems are mostly expensive and not portable. Low cost consumer RGB-D Cameras provides an alternative way to solve the core challenge of indoor mapping that is capturing detailed underlying geometry of the building interiors. Nevertheless, RGB-D Cameras have a very limited field of view resulting in low efficiency in the data collecting stage and incomplete dataset that missing major building structures (e.g. ceilings, walls). Endeavour to collect a complete scene without data blanks using single RGB-D Camera is not technic sound because of the large amount of human labour and position parameters need to be solved. To find an efficient and low cost way to solve the 3D indoor mapping, in this paper, we present an indoor mapping suite prototype that is built upon a novel calibration method which calibrates internal parameters and external parameters of multiple RGB-D Cameras. Three Kinect sensors are mounted on a rig with different view direction to form a large field of view. The calibration procedure is three folds: 1, the internal parameters of the colour and infrared camera inside each Kinect are calibrated using a chess board pattern, respectively; 2, the external parameters between the colour and infrared camera inside each Kinect are calibrated using a chess board pattern; 3, the external parameters between every Kinect are firstly calculated using a pre-set calibration field and further refined by an iterative closet point algorithm. Experiments are carried out to validate the proposed method upon RGB-D datasets collected by the indoor mapping suite prototype. The effectiveness and accuracy of the proposed method is evaluated by comparing the point clouds derived from the prototype with ground truth data collected by commercial terrestrial laser scanner at ultra-high density. The overall analysis of the results shows that the proposed method achieves seamless integration of multiple point clouds form different RGB-D cameras collected at 30 frame per second.
Thermographic imaging for high-temperature composite materials: A defect detection study
NASA Technical Reports Server (NTRS)
Roth, Don J.; Bodis, James R.; Bishop, Chip
1995-01-01
The ability of a thermographic imaging technique for detecting flat-bottom hole defects of various diameters and depths was evaluated in four composite systems (two types of ceramic matrix composites, one metal matrix composite, and one polymer matrix composite) of interest as high-temperature structural materials. The holes ranged from 1 to 13 mm in diameter and 0.1 to 2.5 mm in depth in samples approximately 2-3 mm thick. The thermographic imaging system utilized a scanning mirror optical system and infrared (IR) focusing lens in conjunction with a mercury cadmium telluride infrared detector element to obtain high resolution infrared images. High intensity flash lamps located on the same side as the infrared camera were used to heat the samples. After heating, up to 30 images were sequentially acquired at 70-150 msec intervals. Limits of detectability based on depth and diameter of the flat-bottom holes were defined for each composite material. Ultrasonic and radiographic images of the samples were obtained and compared with the thermographic images.
Final Optical Design of PANIC, a Wide-Field Infrared Camera for CAHA
NASA Astrophysics Data System (ADS)
Cárdenas, M. C.; Gómez, J. Rodríguez; Lenzen, R.; Sánchez-Blanco, E.
We present the Final Optical Design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Ritchey-Chrtien focus of the Calar Alto 2.2 m telescope. This will be the first instrument built under the German-Spanish consortium that manages the Calar Alto observatory. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. The optical design produces a well defined internal pupil available to reducing the thermal background by a cryogenic pupil stop. A mosaic of four detectors Hawaii 2RG of 2 k ×2 k, made by Teledyne, will give a field of view of 31.9 arcmin ×31.9 arcmin.
Standardized rendering from IR surveillance motion imagery
NASA Astrophysics Data System (ADS)
Prokoski, F. J.
2014-06-01
Government agencies, including defense and law enforcement, increasingly make use of video from surveillance systems and camera phones owned by non-government entities.Making advanced and standardized motion imaging technology available to private and commercial users at cost-effective prices would benefit all parties. In particular, incorporating thermal infrared into commercial surveillance systems offers substantial benefits beyond night vision capability. Face rendering is a process to facilitate exploitation of thermal infrared surveillance imagery from the general area of a crime scene, to assist investigations with and without cooperating eyewitnesses. Face rendering automatically generates greyscale representations similar to police artist sketches for faces in surveillance imagery collected from proximate locations and times to a crime under investigation. Near-realtime generation of face renderings can provide law enforcement with an investigation tool to assess witness memory and credibility, and integrate reports from multiple eyewitnesses, Renderings can be quickly disseminated through social media to warn of a person who may pose an immediate threat, and to solicit the public's help in identifying possible suspects and witnesses. Renderings are pose-standardized so as to not divulge the presence and location of eyewitnesses and surveillance cameras. Incorporation of thermal infrared imaging into commercial surveillance systems will significantly improve system performance, and reduce manual review times, at an incremental cost that will continue to decrease. Benefits to criminal justice would include improved reliability of eyewitness testimony and improved accuracy of distinguishing among minority groups in eyewitness and surveillance identifications.
Femtowatt incoherent image conversion from mid-infrared light to near-infrared light
NASA Astrophysics Data System (ADS)
Huang, Nan; Liu, Hongjun; Wang, Zhaolu; Han, Jing; Zhang, Shuan
2017-03-01
We report on the experimental conversion imaging of an incoherent continuous-wave dim source from mid-infrared light to near-infrared light with a lowest input power of 31 femtowatt (fW). Incoherent mid-infrared images of light emission from a heat lamp bulb with an adjustable power supply at window wavelengths ranging from 2.9 µm to 3.5 µm are used for upconversion. The sum-frequency generation is realized in a laser cavity with the resonant wavelength of 1064 nm pumped by an LD at 806 nm built around a periodically poled lithium niobate (PPLN) crystal. The converted infrared image in the wavelength range ~785 nm with a resolution of about 120 × 70 is low-noise detected using a silicon-based camera. By optimizing the system parameters, the upconversion quantum efficiency is predicted to be 28% for correctly polarized, on-axis and phase-matching light.
Near-infrared autofluorescence imaging to detect parathyroid glands in thyroid surgery.
Ladurner, R; Al Arabi, N; Guendogar, U; Hallfeldt, Kkj; Stepp, H; Gallwas, Jks
2018-01-01
Objective To identify and save parathyroid glands during thyroidectomy by displaying their autofluorescence. Methods Autofluorescence imaging was carried out during thyroidectomy with and without central lymph node dissection. After visual recognition by the surgeon, the parathyroid glands and the surrounding tissue were exposed to near-infrared light with a wavelength of 690-770 nm using a modified Karl Storz near infrared/indocyanine green endoscopic system. Parathyroid tissue was expected to show near infrared autofluorescence at 820 nm, captured in the blue channel of the camera. Results We investigated 41 parathyroid glands from 20 patients; 37 glands were identified correctly based on near-infrared autofluorescence. Neither lymph nodes nor thyroid revealed substantial autofluorescence and nor did adipose tissue. Conclusions Parathyroid tissue is characterised by showing autofluorescence in the near-infrared spectrum. This effect can be used to identify and preserve parathyroid glands during thyroidectomy.
2018-02-22
Colors in this image of the Martian moon Deimos indicate a range of surface temperatures detected by observing the moon on February 15, 2018, with the Thermal Emission Imaging System (THEMIS) camera on NASA's Mars Odyssey orbiter. The left edge of the small moon is in darkness, and the right edge in sunlight. Temperature information was derived from thermal-infrared imaging such as the grayscale image shown smaller at lower left with the moon in the same orientation. The color-coding merges information from THEMIS observations made in 10 thermal-infrared wavelength bands. This was the first observation of Deimos by Mars Odyssey; the spacecraft first imaged Mars' other moon, Phobos, on September 29, 2017. Researchers have been using THEMIS to examine Mars since early 2002, but the maneuver turning the orbiter around to point the camera at Phobos was developed only recently. https://photojournal.jpl.nasa.gov/catalog/PIA22250
2018-02-22
Colors in this image of the Martian moon Phobos indicate a range of surface temperatures detected by observing the moon on February 15, 2018, with the Thermal Emission Imaging System (THEMIS) camera on NASA's Mars Odyssey orbiter. The left edge of the small moon is in darkness, and the right edge in sunlight. Phobos has an oblong shape with average diameter of about 14 miles (22 kilometers). Temperature information was derived from thermal-infrared imaging such as the grayscale image shown smaller at lower left with the moon in the same orientation. The color-coding merges information from THEMIS observations made in 10 thermal-infrared wavelength bands. This was the second observation of Phobos by Mars Odyssey; the first was on September 29, 2017. Researchers have been using THEMIS to examine Mars since early 2002, but the maneuver turning the orbiter around to point the camera at Phobos was developed only recently. https://photojournal.jpl.nasa.gov/catalog/PIA22249
Imaging of breast cancer with mid- and long-wave infrared camera.
Joro, R; Lääperi, A-L; Dastidar, P; Soimakallio, S; Kuukasjärvi, T; Toivonen, T; Saaristo, R; Järvenpää, R
2008-01-01
In this novel study the breasts of 15 women with palpable breast cancer were preoperatively imaged with three technically different infrared (IR) cameras - micro bolometer (MB), quantum well (QWIP) and photo voltaic (PV) - to compare their ability to differentiate breast cancer from normal tissue. The IR images were processed, the data for frequency analysis were collected from dynamic IR images by pixel-based analysis and from each image selectively windowed regional analysis was carried out, based on angiogenesis and nitric oxide production of cancer tissue causing vasomotor and cardiogenic frequency differences compared to normal tissue. Our results show that the GaAs QWIP camera and the InSb PV camera demonstrate the frequency difference between normal and cancerous breast tissue; the PV camera more clearly. With selected image processing operations more detailed frequency analyses could be applied to the suspicious area. The MB camera was not suitable for tissue differentiation, as the difference between noise and effective signal was unsatisfactory.
NASA Astrophysics Data System (ADS)
Källhammer, Jan-Erik; Pettersson, Håkan; Eriksson, Dick; Junique, Stéphane; Savage, Susan; Vieider, Christian; Andersson, Jan Y.; Franks, John; Van Nylen, Jan; Vercammen, Hans; Kvisterøy, Terje; Niklaus, Frank; Stemme, Göran
2006-04-01
Pedestrian fatalities are around 15% of the traffic fatalities in Europe. A proposed EU regulation requires the automotive industry to develop technologies that will substantially decrease the risk for Vulnerable Road Users when hit by a vehicle. Automatic Brake Assist systems, activated by a suitable sensor, will reduce the speed of the vehicle before the impact, independent of any driver interaction. Long Wavelength Infrared technology is an ideal candidate for such sensors, but requires a significant cost reduction. The target necessary for automotive serial applications are well below the cost of systems available today. Uncooled bolometer arrays are the most mature technology for Long Wave Infrared with low-cost potential. Analyses show that sensor size and production yield along with vacuum packaging and the optical components are the main cost drivers. A project has been started to design a new Long Wave Infrared system with a ten times cost reduction potential, optimized for the pedestrian protection requirement. It will take advantage of the progress in Micro Electro-Mechanical Systems and Long Wave Infrared optics to keep the cost down. Deployable and pre-impact braking systems can become effective alternatives to passive impact protection systems solutions fulfilling the EU pedestrian protection regulation. Low-cost Long Wave Infrared sensors will be an important enabler to make such systems cost competitive, allowing high market penetration.
Optimising Camera Traps for Monitoring Small Mammals
Glen, Alistair S.; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce
2013-01-01
Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera’s field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera’s field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats ( Mustela erminea ), feral cats (Felis catus) and hedgehogs ( Erinaceus europaeus ). Trigger speeds of 0.2–2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera’s field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790
Don't get burned: thermal monitoring of vessel sealing using a miniature infrared camera
NASA Astrophysics Data System (ADS)
Lin, Shan; Fichera, Loris; Fulton, Mitchell J.; Webster, Robert J.
2017-03-01
Miniature infrared cameras have recently come to market in a form factor that facilitates packaging in endoscopic or other minimally invasive surgical instruments. If absolute temperature measurements can be made with these cameras, they may be useful for non-contact monitoring of electrocautery-based vessel sealing, or other thermal surgical processes like thermal ablation of tumors. As a first step in evaluating the feasibility of optical medical thermometry with these new cameras, in this paper we explore how well thermal measurements can be made with them. These cameras measure the raw flux of incoming IR radiation, and we perform a calibration procedure to map their readings to absolute temperature values in the range between 40 and 150 °C. Furthermore, we propose and validate a method to estimate the spatial extent of heat spread created by a cautery tool based on the thermal images.
2002-01-17
KENNEDY SPACE CENTER, FLA. -- Workers in the Vertical Processing Facility look over the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, part of the payload on mission STS-109, the Hubble Servicing Telescope Mission. NICMOS is a new experimental cooling system consisting of a compressor and tiny turbines. With the experimental cryogenic system, NASA hopes to re-cool the infrared detectors to below -315 degrees F (-193 degrees Celsius). NICMOS II was previously tested aboard STS-95 in 1998. It could extend the life of the Hubble Space Telescope by several years. Astronauts aboard Columbia on mission STS-109 will be replacing the original NICMOS with the newer version. Launch of mission STS-109 is scheduled for Feb. 28, 2002
2002-01-17
KENNEDY SPACE CENTER, FLA. -- A closeup view of the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, part of the payload on mission STS-109, the Hubble Servicing Telescope Mission. NICMOS II is a new experimental cooling system consisting of a compressor and tiny turbines. With the experimental cryogenic system, NASA hopes to re-cool the infrared detectors to below -315 degrees F (-193 degrees Celsius). NICMOS II was previously tested aboard STS-95 in 1998. It could extend the life of the Hubble Space Telescope by several years. Astronauts aboard Columbia on mission STS-109 will be replacing the original NICMOS with the newer version. Launch of mission STS-109 is scheduled for Feb. 28, 2002
High-performance camera module for fast quality inspection in industrial printing applications
NASA Astrophysics Data System (ADS)
Fürtler, Johannes; Bodenstorfer, Ernst; Mayer, Konrad J.; Brodersen, Jörg; Heiss, Dorothea; Penz, Harald; Eckel, Christian; Gravogl, Klaus; Nachtnebel, Herbert
2007-02-01
Today, printing products which must meet highest quality standards, e.g., banknotes, stamps, or vouchers, are automatically checked by optical inspection systems. Typically, the examination of fine details of the print or security features demands images taken from various perspectives, with different spectral sensitivity (visible, infrared, ultraviolet), and with high resolution. Consequently, the inspection system is equipped with several cameras and has to cope with an enormous data rate to be processed in real-time. Hence, it is desirable to move image processing tasks into the camera to reduce the amount of data which has to be transferred to the (central) image processing system. The idea is to transfer relevant information only, i.e., features of the image instead of the raw image data from the sensor. These features are then further processed. In this paper a color line-scan camera for line rates up to 100 kHz is presented. The camera is based on a commercial CMOS (complementary metal oxide semiconductor) area image sensor and a field programmable gate array (FPGA). It implements extraction of image features which are well suited to detect print flaws like blotches of ink, color smears, splashes, spots and scratches. The camera design and several image processing methods implemented on the FPGA are described, including flat field correction, compensation of geometric distortions, color transformation, as well as decimation and neighborhood operations.
Snapshot hyperspectral fovea vision system (HyperVideo)
NASA Astrophysics Data System (ADS)
Kriesel, Jason; Scriven, Gordon; Gat, Nahum; Nagaraj, Sheela; Willson, Paul; Swaminathan, V.
2012-06-01
The development and demonstration of a new snapshot hyperspectral sensor is described. The system is a significant extension of the four dimensional imaging spectrometer (4DIS) concept, which resolves all four dimensions of hyperspectral imaging data (2D spatial, spectral, and temporal) in real-time. The new sensor, dubbed "4×4DIS" uses a single fiber optic reformatter that feeds into four separate, miniature visible to near-infrared (VNIR) imaging spectrometers, providing significantly better spatial resolution than previous systems. Full data cubes are captured in each frame period without scanning, i.e., "HyperVideo". The current system operates up to 30 Hz (i.e., 30 cubes/s), has 300 spectral bands from 400 to 1100 nm (~2.4 nm resolution), and a spatial resolution of 44×40 pixels. An additional 1.4 Megapixel video camera provides scene context and effectively sharpens the spatial resolution of the hyperspectral data. Essentially, the 4×4DIS provides a 2D spatially resolved grid of 44×40 = 1760 separate spectral measurements every 33 ms, which is overlaid on the detailed spatial information provided by the context camera. The system can use a wide range of off-the-shelf lenses and can either be operated so that the fields of view match, or in a "spectral fovea" mode, in which the 4×4DIS system uses narrow field of view optics, and is cued by a wider field of view context camera. Unlike other hyperspectral snapshot schemes, which require intensive computations to deconvolve the data (e.g., Computed Tomographic Imaging Spectrometer), the 4×4DIS requires only a linear remapping, enabling real-time display and analysis. The system concept has a range of applications including biomedical imaging, missile defense, infrared counter measure (IRCM) threat characterization, and ground based remote sensing.
High-Resolution Mars Camera Test Image of Moon (Infrared)
NASA Technical Reports Server (NTRS)
2005-01-01
This crescent view of Earth's Moon in infrared wavelengths comes from a camera test by NASA's Mars Reconnaissance Orbiter spacecraft on its way to Mars. The mission's High Resolution Imaging Science Experiment camera took the image on Sept. 8, 2005, while at a distance of about 10 million kilometers (6 million miles) from the Moon. The dark feature on the right is Mare Crisium. From that distance, the Moon would appear as a star-like point of light to the unaided eye. The test verified the camera's focusing capability and provided an opportunity for calibration. The spacecraft's Context Camera and Optical Navigation Camera also performed as expected during the test. The Mars Reconnaissance Orbiter, launched on Aug. 12, 2005, is on course to reach Mars on March 10, 2006. After gradually adjusting the shape of its orbit for half a year, it will begin its primary science phase in November 2006. From the mission's planned science orbit about 300 kilometers (186 miles) above the surface of Mars, the high resolution camera will be able to discern features as small as one meter or yard across.Huber, V; Huber, A; Kinna, D; Balboa, I; Collins, S; Conway, N; Drewelow, P; Maggi, C F; Matthews, G F; Meigs, A G; Mertens, Ph; Price, M; Sergienko, G; Silburn, S; Wynn, A; Zastrow, K-D
2016-11-01
The in situ absolute calibration of the JET real-time protection imaging system has been performed for the first time by means of radiometric light source placed inside the JET vessel and operated by remote handling. High accuracy of the calibration is confirmed by cross-validation of the near infrared (NIR) cameras against each other, with thermal IR cameras, and with the beryllium evaporator, which lead to successful protection of the JET first wall during the last campaign. The operation temperature ranges of NIR protection cameras for the materials used on JET are Be 650-1600 °C, W coating 600-1320 °C, and W 650-1500 °C.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huber, V., E-mail: V.Huber@fz-juelich.de; Huber, A.; Mertens, Ph.
The in situ absolute calibration of the JET real-time protection imaging system has been performed for the first time by means of radiometric light source placed inside the JET vessel and operated by remote handling. High accuracy of the calibration is confirmed by cross-validation of the near infrared (NIR) cameras against each other, with thermal IR cameras, and with the beryllium evaporator, which lead to successful protection of the JET first wall during the last campaign. The operation temperature ranges of NIR protection cameras for the materials used on JET are Be 650-1600 °C, W coating 600-1320 °C, and W 650-1500 °C.
Colors of active regions on comet 67P
NASA Astrophysics Data System (ADS)
Oklay, N.; Vincent, J.-B.; Sierks, H.; Besse, S.; Fornasier, S.; Barucci, M. A.; Lara, L.; Scholten, F.; Preusker, F.; Lazzarin, M.; Pajola, M.; La Forgia, F.
2015-10-01
The OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) scientific imager (Keller et al. 2007) is successfully delivering images of comet 67P/Churyumov-Gerasimenko from its both wide angle camera (WAC) and narrow angle camera (NAC) since ESA's spacecraft Rosetta's arrival to the comet. Both cameras are equipped with filters covering the wavelength range of about 200 nm to 1000 nm. The comet nucleus is mapped with different combination of the filters in resolutions up to 15 cm/px. Besides the determination of the surface morphology in great details (Thomas et al. 2015), such high resolution images provided us a mean to unambiguously link some activity in the coma to a series of pits on the nucleus surface (Vincent et al. 2015).
Embedded mobile farm robot for identification of diseased plants
NASA Astrophysics Data System (ADS)
Sadistap, S. S.; Botre, B. A.; Pandit, Harshavardhan; Chandrasekhar; Rao, Adesh
2013-07-01
This paper presents the development of a mobile robot used in farms for identification of diseased plants. It puts forth two of the major aspects of robotics namely automated navigation and image processing. The robot navigates on the basis of the GPS (Global Positioning System) location and data obtained from IR (Infrared) sensors to avoid any obstacles in its path. It uses an image processing algorithm to differentiate between diseased and non-diseased plants. A robotic platform consisting of an ARM9 processor, motor drivers, robot mechanical assembly, camera and infrared sensors has been used. Mini2440 microcontroller has been used wherein Embedded linux OS (Operating System) is implemented.
Method and apparatus for coherent imaging of infrared energy
Hutchinson, Donald P.
1998-01-01
A coherent camera system performs ranging, spectroscopy, and thermal imaging. Local oscillator radiation is combined with target scene radiation to enable heterodyne detection by the coherent camera's two-dimensional photodetector array. Versatility enables deployment of the system in either a passive mode (where no laser energy is actively transmitted toward the target scene) or an active mode (where a transmitting laser is used to actively illuminate the target scene). The two-dimensional photodetector array eliminates the need to mechanically scan the detector. Each element of the photodetector array produces an intermediate frequency signal that is amplified, filtered, and rectified by the coherent camera's integrated circuitry. By spectroscopic examination of the frequency components of each pixel of the detector array, a high-resolution, three-dimensional or holographic image of the target scene is produced for applications such as air pollution studies, atmospheric disturbance monitoring, and military weapons targeting.
Temperature measurement with industrial color camera devices
NASA Astrophysics Data System (ADS)
Schmidradler, Dieter J.; Berndorfer, Thomas; van Dyck, Walter; Pretschuh, Juergen
1999-05-01
This paper discusses color camera based temperature measurement. Usually, visual imaging and infrared image sensing are treated as two separate disciplines. We will show, that a well selected color camera device might be a cheaper, more robust and more sophisticated solution for optical temperature measurement in several cases. Herein, only implementation fragments and important restrictions for the sensing element will be discussed. Our aim is to draw the readers attention to the use of visual image sensors for measuring thermal radiation and temperature and to give reasons for the need of improved technologies for infrared camera devices. With AVL-List, our partner of industry, we successfully used the proposed sensor to perform temperature measurement for flames inside the combustion chamber of diesel engines which finally led to the presented insights.
First light observations with TIFR Near Infrared Imaging Camera (TIRCAM-II)
NASA Astrophysics Data System (ADS)
Ojha, D. K.; Ghosh, S. K.; D'Costa, S. L. A.; Naik, M. B.; Sandimani, P. R.; Poojary, S. S.; Bhagat, S. B.; Jadhav, R. B.; Meshram, G. S.; Bakalkar, C. B.; Ramaprakash, A. N.; Mohan, V.; Joshi, J.
TIFR near infrared imaging camera (TIRCAM-II) is based on the Aladdin III Quadrant InSb focal plane array (512×512 pixels; 27.6 μm pixel size; sensitive between 1 - 5.5 μm). TIRCAM-II had its first engineering run with the 2 m IUCAA telescope at Girawali during February - March 2011. The first light observations with TIRCAM-II were quite successful. Several infrared standard with TIRCAM-II were quite successful. Several infrared standard stars, the Trapezium Cluster in Orion region, McNeil's nebula, etc., were observed in the J, K and in a narrow-band at 3.6 μm (nbL). In the nbL band, some bright stars could be detected from the Girawali site. The performance of TIRCAM-II is discussed in the light of preliminary observations in near infrared bands.
NASA Astrophysics Data System (ADS)
Olson, Craig; Theisen, Michael; Pace, Teresa; Halford, Carl; Driggers, Ronald
2016-05-01
The mission of an Infrared Search and Track (IRST) system is to detect and locate (sometimes called find and fix) enemy aircraft at significant ranges. Two extreme opposite examples of IRST applications are 1) long range offensive aircraft detection when electronic warfare equipment is jammed, compromised, or intentionally turned off, and 2) distributed aperture systems where enemy aircraft may be in the proximity of the host aircraft. Past IRST systems have been primarily long range offensive systems that were based on the LWIR second generation thermal imager. The new IRST systems are primarily based on staring infrared focal planes and sensors. In the same manner that FLIR92 did not work well in the design of staring infrared cameras (NVTherm was developed to address staring infrared sensor performance), current modeling techniques do not adequately describe the performance of a staring IRST sensor. There are no standard military IRST models (per AFRL and NAVAIR), and each program appears to perform their own modeling. For this reason, L-3 has decided to develop a corporate model, working with AFRL and NAVAIR, for the analysis, design, and evaluation of IRST concepts, programs, and solutions. This paper provides some of the first analyses in the L-3 IRST model development program for the optimization of staring IRST sensors.
Enhancing swimming pool safety by the use of range-imaging cameras
NASA Astrophysics Data System (ADS)
Geerardyn, D.; Boulanger, S.; Kuijk, M.
2015-05-01
Drowning is the cause of death of 372.000 people, each year worldwide, according to the report of November 2014 of the World Health Organization.1 Currently, most swimming pools only use lifeguards to detect drowning people. In some modern swimming pools, camera-based detection systems are nowadays being integrated. However, these systems have to be mounted underwater, mostly as a replacement of the underwater lighting. In contrast, we are interested in range imaging cameras mounted on the ceiling of the swimming pool, allowing to distinguish swimmers at the surface from drowning people underwater, while keeping the large field-of-view and minimizing occlusions. However, we have to take into account that the water surface of a swimming pool is not a flat, but mostly rippled surface, and that the water is transparent for visible light, but less transparent for infrared or ultraviolet light. We investigated the use of different types of 3D cameras to detect objects underwater at different depths and with different amplitudes of surface perturbations. Specifically, we performed measurements with a commercial Time-of-Flight camera, a commercial structured-light depth camera and our own Time-of-Flight system. Our own system uses pulsed Time-of-Flight and emits light of 785 nm. The measured distances between the camera and the object are influenced through the perturbations on the water surface. Due to the timing of our Time-of-Flight camera, our system is theoretically able to minimize the influence of the reflections of a partially-reflecting surface. The combination of a post image-acquisition filter compensating for the perturbations and the use of a light source with shorter wavelengths to enlarge the depth range can improve the current commercial cameras. As a result, we can conclude that low-cost range imagers can increase swimming pool safety, by inserting a post-processing filter and the use of another light source.
Dworak, Volker; Selbeck, Joern; Dammer, Karl-Heinz; Hoffmann, Matthias; Zarezadeh, Ali Akbar; Bobda, Christophe
2013-01-24
The application of (smart) cameras for process control, mapping, and advanced imaging in agriculture has become an element of precision farming that facilitates the conservation of fertilizer, pesticides, and machine time. This technique additionally reduces the amount of energy required in terms of fuel. Although research activities have increased in this field, high camera prices reflect low adaptation to applications in all fields of agriculture. Smart, low-cost cameras adapted for agricultural applications can overcome this drawback. The normalized difference vegetation index (NDVI) for each image pixel is an applicable algorithm to discriminate plant information from the soil background enabled by a large difference in the reflectance between the near infrared (NIR) and the red channel optical frequency band. Two aligned charge coupled device (CCD) chips for the red and NIR channel are typically used, but they are expensive because of the precise optical alignment required. Therefore, much attention has been given to the development of alternative camera designs. In this study, the advantage of a smart one-chip camera design with NDVI image performance is demonstrated in terms of low cost and simplified design. The required assembly and pixel modifications are described, and new algorithms for establishing an enhanced NDVI image quality for data processing are discussed.
Dworak, Volker; Selbeck, Joern; Dammer, Karl-Heinz; Hoffmann, Matthias; Zarezadeh, Ali Akbar; Bobda, Christophe
2013-01-01
The application of (smart) cameras for process control, mapping, and advanced imaging in agriculture has become an element of precision farming that facilitates the conservation of fertilizer, pesticides, and machine time. This technique additionally reduces the amount of energy required in terms of fuel. Although research activities have increased in this field, high camera prices reflect low adaptation to applications in all fields of agriculture. Smart, low-cost cameras adapted for agricultural applications can overcome this drawback. The normalized difference vegetation index (NDVI) for each image pixel is an applicable algorithm to discriminate plant information from the soil background enabled by a large difference in the reflectance between the near infrared (NIR) and the red channel optical frequency band. Two aligned charge coupled device (CCD) chips for the red and NIR channel are typically used, but they are expensive because of the precise optical alignment required. Therefore, much attention has been given to the development of alternative camera designs. In this study, the advantage of a smart one-chip camera design with NDVI image performance is demonstrated in terms of low cost and simplified design. The required assembly and pixel modifications are described, and new algorithms for establishing an enhanced NDVI image quality for data processing are discussed. PMID:23348037
Confocal Retinal Imaging Using a Digital Light Projector with a Near Infrared VCSEL Source
Muller, Matthew S.; Elsner, Ann E.
2018-01-01
A custom near infrared VCSEL source has been implemented in a confocal non-mydriatic retinal camera, the Digital Light Ophthalmoscope (DLO). The use of near infrared light improves patient comfort, avoids pupil constriction, penetrates the deeper retina, and does not mask visual stimuli. The DLO performs confocal imaging by synchronizing a sequence of lines displayed with a digital micromirror device to the rolling shutter exposure of a 2D CMOS camera. Real-time software adjustments enable multiply scattered light imaging, which rapidly and cost-effectively emphasizes drusen and other scattering disruptions in the deeper retina. A separate 5.1″ LCD display provides customizable visible stimuli for vision experiments with simultaneous near infrared imaging. PMID:29899586
The infrared imaging radiometer for PICASSO-CENA
NASA Astrophysics Data System (ADS)
Corlay, Gilles; Arnolfo, Marie-Christine; Bret-Dibat, Thierry; Lifferman, Anne; Pelon, Jacques
2017-11-01
Microbolometers are infrared detectors of an emerging technology mainly developed in US and few other countries for few years. The main targets of these developments are low performing and low cost military and civilian applications like survey cameras. Applications in space are now arising thanks to the design simplification and the associated cost reduction allowed by this new technology. Among the four instruments of the payload of PICASSO-CENA, the Imaging Infrared Radiometer (IIR) is based on the microbolometer technology. An infrared camera in development for the IASI instrument is the core of the IIR. The aim of the paper is to recall the PICASSO-CENA mission goal, to describe the IIR instrument architecture and highlight its main features and performances and to give the its development status.
NASA Astrophysics Data System (ADS)
Klein, Christopher R.; Kubánek, Petr; Butler, Nathaniel R.; Fox, Ori D.; Kutyrev, Alexander S.; Rapchun, David A.; Bloom, Joshua S.; Farah, Alejandro; Gehrels, Neil; Georgiev, Leonid; González, J. Jesús; Lee, William H.; Lotkin, Gennadiy N.; Moseley, Samuel H.; Prochaska, J. Xavier; Ramirez-Ruiz, Enrico; Richer, Michael G.; Robinson, Frederick D.; Román-Zúñiga, Carlos; Samuel, Mathew V.; Sparr, Leroy M.; Tucker, Corey; Watson, Alan M.
2012-07-01
The Reionization And Transients InfraRed (RATIR) camera has been built for rapid Gamma-Ray Burst (GRB) followup and will provide quasi-simultaneous imaging in ugriZY JH. The optical component uses two 2048 × 2048 pixel Finger Lakes Imaging ProLine detectors, one optimized for the SDSS u, g, and r bands and one optimized for the SDSS i band. The infrared portion incorporates two 2048 × 2048 pixel Teledyne HgCdTe HAWAII-2RG detectors, one with a 1.7-micron cutoff and one with a 2.5-micron cutoff. The infrared detectors are controlled by Teledyne's SIDECAR (System for Image Digitization Enhancement Control And Retrieval) ASICs (Application Specific Integrated Circuits). While other ground-based systems have used the SIDECAR before, this system also utilizes Teledyne's JADE2 (JWST ASIC Drive Electronics) interface card and IDE (Integrated Development Environment). Here we present a summary of the software developed to interface the RATIR detectors with Remote Telescope System, 2nd Version (RTS2) software. RTS2 is an integrated open source package for remote observatory control under the Linux operating system and will autonomously coordinate observatory dome, telescope pointing, detector, filter wheel, focus stage, and dewar vacuum compressor operations. Where necessary we have developed custom interfaces between RTS2 and RATIR hardware, most notably for cryogenic focus stage motor drivers and temperature controllers. All detector and hardware interface software developed for RATIR is freely available and open source as part of the RTS2 distribution.
Detection of Humans and Light Vehicles Using Acoustic-to-Seismic Coupling
2009-08-31
microphones, video cameras (regular and infrared), magnetic sensors, and active Doppler radar and sonar systems. These sensors could be located at... sonar systems due to dramatic absorption/reflection of electromagnetic/ultrasonic waves [8,9]. 6...engine was turned off, and the car continued moving. This eliminated the engine sound. A PCB microphone, 377B41, with preamplifier , 426A30, and with
CATAVIÑA: new infrared camera for OAN-SPM
NASA Astrophysics Data System (ADS)
Iriarte, Arturo; Cruz-González, Irene; Martínez, Luis A.; Tinoco, Silvio; Lara, Gerardo; Ruiz, Elfego; Sohn, Erika; Bernal, Abel; Angeles, Fernando; Moreno, Arturo; Murillo, Francisco; Langarica, Rosalía; Luna, Esteban; Salas, Luis; Cajero, Vicente
2006-06-01
CATAVIÑA is a near-infrared camera system to be operated in conjunction with the existing multi-purpose nearinfrared optical bench "CAMALEON" in OAN-SPM. Observing modes include direct imaging, spectroscopy, Fabry- Perot interferometry and polarimetry. This contribution focuses on the optomechanics and detector controller description of CATAVIÑA, which is planned to start operating later in 2006. The camera consists of an 8 inch LN2 dewar containing a 10 filter carousel, a radiation baffle and the detector circuit board mount. The system is based on a Rockwell 1024x1024 HgCdTe (HAWAII-I) FPA, operating in the 1 to 2.5 micron window. The detector controller/readout system was designed and developed at UNAM Instituto de Astronomia. It is based on five Texas Instruments DSK digital signal processor (DSP) modules. One module generates the detector and ADC-system control, while the remaining four are in charge of the acquisition of each of the detector's quadrants. Each DSP has a built-in expanded memory module in order to store more than one image. The detector read-out and signal driver subsystems are mounted onto the dewar in a "back-pack" fashion, each containing four independent pre-amplifiers, converters and signal drivers, that communicate through fiber optics with their respective DSPs. This system has the possibility of programming the offset input voltage and converter gain. The controller software architecture is based on a client/server model. The client sends commands through the TCP/IP protocol and acquires the image. The server consists of a microcomputer with an embedded Linux operating system, which runs the main program that receives the user commands and interacts with the timing and acquisition DSPs. The observer's interface allows for several readout and image processing modes.
NASA Technical Reports Server (NTRS)
Gilbrech, Richard J.; McManamen, John P.; Wilson, Timmy R.; Robinson, Frank; Schoren, William R.
2004-01-01
CALIPSO is a joint science mission between the CNES, LaRC and GSFC. It was selected as an Earth System Science Pathfinder satellite mission in December 1998 to address the role of clouds and aerosols in the Earth's radiation budget. The spacecraft includes a NASA light detecting and ranging (LIDAR) instrument, a NASA wide-field camera and a CNES imaging infrared radiometer. The scope of this effort was a review of the Proteus propulsion bus design and an assessment of the potential for personnel exposure to hydrazine propellant.
NASA Technical Reports Server (NTRS)
Gilbrech, Richard J.; McManamen, John P.; Wilson, Timmy R.; Robinson, Frank; Schoren, William R.
2005-01-01
CALIPSO is a joint science mission between the CNES, LaRC and GSFC. It was selected as an Earth System Science Pathfinder satellite mission in December 1998 to address the role of clouds and aerosols in the Earth's radiation budget. The spacecraft includes a NASA light detecting and ranging (LIDAR) instrument, a NASA wide-field camera and a CNES imaging infrared radiometer. The scope of this effort was a review of the Proteus propulsion bus design and an assessment of the potential for personnel exposure to hydrazine propellant.
Research on camera on orbit radial calibration based on black body and infrared calibration stars
NASA Astrophysics Data System (ADS)
Wang, YuDu; Su, XiaoFeng; Zhang, WanYing; Chen, FanSheng
2018-05-01
Affected by launching process and space environment, the response capability of a space camera must be attenuated. So it is necessary for a space camera to have a spaceborne radiant calibration. In this paper, we propose a method of calibration based on accurate Infrared standard stars was proposed for increasing infrared radiation measurement precision. As stars can be considered as a point target, we use them as the radiometric calibration source and establish the Taylor expansion method and the energy extrapolation model based on WISE catalog and 2MASS catalog. Then we update the calibration results from black body. Finally, calibration mechanism is designed and the technology of design is verified by on orbit test. The experimental calibration result shows the irradiance extrapolation error is about 3% and the accuracy of calibration methods is about 10%, the results show that the methods could satisfy requirements of on orbit calibration.
Ensuring long-term stability of infrared camera absolute calibration.
Kattnig, Alain; Thetas, Sophie; Primot, Jérôme
2015-07-13
Absolute calibration of cryogenic 3-5 µm and 8-10 µm infrared cameras is notoriously instable and thus has to be repeated before actual measurements. Moreover, the signal to noise ratio of the imagery is lowered, decreasing its quality. These performances degradations strongly lessen the suitability of Infrared Imaging. These defaults are often blamed on detectors reaching a different "response state" after each return to cryogenic conditions, while accounting for the detrimental effects of imperfect stray light management. We show here that detectors are not to be blamed and that the culprit can also dwell in proximity electronics. We identify an unexpected source of instability in the initial voltage of the integrating capacity of detectors. Then we show that this parameter can be easily measured and taken into account. This way we demonstrate that a one month old calibration of a 3-5 µm camera has retained its validity.
2016-04-15
The newest instrument, an infrared camera called the High-resolution Airborne Wideband Camera-Plus (HAWC+), was installed on the Stratospheric Observatory for Infrared Astronomy, SOFIA, in April of 2016. This is the only currently operating astronomical camera that makes images using far-infrared light, allowing studies of low-temperature early stages of star and planet formation. HAWC+ includes a polarimeter, a device that measures the alignment of incoming light waves. With the polarimeter, HAWC+ can map magnetic fields in star forming regions and in the environment around the supermassive black hole at the center of the Milky Way galaxy. These new maps can reveal how the strength and direction of magnetic fields affect the rate at which interstellar clouds condense to form new stars. A team led by C. Darren Dowell at NASA’s Jet Propulsion Laboratory and including participants from more than a dozen institutions developed the instrument.
Bennett, Charles L.
1996-01-01
An imaging Fourier transform spectrometer (10, 210) having a Fourier transform infrared spectrometer (12) providing a series of images (40) to a focal plane array camera (38). The focal plane array camera (38) is clocked to a multiple of zero crossing occurrences as caused by a moving mirror (18) of the Fourier transform infrared spectrometer (12) and as detected by a laser detector (50) such that the frame capture rate of the focal plane array camera (38) corresponds to a multiple of the zero crossing rate of the Fourier transform infrared spectrometer (12). The images (40) are transmitted to a computer (45) for processing such that representations of the images (40) as viewed in the light of an arbitrary spectral "fingerprint" pattern can be displayed on a monitor (60) or otherwise stored and manipulated by the computer (45).
CANICA: The Cananea Near-Infrared Camera at the 2.1 m OAGH Telescope
NASA Astrophysics Data System (ADS)
Carrasco, L.; Hernández Utrera, O.; Vázquez, S.; Mayya, Y. D.; Carrasco, E.; Pedraza, J.; Castillo-Domínguez, E.; Escobedo, G.; Devaraj, R.; Luna, A.
2017-10-01
The Cananea near-infrared camera (CANICA) is an instrument commissioned at the 2.12 m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located in Cananea, Sonora, México. CANICA operates in the near-infrared at multiple bands including J(1.24 μm), H(1.63 μm) and K' (2.12 μm) broad-bands. CANICA in located at the Ritchey-Chrétien focal plane of the telescope, reimaging the f/12 beam into f/6 beam. The detector is a 1024 × 1024 HgCdTe HAWAII array of 18.5 μm pixel size, covering a field of view of 5.5 × 5.5 arcmin2, for a plate scale of 0.32 arcsec/pixel. The camera is enclosed in a cryostat, cooled with liquid nitrogen to 77 K. The cryostat contains the collimator, two 15-position filter wheels, single fixed reimaging optics and the detector.
Yang, Hualei; Yang, Xi; Heskel, Mary; Sun, Shucun; Tang, Jianwu
2017-04-28
Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporal resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). We found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.
Students' framing of laboratory exercises using infrared cameras
NASA Astrophysics Data System (ADS)
Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.
2015-12-01
Thermal science is challenging for students due to its largely imperceptible nature. Handheld infrared cameras offer a pedagogical opportunity for students to see otherwise invisible thermal phenomena. In the present study, a class of upper secondary technology students (N =30 ) partook in four IR-camera laboratory activities, designed around the predict-observe-explain approach of White and Gunstone. The activities involved central thermal concepts that focused on heat conduction and dissipative processes such as friction and collisions. Students' interactions within each activity were videotaped and the analysis focuses on how a purposefully selected group of three students engaged with the exercises. As the basis for an interpretative study, a "thick" narrative description of the students' epistemological and conceptual framing of the exercises and how they took advantage of the disciplinary affordance of IR cameras in the thermal domain is provided. Findings include that the students largely shared their conceptual framing of the four activities, but differed among themselves in their epistemological framing, for instance, in how far they found it relevant to digress from the laboratory instructions when inquiring into thermal phenomena. In conclusion, the study unveils the disciplinary affordances of infrared cameras, in the sense of their use in providing access to knowledge about macroscopic thermal science.
Final Technical Report. Training in Building Audit Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brosemer, Kathleen
In 2011, the Tribe proposed and was awarded the Training in Building Audit Technologies grant from the DOE in the amount of $55,748 to contract for training programs for infrared cameras, blower door technology applications and building systems. The coursework consisted of; Infrared Camera Training: Level I - Thermal Imaging for Energy Audits; Blower Door Analysis and Building-As-A-System Training, Building Performance Institute (BPI) Building Analyst; Building Envelope Training, Building Performance Institute (BPI) Envelope Professional; and Audit/JobFLEX Tablet Software. Competitive procurement of the training contractor resulted in lower costs, allowing the Tribe to request and receive DOE approval to additionally purchasemore » energy audit equipment and contract for residential energy audits of 25 low-income Tribal Housing units. Sault Tribe personnel received field training to supplement the classroom instruction on proper use of the energy audit equipment. Field experience was provided through the second DOE energy audits grant, allowing Sault Tribe personnel to join the contractor, Building Science Academy, in conducting 25 residential energy audits of low-income Tribal Housing units.« less
Fiber-Optic Surface Temperature Sensor Based on Modal Interference.
Musin, Frédéric; Mégret, Patrice; Wuilpart, Marc
2016-07-28
Spatially-integrated surface temperature sensing is highly useful when it comes to controlling processes, detecting hazardous conditions or monitoring the health and safety of equipment and people. Fiber-optic sensing based on modal interference has shown great sensitivity to temperature variation, by means of cost-effective image-processing of few-mode interference patterns. New developments in the field of sensor configuration, as described in this paper, include an innovative cooling and heating phase discrimination functionality and more precise measurements, based entirely on the image processing of interference patterns. The proposed technique was applied to the measurement of the integrated surface temperature of a hollow cylinder and compared with a conventional measurement system, consisting of an infrared camera and precision temperature probe. As a result, the optical technique is in line with the reference system. Compared with conventional surface temperature probes, the optical technique has the following advantages: low heat capacity temperature measurement errors, easier spatial deployment, and replacement of multiple angle infrared camera shooting and the continuous monitoring of surfaces that are not visually accessible.
A Wide-field Camera and Fully Remote Operations at the Wyoming Infrared Observatory
NASA Astrophysics Data System (ADS)
Findlay, Joseph R.; Kobulnicky, Henry A.; Weger, James S.; Bucher, Gerald A.; Perry, Marvin C.; Myers, Adam D.; Pierce, Michael J.; Vogel, Conrad
2016-11-01
Upgrades at the 2.3 meter Wyoming Infrared Observatory telescope have provided the capability for fully remote operations by a single operator from the University of Wyoming campus. A line-of-sight 300 Megabit s-1 11 GHz radio link provides high-speed internet for data transfer and remote operations that include several realtime video feeds. Uninterruptable power is ensured by a 10 kVA battery supply for critical systems and a 55 kW autostart diesel generator capable of running the entire observatory for up to a week. The construction of a new four-element prime-focus corrector with fused-silica elements allows imaging over a 40‧ field of view with a new 40962 UV-sensitive prime-focus camera and filter wheel. A new telescope control system facilitates the remote operations model and provides 20″ rms pointing over the usable sky. Taken together, these improvements pave the way for a new generation of sky surveys supporting space-based missions and flexible-cadence observations advancing emerging astrophysical priorities such as planet detection, quasar variability, and long-term time-domain campaigns.
NASA Astrophysics Data System (ADS)
Cabib, Dario; Lavi, Moshe; Gil, Amir; Milman, Uri
2011-06-01
Since the early '90's CI has been involved in the development of FTIR hyperspectral imagers based on a Sagnac or similar type of interferometer. CI also pioneered the commercialization of such hyperspectral imagers in those years. After having developed a visible version based on a CCD in the early '90's (taken on by a spin-off company for biomedical applications) and a 3 to 5 micron infrared version based on a cooled InSb camera in 2008, it is now developing an LWIR version based on an uncooled camera for the 8 to 14 microns range. In this paper we will present design features and expected performance of the system. The instrument is designed to be rugged for field use, yield a relatively high spectral resolution of 8 cm-1, an IFOV of 0.5 mrad., a 640x480 pixel spectral cube in less than a minute and a noise equivalent spectral radiance of 40 nW/cm2/sr/cm-1 at 10μ. The actually measured performance will be presented in a future paper.
Progress with the lick adaptive optics system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gavel, D T; Olivier, S S; Bauman, B
2000-03-01
Progress and results of observations with the Lick Observatory Laser Guide Star Adaptive Optics System are presented. This system is optimized for diffraction-limited imaging in the near infrared, 1-2 micron wavelength bands. We describe our development efforts in a number of component areas including, a redesign of the optical bench layout, the commissioning of a new infrared science camera, and improvements to the software and user interface. There is also an ongoing effort to characterize the system performance with both natural and laser guide stars and to fold this data into a refined system model. Such a model can bemore » used to help plan future observations, for example, predicting the point-spread function as a function of seeing and guide star magnitude.« less
New-style defect inspection system of film
NASA Astrophysics Data System (ADS)
Liang, Yan; Liu, Wenyao; Liu, Ming; Lee, Ronggang
2002-09-01
An inspection system has been developed for on-line detection of film defects, which bases on combination of photoelectric imaging and digital image processing. The system runs in high speed of maximum 60m/min. Moving film is illuminated by LED array which emits even infrared (peak wavelength λp=940nm), and infrared images are obtained with a high quality and high speed CCD camera. The application software based on Visual C++6.0 under Windows processes images in real time by means of such algorithms as median filter, edge detection and projection, etc. The system is made up of four modules, which are introduced in detail in the paper. On-line experiment results shows that the inspection system can recognize defects precisely in high speed and run reliably in practical application.
Li, Jin; Liu, Zilong
2017-07-24
Remote sensing cameras in the visible/near infrared range are essential tools in Earth-observation, deep-space exploration, and celestial navigation. Their imaging performance, i.e. image quality here, directly determines the target-observation performance of a spacecraft, and even the successful completion of a space mission. Unfortunately, the camera itself, such as a optical system, a image sensor, and a electronic system, limits the on-orbit imaging performance. Here, we demonstrate an on-orbit high-resolution imaging method based on the invariable modulation transfer function (IMTF) of cameras. The IMTF, which is stable and invariable to the changing of ground targets, atmosphere, and environment on orbit or on the ground, depending on the camera itself, is extracted using a pixel optical focal-plane (PFP). The PFP produces multiple spatial frequency targets, which are used to calculate the IMTF at different frequencies. The resulting IMTF in combination with a constrained least-squares filter compensates for the IMTF, which represents the removal of the imaging effects limited by the camera itself. This method is experimentally confirmed. Experiments on an on-orbit panchromatic camera indicate that the proposed method increases 6.5 times of the average gradient, 3.3 times of the edge intensity, and 1.56 times of the MTF value compared to the case when IMTF is not used. This opens a door to push the limitation of a camera itself, enabling high-resolution on-orbit optical imaging.
LIFTING THE VEIL OF DUST TO REVEAL THE SECRETS OF SPIRAL GALAXIES
NASA Technical Reports Server (NTRS)
2002-01-01
Astronomers have combined information from the NASA Hubble Space Telescope's visible- and infrared-light cameras to show the hearts of four spiral galaxies peppered with ancient populations of stars. The top row of pictures, taken by a ground-based telescope, represents complete views of each galaxy. The blue boxes outline the regions observed by the Hubble telescope. The bottom row represents composite pictures from Hubble's visible- and infrared-light cameras, the Wide Field and Planetary Camera 2 (WFPC2) and the Near Infrared Camera and Multi-Object Spectrometer (NICMOS). Astronomers combined views from both cameras to obtain the true ages of the stars surrounding each galaxy's bulge. The Hubble telescope's sharper resolution allows astronomers to study the intricate structure of a galaxy's core. The galaxies are ordered by the size of their bulges. NGC 5838, an 'S0' galaxy, is dominated by a large bulge and has no visible spiral arms; NGC 7537, an 'Sbc' galaxy, has a small bulge and loosely wound spiral arms. Astronomers think that the structure of NGC 7537 is very similar to our Milky Way. The galaxy images are composites made from WFPC2 images taken with blue (4445 Angstroms) and red (8269 Angstroms) filters, and NICMOS images taken in the infrared (16,000 Angstroms). They were taken in June, July, and August of 1997. Credits for the ground-based images: Allan Sandage (The Observatories of the Carnegie Institution of Washington) and John Bedke (Computer Sciences Corporation and the Space Telescope Science Institute) Credits for WFPC2 and NICMOS composites: NASA, ESA, and Reynier Peletier (University of Nottingham, United Kingdom)
Focal plane arrays based on Type-II indium arsenide/gallium antimonide superlattices
NASA Astrophysics Data System (ADS)
Delaunay, Pierre-Yves
The goal of this work is to demonstrate that Type-II InAs/GaSb superlattices can perform high quality infrared imaging from the middle (MWIR) to the long (LWIR) wavelength infrared range. Theoretically, focal plane arrays (FPAs) based on this technology could be operated at higher temperatures, with lower dark currents than the leading HgCdTe platform. This effort will focus on the fabrication of MWIR and LWIR FPAs with performance similar to existing infrared cameras. Some applications in the MWIR require fast, sensitive imagers able to sustain frame rates up to 100Hz. Such speed can only be achieved with photon detectors. However, these cameras need to be operated below 170K. Current research in this spectral band focuses on increasing the operating temperature of the FPA to a point where cooling could be performed with compact and reliable thermoelectric coolers. Type-II superlattice was used to demonstrate a camera that presented similar performance to HgCdTe and that could be operated up to room temperature. At 80K, the camera could detect temperature differences as low as 10 mK for an integration time shorter than 25 ms. In the LWIR, the electric performance of Type-II photodiodes is mainly limited by surface leakage. Aggressive processing steps such as hybridization and underfill can increase the dark current of the devices by several orders of magnitude. New cleaning and passivation techniques were used to reduce the dark current of FPA diodes by two orders of magnitudes. The absorbing GaSb substrate was also removed to increase the quantum efficiency of the devices up to 90%. At 80K, a FPA with a 9.6 microm 50%-cutoff in responsivity was able to detect temperature differences as low as 19 mK, only limited by the performance of the testing system. The non-uniformity in responsivity reached 3.8% for a 98.2% operability. The third generation of infrared cameras is based on multi-band imaging in order to improve the recognition capabilities of the imager. Preliminary detectors based on back to back diodes presented similar performance to single colors devices; the quantum efficiency was measured higher than 40% for both bands. Preliminary imaging results were demonstrated in the LWIR.
Very low cost real time histogram-based contrast enhancer utilizing fixed-point DSP processing
NASA Astrophysics Data System (ADS)
McCaffrey, Nathaniel J.; Pantuso, Francis P.
1998-03-01
A real time contrast enhancement system utilizing histogram- based algorithms has been developed to operate on standard composite video signals. This low-cost DSP based system is designed with fixed-point algorithms and an off-chip look up table (LUT) to reduce the cost considerably over other contemporary approaches. This paper describes several real- time contrast enhancing systems advanced at the Sarnoff Corporation for high-speed visible and infrared cameras. The fixed-point enhancer was derived from these high performance cameras. The enhancer digitizes analog video and spatially subsamples the stream to qualify the scene's luminance. Simultaneously, the video is streamed through a LUT that has been programmed with the previous calculation. Reducing division operations by subsampling reduces calculation- cycles and also allows the processor to be used with cameras of nominal resolutions. All values are written to the LUT during blanking so no frames are lost. The enhancer measures 13 cm X 6.4 cm X 3.2 cm, operates off 9 VAC and consumes 12 W. This processor is small and inexpensive enough to be mounted with field deployed security cameras and can be used for surveillance, video forensics and real- time medical imaging.
A New Era in Solar Thermal-IR Astronomy: the NSO Array Camera (NAC) on the McMath-Pierce Telescope
NASA Astrophysics Data System (ADS)
Ayres, T.; Penn, M.; Plymate, C.; Keller, C.
2008-09-01
The U.S. National Solar Observatory Array Camera (NAC) is a cryogenically cooled 1Kx1K InSb ``Aladdin" array that recently became operational at the McMath-Pierce facility on Kitt Peak, a high dry site in the southwest U.S. (Arizona). The new camera is similar to those already incorporated into instruments on nighttime telescopes, and has unprecedented sensitivity, low noise, and excellent cosmetics compared with the Amber Engineering (AE) device it replaces. (The latter was scavenged from a commercial surveillance camera in the 1990's: only 256X256 format, high noise, and annoying flatfield structure). The NAC focal plane is maintained at 30 K by a mechanical closed-cycle helium cooler, dispensing with the cumbersome pumped--solid-N2 40 K system used previously with the AE camera. The NAC linearity has been verified for exposures as short as 1 ms, although latency in the data recording holds the maximum frame rate to about 8 Hz (in "streaming mode"). The camera is run in tandem with the Infrared Adaptive Optics (IRAO) system. Utilizing a 37-actuator deformable mirror, IRAO can--under moderate seeing conditions--correct the telescope image to the diffraction limit longward of 2.3 mu (if a suitable high contrast target is available: the IR granulation has proven too bland to reliably track). IRAO also provides fine control over the solar image for spatial scanning in long-slit mode with the 14 m vertical "Main" spectrograph (MS). A 1'X1' area scan, with 0.5" steps orthogonal to the slit direction, requires less than half a minute, much shorter than p-mode and granulation evolution time scales. A recent engineering test run, in April 2008, utilized NAC/IRAO/MS to capture the fundamental (4.6 mu) and first-overtone (2.3 mu) rovibrational bands of CO, including maps of quiet regions, drift scans along the equatorial limbs (to measure the off-limb molecular emissions), and imaging of a fortuitous small sunspot pair, a final gasp, perhaps, of Cycle 23. Future work with the NAC will emphasize pathfinding toward the next generation of IR imaging spectrometers for the Advanced Technology Solar Telescope, whose 4 m aperture finally will bring sorely needed high spatial resolution to daytime infrared astronomy. In the meantime, the NAC is available to qualified solar physicists from around the world to conduct forefront research in the 1-5 mu region, on the venerable--but infrared friendly--McMath-Pierce telescope.
Low cost infrared and near infrared sensors for UAVs
NASA Astrophysics Data System (ADS)
Aden, S. T.; Bialas, J. P.; Champion, Z.; Levin, E.; McCarty, J. L.
2014-11-01
Thermal remote sensing has a wide range of applications, though the extent of its use is inhibited by cost. Robotic and computer components are now widely available to consumers on a scale that makes thermal data a readily accessible resource. In this project, thermal imagery collected via a lightweight remote sensing Unmanned Aerial Vehicle (UAV) was used to create a surface temperature map for the purpose of providing wildland firefighting crews with a cost-effective and time-saving resource. The UAV system proved to be flexible, allowing for customized sensor packages to be designed that could include visible or infrared cameras, GPS, temperature sensors, and rangefinders, in addition to many data management options. Altogether, such a UAV system could be used to rapidly collect thermal and aerial data, with a geographic accuracy of less than one meter.
Method and apparatus for implementing material thermal property measurement by flash thermal imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Jiangang
A method and apparatus are provided for implementing measurement of material thermal properties including measurement of thermal effusivity of a coating and/or film or a bulk material of uniform property. The test apparatus includes an infrared camera, a data acquisition and processing computer coupled to the infrared camera for acquiring and processing thermal image data, a flash lamp providing an input of heat onto the surface of a two-layer sample with an enhanced optical filter covering the flash lamp attenuating an entire infrared wavelength range with a series of thermal images is taken of the surface of the two-layer sample.
Handheld hyperspectral imager system for chemical/biological and environmental applications
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele; Piatek, Bob
2004-08-01
A small, hand held, battery operated imaging infrared spectrometer, Sherlock, has been developed by Pacific Advanced Technology and was field tested in early 2003. The Sherlock spectral imaging camera has been designed for remote gas leak detection, however, the architecture of the camera is versatile enough that it can be applied to numerous other applications such as homeland security, chemical/biological agent detection, medical and pharmaceutical applications as well as standard research and development. This paper describes the Sherlock camera, theory of operations, shows current applications and touches on potential future applications for the camera. The Sherlock has an embedded Power PC and performs real-time-image processing function in an embedded FPGA. The camera has a built in LCD display as well as output to a standard monitor, or NTSC display. It has several I/O ports, ethernet, firewire, RS232 and thus can be easily controlled from a remote location. In addition, software upgrades can be performed over the ethernet eliminating the need to send the camera back to the factory for a retrofit. Using the USB port a mouse and key board can be connected and the camera can be used in a laboratory environment as a stand alone imaging spectrometer.
Performance of Backshort-Under-Grid Kilopixel TES Arrays for HAWC+
NASA Technical Reports Server (NTRS)
Staguhn, J. G.; Benford, D. J.; Dowell, C. D.; Fixsen, D. J.; Hilton, G. C.; Irwin, K. D.; Jhabvala, C. A.; Maher, S. F.; Miller, T. M.; Moseley, S. H.;
2016-01-01
We present results from laboratory detector characterizations of the first kilopixel BUG arrays for the High- resolution Wideband Camera Plus (HAWC+) which is the imaging far-infrared polarimeter camera for the Stratospheric Observatory for Infrared Astronomy (SOFIA). Our tests demonstrate that the array performance is consistent with the predicted properties. Here, we highlight results obtained for the thermal conductivity, noise performance, detector speed, and first optical results demonstrating the pixel yield of the arrays.
Sentinel lymph node detection in gynecologic malignancies by a handheld fluorescence camera
NASA Astrophysics Data System (ADS)
Hirsch, Ole; Szyc, Lukasz; Muallem, Mustafa Zelal; Ignat, Iulia; Chekerov, Radoslav; Macdonald, Rainer; Sehouli, Jalid; Braicu, Ioana; Grosenick, Dirk
2017-02-01
Near-infrared fluorescence imaging using indocyanine green (ICG) as a tracer is a promising technique for mapping the lymphatic system and for detecting sentinel lymph nodes (SLN) during cancer surgery. In our feasibility study we have investigated the application of a custom-made handheld fluorescence camera system for the detection of lymph nodes in gynecological malignancies. It comprises a low cost CCD camera with enhanced NIR sensitivity and two groups of LEDs emitting at wavelengths of 735 nm and 830 nm for interlaced recording of fluorescence and reflectance images of the tissue, respectively. With the help of our system, surgeons can observe fluorescent tissue structures overlaid onto the anatomical image on a monitor in real-time. We applied the camera system for intraoperative lymphatic mapping in 5 patients with vulvar cancer, 5 patients with ovarian cancer, 3 patients with cervical cancer, and 3 patients with endometrial cancer. ICG was injected at four loci around the primary malignant tumor during surgery. After a residence time of typically 15 min fluorescence images were taken in order to visualize the lymph nodes closest to the carcinomas. In cases with vulvar cancer about half of the lymph nodes detected by routinely performed radioactive SLN mapping have shown fluorescence in vivo as well. In the other types of carcinomas several lymph nodes could be detected by fluorescence during laparotomy. We conclude that our low cost camera system has sufficient sensitivity for lymphatic mapping during surgery.
Low-cost far infrared bolometer camera for automotive use
NASA Astrophysics Data System (ADS)
Vieider, Christian; Wissmar, Stanley; Ericsson, Per; Halldin, Urban; Niklaus, Frank; Stemme, Göran; Källhammer, Jan-Erik; Pettersson, Håkan; Eriksson, Dick; Jakobsen, Henrik; Kvisterøy, Terje; Franks, John; VanNylen, Jan; Vercammen, Hans; VanHulsel, Annick
2007-04-01
A new low-cost long-wavelength infrared bolometer camera system is under development. It is designed for use with an automatic vision algorithm system as a sensor to detect vulnerable road users in traffic. Looking 15 m in front of the vehicle it can in case of an unavoidable impact activate a brake assist system or other deployable protection system. To achieve our cost target below €100 for the sensor system we evaluate the required performance and can reduce the sensitivity to 150 mK and pixel resolution to 80 x 30. We address all the main cost drivers as sensor size and production yield along with vacuum packaging, optical components and large volume manufacturing technologies. The detector array is based on a new type of high performance thermistor material. Very thin Si/SiGe single crystal multi-layers are grown epitaxially. Due to the resulting valence barriers a high temperature coefficient of resistance is achieved (3.3%/K). Simultaneously, the high quality crystalline material provides very low 1/f-noise characteristics and uniform material properties. The thermistor material is transferred from the original substrate wafer to the read-out circuit using adhesive wafer bonding and subsequent thinning. Bolometer arrays can then be fabricated using industry standard MEMS process and materials. The inherently good detector performance allows us to reduce the vacuum requirement and we can implement wafer level vacuum packaging technology used in established automotive sensor fabrication. The optical design is reduced to a single lens camera. We develop a low cost molding process using a novel chalcogenide glass (GASIR®3) and integrate anti-reflective and anti-erosion properties using diamond like carbon coating.
2002-01-17
KENNEDY SPACE CENTER, FLA. -- Workers in the Vertical Processing Facility help guide the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System onto a payload carrier. NICMOS II is part of the payload on mission STS-109, the Hubble Servicing Telescope Mission. It is a new experimental cooling system consisting of a compressor and tiny turbines. With the experimental cryogenic system, NASA hopes to re-cool the infrared detectors to below -315 degrees F (-193 degrees Celsius). NICMOS II was previously tested aboard STS-95 in 1998. It could extend the life of the Hubble Space Telescope by several years. Astronauts aboard Columbia on mission STS-109 will be replacing the original NICMOS with the newer version. Launch of mission STS-109 is scheduled for Feb. 28, 2002
2002-01-22
KENNEDY SPACE CENTER, FLA. -- The NICMOS II radiator is ready for checkout in the Vertical Processing Facility. The Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System is part of the payload on mission STS-109, the Hubble Servicing Telescope mission. NICMOS is a new experimental cooling system consisting of a compressor and tiny turbines. With the experimental cryogenic system, NASA hopes to re-cool the infrared detectors to below -315 degrees F (-193 degrees Celsius). NICMOS II was previously tested aboard STS-95 in 1998. NICMOS could extend the life of the Hubble Space Telescope by several years. Astronauts aboard Columbia on mission STS-109 will be replacing the original NICMOS with the newer version. Launch of Columbia is scheduled Feb. 28, 2002
2002-01-17
KENNEDY SPACE CENTER, FLA. -- Workers in the Vertical Processing Facility test the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, part of the payload on mission STS-109, the Hubble Servicing Telescope Mission. The worker at right is using a black light. NICMOS II is a new experimental cooling system consisting of a compressor and tiny turbines. With the experimental cryogenic system, NASA hopes to re-cool the infrared detectors to below -315 degrees F (-193 degrees Celsius). NICMOS II was previously tested aboard STS-95 in 1998. It could extend the life of the Hubble Space Telescope by several years. Astronauts aboard Columbia on mission STS-109 will be replacing the original NICMOS with the newer version. Launch of mission STS-109 is scheduled for Feb. 28, 2002
2002-01-17
KENNEDY SPACE CENTER, FLA. -- The Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System rests inside a protective enclosure on a payload carrier. NICMOS II is part of the payload on mission STS-109, the Hubble Servicing Telescope Mission. It is a new experimental cooling system consisting of a compressor and tiny turbines. With the experimental cryogenic system, NASA hopes to re-cool the infrared detectors to below -315 degrees F (-193 degrees Celsius). NICMOS II was previously tested aboard STS-95 in 1998. It could extend the life of the Hubble Space Telescope by several years. Astronauts aboard Columbia on mission STS-109 will be replacing the original NICMOS with the newer version. Launch of mission STS-109 is scheduled for Feb. 28, 2002
2002-01-17
KENNEDY SPACE CENTER, FLA. -- Workers in the Vertical Processing Facility wheel a container with the NICMOS II across the floor. The Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System is part of the payload on mission STS-109, the Hubble Servicing Telescope Mission. NICMOS is a new experimental cooling system consisting of a compressor and tiny turbines. With the experimental cryogenic system, NASA hopes to re-cool the infrared detectors to below -315 degrees F (-193 degrees Celsius). NICMOS II was previously tested aboard STS-95 in 1998. It could extend the life of the Hubble Space Telescope by several years. Astronauts aboard Columbia on mission STS-109 will be replacing the original NICMOS with the newer version. Launch of mission STS-109 is scheduled for Feb. 28, 2002
A Biocompatible Near-Infrared 3D Tracking System*
Decker, Ryan S.; Shademan, Azad; Opfermann, Justin D.; Leonard, Simon; Kim, Peter C. W.; Krieger, Axel
2017-01-01
A fundamental challenge in soft-tissue surgery is that target tissue moves and deforms, becomes occluded by blood or other tissue, and is difficult to differentiate from surrounding tissue. We developed small biocompatible near-infrared fluorescent (NIRF) markers with a novel fused plenoptic and NIR camera tracking system, enabling 3D tracking of tools and target tissue while overcoming blood and tissue occlusion in the uncontrolled, rapidly changing surgical environment. In this work, we present the tracking system and marker design and compare tracking accuracies to standard optical tracking methods using robotic experiments. At speeds of 1 mm/s, we observe tracking accuracies of 1.61 mm, degrading only to 1.71 mm when the markers are covered in blood and tissue. PMID:28129145
Biocompatible Near-Infrared Three-Dimensional Tracking System.
Decker, Ryan S; Shademan, Azad; Opfermann, Justin D; Leonard, Simon; Kim, Peter C W; Krieger, Axel
2017-03-01
A fundamental challenge in soft-tissue surgery is that target tissue moves and deforms, becomes occluded by blood or other tissue, and is difficult to differentiate from surrounding tissue. We developed small biocompatible near-infrared fluorescent (NIRF) markers with a novel fused plenoptic and NIR camera tracking system, enabling three-dimensional tracking of tools and target tissue while overcoming blood and tissue occlusion in the uncontrolled, rapidly changing surgical environment. In this work, we present the tracking system and marker design and compare tracking accuracies to standard optical tracking methods using robotic experiments. At speeds of 1 mm/s, we observe tracking accuracies of 1.61 mm, degrading only to 1.71 mm when the markers are covered in blood and tissue.
Near infrared photography with a vacuum-cold camera. [Orion nebula observation
NASA Technical Reports Server (NTRS)
Rossano, G. S.; Russell, R. W.; Cornett, R. H.
1980-01-01
Sensitized cooled plates have been obtained of the Orion nebula region and of Sh2-149 in the wavelength ranges 8000 A-9000 A and 9,000 A-11,000 A with a recently designed and constructed vacuum-cold camera. Sensitization procedures are described and the camera design is presented.
Darmanis, Spyridon; Toms, Andrew; Durman, Robert; Moore, Donna; Eyres, Keith
2007-07-01
To reduce the operating time in computer-assisted navigated total knee replacement (TKR), by improving communication between the infrared camera and the trackers placed on the patient. The innovation involves placing a routinely used laser pointer on top of the camera, so that the infrared cameras focus precisely on the trackers located on the knee to be operated on. A prospective randomized study was performed involving 40 patients divided into two groups, A and B. Both groups underwent navigated TKR, but for group B patients a laser pointer was used to improve the targeting capabilities of the cameras. Without the laser pointer, the camera had to move a mean 9.2 times in order to identify the trackers. With the introduction of the laser pointer, this was reduced to 0.9 times. Accordingly, the additional mean time required without the laser pointer was 11.6 minutes. Time delays are a major problem in computer-assisted surgery, and our technical suggestion can contribute towards reducing the delays associated with this particular application.
NASA Astrophysics Data System (ADS)
Nugent, P. W.; Shaw, J. A.; Piazzolla, S.
2013-02-01
The continuous demand for high data return in deep space and near-Earth satellite missions has led NASA and international institutions to consider alternative technologies for high-data-rate communications. One solution is the establishment of wide-bandwidth Earth-space optical communication links, which require (among other things) a nearly obstruction-free atmospheric path. Considering the atmospheric channel, the most common and most apparent impairments on Earth-space optical communication paths arise from clouds. Therefore, the characterization of the statistical behavior of cloud coverage for optical communication ground station candidate sites is of vital importance. In this article, we describe the development and deployment of a ground-based, long-wavelength infrared cloud imaging system able to monitor and characterize the cloud coverage. This system is based on a commercially available camera with a 62-deg diagonal field of view. A novel internal-shutter-based calibration technique allows radiometric calibration of the camera, which operates without a thermoelectric cooler. This cloud imaging system provides continuous day-night cloud detection with constant sensitivity. The cloud imaging system also includes data-processing algorithms that calculate and remove atmospheric emission to isolate cloud signatures, and enable classification of clouds according to their optical attenuation. Measurements of long-wavelength infrared cloud radiance are used to retrieve the optical attenuation (cloud optical depth due to absorption and scattering) in the wavelength range of interest from visible to near-infrared, where the cloud attenuation is quite constant. This article addresses the specifics of the operation, calibration, and data processing of the imaging system that was deployed at the NASA/JPL Table Mountain Facility (TMF) in California. Data are reported from July 2008 to July 2010. These data describe seasonal variability in cloud cover at the TMF site, with cloud amount (percentage of cloudy pixels) peaking at just over 51 percent during February, of which more than 60 percent had optical attenuation exceeding 12 dB at wavelengths in the range from the visible to the near-infrared. The lowest cloud amount was found during August, averaging 19.6 percent, and these clouds were mostly optically thin, with low attenuation.
IRAIT project: future mid-IR operations at Dome C during summer
NASA Astrophysics Data System (ADS)
Tosti, Gino; IRAIT Collaboration
The project IRAIT consists of a robotic mid-infrared telescope that will be hosted at Dome C in the Italian-French Concordia station on the Antarctic Plateau. The telescope was built in collaboration with the PNRA (sectors Technology and Earth-Sun Interaction and Astrophysics). Its focal plane instrumentation is a mid-infrared Camera (5-25 mu m), based on the TIRCAM II prototype, which is the result of a join effort between Institutes of CNR and INAF. International collaborations with French and Spanish Institutes for the construction of a near infrared spectrographic camera have also been started. We present the status of the project and the ongoing developments that will make possible to start infrared observations at Dome C during the summer Antarctic campaign 2005-2006.
NASA Astrophysics Data System (ADS)
Hatfield, M. C.; Webley, P.; Saiet, E., II
2014-12-01
Remote Sensing of Arctic Environmental Conditions and Critical Infrastructure using Infra-Red (IR) Cameras and Unmanned Air Vehicles (UAVs) Numerous scientific and logistical applications exist in Alaska and other arctic regions requiring analysis of expansive, remote areas in the near infrared (NIR) and thermal infrared (TIR) bands. These include characterization of wild land fire plumes and volcanic ejecta, detailed mapping of lava flows, and inspection of lengthy segments of critical infrastructure, such as the Alaska pipeline and railroad system. Obtaining timely, repeatable, calibrated measurements of these extensive features and infrastructure networks requires localized, taskable assets such as UAVs. The Alaska Center for Unmanned Aircraft Systems Integration (ACUASI) provides practical solutions to these problem sets by pairing various IR sensors with a combination of fixed-wing and multi-rotor air vehicles. Fixed-wing assets, such as the Insitu ScanEagle, offer long reach and extended duration capabilities to quickly access remote locations and provide enduring surveillance of the target of interest. Rotary-wing assets, such as the Aeryon Scout or the ACUASI-built Ptarmigan hexcopter, provide a precision capability for detailed horizontal mapping or vertical stratification of atmospheric phenomena. When included with other ground capabilities, we will show how they can assist in decision support and hazard assessment as well as giving those in emergency management a new ability to increase knowledge of the event at hand while reducing the risk to all involved. Here, in this presentation, we illustrate how UAV's can provide the ideal tool to map and analyze the hazardous events and critical infrastructure under extreme environmental conditions.
A new method of field MRTD test
NASA Astrophysics Data System (ADS)
Chen, Zhibin; Song, Yan; Liu, Xianhong; Xiao, Wenjian
2014-09-01
MRTD is an important indicator to measure the imaging performance of infrared camera. In the traditional laboratory test, blackbody is used as simulated heat source which is not only expensive and bulky but also difficult to meet field testing requirements of online automatic infrared camera MRTD. To solve this problem, this paper introduces a new detection device for MRTD, which uses LED as a simulation heat source and branded plated zinc sulfide glass carved four-bar target as a simulation target. By using high temperature adaptability cassegrain collimation system, the target is simulated to be distance-infinite so that it can be observed by the human eyes to complete the subjective test, or collected to complete objective measurement by image processing. This method will use LED to replace blackbody. The color temperature of LED is calibrated by thermal imager, thereby, the relation curve between the LED temperature controlling current and the blackbody simulation temperature difference is established, accurately achieved the temperature control of the infrared target. Experimental results show that the accuracy of the device in field testing of thermal imager MRTD can be limited within 0.1K, which greatly reduces the cost to meet the project requirements with a wide application value.
Developing Short Films of Geoscience Research
NASA Astrophysics Data System (ADS)
Shipman, J. S.; Webley, P. W.; Dehn, J.; Harrild, M.; Kienenberger, D.; Salganek, M.
2015-12-01
In today's prevalence of social media and networking, video products are becoming increasingly more useful to communicate research quickly and effectively to a diverse audience, including outreach activities as well as within the research community and to funding agencies. Due to the observational nature of geoscience, researchers often take photos and video footage to document fieldwork or to record laboratory experiments. Here we present how researchers can become more effective storytellers by collaborating with filmmakers to produce short documentary films of their research. We will focus on the use of traditional high-definition (HD) camcorders and HD DSLR cameras to record the scientific story while our research topic focuses on the use of remote sensing techniques, specifically thermal infrared imaging that is often used to analyze time varying natural processes such as volcanic hazards. By capturing the story in the thermal infrared wavelength range, in addition to traditional red-green-blue (RGB) color space, the audience is able to experience the world differently. We will develop a short film specifically designed using thermal infrared cameras that illustrates how visual storytellers can use these new tools to capture unique and important aspects of their research, convey their passion for earth systems science, as well as engage and captive the viewer.
2001-11-27
KENNEDY SPACE CENTER, Fla. -- In the Vertical Processing Facility, members of the STS-109 crew look over the Solar Array 3 panels that will be replacing Solar Array 2 panels on the Hubble Space Telescope (HST). Trainers, at left, point to the panels while Mission Specialist Nancy Currie (second from right) and Commander Scott Altman (far right) look on. Other crew members are Pilot Duane Carey, Payload Commander John Grunsfeld and Mission Specialists James Newman, Richard Linnehan and Michael Massimino. The other goals of the mission are replacing the Power Control Unit, removing the Faint Object Camera and installing the Advanced Camera for Surveys, installing the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System, and installing New Outer Blanket Layer insulation on bays 5 through 8. Mission STS-109 is scheduled for launch Feb. 14, 2002
NASA Astrophysics Data System (ADS)
Naqvi, Rizwan Ali; Park, Kang Ryoung
2016-06-01
Gaze tracking systems are widely used in human-computer interfaces, interfaces for the disabled, game interfaces, and for controlling home appliances. Most studies on gaze detection have focused on enhancing its accuracy, whereas few have considered the discrimination of intentional gaze fixation (looking at a target to activate or select it) from unintentional fixation while using gaze detection systems. Previous research methods based on the use of a keyboard or mouse button, eye blinking, and the dwell time of gaze position have various limitations. Therefore, we propose a method for discriminating between intentional and unintentional gaze fixation using a multimodal fuzzy logic algorithm applied to a gaze tracking system with a near-infrared camera sensor. Experimental results show that the proposed method outperforms the conventional method for determining gaze fixation.
EARLY SCIENCE WITH SOFIA, THE STRATOSPHERIC OBSERVATORY FOR INFRARED ASTRONOMY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, E. T.; Becklin, E. E.; De Buizer, J. M.
The Stratospheric Observatory For Infrared Astronomy (SOFIA) is an airborne observatory consisting of a specially modified Boeing 747SP with a 2.7 m telescope, flying at altitudes as high as 13.7 km (45,000 ft). Designed to observe at wavelengths from 0.3 {mu}m to 1.6 mm, SOFIA operates above 99.8% of the water vapor that obscures much of the infrared and submillimeter. SOFIA has seven science instruments under development, including an occultation photometer, near-, mid-, and far-infrared cameras, infrared spectrometers, and heterodyne receivers. SOFIA, a joint project between NASA and the German Aerospace Center Deutsches Zentrum fuer Luft und-Raumfahrt, began initial sciencemore » flights in 2010 December, and has conducted 30 science flights in the subsequent year. During this early science period three instruments have flown: the mid-infrared camera FORCAST, the heterodyne spectrometer GREAT, and the occultation photometer HIPO. This Letter provides an overview of the observatory and its early performance.« less
Tracking multiple surgical instruments in a near-infrared optical system.
Cai, Ken; Yang, Rongqian; Lin, Qinyong; Wang, Zhigang
2016-12-01
Surgical navigation systems can assist doctors in performing more precise and more efficient surgical procedures to avoid various accidents. The near-infrared optical system (NOS) is an important component of surgical navigation systems. However, several surgical instruments are used during surgery, and effectively tracking all of them is challenging. A stereo matching algorithm using two intersecting lines and surgical instrument codes is proposed in this paper. In our NOS, the markers on the surgical instruments can be captured by two near-infrared cameras. After automatically searching and extracting their subpixel coordinates in the left and right images, the coordinates of the real and pseudo markers are determined by the two intersecting lines. Finally, the pseudo markers are removed to achieve accurate stereo matching by summing the codes for the distances between a specific marker with the other two markers on the surgical instrument. Experimental results show that the markers on the different surgical instruments can be automatically and accurately recognized. The NOS can accurately track multiple surgical instruments.
The infrared imaging spectrograph (IRIS) for TMT: latest science cases and simulations
NASA Astrophysics Data System (ADS)
Wright, Shelley A.; Walth, Gregory; Do, Tuan; Marshall, Daniel; Larkin, James E.; Moore, Anna M.; Adamkovics, Mate; Andersen, David; Armus, Lee; Barth, Aaron; Cote, Patrick; Cooke, Jeff; Chisholm, Eric M.; Davidge, Timothy; Dunn, Jennifer S.; Dumas, Christophe; Ellerbroek, Brent L.; Ghez, Andrea M.; Hao, Lei; Hayano, Yutaka; Liu, Michael; Lopez-Rodriguez, Enrique; Lu, Jessica R.; Mao, Shude; Marois, Christian; Pandey, Shashi B.; Phillips, Andrew C.; Schoeck, Matthias; Subramaniam, Annapurni; Subramanian, Smitha; Suzuki, Ryuji; Tan, Jonathan C.; Terai, Tsuyoshi; Treu, Tommaso; Simard, Luc; Weiss, Jason L.; Wincentsen, James; Wong, Michael; Zhang, Kai
2016-07-01
The Thirty Meter Telescope (TMT) first light instrument IRIS (Infrared Imaging Spectrograph) will complete its preliminary design phase in 2016. The IRIS instrument design includes a near-infrared (0.85 - 2.4 micron) integral field spectrograph (IFS) and imager that are able to conduct simultaneous diffraction-limited observations behind the advanced adaptive optics system NFIRAOS. The IRIS science cases have continued to be developed and new science studies have been investigated to aid in technical performance and design requirements. In this development phase, the IRIS science team has paid particular attention to the selection of filters, gratings, sensitivities of the entire system, and science cases that will benefit from the parallel mode of the IFS and imaging camera. We present new science cases for IRIS using the latest end-to-end data simulator on the following topics: Solar System bodies, the Galactic center, active galactic nuclei (AGN), and distant gravitationally-lensed galaxies. We then briefly discuss the necessity of an advanced data management system and data reduction pipeline.
ARNICA, the NICMOS 3 imaging camera of TIRGO.
NASA Astrophysics Data System (ADS)
Lisi, F.; Baffa, C.; Hunt, L.; Stanga, R.
ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 μm that Arcetri Observatory has designed and built as a general facility for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1″per pixel, with sky coverage of more than 4 min×4 min on the NICMOS 3 (256×256 pixels, 40 μm side) detector array. The camera is remotely controlled by a PC 486, connected to the array control electronics via a fiber-optics link. A C-language package, running under MS-DOS on the PC 486, acquires and stores the frames, and controls the timing of the array. The camera is intended for imaging of large extra-galactic and Galactic fields; a large effort has been dedicated to explore the possibility of achieving precise photometric measurements in the J, H, K astronomical bands, with very promising results.
Toslak, Devrim; Liu, Changgeng; Alam, Minhaj Nur; Yao, Xincheng
2018-06-01
A portable fundus imager is essential for emerging telemedicine screening and point-of-care examination of eye diseases. However, existing portable fundus cameras have limited field of view (FOV) and frequently require pupillary dilation. We report here a miniaturized indirect ophthalmoscopy-based nonmydriatic fundus camera with a snapshot FOV up to 67° external angle, which corresponds to a 101° eye angle. The wide-field fundus camera consists of a near-infrared light source (LS) for retinal guidance and a white LS for color retinal imaging. By incorporating digital image registration and glare elimination methods, a dual-image acquisition approach was used to achieve reflection artifact-free fundus photography.
Experience with the UKIRT InSb array camera
NASA Technical Reports Server (NTRS)
Mclean, Ian S.; Casali, Mark M.; Wright, Gillian S.; Aspin, Colin
1989-01-01
The cryogenic infrared camera, IRCAM, has been operating routinely on the 3.8 m UK Infrared Telescope on Mauna Kea, Hawaii for over two years. The camera, which uses a 62x58 element Indium Antimonide array from Santa Barbara Research Center, was designed and built at the Royal Observatory, Edinburgh which operates UKIRT on behalf of the UK Science and Engineering Research Council. Over the past two years at least 60% of the available time on UKIRT has been allocated for IRCAM observations. Described here are some of the properties of this instrument and its detector which influence astronomical performance. Observational techniques and the power of IR arrays with some recent astronomical results are discussed.
High definition infrared chemical imaging of colorectal tissue using a Spero QCL microscope.
Bird, B; Rowlette, J
2017-04-10
Mid-infrared microscopy has become a key technique in the field of biomedical science and spectroscopy. This label-free, non-destructive technique permits the visualisation of a wide range of intrinsic biochemical markers in tissues, cells and biofluids by detection of the vibrational modes of the constituent molecules. Together, infrared microscopy and chemometrics is a widely accepted method that can distinguish healthy and diseased states with high accuracy. However, despite the exponential growth of the field and its research world-wide, several barriers currently exist for its full translation into the clinical sphere, namely sample throughput and data management. The advent and incorporation of quantum cascade lasers (QCLs) into infrared microscopes could help propel the field over these remaining hurdles. Such systems offer several advantages over their FT-IR counterparts, a simpler instrument architecture, improved photon flux, use of room temperature camera systems, and the flexibility of a tunable illumination source. In this current study we explore the use of a QCL infrared microscope to produce high definition, high throughput chemical images useful for the screening of biopsied colorectal tissue.
Theodolite with CCD Camera for Safe Measurement of Laser-Beam Pointing
NASA Technical Reports Server (NTRS)
Crooke, Julie A.
2003-01-01
The simple addition of a charge-coupled-device (CCD) camera to a theodolite makes it safe to measure the pointing direction of a laser beam. The present state of the art requires this to be a custom addition because theodolites are manufactured without CCD cameras as standard or even optional equipment. A theodolite is an alignment telescope equipped with mechanisms to measure the azimuth and elevation angles to the sub-arcsecond level. When measuring the angular pointing direction of a Class ll laser with a theodolite, one could place a calculated amount of neutral density (ND) filters in front of the theodolite s telescope. One could then safely view and measure the laser s boresight looking through the theodolite s telescope without great risk to one s eyes. This method for a Class ll visible wavelength laser is not acceptable to even consider tempting for a Class IV laser and not applicable for an infrared (IR) laser. If one chooses insufficient attenuation or forgets to use the filters, then looking at the laser beam through the theodolite could cause instant blindness. The CCD camera is already commercially available. It is a small, inexpensive, blackand- white CCD circuit-board-level camera. An interface adaptor was designed and fabricated to mount the camera onto the eyepiece of the specific theodolite s viewing telescope. Other equipment needed for operation of the camera are power supplies, cables, and a black-and-white television monitor. The picture displayed on the monitor is equivalent to what one would see when looking directly through the theodolite. Again, the additional advantage afforded by a cheap black-and-white CCD camera is that it is sensitive to infrared as well as to visible light. Hence, one can use the camera coupled to a theodolite to measure the pointing of an infrared as well as a visible laser.
Subaru Near Infrared Coronagraphic Images of T Tauri
NASA Astrophysics Data System (ADS)
Mayama, Satoshi; Tamura, Motohide; Hayashi, Masahiko; Itoh, Yoichi; Fukagawa, Misato; Suto, Hiroshi; Ishii, Miki; Murakawa, Koji; Oasa, Yumiko; Hayashi, Saeko S.; Yamashita, Takuya; Morino, Junichi; Oya, Shin; Naoi, Takahiro; Pyo, Tae-Soo; Nishikawa, Takayuki; Kudo, Tomoyuki; Usuda, Tomonori; Ando, Hiroyasu; Miyama, Shoken M.; Kaifu, Norio
2006-04-01
High angular resolution near-infrared (JHK) adaptive optics images of T Tau were obtained with the infrared camera Coronagraphic Imager with Adaptive Optics (CIAO) mounted on the 8.2m Subaru Telescope in 2002 and 2004. The images resolve a complex circumstellar structure around a multiple system. We resolved T Tau Sa and Sb as well as T Tau N and S. The estimated orbit of T Tau Sb indicates that it is probably bound to T Tau Sa. The K band flux of T Tau S decreased by ˜ 1.7 Jy in 2002 November compared with that in 2001 mainly because T Tau Sa became fainter. The arc-like ridge detected in our near-infrared images is consistent with what is seen at visible wavelengths, supporting the interpretation in previous studies that the arc is part of the cavity wall seen relatively pole-on. Halo emission is detected out to ˜2''from T Tau N. This may be light scattered off the common envelope surrounding the T Tauri multiple system.
The PALM-3000 high-order adaptive optics system for Palomar Observatory
NASA Astrophysics Data System (ADS)
Bouchez, Antonin H.; Dekany, Richard G.; Angione, John R.; Baranec, Christoph; Britton, Matthew C.; Bui, Khanh; Burruss, Rick S.; Cromer, John L.; Guiwits, Stephen R.; Henning, John R.; Hickey, Jeff; McKenna, Daniel L.; Moore, Anna M.; Roberts, Jennifer E.; Trinh, Thang Q.; Troy, Mitchell; Truong, Tuan N.; Velur, Viswa
2008-07-01
Deployed as a multi-user shared facility on the 5.1 meter Hale Telescope at Palomar Observatory, the PALM-3000 highorder upgrade to the successful Palomar Adaptive Optics System will deliver extreme AO correction in the near-infrared, and diffraction-limited images down to visible wavelengths, using both natural and sodium laser guide stars. Wavefront control will be provided by two deformable mirrors, a 3368 active actuator woofer and 349 active actuator tweeter, controlled at up to 3 kHz using an innovative wavefront processor based on a cluster of 17 graphics processing units. A Shack-Hartmann wavefront sensor with selectable pupil sampling will provide high-order wavefront sensing, while an infrared tip/tilt sensor and visible truth wavefront sensor will provide low-order LGS control. Four back-end instruments are planned at first light: the PHARO near-infrared camera/spectrograph, the SWIFT visible light integral field spectrograph, Project 1640, a near-infrared coronagraphic integral field spectrograph, and 888Cam, a high-resolution visible light imager.
A protection system for the JET ITER-like wall based on imaging diagnostics.
Arnoux, G; Devaux, S; Alves, D; Balboa, I; Balorin, C; Balshaw, N; Beldishevski, M; Carvalho, P; Clever, M; Cramp, S; de Pablos, J-L; de la Cal, E; Falie, D; Garcia-Sanchez, P; Felton, R; Gervaise, V; Goodyear, A; Horton, A; Jachmich, S; Huber, A; Jouve, M; Kinna, D; Kruezi, U; Manzanares, A; Martin, V; McCullen, P; Moncada, V; Obrejan, K; Patel, K; Lomas, P J; Neto, A; Rimini, F; Ruset, C; Schweer, B; Sergienko, G; Sieglin, B; Soleto, A; Stamp, M; Stephen, A; Thomas, P D; Valcárcel, D F; Williams, J; Wilson, J; Zastrow, K-D
2012-10-01
The new JET ITER-like wall (made of beryllium and tungsten) is more fragile than the former carbon fiber composite wall and requires active protection to prevent excessive heat loads on the plasma facing components (PFC). Analog CCD cameras operating in the near infrared wavelength are used to measure surface temperature of the PFCs. Region of interest (ROI) analysis is performed in real time and the maximum temperature measured in each ROI is sent to the vessel thermal map. The protection of the ITER-like wall system started in October 2011 and has already successfully led to a safe landing of the plasma when hot spots were observed on the Be main chamber PFCs. Divertor protection is more of a challenge due to dust deposits that often generate false hot spots. In this contribution we describe the camera, data capture and real time processing systems. We discuss the calibration strategy for the temperature measurements with cross validation with thermal IR cameras and bi-color pyrometers. Most importantly, we demonstrate that a protection system based on CCD cameras can work and show examples of hot spot detections that stop the plasma pulse. The limits of such a design and the associated constraints on the operations are also presented.
Real-time Enhancement, Registration, and Fusion for an Enhanced Vision System
NASA Technical Reports Server (NTRS)
Hines, Glenn D.; Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.
2006-01-01
Over the last few years NASA Langley Research Center (LaRC) has been developing an Enhanced Vision System (EVS) to aid pilots while flying in poor visibility conditions. The EVS captures imagery using two infrared video cameras. The cameras are placed in an enclosure that is mounted and flown forward-looking underneath the NASA LaRC ARIES 757 aircraft. The data streams from the cameras are processed in real-time and displayed on monitors on-board the aircraft. With proper processing the camera system can provide better-than-human-observed imagery particularly during poor visibility conditions. However, to obtain this goal requires several different stages of processing including enhancement, registration, and fusion, and specialized processing hardware for real-time performance. We are using a real-time implementation of the Retinex algorithm for image enhancement, affine transformations for registration, and weighted sums to perform fusion. All of the algorithms are executed on a single TI DM642 digital signal processor (DSP) clocked at 720 MHz. The image processing components were added to the EVS system, tested, and demonstrated during flight tests in August and September of 2005. In this paper we briefly discuss the EVS image processing hardware and algorithms. We then discuss implementation issues and show examples of the results obtained during flight tests.
A protection system for the JET ITER-like wall based on imaging diagnosticsa)
NASA Astrophysics Data System (ADS)
Arnoux, G.; Devaux, S.; Alves, D.; Balboa, I.; Balorin, C.; Balshaw, N.; Beldishevski, M.; Carvalho, P.; Clever, M.; Cramp, S.; de Pablos, J.-L.; de la Cal, E.; Falie, D.; Garcia-Sanchez, P.; Felton, R.; Gervaise, V.; Goodyear, A.; Horton, A.; Jachmich, S.; Huber, A.; Jouve, M.; Kinna, D.; Kruezi, U.; Manzanares, A.; Martin, V.; McCullen, P.; Moncada, V.; Obrejan, K.; Patel, K.; Lomas, P. J.; Neto, A.; Rimini, F.; Ruset, C.; Schweer, B.; Sergienko, G.; Sieglin, B.; Soleto, A.; Stamp, M.; Stephen, A.; Thomas, P. D.; Valcárcel, D. F.; Williams, J.; Wilson, J.; Zastrow, K.-D.; JET-EFDA Contributors
2012-10-01
The new JET ITER-like wall (made of beryllium and tungsten) is more fragile than the former carbon fiber composite wall and requires active protection to prevent excessive heat loads on the plasma facing components (PFC). Analog CCD cameras operating in the near infrared wavelength are used to measure surface temperature of the PFCs. Region of interest (ROI) analysis is performed in real time and the maximum temperature measured in each ROI is sent to the vessel thermal map. The protection of the ITER-like wall system started in October 2011 and has already successfully led to a safe landing of the plasma when hot spots were observed on the Be main chamber PFCs. Divertor protection is more of a challenge due to dust deposits that often generate false hot spots. In this contribution we describe the camera, data capture and real time processing systems. We discuss the calibration strategy for the temperature measurements with cross validation with thermal IR cameras and bi-color pyrometers. Most importantly, we demonstrate that a protection system based on CCD cameras can work and show examples of hot spot detections that stop the plasma pulse. The limits of such a design and the associated constraints on the operations are also presented.
Multi-viewer tracking integral imaging system and its viewing zone analysis.
Park, Gilbae; Jung, Jae-Hyun; Hong, Keehoon; Kim, Yunhee; Kim, Young-Hoon; Min, Sung-Wook; Lee, Byoungho
2009-09-28
We propose a multi-viewer tracking integral imaging system for viewing angle and viewing zone improvement. In the tracking integral imaging system, the pickup angles in each elemental lens in the lens array are decided by the positions of viewers, which means the elemental image can be made for each viewer to provide wider viewing angle and larger viewing zone. Our tracking integral imaging system is implemented with an infrared camera and infrared light emitting diodes which can track the viewers' exact positions robustly. For multiple viewers to watch integrated three-dimensional images in the tracking integral imaging system, it is needed to formulate the relationship between the multiple viewers' positions and the elemental images. We analyzed the relationship and the conditions for the multiple viewers, and verified them by the implementation of two-viewer tracking integral imaging system.
Verification of the test stand for microbolometer camera in accredited laboratory
NASA Astrophysics Data System (ADS)
Krupiński, Michal; Bareła, Jaroslaw; Chmielewski, Krzysztof; Kastek, Mariusz
2017-10-01
Microbolometer belongs to the group of thermal detectors and consist of temperature sensitive resistor which is exposed to measured radiation flux. Bolometer array employs a pixel structure prepared in silicon technology. The detecting area is defined by a size of thin membrane, usually made of amorphous silicon (a-Si) or vanadium oxide (VOx). FPAs are made of a multitude of detector elements (for example 384 × 288 ), where each individual detector has different sensitivity and offset due to detector-to-detector spread in the FPA fabrication process, and additionally can change with sensor operating temperature, biasing voltage variation or temperature of the observed scene. The difference in sensitivity and offset among detectors (which is called non-uniformity) additionally with its high sensitivity, produces fixed pattern noise (FPN) on produced image. Fixed pattern noise degrades parameters of infrared cameras like sensitivity or NETD. Additionally it degrades image quality, radiometric accuracy and temperature resolution. In order to objectively compare the two infrared cameras ones must measure and compare their parameters on a laboratory test stand. One of the basic parameters for the evaluation of a designed camera is NETD. In order to examine the NETD, parameters such as sensitivity and pixels noise must be measured. To do so, ones should register the output signal from the camera in response to the radiation of black bodies at two different temperatures. The article presets an application and measuring stand for determining the parameters of microbolometers camera. Prepared measurements were compared with the result of the measurements in the Institute of Optoelectronics, MUT on a METS test stand by CI SYSTEM. This test stand consists of IR collimator, IR standard source, rotating wheel with test patterns, a computer with a video grabber card and specialized software. The parameters of thermals cameras were measure according to norms and method described in literature.
NASA Astrophysics Data System (ADS)
Daly, Michael J.; Muhanna, Nidal; Chan, Harley; Wilson, Brian C.; Irish, Jonathan C.; Jaffray, David A.
2014-02-01
A freehand, non-contact diffuse optical tomography (DOT) system has been developed for multimodal imaging with intraoperative cone-beam CT (CBCT) during minimally-invasive cancer surgery. The DOT system is configured for near-infrared fluorescence imaging with indocyanine green (ICG) using a collimated 780 nm laser diode and a nearinfrared CCD camera (PCO Pixelfly USB). Depending on the intended surgical application, the camera is coupled to either a rigid 10 mm diameter endoscope (Karl Storz) or a 25 mm focal length lens (Edmund Optics). A prototype flatpanel CBCT C-Arm (Siemens Healthcare) acquires low-dose 3D images with sub-mm spatial resolution. A 3D mesh is extracted from CBCT for finite-element DOT implementation in NIRFAST (Dartmouth College), with the capability for soft/hard imaging priors (e.g., segmented lymph nodes). A stereoscopic optical camera (NDI Polaris) provides real-time 6D localization of reflective spheres mounted to the laser and camera. Camera calibration combined with tracking data is used to estimate intrinsic (focal length, principal point, non-linear distortion) and extrinsic (translation, rotation) lens parameters. Source/detector boundary data is computed from the tracked laser/camera positions using radiometry models. Target registration errors (TRE) between real and projected boundary points are ~1-2 mm for typical acquisition geometries. Pre-clinical studies using tissue phantoms are presented to characterize 3D imaging performance. This translational research system is under investigation for clinical applications in head-and-neck surgery including oral cavity tumour resection, lymph node mapping, and free-flap perforator assessment.
Volcano monitoring with an infrared camera: first insights from Villarrica Volcano
NASA Astrophysics Data System (ADS)
Rosas Sotomayor, Florencia; Amigo Ramos, Alvaro; Velasquez Vargas, Gabriela; Medina, Roxana; Thomas, Helen; Prata, Fred; Geoffroy, Carolina
2015-04-01
This contribution focuses on the first trials of the, almost 24/7 monitoring of Villarrica volcano with an infrared camera. Results must be compared with other SO2 remote sensing instruments such as DOAS and UV-camera, for the ''day'' measurements. Infrared remote sensing of volcanic emissions is a fast and safe method to obtain gas abundances in volcanic plumes, in particular when the access to the vent is difficult, during volcanic crisis and at night time. In recent years, a ground-based infrared camera (Nicair) has been developed by Nicarnica Aviation, which quantifies SO2 and ash on volcanic plumes, based on the infrared radiance at specific wavelengths through the application of filters. Three Nicair1 (first model) have been acquired by the Geological Survey of Chile in order to study degassing of active volcanoes. Several trials with the instruments have been performed in northern Chilean volcanoes, and have proven that the intervals of retrieved SO2 concentration and fluxes are as expected. Measurements were also performed at Villarrica volcano, and a location to install a ''fixed'' camera, at 8km from the crater, was discovered here. It is a coffee house with electrical power, wifi network, polite and committed owners and a full view of the volcano summit. The first measurements are being made and processed in order to have full day and week of SO2 emissions, analyze data transfer and storage, improve the remote control of the instrument and notebook in case of breakdown, web-cam/GoPro support, and the goal of the project: which is to implement a fixed station to monitor and study the Villarrica volcano with a Nicair1 integrating and comparing these results with other remote sensing instruments. This works also looks upon the strengthen of bonds with the community by developing teaching material and giving talks to communicate volcanic hazards and other geoscience topics to the people who live "just around the corner" from one of the most active volcanoes in Chile.
Euro Banknote Recognition System for Blind People.
Dunai Dunai, Larisa; Chillarón Pérez, Mónica; Peris-Fajarnés, Guillermo; Lengua Lengua, Ismael
2017-01-20
This paper presents the development of a portable system with the aim of allowing blind people to detect and recognize Euro banknotes. The developed device is based on a Raspberry Pi electronic instrument and a Raspberry Pi camera, Pi NoIR (No Infrared filter) dotted with additional infrared light, which is embedded into a pair of sunglasses that permit blind and visually impaired people to independently handle Euro banknotes, especially when receiving their cash back when shopping. The banknote detection is based on the modified Viola and Jones algorithms, while the banknote value recognition relies on the Speed Up Robust Features (SURF) technique. The accuracies of banknote detection and banknote value recognition are 84% and 97.5%, respectively.
Euro Banknote Recognition System for Blind People
Dunai Dunai, Larisa; Chillarón Pérez, Mónica; Peris-Fajarnés, Guillermo; Lengua Lengua, Ismael
2017-01-01
This paper presents the development of a portable system with the aim of allowing blind people to detect and recognize Euro banknotes. The developed device is based on a Raspberry Pi electronic instrument and a Raspberry Pi camera, Pi NoIR (No Infrared filter) dotted with additional infrared light, which is embedded into a pair of sunglasses that permit blind and visually impaired people to independently handle Euro banknotes, especially when receiving their cash back when shopping. The banknote detection is based on the modified Viola and Jones algorithms, while the banknote value recognition relies on the Speed Up Robust Features (SURF) technique. The accuracies of banknote detection and banknote value recognition are 84% and 97.5%, respectively. PMID:28117703
A GRAND VIEW OF THE BIRTH OF 'HEFTY' STARS - 30 DORADUS NEBULA DETAILS
NASA Technical Reports Server (NTRS)
2002-01-01
These are two views of a highly active region of star birth located northeast of the central cluster, R136, in 30 Doradus. The orientation and scale are identical for both views. The top panel is a composite of images in two colors taken with the Hubble Space Telescope's visible-light camera, the Wide Field and Planetary Camera 2 (WFPC2). The bottom panel is a composite of pictures taken through three infrared filters with Hubble's Near Infrared Camera and Multi-Object Spectrometer (NICMOS). In both cases the colors of the displays were chosen to correlate with the nebula's and stars' true colors. Seven very young objects are identified with numbered arrows in the infrared image. Number 1 is a newborn, compact cluster dominated by a triple system of 'hefty' stars. It has formed within the head of a massive dust pillar pointing toward R136. The energetic outflows from R136 have shaped the pillar and triggered the collapse of clouds within its summit to form the new stars. The radiation and outflows from these new stars have in turn blown off the top of the pillar, so they can be seen in the visible-light as well as the infrared image. Numbers 2 and 3 also pinpoint newborn stars or stellar systems inside an adjacent, bright-rimmed pillar, likewise oriented toward R136. These objects are still immersed within their natal dust and can be seen only as very faint, red points in the visible-light image. They are, however, among the brightest objects in the infrared image, since dust does not block infrared light as much as visible light. Thus, numbers 2 and 3 and number 1 correspond respectively to two successive stages in the birth of massive stars. Number 4 is a very red star that has just formed within one of several very compact dust clouds nearby. Number 5 is another very young triple-star system with a surrounding cluster of fainter stars. They also can be seen in the visible-light picture. Most remarkable are the glowing patches numbered 6 and 7, which astronomers have interpreted as 'impact points' produced by twin jets of material slamming into surrounding dust clouds. These 'impact points' are perfectly aligned on opposite sides of number 5 (the triple-star system), and each is separated from the star system by about 5 light-years. The jets probably originate from a circumstellar disk around one of the young stars in number 5. They may be rotating counterclockwise, thus producing moving, luminous patches on the surrounding dust, like a searchlight creating spots on clouds. These infrared patches produced by jets from a massive, young star are a new astronomical phenomenon. Credits for NICMOS image: NASA/Nolan Walborn (Space Telescope Science Institute, Baltimore, Md.) and Rodolfo Barba' (La Plata Observatory, La Plata, Argentina) Credits for WFPC2 image: NASA/John Trauger (Jet Propulsion Laboratory, Pasadena, Calif.) and James Westphal (California Institute of Technology, Pasadena, Calif.)
Rosetta/OSIRIS - Nucleus morphology and activity of comet 67P/Churyumov-Gerasimenko
NASA Astrophysics Data System (ADS)
Sierks, Holger; Barbieri, Cesare; Lamy, Philippe; Rickman, Hans; Rodrigo, Rafael; Koschny, Detlef
2015-04-01
ESA's Rosetta mission arrived on August 6, 2014, at target comet 67P/Churyumov-Gerasimenko after 10 years of cruise. OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) is the scientific imaging system onboard Rosetta. It comprises a Narrow Angle Camera (NAC) for nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field coma investigations. OSIRIS imaged the nucleus and coma of the comet from the arrival throughout the mapping phase, PHILAE landing, early escort phase and close fly-by. The overview paper will discuss the surface morpholo-gy and activity of the nucleus as seen in gas, dust, and local jets as well as small scale structures in the local topography.
Investigation of small solar system objects with the space telescope
NASA Technical Reports Server (NTRS)
Morrison, D.
1979-01-01
The application of the space telescope (ST) to study small objects in the solar system in order to understand the birth and the early evolution of the solar system is discussed. The upper size limit of the small bodies is defined as approximately 5000 km and includes planetary satellites, planetary rings, asteroids, and comets.The use of the astronomical instruments aboard the ST, such as the faint object camera, ultraviolet and infrared spectrometers, and spectrophotometers, to study the small solar system objects is discussed.
ORAC-DR: A generic data reduction pipeline infrastructure
NASA Astrophysics Data System (ADS)
Jenness, Tim; Economou, Frossie
2015-03-01
ORAC-DR is a general purpose data reduction pipeline system designed to be instrument and observatory agnostic. The pipeline works with instruments as varied as infrared integral field units, imaging arrays and spectrographs, and sub-millimeter heterodyne arrays and continuum cameras. This paper describes the architecture of the pipeline system and the implementation of the core infrastructure. We finish by discussing the lessons learned since the initial deployment of the pipeline system in the late 1990s.
Advanced Video Data-Acquisition System For Flight Research
NASA Technical Reports Server (NTRS)
Miller, Geoffrey; Richwine, David M.; Hass, Neal E.
1996-01-01
Advanced video data-acquisition system (AVDAS) developed to satisfy variety of requirements for in-flight video documentation. Requirements range from providing images for visualization of airflows around fighter airplanes at high angles of attack to obtaining safety-of-flight documentation. F/A-18 AVDAS takes advantage of very capable systems like NITE Hawk forward-looking infrared (FLIR) pod and recent video developments like miniature charge-couple-device (CCD) color video cameras and other flight-qualified video hardware.
Multiple-frame IR photo-recorder KIT-3M
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roos, E; Wilkins, P; Nebeker, N
2006-05-15
This paper reports the experimental results of a high-speed multi-frame infrared camera which has been developed in Sarov at VNIIEF. Earlier [1] we discussed the possibility of creation of the multi-frame infrared radiation photo-recorder with framing frequency about 1 MHz. The basis of the photo-recorder is a semiconductor ionization camera [2, 3], which converts IR radiation of spectral range 1-10 micrometers into a visible image. Several sequential thermal images are registered by using the IR converter in conjunction with a multi-frame electron-optical camera. In the present report we discuss the performance characteristics of a prototype commercial 9-frame high-speed IR photo-recorder.more » The image converter records infrared images of thermal fields corresponding to temperatures ranging from 300 C to 2000 C with an exposure time of 1-20 {micro}s at a frame frequency up to 500 KHz. The IR-photo-recorder camera is useful for recording the time evolution of thermal fields in fast processes such as gas dynamics, ballistics, pulsed welding, thermal processing, automotive industry, aircraft construction, in pulsed-power electric experiments, and for the measurement of spatial mode characteristics of IR-laser radiation.« less
Yang, Hualei; Yang, Xi; Heskel, Mary; ...
2017-04-28
Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporalmore » resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). Here we found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Hualei; Yang, Xi; Heskel, Mary
Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporalmore » resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). Here we found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.« less
Study of optical techniques for the Ames unitary wind tunnel. Part 5: Infrared imagery
NASA Technical Reports Server (NTRS)
Lee, George
1992-01-01
A survey of infrared thermography for aerodynamics was made. Particular attention was paid to boundary layer transition detection. IR thermography flow visualization of 2-D and 3-D separation was surveyed. Heat transfer measurements and surface temperature measurements were also covered. Comparisons of several commercial IR cameras were made. The use of a recently purchased IR camera in the Ames Unitary Plan Wind Tunnels was studied. Optical access for these facilities and the methods to scan typical models was investigated.
NASA Astrophysics Data System (ADS)
Matras, A.
2017-08-01
The paper discusses the impact of the feed screw heating on the machining accuracy. The test stand was built based on HASS Mini Mill 2 CNC milling machine and a Flir SC620 infrared camera. Measurements of workpiece were performed on Talysurf Intra 50 Taylor Hobson profilometer. The research proved that the intensive work of the milling machine lasted 60 minutes, causing thermal expansion of the feed screw what influence on the dimension error of the workpiece.
Intraoperative near-infrared autofluorescence imaging of parathyroid glands.
Ladurner, Roland; Sommerey, Sandra; Arabi, Nora Al; Hallfeldt, Klaus K J; Stepp, Herbert; Gallwas, Julia K S
2017-08-01
To identify parathyroid glands intraoperatively by exposing their autofluorescence using near-infrared light. Fluorescence imaging was carried out during minimally invasive and open parathyroid and thyroid surgery. After identification, the parathyroid glands as well as the surrounding tissue were exposed to near-infrared (NIR) light with a wavelength of 690-770 nm using a modified Karl Storz near-infrared/indocyanine green (NIR/ICG) endoscopic system. Parathyroid tissue was expected to show near-infrared autofluorescence, captured in the blue channel of the camera. Whenever possible the visual identification of parathyroid tissue was confirmed histologically. In preliminary investigations, using the original NIR/ICG endoscopic system we noticed considerable interference of light in the blue channel overlying the autofluorescence. Therefore, we modified the light source by interposing additional filters. In a second series, we investigated 35 parathyroid glands from 25 patients. Twenty-seven glands were identified correctly based on NIR autofluorescence. Regarding the extent of autofluorescence, there were no noticeable differences between parathyroid adenomas, hyperplasia and normal parathyroid glands. In contrast, thyroid tissue, lymph nodes and adipose tissue revealed no substantial autofluorescence. Parathyroid tissue is characterized by showing autofluorescence in the near-infrared spectrum. This effect can be used to distinguish parathyroid glands from other cervical tissue entities.
NASA Astrophysics Data System (ADS)
Haakenaasen, Randi; Lovold, Stian
2003-01-01
Infrared technology in Norway started at the Norwegian Defense Research Establishment (FFI) in the 1960s, and has since then spread to universities, other research institutes and industry. FFI has a large, integrated IR activity that includes research and development in IR detectors, optics design, optical coatings, advanced dewar design, modelling/simulation of IR scenes, and image analysis. Part of the integrated activity is a laboratory for more basic research in materials science and semiconductor physics, in which thin films of CdHgTe are grown by molecular beam epitaxy and processed into IR detectors by various techniques. FFI also has a lot of experience in research and development of tunable infrared lasers for various applications. Norwegian industrial activities include production of infrared homing anti-ship missiles, laser rangefinders, various infrared gas sensors, hyperspectral cameras, and fiberoptic sensor systems for structural health monitoring and offshore oil well diagnostics.
Location precision analysis of stereo thermal anti-sniper detection system
NASA Astrophysics Data System (ADS)
He, Yuqing; Lu, Ya; Zhang, Xiaoyan; Jin, Weiqi
2012-06-01
Anti-sniper detection devices are the urgent requirement in modern warfare. The precision of the anti-sniper detection system is especially important. This paper discusses the location precision analysis of the anti-sniper detection system based on the dual-thermal imaging system. It mainly discusses the following two aspects which produce the error: the digital quantitative effects of the camera; effect of estimating the coordinate of bullet trajectory according to the infrared images in the process of image matching. The formula of the error analysis is deduced according to the method of stereovision model and digital quantitative effects of the camera. From this, we can get the relationship of the detecting accuracy corresponding to the system's parameters. The analysis in this paper provides the theory basis for the error compensation algorithms which are put forward to improve the accuracy of 3D reconstruction of the bullet trajectory in the anti-sniper detection devices.
Development of low-cost high-performance multispectral camera system at Banpil
NASA Astrophysics Data System (ADS)
Oduor, Patrick; Mizuno, Genki; Olah, Robert; Dutta, Achyut K.
2014-05-01
Banpil Photonics (Banpil) has developed a low-cost high-performance multispectral camera system for Visible to Short- Wave Infrared (VIS-SWIR) imaging for the most demanding high-sensitivity and high-speed military, commercial and industrial applications. The 640x512 pixel InGaAs uncooled camera system is designed to provide a compact, smallform factor to within a cubic inch, high sensitivity needing less than 100 electrons, high dynamic range exceeding 190 dB, high-frame rates greater than 1000 frames per second (FPS) at full resolution, and low power consumption below 1W. This is practically all the feature benefits highly desirable in military imaging applications to expand deployment to every warfighter, while also maintaining a low-cost structure demanded for scaling into commercial markets. This paper describes Banpil's development of the camera system including the features of the image sensor with an innovation integrating advanced digital electronics functionality, which has made the confluence of high-performance capabilities on the same imaging platform practical at low cost. It discusses the strategies employed including innovations of the key components (e.g. focal plane array (FPA) and Read-Out Integrated Circuitry (ROIC)) within our control while maintaining a fabless model, and strategic collaboration with partners to attain additional cost reductions on optics, electronics, and packaging. We highlight the challenges and potential opportunities for further cost reductions to achieve a goal of a sub-$1000 uncooled high-performance camera system. Finally, a brief overview of emerging military, commercial and industrial applications that will benefit from this high performance imaging system and their forecast cost structure is presented.
NASA Astrophysics Data System (ADS)
Gicquel, Adeline; Vincent, Jean-Baptiste; Sierks, Holger; Rose, Martin; Agarwal, Jessica; Deller, Jakob; Guettler, Carsten; Hoefner, Sebastian; Hofmann, Marc; Hu, Xuanyu; Kovacs, Gabor; Oklay Vincent, Nilda; Shi, Xian; Tubiana, Cecilia; Barbieri, Cesare; Lamy, Phylippe; Rodrigo, Rafael; Koschny, Detlef; Rickman, Hans; OSIRIS Team
2016-10-01
Images of the nucleus and the coma (gas and dust) of comet 67P/Churyumov- Gerasimenko have been acquired by the OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) cameras system since March 2014 using both the wide angle camera (WAC) and the narrow angle camera (NAC). We are using the NAC camera to study the bright outburst observed on July 29th, 2015 in the southern hemisphere. The NAC camera's wavelength ranges between 250-1000 nm with a combination of 12 filters. The high spatial resolution is needed to localize the source point of the outburst on the surface of the nucleus. At the time of the observations, the heliocentric distance was 1.25AU and the distance between the spacecraft and the comet was 126 km. We aim to understand the physics leading to such outgassing: Is the jet associated to the outbursts controlled by the micro-topography? Or by ice suddenly exposed? We are using the Direct Simulation Monte Carlo (DSMC) method to study the gas flow close to the nucleus. The goal of the DSMC code is to reproduce the opening angle of the jet, and constrain the outgassing ratio between outburst source and local region. The results of this model will be compared to the images obtained with the NAC camera.
VizieR Online Data Catalog: Antennae galaxies (NGC 4038/4039) revisited (Whitmore+, 2010)
NASA Astrophysics Data System (ADS)
Whitmore, B. C.; Chandar, R.; Schweizer, F.; Rothberg, B.; Leitherer, C.; Rieke, M.; Rieke, G.; Blair, W. P.; Mengel, S.; Alonso-Herrero, A.
2012-06-01
Observations of the main bodies of NGC 4038/39 were made with the Hubble Space Telescope (HST), using the ACS, as part of Program GO-10188. Multi-band photometry was obtained in the following optical broadband filters: F435W (~B), F550M (~V), and F814W (~I). Archival F336W photometry of the Antennae (Program GO-5962) was used to supplement our optical ACS/WFC observations. Infrared observations were made using the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) camera on HST as part of Program GO-10188. Observations were made using the NIC2 camera with the F160W, F187N, and F237M filters, and the NIC3 camera with the F110W, F160W, F164W, F187N, and F222M filters. (10 data files).
Barabino, G; Klein, J P; Porcheron, J; Grichine, A; Coll, J-L; Cottier, M
2016-12-01
This study assesses the value of using Intraoperative Near Infrared Fluorescence Imaging and Indocyanine green to detect colorectal carcinomatosis during oncological surgery. In colorectal carcinomatosis cancer, two of the most important prognostic factors are completeness of staging and completeness of cytoreductive surgery. Presently, intraoperative assessment of tumoral margins relies on palpation and visual inspection. The recent introduction of Near Infrared fluorescence image guidance provides new opportunities for surgical roles, particularly in cancer surgery. The study was a non-randomized, monocentric, pilot "ex vivo" blinded clinical trial validated by the ethical committee of University Hospital of Saint Etienne. Ten patients with colorectal carcinomatosis cancer scheduled for cytoreductive surgery were included. Patients received 0.25 mg/kg of Indocyanine green intravenously 24 h before surgery. A Near Infrared camera was used to detect "ex-vivo" fluorescent lesions. There was no surgical mortality. Each analysis was done blindly. In a total of 88 lesions analyzed, 58 were classified by a pathologist as cancerous and 30 as non-cancerous. Among the 58 cancerous lesions, 42 were correctly classified by the Intraoperative Near-Infrared camera (sensitivity of 72.4%). Among the 30 non-cancerous lesions, 18 were correctly classified by the Intraoperative Near-Infrared camera (specificity of 60.0%). Near Infrared fluorescence imaging is a promising technique for intraoperative tumor identification. It could help the surgeon to determine resection margins and reduce the risk of locoregional recurrence. Copyright © 2016 Elsevier Ltd, BASO ~ the Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.
NASA Astrophysics Data System (ADS)
Olafsen, L. J.; Olafsen, J. S.; Eaves, I. K.
2018-06-01
We report on an experimental investigation of the time-dependent spatial intensity distribution of near-infrared idler pulses from an optical parametric oscillator measured using an infrared (IR) camera, in contrast to beam profiles obtained using traditional knife-edge techniques. Comparisons show the information gained by utilizing the thermal camera provides more detail than the spatially- or time-averaged measurements from a knife-edge profile. Synchronization, averaging, and thresholding techniques are applied to enhance the images acquired. The additional information obtained can improve the process by which semiconductor devices and other IR lasers are characterized for their beam quality and output response and thereby result in IR devices with higher performance.
2017-12-08
NASA image release September 17, 2010 In preparation for a cryogenic test NASA Goddard technicians install instrument mass simulators onto the James Webb Space Telescope ISIM structure. The ISIM Structure supports and holds the four Webb telescope science instruments : the Mid-Infrared Instrument (MIRI), the Near-Infrared Camera (NIRCam), the Near-Infrared Spectrograph (NIRSpec) and the Fine Guidance Sensor (FGS). Credit: NASA/GSFC/Chris Gunn To learn more about the James Webb Space Telescope go to: www.jwst.nasa.gov/ NASA Goddard Space Flight Center contributes to NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s endeavors by providing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook
A Gender Identification System for Customers in a Shop Using Infrared Area Scanners
NASA Astrophysics Data System (ADS)
Tajima, Takuya; Kimura, Haruhiko; Abe, Takehiko; Abe, Koji; Nakamoto, Yoshinori
Information about customers in shops plays an important role in marketing analysis. Currently, in convenience stores and supermarkets, the identification of customer's gender is examined by clerks. On the other hand, gender identification systems using camera images are investigated. However, these systems have a problem of invading human privacies in identifying attributes of customers. The proposed system identifies gender by using infrared area scanners and Bayesian network. In the proposed system, since infrared area scanners do not take customers' images directly, invasion of privacies are not occurred. The proposed method uses three parameters of height, walking speed and pace for humans. In general, it is shown that these parameters have factors of sexual distinction in humans, and Bayesian network is designed with these three parameters. The proposed method resolves the existent problems of restricting the locations where the systems are set and invading human privacies. Experimental results using data obtained from 450 people show that the identification rate for the proposed method was 91.3% on the average of both of male and female identifications.
Cameras Reveal Elements in the Short Wave Infrared
NASA Technical Reports Server (NTRS)
2010-01-01
Goodrich ISR Systems Inc. (formerly Sensors Unlimited Inc.), based out of Princeton, New Jersey, received Small Business Innovation Research (SBIR) contracts from the Jet Propulsion Laboratory, Marshall Space Flight Center, Kennedy Space Center, Goddard Space Flight Center, Ames Research Center, Stennis Space Center, and Langley Research Center to assist in advancing and refining indium gallium arsenide imaging technology. Used on the Lunar Crater Observation and Sensing Satellite (LCROSS) mission in 2009 for imaging the short wave infrared wavelengths, the technology has dozens of applications in military, security and surveillance, machine vision, medical, spectroscopy, semiconductor inspection, instrumentation, thermography, and telecommunications.
High-speed mid-infrared hyperspectral imaging using quantum cascade lasers
NASA Astrophysics Data System (ADS)
Kelley, David B.; Goyal, Anish K.; Zhu, Ninghui; Wood, Derek A.; Myers, Travis R.; Kotidis, Petros; Murphy, Cara; Georgan, Chelsea; Raz, Gil; Maulini, Richard; Müller, Antoine
2017-05-01
We report on a standoff chemical detection system using widely tunable external-cavity quantum cascade lasers (ECQCLs) to illuminate target surfaces in the mid infrared (λ = 7.4 - 10.5 μm). Hyperspectral images (hypercubes) are acquired by synchronously operating the EC-QCLs with a LN2-cooled HgCdTe camera. The use of rapidly tunable lasers and a high-frame-rate camera enables the capture of hypercubes with 128 x 128 pixels and >100 wavelengths in <0.1 s. Furthermore, raster scanning of the laser illumination allowed imaging of a 100-cm2 area at 5-m standoff. Raw hypercubes are post-processed to generate a hypercube that represents the surface reflectance relative to that of a diffuse reflectance standard. Results will be shown for liquids (e.g., silicone oil) and solid particles (e.g., caffeine, acetaminophen) on a variety of surfaces (e.g., aluminum, plastic, glass). Signature spectra are obtained for particulate loadings of RDX on glass of <1 μg/cm2.
NASA Astrophysics Data System (ADS)
Jeong, Mira; Nam, Jae-Yeal; Ko, Byoung Chul
2017-09-01
In this paper, we focus on pupil center detection in various video sequences that include head poses and changes in illumination. To detect the pupil center, we first find four eye landmarks in each eye by using cascade local regression based on a regression forest. Based on the rough location of the pupil, a fast radial symmetric transform is applied using the previously found pupil location to rearrange the fine pupil center. As the final step, the pupil displacement is estimated between the previous frame and the current frame to maintain the level of accuracy against a false locating result occurring in a particular frame. We generated a new face dataset, called Keimyung University pupil detection (KMUPD), with infrared camera. The proposed method was successfully applied to the KMUPD dataset, and the results indicate that its pupil center detection capability is better than that of other methods and with a shorter processing time.
Far-infrared and 3D imaging for doneness assessment in chicken breast
NASA Astrophysics Data System (ADS)
Tao, Yang; Ibarra, Juan G.
2001-03-01
Sensor fusion of infrared imaging and range imaging was proposed to estimate internal temperature on just cooked chicken breasts. An infrared camera operating at 8-12 microns registered surface temperature of cooked meat samples, while a single line structured light system located the thickest region of the meat target. In this region of interest, a combined time series/neural network method is applied to correlate the internal and external temperatures during the cool-down process. Experimental verification in a pilot plant oven is presented. To ensure food safety, a mandatory regulation requires all poultry processors in the U.S.A to verify that all ready-to-eat products reach a minimum endpoint temperature (71¦C for chicken breast), but no current assay can do a non-invasively inspection of all the samples. The proposed system has the potential for on-line inspection of ready-to-eat meat for food quality and safety.
High speed Infrared imaging method for observation of the fast varying temperature phenomena
NASA Astrophysics Data System (ADS)
Moghadam, Reza; Alavi, Kambiz; Yuan, Baohong
With new improvements in high-end commercial R&D camera technologies many challenges have been overcome for exploring the high-speed IR camera imaging. The core benefits of this technology is the ability to capture fast varying phenomena without image blur, acquire enough data to properly characterize dynamic energy, and increase the dynamic range without compromising the number of frames per second. This study presents a noninvasive method for determining the intensity field of a High Intensity Focused Ultrasound Device (HIFU) beam using Infrared imaging. High speed Infrared camera was placed above the tissue-mimicking material that was heated by HIFU with no other sensors present in the HIFU axial beam. A MATLAB simulation code used to perform a finite-element solution to the pressure wave propagation and heat equations within the phantom and temperature rise to the phantom was computed. Three different power levels of HIFU transducers were tested and the predicted temperature increase values were within about 25% of IR measurements. The fundamental theory and methods developed in this research can be used to detect fast varying temperature phenomena in combination with the infrared filters.
Continuous All-Sky Cloud Measurements: Cloud Fraction Analysis Based on a Newly Developed Instrument
NASA Astrophysics Data System (ADS)
Aebi, C.; Groebner, J.; Kaempfer, N.; Vuilleumier, L.
2017-12-01
Clouds play an important role in the climate system and are also a crucial parameter for the Earth's surface energy budget. Ground-based measurements of clouds provide data in a high temporal resolution in order to quantify its influence on radiation. The newly developed all-sky cloud camera at PMOD/WRC in Davos (Switzerland), the infrared cloud camera (IRCCAM), is a microbolometer sensitive in the 8 - 14 μm wavelength range. To get all-sky information the camera is located on top of a frame looking downward on a spherical gold-plated mirror. The IRCCAM has been measuring continuously (day and nighttime) with a time resolution of one minute in Davos since September 2015. To assess the performance of the IRCCAM, two different visible all-sky cameras (Mobotix Q24M and Schreder VIS-J1006), which can only operate during daytime, are installed in Davos. All three camera systems have different software for calculating fractional cloud coverage from images. Our study analyzes mainly the fractional cloud coverage of the IRCCAM and compares it with the fractional cloud coverage calculated from the two visible cameras. Preliminary results of the measurement accuracy of the IRCCAM compared to the visible camera indicate that 78 % of the data are within ± 1 octa and even 93 % within ± 2 octas. An uncertainty of 1-2 octas corresponds to the measurement uncertainty of human observers. Therefore, the IRCCAM shows similar performance in detection of cloud coverage as the visible cameras and the human observers, with the advantage that continuous measurements with high temporal resolution are possible.
Real-time Enhancement, Registration, and Fusion for a Multi-Sensor Enhanced Vision System
NASA Technical Reports Server (NTRS)
Hines, Glenn D.; Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.
2006-01-01
Over the last few years NASA Langley Research Center (LaRC) has been developing an Enhanced Vision System (EVS) to aid pilots while flying in poor visibility conditions. The EVS captures imagery using two infrared video cameras. The cameras are placed in an enclosure that is mounted and flown forward-looking underneath the NASA LaRC ARIES 757 aircraft. The data streams from the cameras are processed in real-time and displayed on monitors on-board the aircraft. With proper processing the camera system can provide better-than- human-observed imagery particularly during poor visibility conditions. However, to obtain this goal requires several different stages of processing including enhancement, registration, and fusion, and specialized processing hardware for real-time performance. We are using a real-time implementation of the Retinex algorithm for image enhancement, affine transformations for registration, and weighted sums to perform fusion. All of the algorithms are executed on a single TI DM642 digital signal processor (DSP) clocked at 720 MHz. The image processing components were added to the EVS system, tested, and demonstrated during flight tests in August and September of 2005. In this paper we briefly discuss the EVS image processing hardware and algorithms. We then discuss implementation issues and show examples of the results obtained during flight tests. Keywords: enhanced vision system, image enhancement, retinex, digital signal processing, sensor fusion
Benmiloud, Fares; Rebaudet, Stanislas; Varoquaux, Arthur; Penaranda, Guillaume; Bannier, Marie; Denizot, Anne
2018-01-01
The clinical impact of intraoperative autofluorescence-based identification of parathyroids using a near-infrared camera remains unknown. In a before and after controlled study, we compared all patients who underwent total thyroidectomy by the same surgeon during Period 1 (January 2015 to January 2016) without near-infrared (near-infrared- group) and those operated on during Period 2 (February 2016 to September 2016) using a near-infrared camera (near-infrared+ group). In parallel, we also compared all patients who underwent surgery without near-infrared during those same periods by another surgeon in the same unit (control groups). Main outcomes included postoperative hypocalcemia, parathyroid identification, autotransplantation, and inadvertent resection. The near-infrared+ group displayed significantly lower postoperative hypocalcemia rates (5.2%) than the near-infrared- group (20.9%; P < .001). Compared with the near-infrared- patients, the near-infrared+ group exhibited an increased mean number of identified parathyroids and reduced parathyroid autotransplantation rates, although no difference was observed in inadvertent resection rates. Parathyroids were identified via near-infrared before they were visualized by the surgeon in 68% patients. In the control groups, parathyroid identification improved significantly from Period 1 to Period 2, although autotransplantation, inadvertent resection and postoperative hypocalcemia rates did not differ. Near-infrared use during total thyroidectomy significantly reduced postoperative hypocalcemia, improved parathyroid identification and reduced their autotransplantation rate. Copyright © 2017 Elsevier Inc. All rights reserved.
Near-surface Thermal Infrared Imaging of a Mixed Forest
NASA Astrophysics Data System (ADS)
Aubrecht, D. M.; Helliker, B. R.; Richardson, A. D.
2014-12-01
Measurement of an organism's temperature is of basic physiological importance and therefore necessary for ecosystem modeling, yet most models derive leaf temperature from energy balance arguments or assume it is equal to air temperature. This is because continuous, direct measurement of leaf temperature outside of a controlled environment is difficult and rarely done. Of even greater challenge is measuring leaf temperature with the resolution required to understand the underlying energy balance and regulation of plant processes. To measure leaf temperature through the year, we have mounted a high-resolution, thermal infrared camera overlooking the canopy of a temperate deciduous forest. The camera is co-located with an eddy covariance system and a suite of radiometric sensors. Our camera measures longwave thermal infrared (λ = 7.5-14 microns) using a microbolometer array. Suspended in the canopy within the camera FOV is a matte black copper plate instrumented with fine wire thermocouples that acts as a thermal reference for each image. In this presentation, I will discuss the challenges of continuous, long-term field operation of the camera, as well as measurement sensitivity to physical and environmental parameters. Based on this analysis, I will show that the uncertainties in converting radiometric signal to leaf temperature are well constrained. The key parameter for minimizing uncertainty is the emissivity of the objects being imaged: measuring the emissivity to within 0.01 enables leaf temperature to be calculated to within 0.5°C. Finally, I will present differences in leaf temperature observed amongst species. From our two-year record, we characterize high frequency, daily, and seasonal thermal signatures of leaves and crowns, in relation to environmental conditions. Our images are taken with sufficient spatial and temporal resolution to quantify the preferential heating of sunlit portions of the canopy and the cooling effect of wind gusts. Future work will be focused on correlations between hyperspectral vegetation indices, fluxes, and thermal signatures to characterize vegetation stress. As water stress increases, causing photosynthesis and transpiration to shutdown, heat fluxes, leaf temperature, and narrow band vegetation indices should report signatures of the affected processes.
Miniature infrared hyperspectral imaging sensor for airborne applications
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele; Hinnrichs, Bradford; McCutchen, Earl
2017-05-01
Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera, both MWIR and LWIR, small enough to serve as a payload on a miniature unmanned aerial vehicles. The optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of the sensor. This new and innovative approach to infrared hyperspectral imaging spectrometer uses micro-optics and will be explained in this paper. The micro-optics are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a mini-UAV or commercial quadcopter. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the spatial resolution. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame.
Infrared hyperspectral imaging miniaturized for UAV applications
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele; Hinnrichs, Bradford; McCutchen, Earl
2017-02-01
Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera, both MWIR and LWIR, small enough to serve as a payload on a miniature unmanned aerial vehicles. The optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of the sensor. This new and innovative approach to infrared hyperspectral imaging spectrometer uses micro-optics and will be explained in this paper. The micro-optics are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a mini-UAV or commercial quadcopter. Also, an example of how this technology can easily be used to quantify a hydrocarbon gas leak's volume and mass flowrates. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the spatial resolution. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame.
NASA Astrophysics Data System (ADS)
Arvidson, R. E.; Squyres, S. W.; Baumgartner, E. T.; Schenker, P. S.; Niebur, C. S.; Larsen, K. W.; SeelosIV, F. P.; Snider, N. O.; Jolliff, B. L.
2002-08-01
The Field Integration Design and Operations (FIDO) prototype Mars rover was deployed and operated remotely for 2 weeks in May 2000 in the Black Rock Summit area of Nevada. The blind science operation trials were designed to evaluate the extent to which FIDO-class rovers can be used to conduct traverse science and collect samples. FIDO-based instruments included stereo cameras for navigation and imaging, an infrared point spectrometer, a color microscopic imager for characterization of rocks and soils, and a rock drill for core acquisition. Body-mounted ``belly'' cameras aided drill deployment, and front and rear hazard cameras enabled terrain hazard avoidance. Airborne Visible and Infrared Imaging Spectrometer (AVIRIS) data, a high spatial resolution IKONOS orbital image, and a suite of descent images were used to provide regional- and local-scale terrain and rock type information, from which hypotheses were developed for testing during operations. The rover visited three sites, traversed 30 m, and acquired 1.3 gigabytes of data. The relatively small traverse distance resulted from a geologically rich site in which materials identified on a regional scale from remote-sensing data could be identified on a local scale using rover-based data. Results demonstrate the synergy of mapping terrain from orbit and during descent using imaging and spectroscopy, followed by a rover mission to test inferences and to make discoveries that can be accomplished only with surface mobility systems.
Near-infrared fluorescence imaging with a mobile phone (Conference Presentation)
NASA Astrophysics Data System (ADS)
Ghassemi, Pejhman; Wang, Bohan; Wang, Jianting; Wang, Quanzeng; Chen, Yu; Pfefer, T. Joshua
2017-03-01
Mobile phone cameras employ sensors with near-infrared (NIR) sensitivity, yet this capability has not been exploited for biomedical purposes. Removing the IR-blocking filter from a phone-based camera opens the door to a wide range of techniques and applications for inexpensive, point-of-care biophotonic imaging and sensing. This study provides proof of principle for one of these modalities - phone-based NIR fluorescence imaging. An imaging system was assembled using a 780 nm light source along with excitation and emission filters with 800 nm and 825 nm cut-off wavelengths, respectively. Indocyanine green (ICG) was used as an NIR fluorescence contrast agent in an ex vivo rodent model, a resolution test target and a 3D-printed, tissue-simulating vascular phantom. Raw and processed images for red, green and blue pixel channels were analyzed for quantitative evaluation of fundamental performance characteristics including spectral sensitivity, detection linearity and spatial resolution. Mobile phone results were compared with a scientific CCD. The spatial resolution of CCD system was consistently superior to the phone, and green phone camera pixels showed better resolution than blue or green channels. The CCD exhibited similar sensitivity as processed red and blue pixels channels, yet a greater degree of detection linearity. Raw phone pixel data showed lower sensitivity but greater linearity than processed data. Overall, both qualitative and quantitative results provided strong evidence of the potential of phone-based NIR imaging, which may lead to a wide range of applications from cancer detection to glucose sensing.
X-ray ‘ghost images’ could cut radiation doses
NASA Astrophysics Data System (ADS)
Chen, Sophia
2018-03-01
On its own, a single-pixel camera captures pictures that are pretty dull: squares that are completely black, completely white, or some shade of gray in between. All it does, after all, is detect brightness. Yet by connecting a single-pixel camera to a patterned light source, a team of physicists in China has made detailed x-ray images using a statistical technique called ghost imaging, first pioneered 20 years ago in infrared and visible light. Researchers in the field say future versions of this system could take clear x-ray photographs with cheap cameras—no need for lenses and multipixel detectors—and less cancer-causing radiation than conventional techniques.
EVA 5 - MS Grunsfeld and Linnehan in payload bay
2002-03-08
STS109-E-5750 (8 March 2002) --- Astronaut John M. Grunsfeld, STS-109 payload commander, floats near the giant Hubble Space Telescope (HST) temporarily hosted in the Space Shuttle Columbias cargo bay. Astronaut Richard M. Linnehan (lower right), mission specialist, works in tandem with Grunsfeld during this fifth and final scheduled space walk. Activities for EVA-5 centered around the Near-Infrared Camera and Multi-Object Spectrometer (NICMOS) to install a Cryogenic Cooler and its Cooling System Radiator. The space walk was completed at 10:06 a.m. CST (1606 GMT), March 8, 2002. The image was recorded with a digital still camera.
Attempt of Serendipitous Science During the Mojave Volatile Prospector Field Expedition
NASA Technical Reports Server (NTRS)
Roush, T. L.; Colaprete, A.; Heldmann, J.; Lim, D. S. S.; Cook, A.; Elphic, R.; Deans, M.; Fluckiger, L.; Fritzler, E.; Hunt, David
2015-01-01
On 23 October a partial solar eclipse occurred across parts of the southwest United States between approximately 21:09 and 23:40 (UT), with maximum obscuration, 36%, occurring at 22:29 (UT). During 21-26 October 2014 the Mojave Volatile Prospector (MVP) field expedition deployed and operated the NASA Ames Krex2 rover in the Mojave desert west of Baker, California (Fig. 1, bottom). The MVP field expedition primary goal was to characterize the surface and sub-surface soil moisture properties within desert alluvial fans, and as a secondary goal to provide mission operations simulations of the Resource Prospector (RP) mission to a Lunar pole. The partial solar eclipse provided an opportunity during MVP operations to address serendipitous science. Science instruments on Krex2 included a neutron spectrometer, a near-infrared spectrometer with associated imaging camera, and an independent camera coupled with software to characterize the surface textures of the areas encountered. All of these devices are focused upon the surface and as a result are downward looking. In addition to these science instruments, two hazard cameras are mounted on Krex2. The chief device used to monitor the partial solar eclipse was the engineering development unit of the Near-Infrared Volatile Spectrometer System (NIRVSS) near-infrared spectrometer. This device uses two separate fiber optic fed Hadamard transform spectrometers. The short-wave and long-wave spectrometers measure the 1600-2400 and 2300-3400 nm wavelength regions with resolutions of 10 and 13 nm, respectively. Data are obtained approximately every 8 seconds. The NIRVSS stares in the opposite direction as the front Krex2.
NASA Technical Reports Server (NTRS)
Georgieva, E. M.; Huang, W.; Heaps, W. S.
2012-01-01
A portable remote sensing system for precision column measurements of methane has been developed, built and tested at NASA GSFC. The sensor covers the spectral range from 1.636 micrometers to 1.646 micrometers, employs an air-gapped Fabry-Perot filter and a CCD camera and has a potential to operate from a variety of platforms. The detector is an XS-1.7-320 camera unit from Xenics Infrared solutions which combines an uncooled InGaAs detector array working up to 1.7 micrometers. Custom software was developed in addition to the graphical user basic interface X-Control provided by the company to help save and process the data. The technique and setup can be used to measure other trace gases in the atmosphere with minimal changes of the etalon and the prefilter. In this paper we describe the calibration of the system using several different approaches.
Design, demonstration and testing of low F-number LWIR panoramic imaging relay optics
NASA Astrophysics Data System (ADS)
Furxhi, Orges; Frascati, Joe; Driggers, Ronald
2018-04-01
Panoramic imaging is inherently wide field of view. High sensitivity uncooled Long Wave Infrared (LWIR) imaging requires low F-number optics. These two requirements result in short back working distance designs that, in addition to being costly, are challenging to integrate with commercially available uncooled LWIR cameras and cores. Common challenges include the relocation of the shutter flag, custom calibration of the camera dynamic range and NUC tables, focusing, and athermalization. Solutions to these challenges add to the system cost and make panoramic uncooled LWIR cameras commercially unattractive. In this paper, we present the design of Panoramic Imaging Relay Optics (PIRO) and show imagery and test results with one of the first prototypes. PIRO designs use several reflective surfaces (generally two) to relay a panoramic scene onto a real, donut-shaped image. The PIRO donut is imaged on the focal plane of the camera using a commercially-off-the-shelf (COTS) low F-number lens. This approach results in low component cost and effortless integration with pre-calibrated commercially available cameras and lenses.
Distance determination method of dust particles using Rosetta OSIRIS NAC and WAC data
NASA Astrophysics Data System (ADS)
Drolshagen, E.; Ott, T.; Koschny, D.; Güttler, C.; Tubiana, C.; Agarwal, J.; Sierks, H.; Barbieri, C.; Lamy, P. I.; Rodrigo, R.; Rickman, H.; A'Hearn, M. F.; Barucci, M. A.; Bertaux, J.-L.; Bertini, I.; Cremonese, G.; da Deppo, V.; Davidsson, B.; Debei, S.; de Cecco, M.; Deller, J.; Feller, C.; Fornasier, S.; Fulle, M.; Gicquel, A.; Groussin, O.; Gutiérrez, P. J.; Hofmann, M.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Keller, H. U.; Knollenberg, J.; Kramm, J. R.; Kührt, E.; Küppers, M.; Lara, L. M.; Lazzarin, M.; Lopez Moreno, J. J.; Marzari, F.; Naletto, G.; Oklay, N.; Shi, X.; Thomas, N.; Poppe, B.
2017-09-01
The ESA Rosetta spacecraft has been tracking its target, the Jupiter-family comet 67P/Churyumov-Gerasimenko, in close vicinity for over two years. It hosts the OSIRIS instruments: the Optical, Spectroscopic, and Infrared Remote Imaging System composed of two cameras, see e.g. Keller et al. (2007). In some imaging sequences dedicated to observe dust particles in the comet's coma, the two cameras took images at the same time. The aim of this work is to use these simultaneous double camera observations to calculate the dust particles' distance to the spacecraft. As the two cameras are mounted on the spacecraft with an offset of 70 cm, the distance of particles observed by both cameras can be determined by a shift of the particles' apparent trails on the images. This paper presents first results of the ongoing work, introducing the distance determination method for the OSIRIS instrument and the analysis of an example particle. We note that this method works for particles in the range of about 500-6000 m from the spacecraft.
Optical Design of the LSST Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivier, S S; Seppala, L; Gilmore, K
2008-07-16
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a camera system that includes a set of broad-band filters and refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. Optical design of the camera lenses and filters is integrated with optical design of telescope mirrors to optimize performance, resulting in excellent image quality over the entire field from ultra-violet to near infra-red wavelengths. The LSST camera optics design consists of three refractive lenses withmore » clear aperture diameters of 1.55 m, 1.10 m and 0.69 m and six interchangeable, broad-band, filters with clear aperture diameters of 0.75 m. We describe the methodology for fabricating, coating, mounting and testing these lenses and filters, and we present the results of detailed tolerance analyses, demonstrating that the camera optics will perform to the specifications required to meet their performance goals.« less
Strickland, Matt; Tremaine, Jamie; Brigley, Greg; Law, Calvin
2013-06-01
As surgical procedures become increasingly dependent on equipment and imaging, the need for sterile members of the surgical team to have unimpeded access to the nonsterile technology in their operating room (OR) is of growing importance. To our knowledge, our team is the first to use an inexpensive infrared depthsensing camera (a component of the Microsoft Kinect) and software developed inhouse to give surgeons a touchless, gestural interface with which to navigate their picture archiving and communication systems intraoperatively. The system was designed and developed with feedback from surgeons and OR personnel and with consideration of the principles of aseptic technique and gestural controls in mind. Simulation was used for basic validation before trialing in a pilot series of 6 hepatobiliary-pancreatic surgeries. The interface was used extensively in 2 laparoscopic and 4 open procedures. Surgeons primarily used the system for anatomic correlation, real-time comparison of intraoperative ultrasound with preoperative computed tomography and magnetic resonance imaging scans and for teaching residents and fellows. The system worked well in a wide range of lighting conditions and procedures. It led to a perceived increase in the use of intraoperative image consultation. Further research should be focused on investigating the usefulness of touchless gestural interfaces in different types of surgical procedures and its effects on operative time.
Early forest fire detection using principal component analysis of infrared video
NASA Astrophysics Data System (ADS)
Saghri, John A.; Radjabi, Ryan; Jacobs, John T.
2011-09-01
A land-based early forest fire detection scheme which exploits the infrared (IR) temporal signature of fire plume is described. Unlike common land-based and/or satellite-based techniques which rely on measurement and discrimination of fire plume directly from its infrared and/or visible reflectance imagery, this scheme is based on exploitation of fire plume temporal signature, i.e., temperature fluctuations over the observation period. The method is simple and relatively inexpensive to implement. The false alarm rate is expected to be lower that of the existing methods. Land-based infrared (IR) cameras are installed in a step-stare-mode configuration in potential fire-prone areas. The sequence of IR video frames from each camera is digitally processed to determine if there is a fire within camera's field of view (FOV). The process involves applying a principal component transformation (PCT) to each nonoverlapping sequence of video frames from the camera to produce a corresponding sequence of temporally-uncorrelated principal component (PC) images. Since pixels that form a fire plume exhibit statistically similar temporal variation (i.e., have a unique temporal signature), PCT conveniently renders the footprint/trace of the fire plume in low-order PC images. The PC image which best reveals the trace of the fire plume is then selected and spatially filtered via simple threshold and median filter operations to remove the background clutter, such as traces of moving tree branches due to wind.
Airborne laser systems for atmospheric sounding in the near infrared
NASA Astrophysics Data System (ADS)
Sabatini, Roberto; Richardson, Mark A.; Jia, Huamin; Zammit-Mangion, David
2012-06-01
This paper presents new techniques for atmospheric sounding using Near Infrared (NIR) laser sources, direct detection electro-optics and passive infrared imaging systems. These techniques allow a direct determination of atmospheric extinction and, through the adoption of suitable inversion algorithms, the indirect measurement of some important natural and man-made atmospheric constituents, including Carbon Dioxide (CO2). The proposed techniques are suitable for remote sensing missions performed by using aircraft, satellites, Unmanned Aerial Vehicles (UAV), parachute/gliding vehicles, Roving Surface Vehicles (RSV), or Permanent Surface Installations (PSI). The various techniques proposed offer relative advantages in different scenarios. All are based on measurements of the laser energy/power incident on target surfaces of known geometric and reflective characteristics, by means of infrared detectors and/or infrared cameras calibrated for radiance. Experimental results are presented relative to ground and flight trials performed with laser systems operating in the near infrared (NIR) at λ = 1064 nm and λ = 1550 nm. This includes ground tests performed with 10 Hz and 20 KHz PRF NIR laser systems in a variety of atmospheric conditions, and flight trials performed with a 10 Hz airborne NIR laser system installed on a TORNADO aircraft, flying up to altitudes of 22,000 ft above ground level. Future activities are planned to validate the atmospheric retrieval algorithms developed for CO2 column density measurements, with emphasis on aircraft related emissions at airports and other high air-traffic density environments.
Tissue-equivalent TL sheet dosimetry system for X- and gamma-ray dose mapping.
Nariyama, N; Konnai, A; Ohnishi, S; Odano, N; Yamaji, A; Ozasa, N; Ishikawa, Y
2006-01-01
To measure dose distribution for X- and gamma rays simply and accurately, a tissue-equivalent thermoluminescent (TL) sheet-type dosemeter and reader system were developed. The TL sheet is composed of LiF:Mg,Cu,P and ETFE polymer, and the thickness is 0.2 mm. For the TL reading, a square heating plate, 20 cm on each side, was developed, and the temperature distribution was measured with an infrared thermal imaging camera. As a result, linearity within 2% and the homogeneity within 3% were confirmed. The TL signal emitted is detected using a CCD camera and displayed as a spatial dose distribution. Irradiation using synchrotron radiation between 10 and 100 keV and (60)Co gamma rays showed that the TL sheet dosimetry system was promising for radiation dose mapping for various purposes.
Multi-band infrared camera systems
NASA Astrophysics Data System (ADS)
Davis, Tim; Lang, Frank; Sinneger, Joe; Stabile, Paul; Tower, John
1994-12-01
The program resulted in an IR camera system that utilizes a unique MOS addressable focal plane array (FPA) with full TV resolution, electronic control capability, and windowing capability. Two systems were delivered, each with two different camera heads: a Stirling-cooled 3-5 micron band head and a liquid nitrogen-cooled, filter-wheel-based, 1.5-5 micron band head. Signal processing features include averaging up to 16 frames, flexible compensation modes, gain and offset control, and real-time dither. The primary digital interface is a Hewlett-Packard standard GPID (IEEE-488) port that is used to upload and download data. The FPA employs an X-Y addressed PtSi photodiode array, CMOS horizontal and vertical scan registers, horizontal signal line (HSL) buffers followed by a high-gain preamplifier and a depletion NMOS output amplifier. The 640 x 480 MOS X-Y addressed FPA has a high degree of flexibility in operational modes. By changing the digital data pattern applied to the vertical scan register, the FPA can be operated in either an interlaced or noninterlaced format. The thermal sensitivity performance of the second system's Stirling-cooled head was the best of the systems produced.
Multi-Angle Snowflake Camera Value-Added Product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shkurko, Konstantin; Garrett, T.; Gaustad, K
The Multi-Angle Snowflake Camera (MASC) addresses a need for high-resolution multi-angle imaging of hydrometeors in freefall with simultaneous measurement of fallspeed. As illustrated in Figure 1, the MASC consists of three cameras, separated by 36°, each pointing at an identical focal point approximately 10 cm away. Located immediately above each camera, a light aims directly at the center of depth of field for its corresponding camera. The focal point at which the cameras are aimed lies within a ring through which hydrometeors fall. The ring houses a system of near-infrared emitter-detector pairs, arranged in two arrays separated vertically by 32more » mm. When hydrometeors pass through the lower array, they simultaneously trigger all cameras and lights. Fallspeed is calculated from the time it takes to traverse the distance between the upper and lower triggering arrays. The trigger electronics filter out ambient light fluctuations associated with varying sunlight and shadows. The microprocessor onboard the MASC controls the camera system and communicates with the personal computer (PC). The image data is sent via FireWire 800 line, and fallspeed (and camera control) is sent via a Universal Serial Bus (USB) line that relies on RS232-over-USB serial conversion. See Table 1 for specific details on the MASC located at the Oliktok Point Mobile Facility on the North Slope of Alaska. The value-added product (VAP) detailed in this documentation analyzes the raw data (Section 2.0) using Python: images rely on OpenCV image processing library and derived aggregated statistics rely on some clever averaging. See Sections 4.1 and 4.2 for more details on what variables are computed.« less
An infra-red imaging system for the analysis of tropisms in Arabidopsis thaliana seedlings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orbovic, V.; Poff, K.L.
1990-05-01
Since blue and green light will induce phototropism and red light is absorbed by phytochrome, no wavelength of visible radiation should be considered safe for any study of tropisms in etiolated seedlings. For this reason, we have developed an infra-red imaging system with a video camera with which we can monitor seedlings using radiation at wavelengths longer than 800 nm. The image of the seedlings can be observed in real time, recorded on a VCR and subsequently analyzed using the Java image analysis system. The time courses for curvature of seedlings differ in shape, amplitude, and lag time. This variabilitymore » accounts for much of the noise in the measurement of curvature for a population of seedlings.« less
2002-01-17
KENNEDY SPACE CENTER, FLA. -- In the Vertical Processing Facility, workers help guide the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) Cooling System into an protective enclosure on a payload carrier. NICMOS II is part of the payload on mission STS-109, the Hubble Servicing Telescope Mission. It is a new experimental cooling system consisting of a compressor and tiny turbines. With the experimental cryogenic system, NASA hopes to re-cool the infrared detectors to below -315 degrees F (-193 degrees Celsius). NICMOS II was previously tested aboard STS-95 in 1998. It could extend the life of the Hubble Space Telescope by several years. Astronauts aboard Columbia on mission STS-109 will be replacing the original NICMOS with the newer version. Launch of mission STS-109 is scheduled for Feb. 28, 2002
Reitsamer, H; Groiss, H P; Franz, M; Pflug, R
2000-01-31
We present a computer-guided microelectrode positioning system that is routinely used in our laboratory for intracellular electrophysiology and functional staining of retinal neurons. Wholemount preparations of isolated retina are kept in a superfusion chamber on the stage of an inverted microscope. Cells and layers of the retina are visualized by Nomarski interference contrast using infrared light in combination with a CCD camera system. After five-point calibration has been performed the electrode can be guided to any point inside the calibrated volume without moving the retina. Electrode deviations from target cells can be corrected by the software further improving the precision of this system. The good visibility of cells avoids prelabeling with fluorescent dyes and makes it possible to work under completely dark adapted conditions.
A fast infrared scanning technique for nondestructive testing
NASA Astrophysics Data System (ADS)
Hartikainen, Jari
1989-04-01
A simple and fast thermal NDT measurement system is described and its usefulness is demonstrated using a honeycomb structure as a test sample. The sample is heated with a hot air jet and the surface temperature differences due to subsurface defects are detected with a single HgCdTe detector. An image of the sample is formed by scanning over the sample surface with a deflection mirror in the y direction while moving the sample in the x direction. The measurement time is typically 6 s per image and several images are averaged to improve signal to noise ratio. The main advantages of this system compared to conventional infrared camera techniques are considerably reduced cost and the ease with which the system can be modified to various applications.
Non-destructive 3D shape measurement of transparent and black objects with thermal fringes
NASA Astrophysics Data System (ADS)
Brahm, Anika; Rößler, Conrad; Dietrich, Patrick; Heist, Stefan; Kühmstedt, Peter; Notni, Gunther
2016-05-01
Fringe projection is a well-established optical method for the non-destructive contactless three-dimensional (3D) measurement of object surfaces. Typically, fringe sequences in the visible wavelength range (VIS) are projected onto the surfaces of objects to be measured and are observed by two cameras in a stereo vision setup. The reconstruction is done by finding corresponding pixels in both cameras followed by triangulation. Problems can occur if the properties of some materials disturb the measurements. If the objects are transparent, translucent, reflective, or strongly absorbing in the VIS range, the projected patterns cannot be recorded properly. To overcome these challenges, we present a new alternative approach in the infrared (IR) region of the electromagnetic spectrum. For this purpose, two long-wavelength infrared (LWIR) cameras (7.5 - 13 μm) are used to detect the emitted heat radiation from surfaces which is induced by a pattern projection unit driven by a CO2 laser (10.6 μm). Thus, materials like glass or black objects, e.g. carbon fiber materials, can be measured non-destructively without the need of any additional paintings. We will demonstrate the basic principles of this heat pattern approach and show two types of 3D systems based on a freeform mirror and a GOBO wheel (GOes Before Optics) projector unit.
Wide-angle ITER-prototype tangential infrared and visible viewing system for DIII-D.
Lasnier, C J; Allen, S L; Ellis, R E; Fenstermacher, M E; McLean, A G; Meyer, W H; Morris, K; Seppala, L G; Crabtree, K; Van Zeeland, M A
2014-11-01
An imaging system with a wide-angle tangential view of the full poloidal cross-section of the tokamak in simultaneous infrared and visible light has been installed on DIII-D. The optical train includes three polished stainless steel mirrors in vacuum, which view the tokamak through an aperture in the first mirror, similar to the design concept proposed for ITER. A dichroic beam splitter outside the vacuum separates visible and infrared (IR) light. Spatial calibration is accomplished by warping a CAD-rendered image to align with landmarks in a data image. The IR camera provides scrape-off layer heat flux profile deposition features in diverted and inner-wall-limited plasmas, such as heat flux reduction in pumped radiative divertor shots. Demonstration of the system to date includes observation of fast-ion losses to the outer wall during neutral beam injection, and shows reduced peak wall heat loading with disruption mitigation by injection of a massive gas puff.
NASA Technical Reports Server (NTRS)
Wiseman, Jennifer
2010-01-01
The Wasp-Waist Nebula was discovered in the IRAC c2d survey of the Ophiuchus starforming clouds. It is powered by a well-isolated, low-luminosity, low-mass Class 0 object. Its weak outflow has been mapped in the CO (3-2) transition with the JCMT, in 2.12 micron H2 emission with WIRC (the Wide-Field Infrared Camera) on the Hale 5-meter, and, most recently, in six H2 mid-infrared lines with the IRS (InfraRed Spectrograph) on-board the Spitzer Space Telescope; possible jet twisting structure may be evidence of unique core dynamics. Here, we report results of recent VLA ammonia mapping observations of the dense gas envelope feeding the central core protostellar system. We describe the morphology, kinematics, and angular momentum characteristics of this unique system. The results are compared with the envelope structure deduced from IRAC 8-micron absorption of the PAH (polycyclic aromatic hydrocarbon) background emission from the cloud.
Wide-angle ITER-prototype tangential infrared and visible viewing system for DIII-D
Lasnier, Charles J.; Allen, Steve L.; Ellis, Ronald E.; ...
2014-08-26
An imaging system with a wide-angle tangential view of the full poloidal cross-section of the tokamak in simultaneous infrared and visible light has been installed on DIII-D. The optical train includes three polished stainless steel mirrors in vacuum, which view the tokamak through an aperture in the first mirror, similar to the design concept proposed for ITER. A dichroic beam splitter outside the vacuum separates visible and infrared (IR) light. Spatial calibration is accomplished by warping a CAD-rendered image to align with landmarks in a data image. The IR camera provides scrape-off layer heat flux profile deposition features in divertedmore » and inner-wall-limited plasmas, such as heat flux reduction in pumped radiative divertor shots. As a result, demonstration of the system to date includes observation of fast-ion losses to the outer wall during neutral beam injection, and shows reduced peak wall heat loading with disruption mitigation by injection of a massive gas puff.« less
NASA Astrophysics Data System (ADS)
Bachche, Shivaji; Oka, Koichi
2013-06-01
This paper presents the comparative study of various color space models to determine the suitable color space model for detection of green sweet peppers. The images were captured by using CCD cameras and infrared cameras and processed by using Halcon image processing software. The LED ring around the camera neck was used as an artificial lighting to enhance the feature parameters. For color images, CieLab, YIQ, YUV, HSI and HSV whereas for infrared images, grayscale color space models were selected for image processing. In case of color images, HSV color space model was found more significant with high percentage of green sweet pepper detection followed by HSI color space model as both provides information in terms of hue/lightness/chroma or hue/lightness/saturation which are often more relevant to discriminate the fruit from image at specific threshold value. The overlapped fruits or fruits covered by leaves can be detected in better way by using HSV color space model as the reflection feature from fruits had higher histogram than reflection feature from leaves. The IR 80 optical filter failed to distinguish fruits from images as filter blocks useful information on features. Computation of 3D coordinates of recognized green sweet peppers was also conducted in which Halcon image processing software provides location and orientation of the fruits accurately. The depth accuracy of Z axis was examined in which 500 to 600 mm distance between cameras and fruits was found significant to compute the depth distance precisely when distance between two cameras maintained to 100 mm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pandya, Shwetang N., E-mail: pandya.shwetang@LHD.nifs.ac.jp; Sano, Ryuichi; Peterson, Byron J.
An Infrared imaging Video Bolometer (IRVB) diagnostic is currently being used in the Large Helical Device (LHD) for studying the localization of radiation structures near the magnetic island and helical divertor X-points during plasma detachment and for 3D tomography. This research demands high signal to noise ratio (SNR) and sensitivity to improve the temporal resolution for studying the evolution of radiation structures during plasma detachment and a wide IRVB field of view (FoV) for tomography. Introduction of an infrared periscope allows achievement of a higher SNR and higher sensitivity, which in turn, permits a twofold improvement in the temporal resolutionmore » of the diagnostic. Higher SNR along with wide FoV is achieved simultaneously by reducing the separation of the IRVB detector (metal foil) from the bolometer's aperture and the LHD plasma. Altering the distances to meet the aforesaid requirements results in an increased separation between the foil and the IR camera. This leads to a degradation of the diagnostic performance in terms of its sensitivity by 1.5-fold. Using an infrared periscope to image the IRVB foil results in a 7.5-fold increase in the number of IR camera pixels imaging the foil. This improves the IRVB sensitivity which depends on the square root of the number of IR camera pixels being averaged per bolometer channel. Despite the slower f-number (f/# = 1.35) and reduced transmission (τ{sub 0} = 89%, due to an increased number of lens elements) for the periscope, the diagnostic with an infrared periscope operational on LHD has improved in terms of sensitivity and SNR by a factor of 1.4 and 4.5, respectively, as compared to the original diagnostic without a periscope (i.e., IRVB foil being directly imaged by the IR camera through conventional optics). The bolometer's field of view has also increased by two times. The paper discusses these improvements in apt details.« less
Huang, Lu-Mao; DU, Pei-Yan; Chen, Lan; Zhang, Sa; Zhou, Di-Fu; Chen, Chun-Lin; Xin, Xue-Gang
2018-04-20
To develop a near-infrared fluorescence imaging system based on the fluorescence properties of methylene blue. According to the optical properties of methylene blue, we used a custom-made specific LED light source and an interference filter, a CCD camera and other relevant components to construct the near-infrared fluorescence imaging system. We tested the signal-to-background ratio (SBR) of this imaging system for detecting methylene blue under different experimental conditions and analyzed the SBR in urine samples collected from 15 Wistar rats with intravenous injection of methylene blue at the doses of 0, 1.4, 1.6, 1.8, or 2.0 0 mg/kg methylene blue. The SBR of this imaging system for detecting methylene blue was affected by the concentration of methylene blue and the distance from the sample (P<0.05). In the urine samples from Wistar rats, the SBR varied with the the injection dose, and the rats injected with 1.6 mg/kg methylene blue showed the highest SBR (8.71∓0.20) in the urine (P<0.05). This near-infrared fluorescence imaging system is useful for fluorescence detection of methylene blue and can be used for real-time recognition of ureters during abdominal surgery.
NASA Astrophysics Data System (ADS)
Nugent, Paul Winston
Cloud cover is an important but poorly understood component of current climate models, and although climate change is most easily observed in the Arctic, cloud data in the Arctic is unreliable or simply unavailable. Ground-based infrared cloud imaging has the potential to fill this gap. This technique uses a thermal infrared camera to observe cloud amount, cloud optical depth, and cloud spatial distribution at a particular location. The Montana State University Optical Remote Sensor Laboratory has developed the ground-based Infrared Cloud Imager (ICI) instrument to measure spatial and temporal cloud data. To build an ICI for Arctic sites required the system to be engineered to overcome the challenges of this environment. Of particular challenge was keeping the system calibration and data processing accurate through the severe temperature changes. Another significant challenge was that weak emission from the cold, dry Arctic atmosphere pushed the camera used in the instrument to its operational limits. To gain an understanding of the operation of the ICI systems for the Arctic and to gather critical data on Arctic clouds, a prototype arctic ICI was deployed in Barrow, AK from July 2012 through July 2014. To understand the long-term operation of an ICI in the arctic, a study was conducted of the ICI system accuracy in relation to co-located active and passive sensors. Understanding the operation of this system in the Arctic environment required careful characterization of the full optical system, including the lens, filter, and detector. Alternative data processing techniques using decision trees and support vector machines were studied to improve data accuracy and reduce dependence on auxiliary instrument data and the resulting accuracy is reported here. The work described in this project was part of the effort to develop a fourth-generation ICI ready to be deployed in the Arctic. This system will serve a critical role in developing our understanding of cloud cover in the Arctic, an important but poorly understood region of the world.
A CMOS camera-based system for clinical photoplethysmographic applications
NASA Astrophysics Data System (ADS)
Humphreys, Kenneth; Markham, Charles; Ward, Tomas E.
2005-06-01
In this work an image-based photoplethysmography (PPG) system is developed and tested against a conventional finger-based system as commonly used in clinical practise. A PPG is essentially an optical instrument consisting of a near infrared (NIR) source and detector that is capable of tracking blood flow changes in body tissue. When used with a number of wavelengths in the NIR band blood oxygenation changes as well as other blood chemical signatures can be ascertained yielding a very useful device in the clinical realm. Conventionally such a device requires direct contact with the tissue under investigation which eliminates the possibility of its use for applications like wound management where the tissue oxygenation measurement could be extremely useful. To circumnavigate this shortcoming we have developed a CMOS camera-based system, which can successfully extract the PPG signal without contact with the tissue under investigation. A comparison of our results with conventional techniques has yielded excellent results.
Intelligent imaging systems for automotive applications
NASA Astrophysics Data System (ADS)
Thompson, Chris; Huang, Yingping; Fu, Shan
2004-03-01
In common with many other application areas, visual signals are becoming an increasingly important information source for many automotive applications. For several years CCD cameras have been used as research tools for a range of automotive applications. Infrared cameras, RADAR and LIDAR are other types of imaging sensors that have also been widely investigated for use in cars. This paper will describe work in this field performed in C2VIP over the last decade - starting with Night Vision Systems and looking at various other Advanced Driver Assistance Systems. Emerging from this experience, we make the following observations which are crucial for "intelligent" imaging systems: 1. Careful arrangement of sensor array. 2. Dynamic-Self-Calibration. 3. Networking and processing. 4. Fusion with other imaging sensors, both at the image level and the feature level, provides much more flexibility and reliability in complex situations. We will discuss how these problems can be addressed and what are the outstanding issues.
Study of optical techniques for the Ames unitary wind tunnel: Digital image processing, part 6
NASA Technical Reports Server (NTRS)
Lee, George
1993-01-01
A survey of digital image processing techniques and processing systems for aerodynamic images has been conducted. These images covered many types of flows and were generated by many types of flow diagnostics. These include laser vapor screens, infrared cameras, laser holographic interferometry, Schlieren, and luminescent paints. Some general digital image processing systems, imaging networks, optical sensors, and image computing chips were briefly reviewed. Possible digital imaging network systems for the Ames Unitary Wind Tunnel were explored.
Reliability Analysis of the MSC System
NASA Astrophysics Data System (ADS)
Kim, Young-Soo; Lee, Do-Kyoung; Lee, Chang-Ho; Woo, Sun-Hee
2003-09-01
MSC (Multi-Spectral Camera) is the payload of KOMPSAT-2, which is being developed for earth imaging in optical and near-infrared region. The design of the MSC is completed and its reliability has been assessed from part level to the MSC system level. The reliability was analyzed in worst case and the analysis results showed that the value complies the required value of 0.9. In this paper, a calculation method of reliability for the MSC system is described, and assessment result is presented and discussed.
The Nimbus 4 data catalog. Volume 3: Data orbits 1124-1956, 1 July - 31 August 1970
NASA Technical Reports Server (NTRS)
1971-01-01
The Nimbus 4 satellite catalog for the period of 1 July through 31 August, 1970 is presented. The subjects discussed are: (1) summary of operations, (2) orbital elements and daily sensors on table, (3) image dissector camera system montages, and (4) temperature-humidity infrared radiometer montages. Data are presented as tables and photographs.
USDA-ARS?s Scientific Manuscript database
A small, fixed-wing UAS was used to survey a replicated small plot field experiment designed to estimate sorghum damage caused by an invasive aphid. Plant stress varied among 40 plots through manipulation of aphid densities. Equipped with a consumer-grade near-infrared camera, the UAS was flown on...
Alternatives for Military Space Radar
2007-01-01
transmitted microwaves to produce images of the Earth’s surface (somewhat akin to photographs produced by optical imaging).2 By providing their own...microwaves for illumination (rather than sunlight, as in an optical imaging system). By providing their own illu- mination, radars can produce...carry a variety of payloads, including electro- optical , infrared, and SAR imagers; a film camera; and signals- intelligence equipment. The aircraft’s
Infrared hyperspectral imaging sensor for gas detection
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele
2000-11-01
A small light weight man portable imaging spectrometer has many applications; gas leak detection, flare analysis, threat warning, chemical agent detection, just to name a few. With support from the US Air Force and Navy, Pacific Advanced Technology has developed a small man portable hyperspectral imaging sensor with an embedded DSP processor for real time processing that is capable of remotely imaging various targets such as gas plums, flames and camouflaged targets. Based upon their spectral signature the species and concentration of gases can be determined. This system has been field tested at numerous places including White Mountain, CA, Edwards AFB, and Vandenberg AFB. Recently evaluation of the system for gas detection has been performed. This paper presents these results. The system uses a conventional infrared camera fitted with a diffractive optic that images as well as disperses the incident radiation to form spectral images that are collected in band sequential mode. Because the diffractive optic performs both imaging and spectral filtering, the lens system consists of only a single element that is small, light weight and robust, thus allowing man portability. The number of spectral bands are programmable such that only those bands of interest need to be collected. The system is entirely passive, therefore, easily used in a covert operation. Currently Pacific Advanced Technology is working on the next generation of this camera system that will have both an embedded processor as well as an embedded digital signal processor in a small hand held camera configuration. This will allow the implementation of signal and image processing algorithms for gas detection and identification in real time. This paper presents field test data on gas detection and identification as well as discuss the signal and image processing used to enhance the gas visibility. Flow rates as low as 0.01 cubic feet per minute have been imaged with this system.
Space infrared telescope facility wide field and diffraction limited array camera (IRAC)
NASA Technical Reports Server (NTRS)
Fazio, Giovanni G.
1988-01-01
The wide-field and diffraction limited array camera (IRAC) is capable of two-dimensional photometry in either a wide-field or diffraction-limited mode over the wavelength range from 2 to 30 microns with a possible extension to 120 microns. A low-doped indium antimonide detector was developed for 1.8 to 5.0 microns, detectors were tested and optimized for the entire 1.8 to 30 micron range, beamsplitters were developed and tested for the 1.8 to 30 micron range, and tradeoff studies of the camera's optical system performed. Data are presented on the performance of InSb, Si:In, Si:Ga, and Si:Sb array detectors bumpbonded to a multiplexed CMOS readout chip of the source-follower type at SIRTF operating backgrounds (equal to or less than 1 x 10 to the 8th ph/sq cm/sec) and temperature (4 to 12 K). Some results at higher temperatures are also presented for comparison to SIRTF temperature results. Data are also presented on the performance of IRAC beamsplitters at room temperature at both 0 and 45 deg angle of incidence and on the performance of the all-reflecting optical system baselined for the camera.
2002-01-22
KENNEDY SPACE CENTER, FLA. -- Workers in the Vertical Processing Facility oversee the installation of the NICMOS radiator onto the MULE (Multi-Use Lightweight Equipment) carrier. Part of the payload on mission STS-109, the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) is a new experimental cooling system consisting of a compressor and tiny turbines. With the experimental cryogenic system, NASA hopes to re-cool the infrared detectors to below -315 degrees F (-193 degrees Celsius). NICMOS II was previously tested aboard STS-95 in 1998. NICMOS could extend the life of the Hubble Space Telescope by several years. Astronauts aboard Columbia on mission STS-109 will be replacing the original NICMOS with the newer version. Launch of Columbia on mission STS-109 is scheduled Feb. 28, 2002
Advances in shutter drive technology to enhance man-portable infrared cameras
NASA Astrophysics Data System (ADS)
Durfee, David
2012-06-01
With an emphasis on highest reliability, infrared (IR) imagers have traditionally used simplest-possible shutters and field-proven technology. Most commonly, single-step rotary or linear magnetic actuators have been used with good success. However, several newer shutter drive technologies offer benefits in size and power reduction, enabling man-portable imagers that are more compact, lighter, and more durable. This paper will discuss improvements in shutter and shutter drive technology, which enable smaller and more power-efficient imagers. Topics will transition from single-step magnetic actuators to multi-stepping magnetic drives, latching vs. balanced systems for blade position shock-resistance, motor and geared motor drives, and associated stepper driver electronics. It will highlight performance tradeoffs pertinent to man-portable military systems.
Cao, Yanpeng; Tisse, Christel-Loic
2014-02-01
In this Letter, we propose an efficient and accurate solution to remove temperature-dependent nonuniformity effects introduced by the imaging optics. This single-image-based approach computes optics-related fixed pattern noise (FPN) by fitting the derivatives of correction model to the gradient components, locally computed on an infrared image. A modified bilateral filtering algorithm is applied to local pixel output variations, so that the refined gradients are most likely caused by the nonuniformity associated with optics. The estimated bias field is subtracted from the raw infrared imagery to compensate the intensity variations caused by optics. The proposed method is fundamentally different from the existing nonuniformity correction (NUC) techniques developed for focal plane arrays (FPAs) and provides an essential image processing functionality to achieve completely shutterless NUC for uncooled long-wave infrared (LWIR) imaging systems.
Star Formation as Seen by the Infrared Array Camera on Spitzer
NASA Technical Reports Server (NTRS)
Smith, Howard A.; Allen, L.; Megeath, T.; Barmby, P.; Calvet, N.; Fazio, G.; Hartmann, L.; Myers, P.; Marengo, M.; Gutermuth, R.
2004-01-01
The Infrared Array Camera (IRAC) onboard Spitzer has imaged regions of star formation (SF) in its four IR bands with spatial resolutions of approximately 2"/pixel. IRAC is sensitive enough to detect very faint, embedded young stars at levels of tens of Jy, and IRAC photometry can categorize their stages of development: from young protostars with infalling envelopes (Class 0/1) to stars whose infrared excesses derive from accreting circumstellar disks (Class 11) to evolved stars dominated by photospheric emission. The IRAC images also clearly reveal and help diagnose associated regions of shocked and/or PDR emission in the clouds; we find existing models provide a good start at explaining the continuum of the SF regions IRAC observes.
Hyperspectral imaging spectro radiometer improves radiometric accuracy
NASA Astrophysics Data System (ADS)
Prel, Florent; Moreau, Louis; Bouchard, Robert; Bullis, Ritchie D.; Roy, Claude; Vallières, Christian; Levesque, Luc
2013-06-01
Reliable and accurate infrared characterization is necessary to measure the specific spectral signatures of aircrafts and associated infrared counter-measures protections (i.e. flares). Infrared characterization is essential to improve counter measures efficiency, improve friend-foe identification and reduce the risk of friendly fire. Typical infrared characterization measurement setups include a variety of panchromatic cameras and spectroradiometers. Each instrument brings essential information; cameras measure the spatial distribution of targets and spectroradiometers provide the spectral distribution of the emitted energy. However, the combination of separate instruments brings out possible radiometric errors and uncertainties that can be reduced with Hyperspectral imagers. These instruments combine both spectral and spatial information into the same data. These instruments measure both the spectral and spatial distribution of the energy at the same time ensuring the temporal and spatial cohesion of collected information. This paper presents a quantitative analysis of the main contributors of radiometric uncertainties and shows how a hyperspectral imager can reduce these uncertainties.
Interpretation of multispectral and infrared thermal surveys of the Suez Canal Zone, Egypt
NASA Technical Reports Server (NTRS)
Elshazly, E. M.; Hady, M. A. A. H.; Hafez, M. A. A.; Salman, A. B.; Morsy, M. A.; Elrakaiby, M. M.; Alaassy, I. E. E.; Kamel, A. F.
1977-01-01
Remote sensing airborne surveys were conducted, as part of the plan of rehabilitation, of the Suez Canal Zone using I2S multispectral camera and Bendix LN-3 infrared passive scanner. The multispectral camera gives four separate photographs for the same scene in the blue, green, red, and near infrared bands. The scanner was operated in the microwave bands of 8 to 14 microns and the thermal surveying was carried out both at night and in the day time. The surveys, coupled with intensive ground investigations, were utilized in the construction of new geological, structural lineation and drainage maps for the Suez Canal Zone on a scale of approximately 1:20,000, which are superior to the maps made by normal aerial photography. A considerable number of anomalies belonging to various types were revealed through the interpretation of the executed multispectral and infrared thermal surveys.
24/7 security system: 60-FPS color EMCCD camera with integral human recognition
NASA Astrophysics Data System (ADS)
Vogelsong, T. L.; Boult, T. E.; Gardner, D. W.; Woodworth, R.; Johnson, R. C.; Heflin, B.
2007-04-01
An advanced surveillance/security system is being developed for unattended 24/7 image acquisition and automated detection, discrimination, and tracking of humans and vehicles. The low-light video camera incorporates an electron multiplying CCD sensor with a programmable on-chip gain of up to 1000:1, providing effective noise levels of less than 1 electron. The EMCCD camera operates in full color mode under sunlit and moonlit conditions, and monochrome under quarter-moonlight to overcast starlight illumination. Sixty frame per second operation and progressive scanning minimizes motion artifacts. The acquired image sequences are processed with FPGA-compatible real-time algorithms, to detect/localize/track targets and reject non-targets due to clutter under a broad range of illumination conditions and viewing angles. The object detectors that are used are trained from actual image data. Detectors have been developed and demonstrated for faces, upright humans, crawling humans, large animals, cars and trucks. Detection and tracking of targets too small for template-based detection is achieved. For face and vehicle targets the results of the detection are passed to secondary processing to extract recognition templates, which are then compared with a database for identification. When combined with pan-tilt-zoom (PTZ) optics, the resulting system provides a reliable wide-area 24/7 surveillance system that avoids the high life-cycle cost of infrared cameras and image intensifiers.
Integrative Multi-Spectral Sensor Device for Far-Infrared and Visible Light Fusion
NASA Astrophysics Data System (ADS)
Qiao, Tiezhu; Chen, Lulu; Pang, Yusong; Yan, Gaowei
2018-06-01
Infrared and visible light image fusion technology is a hot spot in the research of multi-sensor fusion technology in recent years. Existing infrared and visible light fusion technologies need to register before fusion because of using two cameras. However, the application effect of the registration technology has yet to be improved. Hence, a novel integrative multi-spectral sensor device is proposed for infrared and visible light fusion, and by using the beam splitter prism, the coaxial light incident from the same lens is projected to the infrared charge coupled device (CCD) and visible light CCD, respectively. In this paper, the imaging mechanism of the proposed sensor device is studied with the process of the signals acquisition and fusion. The simulation experiment, which involves the entire process of the optic system, signal acquisition, and signal fusion, is constructed based on imaging effect model. Additionally, the quality evaluation index is adopted to analyze the simulation result. The experimental results demonstrate that the proposed sensor device is effective and feasible.
A near-infrared tip-tilt sensor for the Keck I laser guide star adaptive optics system
NASA Astrophysics Data System (ADS)
Wizinowich, Peter; Smith, Roger; Biasi, Roberto; Cetre, Sylvain; Dekany, Richard; Femenia-Castella, Bruno; Fucik, Jason; Hale, David; Neyman, Chris; Pescoller, Dietrich; Ragland, Sam; Stomski, Paul; Andrighettoni, Mario; Bartos, Randy; Bui, Khanh; Cooper, Andrew; Cromer, John; van Dam, Marcos; Hess, Michael; James, Ean; Lyke, Jim; Rodriguez, Hector; Stalcup, Thomas
2014-07-01
The sky coverage and performance of laser guide star (LGS) adaptive optics (AO) systems is limited by the natural guide star (NGS) used for low order correction. This limitation can be dramatically reduced by measuring the tip and tilt of the NGS in the near-infrared where the NGS is partially corrected by the LGS AO system and where stars are generally several magnitudes brighter than at visible wavelengths. We present the design of a near-infrared tip-tilt sensor that has recently been integrated with the Keck I telescope's LGS AO system along with some initial on-sky results. The implementation involved modifications to the AO bench, real-time control system, and higher level controls and operations software that will also be discussed. The tip-tilt sensor is a H2RG-based near-infrared camera with 0.05 arc second pixels. Low noise at high sample rates is achieved by only reading a small region of interest, from 2×2 to 16×16 pixels, centered on an NGS anywhere in the 100 arc second diameter field. The sensor operates at either Ks or H-band using light reflected by a choice of dichroic beamsplitters located in front of the OSIRIS integral field spectrograph.
A projective surgical navigation system for cancer resection
NASA Astrophysics Data System (ADS)
Gan, Qi; Shao, Pengfei; Wang, Dong; Ye, Jian; Zhang, Zeshu; Wang, Xinrui; Xu, Ronald
2016-03-01
Near infrared (NIR) fluorescence imaging technique can provide precise and real-time information about tumor location during a cancer resection surgery. However, many intraoperative fluorescence imaging systems are based on wearable devices or stand-alone displays, leading to distraction of the surgeons and suboptimal outcome. To overcome these limitations, we design a projective fluorescence imaging system for surgical navigation. The system consists of a LED excitation light source, a monochromatic CCD camera, a host computer, a mini projector and a CMOS camera. A software program is written by C++ to call OpenCV functions for calibrating and correcting fluorescence images captured by the CCD camera upon excitation illumination of the LED source. The images are projected back to the surgical field by the mini projector. Imaging performance of this projective navigation system is characterized in a tumor simulating phantom. Image-guided surgical resection is demonstrated in an ex-vivo chicken tissue model. In all the experiments, the projected images by the projector match well with the locations of fluorescence emission. Our experimental results indicate that the proposed projective navigation system can be a powerful tool for pre-operative surgical planning, intraoperative surgical guidance, and postoperative assessment of surgical outcome. We have integrated the optoelectronic elements into a compact and miniaturized system in preparation for further clinical validation.
Unmanned Ground Vehicle Perception Using Thermal Infrared Cameras
NASA Technical Reports Server (NTRS)
Rankin, Arturo; Huertas, Andres; Matthies, Larry; Bajracharya, Max; Assad, Christopher; Brennan, Shane; Bellut, Paolo; Sherwin, Gary
2011-01-01
TIR cameras can be used for day/night Unmanned Ground Vehicle (UGV) autonomous navigation when stealth is required. The quality of uncooled TIR cameras has significantly improved over the last decade, making them a viable option at low speed Limiting factors for stereo ranging with uncooled LWIR cameras are image blur and low texture scenes TIR perception capabilities JPL has explored includes: (1) single and dual band TIR terrain classification (2) obstacle detection (pedestrian, vehicle, tree trunks, ditches, and water) (3) perception thru obscurants
NASA Astrophysics Data System (ADS)
Vincent, Mark B.; Chanover, Nancy J.; Beebe, Reta F.; Huber, Lyle
2005-10-01
The NASA Infrared Telescope Facility (IRTF) on Mauna Kea, Hawaii, set aside some time on about 500 nights from 1995 to 2002, when the NSFCAM facility infrared camera was mounted and Jupiter was visible, for a standardized set of observations of Jupiter in support of the Galileo mission. The program included observations of Jupiter, nearby reference stars, and dome flats in five filters: narrowband filters centered at 1.58, 2.28, and 3.53 μm, and broader L' and M' bands that probe the atmosphere from the stratosphere to below the main cloud layer. The reference stars were not cross-calibrated against standards. We performed follow-up observations to calibrate these stars and Jupiter in 2003 and 2004. We present a summary of the calibration of the Galileo support monitoring program data set. We present calibrated magnitudes of the six most frequently observed stars, calibrated reflectivities, and brightness temperatures of Jupiter from 1995 to 2004, and a simple method of normalizing the Jovian brightness to the 2004 results. Our study indicates that the NSFCAM's zero-point magnitudes were not stable from 1995 to early 1997, and that the best Jovian calibration possible with this data set is limited to about +/-10%. The raw images and calibration data have been deposited in the Planetary Data System.
An infrared image based methodology for breast lesions screening
NASA Astrophysics Data System (ADS)
Morais, K. C. C.; Vargas, J. V. C.; Reisemberger, G. G.; Freitas, F. N. P.; Oliari, S. H.; Brioschi, M. L.; Louveira, M. H.; Spautz, C.; Dias, F. G.; Gasperin, P.; Budel, V. M.; Cordeiro, R. A. G.; Schittini, A. P. P.; Neto, C. D.
2016-05-01
The objective of this paper is to evaluate the potential of utilizing a structured methodology for breast lesions screening, based on infrared imaging temperature measurements of a healthy control group to establish expected normality ranges, and of breast cancer patients, previously diagnosed through biopsies of the affected regions. An analysis of the systematic error of the infrared camera skin temperature measurements was conducted in several different regions of the body, by direct comparison to high precision thermistor temperature measurements, showing that infrared camera temperatures are consistently around 2 °C above the thermistor temperatures. Therefore, a method of conjugated gradients is proposed to eliminate the infrared camera direct temperature measurement imprecision, by calculating the temperature difference between two points to cancel out the error. The method takes into account the human body approximate bilateral symmetry, and compares measured dimensionless temperature difference values (Δ θ bar) between two symmetric regions of the patient's breast, that takes into account the breast region, the surrounding ambient and the individual core temperatures, and doing so, the results interpretation for different individuals become simple and non subjective. The range of normal whole breast average dimensionless temperature differences for 101 healthy individuals was determined, and admitting that the breasts temperatures exhibit a unimodal normal distribution, the healthy normal range for each region was considered to be the dimensionless temperature difference plus/minus twice the standard deviation of the measurements, Δ θ bar ‾ + 2σ Δ θ bar ‾ , in order to represent 95% of the population. Forty-seven patients with previously diagnosed breast cancer through biopsies were examined with the method, which was capable of detecting breast abnormalities in 45 cases (96%). Therefore, the conjugated gradients method was considered effective in breast lesions screening through infrared imaging in order to recommend a biopsy, even with the use of a low optical resolution camera (160 × 120 pixels) and a thermal resolution of 0.1 °C, whose results were compared to the results of a higher resolution camera (320 × 240 pixels). The main conclusion is that the results demonstrate that the method has potential for utilization as a noninvasive screening exam for individuals with breast complaints, indicating whether the patient should be submitted to a biopsy or not.