Sample records for object infrared camera

  1. A fuzzy automated object classification by infrared laser camera

    NASA Astrophysics Data System (ADS)

    Kanazawa, Seigo; Taniguchi, Kazuhiko; Asari, Kazunari; Kuramoto, Kei; Kobashi, Syoji; Hata, Yutaka

    2011-06-01

    Home security in night is very important, and the system that watches a person's movements is useful in the security. This paper describes a classification system of adult, child and the other object from distance distribution measured by an infrared laser camera. This camera radiates near infrared waves and receives reflected ones. Then, it converts the time of flight into distance distribution. Our method consists of 4 steps. First, we do background subtraction and noise rejection in the distance distribution. Second, we do fuzzy clustering in the distance distribution, and form several clusters. Third, we extract features such as the height, thickness, aspect ratio, area ratio of the cluster. Then, we make fuzzy if-then rules from knowledge of adult, child and the other object so as to classify the cluster to one of adult, child and the other object. Here, we made the fuzzy membership function with respect to each features. Finally, we classify the clusters to one with the highest fuzzy degree among adult, child and the other object. In our experiment, we set up the camera in room and tested three cases. The method successfully classified them in real time processing.

  2. ARNICA, the Arcetri Near-Infrared Camera

    NASA Astrophysics Data System (ADS)

    Lisi, F.; Baffa, C.; Bilotti, V.; Bonaccini, D.; del Vecchio, C.; Gennari, S.; Hunt, L. K.; Marcucci, G.; Stanga, R.

    1996-04-01

    ARNICA (ARcetri Near-Infrared CAmera) is the imaging camera for the near-infrared bands between 1.0 and 2.5 microns that the Arcetri Observatory has designed and built for the Infrared Telescope TIRGO located at Gornergrat, Switzerland. We describe the mechanical and optical design of the camera, and report on the astronomical performance of ARNICA as measured during the commissioning runs at the TIRGO (December, 1992 to December 1993), and an observing run at the William Herschel Telescope, Canary Islands (December, 1993). System performance is defined in terms of efficiency of the camera+telescope system and camera sensitivity for extended and point-like sources. (SECTION: Astronomical Instrumentation)

  3. Unmanned Ground Vehicle Perception Using Thermal Infrared Cameras

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo; Huertas, Andres; Matthies, Larry; Bajracharya, Max; Assad, Christopher; Brennan, Shane; Bellutta, Paolo; Sherwin, Gary W.

    2011-01-01

    The ability to perform off-road autonomous navigation at any time of day or night is a requirement for some unmanned ground vehicle (UGV) programs. Because there are times when it is desirable for military UGVs to operate without emitting strong, detectable electromagnetic signals, a passive only terrain perception mode of operation is also often a requirement. Thermal infrared (TIR) cameras can be used to provide day and night passive terrain perception. TIR cameras have a detector sensitive to either mid-wave infrared (MWIR) radiation (3-5?m) or long-wave infrared (LWIR) radiation (8-12?m). With the recent emergence of high-quality uncooled LWIR cameras, TIR cameras have become viable passive perception options for some UGV programs. The Jet Propulsion Laboratory (JPL) has used a stereo pair of TIR cameras under several UGV programs to perform stereo ranging, terrain mapping, tree-trunk detection, pedestrian detection, negative obstacle detection, and water detection based on object reflections. In addition, we have evaluated stereo range data at a variety of UGV speeds, evaluated dual-band TIR classification of soil, vegetation, and rock terrain types, analyzed 24 hour water and 12 hour mud TIR imagery, and analyzed TIR imagery for hazard detection through smoke. Since TIR cameras do not currently provide the resolution available from megapixel color cameras, a UGV's daytime safe speed is often reduced when using TIR instead of color cameras. In this paper, we summarize the UGV terrain perception work JPL has performed with TIR cameras over the last decade and describe a calibration target developed by General Dynamics Robotic Systems (GDRS) for TIR cameras and other sensors.

  4. Optimization of a miniature short-wavelength infrared objective optics of a short-wavelength infrared to visible upconversion layer attached to a mobile-devices visible camera

    NASA Astrophysics Data System (ADS)

    Kadosh, Itai; Sarusi, Gabby

    2017-10-01

    The use of dual cameras in parallax in order to detect and create 3-D images in mobile devices has been increasing over the last few years. We propose a concept where the second camera will be operating in the short-wavelength infrared (SWIR-1300 to 1800 nm) and thus have night vision capability while preserving most of the other advantages of dual cameras in terms of depth and 3-D capabilities. In order to maintain commonality of the two cameras, we propose to attach to one of the cameras a SWIR to visible upconversion layer that will convert the SWIR image into a visible image. For this purpose, the fore optics (the objective lenses) should be redesigned for the SWIR spectral range and the additional upconversion layer, whose thickness is <1 μm. Such layer should be attached in close proximity to the mobile device visible range camera sensor (the CMOS sensor). This paper presents such a SWIR objective optical design and optimization that is formed and fit mechanically to the visible objective design but with different lenses in order to maintain the commonality and as a proof-of-concept. Such a SWIR objective design is very challenging since it requires mimicking the original visible mobile camera lenses' sizes and the mechanical housing, so we can adhere to the visible optical and mechanical design. We present in depth a feasibility study and the overall optical system performance of such a SWIR mobile-device camera fore optics design.

  5. Coherent infrared imaging camera (CIRIC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.

    1995-07-01

    New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerousmore » and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.« less

  6. Real-time moving objects detection and tracking from airborne infrared camera

    NASA Astrophysics Data System (ADS)

    Zingoni, Andrea; Diani, Marco; Corsini, Giovanni

    2017-10-01

    Detecting and tracking moving objects in real-time from an airborne infrared (IR) camera offers interesting possibilities in video surveillance, remote sensing and computer vision applications, such as monitoring large areas simultaneously, quickly changing the point of view on the scene and pursuing objects of interest. To fully exploit such a potential, versatile solutions are needed, but, in the literature, the majority of them works only under specific conditions about the considered scenario, the characteristics of the moving objects or the aircraft movements. In order to overcome these limitations, we propose a novel approach to the problem, based on the use of a cheap inertial navigation system (INS), mounted on the aircraft. To exploit jointly the information contained in the acquired video sequence and the data provided by the INS, a specific detection and tracking algorithm has been developed. It consists of three main stages performed iteratively on each acquired frame. The detection stage, in which a coarse detection map is computed, using a local statistic both fast to calculate and robust to noise and self-deletion of the targeted objects. The registration stage, in which the position of the detected objects is coherently reported on a common reference frame, by exploiting the INS data. The tracking stage, in which the steady objects are rejected, the moving objects are tracked, and an estimation of their future position is computed, to be used in the subsequent iteration. The algorithm has been tested on a large dataset of simulated IR video sequences, recreating different environments and different movements of the aircraft. Promising results have been obtained, both in terms of detection and false alarm rate, and in terms of accuracy in the estimation of position and velocity of the objects. In addition, for each frame, the detection and tracking map has been generated by the algorithm, before the acquisition of the subsequent frame, proving its

  7. The development of large-aperture test system of infrared camera and visible CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  8. Variation in detection among passive infrared triggered-cameras used in wildlife research

    USGS Publications Warehouse

    Damm, Philip E.; Grand, James B.; Barnett, Steven W.

    2010-01-01

    Precise and accurate estimates of demographics such as age structure, productivity, and density are necessary in determining habitat and harvest management strategies for wildlife populations. Surveys using automated cameras are becoming an increasingly popular tool for estimating these parameters. However, most camera studies fail to incorporate detection probabilities, leading to parameter underestimation. The objective of this study was to determine the sources of heterogeneity in detection for trail cameras that incorporate a passive infrared (PIR) triggering system sensitive to heat and motion. Images were collected at four baited sites within the Conecuh National Forest, Alabama, using three cameras at each site operating continuously over the same seven-day period. Detection was estimated for four groups of animals based on taxonomic group and body size. Our hypotheses of detection considered variation among bait sites and cameras. The best model (w=0.99) estimated different rates of detection for each camera in addition to different detection rates for four animal groupings. Factors that explain this variability might include poor manufacturing tolerances, variation in PIR sensitivity, animal behavior, and species-specific infrared radiation. Population surveys using trail cameras with PIR systems must incorporate detection rates for individual cameras. Incorporating time-lapse triggering systems into survey designs should eliminate issues associated with PIR systems.

  9. An Inexpensive Digital Infrared Camera

    ERIC Educational Resources Information Center

    Mills, Allan

    2012-01-01

    Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)

  10. Technical considerations for designing low-cost, long-wave infrared objectives

    NASA Astrophysics Data System (ADS)

    Desroches, Gerard; Dalzell, Kristy; Robitaille, Blaise

    2014-06-01

    With the growth of uncooled infrared imaging in the consumer market, the balance between cost implications and performance criteria in the objective lens must be examined carefully. The increased availability of consumer-grade, long-wave infrared cameras is related to a decrease in military usage but it is also due to the decreasing costs of the cameras themselves. This has also driven up demand for low-cost, long-wave objectives that can resolve smaller pixels while maintaining high performance. Smaller pixels are traditionally associated with high cost objectives because of higher resolution requirements but, with careful consideration of all the requirements and proper selection of materials, costs can be moderated. This paper examines the cost/performance trade-off implications associated with optical and mechanical requirements of long-wave infrared objectives. Optical performance, f-number, field of view, distortion, focus range and thermal range all affect the cost of the objective. Because raw lens material cost is often the most expensive item in the construction, selection of the material as well as the shape of the lens while maintaining acceptable performance and cost targets were explored. As a result of these considerations, a low-cost, lightweight, well-performing objective was successfully designed, manufactured and tested.

  11. Infrared cameras are potential traceable "fixed points" for future thermometry studies.

    PubMed

    Yap Kannan, R; Keresztes, K; Hussain, S; Coats, T J; Bown, M J

    2015-01-01

    The National physical laboratory (NPL) requires "fixed points" whose temperatures have been established by the International Temperature Scale of 1990 (ITS 90) be used for device calibration. In practice, "near" blackbody radiators together with the standard platinum resistance thermometer is accepted as a standard. The aim of this study was to report the correlation and limits of agreement (LOA) of the thermal infrared camera and non-contact infrared temporal thermometer against each other and the "near" blackbody radiator. Temperature readings from an infrared thermography camera (FLIR T650sc) and a non-contact infrared temporal thermometer (Hubdic FS-700) were compared to a near blackbody (Hyperion R blackbody model 982) at 0.5 °C increments between 20-40 °C. At each increment, blackbody cavity temperature was confirmed with the platinum resistance thermometer. Measurements were taken initially with the thermal infrared camera followed by the infrared thermometer, with each device mounted in turn on a stand at a fixed distance of 20 cm and 5 cm from the blackbody aperture, respectively. The platinum thermometer under-estimated the blackbody temperature by 0.015 °C (95% LOA: -0.08 °C to 0.05 °C), in contrast to the thermal infrared camera and infrared thermometer which over-estimated the blackbody temperature by 0.16 °C (95% LOA: 0.03 °C to 0.28 °C) and 0.75 °C (95% LOA: -0.30 °C to 1.79 °C), respectively. Infrared thermometer over-estimates thermal infrared camera measurements by 0.6 °C (95% LOA: -0.46 °C to 1.65 °C). In conclusion, the thermal infrared camera is a potential temperature reference "fixed point" that could substitute mercury thermometers. However, further repeatability and reproducibility studies will be required with different models of thermal infrared cameras.

  12. Improved calibration-based non-uniformity correction method for uncooled infrared camera

    NASA Astrophysics Data System (ADS)

    Liu, Chengwei; Sui, Xiubao

    2017-08-01

    With the latest improvements of microbolometer focal plane arrays (FPA), uncooled infrared (IR) cameras are becoming the most widely used devices in thermography, especially in handheld devices. However the influences derived from changing ambient condition and the non-uniform response of the sensors make it more difficult to correct the nonuniformity of uncooled infrared camera. In this paper, based on the infrared radiation characteristic in the TEC-less uncooled infrared camera, a novel model was proposed for calibration-based non-uniformity correction (NUC). In this model, we introduce the FPA temperature, together with the responses of microbolometer under different ambient temperature to calculate the correction parameters. Based on the proposed model, we can work out the correction parameters with the calibration measurements under controlled ambient condition and uniform blackbody. All correction parameters can be determined after the calibration process and then be used to correct the non-uniformity of the infrared camera in real time. This paper presents the detail of the compensation procedure and the performance of the proposed calibration-based non-uniformity correction method. And our method was evaluated on realistic IR images obtained by a 384x288 pixels uncooled long wave infrared (LWIR) camera operated under changed ambient condition. The results show that our method can exclude the influence caused by the changed ambient condition, and ensure that the infrared camera has a stable performance.

  13. AMICA (Antarctic Multiband Infrared CAmera) project

    NASA Astrophysics Data System (ADS)

    Dolci, Mauro; Straniero, Oscar; Valentini, Gaetano; Di Rico, Gianluca; Ragni, Maurizio; Pelusi, Danilo; Di Varano, Igor; Giuliani, Croce; Di Cianno, Amico; Valentini, Angelo; Corcione, Leonardo; Bortoletto, Favio; D'Alessandro, Maurizio; Bonoli, Carlotta; Giro, Enrico; Fantinel, Daniela; Magrin, Demetrio; Zerbi, Filippo M.; Riva, Alberto; Molinari, Emilio; Conconi, Paolo; De Caprio, Vincenzo; Busso, Maurizio; Tosti, Gino; Nucciarelli, Giuliano; Roncella, Fabio; Abia, Carlos

    2006-06-01

    The Antarctic Plateau offers unique opportunities for ground-based Infrared Astronomy. AMICA (Antarctic Multiband Infrared CAmera) is an instrument designed to perform astronomical imaging from Dome-C in the near- (1 - 5 μm) and mid- (5 - 27 μm) infrared wavelength regions. The camera consists of two channels, equipped with a Raytheon InSb 256 array detector and a DRS MF-128 Si:As IBC array detector, cryocooled at 35 and 7 K respectively. Cryogenic devices will move a filter wheel and a sliding mirror, used to feed alternatively the two detectors. Fast control and readout, synchronized with the chopping secondary mirror of the telescope, will be required because of the large background expected at these wavelengths, especially beyond 10 μm. An environmental control system is needed to ensure the correct start-up, shut-down and housekeeping of the camera. The main technical challenge is represented by the extreme environmental conditions of Dome C (T about -90 °C, p around 640 mbar) and the need for a complete automatization of the overall system. AMICA will be mounted at the Nasmyth focus of the 80 cm IRAIT telescope and will perform survey-mode automatic observations of selected regions of the Southern sky. The first goal will be a direct estimate of the observational quality of this new highly promising site for Infrared Astronomy. In addition, IRAIT, equipped with AMICA, is expected to provide a significant improvement in the knowledge of fundamental astrophysical processes, such as the late stages of stellar evolution (especially AGB and post-AGB stars) and the star formation.

  14. Low-cost uncooled VOx infrared camera development

    NASA Astrophysics Data System (ADS)

    Li, Chuan; Han, C. J.; Skidmore, George D.; Cook, Grady; Kubala, Kenny; Bates, Robert; Temple, Dorota; Lannon, John; Hilton, Allan; Glukh, Konstantin; Hardy, Busbee

    2013-06-01

    The DRS Tamarisk® 320 camera, introduced in 2011, is a low cost commercial camera based on the 17 µm pixel pitch 320×240 VOx microbolometer technology. A higher resolution 17 µm pixel pitch 640×480 Tamarisk®640 has also been developed and is now in production serving the commercial markets. Recently, under the DARPA sponsored Low Cost Thermal Imager-Manufacturing (LCTI-M) program and internal project, DRS is leading a team of industrial experts from FiveFocal, RTI International and MEMSCAP to develop a small form factor uncooled infrared camera for the military and commercial markets. The objective of the DARPA LCTI-M program is to develop a low SWaP camera (<3.5 cm3 in volume and <500 mW in power consumption) that costs less than US $500 based on a 10,000 units per month production rate. To meet this challenge, DRS is developing several innovative technologies including a small pixel pitch 640×512 VOx uncooled detector, an advanced digital ROIC and low power miniature camera electronics. In addition, DRS and its partners are developing innovative manufacturing processes to reduce production cycle time and costs including wafer scale optic and vacuum packaging manufacturing and a 3-dimensional integrated camera assembly. This paper provides an overview of the DRS Tamarisk® project and LCTI-M related uncooled technology development activities. Highlights of recent progress and challenges will also be discussed. It should be noted that BAE Systems and Raytheon Vision Systems are also participants of the DARPA LCTI-M program.

  15. A Study of Planetary Nebulae using the Faint Object Infrared Camera for the SOFIA Telescope

    NASA Technical Reports Server (NTRS)

    Davis, Jessica

    2012-01-01

    A planetary nebula is formed following an intermediate-mass (1-8 solar M) star's evolution off of the main sequence; it undergoes a phase of mass loss whereby the stellar envelope is ejected and the core is converted into a white dwarf. Planetary nebulae often display complex morphologies such as waists or torii, rings, collimated jet-like outflows, and bipolar symmetry, but exactly how these features form is unclear. To study how the distribution of dust in the interstellar medium affects their morphology, we utilize the Faint Object InfraRed CAmera for the SOFIA Telescope (FORCAST) to obtain well-resolved images of four planetary nebulae--NGC 7027, NGC 6543, M2-9, and the Frosty Leo Nebula--at wavelengths where they radiate most of their energy. We retrieve mid infrared images at wavelengths ranging from 6.3 to 37.1 micron for each of our targets. IDL (Interactive Data Language) is used to perform basic analysis. We select M2-9 to investigate further; analyzing cross sections of the southern lobe reveals a slight limb brightening effect. Modeling the dust distribution within the lobes reveals that the thickness of the lobe walls is higher than anticipated, or rather than surrounding a vacuum surrounds a low density region of tenuous dust. Further analysis of this and other planetary nebulae is needed before drawing more specific conclusions.

  16. TIRCAM2: The TIFR near infrared imaging camera

    NASA Astrophysics Data System (ADS)

    Naik, M. B.; Ojha, D. K.; Ghosh, S. K.; Poojary, S. S.; Jadhav, R. B.; Meshram, G. S.; Sandimani, P. R.; Bhagat, S. B.; D'Costa, S. L. A.; Gharat, S. M.; Bakalkar, C. B.; Ninan, J. P.; Joshi, J. S.

    2012-12-01

    TIRCAM2 (TIFR near infrared imaging camera - II) is a closed cycle cooled imager that has been developed by the Infrared Astronomy Group at the Tata Institute of Fundamental Research for observations in the near infrared band of 1 to 3.7 μm with existing Indian telescopes. In this paper, we describe some of the technical details of TIRCAM2 and report its observing capabilities, measured performance and limiting magnitudes with the 2-m IUCAA Girawali telescope and the 1.2-m PRL Gurushikhar telescope. The main highlight is the camera's capability of observing in the nbL (3.59 mum) band enabling our primary motivation of mapping of Polycyclic Aromatic Hydrocarbon (PAH) emission at 3.3 mum.

  17. Object tracking using multiple camera video streams

    NASA Astrophysics Data System (ADS)

    Mehrubeoglu, Mehrube; Rojas, Diego; McLauchlan, Lifford

    2010-05-01

    Two synchronized cameras are utilized to obtain independent video streams to detect moving objects from two different viewing angles. The video frames are directly correlated in time. Moving objects in image frames from the two cameras are identified and tagged for tracking. One advantage of such a system involves overcoming effects of occlusions that could result in an object in partial or full view in one camera, when the same object is fully visible in another camera. Object registration is achieved by determining the location of common features in the moving object across simultaneous frames. Perspective differences are adjusted. Combining information from images from multiple cameras increases robustness of the tracking process. Motion tracking is achieved by determining anomalies caused by the objects' movement across frames in time in each and the combined video information. The path of each object is determined heuristically. Accuracy of detection is dependent on the speed of the object as well as variations in direction of motion. Fast cameras increase accuracy but limit the speed and complexity of the algorithm. Such an imaging system has applications in traffic analysis, surveillance and security, as well as object modeling from multi-view images. The system can easily be expanded by increasing the number of cameras such that there is an overlap between the scenes from at least two cameras in proximity. An object can then be tracked long distances or across multiple cameras continuously, applicable, for example, in wireless sensor networks for surveillance or navigation.

  18. The Near-Earth Object Camera: A Next-Generation Minor Planet Survey

    NASA Astrophysics Data System (ADS)

    Mainzer, Amy K.; Wright, Edward L.; Bauer, James; Grav, Tommy; Cutri, Roc M.; Masiero, Joseph; Nugent, Carolyn R.

    2015-11-01

    The Near-Earth Object Camera (NEOCam) is a next-generation asteroid and comet survey designed to discover, characterize, and track large numbers of minor planets using a 50 cm infrared telescope located at the Sun-Earth L1 Lagrange point. Proposed to NASA's Discovery program, NEOCam is designed to carry out a comprehensive inventory of the small bodies in the inner regions of our solar system. It address three themes: 1) quantify the potential hazard that near-Earth objects may pose to Earth; 2) study the origins and evolution of our solar system as revealed by its small body populations; and 3) identify the best destinations for future robotic and human exploration. With a dual channel infrared imager that observes at 4-5 and 6-10 micron bands simultaneously through the use of a beamsplitter, NEOCam enables measurements of asteroid diameters and thermal inertia. NEOCam complements existing and planned visible light surveys in terms of orbital element phase space and wavelengths, since albedos can be determined for objects with both visible and infrared flux measurements. NEOCam was awarded technology development funding in 2011 to mature the necessary megapixel infrared detectors.

  19. Students' Framing of Laboratory Exercises Using Infrared Cameras

    ERIC Educational Resources Information Center

    Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.

    2015-01-01

    Thermal science is challenging for students due to its largely imperceptible nature. Handheld infrared cameras offer a pedagogical opportunity for students to see otherwise invisible thermal phenomena. In the present study, a class of upper secondary technology students (N = 30) partook in four IR-camera laboratory activities, designed around the…

  20. Passive Infrared Thermographic Imaging for Mobile Robot Object Identification

    NASA Astrophysics Data System (ADS)

    Hinders, M. K.; Fehlman, W. L.

    2010-02-01

    The usefulness of thermal infrared imaging as a mobile robot sensing modality is explored, and a set of thermal-physical features used to characterize passive thermal objects in outdoor environments is described. Objects that extend laterally beyond the thermal camera's field of view, such as brick walls, hedges, picket fences, and wood walls as well as compact objects that are laterally within the thermal camera's field of view, such as metal poles and tree trunks, are considered. Classification of passive thermal objects is a subtle process since they are not a source for their own emission of thermal energy. A detailed analysis is included of the acquisition and preprocessing of thermal images, as well as the generation and selection of thermal-physical features from these objects within thermal images. Classification performance using these features is discussed, as a precursor to the design of a physics-based model to automatically classify these objects.

  1. Application of infrared camera to bituminous concrete pavements: measuring vehicle

    NASA Astrophysics Data System (ADS)

    Janků, Michal; Stryk, Josef

    2017-09-01

    Infrared thermography (IR) has been used for decades in certain fields. However, the technological level of advancement of measuring devices has not been sufficient for some applications. Over the recent years, good quality thermal cameras with high resolution and very high thermal sensitivity have started to appear on the market. The development in the field of measuring technologies allowed the use of infrared thermography in new fields and for larger number of users. This article describes the research in progress in Transport Research Centre with a focus on the use of infrared thermography for diagnostics of bituminous road pavements. A measuring vehicle, equipped with a thermal camera, digital camera and GPS sensor, was designed for the diagnostics of pavements. New, highly sensitive, thermal cameras allow to measure very small temperature differences from the moving vehicle. This study shows the potential of a high-speed inspection without lane closures while using IR thermography.

  2. ARNICA, the Arcetri near-infrared camera: Astronomical performance assessment.

    NASA Astrophysics Data System (ADS)

    Hunt, L. K.; Lisi, F.; Testi, L.; Baffa, C.; Borelli, S.; Maiolino, R.; Moriondo, G.; Stanga, R. M.

    1996-01-01

    The Arcetri near-infrared camera ARNICA was built as a users' instrument for the Infrared Telescope at Gornergrat (TIRGO), and is based on a 256x256 NICMOS 3 detector. In this paper, we discuss ARNICA's optical and astronomical performance at the TIRGO and at the William Herschel Telescope on La Palma. Optical performance is evaluated in terms of plate scale, distortion, point spread function, and ghosting. Astronomical performance is characterized by camera efficiency, sensitivity, and spatial uniformity of the photometry.

  3. High-Resolution Mars Camera Test Image of Moon Infrared

    NASA Image and Video Library

    2005-09-13

    This crescent view of Earth Moon in infrared wavelengths comes from a camera test by NASA Mars Reconnaissance Orbiter spacecraft on its way to Mars. This image was taken by taken by the High Resolution Imaging Science Experiment camera Sept. 8, 2005.

  4. Sniper detection using infrared camera: technical possibilities and limitations

    NASA Astrophysics Data System (ADS)

    Kastek, M.; Dulski, R.; Trzaskawka, P.; Bieszczad, G.

    2010-04-01

    The paper discusses technical possibilities to build an effective system for sniper detection using infrared cameras. Descriptions of phenomena which make it possible to detect sniper activities in infrared spectra as well as analysis of physical limitations were performed. Cooled and uncooled detectors were considered. Three phases of sniper activities were taken into consideration: before, during and after the shot. On the basis of experimental data the parameters defining the target were determined which are essential in assessing the capability of infrared camera to detect sniper activity. A sniper body and muzzle flash were analyzed as targets. The simulation of detection ranges was done for the assumed scenario of sniper detection task. The infrared sniper detection system was discussed, capable of fulfilling the requirements. The discussion of the results of analysis and simulations was finally presented.

  5. Infrared detectors and test technology of cryogenic camera

    NASA Astrophysics Data System (ADS)

    Yang, Xiaole; Liu, Xingxin; Xing, Mailing; Ling, Long

    2016-10-01

    Cryogenic camera which is widely used in deep space detection cools down optical system and support structure by cryogenic refrigeration technology, thereby improving the sensitivity. Discussing the characteristics and design points of infrared detector combined with camera's characteristics. At the same time, cryogenic background test systems of chip and detector assembly are established. Chip test system is based on variable cryogenic and multilayer Dewar, and assembly test system is based on target and background simulator in the thermal vacuum environment. The core of test is to establish cryogenic background. Non-uniformity, ratio of dead pixels and noise of test result are given finally. The establishment of test system supports for the design and calculation of infrared systems.

  6. [Evaluation of Iris Morphology Viewed through Stromal Edematous Corneas by Infrared Camera].

    PubMed

    Kobayashi, Masaaki; Morishige, Naoyuki; Morita, Yukiko; Yamada, Naoyuki; Kobayashi, Motomi; Sonoda, Koh-Hei

    2016-02-01

    We reported that the application of infrared camera enables us to observe iris morphology in Peters' anomaly through edematous corneas. To observe the iris morphology in bullous keratopathy or failure grafts with an infrared camera. Eleven bullous keratopathy or failure grafts subjects (6 men and 5 women, mean age ± SD; 72.7 ± 13.0 years old) were enrolled in this study. The iris morphology was observed by applying visible light mode and near infrared light mode of infrared camera (MeibomPen). The detectability of pupil shapes, iris patterns and presence of iridectomy was evaluated. Infrared mode observation enabled us to detect the pupil shapes in 11 out of 11 cases, iris patterns in 3 out of 11 cases, and presence of iridetomy in 9 out of 11 cases although visible light mode observation could not detect any iris morphological changes. Applying infrared optics was valuable for observation of the iris morphology through stromal edematous corneas.

  7. High-frame-rate infrared and visible cameras for test range instrumentation

    NASA Astrophysics Data System (ADS)

    Ambrose, Joseph G.; King, B.; Tower, John R.; Hughes, Gary W.; Levine, Peter A.; Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; O'Mara, K.; Sjursen, W.; McCaffrey, Nathaniel J.; Pantuso, Francis P.

    1995-09-01

    Field deployable, high frame rate camera systems have been developed to support the test and evaluation activities at the White Sands Missile Range. The infrared cameras employ a 640 by 480 format PtSi focal plane array (FPA). The visible cameras employ a 1024 by 1024 format backside illuminated CCD. The monolithic, MOS architecture of the PtSi FPA supports commandable frame rate, frame size, and integration time. The infrared cameras provide 3 - 5 micron thermal imaging in selectable modes from 30 Hz frame rate, 640 by 480 frame size, 33 ms integration time to 300 Hz frame rate, 133 by 142 frame size, 1 ms integration time. The infrared cameras employ a 500 mm, f/1.7 lens. Video outputs are 12-bit digital video and RS170 analog video with histogram-based contrast enhancement. The 1024 by 1024 format CCD has a 32-port, split-frame transfer architecture. The visible cameras exploit this architecture to provide selectable modes from 30 Hz frame rate, 1024 by 1024 frame size, 32 ms integration time to 300 Hz frame rate, 1024 by 1024 frame size (with 2:1 vertical binning), 0.5 ms integration time. The visible cameras employ a 500 mm, f/4 lens, with integration time controlled by an electro-optical shutter. Video outputs are RS170 analog video (512 by 480 pixels), and 12-bit digital video.

  8. Observation of runaway electrons by infrared camera in J-TEXT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tong, R. H.; Chen, Z. Y., E-mail: zychen@hust.edu.cn; Zhang, M.

    2016-11-15

    When the energy of confined runaway electrons approaches several tens of MeV, the runaway electrons can emit synchrotron radiation in the range of infrared wavelength. An infrared camera working in the wavelength of 3-5 μm has been developed to study the runaway electrons in the Joint Texas Experimental Tokamak (J-TEXT). The camera is located in the equatorial plane looking tangentially into the direction of electron approach. The runaway electron beam inside the plasma has been observed at the flattop phase. With a fast acquisition of the camera, the behavior of runaway electron beam has been observed directly during the runawaymore » current plateau following the massive gas injection triggered disruptions.« less

  9. Attitude identification for SCOLE using two infrared cameras

    NASA Technical Reports Server (NTRS)

    Shenhar, Joram

    1991-01-01

    An algorithm is presented that incorporates real time data from two infrared cameras and computes the attitude parameters of the Spacecraft COntrol Lab Experiment (SCOLE), a lab apparatus representing an offset feed antenna attached to the Space Shuttle by a flexible mast. The algorithm uses camera position data of three miniature light emitting diodes (LEDs), mounted on the SCOLE platform, permitting arbitrary camera placement and an on-line attitude extraction. The continuous nature of the algorithm allows identification of the placement of the two cameras with respect to some initial position of the three reference LEDs, followed by on-line six degrees of freedom attitude tracking, regardless of the attitude time history. A description is provided of the algorithm in the camera identification mode as well as the mode of target tracking. Experimental data from a reduced size SCOLE-like lab model, reflecting the performance of the camera identification and the tracking processes, are presented. Computer code for camera placement identification and SCOLE attitude tracking is listed.

  10. UKIRT's Wide Field Camera and the Detection of 10 MJupiter Objects

    NASA Astrophysics Data System (ADS)

    WFCAM Team; UKIDSS Team

    2004-06-01

    In mid-2004 a near-infrared wide field camera will be commissioned on UKIRT. About 40% of all UKIRT time will go into sky surveys and one of these, the Large Area Survey using YJHK filters, will extend the field brown dwarf population to temperatures and masses significantly lower than those of the T dwarf population discovered by the Sloan and 2MASS surveys. The LAS should find objects as cool as 450 K and as low mass as 10 MJupiter at 10 pc. These planetary-mass objects will possibly require a new spectral type designation.

  11. Development of a portable multispectral thermal infrared camera

    NASA Technical Reports Server (NTRS)

    Osterwisch, Frederick G.

    1991-01-01

    The purpose of this research and development effort was to design and build a prototype instrument designated the 'Thermal Infrared Multispectral Camera' (TIRC). The Phase 2 effort was a continuation of the Phase 1 feasibility study and preliminary design for such an instrument. The completed instrument designated AA465 has application in the field of geologic remote sensing and exploration. The AA465 Thermal Infrared Camera (TIRC) System is a field-portable multispectral thermal infrared camera operating over the 8.0 - 13.0 micron wavelength range. Its primary function is to acquire two-dimensional thermal infrared images of user-selected scenes. Thermal infrared energy emitted by the scene is collected, dispersed into ten 0.5 micron wide channels, and then measured and recorded by the AA465 System. This multispectral information is presented in real time on a color display to be used by the operator to identify spectral and spatial variations in the scenes emissivity and/or irradiance. This fundamental instrument capability has a wide variety of commercial and research applications. While ideally suited for two-man operation in the field, the AA465 System can be transported and operated effectively by a single user. Functionally, the instrument operates as if it were a single exposure camera. System measurement sensitivity requirements dictate relatively long (several minutes) instrument exposure times. As such, the instrument is not suited for recording time-variant information. The AA465 was fabricated, assembled, tested, and documented during this Phase 2 work period. The detailed design and fabrication of the instrument was performed during the period of June 1989 to July 1990. The software development effort and instrument integration/test extended from July 1990 to February 1991. Software development included an operator interface/menu structure, instrument internal control functions, DSP image processing code, and a display algorithm coding program. The

  12. Students' framing of laboratory exercises using infrared cameras

    NASA Astrophysics Data System (ADS)

    Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.

    2015-12-01

    Thermal science is challenging for students due to its largely imperceptible nature. Handheld infrared cameras offer a pedagogical opportunity for students to see otherwise invisible thermal phenomena. In the present study, a class of upper secondary technology students (N =30 ) partook in four IR-camera laboratory activities, designed around the predict-observe-explain approach of White and Gunstone. The activities involved central thermal concepts that focused on heat conduction and dissipative processes such as friction and collisions. Students' interactions within each activity were videotaped and the analysis focuses on how a purposefully selected group of three students engaged with the exercises. As the basis for an interpretative study, a "thick" narrative description of the students' epistemological and conceptual framing of the exercises and how they took advantage of the disciplinary affordance of IR cameras in the thermal domain is provided. Findings include that the students largely shared their conceptual framing of the four activities, but differed among themselves in their epistemological framing, for instance, in how far they found it relevant to digress from the laboratory instructions when inquiring into thermal phenomena. In conclusion, the study unveils the disciplinary affordances of infrared cameras, in the sense of their use in providing access to knowledge about macroscopic thermal science.

  13. Portable Long-Wavelength Infrared Camera for Civilian Application

    NASA Technical Reports Server (NTRS)

    Gunapala, S. D.; Krabach, T. N.; Bandara, S. V.; Liu, J. K.

    1997-01-01

    In this paper, we discuss the performance of this portable long-wavelength infrared camera in quantum efficiency, NEAT, minimum resolvable temperature differnce (MRTD), uniformity, etc. and its application in science, medicine and defense.

  14. Measuring Positions of Objects using Two or More Cameras

    NASA Technical Reports Server (NTRS)

    Klinko, Steve; Lane, John; Nelson, Christopher

    2008-01-01

    An improved method of computing positions of objects from digitized images acquired by two or more cameras (see figure) has been developed for use in tracking debris shed by a spacecraft during and shortly after launch. The method is also readily adaptable to such applications as (1) tracking moving and possibly interacting objects in other settings in order to determine causes of accidents and (2) measuring positions of stationary objects, as in surveying. Images acquired by cameras fixed to the ground and/or cameras mounted on tracking telescopes can be used in this method. In this method, processing of image data starts with creation of detailed computer- aided design (CAD) models of the objects to be tracked. By rotating, translating, resizing, and overlaying the models with digitized camera images, parameters that characterize the position and orientation of the camera can be determined. The final position error depends on how well the centroids of the objects in the images are measured; how accurately the centroids are interpolated for synchronization of cameras; and how effectively matches are made to determine rotation, scaling, and translation parameters. The method involves use of the perspective camera model (also denoted the point camera model), which is one of several mathematical models developed over the years to represent the relationships between external coordinates of objects and the coordinates of the objects as they appear on the image plane in a camera. The method also involves extensive use of the affine camera model, in which the distance from the camera to an object (or to a small feature on an object) is assumed to be much greater than the size of the object (or feature), resulting in a truly two-dimensional image. The affine camera model does not require advance knowledge of the positions and orientations of the cameras. This is because ultimately, positions and orientations of the cameras and of all objects are computed in a coordinate

  15. Visible-infrared achromatic imaging by wavefront coding with wide-angle automobile camera

    NASA Astrophysics Data System (ADS)

    Ohta, Mitsuhiko; Sakita, Koichi; Shimano, Takeshi; Sugiyama, Takashi; Shibasaki, Susumu

    2016-09-01

    We perform an experiment of achromatic imaging with wavefront coding (WFC) using a wide-angle automobile lens. Our original annular phase mask for WFC was inserted to the lens, for which the difference between the focal positions at 400 nm and at 950 nm is 0.10 mm. We acquired images of objects using a WFC camera with this lens under the conditions of visible and infrared light. As a result, the effect of the removal of the chromatic aberration of the WFC system was successfully determined. Moreover, we fabricated a demonstration set assuming the use of a night vision camera in an automobile and showed the effect of the WFC system.

  16. Finite Element Modeling and Long Wave Infrared Imaging for Detection and Identification of Buried Objects

    DTIC Science & Technology

    surface temperature profile of a sandbox containing buried objects using a long-wave infrared camera. Images were recorded for several days under ambient...time of day . Best detection of buried objects corresponded to shallow depths for observed intervals where maxima/minima ambient temperatures coincided

  17. Development of plenoptic infrared camera using low dimensional material based photodetectors

    NASA Astrophysics Data System (ADS)

    Chen, Liangliang

    Infrared (IR) sensor has extended imaging from submicron visible spectrum to tens of microns wavelength, which has been widely used for military and civilian application. The conventional bulk semiconductor materials based IR cameras suffer from low frame rate, low resolution, temperature dependent and highly cost, while the unusual Carbon Nanotube (CNT), low dimensional material based nanotechnology has been made much progress in research and industry. The unique properties of CNT lead to investigate CNT based IR photodetectors and imaging system, resolving the sensitivity, speed and cooling difficulties in state of the art IR imagings. The reliability and stability is critical to the transition from nano science to nano engineering especially for infrared sensing. It is not only for the fundamental understanding of CNT photoresponse induced processes, but also for the development of a novel infrared sensitive material with unique optical and electrical features. In the proposed research, the sandwich-structured sensor was fabricated within two polymer layers. The substrate polyimide provided sensor with isolation to background noise, and top parylene packing blocked humid environmental factors. At the same time, the fabrication process was optimized by real time electrical detection dielectrophoresis and multiple annealing to improve fabrication yield and sensor performance. The nanoscale infrared photodetector was characterized by digital microscopy and precise linear stage in order for fully understanding it. Besides, the low noise, high gain readout system was designed together with CNT photodetector to make the nano sensor IR camera available. To explore more of infrared light, we employ compressive sensing algorithm into light field sampling, 3-D camera and compressive video sensing. The redundant of whole light field, including angular images for light field, binocular images for 3-D camera and temporal information of video streams, are extracted and

  18. Infrared Imaging Camera Final Report CRADA No. TC02061.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roos, E. V.; Nebeker, S.

    This was a collaborative effort between the University of California, Lawrence Livermore National Laboratory (LLNL) and Cordin Company (Cordin) to enhance the U.S. ability to develop a commercial infrared camera capable of capturing high-resolution images in a l 00 nanoseconds (ns) time frame. The Department of Energy (DOE), under an Initiative for Proliferation Prevention (IPP) project, funded the Russian Federation Nuclear Center All-Russian Scientific Institute of Experimental Physics (RFNC-VNIIEF) in Sarov. VNIIEF was funded to develop a prototype commercial infrared (IR) framing camera and to deliver a prototype IR camera to LLNL. LLNL and Cordin were partners with VNIIEF onmore » this project. A prototype IR camera was delivered by VNIIEF to LLNL in December 2006. In June of 2007, LLNL and Cordin evaluated the camera and the test results revealed that the camera exceeded presently available commercial IR cameras. Cordin believes that the camera can be sold on the international market. The camera is currently being used as a scientific tool within Russian nuclear centers. This project was originally designated as a two year project. The project was not started on time due to changes in the IPP project funding conditions; the project funding was re-directed through the International Science and Technology Center (ISTC), which delayed the project start by over one year. The project was not completed on schedule due to changes within the Russian government export regulations. These changes were directed by Export Control regulations on the export of high technology items that can be used to develop military weapons. The IR camera was on the list that export controls required. The ISTC and Russian government, after negotiations, allowed the delivery of the camera to LLNL. There were no significant technical or business changes to the original project.« less

  19. Lock-in thermography using a cellphone attachment infrared camera

    NASA Astrophysics Data System (ADS)

    Razani, Marjan; Parkhimchyk, Artur; Tabatabaei, Nima

    2018-03-01

    Lock-in thermography (LIT) is a thermal-wave-based, non-destructive testing, technique which has been widely utilized in research settings for characterization and evaluation of biological and industrial materials. However, despite promising research outcomes, the wide spread adaptation of LIT in industry, and its commercialization, is hindered by the high cost of the infrared cameras used in the LIT setups. In this paper, we report on the feasibility of using inexpensive cellphone attachment infrared cameras for performing LIT. While the cost of such cameras is over two orders of magnitude less than their research-grade counterparts, our experimental results on block sample with subsurface defects and tooth with early dental caries suggest that acceptable performance can be achieved through careful instrumentation and implementation of proper data acquisition and image processing steps. We anticipate this study to pave the way for development of low-cost thermography systems and their commercialization as inexpensive tools for non-destructive testing of industrial samples as well as affordable clinical devices for diagnostic imaging of biological tissues.

  20. High spatial resolution infrared camera as ISS external experiment

    NASA Astrophysics Data System (ADS)

    Eckehard, Lorenz; Frerker, Hap; Fitch, Robert Alan

    High spatial resolution infrared camera as ISS external experiment for monitoring global climate changes uses ISS internal and external resources (eg. data storage). The optical experiment will consist of an infrared camera for monitoring global climate changes from the ISS. This technology was evaluated by the German small satellite mission BIRD and further developed in different ESA projects. Compared to BIRD the presended instrument uses proven sensor advanced technologies (ISS external) and ISS on board processing and storage capabili-ties (internal). The instrument will be equipped with a serial interfaces for TM/TC and several relay commands for the power supply. For data processing and storage a mass memory is re-quired. The access to actual attitude data is highly desired to produce geo referenced maps-if possible by an on board processing.

  1. Object recognition through turbulence with a modified plenoptic camera

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher

    2015-03-01

    Atmospheric turbulence adds accumulated distortion to images obtained by cameras and surveillance systems. When the turbulence grows stronger or when the object is further away from the observer, increasing the recording device resolution helps little to improve the quality of the image. Many sophisticated methods to correct the distorted images have been invented, such as using a known feature on or near the target object to perform a deconvolution process, or use of adaptive optics. However, most of the methods depend heavily on the object's location, and optical ray propagation through the turbulence is not directly considered. Alternatively, selecting a lucky image over many frames provides a feasible solution, but at the cost of time. In our work, we propose an innovative approach to improving image quality through turbulence by making use of a modified plenoptic camera. This type of camera adds a micro-lens array to a traditional high-resolution camera to form a semi-camera array that records duplicate copies of the object as well as "superimposed" turbulence at slightly different angles. By performing several steps of image reconstruction, turbulence effects will be suppressed to reveal more details of the object independently (without finding references near the object). Meanwhile, the redundant information obtained by the plenoptic camera raises the possibility of performing lucky image algorithmic analysis with fewer frames, which is more efficient. In our work, the details of our modified plenoptic cameras and image processing algorithms will be introduced. The proposed method can be applied to coherently illuminated object as well as incoherently illuminated objects. Our result shows that the turbulence effect can be effectively suppressed by the plenoptic camera in the hardware layer and a reconstructed "lucky image" can help the viewer identify the object even when a "lucky image" by ordinary cameras is not achievable.

  2. Hubble Space Telescope/Near-Infrared Camera and Multi-Object Spectrometer Observations of the GLIMPSE9 Stellar Cluster

    NASA Astrophysics Data System (ADS)

    Messineo, Maria; Figer, Donald F.; Davies, Ben; Kudritzki, R. P.; Rich, R. Michael; MacKenty, John; Trombley, Christine

    2010-01-01

    We present Hubble Space Telescope/Near-Infrared Camera and Multi-Object Spectrometer photometry, and low-resolution K-band spectra of the GLIMPSE9 stellar cluster. The newly obtained color-magnitude diagram shows a cluster sequence with H - KS = ~1 mag, indicating an interstellar extinction A _K_s = 1.6 ± 0.2 mag. The spectra of the three brightest stars show deep CO band heads, which indicate red supergiants with spectral type M1-M2. Two 09-B2 supergiants are also identified, which yield a spectrophotometric distance of 4.2 ± 0.4 kpc. Presuming that the population is coeval, we derive an age between 15 and 27 Myr, and a total cluster mass of 1600 ± 400 M sun, integrated down to 1 M sun. In the vicinity of GLIMPSE9 are several H II regions and supernova remnants, all of which (including GLIMPSE9) are probably associated with a giant molecular cloud (GMC) in the inner galaxy. GLIMPSE9 probably represents one episode of massive star formation in this GMC. We have identified several other candidate stellar clusters of the same complex.

  3. AKARI Infrared Camera Survey of the Large Magellanic Cloud

    NASA Astrophysics Data System (ADS)

    Shimonishi, Takashi; Kato, Daisuke; Ita, Yoshifusa; Onaka, Takashi

    2015-08-01

    The Large Magellanic Cloud (LMC) is one of the closest external galaxies to the Milky Way and has been playing a central role in various fields of modern astronomy and astrophysics. We conducted an unbiased near- to mid-infrared imaging and spectroscopic survey of the LMC with the infrared satellite AKARI. An area of about 10 square degrees of the LMC was observed by five imaging bands (each centered at 3.2, 7, 11, 15, and 24 micron) and the low-resolution slitless prism spectroscopy mode (2--5 micron, R~20) equipped with the Infrared Camera on board AKARI. Based on the data obtained in the survey, we constructed the photometric and spectroscopic catalogues of point sources in the LMC. The photometric catalogue includes about 650,000, 90,000, 49,000, 17,000, 7,000 sources at 3.2, 7, 11, 15, and 24 micron, respectively (Ita et al. 2008, PASJ, 60, 435; Kato et al. 2012, AJ, 144, 179), while the spectroscopic catalogue includes 1,757 sources (Shimonishi et al. 2013, AJ, 145, 32). Both catalogs are publicly released and available through a website (AKARI Observers Page, http://www.ir.isas.ac.jp/AKARI/Observation/). The catalog includes various infrared sources such as young stellar objects, asymptotic giant branch stars, giants/supergiants, and many other cool or dust-enshrouded stars. A large number of near-infrared spectral data, coupled with complementary broadband photometric data, allow us to investigate infrared spectral features of sources by comparison with their spectral energy distributions. Combined use of the present AKARI LMC catalogues with other infrared catalogues such as SAGE and HERITAGE possesses scientific potential that can be applied to various astronomical studies. In this presentation, we report the details of the AKARI photometric and spectroscopic catalogues of the LMC.

  4. AKARI's infrared view on nearby stars. Using AKARI infrared camera all-sky survey, 2MASS, and Hipparcos catalogs

    NASA Astrophysics Data System (ADS)

    Ita, Y.; Matsuura, M.; Ishihara, D.; Oyabu, S.; Takita, S.; Kataza, H.; Yamamura, I.; Matsunaga, N.; Tanabé, T.; Nakada, Y.; Fujiwara, H.; Wada, T.; Onaka, T.; Matsuhara, H.

    2010-05-01

    Context. The AKARI, a Japanese infrared space mission, has performed an All-Sky Survey in six infrared-bands from 9 to 180 μm with higher spatial resolutions and better sensitivities than IRAS. Aims: We investigate the mid-infrared (9 and 18 μm) point source catalog (PSC) obtained with the infrared camera (IRC) onboard AKARI, in order to understand the infrared nature of the known objects and to identify previously unknown objects. Methods: Color-color diagrams and a color-magnitude diagram were plotted with the AKARI-IRC PSC and other available all-sky survey catalogs. We combined the Hipparcos astrometric catalog and the 2MASS all-sky survey catalog with the AKARI-IRC PSC. We furthermore searched literature and SIMBAD astronomical database for object types, spectral types, and luminosity classes. We identified the locations of representative stars and objects on the color-magnitude and color-color diagram schemes. The properties of unclassified sources can be inferred from their locations on these diagrams. Results: We found that the (B-V) vs. (V-S9W) color-color diagram is useful for identifying the stars with infrared excess emerged from circumstellar envelopes or disks. Be stars with infrared excess are separated well from other types of stars in this diagram. Whereas (J-L18W) vs. (S9W-L18W) diagram is a powerful tool for classifying several object types. Carbon-rich asymptotic giant branch (AGB) stars and OH/IR stars form distinct sequences in this color-color diagram. Young stellar objects (YSOs), pre-main sequence (PMS) stars, post-AGB stars, and planetary nebulae (PNe) have the largest mid-infrared color excess and can be identified in the infrared catalog. Finally, we plot the L18W vs. (S9W-L18W) color-magnitude diagram, using the AKARI data together with Hipparcos parallaxes. This diagram can be used to identify low-mass YSOs and AGB stars. We found that this diagram is comparable to the [24] vs. ([8.0]-[24]) diagram of Large Magellanic Cloud sources

  5. Seeing in a different light—using an infrared camera to teach heat transfer and optical phenomena

    NASA Astrophysics Data System (ADS)

    Pei Wong, Choun; Subramaniam, R.

    2018-05-01

    The infrared camera is a useful tool in physics education to ‘see’ in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.

  6. Seeing in a Different Light--Using an Infrared Camera to Teach Heat Transfer and Optical Phenomena

    ERIC Educational Resources Information Center

    Wong, Choun Pei; Subramaniam, R.

    2018-01-01

    The infrared camera is a useful tool in physics education to 'see' in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.

  7. A Ground-Based Near Infrared Camera Array System for UAV Auto-Landing in GPS-Denied Environment.

    PubMed

    Yang, Tao; Li, Guangpo; Li, Jing; Zhang, Yanning; Zhang, Xiaoqiang; Zhang, Zhuoyue; Li, Zhi

    2016-08-30

    This paper proposes a novel infrared camera array guidance system with capability to track and provide real time position and speed of a fixed-wing Unmanned air vehicle (UAV) during a landing process. The system mainly include three novel parts: (1) Infrared camera array and near infrared laser lamp based cooperative long range optical imaging module; (2) Large scale outdoor camera array calibration module; and (3) Laser marker detection and 3D tracking module. Extensive automatic landing experiments with fixed-wing flight demonstrate that our infrared camera array system has the unique ability to guide the UAV landing safely and accurately in real time. Moreover, the measurement and control distance of our system is more than 1000 m. The experimental results also demonstrate that our system can be used for UAV automatic accurate landing in Global Position System (GPS)-denied environments.

  8. Exploring the imaging properties of thin lenses for cryogenic infrared cameras

    NASA Astrophysics Data System (ADS)

    Druart, Guillaume; Verdet, Sebastien; Guerineau, Nicolas; Magli, Serge; Chambon, Mathieu; Grulois, Tatiana; Matallah, Noura

    2016-05-01

    Designing a cryogenic camera is a good strategy to miniaturize and simplify an infrared camera using a cooled detector. Indeed, the integration of optics inside the cold shield allows to simply athermalize the design, guarantees a cold pupil and releases the constraint on having a high back focal length for small focal length systems. By this way, cameras made of a single lens or two lenses are viable systems with good optical features and a good stability in image correction. However it involves a relatively significant additional optical mass inside the dewar and thus increases the cool down time of the camera. ONERA is currently exploring a minimalist strategy consisting in giving an imaging function to thin optical plates that are found in conventional dewars. By this way, we could make a cryogenic camera that has the same cool down time as a traditional dewar without an imagery function. Two examples will be presented: the first one is a camera using a dual-band infrared detector made of a lens outside the dewar and a lens inside the cold shield, the later having the main optical power of the system. We were able to design a cold plano-convex lens with a thickness lower than 1mm. The second example is an evolution of a former cryogenic camera called SOIE. We replaced the cold meniscus by a plano-convex Fresnel lens with a decrease of the optical thermal mass of 66%. The performances of both cameras will be compared.

  9. Forward-Looking Infrared Cameras for Micrometeorological Applications within Vineyards

    PubMed Central

    Katurji, Marwan; Zawar-Reza, Peyman

    2016-01-01

    We apply the principles of atmospheric surface layer dynamics within a vineyard canopy to demonstrate the use of forward-looking infrared cameras measuring surface brightness temperature (spectrum bandwidth of 7.5 to 14 μm) at a relatively high temporal rate of 10 s. The temporal surface brightness signal over a few hours of the stable nighttime boundary layer, intermittently interrupted by periods of turbulent heat flux surges, was shown to be related to the observed meteorological measurements by an in situ eddy-covariance system, and reflected the above-canopy wind variability. The infrared raster images were collected and the resultant self-organized spatial cluster provided the meteorological context when compared to in situ data. The spatial brightness temperature pattern was explained in terms of the presence or absence of nighttime cloud cover and down-welling of long-wave radiation and the canopy turbulent heat flux. Time sequential thermography as demonstrated in this research provides positive evidence behind the application of thermal infrared cameras in the domain of micrometeorology, and to enhance our spatial understanding of turbulent eddy interactions with the surface. PMID:27649208

  10. Ensuring long-term stability of infrared camera absolute calibration.

    PubMed

    Kattnig, Alain; Thetas, Sophie; Primot, Jérôme

    2015-07-13

    Absolute calibration of cryogenic 3-5 µm and 8-10 µm infrared cameras is notoriously instable and thus has to be repeated before actual measurements. Moreover, the signal to noise ratio of the imagery is lowered, decreasing its quality. These performances degradations strongly lessen the suitability of Infrared Imaging. These defaults are often blamed on detectors reaching a different "response state" after each return to cryogenic conditions, while accounting for the detrimental effects of imperfect stray light management. We show here that detectors are not to be blamed and that the culprit can also dwell in proximity electronics. We identify an unexpected source of instability in the initial voltage of the integrating capacity of detectors. Then we show that this parameter can be easily measured and taken into account. This way we demonstrate that a one month old calibration of a 3-5 µm camera has retained its validity.

  11. Calibration of asynchronous smart phone cameras from moving objects

    NASA Astrophysics Data System (ADS)

    Hagen, Oksana; Istenič, Klemen; Bharti, Vibhav; Dhali, Maruf Ahmed; Barmaimon, Daniel; Houssineau, Jérémie; Clark, Daniel

    2015-04-01

    Calibrating multiple cameras is a fundamental prerequisite for many Computer Vision applications. Typically this involves using a pair of identical synchronized industrial or high-end consumer cameras. This paper considers an application on a pair of low-cost portable cameras with different parameters that are found in smart phones. This paper addresses the issues of acquisition, detection of moving objects, dynamic camera registration and tracking of arbitrary number of targets. The acquisition of data is performed using two standard smart phone cameras and later processed using detections of moving objects in the scene. The registration of cameras onto the same world reference frame is performed using a recently developed method for camera calibration using a disparity space parameterisation and the single-cluster PHD filter.

  12. Selecting among competing models of electro-optic, infrared camera system range performance

    USGS Publications Warehouse

    Nichols, Jonathan M.; Hines, James E.; Nichols, James D.

    2013-01-01

    Range performance is often the key requirement around which electro-optical and infrared camera systems are designed. This work presents an objective framework for evaluating competing range performance models. Model selection based on the Akaike’s Information Criterion (AIC) is presented for the type of data collected during a typical human observer and target identification experiment. These methods are then demonstrated on observer responses to both visible and infrared imagery in which one of three maritime targets was placed at various ranges. We compare the performance of a number of different models, including those appearing previously in the literature. We conclude that our model-based approach offers substantial improvements over the traditional approach to inference, including increased precision and the ability to make predictions for some distances other than the specific set for which experimental trials were conducted.

  13. Free-form reflective optics for mid-infrared camera and spectrometer on board SPICA

    NASA Astrophysics Data System (ADS)

    Fujishiro, Naofumi; Kataza, Hirokazu; Wada, Takehiko; Ikeda, Yuji; Sakon, Itsuki; Oyabu, Shinki

    2017-11-01

    SPICA (Space Infrared Telescope for Cosmology and Astrophysics) is an astronomical mission optimized for mid-and far-infrared astronomy with a cryogenically cooled 3-m class telescope, envisioned for launch in early 2020s. Mid-infrared Camera and Spectrometer (MCS) is a focal plane instrument for SPICA with imaging and spectroscopic observing capabilities in the mid-infrared wavelength range of 5-38μm. MCS consists of two relay optical modules and following four scientific optical modules of WFC (Wide Field Camera; 5'x 5' field of view, f/11.7 and f/4.2 cameras), LRS (Low Resolution Spectrometer; 2'.5 long slits, prism dispersers, f/5.0 and f/1.7 cameras, spectral resolving power R ∼ 50-100), MRS (Mid Resolution Spectrometer; echelles, integral field units by image slicer, f/3.3 and f/1.9 cameras, R ∼ 1100-3000) and HRS (High Resolution Spectrometer; immersed echelles, f/6.0 and f/3.6 cameras, R ∼ 20000-30000). Here, we present optical design and expected optical performance of MCS. Most parts of MCS optics adopt off-axis reflective system for covering the wide wavelength range of 5-38μm without chromatic aberration and minimizing problems due to changes in shapes and refractive indices of materials from room temperature to cryogenic temperature. In order to achieve the high specification requirements of wide field of view, small F-number and large spectral resolving power with compact size, we employed the paraxial and aberration analysis of off-axial optical systems (Araki 2005 [1]) which is a design method using free-form surfaces for compact reflective optics such as head mount displays. As a result, we have successfully designed compact reflective optics for MCS with as-built performance of diffraction-limited image resolution.

  14. TIFR Near Infrared Imaging Camera-II on the 3.6 m Devasthal Optical Telescope

    NASA Astrophysics Data System (ADS)

    Baug, T.; Ojha, D. K.; Ghosh, S. K.; Sharma, S.; Pandey, A. K.; Kumar, Brijesh; Ghosh, Arpan; Ninan, J. P.; Naik, M. B.; D’Costa, S. L. A.; Poojary, S. S.; Sandimani, P. R.; Shah, H.; Krishna Reddy, B.; Pandey, S. B.; Chand, H.

    Tata Institute of Fundamental Research (TIFR) Near Infrared Imaging Camera-II (TIRCAM2) is a closed-cycle Helium cryo-cooled imaging camera equipped with a Raytheon 512×512 pixels InSb Aladdin III Quadrant focal plane array (FPA) having sensitivity to photons in the 1-5μm wavelength band. In this paper, we present the performance of the camera on the newly installed 3.6m Devasthal Optical Telescope (DOT) based on the calibration observations carried out during 2017 May 11-14 and 2017 October 7-31. After the preliminary characterization, the camera has been released to the Indian and Belgian astronomical community for science observations since 2017 May. The camera offers a field-of-view (FoV) of ˜86.5‧‧×86.5‧‧ on the DOT with a pixel scale of 0.169‧‧. The seeing at the telescope site in the near-infrared (NIR) bands is typically sub-arcsecond with the best seeing of ˜0.45‧‧ realized in the NIR K-band on 2017 October 16. The camera is found to be capable of deep observations in the J, H and K bands comparable to other 4m class telescopes available world-wide. Another highlight of this camera is the observational capability for sources up to Wide-field Infrared Survey Explorer (WISE) W1-band (3.4μm) magnitudes of 9.2 in the narrow L-band (nbL; λcen˜ 3.59μm). Hence, the camera could be a good complementary instrument to observe the bright nbL-band sources that are saturated in the Spitzer-Infrared Array Camera (IRAC) ([3.6] ≲ 7.92 mag) and the WISE W1-band ([3.4] ≲ 8.1 mag). Sources with strong polycyclic aromatic hydrocarbon (PAH) emission at 3.3μm are also detected. Details of the observations and estimated parameters are presented in this paper.

  15. High-Resolution Mars Camera Test Image of Moon (Infrared)

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This crescent view of Earth's Moon in infrared wavelengths comes from a camera test by NASA's Mars Reconnaissance Orbiter spacecraft on its way to Mars. The mission's High Resolution Imaging Science Experiment camera took the image on Sept. 8, 2005, while at a distance of about 10 million kilometers (6 million miles) from the Moon. The dark feature on the right is Mare Crisium. From that distance, the Moon would appear as a star-like point of light to the unaided eye. The test verified the camera's focusing capability and provided an opportunity for calibration. The spacecraft's Context Camera and Optical Navigation Camera also performed as expected during the test.

    The Mars Reconnaissance Orbiter, launched on Aug. 12, 2005, is on course to reach Mars on March 10, 2006. After gradually adjusting the shape of its orbit for half a year, it will begin its primary science phase in November 2006. From the mission's planned science orbit about 300 kilometers (186 miles) above the surface of Mars, the high resolution camera will be able to discern features as small as one meter or yard across.

  16. Navigating surgical fluorescence cameras using near-infrared optical tracking.

    PubMed

    van Oosterom, Matthias; den Houting, David; van de Velde, Cornelis; van Leeuwen, Fijs

    2018-05-01

    Fluorescence guidance facilitates real-time intraoperative visualization of the tissue of interest. However, due to attenuation, the application of fluorescence guidance is restricted to superficial lesions. To overcome this shortcoming, we have previously applied three-dimensional surgical navigation to position the fluorescence camera in reach of the superficial fluorescent signal. Unfortunately, in open surgery, the near-infrared (NIR) optical tracking system (OTS) used for navigation also induced an interference during NIR fluorescence imaging. In an attempt to support future implementation of navigated fluorescence cameras, different aspects of this interference were characterized and solutions were sought after. Two commercial fluorescence cameras for open surgery were studied in (surgical) phantom and human tissue setups using two different NIR OTSs and one OTS simulating light-emitting diode setup. Following the outcome of these measurements, OTS settings were optimized. Measurements indicated the OTS interference was caused by: (1) spectral overlap between the OTS light and camera, (2) OTS light intensity, (3) OTS duty cycle, (4) OTS frequency, (5) fluorescence camera frequency, and (6) fluorescence camera sensitivity. By optimizing points 2 to 4, navigation of fluorescence cameras during open surgery could be facilitated. Optimization of the OTS and camera compatibility can be used to support navigated fluorescence guidance concepts. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  17. InfraCAM (trade mark): A Hand-Held Commercial Infrared Camera Modified for Spaceborne Applications

    NASA Technical Reports Server (NTRS)

    Manitakos, Daniel; Jones, Jeffrey; Melikian, Simon

    1996-01-01

    In 1994, Inframetrics introduced the InfraCAM(TM), a high resolution hand-held thermal imager. As the world's smallest, lightest and lowest power PtSi based infrared camera, the InfraCAM is ideal for a wise range of industrial, non destructive testing, surveillance and scientific applications. In addition to numerous commercial applications, the light weight and low power consumption of the InfraCAM make it extremely valuable for adaptation to space borne applications. Consequently, the InfraCAM has been selected by NASA Lewis Research Center (LeRC) in Cleveland, Ohio, for use as part of the DARTFire (Diffusive and Radiative Transport in Fires) space borne experiment. In this experiment, a solid fuel is ignited in a low gravity environment. The combustion period is recorded by both visible and infrared cameras. The infrared camera measures the emission from polymethyl methacrylate, (PMMA) and combustion products in six distinct narrow spectral bands. Four cameras successfully completed all qualification tests at Inframetrics and at NASA Lewis. They are presently being used for ground based testing in preparation for space flight in the fall of 1995.

  18. In vitro near-infrared imaging of occlusal dental caries using germanium enhanced CMOS camera.

    PubMed

    Lee, Chulsung; Darling, Cynthia L; Fried, Daniel

    2010-03-01

    The high transparency of dental enamel in the near-infrared (NIR) at 1310-nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study was to determine whether the lesion contrast derived from NIR transillumination can be used to estimate lesion severity. Another aim was to compare the performance of a new Ge enhanced complementary metal-oxide-semiconductor (CMOS) based NIR imaging camera with the InGaAs focal plane array (FPA). Extracted human teeth (n=52) with natural occlusal caries were imaged with both cameras at 1310-nm and the image contrast between sound and carious regions was calculated. After NIR imaging, teeth were sectioned and examined using more established methods, namely polarized light microscopy (PLM) and transverse microradiography (TMR) to calculate lesion severity. Lesions were then classified into 4 categories according to the lesion severity. Lesion contrast increased significantly with lesion severity for both cameras (p<0.05). The Ge enhanced CMOS camera equipped with the larger array and smaller pixels yielded higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.

  19. Distinguishing the road conditions of dry, aquaplane, and frozen by using a three-color infrared camera

    NASA Astrophysics Data System (ADS)

    Tabuchi, Toru; Yamagata, Shigeki; Tamura, Tetsuo

    2003-04-01

    There are increasing demands for information to avoid accident in automobile traffic increase. We will discuss that an infrared camera can identify three conditions (dry, aquaplane, frozen) of the road surface. Principles of this method are; 1.We have found 3-color infrared camera can distinguish those conditions using proper data processing 2.The emissivity of the materials on the road surface (conclete, water, ice) differs in three wavelength regions. 3.The sky's temperature is lower than the road's. The emissivity of the road depends on the road surface conditions. Therefore, 3-color infrared camera measure the energy reflected from the sky on the road surface and self radiation of road surface. The road condition can be distinguished by processing the energy pattern measured in three wavelength regions. We were able to collect the experimental results that the emissivity of conclete is differ from water. The infrared camera whose NETD (Noise Equivalent Temperature Difference) at each 3-wavelength is 1.0C or less can distinguish the road conditions by using emissivity difference.

  20. Conception of a cheap infrared camera using a Fresnel lens

    NASA Astrophysics Data System (ADS)

    Grulois, Tatiana; Druart, Guillaume; Guérineau, Nicolas; Crastes, Arnaud; Sauer, Hervé; Chavel, Pierre

    2014-09-01

    Today huge efforts are made in the research and industrial areas to design compact and cheap uncooled infrared optical systems for low-cost imagery applications. Indeed, infrared cameras are currently too expensive to be widespread. If we manage to cut their cost, we expect to open new types of markets. In this paper, we will present the cheap broadband microimager we have designed. It operates in the long-wavelength infrared range and uses only one silicon lens at a minimal cost for the manufacturing process. Our concept is based on the use of a thin optics. Therefore inexpensive unconventional materials can be used because some absorption can be tolerated. Our imager uses a thin Fresnel lens. Up to now, Fresnel lenses have not been used for broadband imagery applications because of their disastrous chromatic properties. However, we show that working in a high diffraction order can significantly reduce chromatism. A prototype has been made and the performance of our camera will be discussed. Its characterization has been carried out in terms of modulation transfer function (MTF) and noise equivalent temperature difference (NETD). Finally, experimental images will be presented.

  1. Volcano monitoring with an infrared camera: first insights from Villarrica Volcano

    NASA Astrophysics Data System (ADS)

    Rosas Sotomayor, Florencia; Amigo Ramos, Alvaro; Velasquez Vargas, Gabriela; Medina, Roxana; Thomas, Helen; Prata, Fred; Geoffroy, Carolina

    2015-04-01

    This contribution focuses on the first trials of the, almost 24/7 monitoring of Villarrica volcano with an infrared camera. Results must be compared with other SO2 remote sensing instruments such as DOAS and UV-camera, for the ''day'' measurements. Infrared remote sensing of volcanic emissions is a fast and safe method to obtain gas abundances in volcanic plumes, in particular when the access to the vent is difficult, during volcanic crisis and at night time. In recent years, a ground-based infrared camera (Nicair) has been developed by Nicarnica Aviation, which quantifies SO2 and ash on volcanic plumes, based on the infrared radiance at specific wavelengths through the application of filters. Three Nicair1 (first model) have been acquired by the Geological Survey of Chile in order to study degassing of active volcanoes. Several trials with the instruments have been performed in northern Chilean volcanoes, and have proven that the intervals of retrieved SO2 concentration and fluxes are as expected. Measurements were also performed at Villarrica volcano, and a location to install a ''fixed'' camera, at 8km from the crater, was discovered here. It is a coffee house with electrical power, wifi network, polite and committed owners and a full view of the volcano summit. The first measurements are being made and processed in order to have full day and week of SO2 emissions, analyze data transfer and storage, improve the remote control of the instrument and notebook in case of breakdown, web-cam/GoPro support, and the goal of the project: which is to implement a fixed station to monitor and study the Villarrica volcano with a Nicair1 integrating and comparing these results with other remote sensing instruments. This works also looks upon the strengthen of bonds with the community by developing teaching material and giving talks to communicate volcanic hazards and other geoscience topics to the people who live "just around the corner" from one of the most active volcanoes

  2. Low-cost low-power uncooled a-Si-based micro infrared camera for unattended ground sensor applications

    NASA Astrophysics Data System (ADS)

    Schimert, Thomas R.; Ratcliff, David D.; Brady, John F., III; Ropson, Steven J.; Gooch, Roland W.; Ritchey, Bobbi; McCardel, P.; Rachels, K.; Wand, Marty; Weinstein, M.; Wynn, John

    1999-07-01

    Low power and low cost are primary requirements for an imaging infrared camera used in unattended ground sensor arrays. In this paper, an amorphous silicon (a-Si) microbolometer-based uncooled infrared camera technology offering a low cost, low power solution to infrared surveillance for UGS applications is presented. A 15 X 31 micro infrared camera (MIRC) has been demonstrated which exhibits an f/1 noise equivalent temperature difference sensitivity approximately 67 mK. This sensitivity has been achieved without the use of a thermoelectric cooler for array temperature stabilization thereby significantly reducing the power requirements. The chopperless camera is capable of operating from snapshot mode (1 Hz) to video frame rate (30 Hz). Power consumption of 0.4 W without display, and 0.75 W with display, respectively, has been demonstrated at 30 Hz operation. The 15 X 31 camera demonstrated exhibits a 35 mm camera form factor employing a low cost f/1 singlet optic and LED display, as well as low cost vacuum packaging. A larger 120 X 160 version of the MIRC is also in development and will be discussed. The 120 X 160 MIRC exhibits a substantially smaller form factor and incorporates all the low cost, low power features demonstrated in the 15 X 31 MIRC prototype. In this paper, a-Si microbolometer technology for the MIRC will be presented. Also, the key features and performance parameters of the MIRC are presented.

  3. Effect of indocyanine green angiography using infrared fundus camera on subsequent dark adaptation and electroretinogram.

    PubMed

    Wen, Feng; Yu, Minzhong; Wu, Dezheng; Ma, Juanmei; Wu, Lezheng

    2002-07-01

    To observe the effect of indocyanine green angiography (ICGA) with infrared fundus camera on subsequent dark adaptation and the Ganzfeld electroretinogram (ERG), the ERGs of 38 eyes with different retinal diseases were recorded before and after ICGA during a 40-min dark adaptation period. ICGA was performed with Topcon 50IA retina camera. Ganzfeld ERG was recorded with Neuropack II evoked response recorder. The results showed that ICGA did not affect the latencies and the amplitudes in ERG of rod response, cone response and mixed maximum response (p>0.05). It suggests that ICGA using infrared fundus camera could be performed prior to the recording of the Ganzfeld ERG.

  4. Stop outbreak of SARS with infrared cameras

    NASA Astrophysics Data System (ADS)

    Wu, Yigang M.

    2004-04-01

    SARS (Severe Acute Respiratory Syndrome, commonly known as Atypical Pneumonia in mainland China) caused 8422 people affected and resulting in 918 deaths worldwide in half year. This disease can be transmitted by respiratory droplets or by contact with a patient's respiratory secretions. This means it can be spread out very rapidly through the public transportations by the travelers with the syndrome. The challenge was to stop the SARS carriers traveling around by trains, airplanes, coaches and etc. It is impractical with traditional oral thermometers or spot infrared thermometers to screen the tens of travelers with elevated body temperature from thousands of normal travelers in hours. The thermal imager with temperature measurement function is a logical choice for this special application although there are some limitations and drawbacks. This paper discusses the real SARS applications of industrial infrared cameras in China from April to July 2003.

  5. In vitro near-infrared imaging of occlusal dental caries using a germanium-enhanced CMOS camera

    NASA Astrophysics Data System (ADS)

    Lee, Chulsung; Darling, Cynthia L.; Fried, Daniel

    2010-02-01

    The high transparency of dental enamel in the near-infrared (NIR) at 1310-nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study was to determine whether the lesion contrast derived from NIR transillumination can be used to estimate lesion severity. Another aim was to compare the performance of a new Ge enhanced complementary metal-oxide-semiconductor (CMOS) based NIR imaging camera with the InGaAs focal plane array (FPA). Extracted human teeth (n=52) with natural occlusal caries were imaged with both cameras at 1310-nm and the image contrast between sound and carious regions was calculated. After NIR imaging, teeth were sectioned and examined using more established methods, namely polarized light microscopy (PLM) and transverse microradiography (TMR) to calculate lesion severity. Lesions were then classified into 4 categories according to the lesion severity. Lesion contrast increased significantly with lesion severity for both cameras (p<0.05). The Ge enhanced CMOS camera equipped with the larger array and smaller pixels yielded higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.

  6. Prototype of microbolometer thermal infrared camera for forest fire detection from space

    NASA Astrophysics Data System (ADS)

    Guerin, Francois; Dantes, Didier; Bouzou, Nathalie; Chorier, Philippe; Bouchardy, Anne-Marie; Rollin, Joël.

    2017-11-01

    The contribution of the thermal infrared (TIR) camera to the Earth observation FUEGO mission is to participate; to discriminate the clouds and smoke; to detect the false alarms of forest fires; to monitor the forest fires. Consequently, the camera needs a large dynamic range of detectable radiances. A small volume, low mass and power are required by the small FUEGO payload. These specifications can be attractive for other similar missions.

  7. Imaging of breast cancer with mid- and long-wave infrared camera.

    PubMed

    Joro, R; Lääperi, A-L; Dastidar, P; Soimakallio, S; Kuukasjärvi, T; Toivonen, T; Saaristo, R; Järvenpää, R

    2008-01-01

    In this novel study the breasts of 15 women with palpable breast cancer were preoperatively imaged with three technically different infrared (IR) cameras - micro bolometer (MB), quantum well (QWIP) and photo voltaic (PV) - to compare their ability to differentiate breast cancer from normal tissue. The IR images were processed, the data for frequency analysis were collected from dynamic IR images by pixel-based analysis and from each image selectively windowed regional analysis was carried out, based on angiogenesis and nitric oxide production of cancer tissue causing vasomotor and cardiogenic frequency differences compared to normal tissue. Our results show that the GaAs QWIP camera and the InSb PV camera demonstrate the frequency difference between normal and cancerous breast tissue; the PV camera more clearly. With selected image processing operations more detailed frequency analyses could be applied to the suspicious area. The MB camera was not suitable for tissue differentiation, as the difference between noise and effective signal was unsatisfactory.

  8. SCC500: next-generation infrared imaging camera core products with highly flexible architecture for unique camera designs

    NASA Astrophysics Data System (ADS)

    Rumbaugh, Roy N.; Grealish, Kevin; Kacir, Tom; Arsenault, Barry; Murphy, Robert H.; Miller, Scott

    2003-09-01

    A new 4th generation MicroIR architecture is introduced as the latest in the highly successful Standard Camera Core (SCC) series by BAE SYSTEMS to offer an infrared imaging engine with greatly reduced size, weight, power, and cost. The advanced SCC500 architecture provides great flexibility in configuration to include multiple resolutions, an industry standard Real Time Operating System (RTOS) for customer specific software application plug-ins, and a highly modular construction for unique physical and interface options. These microbolometer based camera cores offer outstanding and reliable performance over an extended operating temperature range to meet the demanding requirements of real-world environments. A highly integrated lens and shutter is included in the new SCC500 product enabling easy, drop-in camera designs for quick time-to-market product introductions.

  9. The Detection and Photometric Redshift Determination of Distant Galaxies using SIRTF's Infrared Array Camera

    NASA Technical Reports Server (NTRS)

    Simpson, C.; Eisenhardt, P.

    1998-01-01

    We investigate the ability of the Space Infrared Telescope Facility's Infrared Array Camera to detect distant (z3) galaxies and measure their photometric redshifts. Our analysis shows that changing the original long wavelength filter specifications provides significant improvements in performance in this and other areas.

  10. Moving Object Detection on a Vehicle Mounted Back-Up Camera

    PubMed Central

    Kim, Dong-Sun; Kwon, Jinsan

    2015-01-01

    In the detection of moving objects from vision sources one usually assumes that the scene has been captured by stationary cameras. In case of backing up a vehicle, however, the camera mounted on the vehicle moves according to the vehicle’s movement, resulting in ego-motions on the background. This results in mixed motion in the scene, and makes it difficult to distinguish between the target objects and background motions. Without further treatments on the mixed motion, traditional fixed-viewpoint object detection methods will lead to many false-positive detection results. In this paper, we suggest a procedure to be used with the traditional moving object detection methods relaxing the stationary cameras restriction, by introducing additional steps before and after the detection. We also decribe the implementation as a FPGA platform along with the algorithm. The target application of this suggestion is use with a road vehicle’s rear-view camera systems. PMID:26712761

  11. Final Optical Design of PANIC, a Wide-Field Infrared Camera for CAHA

    NASA Astrophysics Data System (ADS)

    Cárdenas, M. C.; Gómez, J. Rodríguez; Lenzen, R.; Sánchez-Blanco, E.

    We present the Final Optical Design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Ritchey-Chrtien focus of the Calar Alto 2.2 m telescope. This will be the first instrument built under the German-Spanish consortium that manages the Calar Alto observatory. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. The optical design produces a well defined internal pupil available to reducing the thermal background by a cryogenic pupil stop. A mosaic of four detectors Hawaii 2RG of 2 k ×2 k, made by Teledyne, will give a field of view of 31.9 arcmin ×31.9 arcmin.

  12. Strategic options towards an affordable high-performance infrared camera

    NASA Astrophysics Data System (ADS)

    Oduor, Patrick; Mizuno, Genki; Dutta, Achyut K.; Lewis, Jay; Dhar, Nibir K.

    2016-05-01

    The promise of infrared (IR) imaging attaining low-cost akin to CMOS sensors success has been hampered by the inability to achieve cost advantages that are necessary for crossover from military and industrial applications into the consumer and mass-scale commercial realm despite well documented advantages. Banpil Photonics is developing affordable IR cameras by adopting new strategies to speed-up the decline of the IR camera cost curve. We present a new short-wave IR (SWIR) camera; 640x512 pixel InGaAs uncooled system that is high sensitivity low noise (<50e-), high dynamic range (100 dB), high-frame rates (> 500 frames per second (FPS)) at full resolution, and low power consumption (< 1 W) in a compact system. This camera paves the way towards mass market adoption by not only demonstrating high-performance IR imaging capability value add demanded by military and industrial application, but also illuminates a path towards justifiable price points essential for consumer facing application industries such as automotive, medical, and security imaging adoption. Among the strategic options presented include new sensor manufacturing technologies that scale favorably towards automation, multi-focal plane array compatible readout electronics, and dense or ultra-small pixel pitch devices.

  13. Temperature measurements on fast-rotating objects using a thermographic camera with an optomechanical image derotator

    NASA Astrophysics Data System (ADS)

    Altmann, Bettina; Pape, Christian; Reithmeier, Eduard

    2017-08-01

    Increasing requirements concerning the quality and lifetime of machine components in industrial and automotive applications require comprehensive investigations of the components in conditions close to the application. Irregularities in heating of mechanical parts reveal regions with increased loading of pressure, draft or friction. In the long run this leads to damage and total failure of the machine. Thermographic measurements of rotating objects, e.g., rolling bearings, brakes, and clutches provide an approach to investigate those defects. However, it is challenging to measure fast-rotating objects accurately. Currently one contact-free approach is performing stroboscopic measurements using an infrared sensor. The data acquisition is triggered so that the image is taken once per revolution. This leads to a huge loss of information on the majority of the movement and to motion blur. The objective of this research is showing the potential of using an optomechanical image derotator together with a thermographic camera. The derotator follows the rotation of the measurement object so that quasi-stationary thermal images during motion can be acquired by the infrared sensor. Unlike conventional derotators which use a glass prism to achieve this effect, the derotator within this work is equipped with a sophisticated reflector assembly. These reflectors are made of aluminum to transfer infrared radiation emitted by the rotating object. Because of the resulting stationary thermal image, the operation can be monitored continuously even for fast-rotating objects. The field of view can also be set to a small off-axis region of interest which then can be investigated with higher resolution or frame rate. To depict the potential of this approach, thermographic measurements on a rolling bearings in different operating states are presented.

  14. Hubble Space Telescope, Faint Object Camera

    NASA Technical Reports Server (NTRS)

    1981-01-01

    This drawing illustrates Hubble Space Telescope's (HST's), Faint Object Camera (FOC). The FOC reflects light down one of two optical pathways. The light enters a detector after passing through filters or through devices that can block out light from bright objects. Light from bright objects is blocked out to enable the FOC to see background images. The detector intensifies the image, then records it much like a television camera. For faint objects, images can be built up over long exposure times. The total image is translated into digital data, transmitted to Earth, and then reconstructed. The purpose of the HST, the most complex and sensitive optical telescope ever made, is to study the cosmos from a low-Earth orbit. By placing the telescope in space, astronomers are able to collect data that is free of the Earth's atmosphere. The HST detects objects 25 times fainter than the dimmest objects seen from Earth and provides astronomers with an observable universe 250 times larger than visible from ground-based telescopes, perhaps as far away as 14 billion light-years. The HST views galaxies, stars, planets, comets, possibly other solar systems, and even unusual phenomena such as quasars, with 10 times the clarity of ground-based telescopes. The HST was deployed from the Space Shuttle Discovery (STS-31 mission) into Earth orbit in April 1990. The Marshall Space Flight Center had responsibility for design, development, and construction of the HST. The Perkin-Elmer Corporation, in Danbury, Cornecticut, developed the optical system and guidance sensors.

  15. First light observations with TIFR Near Infrared Imaging Camera (TIRCAM-II)

    NASA Astrophysics Data System (ADS)

    Ojha, D. K.; Ghosh, S. K.; D'Costa, S. L. A.; Naik, M. B.; Sandimani, P. R.; Poojary, S. S.; Bhagat, S. B.; Jadhav, R. B.; Meshram, G. S.; Bakalkar, C. B.; Ramaprakash, A. N.; Mohan, V.; Joshi, J.

    TIFR near infrared imaging camera (TIRCAM-II) is based on the Aladdin III Quadrant InSb focal plane array (512×512 pixels; 27.6 μm pixel size; sensitive between 1 - 5.5 μm). TIRCAM-II had its first engineering run with the 2 m IUCAA telescope at Girawali during February - March 2011. The first light observations with TIRCAM-II were quite successful. Several infrared standard with TIRCAM-II were quite successful. Several infrared standard stars, the Trapezium Cluster in Orion region, McNeil's nebula, etc., were observed in the J, K and in a narrow-band at 3.6 μm (nbL). In the nbL band, some bright stars could be detected from the Girawali site. The performance of TIRCAM-II is discussed in the light of preliminary observations in near infrared bands.

  16. Winter risk estimations through infrared cameras an principal component analysis

    NASA Astrophysics Data System (ADS)

    Marchetti, M.; Dumoulin, J.; Ibos, L.

    2012-04-01

    Thermal mapping has been implemented since the late eighties to measure road pavement temperature along with some other atmospheric parameters to establish a winter risk describing the susceptibility of road network to ice occurrence. Measurements are done using a vehicle circulating on the road network in various road weather conditions. When the dew point temperature drops below road surface temperature a risk of ice occurs and therefore a loss of grip risk for circulating vehicles. To avoid too much influence of the sun, and to see the thermal behavior of the pavement enhanced, thermal mapping is usually done before dawn during winter time. That is when the energy accumulated by the road during daytime is mainly dissipated (by radiation, by conduction and by convection) and before the road structure starts a new cycle. This analysis is mainly done when a new road network is built, or when some major pavement changes are made, or when modifications in the road surroundings took place that might affect the thermal heat balance. This helps road managers to install sensors to monitor road status on specific locations identified as dangerous, or simply to install specific road signs. Measurements are anyhow time-consuming. Indeed, a whole road network can hardly be analysed at once, and has to be partitioned in stretches that could be done in the open time window to avoid temperature artefacts due to a rising sun. The LRPC Nancy has been using a thermal mapping vehicle with now two infrared cameras. Road events were collected by the operator to help the analysis of the network thermal response. A conventional radiometer with appropriate performances was used as a reference. The objective of the work was to compare results from the radiometer and the cameras. All the atmospheric parameters measured by the different sensors such as air temperature and relative humidity were used as input parameters for the infrared camera when recording thermal images. Road thermal

  17. PNIC - A near infrared camera for testing focal plane arrays

    NASA Astrophysics Data System (ADS)

    Hereld, Mark; Harper, D. A.; Pernic, R. J.; Rauscher, Bernard J.

    1990-07-01

    This paper describes the design and the performance of the Astrophysical Research Consortium prototype near-infrared camera (pNIC) designed to test focal plane arrays both on and off the telescope. Special attention is given to the detector in pNIC, the mechanical and optical designs, the electronics, and the instrument interface. Experiments performed to illustrate the most salient aspects of pNIC are described.

  18. Thermal Imaging with Novel Infrared Focal Plane Arrays and Quantitative Analysis of Thermal Imagery

    NASA Technical Reports Server (NTRS)

    Gunapala, S. D.; Rafol, S. B.; Bandara, S. V.; Liu, J. K.; Mumolo, J. M.; Soibel, A.; Ting, D. Z.; Tidrow, Meimei

    2012-01-01

    We have developed a single long-wavelength infrared (LWIR) quantum well infrared photodetector (QWIP) camera for thermography. This camera has been used to measure the temperature profile of patients. A pixel coregistered simultaneously reading mid-wavelength infrared (MWIR)/LWIR dual-band QWIP camera was developed to improve the accuracy of temperature measurements especially with objects with unknown emissivity. Even the dualband measurement can provide inaccurate results due to the fact that emissivity is a function of wavelength. Thus we have been developing a four-band QWIP camera for accurate temperature measurement of remote object.

  19. Cosmic Infrared Background Fluctuations in Deep Spitzer Infrared Array Camera Images: Data Processing and Analysis

    NASA Astrophysics Data System (ADS)

    Arendt, Richard G.; Kashlinsky, A.; Moseley, S. H.; Mather, J.

    2010-01-01

    This paper provides a detailed description of the data reduction and analysis procedures that have been employed in our previous studies of spatial fluctuation of the cosmic infrared background (CIB) using deep Spitzer Infrared Array Camera observations. The self-calibration we apply removes a strong instrumental signal from the fluctuations that would otherwise corrupt the results. The procedures and results for masking bright sources and modeling faint sources down to levels set by the instrumental noise are presented. Various tests are performed to demonstrate that the resulting power spectra of these fields are not dominated by instrumental or procedural effects. These tests indicate that the large-scale (gsim30') fluctuations that remain in the deepest fields are not directly related to the galaxies that are bright enough to be individually detected. We provide the parameterization of these power spectra in terms of separate instrument noise, shot noise, and power-law components. We discuss the relationship between fluctuations measured at different wavelengths and depths, and the relations between constraints on the mean intensity of the CIB and its fluctuation spectrum. Consistent with growing evidence that the ~1-5 μm mean intensity of the CIB may not be as far above the integrated emission of resolved galaxies as has been reported in some analyses of DIRBE and IRTS observations, our measurements of spatial fluctuations of the CIB intensity indicate the mean emission from the objects producing the fluctuations is quite low (gsim1 nW m-2 sr-1 at 3-5 μm), and thus consistent with current γ-ray absorption constraints. The source of the fluctuations may be high-z Population III objects, or a more local component of very low luminosity objects with clustering properties that differ from the resolved galaxies. Finally, we discuss the prospects of the upcoming space-based surveys to directly measure the epochs inhabited by the populations producing these source

  20. Cosmic Infrared Background Fluctuations in Deep Spitzer Infrared Array Camera Images: Data Processing and Analysis

    NASA Technical Reports Server (NTRS)

    Arendt, Richard; Kashlinsky, A.; Moseley, S.; Mather, J.

    2010-01-01

    This paper provides a detailed description of the data reduction and analysis procedures that have been employed in our previous studies of spatial fluctuation of the cosmic infrared background (CIB) using deep Spitzer Infrared Array Camera observations. The self-calibration we apply removes a strong instrumental signal from the fluctuations that would otherwise corrupt the results. The procedures and results for masking bright sources and modeling faint sources down to levels set by the instrumental noise are presented. Various tests are performed to demonstrate that the resulting power spectra of these fields are not dominated by instrumental or procedural effects. These tests indicate that the large-scale ([greater, similar]30') fluctuations that remain in the deepest fields are not directly related to the galaxies that are bright enough to be individually detected. We provide the parameterization of these power spectra in terms of separate instrument noise, shot noise, and power-law components. We discuss the relationship between fluctuations measured at different wavelengths and depths, and the relations between constraints on the mean intensity of the CIB and its fluctuation spectrum. Consistent with growing evidence that the [approx]1-5 [mu]m mean intensity of the CIB may not be as far above the integrated emission of resolved galaxies as has been reported in some analyses of DIRBE and IRTS observations, our measurements of spatial fluctuations of the CIB intensity indicate the mean emission from the objects producing the fluctuations is quite low ([greater, similar]1 nW m-2 sr-1 at 3-5 [mu]m), and thus consistent with current [gamma]-ray absorption constraints. The source of the fluctuations may be high-z Population III objects, or a more local component of very low luminosity objects with clustering properties that differ from the resolved galaxies. Finally, we discuss the prospects of the upcoming space-based surveys to directly measure the epochs

  1. Thermal Scanning of Dental Pulp Chamber by Thermocouple System and Infrared Camera during Photo Curing of Resin Composites.

    PubMed

    Hamze, Faeze; Ganjalikhan Nasab, Seyed Abdolreza; Eskandarizadeh, Ali; Shahravan, Arash; Akhavan Fard, Fatemeh; Sinaee, Neda

    2018-01-01

    Due to thermal hazard during composite restorations, this study was designed to scan the pulp temperature by thermocouple and infrared camera during photo polymerizing different composites. A mesio-occlso-distal (MOD) cavity was prepared in an extracted tooth and the K-type thermocouple was fixed in its pulp chamber. Subsequently, 1 mm increment of each composites were inserted (four composite types were incorporated) and photo polymerized employing either LED or QTH systems for 60 sec while the temperature was recorded with 10 sec intervals. Ultimately, the same tooth was hemisected bucco-lingually and the amalgam was removed. The same composite curing procedure was repeated while the thermogram was recorded using an infrared camera. Thereafter, the data was analyzed by repeated measured ANOVA followed by Tukey's HSD Post Hoc test for multiple comparisons ( α =0.05). The pulp temperature was significantly increased (repeated measures) during photo polymerization ( P =0.000) while there was no significant difference among the results recorded by thermocouple comparing to infrared camera ( P >0.05). Moreover, different composite materials and LCUs lead to similar outcomes ( P >0.05). Although various composites have significant different chemical compositions, they lead to similar pulp thermal changes. Moreover, both the infrared camera and the thermocouple would record parallel results of dental pulp temperature.

  2. Development of infrared scene projectors for testing fire-fighter cameras

    NASA Astrophysics Data System (ADS)

    Neira, Jorge E.; Rice, Joseph P.; Amon, Francine K.

    2008-04-01

    We have developed two types of infrared scene projectors for hardware-in-the-loop testing of thermal imaging cameras such as those used by fire-fighters. In one, direct projection, images are projected directly into the camera. In the other, indirect projection, images are projected onto a diffuse screen, which is then viewed by the camera. Both projectors use a digital micromirror array as the spatial light modulator, in the form of a Micromirror Array Projection System (MAPS) engine having resolution of 800 x 600 with mirrors on a 17 micrometer pitch, aluminum-coated mirrors, and a ZnSe protective window. Fire-fighter cameras are often based upon uncooled microbolometer arrays and typically have resolutions of 320 x 240 or lower. For direct projection, we use an argon-arc source, which provides spectral radiance equivalent to a 10,000 Kelvin blackbody over the 7 micrometer to 14 micrometer wavelength range, to illuminate the micromirror array. For indirect projection, an expanded 4 watt CO II laser beam at a wavelength of 10.6 micrometers illuminates the micromirror array and the scene formed by the first-order diffracted light from the array is projected onto a diffuse aluminum screen. In both projectors, a well-calibrated reference camera is used to provide non-uniformity correction and brightness calibration of the projected scenes, and the fire-fighter cameras alternately view the same scenes. In this paper, we compare the two methods for this application and report on our quantitative results. Indirect projection has an advantage of being able to more easily fill the wide field of view of the fire-fighter cameras, which typically is about 50 degrees. Direct projection more efficiently utilizes the available light, which will become important in emerging multispectral and hyperspectral applications.

  3. Thermal Scanning of Dental Pulp Chamber by Thermocouple System and Infrared Camera during Photo Curing of Resin Composites

    PubMed Central

    Hamze, Faeze; Ganjalikhan Nasab, Seyed Abdolreza; Eskandarizadeh, Ali; Shahravan, Arash; Akhavan Fard, Fatemeh; Sinaee, Neda

    2018-01-01

    Introduction: Due to thermal hazard during composite restorations, this study was designed to scan the pulp temperature by thermocouple and infrared camera during photo polymerizing different composites. Methods and Materials: A mesio-occlso-distal (MOD) cavity was prepared in an extracted tooth and the K-type thermocouple was fixed in its pulp chamber. Subsequently, 1 mm increment of each composites were inserted (four composite types were incorporated) and photo polymerized employing either LED or QTH systems for 60 sec while the temperature was recorded with 10 sec intervals. Ultimately, the same tooth was hemisected bucco-lingually and the amalgam was removed. The same composite curing procedure was repeated while the thermogram was recorded using an infrared camera. Thereafter, the data was analyzed by repeated measured ANOVA followed by Tukey’s HSD Post Hoc test for multiple comparisons (α=0.05). Results: The pulp temperature was significantly increased (repeated measures) during photo polymerization (P=0.000) while there was no significant difference among the results recorded by thermocouple comparing to infrared camera (P>0.05). Moreover, different composite materials and LCUs lead to similar outcomes (P>0.05). Conclusion: Although various composites have significant different chemical compositions, they lead to similar pulp thermal changes. Moreover, both the infrared camera and the thermocouple would record parallel results of dental pulp temperature. PMID:29707014

  4. Initial Checkout Results of the Compact Infrared Camera (circ) for Earth Observation

    NASA Astrophysics Data System (ADS)

    Kato, E.; Katayama, H.; Sakai, M.; Nakajima, Y.; Kimura, T.; Nakau, K.; Tonooka, H.

    2015-04-01

    Compact Infrared Camera (CIRC) is a technology-demonstration instrument equipped with an uncooled infrared array detector (microbolometer) for space application. CIRC is the first microbolometer sensor without a calibration function in orbit, like a shutter system or an onboard blackbody. The main objective of the CIRC is to detect wildfires, which are major and chronic disasters affecting various countries of Southeast Asia, particularly considering the effects of global warming and climate change. The CIRC achieves a small size (approximately 200 mm), light mass (approximately 3 kg), and low electrical power consumption (<20 W) by employing athermal optics and a shutterless system. The CIRC can be consequently mounted on multiple satellites to enable highfrequency observation. Installation of CIRCs on the ALOS-2 and on the JEM/CALET is expected to increase observation frequency. We present the initial check-out results of the CIRC onboard ALOS-2. Since the initial check-out phase (July 4-14, 2014), the CIRC has acquired the images of Earth. CIRC was demonstrated to function according to its intended design. After the early calibration validation phase, which confirmed the temperature accuracy of observed data, CIRC data has been available to the public January 2015 onward. We also introduce a few observational results about wildfire, volcanoes, and heat-island.

  5. Planetcam: A Visible And Near Infrared Lucky-imaging Camera To Study Planetary Atmospheres And Solar System Objects

    NASA Astrophysics Data System (ADS)

    Sanchez-Lavega, Agustin; Rojas, J.; Hueso, R.; Perez-Hoyos, S.; de Bilbao, L.; Murga, G.; Ariño, J.; Mendikoa, I.

    2012-10-01

    PlanetCam is a two-channel fast-acquisition and low-noise camera designed for a multispectral study of the atmospheres of the planets (Venus, Mars, Jupiter, Saturn, Uranus and Neptune) and the satellite Titan at high temporal and spatial resolutions simultaneously invisible (0.4-1 μm) and NIR (1-2.5 μm) channels. This is accomplished by means of a dichroic beam splitter that separates both beams directing them into two different detectors. Each detector has filter wheels corresponding to the characteristic absorption bands of each planetary atmosphere. Images are acquired and processed using the “lucky imaging” technique in which several thousand images of the same object are obtained in a short time interval, coregistered and ordered in terms of image quality to reconstruct a high-resolution ideally diffraction limited image of the object. Those images will be also calibrated in terms of intensity and absolute reflectivity. The camera will be tested at the 50.2 cm telescope of the Aula EspaZio Gela (Bilbao) and then commissioned at the 1.05 m at Pic-duMidi Observatory (Franca) and at the 1.23 m telescope at Calar Alto Observatory in Spain. Among the initially planned research targets are: (1) The vertical structure of the clouds and hazes in the planets and their scales of variability; (2) The meteorology, dynamics and global winds and their scales of variability in the planets. PlanetCam is also expected to perform studies of other Solar System and astrophysical objects. Acknowledgments: This work was supported by the Spanish MICIIN project AYA2009-10701 with FEDER funds, by Grupos Gobierno Vasco IT-464-07 and by Universidad País Vasco UPV/EHU through program UFI11/55.

  6. Infrared On-Orbit RCC Inspection With the EVA IR Camera: Development of Flight Hardware From a COTS System

    NASA Technical Reports Server (NTRS)

    Gazanik, Michael; Johnson, Dave; Kist, Ed; Novak, Frank; Antill, Charles; Haakenson, David; Howell, Patricia; Jenkins, Rusty; Yates, Rusty; Stephan, Ryan; hide

    2005-01-01

    In November 2004, NASA's Space Shuttle Program approved the development of the Extravehicular (EVA) Infrared (IR) Camera to test the application of infrared thermography to on-orbit reinforced carbon-carbon (RCC) damage detection. A multi-center team composed of members from NASA's Johnson Space Center (JSC), Langley Research Center (LaRC), and Goddard Space Flight Center (GSFC) was formed to develop the camera system and plan a flight test. The initial development schedule called for the delivery of the system in time to support STS-115 in late 2005. At the request of Shuttle Program managers and the flight crews, the team accelerated its schedule and delivered a certified EVA IR Camera system in time to support STS-114 in July 2005 as a contingency. The development of the camera system, led by LaRC, was based on the Commercial-Off-the-Shelf (COTS) FLIR S65 handheld infrared camera. An assessment of the S65 system in regards to space-flight operation was critical to the project. This paper discusses the space-flight assessment and describes the significant modifications required for EVA use by the astronaut crew. The on-orbit inspection technique will be demonstrated during the third EVA of STS-121 in September 2005 by imaging damaged RCC samples mounted in a box in the Shuttle's cargo bay.

  7. Space imaging infrared optical guidance for autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Akiyama, Akira; Kobayashi, Nobuaki; Mutoh, Eiichiro; Kumagai, Hideo; Yamada, Hirofumi; Ishii, Hiromitsu

    2008-08-01

    We have developed the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle based on the uncooled infrared camera and focusing technique to detect the objects to be evaded and to set the drive path. For this purpose we made servomotor drive system to control the focus function of the infrared camera lens. To determine the best focus position we use the auto focus image processing of Daubechies wavelet transform technique with 4 terms. From the determined best focus position we transformed it to the distance of the object. We made the aluminum frame ground vehicle to mount the auto focus infrared unit. Its size is 900mm long and 800mm wide. This vehicle mounted Ackerman front steering system and the rear motor drive system. To confirm the guidance ability of the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle we had the experiments for the detection ability of the infrared auto focus unit to the actual car on the road and the roadside wall. As a result the auto focus image processing based on the Daubechies wavelet transform technique detects the best focus image clearly and give the depth of the object from the infrared camera unit.

  8. Star Formation as Seen by the Infrared Array Camera on Spitzer

    NASA Technical Reports Server (NTRS)

    Smith, Howard A.; Allen, L.; Megeath, T.; Barmby, P.; Calvet, N.; Fazio, G.; Hartmann, L.; Myers, P.; Marengo, M.; Gutermuth, R.

    2004-01-01

    The Infrared Array Camera (IRAC) onboard Spitzer has imaged regions of star formation (SF) in its four IR bands with spatial resolutions of approximately 2"/pixel. IRAC is sensitive enough to detect very faint, embedded young stars at levels of tens of Jy, and IRAC photometry can categorize their stages of development: from young protostars with infalling envelopes (Class 0/1) to stars whose infrared excesses derive from accreting circumstellar disks (Class 11) to evolved stars dominated by photospheric emission. The IRAC images also clearly reveal and help diagnose associated regions of shocked and/or PDR emission in the clouds; we find existing models provide a good start at explaining the continuum of the SF regions IRAC observes.

  9. CANICA: The Cananea Near-Infrared Camera at the 2.1 m OAGH Telescope

    NASA Astrophysics Data System (ADS)

    Carrasco, L.; Hernández Utrera, O.; Vázquez, S.; Mayya, Y. D.; Carrasco, E.; Pedraza, J.; Castillo-Domínguez, E.; Escobedo, G.; Devaraj, R.; Luna, A.

    2017-10-01

    The Cananea near-infrared camera (CANICA) is an instrument commissioned at the 2.12 m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located in Cananea, Sonora, México. CANICA operates in the near-infrared at multiple bands including J(1.24 μm), H(1.63 μm) and K' (2.12 μm) broad-bands. CANICA in located at the Ritchey-Chrétien focal plane of the telescope, reimaging the f/12 beam into f/6 beam. The detector is a 1024 × 1024 HgCdTe HAWAII array of 18.5 μm pixel size, covering a field of view of 5.5 × 5.5 arcmin2, for a plate scale of 0.32 arcsec/pixel. The camera is enclosed in a cryostat, cooled with liquid nitrogen to 77 K. The cryostat contains the collimator, two 15-position filter wheels, single fixed reimaging optics and the detector.

  10. Object Occlusion Detection Using Automatic Camera Calibration for a Wide-Area Video Surveillance System

    PubMed Central

    Jung, Jaehoon; Yoon, Inhye; Paik, Joonki

    2016-01-01

    This paper presents an object occlusion detection algorithm using object depth information that is estimated by automatic camera calibration. The object occlusion problem is a major factor to degrade the performance of object tracking and recognition. To detect an object occlusion, the proposed algorithm consists of three steps: (i) automatic camera calibration using both moving objects and a background structure; (ii) object depth estimation; and (iii) detection of occluded regions. The proposed algorithm estimates the depth of the object without extra sensors but with a generic red, green and blue (RGB) camera. As a result, the proposed algorithm can be applied to improve the performance of object tracking and object recognition algorithms for video surveillance systems. PMID:27347978

  11. SLR digital camera for forensic photography

    NASA Astrophysics Data System (ADS)

    Har, Donghwan; Son, Youngho; Lee, Sungwon

    2004-06-01

    Forensic photography, which was systematically established in the late 19th century by Alphonse Bertillon of France, has developed a lot for about 100 years. The development will be more accelerated with the development of high technologies, in particular the digital technology. This paper reviews three studies to answer the question: Can the SLR digital camera replace the traditional silver halide type ultraviolet photography and infrared photography? 1. Comparison of relative ultraviolet and infrared sensitivity of SLR digital camera to silver halide photography. 2. How much ultraviolet or infrared sensitivity is improved when removing the UV/IR cutoff filter built in the SLR digital camera? 3. Comparison of relative sensitivity of CCD and CMOS for ultraviolet and infrared. The test result showed that the SLR digital camera has a very low sensitivity for ultraviolet and infrared. The cause was found to be the UV/IR cutoff filter mounted in front of the image sensor. Removing the UV/IR cutoff filter significantly improved the sensitivity for ultraviolet and infrared. Particularly for infrared, the sensitivity of the SLR digital camera was better than that of the silver halide film. This shows the possibility of replacing the silver halide type ultraviolet photography and infrared photography with the SLR digital camera. Thus, the SLR digital camera seems to be useful for forensic photography, which deals with a lot of ultraviolet and infrared photographs.

  12. Design of an infrared camera based aircraft detection system for laser guide star installations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, H.; Macintosh, B.

    1996-03-05

    There have been incidents in which the irradiance resulting from laser guide stars have temporarily blinded pilots or passengers of aircraft. An aircraft detection system based on passive near infrared cameras (instead of active radar) is described in this report.

  13. The Near-Earth Object Camera

    NASA Astrophysics Data System (ADS)

    Mainzer, Amy K.; NEOCam Science Team

    2017-10-01

    The Near-Earth Object Camera (NEOCam) is a NASA mission in formulation designed to find, track, and provide basic physical characterization of asteroids and comets that make close approaches to Earth. Its goal is to reduce the risk of impacts from undetected near-Earth objects (NEOs) capable of causing global and regional disasters. NEOCam consists of a 50 cm telescope operating at two channels dominated by NEO thermal emission, 4.2-5.0um and 6-10um, in order to better constrain the objects' temperatures and diameters. Orbiting the Sun-Earth L1 Lagrange point, the mission would find hundreds of thousands of NEOs and would make significant progress toward the Congressional objective of discovering more than 90% of NEOs larger than 140 m during its five-year lifetime. The mission uses novel 2048x2048 HgCdTe detectors that extend the wavelength cutoff beyond 10um at an operating temperature of 40K (Dorn et al. 2016). Both the optical system and the detectors are cooled passively using radiators and thermal shields to enable long mission life and to avoid the complexity of cryocoolers or cryogens. NEOCam is currently in an extended Phase A.

  14. Infrared Camera Diagnostic for Heat Flux Measurements on NSTX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. Mastrovito; R. Maingi; H.W. Kugel

    2003-03-25

    An infrared imaging system has been installed on NSTX (National Spherical Torus Experiment) at the Princeton Plasma Physics Laboratory to measure the surface temperatures on the lower divertor and center stack. The imaging system is based on an Indigo Alpha 160 x 128 microbolometer camera with 12 bits/pixel operating in the 7-13 {micro}m range with a 30 Hz frame rate and a dynamic temperature range of 0-700 degrees C. From these data and knowledge of graphite thermal properties, the heat flux is derived with a classic one-dimensional conduction model. Preliminary results of heat flux scaling are reported.

  15. SPARTAN Near-IR Camera | SOAR

    Science.gov Websites

    SPARTAN Near-IR Camera SPARTAN Cookbook Ohio State Infrared Imager/Spectrograph (OSIRIS) - NO LONGER Instrumentation at SOAR»SPARTAN Near-IR Camera SPARTAN Near-IR Camera System Overview The Spartan Infrared Camera is a high spatial resolution near-IR imager. Spartan has a focal plane conisisting of four "

  16. Alignment and Performance of the Infrared Multi-Object Spectrometer

    NASA Technical Reports Server (NTRS)

    Connelly, Joseph A.; Ohl, Raymond G.; Mentzell, J. Eric; Madison, Timothy J.; Hylan, Jason E.; Mink, Ronald G.; Saha, Timo T.; Tveekrem, June L.; Sparr, Leroy M.; Chambers, V. John; hide

    2004-01-01

    The Infrared Multi-Object Spectrometer (IRMOS) is a principle investigator class instrument for the Kitt Peak National Observatory 4 and 2.1 meter telescopes. IRMOS is a near-IR (0.8 - 2.5 micron) spectrometer with low-to mid-resolving power (R = 300 - 3000). IRMOS produces simultaneous spectra of approximately 100 objects in its 2.8 x 2.0 arc-min field of view (4 m telescope) using a commercial Micro Electro-Mechanical Systems (MEMS) micro-mirror array (MMA) from Texas Instruments. The IRMOS optical design consists of two imaging subsystems. The focal reducer images the focal plane of the telescope onto the MMA field stop, and the spectrograph images the MMA onto the detector. We describe ambient breadboard subsystem alignment and imaging performance of each stage independently, and ambient imaging performance of the fully assembled instrument. Interferometric measurements of subsystem wavefront error serve as a qualitative alignment guide, and are accomplished using a commercial, modified Twyman-Green laser unequal path interferometer. Image testing provides verification of the optomechanical alignment method and a measurement of near-angle scattered light due to mirror small-scale surface error. Image testing is performed at multiple field points. A mercury-argon pencil lamp provides a spectral line at 546.1 nanometers, a blackbody source provides a line at 1550 nanometers, and a CCD camera and IR camera are used as detectors. We use commercial optical modeling software to predict the point-spread function and its effect on instrument slit transmission and resolution. Our breadboard and instrument level test results validate this prediction. We conclude with an instrument performance prediction for cryogenic operation and first light in late 2003.

  17. Real-time multiple objects tracking on Raspberry-Pi-based smart embedded camera

    NASA Astrophysics Data System (ADS)

    Dziri, Aziz; Duranton, Marc; Chapuis, Roland

    2016-07-01

    Multiple-object tracking constitutes a major step in several computer vision applications, such as surveillance, advanced driver assistance systems, and automatic traffic monitoring. Because of the number of cameras used to cover a large area, these applications are constrained by the cost of each node, the power consumption, the robustness of the tracking, the processing time, and the ease of deployment of the system. To meet these challenges, the use of low-power and low-cost embedded vision platforms to achieve reliable tracking becomes essential in networks of cameras. We propose a tracking pipeline that is designed for fixed smart cameras and which can handle occlusions between objects. We show that the proposed pipeline reaches real-time processing on a low-cost embedded smart camera composed of a Raspberry-Pi board and a RaspiCam camera. The tracking quality and the processing speed obtained with the proposed pipeline are evaluated on publicly available datasets and compared to the state-of-the-art methods.

  18. Don't get burned: thermal monitoring of vessel sealing using a miniature infrared camera

    NASA Astrophysics Data System (ADS)

    Lin, Shan; Fichera, Loris; Fulton, Mitchell J.; Webster, Robert J.

    2017-03-01

    Miniature infrared cameras have recently come to market in a form factor that facilitates packaging in endoscopic or other minimally invasive surgical instruments. If absolute temperature measurements can be made with these cameras, they may be useful for non-contact monitoring of electrocautery-based vessel sealing, or other thermal surgical processes like thermal ablation of tumors. As a first step in evaluating the feasibility of optical medical thermometry with these new cameras, in this paper we explore how well thermal measurements can be made with them. These cameras measure the raw flux of incoming IR radiation, and we perform a calibration procedure to map their readings to absolute temperature values in the range between 40 and 150 °C. Furthermore, we propose and validate a method to estimate the spatial extent of heat spread created by a cautery tool based on the thermal images.

  19. Preliminary optical design of PANIC, a wide-field infrared camera for CAHA

    NASA Astrophysics Data System (ADS)

    Cárdenas, M. C.; Rodríguez Gómez, J.; Lenzen, R.; Sánchez-Blanco, E.

    2008-07-01

    In this paper, we present the preliminary optical design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Calar Alto 2.2 m telescope. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. A mosaic of four Hawaii 2RG of 2k x 2k made by Teledyne is used as detector and will give a field of view of 31.9 arcmin x 31.9 arcmin. This cryogenic instrument has been optimized for the Y, J, H and K bands. Special care has been taken in the selection of the standard IR materials used for the optics in order to maximize the instrument throughput and to include the z band. The main challenges of this design are: to produce a well defined internal pupil which allows reducing the thermal background by a cryogenic pupil stop; the correction of off-axis aberrations due to the large field available; the correction of chromatic aberration because of the wide spectral coverage; and the capability of introduction of narrow band filters (~1%) in the system minimizing the degradation in the filter passband without a collimated stage in the camera. We show the optomechanical error budget and compensation strategy that allows our as built design to met the performances from an optical point of view. Finally, we demonstrate the flexibility of the design showing the performances of PANIC at the CAHA 3.5m telescope.

  20. AMICA: The First camera for Near- and Mid-Infrared Astronomical Imaging at Dome C

    NASA Astrophysics Data System (ADS)

    Straniero, O.; Dolci, M.; Valentini, A.; Valentini, G.; di Rico, G.; Ragni, M.; Giuliani, C.; di Cianno, A.; di Varano, I.; Corcione, L.; Bortoletto, F.; D'Alessandro, M.; Magrin, D.; Bonoli, C.; Giro, E.; Fantinel, D.; Zerbi, F. M.; Riva, A.; de Caprio, V.; Molinari, E.; Conconi, P.; Busso, M.; Tosti, G.; Abia, C. A.

    AMICA (Antarctic Multiband Infrared CAmera) is an instrument designed to perform astronomical imaging in the near- (1{-}5 μm) and mid- (5 27 μm) infrared wavelength regions. Equipped with two detectors, an InSb 2562 and a Si:As 1282 IBC, cooled at 35 and 7 K respectively, it will be the first instrument to investigate the potential of the Italian-French base Concordia for IR astronomy. The main technical challenge is represented by the extreme conditions of Dome C (T ˜ -90 °C, p ˜640 mbar). An environmental control system ensures the correct start-up, shut-down and housekeeping of the various components of the camera. AMICA will be mounted on the IRAIT telescope and will perform survey-mode observations in the Southern sky. The first task is to provide important site-quality data. Substantial contributions to the solution of fundamental astrophysical quests, such as those related to late phases of stellar evolution and to star formation processes, are also expected.

  1. NIRAC: Near Infrared Airglow Camera for the International Space Station

    NASA Astrophysics Data System (ADS)

    Gelinas, L. J.; Rudy, R. J.; Hecht, J. H.

    2017-12-01

    NIRAC is a space based infrared airglow imager that will be deployed to the International Space Station in late 2018, under the auspices of the Space Test Program. NIRAC will survey OH airglow emissions in the 1.6 micron wavelength regime, exploring the spatial and temporal variability of emission intensities at latitudes from 51° south to 51° north. Atmospheric perturbations in the 80-100 km altitude range, including those produced by atmospheric gravity waves (AGWs), are observable in the OH airglow. The objective of the NIRAC experiment is to make near global measurement of the OH airglow and airglow perturbations. These emissions also provide a bright source of illumination at night, allowing for nighttime detection of clouds and surface characteristics. The instrument, developed by the Aerospace Space Science Applications Laboratory, employs a space-compatible FPGA for camera control and data collection and a novel, custom optical system to eliminate image smear due to orbital motion. NIRAC utilizes a high-performance, large format infrared focal plane array, transitioning technology used in the existing Aerospace Corporation ground-based airglow imager to a space based platform. The high-sensitivity, four megapixel imager has a native spatial resolution of 100 meters at ISS altitudes. The 23° x 23° FOV sweeps out a 150 km swath of the OH airglow layer as viewed from the ISS, and is sensitive to OH intensity perturbations down to 0.1%. The detector has a 1.7 micron cutoff that precludes the need for cold optics and reduces cooling requirements (to 180 K). Detector cooling is provided by a compact, lightweight cryocooler capable of reaching 120K, providing a great deal of margin.

  2. Expanded opportunities of THz passive camera for the detection of concealed objects

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Kuchik, Igor E.

    2013-10-01

    Among the security problems, the detection of object implanted into either the human body or animal body is the urgent problem. At the present time the main tool for the detection of such object is X-raying only. However, X-ray is the ionized radiation and therefore can not be used often. Other way for the problem solving is passive THz imaging using. In our opinion, using of the passive THz camera may help to detect the object implanted into the human body under certain conditions. The physical reason of such possibility arises from temperature trace on the human skin as a result of the difference in temperature between object and parts of human body. Modern passive THz cameras have not enough resolution in temperature to see this difference. That is why, we use computer processing to enhance the passive THz camera resolution for this application. After computer processing of images captured by passive THz camera TS4, developed by ThruVision Systems Ltd., we may see the pronounced temperature trace on the human body skin from the water, which is drunk by person, or other food eaten by person. Nevertheless, there are many difficulties on the way of full soution of this problem. We illustrate also an improvement of quality of the image captured by comercially available passive THz cameras using computer processing. In some cases, one can fully supress a noise on the image without loss of its quality. Using computer processing of the THz image of objects concealed on the human body, one may improve it many times. Consequently, the instrumental resolution of such device may be increased without any additional engineering efforts.

  3. Systems and methods for maintaining multiple objects within a camera field-of-view

    DOEpatents

    Gans, Nicholas R.; Dixon, Warren

    2016-03-15

    In one embodiment, a system and method for maintaining objects within a camera field of view include identifying constraints to be enforced, each constraint relating to an attribute of the viewed objects, identifying a priority rank for the constraints such that more important constraints have a higher priority that less important constraints, and determining the set of solutions that satisfy the constraints relative to the order of their priority rank such that solutions that satisfy lower ranking constraints are only considered viable if they also satisfy any higher ranking constraints, each solution providing an indication as to how to control the camera to maintain the objects within the camera field of view.

  4. Collaborative real-time scheduling of multiple PTZ cameras for multiple object tracking in video surveillance

    NASA Astrophysics Data System (ADS)

    Liu, Yu-Che; Huang, Chung-Lin

    2013-03-01

    This paper proposes a multi-PTZ-camera control mechanism to acquire close-up imagery of human objects in a surveillance system. The control algorithm is based on the output of multi-camera, multi-target tracking. Three main concerns of the algorithm are (1) the imagery of human object's face for biometric purposes, (2) the optimal video quality of the human objects, and (3) minimum hand-off time. Here, we define an objective function based on the expected capture conditions such as the camera-subject distance, pan tile angles of capture, face visibility and others. Such objective function serves to effectively balance the number of captures per subject and quality of captures. In the experiments, we demonstrate the performance of the system which operates in real-time under real world conditions on three PTZ cameras.

  5. Research on camera on orbit radial calibration based on black body and infrared calibration stars

    NASA Astrophysics Data System (ADS)

    Wang, YuDu; Su, XiaoFeng; Zhang, WanYing; Chen, FanSheng

    2018-05-01

    Affected by launching process and space environment, the response capability of a space camera must be attenuated. So it is necessary for a space camera to have a spaceborne radiant calibration. In this paper, we propose a method of calibration based on accurate Infrared standard stars was proposed for increasing infrared radiation measurement precision. As stars can be considered as a point target, we use them as the radiometric calibration source and establish the Taylor expansion method and the energy extrapolation model based on WISE catalog and 2MASS catalog. Then we update the calibration results from black body. Finally, calibration mechanism is designed and the technology of design is verified by on orbit test. The experimental calibration result shows the irradiance extrapolation error is about 3% and the accuracy of calibration methods is about 10%, the results show that the methods could satisfy requirements of on orbit calibration.

  6. Quantifying seasonal variation of leaf area index using near-infrared digital camera in a rice paddy

    NASA Astrophysics Data System (ADS)

    Hwang, Y.; Ryu, Y.; Kim, J.

    2017-12-01

    Digital camera has been widely used to quantify leaf area index (LAI). Numerous simple and automatic methods have been proposed to improve the digital camera based LAI estimates. However, most studies in rice paddy relied on arbitrary thresholds or complex radiative transfer models to make binary images. Moreover, only a few study reported continuous, automatic observation of LAI over the season in rice paddy. The objective of this study is to quantify seasonal variations of LAI using raw near-infrared (NIR) images coupled with a histogram shape-based algorithm in a rice paddy. As vegetation highly reflects the NIR light, we installed NIR digital camera 1.8 m above the ground surface and acquired unsaturated raw format images at one-hour intervals between 15 to 80 º solar zenith angles over the entire growing season in 2016 (from May to September). We applied a sub-pixel classification combined with light scattering correction method. Finally, to confirm the accuracy of the quantified LAI, we also conducted direct (destructive sampling) and indirect (LAI-2200) manual observations of LAI once per ten days on average. Preliminary results show that NIR derived LAI agreed well with in-situ observations but divergence tended to appear once rice canopy is fully developed. The continuous monitoring of LAI in rice paddy will help to understand carbon and water fluxes better and evaluate satellite based LAI products.

  7. CIRCE: The Canarias InfraRed Camera Experiment for the Gran Telescopio Canarias

    NASA Astrophysics Data System (ADS)

    Eikenberry, Stephen S.; Charcos, Miguel; Edwards, Michelle L.; Garner, Alan; Lasso-Cabrera, Nestor; Stelter, Richard D.; Marin-Franch, Antonio; Raines, S. Nicholas; Ackley, Kendall; Bennett, John G.; Cenarro, Javier A.; Chinn, Brian; Donoso, H. Veronica; Frommeyer, Raymond; Hanna, Kevin; Herlevich, Michael D.; Julian, Jeff; Miller, Paola; Mullin, Scott; Murphey, Charles H.; Packham, Chris; Varosi, Frank; Vega, Claudia; Warner, Craig; Ramaprakash, A. N.; Burse, Mahesh; Punnadi, Sunjit; Chordia, Pravin; Gerarts, Andreas; Martín, Héctor De Paz; Calero, María Martín; Scarpa, Riccardo; Acosta, Sergio Fernandez; Sánchez, William Miguel Hernández; Siegel, Benjamin; Pérez, Francisco Francisco; Martín, Himar D. Viera; Losada, José A. Rodríguez; Nuñez, Agustín; Tejero, Álvaro; González, Carlos E. Martín; Rodríguez, César Cabrera; Sendra, Jordi Molgó; Rodriguez, J. Esteban; Cáceres, J. Israel Fernádez; García, Luis A. Rodríguez; Lopez, Manuel Huertas; Dominguez, Raul; Gaggstatter, Tim; Lavers, Antonio Cabrera; Geier, Stefan; Pessev, Peter; Sarajedini, Ata; Castro-Tirado, A. J.

    The Canarias InfraRed Camera Experiment (CIRCE) is a near-infrared (1-2.5μm) imager, polarimeter and low-resolution spectrograph operating as a visitor instrument for the Gran Telescopio Canarias (GTC) 10.4-m telescope. It was designed and built largely by graduate students and postdocs, with help from the University of Florida (UF) astronomy engineering group, and is funded by the UF and the US National Science Foundation. CIRCE is intended to help fill the gap in near-infrared capabilities prior to the arrival of Especrografo Multiobjecto Infra-Rojo (EMIR) to the GTC and will also provide the following scientific capabilities to compliment EMIR after its arrival: high-resolution imaging, narrowband imaging, high-time-resolution photometry, imaging polarimetry, and low resolution spectroscopy. In this paper, we review the design, fabrication, integration, lab testing, and on-sky performance results for CIRCE. These include a novel approach to the opto-mechanical design, fabrication, and alignment.

  8. Low-cost camera modifications and methodologies for very-high-resolution digital images

    USDA-ARS?s Scientific Manuscript database

    Aerial color and color-infrared photography are usually acquired at high altitude so the ground resolution of the photographs is < 1 m. Moreover, current color-infrared cameras and manned aircraft flight time are expensive, so the objective is the development of alternative methods for obtaining ve...

  9. NIRCam: Development and Testing of the JWST Near-Infrared Camera

    NASA Technical Reports Server (NTRS)

    Greene, Thomas; Beichman, Charles; Gully-Santiago, Michael; Jaffe, Daniel; Kelly, Douglas; Krist, John; Rieke, Marcia; Smith, Eric H.

    2011-01-01

    The Near Infrared Camera (NIRCam) is one of the four science instruments of the James Webb Space Telescope (JWST). Its high sensitivity, high spatial resolution images over the 0.6 - 5 microns wavelength region will be essential for making significant findings in many science areas as well as for aligning the JWST primary mirror segments and telescope. The NIRCam engineering test unit was recently assembled and has undergone successful cryogenic testing. The NIRCam collimator and camera optics and their mountings are also progressing, with a brass-board system demonstrating relatively low wavefront error across a wide field of view. The flight model?s long-wavelength Si grisms have been fabricated, and its coronagraph masks are now being made. Both the short (0.6 - 2.3 microns) and long (2.4 - 5.0 microns) wavelength flight detectors show good performance and are undergoing final assembly and testing. The flight model subsystems should all be completed later this year through early 2011, and NIRCam will be cryogenically tested in the first half of 2011 before delivery to the JWST integrated science instrument module (ISIM).

  10. The Utility of Using a Near-Infrared (NIR) Camera to Measure Beach Surface Moisture

    NASA Astrophysics Data System (ADS)

    Nelson, S.; Schmutz, P. P.

    2017-12-01

    Surface moisture content is an important factor that must be considered when studying aeolian sediment transport in a beach environment. A few different instruments and procedures are available for measuring surface moisture content (i.e. moisture probes, LiDAR, and gravimetric moisture data from surface scrapings); however, these methods can be inaccurate, costly, and inapplicable, particularly in the field. Near-infrared (NIR) spectral band imagery is another technique used to obtain moisture data. NIR imagery has been predominately used through remote sensing and has yet to be used for ground-based measurements. Dry sand reflects infrared radiation given off by the sun and wet sand absorbs IR radiation. All things considered, this study assesses the utility of measuring surface moisture content of beach sand with a modified NIR camera. A traditional point and shoot digital camera was internally modified with the placement of a visible light-blocking filter. Images were taken of three different types of beach sand at controlled moisture content values, with sunlight as the source of infrared radiation. A technique was established through trial and error by comparing resultant histogram values using Adobe Photoshop with the various moisture conditions. The resultant IR absorption histogram values were calibrated to actual gravimetric moisture content from surface scrapings of the samples. Overall, the results illustrate that the NIR spectrum modified camera does not provide the ability to adequately measure beach surface moisture content. However, there were noted differences in IR absorption histogram values among the different sediment types. Sediment with darker quartz mineralogy provided larger variations in histogram values, but the technique is not sensitive enough to accurately represent low moisture percentages, which are of most importance when studying aeolian sediment transport.

  11. Photometry of Galactic and Extragalactic Far-Infrared Sources using the 91.5 cm Airborne Infrared Telescope

    NASA Technical Reports Server (NTRS)

    Harper, D. A.

    1996-01-01

    The objective of this grant was to construct a series of far infrared photometers, cameras, and supporting systems for use in astronomical observations in the Kuiper Airborne Observatory. The observations have included studies of galaxies, star formation regions, and objects within the Solar System.

  12. University Physics Students' Ideas of Thermal Radiation Expressed in Open Laboratory Activities Using Infrared Cameras

    ERIC Educational Resources Information Center

    Haglund, Jesper; Melander, Emil; Weiszflog, Matthias; Andersson, Staffan

    2017-01-01

    Background: University physics students were engaged in open-ended thermodynamics laboratory activities with a focus on understanding a chosen phenomenon or the principle of laboratory apparatus, such as thermal radiation and a heat pump. Students had access to handheld infrared (IR) cameras for their investigations. Purpose: The purpose of the…

  13. An optical design of the wide-field imaging and multi-object spectrograph for an Antarctic infrared telescope

    NASA Astrophysics Data System (ADS)

    Ichikawa, Takashi; Obata, Tomokazu

    2016-08-01

    A design of the wide-field infrared camera (AIRC) for Antarctic 2.5m infrared telescope (AIRT) is presented. The off-axis design provides a 7'.5 ×7'. 5 field of view with 0".22 pixel-1 in the wavelength range of 1 to 5 μm for the simultaneous three-color bands using cooled optics and three 2048×2048 InSb focal plane arrays. Good image quality is obtained over the entire field of view with practically no chromatic aberration. The image size corresponds to the refraction limited for 2.5 m telescope at 2 μm and longer. To enjoy the stable atmosphere with extremely low perceptible water vapor (PWV), superb seeing quality, and the cadence of the polar winter at Dome Fuji on the Antarctic plateau, the camera will be dedicated to the transit observations of exoplanets. The function of a multi-object spectroscopic mode with low spectra resolution (R 50-100) will be added for the spectroscopic transit observation at 1-5 μm. The spectroscopic capability in the environment of extremely low PWV of Antarctica will be very effective for the study of the existence of water vapor in the atmosphere of super earths.

  14. More than Meets the Eye - Infrared Cameras in Open-Ended University Thermodynamics Labs

    NASA Astrophysics Data System (ADS)

    Melander, Emil; Haglund, Jesper; Weiszflog, Matthias; Andersson, Staffan

    2016-12-01

    Educational research has found that students have challenges understanding thermal science. Undergraduate physics students have difficulties differentiating basic thermal concepts, such as heat, temperature, and internal energy. Engineering students have been found to have difficulties grasping surface emissivity as a thermal material property. One potential source of students' challenges with thermal science is the lack of opportunity to visualize energy transfer in intuitive ways with traditional measurement equipment. Thermodynamics laboratories have typically depended on point measures of temperature by use of thermometers (detecting heat conduction) or pyrometers (detecting heat radiation). In contrast, thermal imaging by means of an infrared (IR) camera provides a real-time, holistic image. Here we provide some background on IR cameras and their uses in education, and summarize five qualitative investigations that we have used in our courses.

  15. C-RED One : the infrared camera using the Saphira e-APD detector

    NASA Astrophysics Data System (ADS)

    Greffe, Timothée.; Feautrier, Philippe; Gach, Jean-Luc; Stadler, Eric; Clop, Fabien; Lemarchand, Stephane; Boutolleau, David; Baker, Ian

    2016-08-01

    Name for Person Card: Observatoire de la Côte d'Azur First Light Imaging' C-RED One infrared camera is capable of capturing up to 3500 full frames per second with a sub-electron readout noise and very low background. This breakthrough has been made possible thanks to the use of an e- APD infrared focal plane array which is a real disruptive technology in imagery. C-RED One is an autonomous system with an integrated cooling system and a vacuum regeneration system. It operates its sensor with a wide variety of read out techniques and processes video on-board thanks to an FPGA. We will show its performances and expose its main features. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.

  16. Estimating the Infrared Radiation Wavelength Emitted by a Remote Control Device Using a Digital Camera

    ERIC Educational Resources Information Center

    Catelli, Francisco; Giovannini, Odilon; Bolzan, Vicente Dall Agnol

    2011-01-01

    The interference fringes produced by a diffraction grating illuminated with radiation from a TV remote control and a red laser beam are, simultaneously, captured by a digital camera. Based on an image with two interference patterns, an estimate of the infrared radiation wavelength emitted by a TV remote control is made. (Contains 4 figures.)

  17. A user-friendly technical set-up for infrared photography of forensic findings.

    PubMed

    Rost, Thomas; Kalberer, Nicole; Scheurer, Eva

    2017-09-01

    Infrared photography is interesting for a use in forensic science and forensic medicine since it reveals findings that normally are almost invisible to the human eye. Originally, infrared photography has been made possible by the placement of an infrared light transmission filter screwed in front of the camera objective lens. However, this set-up is associated with many drawbacks such as the loss of the autofocus function, the need of an external infrared source, and long exposure times which make the use of a tripod necessary. These limitations prevented up to now the routine application of infrared photography in forensics. In this study the use of a professional modification inside the digital camera body was evaluated regarding camera handling and image quality. This permanent modification consisted of the replacement of the in-built infrared blocking filter by an infrared transmission filter of 700nm and 830nm, respectively. The application of this camera set-up for the photo-documentation of forensically relevant post-mortem findings was investigated in examples of trace evidence such as gunshot residues on the skin, in external findings, e.g. hematomas, as well as in an exemplary internal finding, i.e., Wischnewski spots in a putrefied stomach. The application of scattered light created by indirect flashlight yielded a more uniform illumination of the object, and the use of the 700nm filter resulted in better pictures than the 830nm filter. Compared to pictures taken under visible light, infrared photographs generally yielded better contrast. This allowed for discerning more details and revealed findings which were not visible otherwise, such as imprints on a fabric and tattoos in mummified skin. The permanent modification of a digital camera by building in a 700nm infrared transmission filter resulted in a user-friendly and efficient set-up which qualified for the use in daily forensic routine. Main advantages were a clear picture in the viewfinder, an auto

  18. Of Detection Limits and Effective Mitigation: The Use of Infrared Cameras for Methane Leak Detection

    NASA Astrophysics Data System (ADS)

    Ravikumar, A. P.; Wang, J.; McGuire, M.; Bell, C.; Brandt, A. R.

    2017-12-01

    Mitigating methane emissions, a short-lived and potent greenhouse gas, is critical to limiting global temperature rise to two degree Celsius as outlined in the Paris Agreement. A major source of anthropogenic methane emissions in the United States is the oil and gas sector. To this effect, state and federal governments have recommended the use of optical gas imaging systems in periodic leak detection and repair (LDAR) surveys to detect for fugitive emissions or leaks. The most commonly used optical gas imaging systems (OGI) are infrared cameras. In this work, we systematically evaluate the limits of infrared (IR) camera based OGI system for use in methane leak detection programs. We analyze the effect of various parameters that influence the minimum detectable leak rates of infrared cameras. Blind leak detection tests were carried out at the Department of Energy's MONITOR natural gas test-facility in Fort Collins, CO. Leak sources included natural gas wellheads, separators, and tanks. With an EPA mandated 60 g/hr leak detection threshold for IR cameras, we test leak rates ranging from 4 g/hr to over 350 g/hr at imaging distances between 5 ft and 70 ft from the leak source. We perform these experiments over the course of a week, encompassing a wide range of wind and weather conditions. Using repeated measurements at a given leak rate and imaging distance, we generate detection probability curves as a function of leak-size for various imaging distances, and measurement conditions. In addition, we estimate the median detection threshold - leak-size at which the probability of detection is 50% - under various scenarios to reduce uncertainty in mitigation effectiveness. Preliminary analysis shows that the median detection threshold varies from 3 g/hr at an imaging distance of 5 ft to over 150 g/hr at 50 ft (ambient temperature: 80 F, winds < 4 m/s). Results from this study can be directly used to improve OGI based LDAR protocols and reduce uncertainty in estimated

  19. Design of polarized infrared athermal telephoto objective for penetrating the fog

    NASA Astrophysics Data System (ADS)

    Gao, Duorui; Fu, Qiang; Zhao, Zhao; Zhao, Bin; Zhong, Lijun; Zhan, Juntong

    2014-11-01

    Polarized infrared imaging technology is a new detection technique which own the ability of spying through the fog, highlighting the target and recognizing the forgeries, these characters make it a good advantage of increasing the work distance in the fog. Compared to the traditional infrared imaging method, polarized infrared imaging can identify the background and target easily, that is the most distinguishing feature of polarized infrared imaging technology. Owning to the large refractive index of the infrared material, temperature change will bring defocus seriously, athermal infrared objective is necessarily. On the other hand, athermal objective has large total length, and hard to be integrated for their huge volume. However telephoto objective has the character of small volume and short total length. The paper introduce a method of polarized and athermal infrared telephoto objective which can spy the fog. First assign the optical power of the fore group and the rear group on the basis of the principle of telephoto objective, the power of the fore group is positive and the rear group is negative; then distribute the optical power within each group to realize the ability of athermalization, finally computer-aided software is used to correct aberration. In order to prove the feasibility of the scheme, an athermal optical system was designed by virtue of ZEMAX software which works at 8~12 µm, the focal length of 150mm, F number is 2, and total length of the telephoto objective is 120mm. The environment temperature analysis shows that the optical system have stable imaging quality, MTF is close to diffraction limit. This telephoto objective is available for infrared polarized imaging.

  20. Long-Wavelength 640 x 486 GaAs/AlGaAs Quantum Well Infrared Photodetector Snap-Shot Camera

    NASA Technical Reports Server (NTRS)

    Gunapala, Sarath D.; Bandara, Sumith V.; Liu, John K.; Hong, Winn; Sundaram, Mani; Maker, Paul D.; Muller, Richard E.; Shott, Craig A.; Carralejo, Ronald

    1998-01-01

    A 9-micrometer cutoff 640 x 486 snap-shot quantum well infrared photodetector (QWIP) camera has been demonstrated. The performance of this QWIP camera is reported including indoor and outdoor imaging. The noise equivalent differential temperature (NE.deltaT) of 36 mK has been achieved at 300 K background with f/2 optics. This is in good agreement with expected focal plane array sensitivity due to the practical limitations on charge handling capacity of the multiplexer, read noise, bias voltage, and operating temperature.

  1. Using turbulence scintillation to assist object ranging from a single camera viewpoint.

    PubMed

    Wu, Chensheng; Ko, Jonathan; Coffaro, Joseph; Paulson, Daniel A; Rzasa, John R; Andrews, Larry C; Phillips, Ronald L; Crabbs, Robert; Davis, Christopher C

    2018-03-20

    Image distortions caused by atmospheric turbulence are often treated as unwanted noise or errors in many image processing studies. Our study, however, shows that in certain scenarios the turbulence distortion can be very helpful in enhancing image processing results. This paper describes a novel approach that uses the scintillation traits recorded on a video clip to perform object ranging with reasonable accuracy from a single camera viewpoint. Conventionally, a single camera would be confused by the perspective viewing problem, where a large object far away looks the same as a small object close by. When the atmospheric turbulence phenomenon is considered, the edge or texture pixels of an object tend to scintillate and vary more with increased distance. This turbulence induced signature can be quantitatively analyzed to achieve object ranging with reasonable accuracy. Despite the inevitable fact that turbulence will cause random blurring and deformation of imaging results, it also offers convenient solutions to some remote sensing and machine vision problems, which would otherwise be difficult.

  2. Space infrared telescope facility wide field and diffraction limited array camera (IRAC)

    NASA Technical Reports Server (NTRS)

    Fazio, G. G.

    1986-01-01

    IRAC focal plane detector technology was developed and studies of alternate focal plane configurations were supported. While any of the alternate focal planes under consideration would have a major impact on the Infrared Array Camera, it was possible to proceed with detector development and optical analysis research based on the proposed design since, to a large degree, the studies undertaken are generic to any SIRTF imaging instrument. Development of the proposed instrument was also important in a situation in which none of the alternate configurations has received the approval of the Science Working Group.

  3. Fixed-focus camera objective for small remote sensing satellites

    NASA Astrophysics Data System (ADS)

    Topaz, Jeremy M.; Braun, Ofer; Freiman, Dov

    1993-09-01

    An athermalized objective has been designed for a compact, lightweight push-broom camera which is under development at El-Op Ltd. for use in small remote-sensing satellites. The high performance objective has a fixed focus setting, but maintains focus passively over the full range of temperatures encountered in small satellites. The lens is an F/5.0, 320 mm focal length Tessar type, operating over the range 0.5 - 0.9 micrometers . It has a 16 degree(s) field of view and accommodates various state-of-the-art silicon detector arrays. The design and performance of the objective is described in this paper.

  4. Determination of feature generation methods for PTZ camera object tracking

    NASA Astrophysics Data System (ADS)

    Doyle, Daniel D.; Black, Jonathan T.

    2012-06-01

    Object detection and tracking using computer vision (CV) techniques have been widely applied to sensor fusion applications. Many papers continue to be written that speed up performance and increase learning of artificially intelligent systems through improved algorithms, workload distribution, and information fusion. Military application of real-time tracking systems is becoming more and more complex with an ever increasing need of fusion and CV techniques to actively track and control dynamic systems. Examples include the use of metrology systems for tracking and measuring micro air vehicles (MAVs) and autonomous navigation systems for controlling MAVs. This paper seeks to contribute to the determination of select tracking algorithms that best track a moving object using a pan/tilt/zoom (PTZ) camera applicable to both of the examples presented. The select feature generation algorithms compared in this paper are the trained Scale-Invariant Feature Transform (SIFT) and Speeded Up Robust Features (SURF), the Mixture of Gaussians (MoG) background subtraction method, the Lucas- Kanade optical flow method (2000) and the Farneback optical flow method (2003). The matching algorithm used in this paper for the trained feature generation algorithms is the Fast Library for Approximate Nearest Neighbors (FLANN). The BSD licensed OpenCV library is used extensively to demonstrate the viability of each algorithm and its performance. Initial testing is performed on a sequence of images using a stationary camera. Further testing is performed on a sequence of images such that the PTZ camera is moving in order to capture the moving object. Comparisons are made based upon accuracy, speed and memory.

  5. Development of the compact infrared camera (CIRC) for Earth observation

    NASA Astrophysics Data System (ADS)

    Naitoh, Masataka; Katayama, Haruyoshi; Harada, Masatomo; Nakamura, Ryoko; Kato, Eri; Tange, Yoshio; Sato, Ryota; Nakau, Koji

    2017-11-01

    The Compact Infrared Camera (CIRC) is an instrument equipped with an uncooled infrared array detector (microbolometer). We adopted the microbolometer, because it does not require a cooling system such as a mechanical cooler, and athermal optics, which does not require an active thermal control of optics. This can reduce the size, cost, and electrical power consumption of the sensor. The main mission of the CIRC is to demonstrate the technology for detecting wildfire, which are major and chronic disasters affecting many countries in the Asia-Pacific region. It is possible to increase observational frequency of wildfires, if CIRCs are carried on a various satellites by taking advantages of small size and light weight. We have developed two CIRCs. The first will be launched in JFY 2013 onboard Advanced Land Observing Satellite-2 (ALOS- 2), and the second will be launched in JFY 2014 onboard CALorimetric Electron Telescope (CALET) of the Japanese Experiment Module (JEM) at the International Space Station(ISS). We have finished the ground Calibration of the first CIRC onboard ALOS-2. In this paper, we provide an overview of the CIRC and its results of ground calibration.

  6. Teaching physics and understanding infrared thermal imaging

    NASA Astrophysics Data System (ADS)

    Vollmer, Michael; Möllmann, Klaus-Peter

    2017-08-01

    Infrared thermal imaging is a very rapidly evolving field. The latest trends are small smartphone IR camera accessories, making infrared imaging a widespread and well-known consumer product. Applications range from medical diagnosis methods via building inspections and industrial predictive maintenance etc. also to visualization in the natural sciences. Infrared cameras do allow qualitative imaging and visualization but also quantitative measurements of the surface temperatures of objects. On the one hand, they are a particularly suitable tool to teach optics and radiation physics and many selected topics in different fields of physics, on the other hand there is an increasing need of engineers and physicists who understand these complex state of the art photonics systems. Therefore students must also learn and understand the physics underlying these systems.

  7. Infrared Camera System for Visualization of IR-Absorbing Gas Leaks

    NASA Technical Reports Server (NTRS)

    Youngquist, Robert; Immer, Christopher; Cox, Robert

    2010-01-01

    Leak detection and location remain a common problem in NASA and industry, where gas leaks can create hazardous conditions if not quickly detected and corrected. In order to help rectify this problem, this design equips an infrared (IR) camera with the means to make gas leaks of IR-absorbing gases more visible for leak detection and location. By comparing the output of two IR cameras (or two pictures from the same camera under essentially identical conditions and very closely spaced in time) on a pixel-by-pixel basis, one can cancel out all but the desired variations that correspond to the IR absorption of the gas of interest. This can be simply done by absorbing the IR lines that correspond to the gas of interest from the radiation received by one of the cameras by the intervention of a filter that removes the particular wavelength of interest from the "reference" picture. This can be done most sensitively with a gas filter (filled with the gas of interest) placed in front of the IR detector array, or (less sensitively) by use of a suitable line filter in the same location. This arrangement would then be balanced against the unfiltered "measurement" picture, which will have variations from IR absorption from the gas of interest. By suitable processing of the signals from each pixel in the two IR pictures, the user can display only the differences in the signals. Either a difference or a ratio output of the two signals is feasible. From a gas concentration viewpoint, the ratio could be processed to show the column depth of the gas leak. If a variation in the background IR light intensity is present in the field of view, then large changes in the difference signal will occur for the same gas column concentration between the background and the camera. By ratioing the outputs, the same signal ratio is obtained for both high- and low-background signals, even though the low-signal areas may have greater noise content due to their smaller signal strength. Thus, one

  8. Molecular Shocks Associated with Massive Young Stars: CO Line Images with a New Far-Infrared Spectroscopic Camera on the Kuiper Airborne Observatory

    NASA Technical Reports Server (NTRS)

    Watson, Dan M.

    1997-01-01

    Under the terms of our contract with NASA Ames Research Center, the University of Rochester (UR) offers the following final technical report on grant NAG 2-958, Molecular shocks associated with massive young stars: CO line images with a new far-infrared spectroscopic camera, given for implementation of the UR Far-Infrared Spectroscopic Camera (FISC) on the Kuiper Airborne Observatory (KAO), and use of this camera for observations of star-formation regions 1. Two KAO flights in FY 1995, the final year of KAO operations, were awarded to this program, conditional upon a technical readiness confirmation which was given in January 1995. The funding period covered in this report is 1 October 1994 - 30 September 1996. The project was supported with $30,000, and no funds remained at the conclusion of the project.

  9. Auto-converging stereo cameras for 3D robotic tele-operation

    NASA Astrophysics Data System (ADS)

    Edmondson, Richard; Aycock, Todd; Chenault, David

    2012-06-01

    Polaris Sensor Technologies has developed a Stereovision Upgrade Kit for TALON robot to provide enhanced depth perception to the operator. This kit previously required the TALON Operator Control Unit to be equipped with the optional touchscreen interface to allow for operator control of the camera convergence angle adjustment. This adjustment allowed for optimal camera convergence independent of the distance from the camera to the object being viewed. Polaris has recently improved the performance of the stereo camera by implementing an Automatic Convergence algorithm in a field programmable gate array in the camera assembly. This algorithm uses scene content to automatically adjust the camera convergence angle, freeing the operator to focus on the task rather than adjustment of the vision system. The autoconvergence capability has been demonstrated on both visible zoom cameras and longwave infrared microbolometer stereo pairs.

  10. "Wow, It Turned out Red! First, a Little Yellow, and Then Red!" 1st-Graders' Work with an Infrared Camera

    ERIC Educational Resources Information Center

    Jeppsson, Fredrik; Frejd, Johanna; Lundmark, Frida

    2017-01-01

    This study focuses on investigating how students make use of their bodily experiences in combination with infrared (IR) cameras, as a way to make meaning in learning about heat, temperature, and friction. A class of 20 primary students (age 7-8 years), divided into three groups, took part in three IR camera laboratory experiments. The qualitative…

  11. Shutterless non-uniformity correction for the long-term stability of an uncooled long-wave infrared camera

    NASA Astrophysics Data System (ADS)

    Liu, Chengwei; Sui, Xiubao; Gu, Guohua; Chen, Qian

    2018-02-01

    For the uncooled long-wave infrared (LWIR) camera, the infrared (IR) irradiation the focal plane array (FPA) receives is a crucial factor that affects the image quality. Ambient temperature fluctuation as well as system power consumption can result in changes of FPA temperature and radiation characteristics inside the IR camera; these will further degrade the imaging performance. In this paper, we present a novel shutterless non-uniformity correction method to compensate for non-uniformity derived from the variation of ambient temperature. Our method combines a calibration-based method and the properties of a scene-based method to obtain correction parameters at different ambient temperature conditions, so that the IR camera performance can be less influenced by ambient temperature fluctuation or system power consumption. The calibration process is carried out in a temperature chamber with slowly changing ambient temperature and a black body as uniform radiation source. Enough uniform images are captured and the gain coefficients are calculated during this period. Then in practical application, the offset parameters are calculated via the least squares method based on the gain coefficients, the captured uniform images and the actual scene. Thus we can get a corrected output through the gain coefficients and offset parameters. The performance of our proposed method is evaluated on realistic IR images and compared with two existing methods. The images we used in experiments are obtained by a 384× 288 pixels uncooled LWIR camera. Results show that our proposed method can adaptively update correction parameters as the actual target scene changes and is more stable to temperature fluctuation than the other two methods.

  12. In-flight performance of the Faint Object Camera of the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    Greenfield, P.; Paresce, F.; Baxter, D.; Hodge, P.; Hook, R.; Jakobsen, P.; Jedrzejewski, R.; Nota, A.; Sparks, W. B.; Towers, N.

    1991-01-01

    An overview of the Faint Object Camera and its performance to date is presented. In particular, the detector's efficiency, the spatial uniformity of response, distortion characteristics, detector and sky background, detector linearity, spectrography, and operation are discussed. The effect of the severe spherical aberration of the telescope's primary mirror on the camera's point spread function is reviewed, as well as the impact it has on the camera's general performance. The scientific implications of the performance and the spherical aberration are outlined, with emphasis on possible remedies for spherical aberration, hardware remedies, and stellar population studies.

  13. Upgrade of the infrared camera diagnostics for the JET ITER-like wall divertor.

    PubMed

    Balboa, I; Arnoux, G; Eich, T; Sieglin, B; Devaux, S; Zeidner, W; Morlock, C; Kruezi, U; Sergienko, G; Kinna, D; Thomas, P D; Rack, M

    2012-10-01

    For the new ITER-like wall at JET, two new infrared diagnostics (KL9B, KL3B) have been installed. These diagnostics can operate between 3.5 and 5 μm and up to sampling frequencies of ∼20 kHz. KL9B and KL3B image the horizontal and vertical tiles of the divertor. The divertor tiles are tungsten coated carbon fiber composite except the central tile which is bulk tungsten and consists of lamella segments. The thermal emission between lamellae affects the surface temperature measurement and therefore KL9A has been upgraded to achieve a higher spatial resolution (by a factor of 2). A technical description of KL9A, KL9B, and KL3B and cross correlation with a near infrared camera and a two-color pyrometer is presented.

  14. Ambient and Cryogenic Alignment Verification and Performance of the Infrared Multi-Object Spectrometer

    NASA Technical Reports Server (NTRS)

    Connelly, Joseph A.; Ohl, Raymond G.; Mink, Ronald G.; Mentzell, J. Eric; Saha, Timo T.; Tveekrem, June L.; Hylan, Jason E.; Sparr, Leroy M.; Chambers, V. John; Hagopian, John G.

    2003-01-01

    The Infrared Multi-Object Spectrometer (IRMOS) is a facility instrument for the Kitt Peak National Observatory 4 and 2.1 meter telescopes. IRMOS is a near-IR (0.8 - 2.5 micron) spectrometer with low- to mid-resolving power (R = 300 - 3000). IRMOS produces simultaneous spectra of approximately 100 objects in its 2.8 x 2.0 arc-min field of view using a commercial Micro Electro-Mechanical Systems (MEMS) Digital Micro-mirror Device (DMD) from Texas Instruments. The IRMOS optical design consists of two imaging subsystems. The focal reducer images the focal plane of the telescope onto the DMD field stop, and the spectrograph images the DMD onto the detector. We describe ambient breadboard subsystem alignment and imaging performance of each stage independently, and the ambient and cryogenic imaging performance of the fully assembled instrument. Interferometric measurements of subsystem wavefront error serve to venfy alignment, and are accomplished using a commercial, modified Twyman-Green laser unequal path interferometer. Image testing provides further verification of the optomechanical alignment method and a measurement of near-angle scattered light due to mirror small-scale surface error. Image testing is performed at multiple field points. A mercury-argon pencil lamp provides spectral lines at 546.1 nm and 1550 nm, and a CCD camera and IR camera are used as detectors. We use commercial optical modeling software to predict the point-spread function and its effect on instrument slit transmission and resolution. Our breadboard test results validate this prediction. We conclude with an instrument performance prediction for first light.

  15. DMDs for multi-object near-infrared spectrographs in astronomy

    NASA Astrophysics Data System (ADS)

    Smee, Stephen A.; Barkhouser, Robert; Hope, Stephen; Conley, Devin; Gray, Aidan; Hope, Gavin; Robberto, Massimo

    2018-02-01

    The Digital Micromirror Device (DMD), typically used in projection screen technology, has utility in instrumentation for astronomy as a digitally programmable slit in a spectrograph. When placed at an imaging focal plane the device can be used to selectively direct light from astronomical targets into the optical path of a spectrograph, while at the same time directing the remaining light into an imaging camera, which can be used for slit alignment, science imaging, or both. To date the use of DMDs in astronomy has been limited, especially for instruments that operate in the near infrared (1 - 2.5 μm). This limitation is due in part to a host of technical challenges with respect to DMDs that, to date, have not been thoroughly explored. Those challenges include operation at cryogenic temperature, control electronics that facilitate DMD use at these temperatures, window coatings properly coated for the near infrared bandpass, and scattered light. This paper discusses these technical challenges and presents progress towards understanding and mitigating them.

  16. Estimating Clothing Thermal Insulation Using an Infrared Camera

    PubMed Central

    Lee, Jeong-Hoon; Kim, Young-Keun; Kim, Kyung-Soo; Kim, Soohyun

    2016-01-01

    In this paper, a novel algorithm for estimating clothing insulation is proposed to assess thermal comfort, based on the non-contact and real-time measurements of the face and clothing temperatures by an infrared camera. The proposed method can accurately measure the clothing insulation of various garments under different clothing fit and sitting postures. The proposed estimation method is investigated to be effective to measure its clothing insulation significantly in different seasonal clothing conditions using a paired t-test in 99% confidence interval. Temperatures simulated with the proposed estimated insulation value show closer to the values of actual temperature than those with individual clothing insulation values. Upper clothing’s temperature is more accurate within 3% error and lower clothing’s temperature is more accurate by 3.7%~6.2% error in indoor working scenarios. The proposed algorithm can reflect the effect of air layer which makes insulation different in the calculation to estimate clothing insulation using the temperature of the face and clothing. In future, the proposed method is expected to be applied to evaluate the customized passenger comfort effectively. PMID:27005625

  17. Real object-based 360-degree integral-floating display using multiple depth camera

    NASA Astrophysics Data System (ADS)

    Erdenebat, Munkh-Uchral; Dashdavaa, Erkhembaatar; Kwon, Ki-Chul; Wu, Hui-Ying; Yoo, Kwan-Hee; Kim, Young-Seok; Kim, Nam

    2015-03-01

    A novel 360-degree integral-floating display based on the real object is proposed. The general procedure of the display system is similar with conventional 360-degree integral-floating displays. Unlike previously presented 360-degree displays, the proposed system displays the 3D image generated from the real object in 360-degree viewing zone. In order to display real object in 360-degree viewing zone, multiple depth camera have been utilized to acquire the depth information around the object. Then, the 3D point cloud representations of the real object are reconstructed according to the acquired depth information. By using a special point cloud registration method, the multiple virtual 3D point cloud representations captured by each depth camera are combined as single synthetic 3D point cloud model, and the elemental image arrays are generated for the newly synthesized 3D point cloud model from the given anamorphic optic system's angular step. The theory has been verified experimentally, and it shows that the proposed 360-degree integral-floating display can be an excellent way to display real object in the 360-degree viewing zone.

  18. Standoff Mid-Infrared Emissive Imaging Spectroscopy for Identification and Mapping of Materials in Polychrome Objects.

    PubMed

    Gabrieli, Francesca; Dooley, Kathryn A; Zeibel, Jason G; Howe, James D; Delaney, John K

    2018-06-18

    Microscale mid-infrared (mid-IR) imaging spectroscopy is used for the mapping of chemical functional groups. The extension to macroscale imaging requires that either the mid-IR radiation reflected off or that emitted by the object be greater than the radiation from the thermal background. Reflectance spectra can be obtained using an active IR source to increase the amount of radiation reflected off the object, but rapid heating of greater than 4 °C can occur, which is a problem for paintings. Rather than using an active source, by placing a highly reflective tube between the painting and camera and introducing a low temperature source, thermal radiation from the room can be reduced, allowing the IR radiation emitted by the painting to dominate. Thus, emissivity spectra of the object can be recovered. Using this technique, mid-IR emissivity image cubes of paintings were collected at high collection rates with a low-noise, line-scanning imaging spectrometer, which allowed pigments and paint binders to be identified and mapped. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Infrared imaging spectrometry by the use of bundled chalcogenide glass fibers and a PtSi CCD camera

    NASA Astrophysics Data System (ADS)

    Saito, Mitsunori; Kikuchi, Katsuhiro; Tanaka, Chinari; Sone, Hiroshi; Morimoto, Shozo; Yamashita, Toshiharu T.; Nishii, Junji

    1999-10-01

    A coherent fiber bundle for infrared image transmission was prepared by arranging 8400 chalcogenide (AsS) glass fibers. The fiber bundle, 1 m in length, is transmissive in the infrared spectral region of 1 - 6 micrometer. A remote spectroscopic imaging system was constructed with the fiber bundle and an infrared PtSi CCD camera. The system was used for the real-time observation (frame time: 1/60 s) of gas distribution. Infrared light from a SiC heater was delivered to a gas cell through a chalcogenide fiber, and transmitted light was observed through the fiber bundle. A band-pass filter was used for the selection of gas species. A He-Ne laser of 3.4 micrometer wavelength was also used for the observation of hydrocarbon gases. Gases bursting from a nozzle were observed successfully by a remote imaging system.

  20. United Kingdom Infrared Telescope's Spectrograph Observations of Human-Made Space Objects

    NASA Technical Reports Server (NTRS)

    Buckalew, Brent; Abercromby, Kira; Lederer, Susan; Frith, James; Cowardin, Heather

    2017-01-01

    Presented here are the results of the United Kingdom Infrared Telescope (UKIRT) spectral observations of human-made space objects taken from 2014 to 2015. The data collected using the UIST infrared spectrograph cover the wavelength range 0.7-2.5 micrometers. Overall, data were collected on 18 different orbiting objects at or near the geosynchronous (GEO) regime. Thirteen of the objects are spacecraft, one is a rocket body, and four are cataloged as debris pieces. The remotely collected data are compared to the laboratory-collected reflectance data on typical spacecraft materials; thereby general materials are identified but not specific types. These results highlight the usefulness of observations in the infrared by focusing on features from hydrocarbons and silicon. The spacecraft show distinct features due to the presence of solar panels. Signature variations between rocket bodies, due to the presence of various metals and paints on their surfaces, show a clear distinction from those objects with solar panels, demonstrating that one can distinguish most spacecraft from rocket bodies through infrared spectrum analysis. Finally, the debris pieces tend to show featureless, dark spectra. These results show that the laboratory data in its current state give excellent indications as to the nature of the surface materials on the objects. Further telescopic data collection and model updates to include more materials, noise, surface roughness, and material degradation are necessary to make better assessments of orbital object material types. A comparison conducted between objects observed previously with the NASA Infrared Telescope Facility (IRTF) shows similar materials and trends from the two telescopes and from the two distinct data sets. However, based on the current state of the model, infrared spectroscopic data are adequate to classify objects in GEO as spacecraft, rocket bodies, or debris.

  1. The Example of Using the Xiaomi Cameras in Inventory of Monumental Objects - First Results

    NASA Astrophysics Data System (ADS)

    Markiewicz, J. S.; Łapiński, S.; Bienkowski, R.; Kaliszewska, A.

    2017-11-01

    At present, digital documentation recorded in the form of raster or vector files is the obligatory way of inventorying historical objects. Today, photogrammetry is becoming more and more popular and is becoming the standard of documentation in many projects involving the recording of all possible spatial data on landscape, architecture, or even single objects. Low-cost sensors allow for the creation of reliable and accurate three-dimensional models of investigated objects. This paper presents the results of a comparison between the outcomes obtained when using three sources of image: low-cost Xiaomi cameras, a full-frame camera (Canon 5D Mark II) and middle-frame camera (Hasselblad-Hd4). In order to check how the results obtained from the two sensors differ the following parameters were analysed: the accuracy of the orientation of the ground level photos on the control and check points, the distribution of appointed distortion in the self-calibration process, the flatness of the walls, the discrepancies between point clouds from the low-cost cameras and references data. The results presented below are a result of co-operation of researchers from three institutions: the Systems Research Institute PAS, The Department of Geodesy and Cartography at the Warsaw University of Technology and the National Museum in Warsaw.

  2. Detecting Target Objects by Natural Language Instructions Using an RGB-D Camera

    PubMed Central

    Bao, Jiatong; Jia, Yunyi; Cheng, Yu; Tang, Hongru; Xi, Ning

    2016-01-01

    Controlling robots by natural language (NL) is increasingly attracting attention for its versatility, convenience and no need of extensive training for users. Grounding is a crucial challenge of this problem to enable robots to understand NL instructions from humans. This paper mainly explores the object grounding problem and concretely studies how to detect target objects by the NL instructions using an RGB-D camera in robotic manipulation applications. In particular, a simple yet robust vision algorithm is applied to segment objects of interest. With the metric information of all segmented objects, the object attributes and relations between objects are further extracted. The NL instructions that incorporate multiple cues for object specifications are parsed into domain-specific annotations. The annotations from NL and extracted information from the RGB-D camera are matched in a computational state estimation framework to search all possible object grounding states. The final grounding is accomplished by selecting the states which have the maximum probabilities. An RGB-D scene dataset associated with different groups of NL instructions based on different cognition levels of the robot are collected. Quantitative evaluations on the dataset illustrate the advantages of the proposed method. The experiments of NL controlled object manipulation and NL-based task programming using a mobile manipulator show its effectiveness and practicability in robotic applications. PMID:27983604

  3. Infrared Telescope Facility's Spectrograph Observations of Human-Made Space Objects

    NASA Technical Reports Server (NTRS)

    Abercromby, K.; Buckalew, B.; Abell, P.; Cowardin, H.

    2015-01-01

    Presented here are the results of the Infrared Telescope Facility (IRTF) spectral observations of human-made space objects taken from 2006 to 2008. The data collected using the SpeX infrared spectrograph cover the wavelength range 0.7-2.5 micrometers. Overall, data were collected on 20 different orbiting objects at or near the geosynchronous (GEO) regime. Four of the objects were controlled spacecraft, seven were non-controlled spacecraft, five were rocket bodies, and the final four were cataloged as debris pieces. The remotely collected data are compared to the laboratory-collected reflectance data on typical spacecraft materials, thereby general materials are identified but not specific types. These results highlight the usefulness of observations in the infrared by focusing on features from hydrocarbons, silicon, and thermal emission. The spacecraft, both the controlled and non-controlled, show distinct features due to the presence of solar panels, whereas the rocket bodies do not. Signature variations between rocket bodies, due to the presence of various metals and paints on their surfaces, show a clear distinction from those objects with solar panels, demonstrating that one can distinguish most spacecraft from rocket bodies through infrared spectrum analysis. Finally, the debris pieces tend to show featureless, dark spectra. These results show that the laboratory data in its current state give excellent indications as to the nature of the surface materials on the objects. Further telescopic data collection and model updates to include noise, surface roughness, and material degradation are necessary to make better assessments of orbital object material types. However, based on the current state of the comparison between the observations and the laboratory data, infrared spectroscopic data are adequate to classify objects in GEO as spacecraft, rocket bodies, or debris.

  4. On-orbit performance of the Compact Infrared Camera (CIRC) onboard ALOS-2

    NASA Astrophysics Data System (ADS)

    Sakai, Michito; Katayama, Haruyoshi; Kato, Eri; Nakajima, Yasuhiro; Kimura, Toshiyoshi; Nakau, Koji

    2015-10-01

    Compact Infrared Camera (CIRC) is a technology demonstration instrument equipped with an uncooled infrared array detector (microbolometer) for space application. Microbolometers have an advantage of not requiring cooling system such as a mechanical cooler and are suitable for resource-limited sensor systems. Another characteristic of the CIRC is its use of an athermal optical system and a shutterless system. The CIRC is small in size (approximately 200 mm), is light weight (approximately 3 kg), and has low electrical power consumption (<20 W) owing to these characteristics. The main objective of CIRC is to detect wildfires, which are major and chronic disasters affecting various countries of Southeast Asia, particularly considering the effects of global warming and climate change. One of the CIRCs was launched in May 24, 2014 as a technology demonstration payload of the Advanced Land Observation Satellite-2 (ALOS- 2). Since the initial functional verification phase (July 4-14, 2014), the CIRC has demonstrated functions according to its intended design. We also confirmed that the noise equivalent differential temperature of the CIRC observation data is less than 0.2 K, the temperature accuracy is within ±4 K, and the spatial resolution is less than 210 m in the calibration validation phase after the initial functional verification phase. The CIRC also detects wildfires in various areas and observes volcano activities and urban heat islands in the operational phase. The other CIRC will be launched in 2015 onboard the CALorimetric Electron Telescope (CALET) of the Japanese Experiment Module (JEM) of the International Space Station. Installation of the CIRCs on the ALOS-2 and on the JEM/CALET is expected to increase the observation frequency. In this study, we present the on-orbit performance including observational results of the CIRC onboard the ALOS-2 and the current status of the CIRC onboard the JEM/CALET.

  5. CATAVIÑA: new infrared camera for OAN-SPM

    NASA Astrophysics Data System (ADS)

    Iriarte, Arturo; Cruz-González, Irene; Martínez, Luis A.; Tinoco, Silvio; Lara, Gerardo; Ruiz, Elfego; Sohn, Erika; Bernal, Abel; Angeles, Fernando; Moreno, Arturo; Murillo, Francisco; Langarica, Rosalía; Luna, Esteban; Salas, Luis; Cajero, Vicente

    2006-06-01

    CATAVIÑA is a near-infrared camera system to be operated in conjunction with the existing multi-purpose nearinfrared optical bench "CAMALEON" in OAN-SPM. Observing modes include direct imaging, spectroscopy, Fabry- Perot interferometry and polarimetry. This contribution focuses on the optomechanics and detector controller description of CATAVIÑA, which is planned to start operating later in 2006. The camera consists of an 8 inch LN2 dewar containing a 10 filter carousel, a radiation baffle and the detector circuit board mount. The system is based on a Rockwell 1024x1024 HgCdTe (HAWAII-I) FPA, operating in the 1 to 2.5 micron window. The detector controller/readout system was designed and developed at UNAM Instituto de Astronomia. It is based on five Texas Instruments DSK digital signal processor (DSP) modules. One module generates the detector and ADC-system control, while the remaining four are in charge of the acquisition of each of the detector's quadrants. Each DSP has a built-in expanded memory module in order to store more than one image. The detector read-out and signal driver subsystems are mounted onto the dewar in a "back-pack" fashion, each containing four independent pre-amplifiers, converters and signal drivers, that communicate through fiber optics with their respective DSPs. This system has the possibility of programming the offset input voltage and converter gain. The controller software architecture is based on a client/server model. The client sends commands through the TCP/IP protocol and acquires the image. The server consists of a microcomputer with an embedded Linux operating system, which runs the main program that receives the user commands and interacts with the timing and acquisition DSPs. The observer's interface allows for several readout and image processing modes.

  6. Near-infrared spectroscopy of primitive solar system objects

    NASA Technical Reports Server (NTRS)

    Luu, Jane; Jewitt, David; Cloutis, Edward

    1994-01-01

    We have obtained near-infrared (H and K band at lambda/Delta(lambda) is approximately 480 to 600) spectra of a sample of primitive objects including 2 Centaur objects (2060 Chiron and 5145 Pholus) and 16 P- and D-type asteroids. The spectra were obtained at the United Kingdom Infrared Telescope using the cooled grating spectrometer CGS4, and were used to search for chemically diagnostic vibrational features in these primitive objects. Pholus exhibits broad adsorption features at 2.07 and 2.27 micrometers, as well as a weak feature at 1.72 micrometers. The 1.72- and 2.27-micrometer features are similar to those seen in a laboratory tar sand sample. No distinct absorption features are found in other objects, including Chiron, which displays a spectrally neutral continuum. A comparison of the P- and D-type asteroid spectra with laboratory measurements of organic solids shows no compelling evidence for hydrocarbon overtones seen in terrestrial bituminous tar sands.

  7. Automatic Camera Calibration for Cultural Heritage Applications Using Unstructured Planar Objects

    NASA Astrophysics Data System (ADS)

    Adam, K.; Kalisperakis, I.; Grammatikopoulos, L.; Karras, G.; Petsa, E.

    2013-07-01

    As a rule, image-based documentation of cultural heritage relies today on ordinary digital cameras and commercial software. As such projects often involve researchers not familiar with photogrammetry, the question of camera calibration is important. Freely available open-source user-friendly software for automatic camera calibration, often based on simple 2D chess-board patterns, are an answer to the demand for simplicity and automation. However, such tools cannot respond to all requirements met in cultural heritage conservation regarding possible imaging distances and focal lengths. Here we investigate the practical possibility of camera calibration from unknown planar objects, i.e. any planar surface with adequate texture; we have focused on the example of urban walls covered with graffiti. Images are connected pair-wise with inter-image homographies, which are estimated automatically through a RANSAC-based approach after extracting and matching interest points with the SIFT operator. All valid points are identified on all images on which they appear. Provided that the image set includes a "fronto-parallel" view, inter-image homographies with this image are regarded as emulations of image-to-world homographies and allow computing initial estimates for the interior and exterior orientation elements. Following this initialization step, the estimates are introduced into a final self-calibrating bundle adjustment. Measures are taken to discard unsuitable images and verify object planarity. Results from practical experimentation indicate that this method may produce satisfactory results. The authors intend to incorporate the described approach into their freely available user-friendly software tool, which relies on chess-boards, to assist non-experts in their projects with image-based approaches.

  8. An infrared modular panoramic imaging objective

    NASA Astrophysics Data System (ADS)

    Palmer, Troy A.; Alexay, Christopher C.

    2004-08-01

    We describe the optical and mechanical design of an athermal infrared objective lens with an afocal anamorphic adapter. The lens presented consists of two modules: an athermal 25mm F/2.3 mid-wave IR objective lens and an optional panoramic adapter. The adapter utilizes anamorphic lenses to create unique image control. The result of which enables an independent horizontal wide field of view, while preserving the original narrow vertical field. We have designed, fabricated and tested two such lenses. A summary of the assembly and testing process is also presented.

  9. PANIC: A General-purpose Panoramic Near-infrared Camera for the Calar Alto Observatory

    NASA Astrophysics Data System (ADS)

    Cárdenas Vázquez, M.-C.; Dorner, B.; Huber, A.; Sánchez-Blanco, E.; Alter, M.; Rodríguez Gómez, J. F.; Bizenberger, P.; Naranjo, V.; Ibáñez Mengual, J.-M.; Panduro, J.; García Segura, A. J.; Mall, U.; Fernández, M.; Laun, W.; Ferro Rodríguez, I. M.; Helmling, J.; Terrón, V.; Meisenheimer, K.; Fried, J. W.; Mathar, R. J.; Baumeister, H.; Rohloff, R.-R.; Storz, C.; Verdes-Montenegro, L.; Bouy, H.; Ubierna, M.; Fopp, P.; Funke, B.

    2018-02-01

    PANIC7 is the new PAnoramic Near-Infrared Camera for Calar Alto and is a project jointly developed by the MPIA in Heidelberg, Germany, and the IAA in Granada, Spain, for the German-Spanish Astronomical Center at Calar Alto Observatory (CAHA; Almería, Spain). This new instrument works with the 2.2 m and 3.5 m CAHA telescopes covering a field of view of 30 × 30 arcmin and 15 × 15 arcmin, respectively, with a sampling of 4096 × 4096 pixels. It is designed for the spectral bands from Z to K S , and can also be equipped with narrowband filters. The instrument was delivered to the observatory in 2014 October and was commissioned at both telescopes between 2014 November and 2015 June. Science verification at the 2.2 m telescope was carried out during the second semester of 2015 and the instrument is now at full operation. We describe the design, assembly, integration, and verification process, the final laboratory tests and the PANIC instrument performance. We also present first-light data obtained during the commissioning and preliminary results of the scientific verification. The final optical model and the theoretical performance of the camera were updated according to the as-built data. The laboratory tests were made with a star simulator. Finally, the commissioning phase was done at both telescopes to validate the camera real performance on sky. The final laboratory test confirmed the expected camera performances, complying with the scientific requirements. The commissioning phase on sky has been accomplished.

  10. Possibility of passive THz camera using for a temperature difference observing of objects placed inside the human body

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Kuchik, Igor E.

    2014-06-01

    As it is well-known, application of the passive THz camera for the security problems is very promising way. It allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. We demonstrate new possibility of the passive THz camera using for a temperature difference observing on the human skin if this difference is caused by different temperatures inside the body. We discuss some physical experiments, in which a person drinks hot, and warm, and cold water and he eats. After computer processing of images captured by passive THz camera TS4 we may see the pronounced temperature trace on skin of the human body. For proof of validity of our statement we make the similar physical experiment using the IR camera. Our investigation allows to increase field of the passive THz camera using for the detection of objects concealed in the human body because the difference in temperature between object and parts of human body will be reflected on the human skin. However, modern passive THz cameras have not enough resolution in a temperature to see this difference. That is why, we use computer processing to enhance the camera resolution for this application. We consider images produced by THz passive cameras manufactured by Microsemi Corp., and ThruVision Corp.

  11. Water ingress detection in honeycomb sandwich panels by passive infrared thermography using a high-resolution thermal imaging camera

    NASA Astrophysics Data System (ADS)

    Ibarra-Castanedo, C.; Brault, L.; Marcotte, F.; Genest, M.; Farley, V.; Maldague, X.

    2012-06-01

    Water ingress in honeycomb structures is of great concern for the civil and military aerospace industries. Pressure and temperature variations during take-off and landing produce considerable stress on aircraft structures, promoting moisture ingress (by diffusion through fibers or by direct ingress through voids, cracks or unsealed joints) into the core. The presence of water (or other fluids such as kerosene, hydraulic fluid and de-icing agents) in any of its forms (gas vapor, liquid or ice) promotes corrosion, cell breakage, and induce composite layer delaminations and skin disbonds. In this study, testing specimens were produced from unserviceable parts from military aircraft. In order to simulate atmospheric conditions during landing, selected core areas were filled with measured quantities of water and then frozen in a cold chamber. The specimens were then removed from the chamber and monitored for over 20 minutes as they warm up using a cooled high-resolution infrared camera. Results have shown that detection and quantification of water ingress on honeycomb sandwich structures by passive infrared thermography is possible using a HD mid-wave infrared cameras for volumes of water as low as 0.2 ml and from a distance as far as 20 m from the target.

  12. United Kingdom Infrared Telescope's Spectrograph Observations of Human-Made Space Objects

    NASA Technical Reports Server (NTRS)

    Buckalew, Brent; Abercromby, Kira; Lederer, Susan; Cowardin, Heather; Frith, James

    2017-01-01

    Presented here are the results of the United Kingdom Infrared Telescope (UKIRT) spectral observations of human-made space objects taken from 2014 to 2015. The data collected using the UKIRT 1-5 micron Imager Spectrometer (UIST) cover the wavelength range 0.7-2.5 micrometers. Overall, data were collected on 18 different orbiting objects at or near geosynchronous orbit (GEO). Two of the objects are controlled spacecraft, twelve are non-controlled spacecraft, one is a rocket body, and three are cataloged as debris. The remotely collected data are compared to the laboratory-collected reflectance data on typical spacecraft materials; thereby general materials are identified but not specific types. These results highlight the usefulness of observations in the infrared by focusing on features from hydrocarbons and silicon. The spacecraft, both the controlled and non-controlled, show distinct features due to the presence of solar panels whereas the rocket bodies do not. Signature variations between rocket bodies, due to the presence of various metals and paints on their surfaces, show a clear distinction from those objects with solar panels, demonstrating that one can distinguish most spacecraft from rocket bodies through infrared spectrum analysis. Finally, the debris pieces tend to show featureless, dark spectra. These results show that the laboratory data in its current state give well-correlated indications as to the nature of the surface materials on the objects. Further telescopic data collection and model updates to include noise, surface roughness, and material degradation are necessary to make better assessments of orbital object material types. A comparison conducted between objects observed previously with the NASA Infrared Telescope Facility (IRTF) shows similar materials and trends from the two telescopes and different times. However, based on the current state of the model, infrared spectroscopic data are adequate to classify objects in GEO as spacecraft

  13. Low-cost far infrared bolometer camera for automotive use

    NASA Astrophysics Data System (ADS)

    Vieider, Christian; Wissmar, Stanley; Ericsson, Per; Halldin, Urban; Niklaus, Frank; Stemme, Göran; Källhammer, Jan-Erik; Pettersson, Håkan; Eriksson, Dick; Jakobsen, Henrik; Kvisterøy, Terje; Franks, John; VanNylen, Jan; Vercammen, Hans; VanHulsel, Annick

    2007-04-01

    A new low-cost long-wavelength infrared bolometer camera system is under development. It is designed for use with an automatic vision algorithm system as a sensor to detect vulnerable road users in traffic. Looking 15 m in front of the vehicle it can in case of an unavoidable impact activate a brake assist system or other deployable protection system. To achieve our cost target below €100 for the sensor system we evaluate the required performance and can reduce the sensitivity to 150 mK and pixel resolution to 80 x 30. We address all the main cost drivers as sensor size and production yield along with vacuum packaging, optical components and large volume manufacturing technologies. The detector array is based on a new type of high performance thermistor material. Very thin Si/SiGe single crystal multi-layers are grown epitaxially. Due to the resulting valence barriers a high temperature coefficient of resistance is achieved (3.3%/K). Simultaneously, the high quality crystalline material provides very low 1/f-noise characteristics and uniform material properties. The thermistor material is transferred from the original substrate wafer to the read-out circuit using adhesive wafer bonding and subsequent thinning. Bolometer arrays can then be fabricated using industry standard MEMS process and materials. The inherently good detector performance allows us to reduce the vacuum requirement and we can implement wafer level vacuum packaging technology used in established automotive sensor fabrication. The optical design is reduced to a single lens camera. We develop a low cost molding process using a novel chalcogenide glass (GASIR®3) and integrate anti-reflective and anti-erosion properties using diamond like carbon coating.

  14. Flame colour characterization in the visible and infrared spectrum using a digital camera and image processing

    NASA Astrophysics Data System (ADS)

    Huang, Hua-Wei; Zhang, Yang

    2008-08-01

    An attempt has been made to characterize the colour spectrum of methane flame under various burning conditions using RGB and HSV colour models instead of resolving the real physical spectrum. The results demonstrate that each type of flame has its own characteristic distribution in both the RGB and HSV space. It has also been observed that the averaged B and G values in the RGB model represent well the CH* and C*2 emission of methane premixed flame. Theses features may be utilized for flame measurement and monitoring. The great advantage of using a conventional camera for monitoring flame properties based on the colour spectrum is that it is readily available, easy to interface with a computer, cost effective and has certain spatial resolution. Furthermore, it has been demonstrated that a conventional digital camera is able to image flame not only in the visible spectrum but also in the infrared. This feature is useful in avoiding the problem of image saturation typically encountered in capturing the very bright sooty flames. As a result, further digital imaging processing and quantitative information extraction is possible. It has been identified that an infrared image also has its own distribution in both the RGB and HSV colour space in comparison with a flame image in the visible spectrum.

  15. Infrared imaging of the crime scene: possibilities and pitfalls.

    PubMed

    Edelman, Gerda J; Hoveling, Richelle J M; Roos, Martin; van Leeuwen, Ton G; Aalders, Maurice C G

    2013-09-01

    All objects radiate infrared energy invisible to the human eye, which can be imaged by infrared cameras, visualizing differences in temperature and/or emissivity of objects. Infrared imaging is an emerging technique for forensic investigators. The rapid, nondestructive, and noncontact features of infrared imaging indicate its suitability for many forensic applications, ranging from the estimation of time of death to the detection of blood stains on dark backgrounds. This paper provides an overview of the principles and instrumentation involved in infrared imaging. Difficulties concerning the image interpretation due to different radiation sources and different emissivity values within a scene are addressed. Finally, reported forensic applications are reviewed and supported by practical illustrations. When introduced in forensic casework, infrared imaging can help investigators to detect, to visualize, and to identify useful evidence nondestructively. © 2013 American Academy of Forensic Sciences.

  16. Optical design of MEMS-based infrared multi-object spectrograph concept for the Gemini South Telescope

    NASA Astrophysics Data System (ADS)

    Chen, Shaojie; Sivanandam, Suresh; Moon, Dae-Sik

    2016-08-01

    We discuss the optical design of an infrared multi-object spectrograph (MOS) concept that is designed to take advantage of the multi-conjugate adaptive optics (MCAO) corrected field at the Gemini South telescope. This design employs a unique, cryogenic MEMS-based focal plane mask to select target objects for spectroscopy by utilizing the Micro-Shutter Array (MSA) technology originally developed for the Near Infrared Spectrometer (NIRSpec) of the James Webb Space Telescope (JWST). The optical design is based on all spherical refractive optics, which serves both imaging and spectroscopic modes across the wavelength range of 0.9-2.5 μm. The optical system consists of a reimaging system, MSA, collimator, volume phase holographic (VPH) grisms, and spectrograph camera optics. The VPH grisms, which are VPH gratings sandwiched between two prisms, provide high dispersing efficiencies, and a set of several VPH grisms provide the broad spectral coverage at high throughputs. The imaging mode is implemented by removing the MSA and the dispersing unit out of the beam. We optimize both the imaging and spectrographic modes simultaneously, while paying special attention to the performance of the pupil imaging at the cold stop. Our current design provides a 1' ♢ 1' and a 0.5' ♢ 1' field of views for imaging and spectroscopic modes, respectively, on a 2048 × 2048 pixel HAWAII-2RG detector array. The spectrograph's slit width and spectral resolving power are 0.18'' and 3,000, respectively, and spectra of up to 100 objects can be obtained simultaneously. We present the overall results of simulated performance using optical model we designed.

  17. Investigation of the influence of spatial degrees of freedom on thermal infrared measurement

    NASA Astrophysics Data System (ADS)

    Fleuret, Julien R.; Yousefi, Bardia; Lei, Lei; Djupkep Dizeu, Frank Billy; Zhang, Hai; Sfarra, Stefano; Ouellet, Denis; Maldague, Xavier P. V.

    2017-05-01

    Long Wavelength Infrared (LWIR) cameras can provide a representation of a part of the light spectrum that is sensitive to temperature. These cameras also named Thermal Infrared (TIR) cameras are powerful tools to detect features that cannot be seen by other imaging technologies. For instance they enable defect detection in material, fever and anxiety in mammals and many other features for numerous applications. However, the accuracy of thermal cameras can be affected by many parameters; the most critical involves the relative position of the camera with respect to the object of interest. Several models have been proposed in order to minimize the influence of some of the parameters but they are mostly related to specific applications. Because such models are based on some prior informations related to context, their applicability to other contexts cannot be easily assessed. The few models remaining are mostly associated with a specific device. In this paper the authors studied the influence of the camera position on the measurement accuracy. Modeling of the position of the camera from the object of interest depends on many parameters. In order to propose a study which is as accurate as possible, the position of the camera will be represented as a five dimensions model. The aim of this study is to investigate and attempt to introduce a model which is as independent from the device as possible.

  18. Application of infrared uncooled cameras in surveillance systems

    NASA Astrophysics Data System (ADS)

    Dulski, R.; Bareła, J.; Trzaskawka, P.; PiÄ tkowski, T.

    2013-10-01

    The recent necessity to protect military bases, convoys and patrols gave serious impact to the development of multisensor security systems for perimeter protection. One of the most important devices used in such systems are IR cameras. The paper discusses technical possibilities and limitations to use uncooled IR camera in a multi-sensor surveillance system for perimeter protection. Effective ranges of detection depend on the class of the sensor used and the observed scene itself. Application of IR camera increases the probability of intruder detection regardless of the time of day or weather conditions. It also simultaneously decreased the false alarm rate produced by the surveillance system. The role of IR cameras in the system was discussed as well as technical possibilities to detect human being. Comparison of commercially available IR cameras, capable to achieve desired ranges was done. The required spatial resolution for detection, recognition and identification was calculated. The simulation of detection ranges was done using a new model for predicting target acquisition performance which uses the Targeting Task Performance (TTP) metric. Like its predecessor, the Johnson criteria, the new model bounds the range performance with image quality. The scope of presented analysis is limited to the estimation of detection, recognition and identification ranges for typical thermal cameras with uncooled microbolometer focal plane arrays. This type of cameras is most widely used in security systems because of competitive price to performance ratio. Detection, recognition and identification range calculations were made, and the appropriate results for the devices with selected technical specifications were compared and discussed.

  19. Characterization and Performance of the Cananea Near-infrared Camera (CANICA)

    NASA Astrophysics Data System (ADS)

    Devaraj, R.; Mayya, Y. D.; Carrasco, L.; Luna, A.

    2018-05-01

    We present details of characterization and imaging performance of the Cananea Near-infrared Camera (CANICA) at the 2.1 m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located in Cananea, Sonora, México. CANICA has a HAWAII array with a HgCdTe detector of 1024 × 1024 pixels covering a field of view of 5.5 × 5.5 arcmin2 with a plate scale of 0.32 arcsec/pixel. The camera characterization involved measuring key detector parameters: conversion gain, dark current, readout noise, and linearity. The pixels in the detector have a full-well-depth of 100,000 e‑ with the conversion gain measured to be 5.8 e‑/ADU. The time-dependent dark current was estimated to be 1.2 e‑/sec. Readout noise for correlated double sampled (CDS) technique was measured to be 30 e‑/pixel. The detector shows 10% non-linearity close to the full-well-depth. The non-linearity was corrected within 1% levels for the CDS images. Full-field imaging performance was evaluated by measuring the point spread function, zeropoints, throughput, and limiting magnitude. The average zeropoint value in each filter are J = 20.52, H = 20.63, and K = 20.23. The saturation limit of the detector is about sixth magnitude in all the primary broadbands. CANICA on the 2.1 m OAGH telescope reaches background-limited magnitudes of J = 18.5, H = 17.6, and K = 16.0 for a signal-to-noise ratio of 10 with an integration time of 900 s.

  20. Demonstration of First 9 Micron cutoff 640 x 486 GaAs Based Quantum Well Infrared PhotoDetector (QWIP) Snap-Shot Camera

    NASA Technical Reports Server (NTRS)

    Gunapala, S.; Bandara, S. V.; Liu, J. K.; Hong, W.; Sundaram, M.; Maker, P. D.; Muller, R. E.

    1997-01-01

    In this paper, we discuss the development of this very sensitive long waelength infrared (LWIR) camera based on a GaAs/AlGaAs QWIP focal plane array (FPA) and its performance in quantum efficiency, NEAT, uniformity, and operability.

  1. Unmanned Ground Vehicle Perception Using Thermal Infrared Cameras

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo; Huertas, Andres; Matthies, Larry; Bajracharya, Max; Assad, Christopher; Brennan, Shane; Bellut, Paolo; Sherwin, Gary

    2011-01-01

    TIR cameras can be used for day/night Unmanned Ground Vehicle (UGV) autonomous navigation when stealth is required. The quality of uncooled TIR cameras has significantly improved over the last decade, making them a viable option at low speed Limiting factors for stereo ranging with uncooled LWIR cameras are image blur and low texture scenes TIR perception capabilities JPL has explored includes: (1) single and dual band TIR terrain classification (2) obstacle detection (pedestrian, vehicle, tree trunks, ditches, and water) (3) perception thru obscurants

  2. LWIR NUC using an uncooled microbolometer camera

    NASA Astrophysics Data System (ADS)

    Laveigne, Joe; Franks, Greg; Sparkman, Kevin; Prewarski, Marcus; Nehring, Brian; McHugh, Steve

    2010-04-01

    Performing a good non-uniformity correction is a key part of achieving optimal performance from an infrared scene projector. Ideally, NUC will be performed in the same band in which the scene projector will be used. Cooled, large format MWIR cameras are readily available and have been successfully used to perform NUC, however, cooled large format LWIR cameras are not as common and are prohibitively expensive. Large format uncooled cameras are far more available and affordable, but present a range of challenges in practical use for performing NUC on an IRSP. Santa Barbara Infrared, Inc. reports progress on a continuing development program to use a microbolometer camera to perform LWIR NUC on an IRSP. Camera instability and temporal response and thermal resolution are the main difficulties. A discussion of processes developed to mitigate these issues follows.

  3. Space Infrared Telescope Facility (SIRTF) science instruments

    NASA Technical Reports Server (NTRS)

    Ramos, R.; Hing, S. M.; Leidich, C. A.; Fazio, G.; Houck, J. R.

    1989-01-01

    Concepts of scientific instruments designed to perform infrared astronomical tasks such as imaging, photometry, and spectroscopy are discussed as part of the Space Infrared Telescope Facility (SIRTF) project under definition study at NASA/Ames Research Center. The instruments are: the multiband imaging photometer, the infrared array camera, and the infrared spectograph. SIRTF, a cryogenically cooled infrared telescope in the 1-meter range and wavelengths as short as 2.5 microns carrying multiple instruments with high sensitivity and low background performance, provides the capability to carry out basic astronomical investigations such as deep search for very distant protogalaxies, quasi-stellar objects, and missing mass; infrared emission from galaxies; star formation and the interstellar medium; and the composition and structure of the atmospheres of the outer planets in the solar sytem.

  4. Thermal infrared data of active lava surfaces using a newly-developed camera system

    NASA Astrophysics Data System (ADS)

    Thompson, J. O.; Ramsey, M. S.

    2017-12-01

    Our ability to acquire accurate data during lava flow emplacement greatly improves models designed to predict their dynamics and down-flow hazard potential. For example, better constraint on the physical property of emissivity as a lava cools improves the accuracy of the derived temperature, a critical parameter for flow models that estimate at-vent eruption rate, flow length, and distribution. Thermal infrared (TIR) data are increasingly used as a tool to determine eruption styles and cooling regimes by measuring temperatures at high temporal resolutions. Factors that control the accurate measurement of surface temperatures include both material properties (e.g., emissivity and surface texture) as well as external factors (e.g., camera geometry and the intervening atmosphere). We present a newly-developed, field-portable miniature multispectral thermal infrared camera (MMT-Cam) to measure both temperature and emissivity of basaltic lava surfaces at up to 7 Hz. The MMT-Cam acquires emitted radiance in six wavelength channels in addition to the broadband temperature. The instrument was laboratory calibrated for systematic errors and fully field tested at the Overlook Crater lava lake (Kilauea, HI) in January 2017. The data show that the major emissivity absorption feature (around 8.5 to 9.0 µm) transitions to higher wavelengths and the depth of the feature decreases as a lava surface cools, forming a progressively thicker crust. This transition occurs over a temperature range of 758 to 518 K. Constraining the relationship between this spectral change and temperature derived from this data will provide more accurate temperatures and therefore, more accurate modeling results. This is the first time that emissivity and its link to temperature has been measured in situ on active lava surfaces, which will improve input parameters of flow propagation models and possibly improve flow forecasting.

  5. Privacy Protection by Masking Moving Objects for Security Cameras

    NASA Astrophysics Data System (ADS)

    Yabuta, Kenichi; Kitazawa, Hitoshi; Tanaka, Toshihisa

    Because of an increasing number of security cameras, it is crucial to establish a system that protects the privacy of objects in the recorded images. To this end, we propose a framework of image processing and data hiding for security monitoring and privacy protection. First, we state the requirements of the proposed monitoring systems and suggest possible implementation that satisfies those requirements. The underlying concept of our proposed framework is as follows: (1) in the recorded images, the objects whose privacy should be protected are deteriorated by appropriate image processing; (2) the original objects are encrypted and watermarked into the output image, which is encoded using an image compression standard; (3) real-time processing is performed such that no future frame is required to generate on output bitstream. It should be noted that in this framework, anyone can observe the decoded image that includes the deteriorated objects that are unrecognizable or invisible. On the other hand, for crime investigation, this system allows a limited number of users to observe the original objects by using a special viewer that decrypts and decodes the watermarked objects with a decoding password. Moreover, the special viewer allows us to select the objects to be decoded and displayed. We provide an implementation example, experimental results, and performance evaluations to support our proposed framework.

  6. Robot Towed Shortwave Infrared Camera for Specific Surface Area Retrieval of Surface Snow

    NASA Astrophysics Data System (ADS)

    Elliott, J.; Lines, A.; Ray, L.; Albert, M. R.

    2017-12-01

    Optical grain size and specific surface area are key parameters for measuring the atmospheric interactions of snow, as well as tracking metamorphosis and allowing for the ground truthing of remote sensing data. We describe a device using a shortwave infrared camera with changeable optical bandpass filters (centered at 1300 nm and 1550 nm) that can be used to quickly measure the average SSA over an area of 0.25 m^2. The device and method are compared with calculations made from measurements taken with a field spectral radiometer. The instrument is designed to be towed by a small autonomous ground vehicle, and therefore rides above the snow surface on ultra high molecular weight polyethylene (UHMW) skis.

  7. Pettit holds cameras in the U.S. Laboratory

    NASA Image and Video Library

    2012-01-15

    ISS030-E-175788 (15 Jan. 2012) --- NASA astronaut Don Pettit, Expedition 30 flight engineer, is pictured with two still cameras mounted together in the Destiny laboratory of the International Space Station. One camera is an infrared modified still camera.

  8. Optimising Camera Traps for Monitoring Small Mammals

    PubMed Central

    Glen, Alistair S.; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

    2013-01-01

    Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera’s field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera’s field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats ( Mustela erminea ), feral cats (Felis catus) and hedgehogs ( Erinaceus europaeus ). Trigger speeds of 0.2–2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera’s field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790

  9. Cygnids and Taurids - Two classes of infrared objects.

    NASA Technical Reports Server (NTRS)

    Strecker, D. W.; Ney, E. P.; Murdock, T. L.

    1973-01-01

    In a study of the anonymous objects from the IRC Survey, we have found that about 10 percent have large long wave excesses. These infrared stars seem to belong to two classes, one group like NML Cygni (Cygnids) and the other like NML Tauri (Taurids).

  10. Feasibility Study of Utilizing Existing Infrared Array Cameras for Daylight Star Tracking on NASA's Ultra Long Duration Balloon (ULDB) Missions

    NASA Technical Reports Server (NTRS)

    Tueller, Jack (Technical Monitor); Fazio, Giovanni G.; Tolls, Volker

    2004-01-01

    The purpose of this study was to investigate the feasibility of developing a daytime star tracker for ULDB flights using a commercially available off-the-shelf infrared array camera. This report describes the system used for ground-based tests, the observations, the test results, and gives recommendations for continued development.

  11. Infrared camera assessment of skin surface temperature--effect of emissivity.

    PubMed

    Bernard, V; Staffa, E; Mornstein, V; Bourek, A

    2013-11-01

    Infrared thermoimaging is one of the options for object temperature analysis. Infrared thermoimaging is unique due to the non-contact principle of measurement. So it is often used in medicine and for scientific experimental measurements. The presented work aims to determine whether the measurement results could be influenced by topical treatment of the hand surface by various substances. The authors attempted to determine whether the emissivity can be neglected or not in situations of topical application of substances such as ultrasound gel, ointment, disinfection, etc. The results of experiments showed that the value of surface temperature is more or less distorted by the topically applied substance. Our findings demonstrate the effect of emissivity of applied substances on resulting temperature and showed the necessity to integrate the emissivity into calculation of the final surface temperature. Infrared thermoimaging can be an appropriate method for determining the temperature of organisms, if this is understood as the surface temperature, and the surrounding environment and its temperature is taken into account. Copyright © 2012 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  12. NIFTE: The Near Infrared Faint-Object Telescope Experiment

    NASA Technical Reports Server (NTRS)

    Bock, James J.; Lange, Andrew E.; Matsumoto, T.; Eisenhardt, Peter B.; Hacking, Perry B.; Schember, Helene R.

    1994-01-01

    The high sensitivity of large format InSb arrays can be used to obtain deep images of the sky at 3-5 micrometers. In this spectral range cool or highly redshifted objects (e.g. brown dwarfs and protogalaxies) which are not visible at shorter wavelengths may be observed. Sensitivity at these wavelengths in ground-based observations is severly limited by the thermal flux from the telescope and from the earth's atmosphere. The Near Infrared Faint-Object Telescope Experiment (NIFTE), a 50 cm cooled rocket-borne telescope combined with large format, high performance InSb arrays, can reach a limiting flux less than 1 micro-Jy(1-sigma) over a large field-of-view in a single flight. In comparison, the Infrared Space Observatory (ISO) will require days of observation to reach a sensitivity more than one order of magnitude worse over a similar area of the sky. The deep 3-5 micrometer images obtained by the rocket-borne telescope will assist in determining the nature of faint red objects detected by ground-based telescopes at 2 micrometers, and by ISO at wavelengths longer than 5 micrometers.

  13. Multi-band infrared camera systems

    NASA Astrophysics Data System (ADS)

    Davis, Tim; Lang, Frank; Sinneger, Joe; Stabile, Paul; Tower, John

    1994-12-01

    The program resulted in an IR camera system that utilizes a unique MOS addressable focal plane array (FPA) with full TV resolution, electronic control capability, and windowing capability. Two systems were delivered, each with two different camera heads: a Stirling-cooled 3-5 micron band head and a liquid nitrogen-cooled, filter-wheel-based, 1.5-5 micron band head. Signal processing features include averaging up to 16 frames, flexible compensation modes, gain and offset control, and real-time dither. The primary digital interface is a Hewlett-Packard standard GPID (IEEE-488) port that is used to upload and download data. The FPA employs an X-Y addressed PtSi photodiode array, CMOS horizontal and vertical scan registers, horizontal signal line (HSL) buffers followed by a high-gain preamplifier and a depletion NMOS output amplifier. The 640 x 480 MOS X-Y addressed FPA has a high degree of flexibility in operational modes. By changing the digital data pattern applied to the vertical scan register, the FPA can be operated in either an interlaced or noninterlaced format. The thermal sensitivity performance of the second system's Stirling-cooled head was the best of the systems produced.

  14. 15-micro-m 128 x 128 GaAs/Al(x)Ga(1-x) As Quantum Well Infrared Photodetector Focal Plane Array Camera

    NASA Technical Reports Server (NTRS)

    Gunapala, Sarath D.; Park, Jin S.; Sarusi, Gabby; Lin, True-Lon; Liu, John K.; Maker, Paul D.; Muller, Richard E.; Shott, Craig A.; Hoelter, Ted

    1997-01-01

    In this paper, we discuss the development of very sensitive, very long wavelength infrared GaAs/Al(x)Ga(1-x)As quantum well infrared photodetectors (QWIP's) based on bound-to-quasi-bound intersubband transition, fabrication of random reflectors for efficient light coupling, and the demonstration of a 15 micro-m cutoff 128 x 128 focal plane array imaging camera. Excellent imagery, with a noise equivalent differential temperature (N E(delta T)) of 30 mK has been achieved.

  15. Visibility through the gaseous smoke in airborne remote sensing using a DSLR camera

    NASA Astrophysics Data System (ADS)

    Chabok, Mirahmad; Millington, Andrew; Hacker, Jorg M.; McGrath, Andrew J.

    2016-08-01

    Visibility and clarity of remotely sensed images acquired by consumer grade DSLR cameras, mounted on an unmanned aerial vehicle or a manned aircraft, are critical factors in obtaining accurate and detailed information from any area of interest. The presence of substantial haze, fog or gaseous smoke particles; caused, for example, by an active bushfire at the time of data capture, will dramatically reduce image visibility and quality. Although most modern hyperspectral imaging sensors are capable of capturing a large number of narrow range bands of the shortwave and thermal infrared spectral range, which have the potential to penetrate smoke and haze, the resulting images do not contain sufficient spatial detail to enable locating important objects or assist search and rescue or similar applications which require high resolution information. We introduce a new method for penetrating gaseous smoke without compromising spatial resolution using a single modified DSLR camera in conjunction with image processing techniques which effectively improves the visibility of objects in the captured images. This is achieved by modifying a DSLR camera and adding a custom optical filter to enable it to capture wavelengths from 480-1200nm (R, G and Near Infrared) instead of the standard RGB bands (400-700nm). With this modified camera mounted on an aircraft, images were acquired over an area polluted by gaseous smoke from an active bushfire. Processed data using our proposed method shows significant visibility improvements compared with other existing solutions.

  16. The NASA - Arc 10/20 micron camera

    NASA Technical Reports Server (NTRS)

    Roellig, T. L.; Cooper, R.; Deutsch, L. K.; Mccreight, C.; Mckelvey, M.; Pendleton, Y. J.; Witteborn, F. C.; Yuen, L.; Mcmahon, T.; Werner, M. W.

    1994-01-01

    A new infrared camera (AIR Camera) has been developed at NASA - Ames Research Center for observations from ground-based telescopes. The heart of the camera is a Hughes 58 x 62 pixel Arsenic-doped Silicon detector array that has the spectral sensitivity range to allow observations in both the 10 and 20 micron atmospheric windows.

  17. Minimising back reflections from the common path objective in a fundus camera

    NASA Astrophysics Data System (ADS)

    Swat, A.

    2016-11-01

    Eliminating back reflections is critical in the design of a fundus camera with internal illuminating system. As there is very little light reflected from the retina, even excellent antireflective coatings are not sufficient suppression of ghost reflections, therefore the number of surfaces in the common optics in illuminating and imaging paths shall be minimised. Typically a single aspheric objective is used. In the paper an alternative approach, an objective with all spherical surfaces, is presented. As more surfaces are required, more sophisticated method is needed to get rid of back reflections. Typically back reflections analysis, comprise treating subsequent objective surfaces as mirrors, and reflections from the objective surfaces are traced back through the imaging path. This approach can be applied in both sequential and nonsequential ray tracing. It is good enough for system check but not very suitable for early optimisation process in the optical system design phase. There are also available standard ghost control merit function operands in the sequential ray-trace, for example in Zemax system, but these don't allow back ray-trace in an alternative optical path, illumination vs. imaging. What is proposed in the paper, is a complete method to incorporate ghost reflected energy into the raytracing system merit function for sequential mode which is more efficient in optimisation process. Although developed for the purpose of specific case of fundus camera, the method might be utilised in a wider range of applications where ghost control is critical.

  18. The infrared camera application for calculating the impact of the feed screw thermal expansion on machining accuracy

    NASA Astrophysics Data System (ADS)

    Matras, A.

    2017-08-01

    The paper discusses the impact of the feed screw heating on the machining accuracy. The test stand was built based on HASS Mini Mill 2 CNC milling machine and a Flir SC620 infrared camera. Measurements of workpiece were performed on Talysurf Intra 50 Taylor Hobson profilometer. The research proved that the intensive work of the milling machine lasted 60 minutes, causing thermal expansion of the feed screw what influence on the dimension error of the workpiece.

  19. Traffic intensity monitoring using multiple object detection with traffic surveillance cameras

    NASA Astrophysics Data System (ADS)

    Hamdan, H. G. Muhammad; Khalifah, O. O.

    2017-11-01

    Object detection and tracking is a field of research that has many applications in the current generation with increasing number of cameras on the streets and lower cost for Internet of Things(IoT). In this paper, a traffic intensity monitoring system is implemented based on the Macroscopic Urban Traffic model is proposed using computer vision as its source. The input of this program is extracted from a traffic surveillance camera which has another program running a neural network classification which can identify and differentiate the vehicle type is implanted. The neural network toolbox is trained with positive and negative input to increase accuracy. The accuracy of the program is compared to other related works done and the trends of the traffic intensity from a road is also calculated. relevant articles in literature searches, great care should be taken in constructing both. Lastly the limitation and the future work is concluded.

  20. Infrared Fiber Imager

    DTIC Science & Technology

    1999-05-12

    to an infrared television camera AVTO TVS-2100. The detector in the camera was an InSb crystal having sensitivity in the wavelength region between 3.0...Serial Number: Navy Case: 79,823 camera AVTO TVS-2100, with a detector of the In Sb crystal, having peak sensitivity in the wavelength region between

  1. Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor.

    PubMed

    Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung

    2018-03-23

    Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works.

  2. Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor

    PubMed Central

    Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung

    2018-01-01

    Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works. PMID:29570690

  3. Infrared detection, recognition and identification of handheld objects

    NASA Astrophysics Data System (ADS)

    Adomeit, Uwe

    2012-10-01

    A main criterion for comparison and selection of thermal imagers for military applications is their nominal range performance. This nominal range performance is calculated for a defined task and standardized target and environmental conditions. The only standardization available to date is STANAG 4347. The target defined there is based on a main battle tank in front view. Because of modified military requirements, this target is no longer up-to-date. Today, different topics of interest are of interest, especially differentiation between friend and foe and identification of humans. There is no direct way to differentiate between friend and foe in asymmetric scenarios, but one clue can be that someone is carrying a weapon. This clue can be transformed in the observer tasks detection: a person is carrying or is not carrying an object, recognition: the object is a long / medium / short range weapon or civil equipment and identification: the object can be named (e. g. AK-47, M-4, G36, RPG7, Axe, Shovel etc.). These tasks can be assessed experimentally and from the results of such an assessment, a standard target for handheld objects may be derived. For a first assessment, a human carrying 13 different handheld objects in front of his chest was recorded at four different ranges with an IR-dual-band camera. From the recorded data, a perception experiment was prepared. It was conducted with 17 observers in a 13-alternative forced choice, unlimited observation time arrangement. The results of the test together with Minimum Temperature Difference Perceived measurements of the camera and temperature difference and critical dimension derived from the recorded imagery allowed defining a first standard target according to the above tasks. This standard target consist of 2.5 / 3.5 / 5 DRI line pairs on target, 0.24 m critical size and 1 K temperature difference. The values are preliminary and have to be refined in the future. Necessary are different aspect angles, different

  4. Extratympanic observation of middle ear structure using a refractive index matching material (glycerol) and an infrared camera.

    PubMed

    Kong, Soo-Keun; Chon, Kyong-Myong; Goh, Eui-Kyung; Lee, Il-Woo; Wang, Soo-Geun

    2014-05-01

    High-resolution computed tomography has been used mainly in the diagnosis of middle ear disease, such as high-jugular bulb, congenital cholesteatoma, and ossicular disruption. However, certain diagnoses are confirmed through exploratory tympanotomy. There are few noninvasive methods available to observe the middle ear. The purpose of this study was to investigate the effect of glycerol as a refractive index matching material and an infrared (IR) camera system for extratympanic observation. 30% glycerol was used as a refractive index matching material in five fresh cadavers. Each material was divided into four subgroups; GN (glycerol no) group, GO (glycerol out) group, GI (glycerol in) group, and GB (glycerol both) group. A printed letter and middle ear structures on the inside tympanic membrane were observed using a visible and IR ray camera system. In the GB group, there were marked a transilluminated letter or an ossicle on the inside tympanic membrane. In particular, a footplate of stapes was even transilluminated using the IR camera system in the GB group. This method can be useful in the diagnosis of diseases of the middle ear if it is clinically applied through further studies.

  5. Extratympanic observation of middle ear structure using a refractive index matching material (glycerol) and an infrared camera

    NASA Astrophysics Data System (ADS)

    Kong, Soo-Keun; Chon, Kyong-Myong; Goh, Eui-Kyung; Lee, Il-Woo; Wang, Soo-Geun

    2014-05-01

    High-resolution computed tomography has been used mainly in the diagnosis of middle ear disease, such as high-jugular bulb, congenital cholesteatoma, and ossicular disruption. However, certain diagnoses are confirmed through exploratory tympanotomy. There are few noninvasive methods available to observe the middle ear. The purpose of this study was to investigate the effect of glycerol as a refractive index matching material and an infrared (IR) camera system for extratympanic observation. 30% glycerol was used as a refractive index matching material in five fresh cadavers. Each material was divided into four subgroups; GN (glycerol no) group, GO (glycerol out) group, GI (glycerol in) group, and GB (glycerol both) group. A printed letter and middle ear structures on the inside tympanic membrane were observed using a visible and IR ray camera system. In the GB group, there were marked a transilluminated letter or an ossicle on the inside tympanic membrane. In particular, a footplate of stapes was even transilluminated using the IR camera system in the GB group. This method can be useful in the diagnosis of diseases of the middle ear if it is clinically applied through further studies.

  6. Research into a Single-aperture Light Field Camera System to Obtain Passive Ground-based 3D Imagery of LEO Objects

    NASA Astrophysics Data System (ADS)

    Bechis, K.; Pitruzzello, A.

    2014-09-01

    This presentation describes our ongoing research into using a ground-based light field camera to obtain passive, single-aperture 3D imagery of LEO objects. Light field cameras are an emerging and rapidly evolving technology for passive 3D imaging with a single optical sensor. The cameras use an array of lenslets placed in front of the camera focal plane, which provides angle of arrival information for light rays originating from across the target, allowing range to target and 3D image to be obtained from a single image using monocular optics. The technology, which has been commercially available for less than four years, has the potential to replace dual-sensor systems such as stereo cameras, dual radar-optical systems, and optical-LIDAR fused systems, thus reducing size, weight, cost, and complexity. We have developed a prototype system for passive ranging and 3D imaging using a commercial light field camera and custom light field image processing algorithms. Our light field camera system has been demonstrated for ground-target surveillance and threat detection applications, and this paper presents results of our research thus far into applying this technology to the 3D imaging of LEO objects. The prototype 3D imaging camera system developed by Northrop Grumman uses a Raytrix R5 C2GigE light field camera connected to a Windows computer with an nVidia graphics processing unit (GPU). The system has a frame rate of 30 Hz, and a software control interface allows for automated camera triggering and light field image acquisition to disk. Custom image processing software then performs the following steps: (1) image refocusing, (2) change detection, (3) range finding, and (4) 3D reconstruction. In Step (1), a series of 2D images are generated from each light field image; the 2D images can be refocused at up to 100 different depths. Currently, steps (1) through (3) are automated, while step (4) requires some user interaction. A key requirement for light field camera

  7. Taking on the Heat—a Narrative Account of How Infrared Cameras Invite Instant Inquiry

    NASA Astrophysics Data System (ADS)

    Haglund, Jesper; Jeppsson, Fredrik; Schönborn, Konrad J.

    2016-10-01

    Integration of technology, social learning and scientific models offers pedagogical opportunities for science education. A particularly interesting area is thermal science, where students often struggle with abstract concepts, such as heat. In taking on this conceptual obstacle, we explore how hand-held infrared (IR) visualization technology can strengthen students' understanding of thermal phenomena. Grounded in the Swedish physics curriculum and part of a broader research programme on educational uses of IR cameras, we have developed laboratory exercises around a thermal storyline, in conjunction with the teaching of a heat-flow model. We report a narrative analysis of how a group of five fourth-graders, facilitated by a researcher, predicts, observes and explains (POE) how the temperatures change when they pour hot water into a ceramic coffee mug and a thin plastic cup. Four chronological episodes are described and analysed as group interaction unfolded. Results revealed that the students engaged cognitively and emotionally with the POE task and, in particular, held a sustained focus on making observations and offering explanations for the scenarios. A compelling finding was the group's spontaneous generation of multiple "what-ifs" in relation to thermal phenomena, such as blowing on the water surface, or submerging a pencil into the hot water. This was followed by immediate interrogation with the IR camera, a learning event we label instant inquiry. The students' expressions largely reflected adoption of the heat-flow model. In conclusion, IR cameras could serve as an access point for even very young students to develop complex thermal concepts.

  8. Spatiotemporal motion boundary detection and motion boundary velocity estimation for tracking moving objects with a moving camera: a level sets PDEs approach with concurrent camera motion compensation.

    PubMed

    Feghali, Rosario; Mitiche, Amar

    2004-11-01

    The purpose of this study is to investigate a method of tracking moving objects with a moving camera. This method estimates simultaneously the motion induced by camera movement. The problem is formulated as a Bayesian motion-based partitioning problem in the spatiotemporal domain of the image quence. An energy functional is derived from the Bayesian formulation. The Euler-Lagrange descent equations determine imultaneously an estimate of the image motion field induced by camera motion and an estimate of the spatiotemporal motion undary surface. The Euler-Lagrange equation corresponding to the surface is expressed as a level-set partial differential equation for topology independence and numerically stable implementation. The method can be initialized simply and can track multiple objects with nonsimultaneous motions. Velocities on motion boundaries can be estimated from geometrical properties of the motion boundary. Several examples of experimental verification are given using synthetic and real-image sequences.

  9. Camera traps can be heard and seen by animals.

    PubMed

    Meek, Paul D; Ballard, Guy-Anthony; Fleming, Peter J S; Schaefer, Michael; Williams, Warwick; Falzon, Greg

    2014-01-01

    Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals' hearing and produce illumination that can be seen by many species.

  10. Camera Traps Can Be Heard and Seen by Animals

    PubMed Central

    Meek, Paul D.; Ballard, Guy-Anthony; Fleming, Peter J. S.; Schaefer, Michael; Williams, Warwick; Falzon, Greg

    2014-01-01

    Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals’ hearing and produce illumination that can be seen by many species. PMID:25354356

  11. Change detection and characterization of volcanic activity using ground based low-light and near infrared cameras to monitor incandescence and thermal signatures

    NASA Astrophysics Data System (ADS)

    Harrild, M.; Webley, P.; Dehn, J.

    2014-12-01

    Knowledge and understanding of precursory events and thermal signatures are vital for monitoring volcanogenic processes, as activity can often range from low level lava effusion to large explosive eruptions, easily capable of ejecting ash up to aircraft cruise altitudes. Using ground based remote sensing techniques to monitor and detect this activity is essential, but often the required equipment and maintenance is expensive. Our investigation explores the use of low-light cameras to image volcanic activity in the visible to near infrared (NIR) portion of the electromagnetic spectrum. These cameras are ideal for monitoring as they are cheap, consume little power, are easily replaced and can provide near real-time data. We focus here on the early detection of volcanic activity, using automated scripts, that capture streaming online webcam imagery and evaluate image pixel brightness values to determine relative changes and flag increases in activity. The script is written in Python, an open source programming language, to reduce the overall cost to potential consumers and increase the application of these tools across the volcanological community. In addition, by performing laboratory tests to determine the spectral response of these cameras, a direct comparison of collocated low-light and thermal infrared cameras has allowed approximate eruption temperatures and effusion rates to be determined from pixel brightness. The results of a field campaign in June, 2013 to Stromboli volcano, Italy, are also presented here. Future field campaigns to Latin America will include collaborations with INSIVUMEH in Guatemala, to apply our techniques to Fuego and Santiaguito volcanoes.

  12. Change detection and characterization of volcanic activity using ground based low-light and near infrared cameras to monitor incandescence and thermal signatures

    NASA Astrophysics Data System (ADS)

    Harrild, Martin; Webley, Peter; Dehn, Jonathan

    2015-04-01

    Knowledge and understanding of precursory events and thermal signatures are vital for monitoring volcanogenic processes, as activity can often range from low level lava effusion to large explosive eruptions, easily capable of ejecting ash up to aircraft cruise altitudes. Using ground based remote sensing techniques to monitor and detect this activity is essential, but often the required equipment and maintenance is expensive. Our investigation explores the use of low-light cameras to image volcanic activity in the visible to near infrared (NIR) portion of the electromagnetic spectrum. These cameras are ideal for monitoring as they are cheap, consume little power, are easily replaced and can provide near real-time data. We focus here on the early detection of volcanic activity, using automated scripts, that capture streaming online webcam imagery and evaluate image pixel brightness values to determine relative changes and flag increases in activity. The script is written in Python, an open source programming language, to reduce the overall cost to potential consumers and increase the application of these tools across the volcanological community. In addition, by performing laboratory tests to determine the spectral response of these cameras, a direct comparison of collocated low-light and thermal infrared cameras has allowed approximate eruption temperatures and effusion rates to be determined from pixel brightness. The results of a field campaign in June, 2013 to Stromboli volcano, Italy, are also presented here. Future field campaigns to Latin America will include collaborations with INSIVUMEH in Guatemala, to apply our techniques to Fuego and Santiaguito volcanoes.

  13. Thin and thick cloud top height retrieval algorithm with the Infrared Camera and LIDAR of the JEM-EUSO Space Mission

    NASA Astrophysics Data System (ADS)

    Sáez-Cano, G.; Morales de los Ríos, J. A.; del Peral, L.; Neronov, A.; Wada, S.; Rodríguez Frías, M. D.

    2015-03-01

    The origin of cosmic rays have remained a mistery for more than a century. JEM-EUSO is a pioneer space-based telescope that will be located at the International Space Station (ISS) and its aim is to detect Ultra High Energy Cosmic Rays (UHECR) and Extremely High Energy Cosmic Rays (EHECR) by observing the atmosphere. Unlike ground-based telescopes, JEM-EUSO will observe from upwards, and therefore, for a properly UHECR reconstruction under cloudy conditions, a key element of JEM-EUSO is an Atmospheric Monitoring System (AMS). This AMS consists of a space qualified bi-spectral Infrared Camera, that will provide the cloud coverage and cloud top height in the JEM-EUSO Field of View (FoV) and a LIDAR, that will measure the atmospheric optical depth in the direction it has been shot. In this paper we will explain the effects of clouds for the determination of the UHECR arrival direction. Moreover, since the cloud top height retrieval is crucial to analyze the UHECR and EHECR events under cloudy conditions, the retrieval algorithm that fulfills the technical requierements of the Infrared Camera of JEM-EUSO to reconstruct the cloud top height is presently reported.

  14. Time-Resolved Near-Infrared Photometry of Extreme Kuiper Belt Object Haumea

    NASA Astrophysics Data System (ADS)

    Lacerda, Pedro

    2009-02-01

    We present time-resolved near-infrared (J and H) photometry of the extreme Kuiper belt object (136108) Haumea (formerly 2003 EL61) taken to further investigate rotational variability of this object. The new data show that the near-infrared peak-to-peak photometric range is similar to the value at visible wavelengths, ΔmR = 0.30 ± 0.02 mag. Detailed analysis of the new and previous data reveals subtle visible/near-infrared color variations across the surface of Haumea. The color variations are spatially correlated with a previously identified surface region, redder in B - R and darker than the mean surface. Our photometry indicates that the J - H colors of Haumea (J - H = -0.057 ± 0.016 mag) and its brightest satellite Hi'iaka (J - H = -0.399 ± 0.034 mag) are significantly (greater than 9σ) different. The satellite Hi'iaka is unusually blue in J - H, consistent with strong 1.5 μm water-ice absorption. The phase coefficient of Haumea is found to increase monotonically with wavelength in the range 0.4 < λ < 1.3. We compare our findings with other solar system objects and discuss implications regarding the surface of Haumea.

  15. A Portable, Inexpensive, Nonmydriatic Fundus Camera Based on the Raspberry Pi® Computer.

    PubMed

    Shen, Bailey Y; Mukai, Shizuo

    2017-01-01

    Purpose. Nonmydriatic fundus cameras allow retinal photography without pharmacologic dilation of the pupil. However, currently available nonmydriatic fundus cameras are bulky, not portable, and expensive. Taking advantage of recent advances in mobile technology, we sought to create a nonmydriatic fundus camera that was affordable and could be carried in a white coat pocket. Methods. We built a point-and-shoot prototype camera using a Raspberry Pi computer, an infrared-sensitive camera board, a dual infrared and white light light-emitting diode, a battery, a 5-inch touchscreen liquid crystal display, and a disposable 20-diopter condensing lens. Our prototype camera was based on indirect ophthalmoscopy with both infrared and white lights. Results. The prototype camera measured 133mm × 91mm × 45mm and weighed 386 grams. The total cost of the components, including the disposable lens, was $185.20. The camera was able to obtain good-quality fundus images without pharmacologic dilation of the pupils. Conclusion. A fully functional, inexpensive, handheld, nonmydriatic fundus camera can be easily assembled from a relatively small number of components. With modest improvements, such a camera could be useful for a variety of healthcare professionals, particularly those who work in settings where a traditional table-mounted nonmydriatic fundus camera would be inconvenient.

  16. A Portable, Inexpensive, Nonmydriatic Fundus Camera Based on the Raspberry Pi® Computer

    PubMed Central

    Shen, Bailey Y.

    2017-01-01

    Purpose. Nonmydriatic fundus cameras allow retinal photography without pharmacologic dilation of the pupil. However, currently available nonmydriatic fundus cameras are bulky, not portable, and expensive. Taking advantage of recent advances in mobile technology, we sought to create a nonmydriatic fundus camera that was affordable and could be carried in a white coat pocket. Methods. We built a point-and-shoot prototype camera using a Raspberry Pi computer, an infrared-sensitive camera board, a dual infrared and white light light-emitting diode, a battery, a 5-inch touchscreen liquid crystal display, and a disposable 20-diopter condensing lens. Our prototype camera was based on indirect ophthalmoscopy with both infrared and white lights. Results. The prototype camera measured 133mm × 91mm × 45mm and weighed 386 grams. The total cost of the components, including the disposable lens, was $185.20. The camera was able to obtain good-quality fundus images without pharmacologic dilation of the pupils. Conclusion. A fully functional, inexpensive, handheld, nonmydriatic fundus camera can be easily assembled from a relatively small number of components. With modest improvements, such a camera could be useful for a variety of healthcare professionals, particularly those who work in settings where a traditional table-mounted nonmydriatic fundus camera would be inconvenient. PMID:28396802

  17. Near infrared observations of S 155. Evidence of induced star formation?

    NASA Astrophysics Data System (ADS)

    Hunt, L. K.; Lisi, F.; Felli, M.; Tofani, G.

    In order to investigate the possible existence of embedded objects of recent formation in the area of the Cepheus B - Sh2-155 interface, the authors have observed the region of the compact radio continuum source with the new near infrared camera ARNICA and the TIRGO telescope.

  18. Infrared and Radio Observations of a Small Group of Protostellar Objects in the Molecular Core, L1251-C

    NASA Astrophysics Data System (ADS)

    Kim, Jungha; Lee, Jeong-Eun; Choi, Minho; Bourke, Tyler L.; Evans, Neal J., II; Di Francesco, James; Cieza, Lucas A.; Dunham, Michael M.; Kang, Miju

    2015-05-01

    We present a multi-wavelength observational study of a low-mass star-forming region, L1251-C, with observational results at wavelengths from the near-infrared to the millimeter. Spitzer Space Telescope observations confirmed that IRAS 22343+7501 is a small group of protostellar objects. The extended emission in the east-west direction with its intensity peak at the center of L1251A has been detected at 350 and 850 μm with the Caltech Submillimeter Observatory and James Clerk Maxwell telescopes, tracing dense envelope material around L1251A. The single-dish data from the Korean VLBI Network and TRAO telescopes show inconsistencies between the intensity peaks of several molecular emission lines and that of the continuum emission, suggesting complex distributions of molecular abundances around L1251A. The Submillimeter Array interferometer data, however, show intensity peaks of CO 2-1 and 13CO 2-1 located at the position of IRS 1, which is both the brightest source in the Infrared Array Camera image and the weakest source in the 1.3 mm dust-continuum map. IRS 1 is the strongest candidate for the driving source of the newly detected compact CO 2-1 outflow. Over the entire region (14‧ × 14‧) of L125l-C, 3 Class I and 16 Class II sources have been detected, including three young stellar objects (YSOs) in L1251A. A comparison between the average projected distance among the 19 YSOs in L1251-C and that among the 3 YSOs in L1251A suggests that L1251-C is an example of low-mass cluster formation where protostellar objects form in a small group.

  19. Comparison of two surface temperature measurement using thermocouples and infrared camera

    NASA Astrophysics Data System (ADS)

    Michalski, Dariusz; Strąk, Kinga; Piasecka, Magdalena

    This paper compares two methods applied to measure surface temperatures at an experimental setup designed to analyse flow boiling heat transfer. The temperature measurements were performed in two parallel rectangular minichannels, both 1.7 mm deep, 16 mm wide and 180 mm long. The heating element for the fluid flowing in each minichannel was a thin foil made of Haynes-230. The two measurement methods employed to determine the surface temperature of the foil were: the contact method, which involved mounting thermocouples at several points in one minichannel, and the contactless method to study the other minichannel, where the results were provided with an infrared camera. Calculations were necessary to compare the temperature results. Two sets of measurement data obtained for different values of the heat flux were analysed using the basic statistical methods, the method error and the method accuracy. The experimental error and the method accuracy were taken into account. The comparative analysis showed that although the values and distributions of the surface temperatures obtained with the two methods were similar but both methods had certain limitations.

  20. Temperature measurement with industrial color camera devices

    NASA Astrophysics Data System (ADS)

    Schmidradler, Dieter J.; Berndorfer, Thomas; van Dyck, Walter; Pretschuh, Juergen

    1999-05-01

    This paper discusses color camera based temperature measurement. Usually, visual imaging and infrared image sensing are treated as two separate disciplines. We will show, that a well selected color camera device might be a cheaper, more robust and more sophisticated solution for optical temperature measurement in several cases. Herein, only implementation fragments and important restrictions for the sensing element will be discussed. Our aim is to draw the readers attention to the use of visual image sensors for measuring thermal radiation and temperature and to give reasons for the need of improved technologies for infrared camera devices. With AVL-List, our partner of industry, we successfully used the proposed sensor to perform temperature measurement for flames inside the combustion chamber of diesel engines which finally led to the presented insights.

  1. Built-in hyperspectral camera for smartphone in visible, near-infrared and middle-infrared lights region (second report): sensitivity improvement of Fourier-spectroscopic imaging to detect diffuse reflection lights from internal human tissues for healthcare sensors

    NASA Astrophysics Data System (ADS)

    Kawashima, Natsumi; Hosono, Satsuki; Ishimaru, Ichiro

    2016-05-01

    We proposed the snapshot-type Fourier spectroscopic imaging for smartphone that was mentioned in 1st. report in this conference. For spectroscopic components analysis, such as non-invasive blood glucose sensors, the diffuse reflection lights from internal human skins are very weak for conventional hyperspectral cameras, such as AOTF (Acousto-Optic Tunable Filter) type. Furthermore, it is well known that the spectral absorption of mid-infrared lights or Raman spectroscopy especially in long wavelength region is effective to distinguish specific biomedical components quantitatively, such as glucose concentration. But the main issue was that photon energies of middle infrared lights and light intensities of Raman scattering are extremely weak. For improving sensitivity of our spectroscopic imager, the wide-field-stop & beam-expansion method was proposed. Our line spectroscopic imager introduced a single slit for field stop on the conjugate objective plane. Obviously to increase detected light intensities, the wider slit width of the field stop makes light intensities higher, regardless of deterioration of spatial resolutions. Because our method is based on wavefront-division interferometry, it becomes problems that the wider width of single slit makes the diffraction angle narrower. This means that the narrower diameter of collimated objective beams deteriorates visibilities of interferograms. By installing the relative inclined phaseshifter onto optical Fourier transform plane of infinity corrected optical systems, the collimated half flux of objective beams derived from single-bright points on objective surface penetrate through the wedge prism and the cuboid glass respectively. These two beams interfere each other and form the infererogram as spatial fringe patterns. Thus, we installed concave-cylindrical lens between the wider slit and objective lens as a beam expander. We successfully obtained the spectroscopic characters of hemoglobin from reflected lights from

  2. KENNEDY SPACE CENTER, FLA. - Workers in KSC's Vertical Processing Facility inspect the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) on its handling fixture. NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS also is HST's first cryogenic instrument — its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 degrees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is targeted Feb. 11 aboard Discovery with a crew of seven.

    NASA Image and Video Library

    1997-01-18

    KENNEDY SPACE CENTER, FLA. - Workers in KSC's Vertical Processing Facility inspect the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) on its handling fixture. NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS also is HST's first cryogenic instrument — its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 degrees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is targeted Feb. 11 aboard Discovery with a crew of seven.

  3. KENNEDY SPACE CENTER, FLA. - Workers in KSC's Vertical Processing Facility lower the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) into the Second Axial Carrier. NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS also is HST's first cryogenic instrument — its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 degrees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is targeted Feb. 11 aboard Discovery with a crew of seven.

    NASA Image and Video Library

    1997-01-18

    KENNEDY SPACE CENTER, FLA. - Workers in KSC's Vertical Processing Facility lower the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) into the Second Axial Carrier. NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS also is HST's first cryogenic instrument — its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 degrees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is targeted Feb. 11 aboard Discovery with a crew of seven.

  4. KENNEDY SPACE CENTER, FLA. - Workers in KSC's Vertical Processing Facility lower the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) into the Second Axial Carrier. NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS is HST's first cryogenic instrument -- its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 derees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is targeted Feb. 11 aboard Discovery with a crew of seven.

    NASA Image and Video Library

    1997-01-16

    KENNEDY SPACE CENTER, FLA. - Workers in KSC's Vertical Processing Facility lower the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) into the Second Axial Carrier. NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS is HST's first cryogenic instrument -- its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 derees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is targeted Feb. 11 aboard Discovery with a crew of seven.

  5. The Effect of a Pre-Lens Aperture on the Temperature Range and Image Uniformity of Microbolometer Infrared Cameras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinwiddie, Ralph Barton; Parris, Larkin S.; Lindal, John M.

    This paper explores the temperature range extension of long-wavelength infrared (LWIR) cameras by placing an aperture in front of the lens. An aperture smaller than the lens will reduce the radiance to the sensor, allowing the camera to image targets much hotter than typically allowable. These higher temperatures were accurately determined after developing a correction factor which was applied to the built-in temperature calibration. The relationship between aperture diameter and temperature range is linear. The effect of pre-lens apertures on the image uniformity is a form of anti-vignetting, meaning the corners appear brighter (hotter) than the rest of the image.more » An example of using this technique to measure temperatures of high melting point polymers during 3D printing provide valuable information of the time required for the weld-line temperature to fall below the glass transition temperature.« less

  6. Verification of the test stand for microbolometer camera in accredited laboratory

    NASA Astrophysics Data System (ADS)

    Krupiński, Michal; Bareła, Jaroslaw; Chmielewski, Krzysztof; Kastek, Mariusz

    2017-10-01

    Microbolometer belongs to the group of thermal detectors and consist of temperature sensitive resistor which is exposed to measured radiation flux. Bolometer array employs a pixel structure prepared in silicon technology. The detecting area is defined by a size of thin membrane, usually made of amorphous silicon (a-Si) or vanadium oxide (VOx). FPAs are made of a multitude of detector elements (for example 384 × 288 ), where each individual detector has different sensitivity and offset due to detector-to-detector spread in the FPA fabrication process, and additionally can change with sensor operating temperature, biasing voltage variation or temperature of the observed scene. The difference in sensitivity and offset among detectors (which is called non-uniformity) additionally with its high sensitivity, produces fixed pattern noise (FPN) on produced image. Fixed pattern noise degrades parameters of infrared cameras like sensitivity or NETD. Additionally it degrades image quality, radiometric accuracy and temperature resolution. In order to objectively compare the two infrared cameras ones must measure and compare their parameters on a laboratory test stand. One of the basic parameters for the evaluation of a designed camera is NETD. In order to examine the NETD, parameters such as sensitivity and pixels noise must be measured. To do so, ones should register the output signal from the camera in response to the radiation of black bodies at two different temperatures. The article presets an application and measuring stand for determining the parameters of microbolometers camera. Prepared measurements were compared with the result of the measurements in the Institute of Optoelectronics, MUT on a METS test stand by CI SYSTEM. This test stand consists of IR collimator, IR standard source, rotating wheel with test patterns, a computer with a video grabber card and specialized software. The parameters of thermals cameras were measure according to norms and method described

  7. Automated cloud classification using a ground based infra-red camera and texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Rumi, Emal; Kerr, David; Coupland, Jeremy M.; Sandford, Andrew P.; Brettle, Mike J.

    2013-10-01

    Clouds play an important role in influencing the dynamics of local and global weather and climate conditions. Continuous monitoring of clouds is vital for weather forecasting and for air-traffic control. Convective clouds such as Towering Cumulus (TCU) and Cumulonimbus clouds (CB) are associated with thunderstorms, turbulence and atmospheric instability. Human observers periodically report the presence of CB and TCU clouds during operational hours at airports and observatories; however such observations are expensive and time limited. Robust, automatic classification of cloud type using infrared ground-based instrumentation offers the advantage of continuous, real-time (24/7) data capture and the representation of cloud structure in the form of a thermal map, which can greatly help to characterise certain cloud formations. The work presented here utilised a ground based infrared (8-14 μm) imaging device mounted on a pan/tilt unit for capturing high spatial resolution sky images. These images were processed to extract 45 separate textural features using statistical and spatial frequency based analytical techniques. These features were used to train a weighted k-nearest neighbour (KNN) classifier in order to determine cloud type. Ground truth data were obtained by inspection of images captured simultaneously from a visible wavelength colour camera at the same installation, with approximately the same field of view as the infrared device. These images were classified by a trained cloud observer. Results from the KNN classifier gave an encouraging success rate. A Probability of Detection (POD) of up to 90% with a Probability of False Alarm (POFA) as low as 16% was achieved.

  8. In-situ calibration of nonuniformity in infrared staring and modulated systems

    NASA Astrophysics Data System (ADS)

    Black, Wiley T.

    Infrared cameras can directly measure the apparent temperature of objects, providing thermal imaging. However, the raw output from most infrared cameras suffers from a strong, often limiting noise source called nonuniformity. Manufacturing imperfections in infrared focal planes lead to high pixel-to-pixel sensitivity to electronic bias, focal plane temperature, and other effects. The resulting imagery can only provide useful thermal imaging after a nonuniformity calibration has been performed. Traditionally, these calibrations are performed by momentarily blocking the field of view with a at temperature plate or blackbody cavity. However because the pattern is a coupling of manufactured sensitivities with operational variations, periodic recalibration is required, sometimes on the order of tens of seconds. A class of computational methods called Scene-Based Nonuniformity Correction (SBNUC) has been researched for over 20 years where the nonuniformity calibration is estimated in digital processing by analysis of the video stream in the presence of camera motion. The most sophisticated SBNUC methods can completely and robustly eliminate the high-spatial frequency component of nonuniformity with only an initial reference calibration or potentially no physical calibration. I will demonstrate a novel algorithm that advances these SBNUC techniques to support all spatial frequencies of nonuniformity correction. Long-wave infrared microgrid polarimeters are a class of camera that incorporate a microscale per-pixel wire-grid polarizer directly affixed to each pixel of the focal plane. These cameras have the capability of simultaneously measuring thermal imagery and polarization in a robust integrated package with no moving parts. I will describe the necessary adaptations of my SBNUC method to operate on this class of sensor as well as demonstrate SBNUC performance in LWIR polarimetry video collected on the UA mall.

  9. Balancing Science Objectives and Operational Constraints: A Mission Planner's Challenge

    NASA Technical Reports Server (NTRS)

    Weldy, Michelle

    1996-01-01

    The Air Force minute sensor technology integration (MSTI-3) satellite's primary mission is to characterize Earth's atmospheric background clutter. MSTI-3 will use three cameras for data collection, a mid-wave infrared imager, a short wave infrared imager, and a visible imaging spectrometer. Mission science objectives call for the collection of over 2 million images within the one year mission life. In addition, operational constraints limit camera usage to four operations of twenty minutes per day, with no more than 10,000 data and calibrating images collected per day. To balance the operational constraints and science objectives, the mission planning team has designed a planning process to e event schedules and sensor operation timelines. Each set of constraints, including spacecraft performance capabilities, the camera filters, the geographical regions, and the spacecraft-Sun-Earth geometries of interest, and remote tracking station deconflictions has been accounted for in this methodology. To aid in this process, the mission planning team is building a series of tools from commercial off-the-shelf software. These include the mission manifest which builds a daily schedule of events, and the MSTI Scene Simulator which helps build geometrically correct scans. These tools provide an efficient, responsive, and highly flexible architecture that maximizes data collection while minimizing mission planning time.

  10. Faint Object Camera observations of a globular cluster nova field

    NASA Technical Reports Server (NTRS)

    Margon, Bruce; Anderson, Scott F.; Downes, Ronald A.; Bohlin, Ralph C.; Jakobsen, Peter

    1991-01-01

    The Faint Object Camera onboard Hubble Space Telescope has obtained U and B images of the field of Nova Ophiuchi 1938 in the globular cluster M14 (NGC 6402). The candidate for the quiescent nova suggested by Shara et al. (1986) is clearly resolved into at least six separate images, probably all stellar, in a region of 0.5 arcsec. Although two of these objects are intriguing as they are somewhat ultraviolet, the actual nova counterpart remains ambiguous, as none of the images in the field has a marked UV excess. Many stars within the 1.4 arcsec (2 sigma) uncertainty of the nova outburst position are viable counterparts if only astrometric criteria are used for selection. The 11 x 11 arcsec frames easily resolve several hundred stars in modest exposures, implying that HST even in its current optical configuration will be unique for studies of very crowded fields at moderate (B = 22) limiting magnitudes.

  11. The Beagle 2 Stereo Camera System: Scientific Objectives and Design Characteristics

    NASA Astrophysics Data System (ADS)

    Griffiths, A.; Coates, A.; Josset, J.; Paar, G.; Sims, M.

    2003-04-01

    The Stereo Camera System (SCS) will provide wide-angle (48 degree) multi-spectral stereo imaging of the Beagle 2 landing site in Isidis Planitia with an angular resolution of 0.75 milliradians. Based on the SpaceX Modular Micro-Imager, the SCS is composed of twin cameras (with 1024 by 1024 pixel frame transfer CCD) and twin filter wheel units (with a combined total of 24 filters). The primary mission objective is to construct a digital elevation model of the area in reach of the lander’s robot arm. The SCS specifications and following baseline studies are described: Panoramic RGB colour imaging of the landing site and panoramic multi-spectral imaging at 12 distinct wavelengths to study the mineralogy of landing site. Solar observations to measure water vapour absorption and the atmospheric dust optical density. Also envisaged are multi-spectral observations of Phobos &Deimos (observations of the moons relative to background stars will be used to determine the lander’s location and orientation relative to the Martian surface), monitoring of the landing site to detect temporal changes, observation of the actions and effects of the other PAW experiments (including rock texture studies with a close-up-lens) and collaborative observations with the Mars Express orbiter instrument teams. Due to be launched in May of this year, the total system mass is 360 g, the required volume envelope is 747 cm^3 and the average power consumption is 1.8 W. A 10Mbit/s RS422 bus connects each camera to the lander common electronics.

  12. The Texas Thermal Interface: A real-time computer interface for an Inframetrics infrared camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Storek, D.J.; Gentle, K.W.

    1996-03-01

    The Texas Thermal Interface (TTI) offers an advantageous alternative to the conventional video path for computer analysis of infrared images from Inframetrics cameras. The TTI provides real-time computer data acquisition of 48 consecutive fields (version described here) with 8-bit pixels. The alternative requires time-consuming individual frame grabs from video tape with frequent loss of resolution in the D/A/D conversion. Within seconds after the event, the TTI temperature files may be viewed and processed to infer heat fluxes or other quantities as needed. The system cost is far less than commercial units which offer less capability. The system was developed formore » and is being used to measure heat fluxes to the plasma-facing components in a tokamak. {copyright} {ital 1996 American Institute of Physics.}« less

  13. A new concept of real-time security camera monitoring with privacy protection by masking moving objects

    NASA Astrophysics Data System (ADS)

    Yabuta, Kenichi; Kitazawa, Hitoshi; Tanaka, Toshihisa

    2006-02-01

    Recently, monitoring cameras for security have been extensively increasing. However, it is normally difficult to know when and where we are monitored by these cameras and how the recorded images are stored and/or used. Therefore, how to protect privacy in the recorded images is a crucial issue. In this paper, we address this problem and introduce a framework for security monitoring systems considering the privacy protection. We state requirements for monitoring systems in this framework. We propose a possible implementation that satisfies the requirements. To protect privacy of recorded objects, they are made invisible by appropriate image processing techniques. Moreover, the original objects are encrypted and watermarked into the image with the "invisible" objects, which is coded by the JPEG standard. Therefore, the image decoded by a normal JPEG viewer includes the objects that are unrecognized or invisible. We also introduce in this paper a so-called "special viewer" in order to decrypt and display the original objects. This special viewer can be used by limited users when necessary for crime investigation, etc. The special viewer allows us to choose objects to be decoded and displayed. Moreover, in this proposed system, real-time processing can be performed, since no future frame is needed to generate a bitstream.

  14. An infrared image based methodology for breast lesions screening

    NASA Astrophysics Data System (ADS)

    Morais, K. C. C.; Vargas, J. V. C.; Reisemberger, G. G.; Freitas, F. N. P.; Oliari, S. H.; Brioschi, M. L.; Louveira, M. H.; Spautz, C.; Dias, F. G.; Gasperin, P.; Budel, V. M.; Cordeiro, R. A. G.; Schittini, A. P. P.; Neto, C. D.

    2016-05-01

    The objective of this paper is to evaluate the potential of utilizing a structured methodology for breast lesions screening, based on infrared imaging temperature measurements of a healthy control group to establish expected normality ranges, and of breast cancer patients, previously diagnosed through biopsies of the affected regions. An analysis of the systematic error of the infrared camera skin temperature measurements was conducted in several different regions of the body, by direct comparison to high precision thermistor temperature measurements, showing that infrared camera temperatures are consistently around 2 °C above the thermistor temperatures. Therefore, a method of conjugated gradients is proposed to eliminate the infrared camera direct temperature measurement imprecision, by calculating the temperature difference between two points to cancel out the error. The method takes into account the human body approximate bilateral symmetry, and compares measured dimensionless temperature difference values (Δ θ bar) between two symmetric regions of the patient's breast, that takes into account the breast region, the surrounding ambient and the individual core temperatures, and doing so, the results interpretation for different individuals become simple and non subjective. The range of normal whole breast average dimensionless temperature differences for 101 healthy individuals was determined, and admitting that the breasts temperatures exhibit a unimodal normal distribution, the healthy normal range for each region was considered to be the dimensionless temperature difference plus/minus twice the standard deviation of the measurements, Δ θ bar ‾ + 2σ Δ θ bar ‾ , in order to represent 95% of the population. Forty-seven patients with previously diagnosed breast cancer through biopsies were examined with the method, which was capable of detecting breast abnormalities in 45 cases (96%). Therefore, the conjugated gradients method was considered effective

  15. ARNICA, the NICMOS 3 imaging camera of TIRGO.

    NASA Astrophysics Data System (ADS)

    Lisi, F.; Baffa, C.; Hunt, L.; Stanga, R.

    ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 μm that Arcetri Observatory has designed and built as a general facility for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1″per pixel, with sky coverage of more than 4 min×4 min on the NICMOS 3 (256×256 pixels, 40 μm side) detector array. The camera is remotely controlled by a PC 486, connected to the array control electronics via a fiber-optics link. A C-language package, running under MS-DOS on the PC 486, acquires and stores the frames, and controls the timing of the array. The camera is intended for imaging of large extra-galactic and Galactic fields; a large effort has been dedicated to explore the possibility of achieving precise photometric measurements in the J, H, K astronomical bands, with very promising results.

  16. Characterization and Application of a Grazing Angle Objective for Quantitative Infrared Reflection Microspectroscopy

    NASA Technical Reports Server (NTRS)

    Pepper, Stephen V.

    1995-01-01

    A grazing angle objective on an infrared microspectrometer is studied for quantitative spectroscopy by considering the angular dependence of the incident intensity within the objective's angular aperture. The assumption that there is no angular dependence is tested by comparing the experimental reflectance of Si and KBr surfaces with the reflectance calculated by integrating the Fresnel reflection coefficient over the angular aperture under this assumption. Good agreement was found, indicating that the specular reflectance of surfaces can straight-forwardly be quantitatively integrated over the angular aperture without considering non-uniform incident intensity. This quantitative approach is applied to the thickness determination of dipcoated Krytox on gold. The infrared optical constants of both materials are known, allowing the integration to be carried out. The thickness obtained is in fair agreement with the value determined by ellipsometry in the visible. Therefore, this paper illustrates a method for more quantitative use of a grazing angle objective for infrared reflectance microspectroscopy.

  17. A new high-speed IR camera system

    NASA Technical Reports Server (NTRS)

    Travis, Jeffrey W.; Shu, Peter K.; Jhabvala, Murzy D.; Kasten, Michael S.; Moseley, Samuel H.; Casey, Sean C.; Mcgovern, Lawrence K.; Luers, Philip J.; Dabney, Philip W.; Kaipa, Ravi C.

    1994-01-01

    A multi-organizational team at the Goddard Space Flight Center is developing a new far infrared (FIR) camera system which furthers the state of the art for this type of instrument by the incorporating recent advances in several technological disciplines. All aspects of the camera system are optimized for operation at the high data rates required for astronomical observations in the far infrared. The instrument is built around a Blocked Impurity Band (BIB) detector array which exhibits responsivity over a broad wavelength band and which is capable of operating at 1000 frames/sec, and consists of a focal plane dewar, a compact camera head electronics package, and a Digital Signal Processor (DSP)-based data system residing in a standard 486 personal computer. In this paper we discuss the overall system architecture, the focal plane dewar, and advanced features and design considerations for the electronics. This system, or one derived from it, may prove useful for many commercial and/or industrial infrared imaging or spectroscopic applications, including thermal machine vision for robotic manufacturing, photographic observation of short-duration thermal events such as combustion or chemical reactions, and high-resolution surveillance imaging.

  18. a Transplantable Compensation Scheme for the Effect of the Radiance from the Interior of a Camera on the Accuracy of Temperature Measurement

    NASA Astrophysics Data System (ADS)

    Dong, Shidu; Yang, Xiaofan; He, Bo; Liu, Guojin

    2006-11-01

    Radiance coming from the interior of an uncooled infrared camera has a significant effect on the measured value of the temperature of the object. This paper presents a three-phase compensation scheme for coping with this effect. The first phase acquires the calibration data and forms the calibration function by least square fitting. Likewise, the second phase obtains the compensation data and builds the compensation function by fitting. With the aid of these functions, the third phase determines the temperature of the object in concern from any given ambient temperature. It is known that acquiring the compensation data of a camera is very time-consuming. For the purpose of getting the compensation data at a reasonable time cost, we propose a transplantable scheme. The idea of this scheme is to calculate the ratio between the central pixel’s responsivity of the child camera to the radiance from the interior and that of the mother camera, followed by determining the compensation data of the child camera using this ratio and the compensation data of the mother camera Experimental results show that either of the child camera and the mother camera can measure the temperature of the object with an error of no more than 2°C.

  19. Image quality testing of assembled IR camera modules

    NASA Astrophysics Data System (ADS)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  20. Implementation and performance of the metrology system for the multi-object optical and near-infrared spectrograph MOONS

    NASA Astrophysics Data System (ADS)

    Drass, Holger; Vanzi, Leonardo; Torres-Torriti, Miguel; Dünner, Rolando; Shen, Tzu-Chiang; Belmar, Francisco; Dauvin, Lousie; Staig, Tomás.; Antognini, Jonathan; Flores, Mauricio; Luco, Yerko; Béchet, Clémentine; Boettger, David; Beard, Steven; Montgomery, David; Watson, Stephen; Cabral, Alexandre; Hayati, Mahmoud; Abreu, Manuel; Rees, Phil; Cirasuolo, Michele; Taylor, William; Fairley, Alasdair

    2016-08-01

    The Multi-Object Optical and Near-infrared Spectrograph (MOONS) will cover the Very Large Telescope's (VLT) field of view with 1000 fibres. The fibres will be mounted on fibre positioning units (FPU) implemented as two-DOF robot arms to ensure a homogeneous coverage of the 500 square arcmin field of view. To accurately and fast determine the position of the 1000 fibres a metrology system has been designed. This paper presents the hardware and software design and performance of the metrology system. The metrology system is based on the analysis of images taken by a circular array of 12 cameras located close to the VLTs derotator ring around the Nasmyth focus. The system includes 24 individually adjustable lamps. The fibre positions are measured through dedicated metrology targets mounted on top of the FPUs and fiducial markers connected to the FPU support plate which are imaged at the same time. A flexible pipeline based on VLT standards is used to process the images. The position accuracy was determined to 5 μm in the central region of the images. Including the outer regions the overall positioning accuracy is 25 μm. The MOONS metrology system is fully set up with a working prototype. The results in parts of the images are already excellent. By using upcoming hardware and improving the calibration it is expected to fulfil the accuracy requirement over the complete field of view for all metrology cameras.

  1. Non-destructive 3D shape measurement of transparent and black objects with thermal fringes

    NASA Astrophysics Data System (ADS)

    Brahm, Anika; Rößler, Conrad; Dietrich, Patrick; Heist, Stefan; Kühmstedt, Peter; Notni, Gunther

    2016-05-01

    Fringe projection is a well-established optical method for the non-destructive contactless three-dimensional (3D) measurement of object surfaces. Typically, fringe sequences in the visible wavelength range (VIS) are projected onto the surfaces of objects to be measured and are observed by two cameras in a stereo vision setup. The reconstruction is done by finding corresponding pixels in both cameras followed by triangulation. Problems can occur if the properties of some materials disturb the measurements. If the objects are transparent, translucent, reflective, or strongly absorbing in the VIS range, the projected patterns cannot be recorded properly. To overcome these challenges, we present a new alternative approach in the infrared (IR) region of the electromagnetic spectrum. For this purpose, two long-wavelength infrared (LWIR) cameras (7.5 - 13 μm) are used to detect the emitted heat radiation from surfaces which is induced by a pattern projection unit driven by a CO2 laser (10.6 μm). Thus, materials like glass or black objects, e.g. carbon fiber materials, can be measured non-destructively without the need of any additional paintings. We will demonstrate the basic principles of this heat pattern approach and show two types of 3D systems based on a freeform mirror and a GOBO wheel (GOes Before Optics) projector unit.

  2. Space-based infrared sensors of space target imaging effect analysis

    NASA Astrophysics Data System (ADS)

    Dai, Huayu; Zhang, Yasheng; Zhou, Haijun; Zhao, Shuang

    2018-02-01

    Target identification problem is one of the core problem of ballistic missile defense system, infrared imaging simulation is an important means of target detection and recognition. This paper first established the space-based infrared sensors ballistic target imaging model of point source on the planet's atmosphere; then from two aspects of space-based sensors camera parameters and target characteristics simulated atmosphere ballistic target of infrared imaging effect, analyzed the camera line of sight jitter, camera system noise and different imaging effects of wave on the target.

  3. Mobile viewer system for virtual 3D space using infrared LED point markers and camera

    NASA Astrophysics Data System (ADS)

    Sakamoto, Kunio; Taneji, Shoto

    2006-09-01

    The authors have developed a 3D workspace system using collaborative imaging devices. A stereoscopic display enables this system to project 3D information. In this paper, we describe the position detecting system for a see-through 3D viewer. A 3D display system is useful technology for virtual reality, mixed reality and augmented reality. We have researched spatial imaging and interaction system. We have ever proposed 3D displays using the slit as a parallax barrier, the lenticular screen and the holographic optical elements(HOEs) for displaying active image 1)2)3)4). The purpose of this paper is to propose the interactive system using these 3D imaging technologies. The observer can view virtual images in the real world when the user watches the screen of a see-through 3D viewer. The goal of our research is to build the display system as follows; when users see the real world through the mobile viewer, the display system gives users virtual 3D images, which is floating in the air, and the observers can touch these floating images and interact them such that kids can make play clay. The key technologies of this system are the position recognition system and the spatial imaging display. The 3D images are presented by the improved parallax barrier 3D display. Here the authors discuss the measuring method of the mobile viewer using infrared LED point markers and a camera in the 3D workspace (augmented reality world). The authors show the geometric analysis of the proposed measuring method, which is the simplest method using a single camera not the stereo camera, and the results of our viewer system.

  4. Evaluation of Moving Object Detection Based on Various Input Noise Using Fixed Camera

    NASA Astrophysics Data System (ADS)

    Kiaee, N.; Hashemizadeh, E.; Zarrinpanjeh, N.

    2017-09-01

    Detecting and tracking objects in video has been as a research area of interest in the field of image processing and computer vision. This paper evaluates the performance of a novel method for object detection algorithm in video sequences. This process helps us to know the advantage of this method which is being used. The proposed framework compares the correct and wrong detection percentage of this algorithm. This method was evaluated with the collected data in the field of urban transport which include car and pedestrian in fixed camera situation. The results show that the accuracy of the algorithm will decreases because of image resolution reduction.

  5. Novel fast catadioptric objective with wide field of view

    NASA Astrophysics Data System (ADS)

    Muñoz, Fernando; Infante Herrero, José M.; Benítez, Pablo; Miñano, Juan C.; Lin, Wang; Vilaplana, Juan; Biot, Guillermo; de la Fuente, Marta

    2010-08-01

    Using the Simultaneous Multiple Surface method in 2D (SMS2D), we present a fast catadioptric objective with a wide field of view (125°×96°) designed for a microbolometer detector with 640×480 pixels and 25 microns pixel pitch Keywords: Infrared lens design, thermal imaging, Schwarzschild configuration, SMS2D, wide field of view, driving cameras, panoramic systems

  6. Use of infrared camera to understand bats' access to date palm sap: implications for preventing Nipah virus transmission.

    PubMed

    Khan, M Salah Uddin; Hossain, Jahangir; Gurley, Emily S; Nahar, Nazmun; Sultana, Rebeca; Luby, Stephen P

    2010-12-01

    Pteropus bats are commonly infected with Nipah virus, but show no signs of illness. Human Nipah outbreaks in Bangladesh coincide with the date palm sap harvesting season. In epidemiologic studies, drinking raw date palm sap is a risk factor for human Nipah infection. We conducted a study to evaluate bats' access to date palm sap. We mounted infrared cameras that silently captured images upon detection of motion on date palm trees from 5:00 pm to 6:00 am. Additionally, we placed two locally used preventative techniques, bamboo skirts and lime (CaCO₃) smeared on date palm trees to assess their effectiveness in preventing bats access to sap. Out of 20 camera-nights of observations, 14 identified 132 visits of bats around the tree, 91 to the shaved surface of the tree where the sap flow originates, 4 at the stream of sap moving toward the collection pot, and no bats at the tap or on the collection pots; the remaining 6 camera-nights recorded no visits. Of the preventative techniques, the bamboo skirt placed for four camera-nights prevented bats access to sap. This study confirmed that bats commonly visited date palm trees and physically contacted the sap collected for human consumption. This is further evidence that date palm sap is an important link between Nipah virus in bats and Nipah virus in humans. Efforts that prevent bat access to the shaved surface and the sap stream of the tree could reduce Nipah spillovers to the human population.

  7. Investigation of small solar system objects with the space telescope

    NASA Technical Reports Server (NTRS)

    Morrison, D.

    1979-01-01

    The application of the space telescope (ST) to study small objects in the solar system in order to understand the birth and the early evolution of the solar system is discussed. The upper size limit of the small bodies is defined as approximately 5000 km and includes planetary satellites, planetary rings, asteroids, and comets.The use of the astronomical instruments aboard the ST, such as the faint object camera, ultraviolet and infrared spectrometers, and spectrophotometers, to study the small solar system objects is discussed.

  8. Low-cost panoramic infrared surveillance system

    NASA Astrophysics Data System (ADS)

    Kecskes, Ian; Engel, Ezra; Wolfe, Christopher M.; Thomson, George

    2017-05-01

    A nighttime surveillance concept consisting of a single surface omnidirectional mirror assembly and an uncooled Vanadium Oxide (VOx) longwave infrared (LWIR) camera has been developed. This configuration provides a continuous field of view spanning 360° in azimuth and more than 110° in elevation. Both the camera and the mirror are readily available, off-the-shelf, inexpensive products. The mirror assembly is marketed for use in the visible spectrum and requires only minor modifications to function in the LWIR spectrum. The compactness and portability of this optical package offers significant advantages over many existing infrared surveillance systems. The developed system was evaluated on its ability to detect moving, human-sized heat sources at ranges between 10 m and 70 m. Raw camera images captured by the system are converted from rectangular coordinates in the camera focal plane to polar coordinates and then unwrapped into the users azimuth and elevation system. Digital background subtraction and color mapping are applied to the images to increase the users ability to extract moving items from background clutter. A second optical system consisting of a commercially available 50 mm f/1.2 ATHERM lens and a second LWIR camera is used to examine the details of objects of interest identified using the panoramic imager. A description of the components of the proof of concept is given, followed by a presentation of raw images taken by the panoramic LWIR imager. A description of the method by which these images are analyzed is given, along with a presentation of these results side-by-side with the output of the 50 mm LWIR imager and a panoramic visible light imager. Finally, a discussion of the concept and its future development are given.

  9. Calibration of the Infrared Telescope Facility National Science Foundation Camera Jupiter Galileo Data Set

    NASA Astrophysics Data System (ADS)

    Vincent, Mark B.; Chanover, Nancy J.; Beebe, Reta F.; Huber, Lyle

    2005-10-01

    The NASA Infrared Telescope Facility (IRTF) on Mauna Kea, Hawaii, set aside some time on about 500 nights from 1995 to 2002, when the NSFCAM facility infrared camera was mounted and Jupiter was visible, for a standardized set of observations of Jupiter in support of the Galileo mission. The program included observations of Jupiter, nearby reference stars, and dome flats in five filters: narrowband filters centered at 1.58, 2.28, and 3.53 μm, and broader L' and M' bands that probe the atmosphere from the stratosphere to below the main cloud layer. The reference stars were not cross-calibrated against standards. We performed follow-up observations to calibrate these stars and Jupiter in 2003 and 2004. We present a summary of the calibration of the Galileo support monitoring program data set. We present calibrated magnitudes of the six most frequently observed stars, calibrated reflectivities, and brightness temperatures of Jupiter from 1995 to 2004, and a simple method of normalizing the Jovian brightness to the 2004 results. Our study indicates that the NSFCAM's zero-point magnitudes were not stable from 1995 to early 1997, and that the best Jovian calibration possible with this data set is limited to about +/-10%. The raw images and calibration data have been deposited in the Planetary Data System.

  10. Conditions for the use of infrared camera diagnostics in energy auditing of the objects exposed to open air space at isothermal sky

    NASA Astrophysics Data System (ADS)

    Kruczek, Tadeusz

    2015-03-01

    Convective and radiation heat transfer take place between various objects placed in open air space and their surroundings. These phenomena bring about heat losses from pipelines, building walls, roofs and other objects. One of the main tasks in energy auditing is the reduction of excessive heat losses. In the case of a low sky temperature, the radiation heat exchange is very intensive and the temperature of the top part of the horizontal pipelines or walls is lower than the temperature of their bottom parts. Quite often this temperature is also lower than the temperature of the surrounding atmospheric air. In the case of overhead heat pipelines placed in open air space, it is the ground and sky that constitute the surroundings. The aforementioned elements of surroundings usually have different values of temperature. Thus, these circumstances bring about difficulties during infrared inspections because only one ambient temperature which represents radiation of all surrounding elements must be known during the thermovision measurements. This work is aimed at the development of a method for determination of an equivalent ambient temperature representing the thermal radiation of the surrounding elements of the object under consideration placed in open air space, which could be applied at a fairly uniform temperature of the sky during the thermovision measurements as well as for the calculation of radiative heat losses.

  11. Pattern recognition applied to infrared images for early alerts in fog

    NASA Astrophysics Data System (ADS)

    Boucher, Vincent; Marchetti, Mario; Dumoulin, Jean; Cord, Aurélien

    2014-09-01

    Fog conditions are the cause of severe car accidents in western countries because of the poor induced visibility. Its forecast and intensity are still very difficult to predict by weather services. Infrared cameras allow to detect and to identify objects in fog while visibility is too low for eye detection. Over the past years, the implementation of cost effective infrared cameras on some vehicles has enabled such detection. On the other hand pattern recognition algorithms based on Canny filters and Hough transformation are a common tool applied to images. Based on these facts, a joint research program between IFSTTAR and Cerema has been developed to study the benefit of infrared images obtained in a fog tunnel during its natural dissipation. Pattern recognition algorithms have been applied, specifically on road signs which shape is usually associated to a specific meaning (circular for a speed limit, triangle for an alert, …). It has been shown that road signs were detected early enough in images, with respect to images in the visible spectrum, to trigger useful alerts for Advanced Driver Assistance Systems.

  12. Results of shuttle EMU thermal vacuum tests incorporating an infrared imaging camera data acquisition system

    NASA Technical Reports Server (NTRS)

    Anderson, James E.; Tepper, Edward H.; Trevino, Louis A.

    1991-01-01

    Manned tests in Chamber B at NASA JSC were conducted in May and June of 1990 to better quantify the Space Shuttle Extravehicular Mobility Unit's (EMU) thermal performance in the cold environmental extremes of space. Use of an infrared imaging camera with real-time video monitoring of the output significantly added to the scope, quality and interpretation of the test conduct and data acquisition. Results of this test program have been effective in the thermal certification of a new insulation configuration and the '5000 Series' glove. In addition, the acceptable thermal performance of flight garments with visually deteriorated insulation was successfully demonstrated, thereby saving significant inspection and garment replacement cost. This test program also established a new method for collecting data vital to improving crew thermal comfort in a cold environment.

  13. Formulation of image quality prediction criteria for the Viking lander camera

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Jobson, D. J.; Taylor, E. J.; Wall, S. D.

    1973-01-01

    Image quality criteria are defined and mathematically formulated for the prediction computer program which is to be developed for the Viking lander imaging experiment. The general objective of broad-band (black and white) imagery to resolve small spatial details and slopes is formulated as the detectability of a right-circular cone with surface properties of the surrounding terrain. The general objective of narrow-band (color and near-infrared) imagery to observe spectral characteristics if formulated as the minimum detectable albedo variation. The general goal to encompass, but not exceed, the range of the scene radiance distribution within single, commandable, camera dynamic range setting is also considered.

  14. Near-infrared transillumination photography of intraocular tumours.

    PubMed

    Krohn, Jørgen; Ulltang, Erlend; Kjersem, Bård

    2013-10-01

    To present a technique for near-infrared transillumination imaging of intraocular tumours based on the modifications of a conventional digital slit lamp camera system. The Haag-Streit Photo-Slit Lamp BX 900 (Haag-Streit AG) was used for transillumination photography by gently pressing the tip of the background illumination cable against the surface of the patient's eye. Thus the light from the flash unit was transmitted into the eye, leading to improved illumination and image resolution. The modification for near-infrared photography was done by replacing the original camera with a Canon EOS 30D (Canon Inc) converted by Advanced Camera Services Ltd. In this camera, the infrared blocking filter was exchanged for a 720 nm long-pass filter, so that the near-infrared part of the spectrum was recorded by the sensor. The technique was applied in eight patients: three with anterior choroidal melanoma, three with ciliary body melanoma and two with ocular pigment alterations. The good diagnostic quality of the photographs made it possible to evaluate the exact location and extent of the lesions in relation to pigmented intraocular landmarks such as the ora serrata and ciliary body. The photographic procedure did not lead to any complications. We recommend near-infrared transillumination photography as a supplementary diagnostic tool for the evaluation and documentation of anteriorly located intraocular tumours.

  15. Uncooled infrared sensors: rapid growth and future perspective

    NASA Astrophysics Data System (ADS)

    Balcerak, Raymond S.

    2000-07-01

    The uncooled infrared cameras are now available for both the military and commercial markets. The current camera technology incorporates the fruits of many years of development, focusing on the details of pixel design, novel material processing, and low noise read-out electronics. The rapid insertion of cameras into systems is testimony to the successful completion of this 'first phase' of development. In the military market, the first uncooled infrared cameras will be used for weapon sights, driver's viewers and helmet mounted cameras. Major commercial applications include night driving, security, police and fire fighting, and thermography, primarily for preventive maintenance and process control. The technology for the next generation of cameras is even more demanding, but within reach. The paper outlines the technology program planned for the next generation of cameras, and the approaches to further enhance performance, even to the radiation limit of thermal detectors.

  16. Pettit works with two still cameras mounted together in the U.S. Laboratory

    NASA Image and Video Library

    2012-01-21

    ISS030-E-049636 (21 Jan. 2012) --- NASA astronaut Don Pettit, Expedition 30 flight engineer, works with two still cameras mounted together in the Destiny laboratory of the International Space Station. One camera is an infrared modified still camera.

  17. Pettit works with two still cameras mounted together in the U.S. Laboratory

    NASA Image and Video Library

    2012-01-21

    ISS030-E-049643 (21 Jan. 2012) --- NASA astronaut Don Pettit, Expedition 30 flight engineer, works with two still cameras mounted together in the Destiny laboratory of the International Space Station. One camera is an infrared modified still camera.

  18. Far-infrared observations of the exciting stars of Herbig-Haro objects. III - Circumstellar disks

    NASA Technical Reports Server (NTRS)

    Cohen, M.; Harvey, P. M.; Schwartz, R. D.

    1985-01-01

    Far-infrared observations of the exciting stars of Herbig-Haro objects are presented that (1) show these stars to be of low luminosity; (2) indicate that it is not usual for these objects themselves to be visible at far-infrared wavelengths; and (3) demonstrate the existence of spatially resolved, physically large, potentially disklike structures. These latter structures are resolved perpendicular to the directions of flow from the stars, but not parallel to the flows. In addition to these general properties, two new HH-exciting stars were discovered by searching along the extrapolated proper motion vectors for these HHs; and the jetlike object 'DG Tau B' was also detected.

  19. Detection and tracking of drones using advanced acoustic cameras

    NASA Astrophysics Data System (ADS)

    Busset, Joël.; Perrodin, Florian; Wellig, Peter; Ott, Beat; Heutschi, Kurt; Rühl, Torben; Nussbaumer, Thomas

    2015-10-01

    Recent events of drones flying over city centers, official buildings and nuclear installations stressed the growing threat of uncontrolled drone proliferation and the lack of real countermeasure. Indeed, detecting and tracking them can be difficult with traditional techniques. A system to acoustically detect and track small moving objects, such as drones or ground robots, using acoustic cameras is presented. The described sensor, is completely passive, and composed of a 120-element microphone array and a video camera. The acoustic imaging algorithm determines in real-time the sound power level coming from all directions, using the phase of the sound signals. A tracking algorithm is then able to follow the sound sources. Additionally, a beamforming algorithm selectively extracts the sound coming from each tracked sound source. This extracted sound signal can be used to identify sound signatures and determine the type of object. The described techniques can detect and track any object that produces noise (engines, propellers, tires, etc). It is a good complementary approach to more traditional techniques such as (i) optical and infrared cameras, for which the object may only represent few pixels and may be hidden by the blooming of a bright background, and (ii) radar or other echo-localization techniques, suffering from the weakness of the echo signal coming back to the sensor. The distance of detection depends on the type (frequency range) and volume of the noise emitted by the object, and on the background noise of the environment. Detection range and resilience to background noise were tested in both, laboratory environments and outdoor conditions. It was determined that drones can be tracked up to 160 to 250 meters, depending on their type. Speech extraction was also experimentally investigated: the speech signal of a person being 80 to 100 meters away can be captured with acceptable speech intelligibility.

  20. KENNEDY SPACE CENTER, FLA. - Workers in KSC's Vertical Processing Facility lift the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) prior to its installation in the Second Axial Carrier. NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS also is HST's first cryogenic instrument — its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 degrees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is targeted Feb. 11 aboard Discovery with a crew of seven.

    NASA Image and Video Library

    1997-01-18

    KENNEDY SPACE CENTER, FLA. - Workers in KSC's Vertical Processing Facility lift the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) prior to its installation in the Second Axial Carrier. NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS also is HST's first cryogenic instrument — its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 degrees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is targeted Feb. 11 aboard Discovery with a crew of seven.

  1. Enhancing swimming pool safety by the use of range-imaging cameras

    NASA Astrophysics Data System (ADS)

    Geerardyn, D.; Boulanger, S.; Kuijk, M.

    2015-05-01

    Drowning is the cause of death of 372.000 people, each year worldwide, according to the report of November 2014 of the World Health Organization.1 Currently, most swimming pools only use lifeguards to detect drowning people. In some modern swimming pools, camera-based detection systems are nowadays being integrated. However, these systems have to be mounted underwater, mostly as a replacement of the underwater lighting. In contrast, we are interested in range imaging cameras mounted on the ceiling of the swimming pool, allowing to distinguish swimmers at the surface from drowning people underwater, while keeping the large field-of-view and minimizing occlusions. However, we have to take into account that the water surface of a swimming pool is not a flat, but mostly rippled surface, and that the water is transparent for visible light, but less transparent for infrared or ultraviolet light. We investigated the use of different types of 3D cameras to detect objects underwater at different depths and with different amplitudes of surface perturbations. Specifically, we performed measurements with a commercial Time-of-Flight camera, a commercial structured-light depth camera and our own Time-of-Flight system. Our own system uses pulsed Time-of-Flight and emits light of 785 nm. The measured distances between the camera and the object are influenced through the perturbations on the water surface. Due to the timing of our Time-of-Flight camera, our system is theoretically able to minimize the influence of the reflections of a partially-reflecting surface. The combination of a post image-acquisition filter compensating for the perturbations and the use of a light source with shorter wavelengths to enlarge the depth range can improve the current commercial cameras. As a result, we can conclude that low-cost range imagers can increase swimming pool safety, by inserting a post-processing filter and the use of another light source.

  2. Development of an Extra-vehicular (EVA) Infrared (IR) Camera Inspection System

    NASA Technical Reports Server (NTRS)

    Gazarik, Michael; Johnson, Dave; Kist, Ed; Novak, Frank; Antill, Charles; Haakenson, David; Howell, Patricia; Pandolf, John; Jenkins, Rusty; Yates, Rusty

    2006-01-01

    Designed to fulfill a critical inspection need for the Space Shuttle Program, the EVA IR Camera System can detect crack and subsurface defects in the Reinforced Carbon-Carbon (RCC) sections of the Space Shuttle s Thermal Protection System (TPS). The EVA IR Camera performs this detection by taking advantage of the natural thermal gradients induced in the RCC by solar flux and thermal emission from the Earth. This instrument is a compact, low-mass, low-power solution (1.2cm3, 1.5kg, 5.0W) for TPS inspection that exceeds existing requirements for feature detection. Taking advantage of ground-based IR thermography techniques, the EVA IR Camera System provides the Space Shuttle program with a solution that can be accommodated by the existing inspection system. The EVA IR Camera System augments the visible and laser inspection systems and finds cracks and subsurface damage that is not measurable by the other sensors, and thus fills a critical gap in the Space Shuttle s inspection needs. This paper discusses the on-orbit RCC inspection measurement concept and requirements, and then presents a detailed description of the EVA IR Camera System design.

  3. First results from the faint object camera - High-resolution observations of the central object R136 in the 30 Doradus nebula

    NASA Technical Reports Server (NTRS)

    Weigelt, G.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Boksenberg, A.; Crane, P.; Deharveng, J. M.; Disney, M. J.; Jakobsen, P.; Kamperman, T. M.

    1991-01-01

    R136 is the luminous central object of the giant H II region 30 Doradus in the LMC. The first high-resolution observations of R136 with the Faint Object Camera on board the Hubble Space Telescope are reported. The physical nature of the brightest component R136a has been a matter of some controversy over the last few years. The UV images obtained show that R136a is a very compact star cluster consisting of more than eight stars within 0.7 arcsec diameter. From these high-resolution images a mass upper limit can be derived for the most luminous stars observed in R136.

  4. Frontal lobe activation during object permanence: data from near-infrared spectroscopy.

    PubMed

    Baird, Abigail A; Kagan, Jerome; Gaudette, Thomas; Walz, Kathryn A; Hershlag, Natalie; Boas, David A

    2002-08-01

    The ability to create and hold a mental schema of an object is one of the milestones in cognitive development. Developmental scientists have named the behavioral manifestation of this competence object permanence. Convergent evidence indicates that frontal lobe maturation plays a critical role in the display of object permanence, but methodological and ethical constrains have made it difficult to collect neurophysiological evidence from awake, behaving infants. Near-infrared spectroscopy provides a noninvasive assessment of changes in oxy- and deoxyhemoglobin and total hemoglobin concentration within a prescribed region. The evidence described in this report reveals that the emergence of object permanence is related to an increase in hemoglobin concentration in frontal cortex.

  5. Calibration and verification of thermographic cameras for geometric measurements

    NASA Astrophysics Data System (ADS)

    Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P.

    2011-03-01

    Infrared thermography is a technique with an increasing degree of development and applications. Quality assessment in the measurements performed with the thermal cameras should be achieved through metrology calibration and verification. Infrared cameras acquire temperature and geometric information, although calibration and verification procedures are only usual for thermal data. Black bodies are used for these purposes. Moreover, the geometric information is important for many fields as architecture, civil engineering and industry. This work presents a calibration procedure that allows the photogrammetric restitution and a portable artefact to verify the geometric accuracy, repeatability and drift of thermographic cameras. These results allow the incorporation of this information into the quality control processes of the companies. A grid based on burning lamps is used for the geometric calibration of thermographic cameras. The artefact designed for the geometric verification consists of five delrin spheres and seven cubes of different sizes. Metrology traceability for the artefact is obtained from a coordinate measuring machine. Two sets of targets with different reflectivity are fixed to the spheres and cubes to make data processing and photogrammetric restitution possible. Reflectivity was the chosen material propriety due to the thermographic and visual cameras ability to detect it. Two thermographic cameras from Flir and Nec manufacturers, and one visible camera from Jai are calibrated, verified and compared using calibration grids and the standard artefact. The calibration system based on burning lamps shows its capability to perform the internal orientation of the thermal cameras. Verification results show repeatability better than 1 mm for all cases, being better than 0.5 mm for the visible one. As it must be expected, also accuracy appears higher in the visible camera, and the geometric comparison between thermographic cameras shows slightly better

  6. Experience with the UKIRT InSb array camera

    NASA Technical Reports Server (NTRS)

    Mclean, Ian S.; Casali, Mark M.; Wright, Gillian S.; Aspin, Colin

    1989-01-01

    The cryogenic infrared camera, IRCAM, has been operating routinely on the 3.8 m UK Infrared Telescope on Mauna Kea, Hawaii for over two years. The camera, which uses a 62x58 element Indium Antimonide array from Santa Barbara Research Center, was designed and built at the Royal Observatory, Edinburgh which operates UKIRT on behalf of the UK Science and Engineering Research Council. Over the past two years at least 60% of the available time on UKIRT has been allocated for IRCAM observations. Described here are some of the properties of this instrument and its detector which influence astronomical performance. Observational techniques and the power of IR arrays with some recent astronomical results are discussed.

  7. Detection of unknown targets from aerial camera and extraction of simple object fingerprints for the purpose of target reacquisition

    NASA Astrophysics Data System (ADS)

    Mundhenk, T. Nathan; Ni, Kang-Yu; Chen, Yang; Kim, Kyungnam; Owechko, Yuri

    2012-01-01

    An aerial multiple camera tracking paradigm needs to not only spot unknown targets and track them, but also needs to know how to handle target reacquisition as well as target handoff to other cameras in the operating theater. Here we discuss such a system which is designed to spot unknown targets, track them, segment the useful features and then create a signature fingerprint for the object so that it can be reacquired or handed off to another camera. The tracking system spots unknown objects by subtracting background motion from observed motion allowing it to find targets in motion, even if the camera platform itself is moving. The area of motion is then matched to segmented regions returned by the EDISON mean shift segmentation tool. Whole segments which have common motion and which are contiguous to each other are grouped into a master object. Once master objects are formed, we have a tight bound on which to extract features for the purpose of forming a fingerprint. This is done using color and simple entropy features. These can be placed into a myriad of different fingerprints. To keep data transmission and storage size low for camera handoff of targets, we try several different simple techniques. These include Histogram, Spatiogram and Single Gaussian Model. These are tested by simulating a very large number of target losses in six videos over an interval of 1000 frames each from the DARPA VIVID video set. Since the fingerprints are very simple, they are not expected to be valid for long periods of time. As such, we test the shelf life of fingerprints. This is how long a fingerprint is good for when stored away between target appearances. Shelf life gives us a second metric of goodness and tells us if a fingerprint method has better accuracy over longer periods. In videos which contain multiple vehicle occlusions and vehicles of highly similar appearance we obtain a reacquisition rate for automobiles of over 80% using the simple single Gaussian model compared

  8. A Wide-field Camera and Fully Remote Operations at the Wyoming Infrared Observatory

    NASA Astrophysics Data System (ADS)

    Findlay, Joseph R.; Kobulnicky, Henry A.; Weger, James S.; Bucher, Gerald A.; Perry, Marvin C.; Myers, Adam D.; Pierce, Michael J.; Vogel, Conrad

    2016-11-01

    Upgrades at the 2.3 meter Wyoming Infrared Observatory telescope have provided the capability for fully remote operations by a single operator from the University of Wyoming campus. A line-of-sight 300 Megabit s-1 11 GHz radio link provides high-speed internet for data transfer and remote operations that include several realtime video feeds. Uninterruptable power is ensured by a 10 kVA battery supply for critical systems and a 55 kW autostart diesel generator capable of running the entire observatory for up to a week. The construction of a new four-element prime-focus corrector with fused-silica elements allows imaging over a 40‧ field of view with a new 40962 UV-sensitive prime-focus camera and filter wheel. A new telescope control system facilitates the remote operations model and provides 20″ rms pointing over the usable sky. Taken together, these improvements pave the way for a new generation of sky surveys supporting space-based missions and flexible-cadence observations advancing emerging astrophysical priorities such as planet detection, quasar variability, and long-term time-domain campaigns.

  9. Theodolite with CCD Camera for Safe Measurement of Laser-Beam Pointing

    NASA Technical Reports Server (NTRS)

    Crooke, Julie A.

    2003-01-01

    The simple addition of a charge-coupled-device (CCD) camera to a theodolite makes it safe to measure the pointing direction of a laser beam. The present state of the art requires this to be a custom addition because theodolites are manufactured without CCD cameras as standard or even optional equipment. A theodolite is an alignment telescope equipped with mechanisms to measure the azimuth and elevation angles to the sub-arcsecond level. When measuring the angular pointing direction of a Class ll laser with a theodolite, one could place a calculated amount of neutral density (ND) filters in front of the theodolite s telescope. One could then safely view and measure the laser s boresight looking through the theodolite s telescope without great risk to one s eyes. This method for a Class ll visible wavelength laser is not acceptable to even consider tempting for a Class IV laser and not applicable for an infrared (IR) laser. If one chooses insufficient attenuation or forgets to use the filters, then looking at the laser beam through the theodolite could cause instant blindness. The CCD camera is already commercially available. It is a small, inexpensive, blackand- white CCD circuit-board-level camera. An interface adaptor was designed and fabricated to mount the camera onto the eyepiece of the specific theodolite s viewing telescope. Other equipment needed for operation of the camera are power supplies, cables, and a black-and-white television monitor. The picture displayed on the monitor is equivalent to what one would see when looking directly through the theodolite. Again, the additional advantage afforded by a cheap black-and-white CCD camera is that it is sensitive to infrared as well as to visible light. Hence, one can use the camera coupled to a theodolite to measure the pointing of an infrared as well as a visible laser.

  10. Sub-Camera Calibration of a Penta-Camera

    NASA Astrophysics Data System (ADS)

    Jacobsen, K.; Gerke, M.

    2016-03-01

    Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test) of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors for corresponding

  11. Ambient-Light-Canceling Camera Using Subtraction of Frames

    NASA Technical Reports Server (NTRS)

    Morookian, John Michael

    2004-01-01

    The ambient-light-canceling camera (ALCC) is a proposed near-infrared electronic camera that would utilize a combination of (1) synchronized illumination during alternate frame periods and (2) subtraction of readouts from consecutive frames to obtain images without a background component of ambient light. The ALCC is intended especially for use in tracking the motion of an eye by the pupil center corneal reflection (PCCR) method. Eye tracking by the PCCR method has shown potential for application in human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological deficiencies. In the PCCR method, an eye is illuminated by near-infrared light from a lightemitting diode (LED). Some of the infrared light is reflected from the surface of the cornea. Some of the infrared light enters the eye through the pupil and is reflected from back of the eye out through the pupil a phenomenon commonly observed as the red-eye effect in flash photography. An electronic camera is oriented to image the user's eye. The output of the camera is digitized and processed by algorithms that locate the two reflections. Then from the locations of the centers of the two reflections, the direction of gaze is computed. As described thus far, the PCCR method is susceptible to errors caused by reflections of ambient light. Although a near-infrared band-pass optical filter can be used to discriminate against ambient light, some sources of ambient light have enough in-band power to compete with the LED signal. The mode of operation of the ALCC would complement or supplant spectral filtering by providing more nearly complete cancellation of the effect of ambient light. In the operation of the ALCC, a near-infrared LED would be pulsed on during one camera frame period and off during the next frame period. Thus, the scene would be illuminated by both the LED (signal) light and the ambient (background) light

  12. Effects of camera location on the reconstruction of 3D flare trajectory with two cameras

    NASA Astrophysics Data System (ADS)

    Özsaraç, Seçkin; Yeşilkaya, Muhammed

    2015-05-01

    Flares are used as valuable electronic warfare assets for the battle against infrared guided missiles. The trajectory of the flare is one of the most important factors that determine the effectiveness of the counter measure. Reconstruction of the three dimensional (3D) position of a point, which is seen by multiple cameras, is a common problem. Camera placement, camera calibration, corresponding pixel determination in between the images of different cameras and also the triangulation algorithm affect the performance of 3D position estimation. In this paper, we specifically investigate the effects of camera placement on the flare trajectory estimation performance by simulations. Firstly, 3D trajectory of a flare and also the aircraft, which dispenses the flare, are generated with simple motion models. Then, we place two virtual ideal pinhole camera models on different locations. Assuming the cameras are tracking the aircraft perfectly, the view vectors of the cameras are computed. Afterwards, using the view vector of each camera and also the 3D position of the flare, image plane coordinates of the flare on both cameras are computed using the field of view (FOV) values. To increase the fidelity of the simulation, we have used two sources of error. One is used to model the uncertainties in the determination of the camera view vectors, i.e. the orientations of the cameras are measured noisy. Second noise source is used to model the imperfections of the corresponding pixel determination of the flare in between the two cameras. Finally, 3D position of the flare is estimated using the corresponding pixel indices, view vector and also the FOV of the cameras by triangulation. All the processes mentioned so far are repeated for different relative camera placements so that the optimum estimation error performance is found for the given aircraft and are trajectories.

  13. A low-cost dual-camera imaging system for aerial applicators

    USDA-ARS?s Scientific Manuscript database

    Agricultural aircraft provide a readily available remote sensing platform as low-cost and easy-to-use consumer-grade cameras are being increasingly used for aerial imaging. In this article, we report on a dual-camera imaging system we recently assembled that can capture RGB and near-infrared (NIR) i...

  14. Multi-Angle Snowflake Camera Instrument Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stuefer, Martin; Bailey, J.

    2016-07-01

    The Multi-Angle Snowflake Camera (MASC) takes 9- to 37-micron resolution stereographic photographs of free-falling hydrometers from three angles, while simultaneously measuring their fall speed. Information about hydrometeor size, shape orientation, and aspect ratio is derived from MASC photographs. The instrument consists of three commercial cameras separated by angles of 36º. Each camera field of view is aligned to have a common single focus point about 10 cm distant from the cameras. Two near-infrared emitter pairs are aligned with the camera’s field of view within a 10-angular ring and detect hydrometeor passage, with the lower emitters configured to trigger the MASCmore » cameras. The sensitive IR motion sensors are designed to filter out slow variations in ambient light. Fall speed is derived from successive triggers along the fall path. The camera exposure times are extremely short, in the range of 1/25,000th of a second, enabling the MASC to capture snowflake sizes ranging from 30 micrometers to 3 cm.« less

  15. Cloud cameras at the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Winnick, Michael G.

    2010-06-01

    This thesis presents the results of measurements made by infrared cloud cameras installed at the Pierre Auger Observatory in Argentina. These cameras were used to record cloud conditions during operation of the observatory's fluorescence detectors. As cloud may affect the measurement of fluorescence from cosmic ray extensive air showers, the cloud cameras provide a record of which measurements have been interfered with by cloud. Several image processing algorithms were developed, along with a methodology for the detection of cloud within infrared images taken by the cloud cameras. A graphical user interface (GUI) was developed to expediate this, as a large number of images need to be checked for cloud. A cross-check between images recorded by three of the observatory's cloud cameras is presented, along with a comparison with independent cloud measurements made by LIDAR. Despite the cloud cameras and LIDAR observing different areas of the sky, a good agreement is observed in the measured cloud fraction between the two instruments, particularly on very clear and overcast nights. Cloud information recorded by the cloud cameras, with cloud height information measured by the LIDAR, was used to identify those extensive air showers that were obscured by cloud. These events were used to study the effectiveness of standard quality cuts at removing cloud afflicted events. Of all of the standard quality cuts studied in this thesis, the LIDAR cloud fraction cut was the most effective at preferentially removing cloud obscured events. A 'cloudy pixel' veto is also presented, whereby cloud obscured measurements are excluded during the standard hybrid analysis, and new extensive air shower reconstructed parameters determined. The application of such a veto would provide a slight increase to the number of events available for higher level analysis.

  16. Hubble Space Telescope: Faint object camera instrument handbook. Version 2.0

    NASA Technical Reports Server (NTRS)

    Paresce, Francesco (Editor)

    1990-01-01

    The Faint Object Camera (FOC) is a long focal ratio, photon counting device designed to take high resolution two dimensional images of areas of the sky up to 44 by 44 arcseconds squared in size, with pixel dimensions as small as 0.0007 by 0.0007 arcseconds squared in the 1150 to 6500 A wavelength range. The basic aim of the handbook is to make relevant information about the FOC available to a wide range of astronomers, many of whom may wish to apply for HST observing time. The FOC, as presently configured, is briefly described, and some basic performance parameters are summarized. Also included are detailed performance parameters and instructions on how to derive approximate FOC exposure times for the proposed targets.

  17. The Stratospheric Observatory for Infrared Astronomy - A New Tool for Planetary Science

    NASA Astrophysics Data System (ADS)

    Ruzek, M. J.; Becklin, E.; Burgdorf, M. J.; Reach, W.

    2010-12-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a joint US/German effort to fly a 2.5 meter telescope on a modified Boeing 747SP aircraft at stratospheric altitudes where the atmosphere is largely transparent at infrared wavelengths. Key goals of the observatory include understanding the formation of stars and planets; the origin and evolution of the interstellar medium; the star formation history of galaxies; and planetary science. SOFIA offers the convenient accessibility of a ground-based observatory coupled with performance advantages of a space-based telescope. SOFIA’s scientific instruments can be exchanged regularly for repairs, to accommodate changing scientific requirements, and to incorporate new technologies. SOFIA’s portability will enable specialized observations of transient and location-specific events such as stellar occultations of Trans-Neptunian Objects. Unlike many spaceborne observatories, SOFIA can observe bright planets and moons directly, and can observe objects closer to the sun than Earth, e.g. comets in their most active phase, and the planet Venus. SOFIA’s first generation instruments cover the spectral range of .3 to 240 microns and have been designed with planetary science in mind. The High-speed Imaging Photometer for Occultations (HIPO) is designed to measure occultations of stars by Kuiper Belt Objects, with SOFIA flying into the predicted shadows and timing the occultation ingress and egress to determine the size of the occulting body. HIPO will also enable transit observations of extrasolar planets. The Faint Object Infrared Camera for the SOFIA Telescope (FORCAST) and the High-resolution Airborne Wideband Camera (HAWC) will enable mid-infrared and far-infrared (respectively) imaging with a wide range of filters for comets and giant planets, and colorimetric observations of small, unresolved bodies to measure the spectral energy distribution of their thermal emission. The German Receiver for Astronomy at

  18. Near-infrared autofluorescence imaging to detect parathyroid glands in thyroid surgery.

    PubMed

    Ladurner, R; Al Arabi, N; Guendogar, U; Hallfeldt, Kkj; Stepp, H; Gallwas, Jks

    2018-01-01

    Objective To identify and save parathyroid glands during thyroidectomy by displaying their autofluorescence. Methods Autofluorescence imaging was carried out during thyroidectomy with and without central lymph node dissection. After visual recognition by the surgeon, the parathyroid glands and the surrounding tissue were exposed to near-infrared light with a wavelength of 690-770 nm using a modified Karl Storz near infrared/indocyanine green endoscopic system. Parathyroid tissue was expected to show near infrared autofluorescence at 820 nm, captured in the blue channel of the camera. Results We investigated 41 parathyroid glands from 20 patients; 37 glands were identified correctly based on near-infrared autofluorescence. Neither lymph nodes nor thyroid revealed substantial autofluorescence and nor did adipose tissue. Conclusions Parathyroid tissue is characterised by showing autofluorescence in the near-infrared spectrum. This effect can be used to identify and preserve parathyroid glands during thyroidectomy.

  19. Land-based infrared imagery for marine mammal detection

    NASA Astrophysics Data System (ADS)

    Graber, Joseph; Thomson, Jim; Polagye, Brian; Jessup, Andrew

    2011-09-01

    A land-based infrared (IR) camera is used to detect endangered Southern Resident killer whales in Puget Sound, Washington, USA. The observations are motivated by a proposed tidal energy pilot project, which will be required to monitor for environmental effects. Potential monitoring methods also include visual observation, passive acoustics, and active acoustics. The effectiveness of observations in the infrared spectrum is compared to observations in the visible spectrum to assess the viability of infrared imagery for cetacean detection and classification. Imagery was obtained at Lime Kiln Park, Washington from 7/6/10-7/9/10 using a FLIR Thermovision A40M infrared camera (7.5-14μm, 37°HFOV, 320x240 pixels) under ideal atmospheric conditions (clear skies, calm seas, and wind speed 0-4 m/s). Whales were detected during both day (9 detections) and night (75 detections) at distances ranging from 42 to 162 m. The temperature contrast between dorsal fins and the sea surface ranged from 0.5 to 4.6 °C. Differences in emissivity from sea surface to dorsal fin are shown to aid detection at high incidence angles (near grazing). A comparison to theory is presented, and observed deviations from theory are investigated. A guide for infrared camera selection based on site geometry and desired target size is presented, with specific considerations regarding marine mammal detection. Atmospheric conditions required to use visible and infrared cameras for marine mammal detection are established and compared with 2008 meteorological data for the proposed tidal energy site. Using conservative assumptions, infrared observations are predicted to provide a 74% increase in hours of possible detection, compared with visual observations.

  20. Heated Surface Temperatures Measured by Infrared Detector in a Cascade Environment

    NASA Technical Reports Server (NTRS)

    Boyle, Robert J.

    2002-01-01

    Investigators have used infrared devices to accurately measure heated surface temperatures. Several of these applications have been for turbine heat transfer studies involving film cooling and surface roughness, typically, these measurements use an infrared camera positioned externally to the test section. In cascade studies, where several blades are used to ensure periodic flow, adjacent blades block the externally positioned camera's views of the test blade. To obtain a more complete mapping of the surface temperatures, researchers at the NASA Glenn Research Center fabricated a probe with an infrared detector to sense the blade temperatures. The probe size was kept small to minimize the flow disturbance. By traversing and rotating the probe, using the same approach as for total pressure surveys, one can find the blade surface temperatures. Probe mounted infrared detectors are appropriate for measuring surface temperatures where an externally positioned infrared camera is unable to completely view the test object. This probe consists of a 8-mm gallium arsenide (GaAs) lens mounted in front of a mercury-cadmium-zinc-tellurium (HgCdZnTe) detector. This type of photovoltaic detector was chosen because of its high sensitivity to temperature when the detector is uncooled. The particular application is for relatively low surface temperatures, typically ambient to 100 C. This requires a detector sensitive at long wavelengths. The detector is a commercial product enclosed in a 9-mm-diameter package. The GaAs lens material was chosen because of its glass-like hardness and its good long-wavelength transmission characteristics. When assembled, the 6.4-mm probe stem is held in the traversing actuator. Since the entire probe is above the measurement plane, the flow field disturbance in the measurement plane is minimized. This particular probe body is somewhat wider than necessary, because it was designed to have replaceable detectors and lenses. The signal for the detector is

  1. High resolution infrared acquisitions droning over the LUSI mud eruption.

    NASA Astrophysics Data System (ADS)

    Di Felice, Fabio; Romeo, Giovanni; Di Stefano, Giuseppe; Mazzini, Adriano

    2016-04-01

    The use of low-cost hand-held infrared (IR) thermal cameras based on uncooled micro-bolometer detector arrays became more widespread during the recent years. Thermal cameras have the ability to estimate temperature values without contact and therefore can be used in circumstances where objects are difficult or dangerous to reach such as volcanic eruptions. Since May 2006 the Indonesian LUSI mud eruption continues to spew boiling mud, water, aqueous vapor, CO2, CH4 and covers a surface of nearly 7 km2. At this locality we performed surveys over the unreachable erupting crater. In the framework of the LUSI Lab project (ERC grant n° 308126), in 2014 and 2015, we acquired high resolution infrared images using a specifically equipped remote-controlled drone flying at an altitude of m 100. The drone is equipped with GPS and an autopilot system that allows pre-programming the flying path or designing grids. The mounted thermal camera has peak spectral sensitivity in LW wavelength (μm 10) that is characterized by low water vapor and CO2 absorption. The low distance (high resolution) acquisitions have a temperature detail every cm 40, therefore it is possible to detect and observe physical phenomena such as thermodynamic behavior, hot mud and fluids emissions locations and their time shifts. Despite the harsh logistics and the continuously varying gas concentrations we managed to collect thermal images to estimate the crater zone spatial thermal variations. We applied atmosphere corrections to calculate infrared absorption by high concentration of water vapor. Thousands of images have been stitched together to obtain a mosaic of the crater zone. Regular monitoring with heat variation measurements collected, e.g. every six months, could give important information about the volcano activity estimating its evolution. A future data base of infrared high resolution and visible images stored in a web server could be a useful monitoring tool. An interesting development will be

  2. New high spectral resolution spectrograph and mid-IR camera for the NASA Infrared Telescope Facility

    NASA Astrophysics Data System (ADS)

    Tokunaga, Alan T.; Bus, Schelte J.; Connelley, Michael; Rayner, John

    2016-10-01

    The NASA Infrared Telescope Facility (IRTF) is a 3.0 m infrared telescope located at an altitude of 4.2 km near the summit of Mauna Kea on the island of Hawaii. The IRTF was established by NASA to support planetary science missions. We show new observational capabilities resulting from the completion of iSHELL, a 1-5 μm echelle spectrograph with resolving power of 70,000 using a 0.375 arcsec slit. This instrument will be commissioned starting in August 2016. The spectral grasp of iSHELL is enormous due to the cross-dispersed design and use of a 2Kx2K HgCdTe array. Raw fits files will be publicly archived, allowing for more effective use of the large amount of spectral data that will be collected. The preliminary observing manual for iSHELL, containing the instrument description, observing procedures and estimates of sensitivity can be downloaded at http://irtfweb.ifa.hawaii.edu/~ishell/iSHELL_observing_manual.pdf. This manual and instrument description papers can be downloaded at http://bit.ly/28NFiMj. We are also working to restore to service our 8-25 μm camera, MIRSI. It will be upgraded with a closed cycle cooler that will eliminate the need for liquid helium and allow continuous use of MIRSI on the telescope. This will enable a wider range of Solar System studies at mid-IR wavelengths, with particular focus on thermal observations of NEOs. The MIRSI upgrade includes plans to integrate a visible CCD camera that will provide simultaneous imaging and guiding capabilities. This visible imager will utilize similar hardware and software as the MORIS system on SpeX. The MIRSI upgrade is being done in collaboration with David Trilling (NAU) and Joseph Hora (CfA). For further information on the IRTF and its instruments including visitor instruments, see: http:// irtfweb.ifa.hawaii.edu/. We gratefully acknowledge the support of NASA contract NNH14CK55B, NASA Science Mission Directorate, and NASA grant NNX15AF81G (Trilling, Hora) for the upgrade of MIRSI.

  3. Pedestrian Detection Based on Adaptive Selection of Visible Light or Far-Infrared Light Camera Image by Fuzzy Inference System and Convolutional Neural Network-Based Verification.

    PubMed

    Kang, Jin Kyu; Hong, Hyung Gil; Park, Kang Ryoung

    2017-07-08

    A number of studies have been conducted to enhance the pedestrian detection accuracy of intelligent surveillance systems. However, detecting pedestrians under outdoor conditions is a challenging problem due to the varying lighting, shadows, and occlusions. In recent times, a growing number of studies have been performed on visible light camera-based pedestrian detection systems using a convolutional neural network (CNN) in order to make the pedestrian detection process more resilient to such conditions. However, visible light cameras still cannot detect pedestrians during nighttime, and are easily affected by shadows and lighting. There are many studies on CNN-based pedestrian detection through the use of far-infrared (FIR) light cameras (i.e., thermal cameras) to address such difficulties. However, when the solar radiation increases and the background temperature reaches the same level as the body temperature, it remains difficult for the FIR light camera to detect pedestrians due to the insignificant difference between the pedestrian and non-pedestrian features within the images. Researchers have been trying to solve this issue by inputting both the visible light and the FIR camera images into the CNN as the input. This, however, takes a longer time to process, and makes the system structure more complex as the CNN needs to process both camera images. This research adaptively selects a more appropriate candidate between two pedestrian images from visible light and FIR cameras based on a fuzzy inference system (FIS), and the selected candidate is verified with a CNN. Three types of databases were tested, taking into account various environmental factors using visible light and FIR cameras. The results showed that the proposed method performs better than the previously reported methods.

  4. A customizable commercial miniaturized 320×256 indium gallium arsenide shortwave infrared camera

    NASA Astrophysics Data System (ADS)

    Huang, Shih-Che; O'Grady, Matthew; Groppe, Joseph V.; Ettenberg, Martin H.; Brubaker, Robert M.

    2004-10-01

    The design and performance of a commercial short-wave-infrared (SWIR) InGaAs microcamera engine is presented. The 0.9-to-1.7 micron SWIR imaging system consists of a room-temperature-TEC-stabilized, 320x256 (25 μm pitch) InGaAs focal plane array (FPA) and a high-performance, highly customizable image-processing set of electronics. The detectivity, D*, of the system is greater than 1013 cm-√Hz/W at 1.55 μm, and this sensitivity may be adjusted in real-time over 100 dB. It features snapshot-mode integration with a minimum exposure time of 130 μs. The digital video processor provides real time pixel-to-pixel, 2-point dark-current subtraction and non-uniformity compensation along with defective-pixel substitution. Other features include automatic gain control (AGC), gamma correction, 7 preset configurations, adjustable exposure time, external triggering, and windowing. The windowing feature is highly flexible; the region of interest (ROI) may be placed anywhere on the imager and can be varied at will. Windowing allows for high-speed readout enabling such applications as target acquisition and tracking; for example, a 32x32 ROI window may be read out at over 3500 frames per second (fps). Output video is provided as EIA170-compatible analog, or as 12-bit CameraLink-compatible digital. All the above features are accomplished in a small volume < 28 cm3, weight < 70 g, and with low power consumption < 1.3 W at room temperature using this new microcamera engine. Video processing is based on a field-programmable gate array (FPGA) platform with a soft-embedded processor that allows for ease of integration/addition of customer-specific algorithms, processes, or design requirements. The camera was developed with the high-performance, space-restricted, power-conscious application in mind, such as robotic or UAV deployment.

  5. Near infrared photography with a vacuum-cold camera. [Orion nebula observation

    NASA Technical Reports Server (NTRS)

    Rossano, G. S.; Russell, R. W.; Cornett, R. H.

    1980-01-01

    Sensitized cooled plates have been obtained of the Orion nebula region and of Sh2-149 in the wavelength ranges 8000 A-9000 A and 9,000 A-11,000 A with a recently designed and constructed vacuum-cold camera. Sensitization procedures are described and the camera design is presented.

  6. First results from the infrared Juno spectral/imager JIRAM at Jupiter

    NASA Astrophysics Data System (ADS)

    Adriani, Alberto; Mura, Alessandro; Grassi, Davide; Altieri, Francesca; Dinelli, Bianca M.; Sindoni, Giuseppe; Bolton, Scott J.; Connerney, Jack E. P.; Atreya, Sushil K.; Bagenal, Fran; Gladstone, G. Randall; Hansen, Candice J.; Ingersoll, Andrew P.; Jansen, Michael A.; Kurth, William S.; Levin, Steven M.; Lunine, Jonathan I.; Mauk, Barry H.; J, McComas, David; Orton, Glenn S.

    2017-04-01

    JIRAM, the Jovian InfraRed Auroral Mapper on board Juno, is equipped with an infrared camera and a spectrometer working in the spectral range 2-5 μm. The primary scientific objectives of the instrument are the study of the infrared aurora, the concentrations of some atmospheric compounds like water, ammonia and phosphine in the Jupiter troposphere and, in particular, in the hot spots and below the cloud deck. Secondary JIRAM objectives are the study of Jupiter's clouds and, to some extent, the dynamics of the atmosphere. So far the instrument was able to get its observations during the first fly-by (PJ1) when JIRAM was operating. Results from data collected during PJ1 about auroras and atmosphere will be presented. We will also show data from the PJ4 pass if the fly-by, which will take place in February, will be successful. A complete coverage of the planet will be obtained after PJ4.

  7. Low-cost thermo-electric infrared FPAs and their automotive applications

    NASA Astrophysics Data System (ADS)

    Hirota, Masaki; Ohta, Yoshimi; Fukuyama, Yasuhiro

    2008-04-01

    This paper describes three low-cost infrared focal plane arrays (FPAs) having a 1,536, 2,304, and 10,800 elements and experimental vehicle systems. They have a low-cost potential because each element consists of p-n polysilicon thermocouples, which allows the use of low-cost ultra-fine microfabrication technology commonly employed in the conventional semiconductor manufacturing processes. To increase the responsivity of FPA, we have developed a precisely patterned Au-black absorber that has high infrared absorptivity of more than 90%. The FPA having a 2,304 elements achieved high resposivity of 4,300 V/W. In order to reduce package cost, we developed a vacuum-sealed package integrated with a molded ZnS lens. The camera aiming the temperature measurement of a passenger cabin is compact and light weight devices that measures 45 x 45 x 30 mm and weighs 190 g. The camera achieves a noise equivalent temperature deviation (NETD) of less than 0.7°C from 0 to 40°C. In this paper, we also present a several experimental systems that use infrared cameras. One experimental system is a blind spot pedestrian warning system that employs four infrared cameras. It can detect the infrared radiation emitted from a human body and alerts the driver when a pedestrian is in a blind spot. The system can also prevent the vehicle from moving in the direction of the pedestrian. Another system uses a visible-light camera and infrared sensors to detect the presence of a pedestrian in a rear blind spot and alerts the driver. The third system is a new type of human-machine interface system that enables the driver to control the car's audio system without letting go of the steering wheel. Uncooled infrared cameras are still costly, which limits their automotive use to high-end luxury cars at present. To promote widespread use of IR imaging sensors on vehicles, we need to reduce their cost further.

  8. Optical design and athermalization analysis of infrared dual band refractive-diffractive telephoto objective

    NASA Astrophysics Data System (ADS)

    Dong, Jianing; Zhang, Yinchao; Chen, Siying; Chen, He; Guo, Pan

    2017-02-01

    In order to improve the remote target detection ability of infrared (IR) images effectively, an infrared telephoto objective for 3μm 5μm and 8μm 12μm dual wave-band is designed for 640 pixel×512 pixel infrared CCD detector. The effects of the surrounding environmental temperature are analyzed and the refractive diffractive hybrid thermal compensation is discussed. The focal length of the system is 200mm, the relative aperture is 1:2.2 and the field of view is 7°. The infrared dual band telephoto system with small volume and compact structure is designed in a large range of temperature. The system is composed of four lenses with only three materials of zinc sulfide, zinc selenide and germanium to compensate for the temperature. The image quality of the system is evaluated by ZEMAX optical design software. The results show that the modulation transfer function (MTF) for each field of view at cut-off frequency of 17 lp/mm are respectively greater than 0.6 and 0.4 which approaches the diffraction limit. The telephoto objective has favorable performance at the working temperature of -40°C +60°C. The relative aperture, field of view, and focal length are same for both spectral regions. The system meets the requirements of technical specification.

  9. Observations of Young Stellar Objects with Infrared Interferometry: Recent Results from PTI, KI and IOTA

    NASA Astrophysics Data System (ADS)

    Akeson, Rachel

    Young stellar objects have been one of the favorite targets of infrared interferometers for many years. In this contribution I will briefly review some of the first results and their contributions to the field and then describe some of the recent results from the Keck Interferometer (KI), the Palomar Testbed Interferometer (PTI) and the Infrared-Optical Telescope Array (IOTA). This conference also saw many exciting new results from the VLTI at both near and mid-infrared wavelengths that are covered by other contributions.

  10. Radiation camera motion correction system

    DOEpatents

    Hoffer, P.B.

    1973-12-18

    The device determines the ratio of the intensity of radiation received by a radiation camera from two separate portions of the object. A correction signal is developed to maintain this ratio at a substantially constant value and this correction signal is combined with the camera signal to correct for object motion. (Official Gazette)

  11. Raspberry Pi camera with intervalometer used as crescograph

    NASA Astrophysics Data System (ADS)

    Albert, Stefan; Surducan, Vasile

    2017-12-01

    The intervalometer is an attachment or facility on a photo-camera that operates the shutter regularly at set intervals over a period. Professional cameras with built in intervalometers are expensive and quite difficult to find. The Canon CHDK open source operating system allows intervalometer implementation on Canon cameras only. However finding a Canon camera with near infra-red (NIR) photographic lens at affordable price is impossible. On experiments requiring several cameras (used to measure growth in plants - the crescographs, but also for coarse evaluation of the water content of leaves), the costs of the equipment are often over budget. Using two Raspberry Pi modules each equipped with a low cost NIR camera and a WIFI adapter (for downloading pictures stored on the SD card) and some freely available software, we have implemented two low budget intervalometer cameras. The shutting interval, the number of pictures to be taken, image resolution and some other parameters can be fully programmed. Cameras have been in use continuously for three months (July-October 2017) in a relevant environment (outside), proving the concept functionality.

  12. Using the Standard Deviation of a Region of Interest in an Image to Estimate Camera to Emitter Distance

    PubMed Central

    Cano-García, Angel E.; Lazaro, José Luis; Infante, Arturo; Fernández, Pedro; Pompa-Chacón, Yamilet; Espinoza, Felipe

    2012-01-01

    In this study, a camera to infrared diode (IRED) distance estimation problem was analyzed. The main objective was to define an alternative to measures depth only using the information extracted from pixel grey levels of the IRED image to estimate the distance between the camera and the IRED. In this paper, the standard deviation of the pixel grey level in the region of interest containing the IRED image is proposed as an empirical parameter to define a model for estimating camera to emitter distance. This model includes the camera exposure time, IRED radiant intensity and the distance between the camera and the IRED. An expression for the standard deviation model related to these magnitudes was also derived and calibrated using different images taken under different conditions. From this analysis, we determined the optimum parameters to ensure the best accuracy provided by this alternative. Once the model calibration had been carried out, a differential method to estimate the distance between the camera and the IRED was defined and applied, considering that the camera was aligned with the IRED. The results indicate that this method represents a useful alternative for determining the depth information. PMID:22778608

  13. Using the standard deviation of a region of interest in an image to estimate camera to emitter distance.

    PubMed

    Cano-García, Angel E; Lazaro, José Luis; Infante, Arturo; Fernández, Pedro; Pompa-Chacón, Yamilet; Espinoza, Felipe

    2012-01-01

    In this study, a camera to infrared diode (IRED) distance estimation problem was analyzed. The main objective was to define an alternative to measures depth only using the information extracted from pixel grey levels of the IRED image to estimate the distance between the camera and the IRED. In this paper, the standard deviation of the pixel grey level in the region of interest containing the IRED image is proposed as an empirical parameter to define a model for estimating camera to emitter distance. This model includes the camera exposure time, IRED radiant intensity and the distance between the camera and the IRED. An expression for the standard deviation model related to these magnitudes was also derived and calibrated using different images taken under different conditions. From this analysis, we determined the optimum parameters to ensure the best accuracy provided by this alternative. Once the model calibration had been carried out, a differential method to estimate the distance between the camera and the IRED was defined and applied, considering that the camera was aligned with the IRED. The results indicate that this method represents a useful alternative for determining the depth information.

  14. ARNICA: the Arcetri Observatory NICMOS3 imaging camera

    NASA Astrophysics Data System (ADS)

    Lisi, Franco; Baffa, Carlo; Hunt, Leslie K.

    1993-10-01

    ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 micrometers that Arcetri Observatory has designed and built as a general facility for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1' per pixel, with sky coverage of more than 4' X 4' on the NICMOS 3 (256 X 256 pixels, 40 micrometers side) detector array. The optical path is compact enough to be enclosed in a 25.4 cm diameter dewar; the working temperature is 76 K. The camera is remotely controlled by a 486 PC, connected to the array control electronics via a fiber-optics link. A C-language package, running under MS-DOS on the 486 PC, acquires and stores the frames, and controls the timing of the array. We give an estimate of performance, in terms of sensitivity with an assigned observing time, along with some details on the main parameters of the NICMOS 3 detector.

  15. Dermoscopy-guided reflectance confocal microscopy of skin using high-NA objective lens with integrated wide-field color camera

    NASA Astrophysics Data System (ADS)

    Dickensheets, David L.; Kreitinger, Seth; Peterson, Gary; Heger, Michael; Rajadhyaksha, Milind

    2016-02-01

    Reflectance Confocal Microscopy, or RCM, is being increasingly used to guide diagnosis of skin lesions. The combination of widefield dermoscopy (WFD) with RCM is highly sensitive (~90%) and specific (~ 90%) for noninvasively detecting melanocytic and non-melanocytic skin lesions. The combined WFD and RCM approach is being implemented on patients to triage lesions into benign (with no biopsy) versus suspicious (followed by biopsy and pathology). Currently, however, WFD and RCM imaging are performed with separate instruments, while using an adhesive ring attached to the skin to sequentially image the same region and co-register the images. The latest small handheld RCM instruments offer no provision yet for a co-registered wide-field image. This paper describes an innovative solution that integrates an ultra-miniature dermoscopy camera into the RCM objective lens, providing simultaneous wide-field color images of the skin surface and RCM images of the subsurface cellular structure. The objective lens (0.9 NA) includes a hyperhemisphere lens and an ultra-miniature CMOS color camera, commanding a 4 mm wide dermoscopy view of the skin surface. The camera obscures the central portion of the aperture of the objective lens, but the resulting annular aperture provides excellent RCM optical sectioning and resolution. Preliminary testing on healthy volunteers showed the feasibility of combined WFD and RCM imaging to concurrently show the skin surface in wide-field and the underlying microscopic cellular-level detail. The paper describes this unique integrated dermoscopic WFD/RCM lens, and shows representative images. The potential for dermoscopy-guided RCM for skin cancer diagnosis is discussed.

  16. Dermoscopy-guided reflectance confocal microscopy of skin using high-NA objective lens with integrated wide-field color camera.

    PubMed

    Dickensheets, David L; Kreitinger, Seth; Peterson, Gary; Heger, Michael; Rajadhyaksha, Milind

    2016-02-01

    Reflectance Confocal Microscopy, or RCM, is being increasingly used to guide diagnosis of skin lesions. The combination of widefield dermoscopy (WFD) with RCM is highly sensitive (~90%) and specific (~ 90%) for noninvasively detecting melanocytic and non-melanocytic skin lesions. The combined WFD and RCM approach is being implemented on patients to triage lesions into benign (with no biopsy) versus suspicious (followed by biopsy and pathology). Currently, however, WFD and RCM imaging are performed with separate instruments, while using an adhesive ring attached to the skin to sequentially image the same region and co-register the images. The latest small handheld RCM instruments offer no provision yet for a co-registered wide-field image. This paper describes an innovative solution that integrates an ultra-miniature dermoscopy camera into the RCM objective lens, providing simultaneous wide-field color images of the skin surface and RCM images of the subsurface cellular structure. The objective lens (0.9 NA) includes a hyperhemisphere lens and an ultra-miniature CMOS color camera, commanding a 4 mm wide dermoscopy view of the skin surface. The camera obscures the central portion of the aperture of the objective lens, but the resulting annular aperture provides excellent RCM optical sectioning and resolution. Preliminary testing on healthy volunteers showed the feasibility of combined WFD and RCM imaging to concurrently show the skin surface in wide-field and the underlying microscopic cellular-level detail. The paper describes this unique integrated dermoscopic WFD/RCM lens, and shows representative images. The potential for dermoscopy-guided RCM for skin cancer diagnosis is discussed.

  17. Target discrimination of man-made objects using passive polarimetric signatures acquired in the visible and infrared spectral bands

    NASA Astrophysics Data System (ADS)

    Lavigne, Daniel A.; Breton, Mélanie; Fournier, Georges; Charette, Jean-François; Pichette, Mario; Rivet, Vincent; Bernier, Anne-Pier

    2011-10-01

    Surveillance operations and search and rescue missions regularly exploit electro-optic imaging systems to detect targets of interest in both the civilian and military communities. By incorporating the polarization of light as supplementary information to such electro-optic imaging systems, it is possible to increase their target discrimination capabilities, considering that man-made objects are known to depolarized light in different manner than natural backgrounds. As it is known that electro-magnetic radiation emitted and reflected from a smooth surface observed near a grazing angle becomes partially polarized in the visible and infrared wavelength bands, additional information about the shape, roughness, shading, and surface temperatures of difficult targets can be extracted by processing effectively such reflected/emitted polarized signatures. This paper presents a set of polarimetric image processing algorithms devised to extract meaningful information from a broad range of man-made objects. Passive polarimetric signatures are acquired in the visible, shortwave infrared, midwave infrared, and longwave infrared bands using a fully automated imaging system developed at DRDC Valcartier. A fusion algorithm is used to enable the discrimination of some objects lying in shadowed areas. Performance metrics, derived from the computed Stokes parameters, characterize the degree of polarization of man-made objects. Field experiments conducted during winter and summer time demonstrate: 1) the utility of the imaging system to collect polarized signatures of different objects in the visible and infrared spectral bands, and 2) the enhanced performance of target discrimination and fusion algorithms to exploit the polarized signatures of man-made objects against cluttered backgrounds.

  18. Uncooled radiometric camera performance

    NASA Astrophysics Data System (ADS)

    Meyer, Bill; Hoelter, T.

    1998-07-01

    Thermal imaging equipment utilizing microbolometer detectors operating at room temperature has found widespread acceptance in both military and commercial applications. Uncooled camera products are becoming effective solutions to applications currently using traditional, photonic infrared sensors. The reduced power consumption and decreased mechanical complexity offered by uncooled cameras have realized highly reliable, low-cost, hand-held instruments. Initially these instruments displayed only relative temperature differences which limited their usefulness in applications such as Thermography. Radiometrically calibrated microbolometer instruments are now available. The ExplorIR Thermography camera leverages the technology developed for Raytheon Systems Company's first production microbolometer imaging camera, the Sentinel. The ExplorIR camera has a demonstrated temperature measurement accuracy of 4 degrees Celsius or 4% of the measured value (whichever is greater) over scene temperatures ranges of minus 20 degrees Celsius to 300 degrees Celsius (minus 20 degrees Celsius to 900 degrees Celsius for extended range models) and camera environmental temperatures of minus 10 degrees Celsius to 40 degrees Celsius. Direct temperature measurement with high resolution video imaging creates some unique challenges when using uncooled detectors. A temperature controlled, field-of-view limiting aperture (cold shield) is not typically included in the small volume dewars used for uncooled detector packages. The lack of a field-of-view shield allows a significant amount of extraneous radiation from the dewar walls and lens body to affect the sensor operation. In addition, the transmission of the Germanium lens elements is a function of ambient temperature. The ExplorIR camera design compensates for these environmental effects while maintaining the accuracy and dynamic range required by today's predictive maintenance and condition monitoring markets.

  19. First experiences with ARNICA, the ARCETRI observatory imaging camera

    NASA Astrophysics Data System (ADS)

    Lisi, F.; Baffa, C.; Hunt, L.; Maiolino, R.; Moriondo, G.; Stanga, R.

    1994-03-01

    ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 micrometer that Arcetri Observatory has designed and built as a common use instrument for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1 sec per pixel, with sky coverage of more than 4 min x 4 min on the NICMOS 3 (256 x 256 pixels, 40 micrometer side) detector array. The optical path is compact enough to be enclosed in a 25.4 cm diameter dewar; the working temperature of detector and optics is 76 K. We give an estimate of performance, in terms of sensitivity with an assigned observing time, along with some preliminary considerations on photometric accuracy.

  20. Machine learning in infrared object classification - an all-sky selection of YSO candidates

    NASA Astrophysics Data System (ADS)

    Marton, Gabor; Zahorecz, Sarolta; Toth, L. Viktor; Magnus McGehee, Peregrine; Kun, Maria

    2015-08-01

    Object classification is a fundamental and challenging problem in the era of big data. I will discuss up-to-date methods and their application to classify infrared point sources.We analysed the ALLWISE catalogue, the most recent public source catalogue of the Wide-field Infrared Survey Explorer (WISE) to compile a reliable list of Young Stellar Object (YSO) candidates. We tested and compared classical and up-to-date statistical methods as well, to discriminate source types like extragalactic objects, evolved stars, main sequence stars, objects related to the interstellar medium and YSO candidates by using their mid-IR WISE properties and associated near-IR 2MASS data.In the particular classification problem the Support Vector Machines (SVM), a class of supervised learning algorithm turned out to be the best tool. As a result we classify Class I and II YSOs with >90% accuracy while the fraction of contaminating extragalactic objects remains well below 1%, based on the number of known objects listed in the SIMBAD and VizieR databases. We compare our results to other classification schemes from the literature and show that the SVM outperforms methods that apply linear cuts on the colour-colour and colour-magnitude space. Our homogenous YSO candidate catalog can serve as an excellent pathfinder for future detailed observations of individual objects and a starting point of statistical studies that aim to add pieces to the big picture of star formation theory.

  1. Near-infrared high-resolution real-time omnidirectional imaging platform for drone detection

    NASA Astrophysics Data System (ADS)

    Popovic, Vladan; Ott, Beat; Wellig, Peter; Leblebici, Yusuf

    2016-10-01

    Recent technological advancements in hardware systems have made higher quality cameras. State of the art panoramic systems use them to produce videos with a resolution of 9000 x 2400 pixels at a rate of 30 frames per second (fps).1 Many modern applications use object tracking to determine the speed and the path taken by each object moving through a scene. The detection requires detailed pixel analysis between two frames. In fields like surveillance systems or crowd analysis, this must be achieved in real time.2 In this paper, we focus on the system-level design of multi-camera sensor acquiring near-infrared (NIR) spectrum and its ability to detect mini-UAVs in a representative rural Swiss environment. The presented results show the UAV detection from the trial that we conducted during a field trial in August 2015.

  2. Constrained space camera assembly

    DOEpatents

    Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

    1999-05-11

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

  3. Constrained space camera assembly

    DOEpatents

    Heckendorn, Frank M.; Anderson, Erin K.; Robinson, Casandra W.; Haynes, Harriet B.

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  4. LOITA: Lunar Optical/Infrared Telescope Array

    NASA Technical Reports Server (NTRS)

    1993-01-01

    LOITA (Lunar Optical/Infrared Telescope Array) is a lunar-based interferometer composed of 18 alt-azimuth telescopes arranged in a circular geometry. This geometry results in excellent uv coverage and allows baselines up to 5 km long. The angular resolution will be 25 micro-arcsec at 500 nm and the main spectral range of the array will be 200 to 1100 nm. For infrared planet detection, the spectral range may be extended to nearly 10 mu m. The telescope mirrors have a Cassegrain configuration using a 1.75 m diameter primary mirror and a 0.24 m diameter secondary mirror. A three-stage (coarse, intermediate, and fine) optical delay system, controlled by laser metrology, is used to equalize path lengths from different telescopes to within a few wavelengths. All instruments and the fine delay system are located within the instrument room. Upon exiting the fine delay system, all beams enter the beam combiner and are then directed to the various scientific instruments and detectors. The array instrumentation will consist of CCD detectors optimized for both the visible and infrared as well as specially designed cameras and spectrographs. For direct planet detection, a beam combiner employing achromatic nulling interferometry will be used to reduce star light (by several orders of magnitude) while passing the planet light. A single telescope will be capable of autonomous operation. This telescope will be equipped with four instruments: wide field and planetary camera, faint object camera, high resolution spectrograph, and faint object spectrograph. These instruments will be housed beneath the telescope. The array pointing and control system is designed to meet the fine pointing requirement of one micro-arcsec stability and to allow precise tracking of celestial objects for up to 12 days. During the lunar night, the optics and the detectors will be passively cooled to 70-80 K temperature. To maintain a continuous communication with the earth a relay satellite placed at the L4

  5. Hubble Space Telescope faint object camera instrument handbook (Post-COSTAR), version 5.0

    NASA Technical Reports Server (NTRS)

    Nota, A. (Editor); Jedrzejewski, R. (Editor); Greenfield, P. (Editor); Hack, W. (Editor)

    1994-01-01

    The faint object camera (FOC) is a long-focal-ratio, photon-counting device capable of taking high-resolution two-dimensional images of the sky up to 14 by 14 arc seconds squared in size with pixel dimensions as small as 0.014 by 0.014 arc seconds squared in the 1150 to 6500 A wavelength range. Its performance approaches that of an ideal imaging system at low light levels. The FOC is the only instrument on board the Hubble Space Telescope (HST) to fully use the spatial resolution capabilities of the optical telescope assembly (OTA) and is one of the European Space Agency's contributions to the HST program.

  6. Infrared Thermography-based Biophotonics: Integrated Diagnostic Technique for Systemic Reaction Monitoring

    NASA Astrophysics Data System (ADS)

    Vainer, Boris G.; Morozov, Vitaly V.

    A peculiar branch of biophotonics is a measurement, visualisation and quantitative analysis of infrared (IR) radiation emitted from living object surfaces. Focal plane array (FPA)-based IR cameras make it possible to realize in medicine the so called interventional infrared thermal diagnostics. An integrated technique aimed at the advancement of this new approach in biomedical science and practice is described in the paper. The assembled system includes a high-performance short-wave (2.45-3.05 μm) or long-wave (8-14 μm) IR camera, two laser Doppler flowmeters (LDF) and additional equipment and complementary facilities implementing the monitoring of human cardiovascular status. All these means operate synchronously. It is first ascertained the relationship between infrared thermography (IRT) and LDF data in humans in regard to their systemic cardiovascular reactivity. Blood supply real-time dynamics in a narcotized patient is first visualized and quantitatively represented during surgery in order to observe how the general hyperoxia influences thermoregulatory mechanisms; an abrupt increase in temperature of the upper limb is observed using IRT. It is outlined that the IRT-based integrated technique may act as a take-off runway leading to elaboration of informative new methods directly applicable to medicine and biomedical sciences.

  7. A Multispectral Image Creating Method for a New Airborne Four-Camera System with Different Bandpass Filters

    PubMed Central

    Li, Hanlun; Zhang, Aiwu; Hu, Shaoxing

    2015-01-01

    This paper describes an airborne high resolution four-camera multispectral system which mainly consists of four identical monochrome cameras equipped with four interchangeable bandpass filters. For this multispectral system, an automatic multispectral data composing method was proposed. The homography registration model was chosen, and the scale-invariant feature transform (SIFT) and random sample consensus (RANSAC) were used to generate matching points. For the difficult registration problem between visible band images and near-infrared band images in cases lacking manmade objects, we presented an effective method based on the structural characteristics of the system. Experiments show that our method can acquire high quality multispectral images and the band-to-band alignment error of the composed multiple spectral images is less than 2.5 pixels. PMID:26205264

  8. Prism-based single-camera system for stereo display

    NASA Astrophysics Data System (ADS)

    Zhao, Yue; Cui, Xiaoyu; Wang, Zhiguo; Chen, Hongsheng; Fan, Heyu; Wu, Teresa

    2016-06-01

    This paper combines the prism and single camera and puts forward a method of stereo imaging with low cost. First of all, according to the principle of geometrical optics, we can deduce the relationship between the prism single-camera system and dual-camera system, and according to the principle of binocular vision we can deduce the relationship between binoculars and dual camera. Thus we can establish the relationship between the prism single-camera system and binoculars and get the positional relation of prism, camera, and object with the best effect of stereo display. Finally, using the active shutter stereo glasses of NVIDIA Company, we can realize the three-dimensional (3-D) display of the object. The experimental results show that the proposed approach can make use of the prism single-camera system to simulate the various observation manners of eyes. The stereo imaging system, which is designed by the method proposed by this paper, can restore the 3-D shape of the object being photographed factually.

  9. Measurement of reach envelopes with a four-camera Selective Spot Recognition (SELSPOT) system

    NASA Technical Reports Server (NTRS)

    Stramler, J. H., Jr.; Woolford, B. J.

    1983-01-01

    The basic Selective Spot Recognition (SELSPOT) system is essentially a system which uses infrared LEDs and a 'camera' with an infrared-sensitive photodetector, a focusing lens, and some A/D electronics to produce a digital output representing an X and Y coordinate for each LED for each camera. When the data are synthesized across all cameras with appropriate calibrations, an XYZ set of coordinates is obtained for each LED at a given point in time. Attention is given to the operating modes, a system checkout, and reach envelopes and software. The Video Recording Adapter (VRA) represents the main addition to the basic SELSPOT system. The VRA contains a microprocessor and other electronics which permit user selection of several options and some interaction with the system.

  10. Design of a Remote Infrared Images and Other Data Acquisition Station for outdoor applications

    NASA Astrophysics Data System (ADS)

    Béland, M.-A.; Djupkep, F. B. D.; Bendada, A.; Maldague, X.; Ferrarini, G.; Bison, P.; Grinzato, E.

    2013-05-01

    The Infrared Images and Other Data Acquisition Station enables a user, who is located inside a laboratory, to acquire visible and infrared images and distances in an outdoor environment with the help of an Internet connection. This station can acquire data using an infrared camera, a visible camera, and a rangefinder. The system can be used through a web page or through Python functions.

  11. C-RED one: ultra-high speed wavefront sensing in the infrared made possible

    NASA Astrophysics Data System (ADS)

    Gach, J.-L.; Feautrier, Philippe; Stadler, Eric; Greffe, Timothee; Clop, Fabien; Lemarchand, Stéphane; Carmignani, Thomas; Boutolleau, David; Baker, Ian

    2016-07-01

    First Light Imaging's CRED-ONE infrared camera is capable of capturing up to 3500 full frames per second with a subelectron readout noise. This breakthrough has been made possible thanks to the use of an e-APD infrared focal plane array which is a real disruptive technology in imagery. We will show the performances of the camera, its main features and compare them to other high performance wavefront sensing cameras like OCAM2 in the visible and in the infrared. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.

  12. Research on a solid state-streak camera based on an electro-optic crystal

    NASA Astrophysics Data System (ADS)

    Wang, Chen; Liu, Baiyu; Bai, Yonglin; Bai, Xiaohong; Tian, Jinshou; Yang, Wenzheng; Xian, Ouyang

    2006-06-01

    With excellent temporal resolution ranging from nanosecond to sub-picoseconds, a streak camera is widely utilized in measuring ultrafast light phenomena, such as detecting synchrotron radiation, examining inertial confinement fusion target, and making measurements of laser-induced discharge. In combination with appropriate optics or spectroscope, the streak camera delivers intensity vs. position (or wavelength) information on the ultrafast process. The current streak camera is based on a sweep electric pulse and an image converting tube with a wavelength-sensitive photocathode ranging from the x-ray to near infrared region. This kind of streak camera is comparatively costly and complex. This paper describes the design and performance of a new-style streak camera based on an electro-optic crystal with large electro-optic coefficient. Crystal streak camera accomplishes the goal of time resolution by direct photon beam deflection using the electro-optic effect which can replace the current streak camera from the visible to near infrared region. After computer-aided simulation, we design a crystal streak camera which has the potential of time resolution between 1ns and 10ns.Some further improvements in sweep electric circuits, a crystal with a larger electro-optic coefficient, for example LN (γ 33=33.6×10 -12m/v) and the optimal optic system may lead to better time resolution less than 1ns.

  13. Convolutional Neural Network-Based Human Detection in Nighttime Images Using Visible Light Camera Sensors.

    PubMed

    Kim, Jong Hyun; Hong, Hyung Gil; Park, Kang Ryoung

    2017-05-08

    Because intelligent surveillance systems have recently undergone rapid growth, research on accurately detecting humans in videos captured at a long distance is growing in importance. The existing research using visible light cameras has mainly focused on methods of human detection for daytime hours when there is outside light, but human detection during nighttime hours when there is no outside light is difficult. Thus, methods that employ additional near-infrared (NIR) illuminators and NIR cameras or thermal cameras have been used. However, in the case of NIR illuminators, there are limitations in terms of the illumination angle and distance. There are also difficulties because the illuminator power must be adaptively adjusted depending on whether the object is close or far away. In the case of thermal cameras, their cost is still high, which makes it difficult to install and use them in a variety of places. Because of this, research has been conducted on nighttime human detection using visible light cameras, but this has focused on objects at a short distance in an indoor environment or the use of video-based methods to capture multiple images and process them, which causes problems related to the increase in the processing time. To resolve these problems, this paper presents a method that uses a single image captured at night on a visible light camera to detect humans in a variety of environments based on a convolutional neural network. Experimental results using a self-constructed Dongguk night-time human detection database (DNHD-DB1) and two open databases (Korea advanced institute of science and technology (KAIST) and computer vision center (CVC) databases), as well as high-accuracy human detection in a variety of environments, show that the method has excellent performance compared to existing methods.

  14. The infrared imaging radiometer for PICASSO-CENA

    NASA Astrophysics Data System (ADS)

    Corlay, Gilles; Arnolfo, Marie-Christine; Bret-Dibat, Thierry; Lifferman, Anne; Pelon, Jacques

    2017-11-01

    Microbolometers are infrared detectors of an emerging technology mainly developed in US and few other countries for few years. The main targets of these developments are low performing and low cost military and civilian applications like survey cameras. Applications in space are now arising thanks to the design simplification and the associated cost reduction allowed by this new technology. Among the four instruments of the payload of PICASSO-CENA, the Imaging Infrared Radiometer (IIR) is based on the microbolometer technology. An infrared camera in development for the IASI instrument is the core of the IIR. The aim of the paper is to recall the PICASSO-CENA mission goal, to describe the IIR instrument architecture and highlight its main features and performances and to give the its development status.

  15. The use of far infra-red radiation for the detection of concealed metal objects.

    DOT National Transportation Integrated Search

    1971-11-01

    Abstract The use of infrared radiation for the detection : of concealed metal objects has been investigated both : theoretically and experimentally. The investigation was : divided into two phases, one which considered passive : techniques, and anoth...

  16. Performance and Calibration of H2RG Detectors and SIDECAR ASICs for the RATIR Camera

    NASA Technical Reports Server (NTRS)

    Fox, Ori D.; Kutyrev, Alexander S.; Rapchun, David A.; Klein, Christopher R.; Butler, Nathaniel R.; Bloom, Josh; de Diego, Jos A.; Simn Farah, Alejandro D.; Gehrels, Neil A.; Georgiev, Leonid; hide

    2012-01-01

    The Reionization And Transient Infra,.Red (RATIR) camera has been built for rapid Gamma,.Ray Burst (GRE) followup and will provide simultaneous optical and infrared photometric capabilities. The infrared portion of this camera incorporates two Teledyne HgCdTe HAWAII-2RG detectors, controlled by Teledyne's SIDECAR ASICs. While other ground-based systems have used the SIDECAR before, this system also utilizes Teledyne's JADE2 interface card and IDE development environment. Together, this setup comprises Teledyne's Development Kit, which is a bundled solution that can be efficiently integrated into future ground-based systems. In this presentation, we characterize the system's read noise, dark current, and conversion gain.

  17. The use of far infra-red radiation for the detection of concealed metal objects

    DOT National Transportation Integrated Search

    1971-11-01

    The use of infrared radiation for the detection of concealed metal objects has been investigated both theoretically and experimentally. The investigation was divided into two phases, one which considered passive techniques, and another which involved...

  18. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell; Nettles, Mindy

    2015-01-01

    The Additive Manufacturing Infrared Inspection Task started the development of a real-time dimensional inspection technique and digital quality record for the additive manufacturing process using infrared camera imaging and processing techniques. This project will benefit additive manufacturing by providing real-time inspection of internal geometry that is not currently possible and reduce the time and cost of additive manufactured parts with automated real-time dimensional inspections which deletes post-production inspections.

  19. Extended spectrum SWIR camera with user-accessible Dewar

    NASA Astrophysics Data System (ADS)

    Benapfl, Brendan; Miller, John Lester; Vemuri, Hari; Grein, Christoph; Sivananthan, Siva

    2017-02-01

    Episensors has developed a series of extended short wavelength infrared (eSWIR) cameras based on high-Cd concentration Hg1-xCdxTe absorbers. The cameras have a bandpass extending to 3 microns cutoff wavelength, opening new applications relative to traditional InGaAs-based cameras. Applications and uses are discussed and examples given. A liquid nitrogen pour-filled version was initially developed. This was followed by a compact Stirling-cooled version with detectors operating at 200 K. Each camera has unique sensitivity and performance characteristics. The cameras' size, weight and power specifications are presented along with images captured with band pass filters and eSWIR sources to demonstrate spectral response beyond 1.7 microns. The soft seal Dewars of the cameras are designed for accessibility, and can be opened and modified in a standard laboratory environment. This modular approach allows user flexibility for swapping internal components such as cold filters and cold stops. The core electronics of the Stirlingcooled camera are based on a single commercial field programmable gate array (FPGA) that also performs on-board non-uniformity corrections, bad pixel replacement, and directly drives any standard HDMI display.

  20. Engineers Install Near Infrared Camera into the Heart of Webb Telescope

    NASA Image and Video Library

    2014-03-31

    nside the world's largest clean room at NASA's Goddard Space Flight Center in Greenbelt, Md., engineers worked tirelessly to install another essential part of the James Webb Space Telescope - the Near Infrared Camera into the heart of the telescope. To complete this installation, the engineers needed to carefully move NIRCam inside the heart or ISIM, or Integrated Science Instrument Module that will house all of the science instruments. "Installing NIRCam into the center of the structure is nerve wracking because of the tight clearances," said Marcia J. Rieke, Professor of Astronomy at the University of Arizona, and principal investigator for the NIRCam. "I'm glad nothing bumped, and all the bolts are in place." NIRCam is a unique machine because in addition to being one of the four science instruments on the Webb, it also serves as the wavefront sensor, which means it will provide vital information for shaping the telescope mirrors and aligning its optics so that they can function properly and see into the distant universe. The NIRCam instrument will operate at very cold temperatures, and will be tested to ensure that it will be able to withstand the environment of space. The NIRCam is Webb's primary imager that will cover the infrared wavelength range 0.6 to 5 microns. It will detect light from the earliest stars and galaxies in the process of formation, the population of stars in nearby galaxies, as well as young stars and exoplanets in the Milky Way. NIRCam is provided by the University of Arizona and Lockheed Martin Advanced Technology Center. Webb is an international project led by NASA with its partners the European Space Agency and the Canadian Space Agency. The James Webb Space Telescope is the successor to NASA's Hubble Space Telescope. It will be the most powerful space telescope ever built. For more information about the Webb telescope, visit: www.jwst.nasa.gov or www.nasa.gov/webb Credit: NASA/Goddard/Chris Gunn NASA image use policy. NASA Goddard

  1. Correlating objective and subjective evaluation of texture appearance with applications to camera phone imaging

    NASA Astrophysics Data System (ADS)

    Phillips, Jonathan B.; Coppola, Stephen M.; Jin, Elaine W.; Chen, Ying; Clark, James H.; Mauer, Timothy A.

    2009-01-01

    Texture appearance is an important component of photographic image quality as well as object recognition. Noise cleaning algorithms are used to decrease sensor noise of digital images, but can hinder texture elements in the process. The Camera Phone Image Quality (CPIQ) initiative of the International Imaging Industry Association (I3A) is developing metrics to quantify texture appearance. Objective and subjective experimental results of the texture metric development are presented in this paper. Eight levels of noise cleaning were applied to ten photographic scenes that included texture elements such as faces, landscapes, architecture, and foliage. Four companies (Aptina Imaging, LLC, Hewlett-Packard, Eastman Kodak Company, and Vista Point Technologies) have performed psychophysical evaluations of overall image quality using one of two methods of evaluation. Both methods presented paired comparisons of images on thin film transistor liquid crystal displays (TFT-LCD), but the display pixel pitch and viewing distance differed. CPIQ has also been developing objective texture metrics and targets that were used to analyze the same eight levels of noise cleaning. The correlation of the subjective and objective test results indicates that texture perception can be modeled with an objective metric. The two methods of psychophysical evaluation exhibited high correlation despite the differences in methodology.

  2. Nonuniformity correction of infrared cameras by reading radiance temperatures with a spatially nonhomogeneous radiation source

    NASA Astrophysics Data System (ADS)

    Gutschwager, Berndt; Hollandt, Jörg

    2017-01-01

    We present a novel method of nonuniformity correction (NUC) of infrared cameras and focal plane arrays (FPA) in a wide optical spectral range by reading radiance temperatures and by applying a radiation source with an unknown and spatially nonhomogeneous radiance temperature distribution. The benefit of this novel method is that it works with the display and the calculation of radiance temperatures, it can be applied to radiation sources of arbitrary spatial radiance temperature distribution, and it only requires sufficient temporal stability of this distribution during the measurement process. In contrast to this method, an initially presented method described the calculation of NUC correction with the reading of monitored radiance values. Both methods are based on the recording of several (at least three) images of a radiation source and a purposeful row- and line-shift of these sequent images in relation to the first primary image. The mathematical procedure is explained in detail. Its numerical verification with a source of a predefined nonhomogeneous radiance temperature distribution and a thermal imager of a predefined nonuniform FPA responsivity is presented.

  3. Faint Object Camera observations of M87 - The jet and nucleus

    NASA Technical Reports Server (NTRS)

    Boksenberg, A.; Macchetto, F.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Crane, P.; Deharveng, J. M.; Disney, M. J.; Jakobsen, P.; Kamperman, T. M.

    1992-01-01

    UV and optical images of the central region and jet of the nearby elliptical galaxy M87 have been obtained with about 0.1 arcsec resolution in several spectral bands with the Faint Object Camera (FOC) on the HST, including polarization images. Deconvolution enhances the contrast of the complex structure and filamentary patterns in the jet already evident in the aberrated images. Morphologically there is close similarity between the FOC images of the extended jet and the best 2-cm radio maps obtained at similar resolution, and the magnetic field vectors from the UV and radio polarimetric data also correspond well. We observe structure in the inner jet within a few tenths arcsec of the nucleus which also has been well studied at radio wavelengths. Our UV and optical photometry of regions along the jet shows little variation in spectral index from the value 1.0 between markedly different regions and no trend to a steepening spectrum with distance along the jet.

  4. Relationship among eye temperature measured using digital infrared thermal imaging and vaginal and rectal temperatures in hair sheep and cattle

    USDA-ARS?s Scientific Manuscript database

    Digital infrared thermal imaging (DITI) using a thermal camera has potential to be a useful tool for the production animal industry. Thermography has been used in both humans and a wide range of animal species to measure body temperature as a method to detect injury or inflammation. The objective of...

  5. The use of far infra-red radiation for the detection of concealed metal objects.

    DOT National Transportation Integrated Search

    1971-11-01

    The use of infra-red radiation for the detection : of conceal ed metal objects has been investiga ted both : theoretically and experimentally. The investigation was : divided into two phases, one which considered passive : techniques, and another whi...

  6. Vector-Based Ground Surface and Object Representation Using Cameras

    DTIC Science & Technology

    2009-12-01

    representations and it is a digital data structure used for the representation of a ground surface in geographical information systems ( GIS ). Figure...Vision API library, and the OpenCV library. Also, the Posix thread library was utilized to quickly capture the source images from cameras. Both

  7. Multi-spectral imaging with infrared sensitive organic light emitting diode

    PubMed Central

    Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky

    2014-01-01

    Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions. PMID:25091589

  8. Multi-spectral imaging with infrared sensitive organic light emitting diode

    NASA Astrophysics Data System (ADS)

    Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky

    2014-08-01

    Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions.

  9. Design and Validation of an Infrared Badal Optometer for Laser Speckle (IBOLS)

    PubMed Central

    Teel, Danielle F. W.; Copland, R. James; Jacobs, Robert J.; Wells, Thad; Neal, Daniel R.; Thibos, Larry N.

    2009-01-01

    Purpose To validate the design of an infrared wavefront aberrometer with a Badal optometer employing the principle of laser speckle generated by a spinning disk and infrared light. The instrument was designed for subjective meridional refraction in infrared light by human patients. Methods Validation employed a model eye with known refractive error determined with an objective infrared wavefront aberrometer. The model eye was used to produce a speckle pattern on an artificial retina with controlled amounts of ametropia introduced with auxiliary ophthalmic lenses. A human observer performed the psychophysical task of observing the speckle pattern (with the aid of a video camera sensitive to infrared radiation) formed on the artificial retina. Refraction was performed by adjusting the vergence of incident light with the Badal optometer to nullify the motion of laser speckle. Validation of the method was performed for different levels of spherical ametropia and for various configurations of an astigmatic model eye. Results Subjective measurements of meridional refractive error over the range −4D to + 4D agreed with astigmatic refractive errors predicted by the power of the model eye in the meridian of motion of the spinning disk. Conclusions Use of a Badal optometer to control laser speckle is a valid method for determining subjective refractive error at infrared wavelengths. Such an instrument will be useful for comparing objective measures of refractive error obtained for the human eye with autorefractors and wavefront aberrometers that employ infrared radiation. PMID:18772719

  10. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates

    USGS Publications Warehouse

    Hobbs, Michael T.; Brehme, Cheryl S.

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  11. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates.

    PubMed

    Hobbs, Michael T; Brehme, Cheryl S

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  12. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates

    PubMed Central

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing. PMID:28981533

  13. Science, conservation, and camera traps

    USGS Publications Warehouse

    Nichols, James D.; Karanth, K. Ullas; O'Connel, Allan F.; O'Connell, Allan F.; Nichols, James D.; Karanth, K. Ullas

    2011-01-01

    Biologists commonly perceive camera traps as a new tool that enables them to enter the hitherto secret world of wild animals. Camera traps are being used in a wide range of studies dealing with animal ecology, behavior, and conservation. Our intention in this volume is not to simply present the various uses of camera traps, but to focus on their use in the conduct of science and conservation. In this chapter, we provide an overview of these two broad classes of endeavor and sketch the manner in which camera traps are likely to be able to contribute to them. Our main point here is that neither photographs of individual animals, nor detection history data, nor parameter estimates generated from detection histories are the ultimate objective of a camera trap study directed at either science or management. Instead, the ultimate objectives are best viewed as either gaining an understanding of how ecological systems work (science) or trying to make wise decisions that move systems from less desirable to more desirable states (conservation, management). Therefore, we briefly describe here basic approaches to science and management, emphasizing the role of field data and associated analyses in these processes. We provide examples of ways in which camera trap data can inform science and management.

  14. Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras.

    PubMed

    Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki

    2016-06-24

    Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system.

  15. Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras

    PubMed Central

    Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki

    2016-01-01

    Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system. PMID:27347961

  16. Camera Systems Rapidly Scan Large Structures

    NASA Technical Reports Server (NTRS)

    2013-01-01

    Needing a method to quickly scan large structures like an aircraft wing, Langley Research Center developed the line scanning thermography (LST) system. LST works in tandem with a moving infrared camera to capture how a material responds to changes in temperature. Princeton Junction, New Jersey-based MISTRAS Group Inc. now licenses the technology and uses it in power stations and industrial plants.

  17. SUBARU near-infrared multi-color images of Class II Young Stellar Object, RNO91

    NASA Astrophysics Data System (ADS)

    Mayama, Satoshi; Tamura, Motohide; Hayashi, Masahiko

    RNO91 is class II source currently in a transition phase between a protostar and a main-sequence star. It is known as a source of complex molecular outflows. Previous studies suggested that RNO91 was associated with a reflection nebula, a CO outflow, shock-excited H[2] emission, and disk type structure. But geometry of RNO91, especially its inner region, is not well confirmed yet. High resolution imaging is needed to understand the nature of RNO91 and its interaction with outflow. Furthermore, RNO91 is an important candidate for studying YSOs in a transition phase. Thus, we conducted near-infrared imaging observations of RNO91 with the infrared camera CIAO mounted on the Subaru 8.2m Telescope. We present JHK band and optical images which resolve a complex asymmetrical circumstellar structure. We examined the color of RNO91 nebula and compare the geometry of the system suggested by our data with that already proposed on the basis of other studies. Our main results are as follows; 1. At J and optical, several bluer clumps are detected and they are aligned nearly perpendicular to the outflow axis. 2. The NIR images show significant halo emission detected within 2'' around the peak position while less halo emission is seen in the optical image. The nebula appears to become more circular and more diffuse with increasing wavelengths. The power-law dependence of radial surface brightness profile is shallower than that of normal stars, indicating that RNO91 is still optically thick objects. We suggest that the halo emission is the NIR light scattered by an optically thick disk or envelope surrounding the RNO91. 3. In the shorter wavelength images, the nebula appears to become more extended (2".3 long) to the southwest. This extended emission might trace a bottom of outflow emanating to southwest direction. 4. Color composite image of RNO91 reveals that the emission extending to the north and to the east through RNO91 is interpreted as a part of the cavity wall seen

  18. Electronic camera-management system for 35-mm and 70-mm film cameras

    NASA Astrophysics Data System (ADS)

    Nielsen, Allan

    1993-01-01

    Military and commercial test facilities have been tasked with the need for increasingly sophisticated data collection and data reduction. A state-of-the-art electronic control system for high speed 35 mm and 70 mm film cameras designed to meet these tasks is described. Data collection in today's test range environment is difficult at best. The need for a completely integrated image and data collection system is mandated by the increasingly complex test environment. Instrumentation film cameras have been used on test ranges to capture images for decades. Their high frame rates coupled with exceptionally high resolution make them an essential part of any test system. In addition to documenting test events, today's camera system is required to perform many additional tasks. Data reduction to establish TSPI (time- space-position information) may be performed after a mission and is subject to all of the variables present in documenting the mission. A typical scenario would consist of multiple cameras located on tracking mounts capturing the event along with azimuth and elevation position data. Corrected data can then be reduced using each camera's time and position deltas and calculating the TSPI of the object using triangulation. An electronic camera control system designed to meet these requirements has been developed by Photo-Sonics, Inc. The feedback received from test technicians at range facilities throughout the world led Photo-Sonics to design the features of this control system. These prominent new features include: a comprehensive safety management system, full local or remote operation, frame rate accuracy of less than 0.005 percent, and phase locking capability to Irig-B. In fact, Irig-B phase lock operation of multiple cameras can reduce the time-distance delta of a test object traveling at mach-1 to less than one inch during data reduction.

  19. Evaluation of fiber reinforced polymers using active infrared thermography system with thermoelectric cooling modules

    NASA Astrophysics Data System (ADS)

    Chady, Tomasz; Gorący, Krzysztof

    2018-04-01

    Active infrared thermography is increasingly used for nondestructive testing of various materials. Properties of this method are creating a unique possibility to utilize it for inspection of composites. In the case of active thermography, an external energy source is usually used to induce a thermal contrast inside tested objects. The conventional heating methods (like halogen lamps or flash lamps) are utilized for this purpose. In this study, we propose to use a cooling unit. The proposed system consists of a thermal imaging infrared camera, which is used to observe the surface of the inspected specimen and a specially designed cooling unit with thermoelectric modules (the Peltier modules).

  20. New gonioscopy system using only infrared light.

    PubMed

    Sugimoto, Kota; Ito, Kunio; Matsunaga, Koichi; Miura, Katsuya; Esaki, Koji; Uji, Yukitaka

    2005-08-01

    To describe an infrared gonioscopy system designed to observe the anterior chamber angle under natural mydriasis in a completely darkened room. An infrared light filter was used to modify the light source of the slit-lamp microscope. A television monitor connected to a CCD monochrome camera was used to indirectly observe the angle. Use of the infrared system enabled observation of the angle under natural mydriasis in a completely darkened room. Infrared gonioscopy is a useful procedure for the observation of the angle under natural mydriasis.

  1. Far-infrared photometry of compact extragalactic objects - Detection of 3C 345

    NASA Technical Reports Server (NTRS)

    Harvey, P. M.; Wilking, B. A.; Joy, M.

    1982-01-01

    The first detection of a quasar between 10 and 1000 microns is reported. The observation permits (1) the determination of the intersection of the optical/infrared and millimeter continua; (2) more precise determination of the total luminosity; (3) the placing of limits on the contribution of any thermal dust emission to the total luminosity. The quasar is the first object ever to have been observed whose energy distribution peaks at wavelength of about 100 microns without a large contribution to the total luminosity from thermal dust emission. The observed flux density of 2.2 + or - 0.5 Jy at 100 microns and an upper limit of 0.5 + or - 0.6 Jy at 50 microns clearly define the overall energy distribution and show the quasar to be a powerful far-infrared source.

  2. Remote Sensing of Arctic Environmental Conditions and Critical Infrastructure using Infra-Red (IR) Cameras and Unmanned Air Vehicles (UAVs)

    NASA Astrophysics Data System (ADS)

    Hatfield, M. C.; Webley, P.; Saiet, E., II

    2014-12-01

    Remote Sensing of Arctic Environmental Conditions and Critical Infrastructure using Infra-Red (IR) Cameras and Unmanned Air Vehicles (UAVs) Numerous scientific and logistical applications exist in Alaska and other arctic regions requiring analysis of expansive, remote areas in the near infrared (NIR) and thermal infrared (TIR) bands. These include characterization of wild land fire plumes and volcanic ejecta, detailed mapping of lava flows, and inspection of lengthy segments of critical infrastructure, such as the Alaska pipeline and railroad system. Obtaining timely, repeatable, calibrated measurements of these extensive features and infrastructure networks requires localized, taskable assets such as UAVs. The Alaska Center for Unmanned Aircraft Systems Integration (ACUASI) provides practical solutions to these problem sets by pairing various IR sensors with a combination of fixed-wing and multi-rotor air vehicles. Fixed-wing assets, such as the Insitu ScanEagle, offer long reach and extended duration capabilities to quickly access remote locations and provide enduring surveillance of the target of interest. Rotary-wing assets, such as the Aeryon Scout or the ACUASI-built Ptarmigan hexcopter, provide a precision capability for detailed horizontal mapping or vertical stratification of atmospheric phenomena. When included with other ground capabilities, we will show how they can assist in decision support and hazard assessment as well as giving those in emergency management a new ability to increase knowledge of the event at hand while reducing the risk to all involved. Here, in this presentation, we illustrate how UAV's can provide the ideal tool to map and analyze the hazardous events and critical infrastructure under extreme environmental conditions.

  3. Atlas of low-mass young stellar object disks from mid-infrared interferometry

    NASA Astrophysics Data System (ADS)

    Varga, J.; Ábrahám, P.; Ratzka, Th.; Menu, J.; Gabányi, K.; Kóspál, Á.; van Boekel, R.; Mosoni, L.; Henning, Th.

    We present our approach of visibility modeling of disks around low-mass (< 2 M ⊙) young stellar objects (YSOs). We compiled an atlas based on mid-infrared interferometric observations from the MIDI instrument at the VLTI. We use three different models to fit the data. These models allow us to determine overall sizes (and the extent of the inner gaps) of the modeled circumstellar disks.

  4. Enhanced LWIR NUC using an uncooled microbolometer camera

    NASA Astrophysics Data System (ADS)

    LaVeigne, Joe; Franks, Greg; Sparkman, Kevin; Prewarski, Marcus; Nehring, Brian

    2011-06-01

    Performing a good non-uniformity correction is a key part of achieving optimal performance from an infrared scene projector, and the best NUC is performed in the band of interest for the sensor being tested. While cooled, large format MWIR cameras are readily available and have been successfully used to perform NUC, similar cooled, large format LWIR cameras are not as common and are prohibitively expensive. Large format uncooled cameras are far more available and affordable, but present a range of challenges in practical use for performing NUC on an IRSP. Some of these challenges were discussed in a previous paper. In this discussion, we report results from a continuing development program to use a microbolometer camera to perform LWIR NUC on an IRSP. Camera instability and temporal response and thermal resolution were the main problems, and have been solved by the implementation of several compensation strategies as well as hardware used to stabilize the camera. In addition, other processes have been developed to allow iterative improvement as well as supporting changes of the post-NUC lookup table without requiring re-collection of the pre-NUC data with the new LUT in use.

  5. Monitoring the body temperature of cows and calves using video recordings from an infrared thermography camera.

    PubMed

    Hoffmann, Gundula; Schmidt, Mariana; Ammon, Christian; Rose-Meierhöfer, Sandra; Burfeind, Onno; Heuwieser, Wolfgang; Berg, Werner

    2013-06-01

    The aim of this study was to assess the variability of temperatures measured by a video-based infrared camera (IRC) in comparison to rectal and vaginal temperatures. The body surface temperatures of cows and calves were measured contactless at different body regions using videos from the IRC. Altogether, 22 cows and 9 calves were examined. The differences of the measured IRC temperatures among the body regions, i.e. eye (mean: 37.0 °C), back of the ear (35.6 °C), shoulder (34.9 °C) and vulva (37.2 °C), were significant (P < 0.01), except between eye and vulva (P = 0.99). The quartile ranges of the measured IRC temperatures at the 4 above mentioned regions were between 1.2 and 1.8 K. Of the investigated body regions the eye and the back of the ear proved to be suitable as practical regions for temperature monitoring. The temperatures of these 2 regions could be gained by the use of the maximum temperatures of the head and body area. Therefore, only the maximum temperatures of both areas were used for further analysis. The data analysis showed an increase for the maximum temperature measured by IRC at head and body area with an increase of rectal temperature in cows and calves. The use of infrared thermography videos has the advantage to analyze more than 1 picture per animal in a short period of time, and shows potential as a monitoring system for body temperatures in cattle.

  6. Accuracy, resolution, and cost comparisons between small format and mapping cameras for environmental mapping

    NASA Technical Reports Server (NTRS)

    Clegg, R. H.; Scherz, J. P.

    1975-01-01

    Successful aerial photography depends on aerial cameras providing acceptable photographs within cost restrictions of the job. For topographic mapping where ultimate accuracy is required only large format mapping cameras will suffice. For mapping environmental patterns of vegetation, soils, or water pollution, 9-inch cameras often exceed accuracy and cost requirements, and small formats may be better. In choosing the best camera for environmental mapping, relative capabilities and costs must be understood. This study compares resolution, photo interpretation potential, metric accuracy, and cost of 9-inch, 70mm, and 35mm cameras for obtaining simultaneous color and color infrared photography for environmental mapping purposes.

  7. Slitless spectroscopy with the James Webb Space Telescope Near-Infrared Camera (JWST NIRCam)

    NASA Astrophysics Data System (ADS)

    Greene, Thomas P.; Chu, Laurie; Egami, Eiichi; Hodapp, Klaus W.; Kelly, Douglas M.; Leisenring, Jarron; Rieke, Marcia; Robberto, Massimo; Schlawin, Everett; Stansberry, John

    2016-07-01

    The James Webb Space Telescope near-infrared camera (JWST NIRCam) has two 2.02 x 2.02 fields of view that are capable of either imaging or spectroscopic observations. Either of two R ~ 1500 grisms with orthogonal dispersion directions can be used for slitless spectroscopy over λ = 2.4 - 5.0 μm in each module, and shorter wavelength observations of the same fields can be obtained simultaneously. We present the latest predicted grism sensitivities, saturation limits, resolving power, and wavelength coverage values based on component measurements, instrument tests, and end-to-end modeling. Short wavelength (0.6 - 2.3 μm) imaging observations of the 2.4 - 5.0 μm spectroscopic field can be performed in one of several different filter bands, either in-focus or defocused via weak lenses internal to NIRCam. Alternatively, the possibility of 1.0 - 2.0 μm spectroscopy (simultaneously with 2.4 - 5.0 μm) using dispersed Hartmann sensors (DHSs) is being explored. The grisms, weak lenses, and DHS elements were included in NIRCam primarily for wavefront sensing purposes, but all have significant science applications. Operational considerations including subarray sizes, and data volume limits are also discussed. Finally, we describe spectral simulation tools and illustrate potential scientific uses of the grisms by presenting simulated observations of deep extragalactic fields, galactic dark clouds, and transiting exoplanets.

  8. Research on Geometric Calibration of Spaceborne Linear Array Whiskbroom Camera

    PubMed Central

    Sheng, Qinghong; Wang, Qi; Xiao, Hui; Wang, Qing

    2018-01-01

    The geometric calibration of a spaceborne thermal-infrared camera with a high spatial resolution and wide coverage can set benchmarks for providing an accurate geographical coordinate for the retrieval of land surface temperature. The practice of using linear array whiskbroom Charge-Coupled Device (CCD) arrays to image the Earth can help get thermal-infrared images of a large breadth with high spatial resolutions. Focusing on the whiskbroom characteristics of equal time intervals and unequal angles, the present study proposes a spaceborne linear-array-scanning imaging geometric model, whilst calibrating temporal system parameters and whiskbroom angle parameters. With the help of the YG-14—China’s first satellite equipped with thermal-infrared cameras of high spatial resolution—China’s Anyang Imaging and Taiyuan Imaging are used to conduct an experiment of geometric calibration and a verification test, respectively. Results have shown that the plane positioning accuracy without ground control points (GCPs) is better than 30 pixels and the plane positioning accuracy with GCPs is better than 1 pixel. PMID:29337885

  9. Non-contact local temperature measurement inside an object using an infrared point detector

    NASA Astrophysics Data System (ADS)

    Hisaka, Masaki

    2017-04-01

    Local temperature measurement in deep areas of objects is an important technique in biomedical measurement. We have investigated a non-contact method for measuring temperature inside an object using a point detector for infrared (IR) light. An IR point detector with a pinhole was constructed and the radiant IR light emitted from the local interior of the object is photodetected only at the position of pinhole located in imaging relation. We measured the thermal structure of the filament inside the miniature bulb using the IR point detector, and investigated the temperature dependence at approximately human body temperature using a glass plate positioned in front of the heat source.

  10. Leveraging traffic and surveillance video cameras for urban traffic.

    DOT National Transportation Integrated Search

    2014-12-01

    The objective of this project was to investigate the use of existing video resources, such as traffic : cameras, police cameras, red light cameras, and security cameras for the long-term, real-time : collection of traffic statistics. An additional ob...

  11. Optical Design of the LSST Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S; Seppala, L; Gilmore, K

    2008-07-16

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a camera system that includes a set of broad-band filters and refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. Optical design of the camera lenses and filters is integrated with optical design of telescope mirrors to optimize performance, resulting in excellent image quality over the entire field from ultra-violet to near infra-red wavelengths. The LSST camera optics design consists of three refractive lenses withmore » clear aperture diameters of 1.55 m, 1.10 m and 0.69 m and six interchangeable, broad-band, filters with clear aperture diameters of 0.75 m. We describe the methodology for fabricating, coating, mounting and testing these lenses and filters, and we present the results of detailed tolerance analyses, demonstrating that the camera optics will perform to the specifications required to meet their performance goals.« less

  12. A Cryogenic, Insulating Suspension System for the High Resolution Airborne Wideband Camera (HAWC)and Submillemeter And Far Infrared Experiment (SAFIRE) Adiabatic Demagnetization Refrigerators (ADRs)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Jackson, Michael L.; Shirron, Peter J.; Tuttle, James G.

    2002-01-01

    The High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter And Far Infrared Experiment (SAFIRE) will use identical Adiabatic Demagnetization Refrigerators (ADR) to cool their detectors to 200mK and 100mK, respectively. In order to minimize thermal loads on the salt pill, a Kevlar suspension system is used to hold it in place. An innovative, kinematic suspension system is presented. The suspension system is unique in that it consists of two parts that can be assembled and tensioned offline, and later bolted onto the salt pill.

  13. NEAR-INFRARED THERMAL EMISSION DETECTIONS OF A NUMBER OF HOT JUPITERS AND THE SYSTEMATICS OF GROUND-BASED NEAR-INFRARED PHOTOMETRY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croll, Bryce; Albert, Loic; Lafreniere, David

    We present detections of the near-infrared thermal emission of three hot Jupiters and one brown dwarf using the Wide-field Infrared Camera (WIRCam) on the Canada-France-Hawaii Telescope (CFHT). These include Ks-band secondary eclipse detections of the hot Jupiters WASP-3b and Qatar-1b and the brown dwarf KELT-1b. We also report Y-band, K {sub CONT}-band, and two new and one reanalyzed Ks-band detections of the thermal emission of the hot Jupiter WASP-12b. We present a new reduction pipeline for CFHT/WIRCam data, which is optimized for high precision photometry. We also describe novel techniques for constraining systematic errors in ground-based near-infrared photometry, so asmore » to return reliable secondary eclipse depths and uncertainties. We discuss the noise properties of our ground-based photometry for wavelengths spanning the near-infrared (the YJHK bands), for faint and bright stars, and for the same object on several occasions. For the hot Jupiters WASP-3b and WASP-12b we demonstrate the repeatability of our eclipse depth measurements in the Ks band; we therefore place stringent limits on the systematics of ground-based, near-infrared photometry, and also rule out violent weather changes in the deep, high pressure atmospheres of these two hot Jupiters at the epochs of our observations.« less

  14. Mid-Infrared Reflectance Imaging of Thermal-Barrier Coatings

    NASA Technical Reports Server (NTRS)

    Edlridge, Jeffrey I.; Martin, Richard E.

    2009-01-01

    An apparatus for mid-infrared reflectance imaging has been developed as means of inspecting for subsurface damage in thermal-barrier coatings (TBCs). The apparatus is designed, more specifically, for imaging the progression of buried delamination cracks in plasma-sprayed yttria-stabilized zirconia coatings on turbine-engine components. Progression of TBC delamination occurs by the formation of buried cracks that grow and then link together to produce eventual TBC spallation. The mid-infrared reflectance imaging system described here makes it possible to see delamination progression that is invisible to the unaided eye, and therefore give sufficiently advanced warning before delamination progression adversely affects engine performance and safety. The apparatus (see figure) includes a commercial mid-infrared camera that contains a liquid-nitrogen-cooled focal plane indium antimonide photodetector array, and imaging is restricted by a narrow bandpass centered at wavelength of 4 microns. This narrow wavelength range centered at 4 microns was chosen because (1) it enables avoidance of interfering absorptions by atmospheric OH and CO2 at 3 and 4.25 microns, respectively; and (2) the coating material exhibits maximum transparency in this wavelength range. Delamination contrast is produced in the midinfrared reflectance images because the introduction of cracks into the TBC creates an internal TBC/air-gap interface with a high diffuse reflectivity of 0.81, resulting in substantially higher reflectance of mid-infrared radiation in regions that contain buried delamination cracks. The camera is positioned a short distance (.12 cm) from the specimen. The mid-infrared illumination is generated by a 50-watt silicon carbide source positioned to the side of the mid-infrared camera, and the illumination is collimated and reflected onto the specimen by a 6.35-cm-diameter off-axis paraboloidal mirror. Because the collected images are of a steady-state reflected intensity (in

  15. Holographic motion picture camera with Doppler shift compensation

    NASA Technical Reports Server (NTRS)

    Kurtz, R. L. (Inventor)

    1976-01-01

    A holographic motion picture camera is reported for producing three dimensional images by employing an elliptical optical system. There is provided in one of the beam paths (the object or reference beam path) a motion compensator which enables the camera to photograph faster moving objects.

  16. KENNEDY SPACE CENTER, FLA. - STS-82 crew members and workers at KSC's Vertical Processing Facility get a final look at the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) in its flight configuration for the STS-82 mission. The crew is participating in the Crew Equipment Integration Test (CEIT). NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS also is HST's first cryogenic instrument - its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 degrees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is scheduled Feb. 11 aboard Discovery with a crew of seven.

    NASA Image and Video Library

    1997-01-22

    KENNEDY SPACE CENTER, FLA. - STS-82 crew members and workers at KSC's Vertical Processing Facility get a final look at the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) in its flight configuration for the STS-82 mission. The crew is participating in the Crew Equipment Integration Test (CEIT). NICMOS is one of two new scientific instruments that will replace two outdated instruments on the Hubble Space Telescope (HST). NICMOS will provide HST with the capability for infrared imaging and spectroscopic observations of astronomical targets. The refrigerator-sized NICMOS also is HST's first cryogenic instrument - its sensitive infrared detectors must operate at very cold temperatures of minus 355 degrees Fahrenheit or 58 degrees Kelvin. NICMOS will be installed in Hubble during STS-82, the second Hubble Space Telescope servicing mission. Liftoff is scheduled Feb. 11 aboard Discovery with a crew of seven.

  17. SWIR, VIS and LWIR observer performance against handheld objects: a comparison

    NASA Astrophysics Data System (ADS)

    Adomeit, Uwe

    2016-10-01

    The short wave infrared spectral range caused interest to be used in day and night time military and security applications in the last years. This necessitates performance assessment of SWIR imaging equipment in comparison to the one operating in the visual (VIS) and thermal infrared (LWIR) spectral range. In the military context (nominal) range is the main performance criteria. Discriminating friend from foe is one of the main tasks in today's asymmetric scenarios and so personnel, human activities and handheld objects are used as targets to estimate ranges. The later was also used for an experiment at Fraunhofer IOSB to get a first impression how the SWIR performs compared to VIS and LWIR. A human consecutively carrying one of nine different civil or military objects was recorded from five different ranges in the three spectral ranges. For the visual spectral range a 3-chip color-camera was used, the SWIR range was covered by an InGaAs-camera and the LWIR by an uncooled bolometer. It was ascertained that the nominal spatial resolution of the three cameras was in the same magnitude in order to enable an unbiased assessment. Daytime conditions were selected for data acquisition to separate the observer performance from illumination conditions and to some extend also camera performance. From the recorded data, a perception experiment was prepared. It was conducted as a nine-alternative forced choice, unlimited observation time test with 15 observers participating. Before the experiment, the observers were trained on close range target data. Outcome of the experiment was the average probability of identification versus range between camera and target. The comparison of the range performance achieved in the three spectral bands gave a mixed result. On one hand a ranking VIS / SWIR / LWIR in decreasing order can be seen in the data, but on the other hand only the difference between VIS and the other bands is statistically significant. Additionally it was not possible

  18. EARLY SCIENCE WITH SOFIA, THE STRATOSPHERIC OBSERVATORY FOR INFRARED ASTRONOMY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, E. T.; Becklin, E. E.; De Buizer, J. M.

    The Stratospheric Observatory For Infrared Astronomy (SOFIA) is an airborne observatory consisting of a specially modified Boeing 747SP with a 2.7 m telescope, flying at altitudes as high as 13.7 km (45,000 ft). Designed to observe at wavelengths from 0.3 {mu}m to 1.6 mm, SOFIA operates above 99.8% of the water vapor that obscures much of the infrared and submillimeter. SOFIA has seven science instruments under development, including an occultation photometer, near-, mid-, and far-infrared cameras, infrared spectrometers, and heterodyne receivers. SOFIA, a joint project between NASA and the German Aerospace Center Deutsches Zentrum fuer Luft und-Raumfahrt, began initial sciencemore » flights in 2010 December, and has conducted 30 science flights in the subsequent year. During this early science period three instruments have flown: the mid-infrared camera FORCAST, the heterodyne spectrometer GREAT, and the occultation photometer HIPO. This Letter provides an overview of the observatory and its early performance.« less

  19. Infrared techniques for comet observations

    NASA Technical Reports Server (NTRS)

    Hanner, Martha S.; Tokunaga, Alan T.

    1991-01-01

    The infrared spectral region (1-1000 microns) is important for studies of both molecules and solid grains in comets. Infrared astronomy is in the midst of a technological revolution, with the development of sensitive 2D arrays leading to IR cameras and spectrometers with vastly improved sensitivity and resolution. The Halley campaign gave us tantalizing first glimpses of the comet science possible with this new technology, evidenced, for example, by the many new spectral features detected in the infrared. The techniques of photometry, imaging, and spectroscopy are reviewed in this chapter and their status at the time of the Halley observations is described.

  20. Traffic monitoring with distributed smart cameras

    NASA Astrophysics Data System (ADS)

    Sidla, Oliver; Rosner, Marcin; Ulm, Michael; Schwingshackl, Gert

    2012-01-01

    The observation and monitoring of traffic with smart visions systems for the purpose of improving traffic safety has a big potential. Today the automated analysis of traffic situations is still in its infancy--the patterns of vehicle motion and pedestrian flow in an urban environment are too complex to be fully captured and interpreted by a vision system. 3In this work we present steps towards a visual monitoring system which is designed to detect potentially dangerous traffic situations around a pedestrian crossing at a street intersection. The camera system is specifically designed to detect incidents in which the interaction of pedestrians and vehicles might develop into safety critical encounters. The proposed system has been field-tested at a real pedestrian crossing in the City of Vienna for the duration of one year. It consists of a cluster of 3 smart cameras, each of which is built from a very compact PC hardware system in a weatherproof housing. Two cameras run vehicle detection and tracking software, one camera runs a pedestrian detection and tracking module based on the HOG dectection principle. All 3 cameras use sparse optical flow computation in a low-resolution video stream in order to estimate the motion path and speed of objects. Geometric calibration of the cameras allows us to estimate the real-world co-ordinates of detected objects and to link the cameras together into one common reference system. This work describes the foundation for all the different object detection modalities (pedestrians, vehicles), and explains the system setup, tis design, and evaluation results which we have achieved so far.

  1. Non-contact measurement of rotation angle with solo camera

    NASA Astrophysics Data System (ADS)

    Gan, Xiaochuan; Sun, Anbin; Ye, Xin; Ma, Liqun

    2015-02-01

    For the purpose to measure a rotation angle around the axis of an object, a non-contact rotation angle measurement method based on solo camera was promoted. The intrinsic parameters of camera were calibrated using chessboard on principle of plane calibration theory. The translation matrix and rotation matrix between the object coordinate and the camera coordinate were calculated according to the relationship between the corners' position on object and their coordinates on image. Then the rotation angle between the measured object and the camera could be resolved from the rotation matrix. A precise angle dividing table (PADT) was chosen as the reference to verify the angle measurement error of this method. Test results indicated that the rotation angle measurement error of this method did not exceed +/- 0.01 degree.

  2. Objectives for the Space Infrared Telescope Facility

    NASA Technical Reports Server (NTRS)

    Spehalski, Richard J.; Werner, Michael W.

    1991-01-01

    The Space Infrared Telescope Facility (SIRFT) is a one-meter-class, liquid-helium-cooled, earth-orbiting astronomical observatory that will be the infrared component of NASA's family of Great Observatories. SIRTF will investigate numerous scientific areas including formation and evolution of galaxies, stars, and other solar systems; supernovae; phenomena in our own solar system; and, undoubtedly, topics that are outside today's scientific domain. SIRTF's three instruments will permit imaging at all infrared wavelengths from 1.8 to 1200 microns and spectroscopy from 2.5 to 200 microns. The observatory will operate at an altitude of 100,000 km where it will achieve a five-year lifetime and operate with better than 80 percent on-target efficiency. The scientific importance and technical and programmatic readiness of SIRTF has been recognized by the 1991 report of the National Research Council's Astronomy and Astrophysics Survey Committee which recently identified SIRTF as the highest priority major new initiative in all of astronomy for the coming decade.

  3. James Webb Telescope's Near Infrared Camera: Making Models, Building Understanding

    NASA Astrophysics Data System (ADS)

    Lebofsky, Larry A.; McCarthy, D. W.; Higgins, M. L.; Lebofsky, N. R.

    2010-10-01

    The Astronomy Camp for Girl Scout Leaders is a science education program sponsored by NASA's next large space telescope: The James Webb Space Telescope (JWST). The E/PO team for JWST's Near Infrared Camera (NIRCam), in collaboration with the Sahuaro Girl Scout Council, has developed a long-term relationship with adult leaders from all GSUSA Councils that directly benefits troops of all ages, not only in general science education but also specifically in the astronomical and technology concepts relating to JWST. We have been training and equipping these leaders so they can in turn teach young women essential concepts in astronomy, i.e., the night sky environment. We model what astronomers do by engaging trainers in the process of scientific inquiry, and we equip them to host troop-level astronomy-related activities. It is GSUSA's goal to foster girls’ interest and creativity in Science, Technology, Engineering, and Math, creating an environment that encourages their interests early in their lives while creating a safe place for girls to try and fail, and then try again and succeed. To date, we have trained over 158 leaders in 13 camps. These leaders have come from 24 states, DC, Guam, and Japan. While many of the camp activities are related to the "First Light” theme, many of the background activities relate to two of the other JWST and NIRCam themes: "Birth of Stars and Protoplanetary Systems” and "Planetary Systems and the Origin of Life.” The latter includes our own Solar System. Our poster will highlight the Planetary Systems theme: 1. Earth and Moon: Day and Night; Rotation and Revolution. 2. Earth/Moon Comparisons. 3. Size Model: The Diameters of the Planets. 4. Macramé Planetary (Solar) Distance Model. 5.What is a Planet? 6. Planet Sorting Cards. 7. Human Orrery 8. Lookback Time in Our Daily Lives NIRCam E/PO website: http://zeus.as.arizona.edu/ dmccarthy/GSUSA

  4. Aircraft engine-mounted camera system for long wavelength infrared imaging of in-service thermal barrier coated turbine blades

    NASA Astrophysics Data System (ADS)

    Markham, James; Cosgrove, Joseph; Scire, James; Haldeman, Charles; Agoos, Ian

    2014-12-01

    This paper announces the implementation of a long wavelength infrared camera to obtain high-speed thermal images of an aircraft engine's in-service thermal barrier coated turbine blades. Long wavelength thermal images were captured of first-stage blades. The achieved temporal and spatial resolutions allowed for the identification of cooling-hole locations. The software and synchronization components of the system allowed for the selection of any blade on the turbine wheel, with tuning capability to image from leading edge to trailing edge. Its first application delivered calibrated thermal images as a function of turbine rotational speed at both steady state conditions and during engine transients. In advance of presenting these data for the purpose of understanding engine operation, this paper focuses on the components of the system, verification of high-speed synchronized operation, and the integration of the system with the commercial jet engine test bed.

  5. Aircraft engine-mounted camera system for long wavelength infrared imaging of in-service thermal barrier coated turbine blades.

    PubMed

    Markham, James; Cosgrove, Joseph; Scire, James; Haldeman, Charles; Agoos, Ian

    2014-12-01

    This paper announces the implementation of a long wavelength infrared camera to obtain high-speed thermal images of an aircraft engine's in-service thermal barrier coated turbine blades. Long wavelength thermal images were captured of first-stage blades. The achieved temporal and spatial resolutions allowed for the identification of cooling-hole locations. The software and synchronization components of the system allowed for the selection of any blade on the turbine wheel, with tuning capability to image from leading edge to trailing edge. Its first application delivered calibrated thermal images as a function of turbine rotational speed at both steady state conditions and during engine transients. In advance of presenting these data for the purpose of understanding engine operation, this paper focuses on the components of the system, verification of high-speed synchronized operation, and the integration of the system with the commercial jet engine test bed.

  6. Objective assessment of biomagnetic devices and alternative clinical therapies using infrared thermal imaging

    NASA Astrophysics Data System (ADS)

    Rockley, Graham J.

    2001-03-01

    The overwhelming introduction of magnetic devices and other alternative therapies into the health care market prompts the need for objective evaluation of these techniques through the use of infrared thermal imaging. Many of these therapies are reported to promote the stimulation of blood flow or the relief of pain conditions. Infrared imaging is an efficient tool to assess such changes in the physiological state. Therefore, a thermal imager can help document and substantiate whether these therapies are in fact providing an effective change to the local circulation. Thermal images may also indicate whether the change is temporary or sustained. As a specific case example, preliminary findings will be presented concerning the use of magnets and the effect they have on peripheral circulation. This will include a discussion of the recommended protocols for this type of infrared testing. This test model can be applied to the evaluation of other devices and therapeutic procedures which are reputed to affect circulation such as electro acupuncture, orthopedic footwear and topical ointments designed to relieve pain or inflammation.

  7. Monitoring machining conditions by infrared images

    NASA Astrophysics Data System (ADS)

    Borelli, Joao E.; Gonzaga Trabasso, Luis; Gonzaga, Adilson; Coelho, Reginaldo T.

    2001-03-01

    During machining process the knowledge of the temperature is the most important factor in tool analysis. It allows to control main factors that influence tool use, life time and waste. The temperature in the contact area between the piece and the tool is resulting from the material removal in cutting operation and it is too difficult to be obtained because the tool and the work piece are in motion. One way to measure the temperature in this situation is detecting the infrared radiation. This work presents a new methodology for diagnosis and monitoring of machining processes with the use of infrared images. The infrared image provides a map in gray tones of the elements in the process: tool, work piece and chips. Each gray tone in the image corresponds to a certain temperature for each one of those materials and the relationship between the gray tones and the temperature is gotten by the previous of infrared camera calibration. The system developed in this work uses an infrared camera, a frame grabber board and a software composed of three modules. The first module makes the image acquisition and processing. The second module makes the feature image extraction and performs the feature vector. Finally, the third module uses fuzzy logic to evaluate the feature vector and supplies the tool state diagnostic as output.

  8. Implementation and performance of shutterless uncooled micro-bolometer cameras

    NASA Astrophysics Data System (ADS)

    Das, J.; de Gaspari, D.; Cornet, P.; Deroo, P.; Vermeiren, J.; Merken, P.

    2015-06-01

    A shutterless algorithm is implemented into the Xenics LWIR thermal cameras and modules. Based on a calibration set and a global temperature coefficient the optimal non-uniformity correction is calculated onboard of the camera. The limited resources in the camera require a compact algorithm, hence the efficiency of the coding is important. The performance of the shutterless algorithm is studied by a comparison of the residual non-uniformity (RNU) and signal-to-noise ratio (SNR) between the shutterless and shuttered correction algorithm. From this comparison we conclude that the shutterless correction is only slightly less performant compared to the standard shuttered algorithm, making this algorithm very interesting for thermal infrared applications where small weight and size, and continuous operation are important.

  9. Infrared Spectrometer for ExoMars: A Mast-Mounted Instrument for the Rover.

    PubMed

    Korablev, Oleg I; Dobrolensky, Yurii; Evdokimova, Nadezhda; Fedorova, Anna A; Kuzmin, Ruslan O; Mantsevich, Sergei N; Cloutis, Edward A; Carter, John; Poulet, Francois; Flahaut, Jessica; Griffiths, Andrew; Gunn, Matthew; Schmitz, Nicole; Martín-Torres, Javier; Zorzano, Maria-Paz; Rodionov, Daniil S; Vago, Jorge L; Stepanov, Alexander V; Titov, Andrei Yu; Vyazovetsky, Nikita A; Trokhimovskiy, Alexander Yu; Sapgir, Alexander G; Kalinnikov, Yurii K; Ivanov, Yurii S; Shapkin, Alexei A; Ivanov, Andrei Yu

    ISEM (Infrared Spectrometer for ExoMars) is a pencil-beam infrared spectrometer that will measure reflected solar radiation in the near infrared range for context assessment of the surface mineralogy in the vicinity of the ExoMars rover. The instrument will be accommodated on the mast of the rover and will be operated together with the panoramic camera (PanCam), high-resolution camera (HRC). ISEM will study the mineralogical and petrographic composition of the martian surface in the vicinity of the rover, and in combination with the other remote sensing instruments, it will aid in the selection of potential targets for close-up investigations and drilling sites. Of particular scientific interest are water-bearing minerals, such as phyllosilicates, sulfates, carbonates, and minerals indicative of astrobiological potential, such as borates, nitrates, and ammonium-bearing minerals. The instrument has an ∼1° field of view and covers the spectral range between 1.15 and 3.30 μm with a spectral resolution varying from 3.3 nm at 1.15 μm to 28 nm at 3.30 μm. The ISEM optical head is mounted on the mast, and its electronics box is located inside the rover's body. The spectrometer uses an acousto-optic tunable filter and a Peltier-cooled InAs detector. The mass of ISEM is 1.74 kg, including the electronics and harness. The science objectives of the experiment, the instrument design, and operational scenarios are described. Key Words: ExoMars-ISEM-Mars-Surface-Mineralogy-Spectroscopy-AOTF-Infrared. Astrobiology 17, 542-564.

  10. Tower testing of a 64W shortwave infrared supercontinuum laser for use as a hyperspectral imaging illuminator

    NASA Astrophysics Data System (ADS)

    Meola, Joseph; Absi, Anthony; Islam, Mohammed N.; Peterson, Lauren M.; Ke, Kevin; Freeman, Michael J.; Ifaraguerri, Agustin I.

    2014-06-01

    Hyperspectral imaging systems are currently used for numerous activities related to spectral identification of materials. These passive imaging systems rely on naturally reflected/emitted radiation as the source of the signal. Thermal infrared systems measure radiation emitted from objects in the scene. As such, they can operate at both day and night. However, visible through shortwave infrared systems measure solar illumination reflected from objects. As a result, their use is limited to daytime applications. Omni Sciences has produced high powered broadband shortwave infrared super-continuum laser illuminators. A 64-watt breadboard system was recently packaged and tested at Wright-Patterson Air Force Base to gauge beam quality and to serve as a proof-of-concept for potential use as an illuminator for a hyperspectral receiver. The laser illuminator was placed in a tower and directed along a 1.4km slant path to various target materials with reflected radiation measured with both a broadband camera and a hyperspectral imaging system to gauge performance.

  11. Near-infrared images of MG 1131+0456 with the W. M. Keck telescope: Another dusty gravitational lens?

    NASA Technical Reports Server (NTRS)

    Larkin, J. E.; Matthews, K.; Lawrence, C. R.; Graham, J. R.; Harrison, W.; Jernigan, G.; Lin, S.; Nelson, J.; Neugebauer, G.; Smith, G.

    1994-01-01

    Images of the gravitational lens system MG 1131+0456 taken with the near-infrared camera on the W. M. Keck telescope in the J and K(sub s) bands show that the infrared counterparts of the compact radio structure are exceedingly red, with J - K greater than 4.2 mag. The J image reveals only the lensing galaxy, while the K(sub s) image shows both the lens and the infrared counterparts of the compact radio components. After subtracting the lensing galaxy from the K(sub s) image, the position and orientation of the compact components agree with their radio counterparts. The broad-band spectrum and observed brightness of the lens suggest a giant galaxy at a redshift of approximately 0.75, while the color of the quasar images suggests significant extinction by dust in the lens. There is a significant excess of faint objects within 20 sec of MG 1131+0456. Depending on their mass and redshifts, these objects could complicate the lensing potential considerably.

  12. Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects

    PubMed Central

    Lambers, Martin; Kolb, Andreas

    2017-01-01

    In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data. PMID:29271888

  13. Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects.

    PubMed

    Bulczak, David; Lambers, Martin; Kolb, Andreas

    2017-12-22

    In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.

  14. Hardware accelerator design for tracking in smart camera

    NASA Astrophysics Data System (ADS)

    Singh, Sanjay; Dunga, Srinivasa Murali; Saini, Ravi; Mandal, A. S.; Shekhar, Chandra; Vohra, Anil

    2011-10-01

    Smart Cameras are important components in video analysis. For video analysis, smart cameras needs to detect interesting moving objects, track such objects from frame to frame, and perform analysis of object track in real time. Therefore, the use of real-time tracking is prominent in smart cameras. The software implementation of tracking algorithm on a general purpose processor (like PowerPC) could achieve low frame rate far from real-time requirements. This paper presents the SIMD approach based hardware accelerator designed for real-time tracking of objects in a scene. The system is designed and simulated using VHDL and implemented on Xilinx XUP Virtex-IIPro FPGA. Resulted frame rate is 30 frames per second for 250x200 resolution video in gray scale.

  15. Design of the front end electronics for the infrared camera of JEM-EUSO, and manufacturing and verification of the prototype model

    NASA Astrophysics Data System (ADS)

    Maroto, Oscar; Diez-Merino, Laura; Carbonell, Jordi; Tomàs, Albert; Reyes, Marcos; Joven-Alvarez, Enrique; Martín, Yolanda; Morales de los Ríos, J. A.; del Peral, Luis; Rodríguez-Frías, M. D.

    2014-07-01

    The Japanese Experiment Module (JEM) Extreme Universe Space Observatory (EUSO) will be launched and attached to the Japanese module of the International Space Station (ISS). Its aim is to observe UV photon tracks produced by ultra-high energy cosmic rays developing in the atmosphere and producing extensive air showers. The key element of the instrument is a very wide-field, very fast, large-lense telescope that can detect extreme energy particles with energy above 1019 eV. The Atmospheric Monitoring System (AMS), comprising, among others, the Infrared Camera (IRCAM), which is the Spanish contribution, plays a fundamental role in the understanding of the atmospheric conditions in the Field of View (FoV) of the telescope. It is used to detect the temperature of clouds and to obtain the cloud coverage and cloud top altitude during the observation period of the JEM-EUSO main instrument. SENER is responsible for the preliminary design of the Front End Electronics (FEE) of the Infrared Camera, based on an uncooled microbolometer, and the manufacturing and verification of the prototype model. This paper describes the flight design drivers and key factors to achieve the target features, namely, detector biasing with electrical noise better than 100μV from 1Hz to 10MHz, temperature control of the microbolometer, from 10°C to 40°C with stability better than 10mK over 4.8hours, low noise high bandwidth amplifier adaptation of the microbolometer output to differential input before analog to digital conversion, housekeeping generation, microbolometer control, and image accumulation for noise reduction. It also shows the modifications implemented in the FEE prototype design to perform a trade-off of different technologies, such as the convenience of using linear or switched regulation for the temperature control, the possibility to check the camera performances when both microbolometer and analog electronics are moved further away from the power and digital electronics, and

  16. Selecting a digital camera for telemedicine.

    PubMed

    Patricoski, Chris; Ferguson, A Stewart

    2009-06-01

    The digital camera is an essential component of store-and-forward telemedicine (electronic consultation). There are numerous makes and models of digital cameras on the market, and selecting a suitable consumer-grade camera can be complicated. Evaluation of digital cameras includes investigating the features and analyzing image quality. Important features include the camera settings, ease of use, macro capabilities, method of image transfer, and power recharging. Consideration needs to be given to image quality, especially as it relates to color (skin tones) and detail. It is important to know the level of the photographer and the intended application. The goal is to match the characteristics of the camera with the telemedicine program requirements. In the end, selecting a digital camera is a combination of qualitative (subjective) and quantitative (objective) analysis. For the telemedicine program in Alaska in 2008, the camera evaluation and decision process resulted in a specific selection based on the criteria developed for our environment.

  17. THE VARIABLE NEAR-INFRARED COUNTERPART OF THE MICROQUASAR GRS 1758–258

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luque-Escamilla, Pedro L.; Martí, Josep; Muñoz-Arjonilla, Álvaro J., E-mail: peter@ujaen.es, E-mail: jmarti@ujaen.es, E-mail: ajmunoz@ujaen.es

    2014-12-10

    We present a new study of the microquasar system GRS 1758–258 in the near-infrared domain based on archival observations with the Hubble Space Telescope and the NICMOS camera. In addition to confirming the near-infrared counterpart pointed out by Muñoz-Arjonilla et al., we show that this object displays significant photometric variability. From its average magnitudes, we also find that GRS 1758–258 fits well within the correlation between the optical/near-infrared and X-ray luminosity known to exist for low-mass, black-hole candidate X-ray binaries in a hard state. Moreover, the spectral energy distribution built using all radio, near-infrared, and X-ray data available closest inmore » time to the NICMOS observations can be reasonably interpreted in terms of a self-absorbed radio jet and an irradiated accretion disk model around a stellar-mass black hole. All these facts match the expected behavior of a compact binary system and strengthen our confidence in the counterpart identification.« less

  18. Plenoptic camera image simulation for reconstruction algorithm verification

    NASA Astrophysics Data System (ADS)

    Schwiegerling, Jim

    2014-09-01

    Plenoptic cameras have emerged in recent years as a technology for capturing light field data in a single snapshot. A conventional digital camera can be modified with the addition of a lenslet array to create a plenoptic camera. Two distinct camera forms have been proposed in the literature. The first has the camera image focused onto the lenslet array. The lenslet array is placed over the camera sensor such that each lenslet forms an image of the exit pupil onto the sensor. The second plenoptic form has the lenslet array relaying the image formed by the camera lens to the sensor. We have developed a raytracing package that can simulate images formed by a generalized version of the plenoptic camera. Several rays from each sensor pixel are traced backwards through the system to define a cone of rays emanating from the entrance pupil of the camera lens. Objects that lie within this cone are integrated to lead to a color and exposure level for that pixel. To speed processing three-dimensional objects are approximated as a series of planes at different depths. Repeating this process for each pixel in the sensor leads to a simulated plenoptic image on which different reconstruction algorithms can be tested.

  19. POLICAN: A near-infrared imaging polarimeter at OAGH

    NASA Astrophysics Data System (ADS)

    Devaraj, R.; Luna, A.; Carrasco, L.; Mayya, Y. D.; Serrano-Bernal, O.

    2017-07-01

    We present a near-infrared linear imaging polarimeter POLICAN, developed for the Cananea near-infrared camera (CANICA) at the 2.1m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located at Cananea, Sonora, México. POLICAN reaches a limiting magnitude to about 16th mag with a polarimetric accuracy of about 1% for bright sources.

  20. Portable, stand-off spectral imaging camera for detection of effluents and residues

    NASA Astrophysics Data System (ADS)

    Goldstein, Neil; St. Peter, Benjamin; Grot, Jonathan; Kogan, Michael; Fox, Marsha; Vujkovic-Cvijin, Pajo; Penny, Ryan; Cline, Jason

    2015-06-01

    A new, compact and portable spectral imaging camera, employing a MEMs-based encoded imaging approach, has been built and demonstrated for detection of hazardous contaminants including gaseous effluents and solid-liquid residues on surfaces. The camera is called the Thermal infrared Reconfigurable Analysis Camera for Effluents and Residues (TRACER). TRACER operates in the long wave infrared and has the potential to detect a wide variety of materials with characteristic spectral signatures in that region. The 30 lb. camera is tripod mounted and battery powered. A touch screen control panel provides a simple user interface for most operations. The MEMS spatial light modulator is a Texas Instruments Digital Microarray Array with custom electronics and firmware control. Simultaneous 1D-spatial and 1Dspectral dimensions are collected, with the second spatial dimension obtained by scanning the internal spectrometer slit. The sensor can be configured to collect data in several modes including full hyperspectral imagery using Hadamard multiplexing, panchromatic thermal imagery, and chemical-specific contrast imagery, switched with simple user commands. Matched filters and other analog filters can be generated internally on-the-fly and applied in hardware, substantially reducing detection time and improving SNR over HSI software processing, while reducing storage requirements. Results of preliminary instrument evaluation and measurements of flame exhaust are presented.

  1. Measuring and Estimating Normalized Contrast in Infrared Flash Thermography

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2013-01-01

    Infrared flash thermography (IRFT) is used to detect void-like flaws in a test object. The IRFT technique involves heating up the part surface using a flash of flash lamps. The post-flash evolution of the part surface temperature is sensed by an IR camera in terms of pixel intensity of image pixels. The IR technique involves recording of the IR video image data and analysis of the data using the normalized pixel intensity and temperature contrast analysis method for characterization of void-like flaws for depth and width. This work introduces a new definition of the normalized IR pixel intensity contrast and normalized surface temperature contrast. A procedure is provided to compute the pixel intensity contrast from the camera pixel intensity evolution data. The pixel intensity contrast and the corresponding surface temperature contrast differ but are related. This work provides a method to estimate the temperature evolution and the normalized temperature contrast from the measured pixel intensity evolution data and some additional measurements during data acquisition.

  2. High-resolution imaging of the Pluto-Charon system with the Faint Object Camera of the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    Albrecht, R.; Barbieri, C.; Adorf, H.-M.; Corrain, G.; Gemmo, A.; Greenfield, P.; Hainaut, O.; Hook, R. N.; Tholen, D. J.; Blades, J. C.

    1994-01-01

    Images of the Pluto-Charon system were obtained with the Faint Object Camera (FOC) of the Hubble Space Telescope (HST) after the refurbishment of the telescope. The images are of superb quality, allowing the determination of radii, fluxes, and albedos. Attempts were made to improve the resolution of the already diffraction limited images by image restoration. These yielded indications of surface albedo distributions qualitatively consistent with models derived from observations of Pluto-Charon mutual eclipses.

  3. Method for separating video camera motion from scene motion for constrained 3D displacement measurements

    NASA Astrophysics Data System (ADS)

    Gauthier, L. R.; Jansen, M. E.; Meyer, J. R.

    2014-09-01

    Camera motion is a potential problem when a video camera is used to perform dynamic displacement measurements. If the scene camera moves at the wrong time, the apparent motion of the object under study can easily be confused with the real motion of the object. In some cases, it is practically impossible to prevent camera motion, as for instance, when a camera is used outdoors in windy conditions. A method to address this challenge is described that provides an objective means to measure the displacement of an object of interest in the scene, even when the camera itself is moving in an unpredictable fashion at the same time. The main idea is to synchronously measure the motion of the camera and to use those data ex post facto to subtract out the apparent motion in the scene that is caused by the camera motion. The motion of the scene camera is measured by using a reference camera that is rigidly attached to the scene camera and oriented towards a stationary reference object. For instance, this reference object may be on the ground, which is known to be stationary. It is necessary to calibrate the reference camera by simultaneously measuring the scene images and the reference images at times when it is known that the scene object is stationary and the camera is moving. These data are used to map camera movement data to apparent scene movement data in pixel space and subsequently used to remove the camera movement from the scene measurements.

  4. Visual Positioning Indoors: Human Eyes vs. Smartphone Cameras

    PubMed Central

    Wu, Dewen; Chen, Ruizhi; Chen, Liang

    2017-01-01

    Artificial Intelligence (AI) technologies and their related applications are now developing at a rapid pace. Indoor positioning will be one of the core technologies that enable AI applications because people spend 80% of their time indoors. Humans can locate themselves related to a visually well-defined object, e.g., a door, based on their visual observations. Can a smartphone camera do a similar job when it points to an object? In this paper, a visual positioning solution was developed based on a single image captured from a smartphone camera pointing to a well-defined object. The smartphone camera simulates the process of human eyes for the purpose of relatively locating themselves against a well-defined object. Extensive experiments were conducted with five types of smartphones on three different indoor settings, including a meeting room, a library, and a reading room. Experimental results shown that the average positioning accuracy of the solution based on five smartphone cameras is 30.6 cm, while that for the human-observed solution with 300 samples from 10 different people is 73.1 cm. PMID:29144420

  5. Visual Positioning Indoors: Human Eyes vs. Smartphone Cameras.

    PubMed

    Wu, Dewen; Chen, Ruizhi; Chen, Liang

    2017-11-16

    Artificial Intelligence (AI) technologies and their related applications are now developing at a rapid pace. Indoor positioning will be one of the core technologies that enable AI applications because people spend 80% of their time indoors. Humans can locate themselves related to a visually well-defined object, e.g., a door, based on their visual observations. Can a smartphone camera do a similar job when it points to an object? In this paper, a visual positioning solution was developed based on a single image captured from a smartphone camera pointing to a well-defined object. The smartphone camera simulates the process of human eyes for the purpose of relatively locating themselves against a well-defined object. Extensive experiments were conducted with five types of smartphones on three different indoor settings, including a meeting room, a library, and a reading room. Experimental results shown that the average positioning accuracy of the solution based on five smartphone cameras is 30.6 cm, while that for the human-observed solution with 300 samples from 10 different people is 73.1 cm.

  6. Merged infrared catalogue

    NASA Technical Reports Server (NTRS)

    Schmitz, M.; Brown, L. W.; Mead, J. M.; Nagy, T. A.

    1978-01-01

    A compilation of equatorial coordinates, spectral types, magnitudes, and fluxes from five catalogues of infrared observations is presented. This first edition of the Merged Infrared Catalogue contains 11,201 oservations from the Two-Micron Sky Survey, Observations of Infrared Radiation from Cool Stars, the Air Force Geophysics Laboratory four Color Infrared Sky Survey and its Supplemental Catalog, and from Catalog of 10 micron Celestial Objects (HALL). This compilation is a by-product of a computerized infrared data base under development at Goddard Space Flight Center; the objective is to maintain a complete and current record of all infrared observations from 1 micron m to 1000 micron m of nonsolar system objects. These observations are being placed into a standardized system.

  7. Medium-resolution near-infrared spectroscopy of massive young stellar objects

    NASA Astrophysics Data System (ADS)

    Pomohaci, R.; Oudmaijer, R. D.; Lumsden, S. L.; Hoare, M. G.; Mendigutía, I.

    2017-12-01

    We present medium-resolution (R ∼ 7000) near-infrared echelle spectroscopic data for 36 massive young stellar objects (MYSOs) drawn from the Red MSX Source survey. This is the largest sample observed at this resolution at these wavelengths of MYSOs to date. The spectra are characterized mostly by emission from hydrogen recombination lines and accretion diagnostic lines. One MYSO shows photospheric H I absorption, a comparison with spectral standards indicates that the star is an A-type star with a low surface gravity, implying that the MYSOs are probably swollen, as also suggested by evolutionary calculations. An investigation of the Brγ line profiles shows that most are in pure emission, while 13 ± 5 per cent display P Cygni profiles, indicative of outflow, while less than 8 ± 4 per cent have inverse P Cygni profiles, indicative of infall. These values are comparable with investigations into the optically bright Herbig Be stars, but not with those of Herbig Ae and T Tauri stars, consistent with the notion that the more massive stars undergo accretion in a different fashion than lower mass objects that are undergoing magnetospheric accretion. Accretion luminosities and rates as derived from the Br γ line luminosities agree with results for lower mass sources, providing tentative evidence for massive star formation theories based on scaling of low-mass scenarios. We present Br γ/Br12 line profile ratios exploiting the fact that optical depth effects can be traced as a function of Doppler shift across the lines. These show that the winds of MYSOs in this sample are nearly equally split between constant, accelerating and decelerating velocity structures. There are no trends between the types of features we see and bolometric luminosities or near-infrared colours.

  8. Time-resolved infrared spectrophotometric observations of high area to mass ratio (HAMR) objects in GEO

    NASA Astrophysics Data System (ADS)

    Skinner, Mark A.; Russell, Ray W.; Rudy, Richard J.; Gutierrez, David J.; Kim, Daryl L.; Crawford, Kirk; Gregory, Steve; Kelecy, Tom

    2011-12-01

    Optical surveys have identified a class of high area-to-mass ratio (HAMR) objects in the vicinity of the Geostationary Earth Orbit (GEO) ring [1]. The exact origin and nature of these objects are not well known, although their proximity to the GEO ring poses a hazard to active GEO satellites. Due to their high area-to-mass ratios, solar radiation pressure perturbs their orbits in ways that makes it difficult to predict their orbital trajectories over periods of time exceeding a week. To better understand these objects and their origins, observations that allow us to derive physical characteristics are required in order to improve the non-conservative force modeling for orbit determination and prediction. Information on their temperatures, areas, emissivities, and albedos may be obtained from thermal infrared, mid-wave infrared (MWIR), and visible measurements. Spectral features may help to identify the composition of the material, and thus possible origins for these objects. We have collected observational data on various HAMR objects from the AMOS observatory 3.6 m AEOS telescope. The thermal-IR spectra of these low-earth orbit objects acquired by the Broadband Array Spectrograph System (BASS) span wavelengths 3-13 μm and constitute a unique data set, providing a means of measuring, as a function of time, object fluxes. These, in turn, allow temperatures and emissivity-area products to be calculated. In some instances we have also collected simultaneous filtered visible photometric data on the observed objects. The multi-wavelength observations of the objects provide possible clues as to the nature of the observed objects. We describe briefly the nature and status of the instrumental programs used to acquire the data, our data of record, our data analysis techniques, and our current results, as well as future plans.

  9. PMMW Camera TRP. Phase 1

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Passive millimeter wave (PMMW) sensors have the ability to see through fog, clouds, dust and sandstorms and thus have the potential to support all-weather operations, both military and commercial. Many of the applications, such as military transport or commercial aircraft landing, are technologically stressing in that they require imaging of a scene with a large field of view in real time and with high spatial resolution. The development of a low cost PMMW focal plane array camera is essential to obtain real-time video images to fulfill the above needs. The overall objective of this multi-year project (Phase 1) was to develop and demonstrate the capabilities of a W-band PMMW camera with a microwave/millimeter wave monolithic integrated circuit (MMIC) focal plane array (FPA) that can be manufactured at low cost for both military and commercial applications. This overall objective was met in July 1997 when the first video images from the camera were generated of an outdoor scene. In addition, our consortium partner McDonnell Douglas was to develop a real-time passive millimeter wave flight simulator to permit pilot evaluation of a PMMW-equipped aircraft in a landing scenario. A working version of this simulator was completed. This work was carried out under the DARPA-funded PMMW Camera Technology Reinvestment Project (TRP), also known as the PMMW Camera DARPA Joint Dual-Use Project. In this final report for the Phase 1 activities, a year by year description of what the specific objectives were, the approaches taken, and the progress made is presented, followed by a description of the validation and imaging test results obtained in 1997.

  10. OT1_mhuang01_1: GRB Afterglow Photometry with Herschel Infrared Cameras

    NASA Astrophysics Data System (ADS)

    Huang, M.

    2010-07-01

    GRB Afterglow Photometry with Herschel Infrared Cameras (GRAPHICS) Gamma-ray bursts (GRBs) are the most luminous explosions in the universe. It has been difficult to obtain a full spectral picture of the phenomena in the short period when GRBs become ``alive'', i.e. when they generate bursts in Gamma-ray and produce afterglows in other wavelengths. Between NIR (12micron) and submillimeter (850micron) there lies nearly two orders of magnitude of spectral range where GRB afterglows have never been detected. Herschel is unique in its cutting edge sensitivity, efficiency, and readiness in FIR observations, and is capable of detecting GRB afterglows. Observing GRB afterglows with Herschel would greatly enrich our understanding of GRB physics and conditions of the Universe in early epochs. We propose Target of Opportunity studies using the SPIRE and PACS instruments of Herschel to observe 3 bright GRB afterglows, each within a few hours to a few tens of days after burst. We will make follow-up observations after the initial one to photometrically measure GRB light curves and IR SEDs. We will make ground optical observations to compliment Herschel data, and have the the GRB community informed. Observing the forward shock peak in the FIR light curve and compare it (both the flux and time) with those in the optical and radio bands would give a unambiguous test to the fireball model, and offer a direct measurement of the density profile of the circumburst material. Catching the short-lived reverse shock emission and measure its magnitude would lead to constraints on some important parameters of the GRB ejecta and address the unknown composition of GRBs, baryonic vs. magnetic.

  11. Streak camera imaging of single photons at telecom wavelength

    NASA Astrophysics Data System (ADS)

    Allgaier, Markus; Ansari, Vahid; Eigner, Christof; Quiring, Viktor; Ricken, Raimund; Donohue, John Matthew; Czerniuk, Thomas; Aßmann, Marc; Bayer, Manfred; Brecht, Benjamin; Silberhorn, Christine

    2018-01-01

    Streak cameras are powerful tools for temporal characterization of ultrafast light pulses, even at the single-photon level. However, the low signal-to-noise ratio in the infrared range prevents measurements on weak light sources in the telecom regime. We present an approach to circumvent this problem, utilizing an up-conversion process in periodically poled waveguides in Lithium Niobate. We convert single photons from a parametric down-conversion source in order to reach the point of maximum detection efficiency of commercially available streak cameras. We explore phase-matching configurations to apply the up-conversion scheme in real-world applications.

  12. In-line particle measurement in a recovery boiler using high-speed infrared imaging

    NASA Astrophysics Data System (ADS)

    Siikanen, Sami; Miikkulainen, Pasi; Kaarre, Marko; Juuti, Mikko

    2012-06-01

    Black liquor is the fuel of Kraft recovery boilers. It is sprayed into the furnace of a recovery boiler through splashplate nozzles. The operation of a recovery boiler is largely influenced by the particle size and particle size distribution of black liquor. When entrained by upwards-flowing flue gas flow, small droplet particles may form carry-over and cause the fouling of heat transfer surfaces. Large droplet particles hit the char bed and the walls of the furnace without being dried. In this study, particles of black liquor sprays were imaged using a high-speed infrared camera. Measurements were done in a functional recovery boiler in a pulp mill. Objective was to find a suitable wavelength range and settings such as integration time, frame rate and averaging for the camera.

  13. Long wave infrared (8 to 14 microns) hyperspectral imager based on an uncooled thermal camera and the traditional CI block interferometer (SI-LWIR-UC)

    NASA Astrophysics Data System (ADS)

    Cabib, Dario; Lavi, Moshe; Gil, Amir; Milman, Uri

    2011-06-01

    Since the early '90's CI has been involved in the development of FTIR hyperspectral imagers based on a Sagnac or similar type of interferometer. CI also pioneered the commercialization of such hyperspectral imagers in those years. After having developed a visible version based on a CCD in the early '90's (taken on by a spin-off company for biomedical applications) and a 3 to 5 micron infrared version based on a cooled InSb camera in 2008, it is now developing an LWIR version based on an uncooled camera for the 8 to 14 microns range. In this paper we will present design features and expected performance of the system. The instrument is designed to be rugged for field use, yield a relatively high spectral resolution of 8 cm-1, an IFOV of 0.5 mrad., a 640x480 pixel spectral cube in less than a minute and a noise equivalent spectral radiance of 40 nW/cm2/sr/cm-1 at 10μ. The actually measured performance will be presented in a future paper.

  14. Infrared Thermography Sensor for Temperature and Speed Measurement of Moving Material.

    PubMed

    Usamentiaga, Rubén; García, Daniel Fernando

    2017-05-18

    Infrared thermography offers significant advantages in monitoring the temperature of objects over time, but crucial aspects need to be addressed. Movements between the infrared camera and the inspected material seriously affect the accuracy of the calculated temperature. These movements can be the consequence of solid objects that are moved, molten metal poured, material on a conveyor belt, or just vibrations. This work proposes a solution for monitoring the temperature of material in these scenarios. In this work both real movements and vibrations are treated equally, proposing a unified solution for both problems. The three key steps of the proposed procedure are image rectification, motion estimation and motion compensation. Image rectification calculates a front-parallel projection of the image that simplifies the estimation and compensation of the movement. Motion estimation describes the movement using a mathematical model, and estimates the coefficients using robust methods adapted to infrared images. Motion is finally compensated for in order to produce the correct temperature time history of the monitored material regardless of the movement. The result is a robust sensor for temperature of moving material that can also be used to measure the speed of the material. Different experiments are carried out to validate the proposed method in laboratory and real environments. Results show excellent performance.

  15. Infrared Thermography Sensor for Temperature and Speed Measurement of Moving Material

    PubMed Central

    Usamentiaga, Rubén; García, Daniel Fernando

    2017-01-01

    Infrared thermography offers significant advantages in monitoring the temperature of objects over time, but crucial aspects need to be addressed. Movements between the infrared camera and the inspected material seriously affect the accuracy of the calculated temperature. These movements can be the consequence of solid objects that are moved, molten metal poured, material on a conveyor belt, or just vibrations. This work proposes a solution for monitoring the temperature of material in these scenarios. In this work both real movements and vibrations are treated equally, proposing a unified solution for both problems. The three key steps of the proposed procedure are image rectification, motion estimation and motion compensation. Image rectification calculates a front-parallel projection of the image that simplifies the estimation and compensation of the movement. Motion estimation describes the movement using a mathematical model, and estimates the coefficients using robust methods adapted to infrared images. Motion is finally compensated for in order to produce the correct temperature time history of the monitored material regardless of the movement. The result is a robust sensor for temperature of moving material that can also be used to measure the speed of the material. Different experiments are carried out to validate the proposed method in laboratory and real environments. Results show excellent performance. PMID:28524110

  16. Robust vehicle detection under various environments to realize road traffic flow surveillance using an infrared thermal camera.

    PubMed

    Iwasaki, Yoichiro; Misumi, Masato; Nakamiya, Toshiyuki

    2015-01-01

    To realize road traffic flow surveillance under various environments which contain poor visibility conditions, we have already proposed two vehicle detection methods using thermal images taken with an infrared thermal camera. The first method uses pattern recognition for the windshields and their surroundings to detect vehicles. However, the first method decreases the vehicle detection accuracy in winter season. To maintain high vehicle detection accuracy in all seasons, we developed the second method. The second method uses tires' thermal energy reflection areas on a road as the detection targets. The second method did not achieve high detection accuracy for vehicles on left-hand and right-hand lanes except for two center-lanes. Therefore, we have developed a new method based on the second method to increase the vehicle detection accuracy. This paper proposes the new method and shows that the detection accuracy for vehicles on all lanes is 92.1%. Therefore, by combining the first method and the new method, high vehicle detection accuracies are maintained under various environments, and road traffic flow surveillance can be realized.

  17. Robust Vehicle Detection under Various Environments to Realize Road Traffic Flow Surveillance Using an Infrared Thermal Camera

    PubMed Central

    Iwasaki, Yoichiro; Misumi, Masato; Nakamiya, Toshiyuki

    2015-01-01

    To realize road traffic flow surveillance under various environments which contain poor visibility conditions, we have already proposed two vehicle detection methods using thermal images taken with an infrared thermal camera. The first method uses pattern recognition for the windshields and their surroundings to detect vehicles. However, the first method decreases the vehicle detection accuracy in winter season. To maintain high vehicle detection accuracy in all seasons, we developed the second method. The second method uses tires' thermal energy reflection areas on a road as the detection targets. The second method did not achieve high detection accuracy for vehicles on left-hand and right-hand lanes except for two center-lanes. Therefore, we have developed a new method based on the second method to increase the vehicle detection accuracy. This paper proposes the new method and shows that the detection accuracy for vehicles on all lanes is 92.1%. Therefore, by combining the first method and the new method, high vehicle detection accuracies are maintained under various environments, and road traffic flow surveillance can be realized. PMID:25763384

  18. Space infrared telescope facility wide field and diffraction limited array camera (IRAC)

    NASA Technical Reports Server (NTRS)

    Fazio, Giovanni G.

    1988-01-01

    The wide-field and diffraction limited array camera (IRAC) is capable of two-dimensional photometry in either a wide-field or diffraction-limited mode over the wavelength range from 2 to 30 microns with a possible extension to 120 microns. A low-doped indium antimonide detector was developed for 1.8 to 5.0 microns, detectors were tested and optimized for the entire 1.8 to 30 micron range, beamsplitters were developed and tested for the 1.8 to 30 micron range, and tradeoff studies of the camera's optical system performed. Data are presented on the performance of InSb, Si:In, Si:Ga, and Si:Sb array detectors bumpbonded to a multiplexed CMOS readout chip of the source-follower type at SIRTF operating backgrounds (equal to or less than 1 x 10 to the 8th ph/sq cm/sec) and temperature (4 to 12 K). Some results at higher temperatures are also presented for comparison to SIRTF temperature results. Data are also presented on the performance of IRAC beamsplitters at room temperature at both 0 and 45 deg angle of incidence and on the performance of the all-reflecting optical system baselined for the camera.

  19. NV-CMOS HD camera for day/night imaging

    NASA Astrophysics Data System (ADS)

    Vogelsong, T.; Tower, J.; Sudol, Thomas; Senko, T.; Chodelka, D.

    2014-06-01

    SRI International (SRI) has developed a new multi-purpose day/night video camera with low-light imaging performance comparable to an image intensifier, while offering the size, weight, ruggedness, and cost advantages enabled by the use of SRI's NV-CMOS HD digital image sensor chip. The digital video output is ideal for image enhancement, sharing with others through networking, video capture for data analysis, or fusion with thermal cameras. The camera provides Camera Link output with HD/WUXGA resolution of 1920 x 1200 pixels operating at 60 Hz. Windowing to smaller sizes enables operation at higher frame rates. High sensitivity is achieved through use of backside illumination, providing high Quantum Efficiency (QE) across the visible and near infrared (NIR) bands (peak QE <90%), as well as projected low noise (<2h+) readout. Power consumption is minimized in the camera, which operates from a single 5V supply. The NVCMOS HD camera provides a substantial reduction in size, weight, and power (SWaP) , ideal for SWaP-constrained day/night imaging platforms such as UAVs, ground vehicles, fixed mount surveillance, and may be reconfigured for mobile soldier operations such as night vision goggles and weapon sights. In addition the camera with the NV-CMOS HD imager is suitable for high performance digital cinematography/broadcast systems, biofluorescence/microscopy imaging, day/night security and surveillance, and other high-end applications which require HD video imaging with high sensitivity and wide dynamic range. The camera comes with an array of lens mounts including C-mount and F-mount. The latest test data from the NV-CMOS HD camera will be presented.

  20. Thermoelectric infrared imaging sensors for automotive applications

    NASA Astrophysics Data System (ADS)

    Hirota, Masaki; Nakajima, Yasushi; Saito, Masanori; Satou, Fuminori; Uchiyama, Makoto

    2004-07-01

    This paper describes three low-cost thermoelectric infrared imaging sensors having a 1,536, 2,304, and 10,800 element thermoelectric focal plane array (FPA) respectively and two experimental automotive application systems. The FPAs are basically fabricated with a conventional IC process and micromachining technologies and have a low cost potential. Among these sensors, the sensor having 2,304 elements provide high responsivity of 5,500 V/W and a very small size with adopting a vacuum-sealed package integrated with a wide-angle ZnS lens. One experimental system incorporated in the Nissan ASV-2 is a blind spot pedestrian warning system that employs four infrared imaging sensors. This system helps alert the driver to the presence of a pedestrian in a blind spot by detecting the infrared radiation emitted from the person"s body. The system can also prevent the vehicle from moving in the direction of the pedestrian. The other is a rearview camera system with an infrared detection function. This system consists of a visible camera and infrared sensors, and it helps alert the driver to the presence of a pedestrian in a rear blind spot. Various issues that will need to be addressed in order to expand the automotive applications of IR imaging sensors in the future are also summarized. This performance is suitable for consumer electronics as well as automotive applications.

  1. Hubble Space Telescope Planetary Camera observations of Arp 220

    NASA Technical Reports Server (NTRS)

    Shaya, Edward J.; Dowling, Daniel M.; Currie, Douglas G.; Faber, S. M.; Groth, Edward J.

    1994-01-01

    Planetary Camera images of peculiar galaxy Arp 220 taken with V, R, and I band filters reveal a very luminous object near the position of the western radio continuum source, assumed to be the major nucleus, ans seven lesser objects within 2 sec of this position. The most luminous object is formally coincident with the radio source to within the errors of Hubble Space Telescope (HST) pointing accuracy, but we have found an alternate, more compelling alignment of maps in which the eastern radio source coincides with one of the lesser objects and the OH radio sources reside near the surfaces of other optical objects. The proposed centering places the most luminous object 150 pc (0.4 sec) away from the western radio source. We explore the possibilities that the objects are either holes in the dense dust distribution, dusty clouds reflecting a hidden bright nucleus, or associations of bright young stars. We favor the interpretation that at least the brightest two objects are massive young star associations with luminosities 10(exp 9) to 10(exp 11) solar luminosity, but highly extinguished by intervening dust. These massive associations should fall into the nucleus on a time scale of 10(exp 8) yr. About 10% of the enigmatic far-IR flux arises from the observed objects. In addition, if the diffuse starlight out to a radius of 8 sec is dominated by stars with typical ages of order 10(exp 8) yr (the time since the alleged merger of two galaxies), as indicated by the blue colors at larger radius, then the lower limit to the reradiation of diffuse starlight contributes 3 x 10(exp 11) solar luminosity to the far-infrared flux, or greater than or equal to 25% of the total far-IR flux. Three additional bright objects (M(sub V) approximately equals -13) located about 6 sec from the core are likely young globular clusters, but any of these could be recently exploded supernovae instead. The expected supernovae rate, if the dominant energy source is young stars, is about one per

  2. Infrared Spectrometer for ExoMars: A Mast-Mounted Instrument for the Rover

    NASA Astrophysics Data System (ADS)

    Korablev, Oleg I.; Dobrolensky, Yurii; Evdokimova, Nadezhda; Fedorova, Anna A.; Kuzmin, Ruslan O.; Mantsevich, Sergei N.; Cloutis, Edward A.; Carter, John; Poulet, Francois; Flahaut, Jessica; Griffiths, Andrew; Gunn, Matthew; Schmitz, Nicole; Martín-Torres, Javier; Zorzano, Maria-Paz; Rodionov, Daniil S.; Vago, Jorge L.; Stepanov, Alexander V.; Titov, Andrei Yu.; Vyazovetsky, Nikita A.; Trokhimovskiy, Alexander Yu.; Sapgir, Alexander G.; Kalinnikov, Yurii K.; Ivanov, Yurii S.; Shapkin, Alexei A.; Ivanov, Andrei Yu.

    2017-07-01

    ISEM (Infrared Spectrometer for ExoMars) is a pencil-beam infrared spectrometer that will measure reflected solar radiation in the near infrared range for context assessment of the surface mineralogy in the vicinity of the ExoMars rover. The instrument will be accommodated on the mast of the rover and will be operated together with the panoramic camera (PanCam), high-resolution camera (HRC). ISEM will study the mineralogical and petrographic composition of the martian surface in the vicinity of the rover, and in combination with the other remote sensing instruments, it will aid in the selection of potential targets for close-up investigations and drilling sites. Of particular scientific interest are water-bearing minerals, such as phyllosilicates, sulfates, carbonates, and minerals indicative of astrobiological potential, such as borates, nitrates, and ammonium-bearing minerals. The instrument has an ˜1° field of view and covers the spectral range between 1.15 and 3.30 μm with a spectral resolution varying from 3.3 nm at 1.15 μm to 28 nm at 3.30 μm. The ISEM optical head is mounted on the mast, and its electronics box is located inside the rover's body. The spectrometer uses an acousto-optic tunable filter and a Peltier-cooled InAs detector. The mass of ISEM is 1.74 kg, including the electronics and harness. The science objectives of the experiment, the instrument design, and operational scenarios are described.

  3. Near-surface Thermal Infrared Imaging of a Mixed Forest

    NASA Astrophysics Data System (ADS)

    Aubrecht, D. M.; Helliker, B. R.; Richardson, A. D.

    2014-12-01

    Measurement of an organism's temperature is of basic physiological importance and therefore necessary for ecosystem modeling, yet most models derive leaf temperature from energy balance arguments or assume it is equal to air temperature. This is because continuous, direct measurement of leaf temperature outside of a controlled environment is difficult and rarely done. Of even greater challenge is measuring leaf temperature with the resolution required to understand the underlying energy balance and regulation of plant processes. To measure leaf temperature through the year, we have mounted a high-resolution, thermal infrared camera overlooking the canopy of a temperate deciduous forest. The camera is co-located with an eddy covariance system and a suite of radiometric sensors. Our camera measures longwave thermal infrared (λ = 7.5-14 microns) using a microbolometer array. Suspended in the canopy within the camera FOV is a matte black copper plate instrumented with fine wire thermocouples that acts as a thermal reference for each image. In this presentation, I will discuss the challenges of continuous, long-term field operation of the camera, as well as measurement sensitivity to physical and environmental parameters. Based on this analysis, I will show that the uncertainties in converting radiometric signal to leaf temperature are well constrained. The key parameter for minimizing uncertainty is the emissivity of the objects being imaged: measuring the emissivity to within 0.01 enables leaf temperature to be calculated to within 0.5°C. Finally, I will present differences in leaf temperature observed amongst species. From our two-year record, we characterize high frequency, daily, and seasonal thermal signatures of leaves and crowns, in relation to environmental conditions. Our images are taken with sufficient spatial and temporal resolution to quantify the preferential heating of sunlit portions of the canopy and the cooling effect of wind gusts. Future work will

  4. A near-infrared SETI experiment: instrument overview

    NASA Astrophysics Data System (ADS)

    Wright, Shelley A.; Werthimer, Dan; Treffers, Richard R.; Maire, Jérôme; Marcy, Geoffrey W.; Stone, Remington P. S.; Drake, Frank; Meyer, Elliot; Dorval, Patrick; Siemion, Andrew

    2014-07-01

    We are designing and constructing a new SETI (Search for Extraterrestrial Intelligence) instrument to search for direct evidence of interstellar communications via pulsed laser signals at near-infrared wavelengths. The new instrument design builds upon our past optical SETI experiences, and is the first step toward a new, more versatile and sophisticated generation of very fast optical and near-infrared pulse search devices. We present our instrumental design by giving an overview of the opto-mechanical design, detector selection and characterization, signal processing, and integration procedure. This project makes use of near-infrared (950 - 1650 nm) discrete amplification Avalanche Photodiodes (APD) that have > 1 GHz bandwidths with low noise characteristics and moderate gain (~104). We have investigated the use of single versus multiple detectors in our instrument (see Maire et al., this conference), and have optimized the system to have both high sensitivity and low false coincidence rates. Our design is optimized for use behind a 1m telescope and includes an optical camera for acquisition and guiding. A goal is to make our instrument relatively economical and easy to duplicate. We describe our observational setup and our initial search strategies for SETI targets, and for potential interesting compact astrophysical objects.

  5. Application of Terrestrial Laser Scanner with an Integrated Thermal Camera in Non-Destructive Evaluation of Concrete Surface of Hydrotechnical Objects

    NASA Astrophysics Data System (ADS)

    Kaczmarek, Łukasz Dominik; Dobak, Paweł Józef; Kiełbasiński, Kamil

    2017-12-01

    The authors present possible applications of thermal data as an additional source of information on an object's behaviour during the technical assessment of the condition of a concrete surface. For the study one of the most recent propositions introduced by Zoller + Fröhlich company was used, which is an integration of a thermal camera with a terrestrial laser scanner. This solution enables an acquisition of geometric and spectral data on the surveyed object and also provides information on the surface's temperature in the selected points. A section of the dam's downstream concrete wall was selected as the subject of the study for which a number of scans were carried out and a number of thermal images were taken at different times of the day. The obtained thermal data was confronted with the acquired spectral information for the specified points. This made it possible to carry out broader analysis of the surface and an inspection of the revealed fissure. The thermal analysis of said fissure indicated that the temperature changes within it are slower, which may affect the way the concrete works and may require further elaboration by the appropriate experts. Through the integration of a thermal camera with a terrestrial laser scanner one can not only analyse changes of temperature in the discretely selected points but on the whole surface as well. Moreover, it is also possible to accurately determine the range and the area of the change affecting the surface. The authors note the limitations of the presented solution like, inter alia, the resolution of the thermal camera.

  6. Coaxial fundus camera for opthalmology

    NASA Astrophysics Data System (ADS)

    de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.

    2015-09-01

    A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.

  7. Multi-Angle Snowflake Camera Value-Added Product

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shkurko, Konstantin; Garrett, T.; Gaustad, K

    The Multi-Angle Snowflake Camera (MASC) addresses a need for high-resolution multi-angle imaging of hydrometeors in freefall with simultaneous measurement of fallspeed. As illustrated in Figure 1, the MASC consists of three cameras, separated by 36°, each pointing at an identical focal point approximately 10 cm away. Located immediately above each camera, a light aims directly at the center of depth of field for its corresponding camera. The focal point at which the cameras are aimed lies within a ring through which hydrometeors fall. The ring houses a system of near-infrared emitter-detector pairs, arranged in two arrays separated vertically by 32more » mm. When hydrometeors pass through the lower array, they simultaneously trigger all cameras and lights. Fallspeed is calculated from the time it takes to traverse the distance between the upper and lower triggering arrays. The trigger electronics filter out ambient light fluctuations associated with varying sunlight and shadows. The microprocessor onboard the MASC controls the camera system and communicates with the personal computer (PC). The image data is sent via FireWire 800 line, and fallspeed (and camera control) is sent via a Universal Serial Bus (USB) line that relies on RS232-over-USB serial conversion. See Table 1 for specific details on the MASC located at the Oliktok Point Mobile Facility on the North Slope of Alaska. The value-added product (VAP) detailed in this documentation analyzes the raw data (Section 2.0) using Python: images rely on OpenCV image processing library and derived aggregated statistics rely on some clever averaging. See Sections 4.1 and 4.2 for more details on what variables are computed.« less

  8. Femtowatt incoherent image conversion from mid-infrared light to near-infrared light

    NASA Astrophysics Data System (ADS)

    Huang, Nan; Liu, Hongjun; Wang, Zhaolu; Han, Jing; Zhang, Shuan

    2017-03-01

    We report on the experimental conversion imaging of an incoherent continuous-wave dim source from mid-infrared light to near-infrared light with a lowest input power of 31 femtowatt (fW). Incoherent mid-infrared images of light emission from a heat lamp bulb with an adjustable power supply at window wavelengths ranging from 2.9 µm to 3.5 µm are used for upconversion. The sum-frequency generation is realized in a laser cavity with the resonant wavelength of 1064 nm pumped by an LD at 806 nm built around a periodically poled lithium niobate (PPLN) crystal. The converted infrared image in the wavelength range ~785 nm with a resolution of about 120  ×  70 is low-noise detected using a silicon-based camera. By optimizing the system parameters, the upconversion quantum efficiency is predicted to be 28% for correctly polarized, on-axis and phase-matching light.

  9. Application of PLZT electro-optical shutter to diaphragm of visible and mid-infrared cameras

    NASA Astrophysics Data System (ADS)

    Fukuyama, Yoshiyuki; Nishioka, Shunji; Chonan, Takao; Sugii, Masakatsu; Shirahata, Hiromichi

    1997-04-01

    Pb0.9La0.09(Zr0.65,Ti0.35)0.9775O3 9/65/35) commonly used as an electro-optical shutter exhibits large phase retardation with low applied voltage. This shutter features as follows; (1) high shutter speed, (2) wide optical transmittance, and (3) high optical density in 'OFF'-state. If the shutter is applied to a diaphragm of video-camera, it could protect its sensor from intense lights. We have tested the basic characteristics of the PLZT electro-optical shutter and resolved power of imaging. The ratio of optical transmittance at 'ON' and 'OFF'-states was 1.1 X 103. The response time of the PLZT shutter from 'ON'-state to 'OFF'-state was 10 micro second. MTF reduction when putting the PLZT shutter in from of the visible video- camera lens has been observed only with 12 percent at a spatial frequency of 38 cycles/mm which are sensor resolution of the video-camera. Moreover, we took the visible image of the Si-CCD video-camera. The He-Ne laser ghost image was observed at 'ON'-state. On the contrary, the ghost image was totally shut out at 'OFF'-state. From these teste, it has been found that the PLZT shutter is useful for the diaphragm of the visible video-camera. The measured optical transmittance of PLZT wafer with no antireflection coating was 78 percent over the range from 2 to 6 microns.

  10. Using DSLR cameras in digital holography

    NASA Astrophysics Data System (ADS)

    Hincapié-Zuluaga, Diego; Herrera-Ramírez, Jorge; García-Sucerquia, Jorge

    2017-08-01

    In Digital Holography (DH), the size of the bidimensional image sensor to record the digital hologram, plays a key role on the performance of this imaging technique; the larger the size of the camera sensor, the better the quality of the final reconstructed image. Scientific cameras with large formats are offered in the market, but their cost and availability limit their use as a first option when implementing DH. Nowadays, DSLR cameras provide an easy-access alternative that is worthwhile to be explored. The DSLR cameras are a wide, commercial, and available option that in comparison with traditional scientific cameras, offer a much lower cost per effective pixel over a large sensing area. However, in the DSLR cameras, with their RGB pixel distribution, the sampling of information is different to the sampling in monochrome cameras usually employed in DH. This fact has implications in their performance. In this work, we discuss why DSLR cameras are not extensively used for DH, taking into account the problem reported by different authors of object replication. Simulations of DH using monochromatic and DSLR cameras are presented and a theoretical deduction for the replication problem using the Fourier theory is also shown. Experimental results of DH implementation using a DSLR camera show the replication problem.

  11. Camera network video summarization

    NASA Astrophysics Data System (ADS)

    Panda, Rameswar; Roy-Chowdhury, Amit K.

    2017-05-01

    Networks of vision sensors are deployed in many settings, ranging from security needs to disaster response to environmental monitoring. Many of these setups have hundreds of cameras and tens of thousands of hours of video. The difficulty of analyzing such a massive volume of video data is apparent whenever there is an incident that requires foraging through vast video archives to identify events of interest. As a result, video summarization, that automatically extract a brief yet informative summary of these videos, has attracted intense attention in the recent years. Much progress has been made in developing a variety of ways to summarize a single video in form of a key sequence or video skim. However, generating a summary from a set of videos captured in a multi-camera network still remains as a novel and largely under-addressed problem. In this paper, with the aim of summarizing videos in a camera network, we introduce a novel representative selection approach via joint embedding and capped l21-norm minimization. The objective function is two-fold. The first is to capture the structural relationships of data points in a camera network via an embedding, which helps in characterizing the outliers and also in extracting a diverse set of representatives. The second is to use a capped l21-norm to model the sparsity and to suppress the influence of data outliers in representative selection. We propose to jointly optimize both of the objectives, such that embedding can not only characterize the structure, but also indicate the requirements of sparse representative selection. Extensive experiments on standard multi-camera datasets well demonstrate the efficacy of our method over state-of-the-art methods.

  12. IMAX camera in payload bay

    NASA Image and Video Library

    1995-12-20

    STS074-361-035 (12-20 Nov 1995) --- This medium close-up view centers on the IMAX Cargo Bay Camera (ICBC) and its associated IMAX Camera Container Equipment (ICCE) at its position in the cargo bay of the Earth-orbiting Space Shuttle Atlantis. With its own ?space suit? or protective covering to protect it from the rigors of space, this version of the IMAX was able to record scenes not accessible with the in-cabin cameras. For docking and undocking activities involving Russia?s Mir Space Station and the Space Shuttle Atlantis, the camera joined a variety of in-cabin camera hardware in recording the historical events. IMAX?s secondary objectives were to film Earth views. The IMAX project is a collaboration between NASA, the Smithsonian Institution?s National Air and Space Museum (NASM), IMAX Systems Corporation, and the Lockheed Corporation to document significant space activities and promote NASA?s educational goals using the IMAX film medium.

  13. 2001 Mars Odyssey Images Earth (Visible and Infrared)

    NASA Technical Reports Server (NTRS)

    2001-01-01

    2001 Mars Odyssey's Thermal Emission Imaging System (THEMIS) acquired these images of the Earth using its visible and infrared cameras as it left the Earth. The visible image shows the thin crescent viewed from Odyssey's perspective. The infrared image was acquired at exactly the same time, but shows the entire Earth using the infrared's 'night-vision' capability. Invisible light the instrument sees only reflected sunlight and therefore sees nothing on the night side of the planet. In infrared light the camera observes the light emitted by all regions of the Earth. The coldest ground temperatures seen correspond to the nighttime regions of Antarctica; the warmest temperatures occur in Australia. The low temperature in Antarctica is minus 50 degrees Celsius (minus 58 degrees Fahrenheit); the high temperature at night in Australia 9 degrees Celsius(48.2 degrees Fahrenheit). These temperatures agree remarkably well with observed temperatures of minus 63 degrees Celsius at Vostok Station in Antarctica, and 10 degrees Celsius in Australia. The images were taken at a distance of 3,563,735 kilometers (more than 2 million miles) on April 19,2001 as the Odyssey spacecraft left Earth.

  14. Active learning in camera calibration through vision measurement application

    NASA Astrophysics Data System (ADS)

    Li, Xiaoqin; Guo, Jierong; Wang, Xianchun; Liu, Changqing; Cao, Binfang

    2017-08-01

    Since cameras are increasingly more used in scientific application as well as in the applications requiring precise visual information, effective calibration of such cameras is getting more important. There are many reasons why the measurements of objects are not accurate. The largest reason is that the lens has a distortion. Another detrimental influence on the evaluation accuracy is caused by the perspective distortions in the image. They happen whenever we cannot mount the camera perpendicularly to the objects we want to measure. In overall, it is very important for students to understand how to correct lens distortions, that is camera calibration. If the camera is calibrated, the images are rectificated, and then it is possible to obtain undistorted measurements in world coordinates. This paper presents how the students should develop a sense of active learning for mathematical camera model besides the theoretical scientific basics. The authors will present the theoretical and practical lectures which have the goal of deepening the students understanding of the mathematical models of area scan cameras and building some practical vision measurement process by themselves.

  15. A multipurpose camera system for monitoring Kīlauea Volcano, Hawai'i

    USGS Publications Warehouse

    Patrick, Matthew R.; Orr, Tim R.; Lee, Lopaka; Moniz, Cyril J.

    2015-01-01

    We describe a low-cost, compact multipurpose camera system designed for field deployment at active volcanoes that can be used either as a webcam (transmitting images back to an observatory in real-time) or as a time-lapse camera system (storing images onto the camera system for periodic retrieval during field visits). The system also has the capability to acquire high-definition video. The camera system uses a Raspberry Pi single-board computer and a 5-megapixel low-light (near-infrared sensitive) camera, as well as a small Global Positioning System (GPS) module to ensure accurate time-stamping of images. Custom Python scripts control the webcam and GPS unit and handle data management. The inexpensive nature of the system allows it to be installed at hazardous sites where it might be lost. Another major advantage of this camera system is that it provides accurate internal timing (independent of network connection) and, because a full Linux operating system and the Python programming language are available on the camera system itself, it has the versatility to be configured for the specific needs of the user. We describe example deployments of the camera at Kīlauea Volcano, Hawai‘i, to monitor ongoing summit lava lake activity. 

  16. KWIC: A Widefield Mid-Infrared Array Camera/Spectrometer for the KAO

    NASA Technical Reports Server (NTRS)

    Stacey, Gordon J.

    1999-01-01

    This grant covered a one year data analysis period for the data we obtained with the Kuiper Widefield Infrared Camera (KWIC) on the KAO during CY94 and CY95. A fairly complete list of scientific papers produced, or soon to be produced under this award is contained at the end of this report. Below we summarize some of the highlights of the work we did under this grant. KWIC Imaging of the Orion Nebula. KWIC was successfully developed under the KAO grants program (NASA grant NAG2-800). First funding arrived in November of 1992, and we flew our first two flights in February of 1994 -just 15 months later. These flights were very successful. We imaged the Orion Nebula in the 37.7 micron continuum and [SiII] 35 micron line and imaged M82 and Arp299 in the 37.7 micron continuum. Our Orion image demonstrates that the 37.7 micron continuum arises in the warm dust associated with the photodissociated surfaces (photodissociation regions, or PDRs) of molecular clouds. We use the brightness and color temperature distribution to ascertain the morphology of the Orion PDR. The [SiII] image of Orion encompassed the entire Orion A HII region and its enveloping PDR. Most of the emission in the PDR regions of the map appears to coincide very well with our 37.7 micron continuum map indicating a PDR origin for the [SiII] in agreement with theoretical predictions. The [SiII] line emission is very clumpy in the PDR directly imaging the clump spectrum indirectly ascertained by examining the distribution and flux ratios of [CII] and [0I] far-IR fine structure line, and high J CO emission. We also detected very strong [SiII] line emission from the embedded BN-KL star formation region tracing the morphology and physical conditions of the high velocity shock from these very young stars.

  17. Minimal camera networks for 3D image based modeling of cultural heritage objects.

    PubMed

    Alsadik, Bashar; Gerke, Markus; Vosselman, George; Daham, Afrah; Jasim, Luma

    2014-03-25

    3D modeling of cultural heritage objects like artifacts, statues and buildings is nowadays an important tool for virtual museums, preservation and restoration. In this paper, we introduce a method to automatically design a minimal imaging network for the 3D modeling of cultural heritage objects. This becomes important for reducing the image capture time and processing when documenting large and complex sites. Moreover, such a minimal camera network design is desirable for imaging non-digitally documented artifacts in museums and other archeological sites to avoid disturbing the visitors for a long time and/or moving delicate precious objects to complete the documentation task. The developed method is tested on the Iraqi famous statue "Lamassu". Lamassu is a human-headed winged bull of over 4.25 m in height from the era of Ashurnasirpal II (883-859 BC). Close-range photogrammetry is used for the 3D modeling task where a dense ordered imaging network of 45 high resolution images were captured around Lamassu with an object sample distance of 1 mm. These images constitute a dense network and the aim of our study was to apply our method to reduce the number of images for the 3D modeling and at the same time preserve pre-defined point accuracy. Temporary control points were fixed evenly on the body of Lamassu and measured by using a total station for the external validation and scaling purpose. Two network filtering methods are implemented and three different software packages are used to investigate the efficiency of the image orientation and modeling of the statue in the filtered (reduced) image networks. Internal and external validation results prove that minimal image networks can provide highly accurate records and efficiency in terms of visualization, completeness, processing time (>60% reduction) and the final accuracy of 1 mm.

  18. Minimal Camera Networks for 3D Image Based Modeling of Cultural Heritage Objects

    PubMed Central

    Alsadik, Bashar; Gerke, Markus; Vosselman, George; Daham, Afrah; Jasim, Luma

    2014-01-01

    3D modeling of cultural heritage objects like artifacts, statues and buildings is nowadays an important tool for virtual museums, preservation and restoration. In this paper, we introduce a method to automatically design a minimal imaging network for the 3D modeling of cultural heritage objects. This becomes important for reducing the image capture time and processing when documenting large and complex sites. Moreover, such a minimal camera network design is desirable for imaging non-digitally documented artifacts in museums and other archeological sites to avoid disturbing the visitors for a long time and/or moving delicate precious objects to complete the documentation task. The developed method is tested on the Iraqi famous statue “Lamassu”. Lamassu is a human-headed winged bull of over 4.25 m in height from the era of Ashurnasirpal II (883–859 BC). Close-range photogrammetry is used for the 3D modeling task where a dense ordered imaging network of 45 high resolution images were captured around Lamassu with an object sample distance of 1 mm. These images constitute a dense network and the aim of our study was to apply our method to reduce the number of images for the 3D modeling and at the same time preserve pre-defined point accuracy. Temporary control points were fixed evenly on the body of Lamassu and measured by using a total station for the external validation and scaling purpose. Two network filtering methods are implemented and three different software packages are used to investigate the efficiency of the image orientation and modeling of the statue in the filtered (reduced) image networks. Internal and external validation results prove that minimal image networks can provide highly accurate records and efficiency in terms of visualization, completeness, processing time (>60% reduction) and the final accuracy of 1 mm. PMID:24670718

  19. Design and Calibration of a Dispersive Imaging Spectrometer Adaptor for a Fast IR Camera on NSTX-U

    NASA Astrophysics Data System (ADS)

    Reksoatmodjo, Richard; Gray, Travis; Princeton Plasma Physics Laboratory Team

    2017-10-01

    A dispersive spectrometer adaptor was designed, constructed and calibrated for use on a fast infrared camera employed to measure temperatures on the lower divertor tiles of the NSTX-U tokamak. This adaptor efficiently and evenly filters and distributes long-wavelength infrared photons between 8.0 and 12.0 microns across the 128x128 pixel detector of the fast IR camera. By determining the width of these separated wavelength bands across the camera detector, and then determining the corresponding average photon count for each photon wavelength, a very accurate measurement of the temperature, and thus heat flux, of the divertor tiles can be calculated using Plank's law. This approach of designing an exterior dispersive adaptor for the fast IR camera allows accurate temperature measurements to be made of materials with unknown emissivity. Further, the relative simplicity and affordability of this adaptor design provides an attractive option over more expensive, slower, dispersive IR camera systems. This work was made possible by funding from the Department of Energy for the Summer Undergraduate Laboratory Internship (SULI) program. This work is supported by the US DOE Contract No. DE-AC02-09CH11466.

  20. Diffraction experiments with infrared remote controls

    NASA Astrophysics Data System (ADS)

    Kuhn, Jochen; Vogt, Patrik

    2012-02-01

    In this paper we describe an experiment in which radiation emitted by an infrared remote control is passed through a diffraction grating. An image of the diffraction pattern is captured using a cell phone camera and then used to determine the wavelength of the radiation.

  1. Semi-autonomous wheelchair system using stereoscopic cameras.

    PubMed

    Nguyen, Jordan S; Nguyen, Thanh H; Nguyen, Hung T

    2009-01-01

    This paper is concerned with the design and development of a semi-autonomous wheelchair system using stereoscopic cameras to assist hands-free control technologies for severely disabled people. The stereoscopic cameras capture an image from both the left and right cameras, which are then processed with a Sum of Absolute Differences (SAD) correlation algorithm to establish correspondence between image features in the different views of the scene. This is used to produce a stereo disparity image containing information about the depth of objects away from the camera in the image. A geometric projection algorithm is then used to generate a 3-Dimensional (3D) point map, placing pixels of the disparity image in 3D space. This is then converted to a 2-Dimensional (2D) depth map allowing objects in the scene to be viewed and a safe travel path for the wheelchair to be planned and followed based on the user's commands. This assistive technology utilising stereoscopic cameras has the purpose of automated obstacle detection, path planning and following, and collision avoidance during navigation. Experimental results obtained in an indoor environment displayed the effectiveness of this assistive technology.

  2. A technical innovation for improving identification of the trackers by the LED cameras in navigation-assisted total knee arthroplasty.

    PubMed

    Darmanis, Spyridon; Toms, Andrew; Durman, Robert; Moore, Donna; Eyres, Keith

    2007-07-01

    To reduce the operating time in computer-assisted navigated total knee replacement (TKR), by improving communication between the infrared camera and the trackers placed on the patient. The innovation involves placing a routinely used laser pointer on top of the camera, so that the infrared cameras focus precisely on the trackers located on the knee to be operated on. A prospective randomized study was performed involving 40 patients divided into two groups, A and B. Both groups underwent navigated TKR, but for group B patients a laser pointer was used to improve the targeting capabilities of the cameras. Without the laser pointer, the camera had to move a mean 9.2 times in order to identify the trackers. With the introduction of the laser pointer, this was reduced to 0.9 times. Accordingly, the additional mean time required without the laser pointer was 11.6 minutes. Time delays are a major problem in computer-assisted surgery, and our technical suggestion can contribute towards reducing the delays associated with this particular application.

  3. 24/7 security system: 60-FPS color EMCCD camera with integral human recognition

    NASA Astrophysics Data System (ADS)

    Vogelsong, T. L.; Boult, T. E.; Gardner, D. W.; Woodworth, R.; Johnson, R. C.; Heflin, B.

    2007-04-01

    An advanced surveillance/security system is being developed for unattended 24/7 image acquisition and automated detection, discrimination, and tracking of humans and vehicles. The low-light video camera incorporates an electron multiplying CCD sensor with a programmable on-chip gain of up to 1000:1, providing effective noise levels of less than 1 electron. The EMCCD camera operates in full color mode under sunlit and moonlit conditions, and monochrome under quarter-moonlight to overcast starlight illumination. Sixty frame per second operation and progressive scanning minimizes motion artifacts. The acquired image sequences are processed with FPGA-compatible real-time algorithms, to detect/localize/track targets and reject non-targets due to clutter under a broad range of illumination conditions and viewing angles. The object detectors that are used are trained from actual image data. Detectors have been developed and demonstrated for faces, upright humans, crawling humans, large animals, cars and trucks. Detection and tracking of targets too small for template-based detection is achieved. For face and vehicle targets the results of the detection are passed to secondary processing to extract recognition templates, which are then compared with a database for identification. When combined with pan-tilt-zoom (PTZ) optics, the resulting system provides a reliable wide-area 24/7 surveillance system that avoids the high life-cycle cost of infrared cameras and image intensifiers.

  4. Single Pixel Black Phosphorus Photodetector for Near-Infrared Imaging.

    PubMed

    Miao, Jinshui; Song, Bo; Xu, Zhihao; Cai, Le; Zhang, Suoming; Dong, Lixin; Wang, Chuan

    2018-01-01

    Infrared imaging systems have wide range of military or civil applications and 2D nanomaterials have recently emerged as potential sensing materials that may outperform conventional ones such as HgCdTe, InGaAs, and InSb. As an example, 2D black phosphorus (BP) thin film has a thickness-dependent direct bandgap with low shot noise and noncryogenic operation for visible to mid-infrared photodetection. In this paper, the use of a single-pixel photodetector made with few-layer BP thin film for near-infrared imaging applications is demonstrated. The imaging is achieved by combining the photodetector with a digital micromirror device to encode and subsequently reconstruct the image based on compressive sensing algorithm. Stationary images of a near-infrared laser spot (λ = 830 nm) with up to 64 × 64 pixels are captured using this single-pixel BP camera with 2000 times of measurements, which is only half of the total number of pixels. The imaging platform demonstrated in this work circumvents the grand challenges of scalable BP material growth for photodetector array fabrication and shows the efficacy of utilizing the outstanding performance of BP photodetector for future high-speed infrared camera applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Firefly: A HOT camera core for thermal imagers with enhanced functionality

    NASA Astrophysics Data System (ADS)

    Pillans, Luke; Harmer, Jack; Edwards, Tim

    2015-06-01

    Raising the operating temperature of mercury cadmium telluride infrared detectors from 80K to above 160K creates new applications for high performance infrared imagers by vastly reducing the size, weight and power consumption of the integrated cryogenic cooler. Realizing the benefits of Higher Operating Temperature (HOT) requires a new kind of infrared camera core with the flexibility to address emerging applications in handheld, weapon mounted and UAV markets. This paper discusses the Firefly core developed to address these needs by Selex ES in Southampton UK. Firefly represents a fundamental redesign of the infrared signal chain reducing power consumption and providing compatibility with low cost, low power Commercial Off-The-Shelf (COTS) computing technology. This paper describes key innovations in this signal chain: a ROIC purpose built to minimize power consumption in the proximity electronics, GPU based image processing of infrared video, and a software customisable infrared core which can communicate wirelessly with other Battlespace systems.

  6. Calibration procedures of the Tore-Supra infrared endoscopes

    NASA Astrophysics Data System (ADS)

    Desgranges, C.; Jouve, M.; Balorin, C.; Reichle, R.; Firdaouss, M.; Lipa, M.; Chantant, M.; Gardarein, J. L.; Saille, A.; Loarer, T.

    2018-01-01

    Five endoscopes equipped with infrared cameras working in the medium infrared range (3-5 μm) are installed on the controlled thermonuclear fusion research device Tore-Supra. These endoscopes aim at monitoring the plasma facing components surface temperature to prevent their overheating. Signals delivered by infrared cameras through endoscopes are analysed and used on the one hand through a real time feedback control loop acting on the heating systems of the plasma to decrease plasma facing components surface temperatures when necessary, on the other hand for physics studies such as determination of the incoming heat flux . To ensure these two roles a very accurate knowledge of the absolute surface temperatures is mandatory. Consequently the infrared endoscopes must be calibrated through a very careful procedure. This means determining their transmission coefficients which is a delicate operation. Methods to calibrate infrared endoscopes during the shutdown period of the Tore-Supra machine will be presented. As they do not allow determining the possible transmittances evolution during operation an in-situ method is presented. It permits the validation of the calibration performed in laboratory as well as the monitoring of their evolution during machine operation. This is possible by the use of the endoscope shutter and a dedicated plasma scenario developed to heat it. Possible improvements of this method are briefly evoked.

  7. Performance Characteristics For The Orbiter Camera Payload System's Large Format Camera (LFC)

    NASA Astrophysics Data System (ADS)

    MoIIberg, Bernard H.

    1981-11-01

    The Orbiter Camera Payload System, the OCPS, is an integrated photographic system which is carried into Earth orbit as a payload in the Shuttle Orbiter vehicle's cargo bay. The major component of the OCPS is a Large Format Camera (LFC) which is a precision wide-angle cartographic instrument that is capable of produc-ing high resolution stereophotography of great geometric fidelity in multiple base to height ratios. The primary design objective for the LFC was to maximize all system performance characteristics while maintaining a high level of reliability compatible with rocket launch conditions and the on-orbit environment.

  8. Phase-stepped fringe projection by rotation about the camera's perspective center.

    PubMed

    Huddart, Y R; Valera, J D; Weston, N J; Featherstone, T C; Moore, A J

    2011-09-12

    A technique to produce phase steps in a fringe projection system for shape measurement is presented. Phase steps are produced by introducing relative rotation between the object and the fringe projection probe (comprising a projector and camera) about the camera's perspective center. Relative motion of the object in the camera image can be compensated, because it is independent of the distance of the object from the camera, whilst the phase of the projected fringes is stepped due to the motion of the projector with respect to the object. The technique was validated with a static fringe projection system by moving an object on a coordinate measuring machine (CMM). The alternative approach, of rotating a lightweight and robust CMM-mounted fringe projection probe, is discussed. An experimental accuracy of approximately 1.5% of the projected fringe pitch was achieved, limited by the standard phase-stepping algorithms used rather than by the accuracy of the phase steps produced by the new technique.

  9. Color Infrared view of Houston, TX, USA

    NASA Image and Video Library

    1991-09-18

    This color infrared view of Houston (29.5N, 95.0W) was taken with a dual camera mount. Compare this scene with STS048-78-034 for an analysis of the unique properties of each film type. Comparative tests such as this aids in determining the kinds of information unique to each film system and evaluates and compares photography taken through hazy atmospheres. Infrared film is best at penetrating haze, vegetation detection and producing a sharp image.

  10. Children's exposure to alcohol marketing within supermarkets: An objective analysis using GPS technology and wearable cameras.

    PubMed

    Chambers, T; Pearson, A L; Stanley, J; Smith, M; Barr, M; Ni Mhurchu, C; Signal, L

    2017-07-01

    Exposure to alcohol marketing within alcohol retailers has been associated with higher rates of childhood drinking, brand recognition, and marketing recall. This study aimed to objectively measure children's everyday exposure to alcohol marketing within supermarkets. Children aged 11-13 (n = 167) each wore a wearable camera and GPS device for four consecutive days. Micro-spatial analyses were used to examine exposures within supermarkets. In alcohol retailing supermarkets (n = 30), children encountered alcohol marketing on 85% of their visits (n = 78). Alcohol marketing was frequently near everyday goods (bread and milk) or entrance/exit. Alcohol sales in supermarkets should be banned in order to protect children from alcohol marketing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Confocal Retinal Imaging Using a Digital Light Projector with a Near Infrared VCSEL Source

    PubMed Central

    Muller, Matthew S.; Elsner, Ann E.

    2018-01-01

    A custom near infrared VCSEL source has been implemented in a confocal non-mydriatic retinal camera, the Digital Light Ophthalmoscope (DLO). The use of near infrared light improves patient comfort, avoids pupil constriction, penetrates the deeper retina, and does not mask visual stimuli. The DLO performs confocal imaging by synchronizing a sequence of lines displayed with a digital micromirror device to the rolling shutter exposure of a 2D CMOS camera. Real-time software adjustments enable multiply scattered light imaging, which rapidly and cost-effectively emphasizes drusen and other scattering disruptions in the deeper retina. A separate 5.1″ LCD display provides customizable visible stimuli for vision experiments with simultaneous near infrared imaging. PMID:29899586

  12. Confocal retinal imaging using a digital light projector with a near infrared VCSEL source

    NASA Astrophysics Data System (ADS)

    Muller, Matthew S.; Elsner, Ann E.

    2018-02-01

    A custom near infrared VCSEL source has been implemented in a confocal non-mydriatic retinal camera, the Digital Light Ophthalmoscope (DLO). The use of near infrared light improves patient comfort, avoids pupil constriction, penetrates the deeper retina, and does not mask visual stimuli. The DLO performs confocal imaging by synchronizing a sequence of lines displayed with a digital micromirror device to the rolling shutter exposure of a 2D CMOS camera. Real-time software adjustments enable multiply scattered light imaging, which rapidly and cost-effectively emphasizes drusen and other scattering disruptions in the deeper retina. A separate 5.1" LCD display provides customizable visible stimuli for vision experiments with simultaneous near infrared imaging.

  13. Towards next generation 3D cameras

    NASA Astrophysics Data System (ADS)

    Gupta, Mohit

    2017-03-01

    We are in the midst of a 3D revolution. Robots enabled by 3D cameras are beginning to autonomously drive cars, perform surgeries, and manage factories. However, when deployed in the real-world, these cameras face several challenges that prevent them from measuring 3D shape reliably. These challenges include large lighting variations (bright sunlight to dark night), presence of scattering media (fog, body tissue), and optically complex materials (metal, plastic). Due to these factors, 3D imaging is often the bottleneck in widespread adoption of several key robotics technologies. I will talk about our work on developing 3D cameras based on time-of-flight and active triangulation that addresses these long-standing problems. This includes designing `all-weather' cameras that can perform high-speed 3D scanning in harsh outdoor environments, as well as cameras that recover shape of objects with challenging material properties. These cameras are, for the first time, capable of measuring detailed (<100 microns resolution) scans in extremely demanding scenarios with low-cost components. Several of these cameras are making a practical impact in industrial automation, being adopted in robotic inspection and assembly systems.

  14. A hidden view of wildlife conservation: How camera traps aid science, research and management

    USGS Publications Warehouse

    O'Connell, Allan F.

    2015-01-01

    Camera traps — remotely activated cameras with infrared sensors — first gained measurable popularity in wildlife conservation in the early 1990s. Today, they’re used for a variety of activities, from species-specific research to broad-scale inventory or monitoring programs that, in some cases, attempt to detect biodiversity across vast landscapes. As this modern tool continues to evolve, it’s worth examining its uses and benefits for wildlife management and conservation.

  15. Geometric Calibration and Radiometric Correction of the Maia Multispectral Camera

    NASA Astrophysics Data System (ADS)

    Nocerino, E.; Dubbini, M.; Menna, F.; Remondino, F.; Gattelli, M.; Covi, D.

    2017-10-01

    Multispectral imaging is a widely used remote sensing technique, whose applications range from agriculture to environmental monitoring, from food quality check to cultural heritage diagnostic. A variety of multispectral imaging sensors are available on the market, many of them designed to be mounted on different platform, especially small drones. This work focuses on the geometric and radiometric characterization of a brand-new, lightweight, low-cost multispectral camera, called MAIA. The MAIA camera is equipped with nine sensors, allowing for the acquisition of images in the visible and near infrared parts of the electromagnetic spectrum. Two versions are available, characterised by different set of band-pass filters, inspired by the sensors mounted on the WorlView-2 and Sentinel2 satellites, respectively. The camera details and the developed procedures for the geometric calibrations and radiometric correction are presented in the paper.

  16. Basic temperature correction of QWIP cameras in thermoelastic/plastic tests of composite materials.

    PubMed

    Boccardi, Simone; Carlomagno, Giovanni Maria; Meola, Carosena

    2016-12-01

    The present work is concerned with the use of a quantum well infrared photodetector (QWIP) infrared camera to measure very small temperature variations, which are related to thermoelastic/plastic effects, developing on composites under relatively low loads, either periodic or due to impact. As is evident from previous work, some temperature variations are difficult to measure, being at the edge of the IR camera resolution and/or affected by the instrument noise. Conversely, they may be valuable to get either information about the material characteristics and its behavior under periodic load (thermoelastic), or to assess the overall extension of delaminations due to impact (thermo-plastic). An image post-processing procedure is herein described that, with the help of a reference signal, allows for suppression of the instrument noise and better discrimination of thermal signatures induced by the two different loads.

  17. Object localization in handheld thermal images for fireground understanding

    NASA Astrophysics Data System (ADS)

    Vandecasteele, Florian; Merci, Bart; Jalalvand, Azarakhsh; Verstockt, Steven

    2017-05-01

    Despite the broad application of the handheld thermal imaging cameras in firefighting, its usage is mostly limited to subjective interpretation by the person carrying the device. As remedies to overcome this limitation, object localization and classification mechanisms could assist the fireground understanding and help with the automated localization, characterization and spatio-temporal (spreading) analysis of the fire. An automated understanding of thermal images can enrich the conventional knowledge-based firefighting techniques by providing the information from the data and sensing-driven approaches. In this work, transfer learning is applied on multi-labeling convolutional neural network architectures for object localization and recognition in monocular visual, infrared and multispectral dynamic images. Furthermore, the possibility of analyzing fire scene images is studied and their current limitations are discussed. Finally, the understanding of the room configuration (i.e., objects location) for indoor localization in reduced visibility environments and the linking with Building Information Models (BIM) are investigated.

  18. Improved signal to noise ratio and sensitivity of an infrared imaging video bolometer on large helical device by using an infrared periscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandya, Shwetang N., E-mail: pandya.shwetang@LHD.nifs.ac.jp; Sano, Ryuichi; Peterson, Byron J.

    An Infrared imaging Video Bolometer (IRVB) diagnostic is currently being used in the Large Helical Device (LHD) for studying the localization of radiation structures near the magnetic island and helical divertor X-points during plasma detachment and for 3D tomography. This research demands high signal to noise ratio (SNR) and sensitivity to improve the temporal resolution for studying the evolution of radiation structures during plasma detachment and a wide IRVB field of view (FoV) for tomography. Introduction of an infrared periscope allows achievement of a higher SNR and higher sensitivity, which in turn, permits a twofold improvement in the temporal resolutionmore » of the diagnostic. Higher SNR along with wide FoV is achieved simultaneously by reducing the separation of the IRVB detector (metal foil) from the bolometer's aperture and the LHD plasma. Altering the distances to meet the aforesaid requirements results in an increased separation between the foil and the IR camera. This leads to a degradation of the diagnostic performance in terms of its sensitivity by 1.5-fold. Using an infrared periscope to image the IRVB foil results in a 7.5-fold increase in the number of IR camera pixels imaging the foil. This improves the IRVB sensitivity which depends on the square root of the number of IR camera pixels being averaged per bolometer channel. Despite the slower f-number (f/# = 1.35) and reduced transmission (τ{sub 0} = 89%, due to an increased number of lens elements) for the periscope, the diagnostic with an infrared periscope operational on LHD has improved in terms of sensitivity and SNR by a factor of 1.4 and 4.5, respectively, as compared to the original diagnostic without a periscope (i.e., IRVB foil being directly imaged by the IR camera through conventional optics). The bolometer's field of view has also increased by two times. The paper discusses these improvements in apt details.« less

  19. Mars Cameras Make Panoramic Photography a Snap

    NASA Technical Reports Server (NTRS)

    2008-01-01

    If you wish to explore a Martian landscape without leaving your armchair, a few simple clicks around the NASA Web site will lead you to panoramic photographs taken from the Mars Exploration Rovers, Spirit and Opportunity. Many of the technologies that enable this spectacular Mars photography have also inspired advancements in photography here on Earth, including the panoramic camera (Pancam) and its housing assembly, designed by the Jet Propulsion Laboratory and Cornell University for the Mars missions. Mounted atop each rover, the Pancam mast assembly (PMA) can tilt a full 180 degrees and swivel 360 degrees, allowing for a complete, highly detailed view of the Martian landscape. The rover Pancams take small, 1 megapixel (1 million pixel) digital photographs, which are stitched together into large panoramas that sometimes measure 4 by 24 megapixels. The Pancam software performs some image correction and stitching after the photographs are transmitted back to Earth. Different lens filters and a spectrometer also assist scientists in their analyses of infrared radiation from the objects in the photographs. These photographs from Mars spurred developers to begin thinking in terms of larger and higher quality images: super-sized digital pictures, or gigapixels, which are images composed of 1 billion or more pixels. Gigapixel images are more than 200 times the size captured by today s standard 4 megapixel digital camera. Although originally created for the Mars missions, the detail provided by these large photographs allows for many purposes, not all of which are limited to extraterrestrial photography.

  20. A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks.

    PubMed

    Su, Po-Chang; Shen, Ju; Xu, Wanxin; Cheung, Sen-Ching S; Luo, Ying

    2018-01-15

    From object tracking to 3D reconstruction, RGB-Depth (RGB-D) camera networks play an increasingly important role in many vision and graphics applications. Practical applications often use sparsely-placed cameras to maximize visibility, while using as few cameras as possible to minimize cost. In general, it is challenging to calibrate sparse camera networks due to the lack of shared scene features across different camera views. In this paper, we propose a novel algorithm that can accurately and rapidly calibrate the geometric relationships across an arbitrary number of RGB-D cameras on a network. Our work has a number of novel features. First, to cope with the wide separation between different cameras, we establish view correspondences by using a spherical calibration object. We show that this approach outperforms other techniques based on planar calibration objects. Second, instead of modeling camera extrinsic calibration using rigid transformation, which is optimal only for pinhole cameras, we systematically test different view transformation functions including rigid transformation, polynomial transformation and manifold regression to determine the most robust mapping that generalizes well to unseen data. Third, we reformulate the celebrated bundle adjustment procedure to minimize the global 3D reprojection error so as to fine-tune the initial estimates. Finally, our scalable client-server architecture is computationally efficient: the calibration of a five-camera system, including data capture, can be done in minutes using only commodity PCs. Our proposed framework is compared with other state-of-the-arts systems using both quantitative measurements and visual alignment results of the merged point clouds.

  1. High-Resolution Near-Infrared Spectroscopy of FU Orionis Objects

    NASA Astrophysics Data System (ADS)

    Hartmann, Lee; Hinkle, Kenneth; Calvet, Nuria

    2004-07-01

    We present an analysis of recent near-infrared, high-resolution spectra of the variable FU Ori objects. During a phase of rapid fading in optical brightness during 1997, V1057 Cyg exhibited shell absorption in first-overtone (v''-v'=2-0) CO lines, blueshifted by about 50 km s-1 from the system velocity. This shell component had not been seen previously, nor was it present in 1999, although some blueshifted absorption asymmetry is seen at the latter epoch. The appearance of this CO absorption shell is connected with the roughly contemporaneous appearance of blueshifted, low-excitation optical absorption lines with comparable low velocities; we suggest that this shell was also responsible for some of the peculiar emission features seen in red-optical spectra of V1057 Cyg. FU Ori continues to exhibit broad CO lines, with some evidence for the double-peaked profiles characteristic of an accretion disk; the line profiles are consistent with previous observations. Both FU Ori and V1057 Cyg continue to exhibit lower rotational broadening at 2.3 μm than at optical wavelengths, in agreement with the prediction of differentially rotating disk models; we have a marginal detection of the same effect in V1515 Cyg. The relative population of the first-overtone CO rotational levels in the FU Ori objects suggests low excitation temperatures. We compare disk models to the observations and find agreement with overall line strengths and rotational broadening, but the observed line profiles are generally less double-peaked than predicted. We suggest that the discrepancy in line profiles is due to turbulent motions in FU Ori disks, an effect qualitatively predicted by recent simulations of the magnetorotational instability in vertically stratified accretion disks. Based on observations obtained at the Gemini Observatory, which is operated by the Association of Universities for Research in Astronomy (AURA), Inc., under a cooperative agreement with the NSF, on behalf of the Gemini

  2. Cameras Reveal Elements in the Short Wave Infrared

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Goodrich ISR Systems Inc. (formerly Sensors Unlimited Inc.), based out of Princeton, New Jersey, received Small Business Innovation Research (SBIR) contracts from the Jet Propulsion Laboratory, Marshall Space Flight Center, Kennedy Space Center, Goddard Space Flight Center, Ames Research Center, Stennis Space Center, and Langley Research Center to assist in advancing and refining indium gallium arsenide imaging technology. Used on the Lunar Crater Observation and Sensing Satellite (LCROSS) mission in 2009 for imaging the short wave infrared wavelengths, the technology has dozens of applications in military, security and surveillance, machine vision, medical, spectroscopy, semiconductor inspection, instrumentation, thermography, and telecommunications.

  3. Infrared Spectroscopy Data Reduction with ORAC-DR

    NASA Astrophysics Data System (ADS)

    Economou, F.; Jenness, T.; Cavanagh, B.; Wright, G. S.; Bridger, A. B.; Kerr, T. H.; Hirst, P.; Adamson, A. J.

    ORAC-DR is a flexible and extensible data reduction pipeline suitable for both on-line and off-line use. Since its development it has been in use on-line at UKIRT for data from the infrared cameras UFTI and IRCAM and at JCMT for data from the sub-millimetre bolometer array SCUBA. We have now added a suite of on-line reduction recipes that produces publication quality (or nearly so) data from the CGS4 near-infrared spectrometer and the MICHELLE mid-infrared Echelle spectrometer. As an example, this paper briefly describes some pipeline features for one of the more commonly used observing modes.

  4. Measuring Distances Using Digital Cameras

    ERIC Educational Resources Information Center

    Kendal, Dave

    2007-01-01

    This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

  5. Infrared stereo calibration for unmanned ground vehicle navigation

    NASA Astrophysics Data System (ADS)

    Harguess, Josh; Strange, Shawn

    2014-06-01

    The problem of calibrating two color cameras as a stereo pair has been heavily researched and many off-the-shelf software packages, such as Robot Operating System and OpenCV, include calibration routines that work in most cases. However, the problem of calibrating two infrared (IR) cameras for the purposes of sensor fusion and point could generation is relatively new and many challenges exist. We present a comparison of color camera and IR camera stereo calibration using data from an unmanned ground vehicle. There are two main challenges in IR stereo calibration; the calibration board (material, design, etc.) and the accuracy of calibration pattern detection. We present our analysis of these challenges along with our IR stereo calibration methodology. Finally, we present our results both visually and analytically with computed reprojection errors.

  6. Infrared needle mapping to assist biopsy procedures and training.

    PubMed

    Shar, Bruce; Leis, John; Coucher, John

    2018-04-01

    A computed tomography (CT) biopsy is a radiological procedure which involves using a needle to withdraw tissue or a fluid specimen from a lesion of interest inside a patient's body. The needle is progressively advanced into the patient's body, guided by the most recent CT scan. CT guided biopsies invariably expose patients to high dosages of radiation, due to the number of scans required whilst the needle is advanced. This study details the design of a novel method to aid biopsy procedures using infrared cameras. Two cameras are used to image the biopsy needle area, from which the proposed algorithm computes an estimate of the needle endpoint, which is projected onto the CT image space. This estimated position may be used to guide the needle between scans, and results in a reduction in the number of CT scans that need to be performed during the biopsy procedure. The authors formulate a 2D augmentation system which compensates for camera pose, and show that multiple low-cost infrared imaging devices provide a promising approach.

  7. Automatic target recognition and detection in infrared imagery under cluttered background

    NASA Astrophysics Data System (ADS)

    Gundogdu, Erhan; Koç, Aykut; Alatan, A. Aydın.

    2017-10-01

    Visual object classification has long been studied in visible spectrum by utilizing conventional cameras. Since the labeled images has recently increased in number, it is possible to train deep Convolutional Neural Networks (CNN) with significant amount of parameters. As the infrared (IR) sensor technology has been improved during the last two decades, labeled images extracted from IR sensors have been started to be used for object detection and recognition tasks. We address the problem of infrared object recognition and detection by exploiting 15K images from the real-field with long-wave and mid-wave IR sensors. For feature learning, a stacked denoising autoencoder is trained in this IR dataset. To recognize the objects, the trained stacked denoising autoencoder is fine-tuned according to the binary classification loss of the target object. Once the training is completed, the test samples are propagated over the network, and the probability of the test sample belonging to a class is computed. Moreover, the trained classifier is utilized in a detect-by-classification method, where the classification is performed in a set of candidate object boxes and the maximum confidence score in a particular location is accepted as the score of the detected object. To decrease the computational complexity, the detection step at every frame is avoided by running an efficient correlation filter based tracker. The detection part is performed when the tracker confidence is below a pre-defined threshold. The experiments conducted on the real field images demonstrate that the proposed detection and tracking framework presents satisfactory results for detecting tanks under cluttered background.

  8. Early forest fire detection using principal component analysis of infrared video

    NASA Astrophysics Data System (ADS)

    Saghri, John A.; Radjabi, Ryan; Jacobs, John T.

    2011-09-01

    A land-based early forest fire detection scheme which exploits the infrared (IR) temporal signature of fire plume is described. Unlike common land-based and/or satellite-based techniques which rely on measurement and discrimination of fire plume directly from its infrared and/or visible reflectance imagery, this scheme is based on exploitation of fire plume temporal signature, i.e., temperature fluctuations over the observation period. The method is simple and relatively inexpensive to implement. The false alarm rate is expected to be lower that of the existing methods. Land-based infrared (IR) cameras are installed in a step-stare-mode configuration in potential fire-prone areas. The sequence of IR video frames from each camera is digitally processed to determine if there is a fire within camera's field of view (FOV). The process involves applying a principal component transformation (PCT) to each nonoverlapping sequence of video frames from the camera to produce a corresponding sequence of temporally-uncorrelated principal component (PC) images. Since pixels that form a fire plume exhibit statistically similar temporal variation (i.e., have a unique temporal signature), PCT conveniently renders the footprint/trace of the fire plume in low-order PC images. The PC image which best reveals the trace of the fire plume is then selected and spatially filtered via simple threshold and median filter operations to remove the background clutter, such as traces of moving tree branches due to wind.

  9. Eye pupil detection system using an ensemble of regression forest and fast radial symmetry transform with a near infrared camera

    NASA Astrophysics Data System (ADS)

    Jeong, Mira; Nam, Jae-Yeal; Ko, Byoung Chul

    2017-09-01

    In this paper, we focus on pupil center detection in various video sequences that include head poses and changes in illumination. To detect the pupil center, we first find four eye landmarks in each eye by using cascade local regression based on a regression forest. Based on the rough location of the pupil, a fast radial symmetric transform is applied using the previously found pupil location to rearrange the fine pupil center. As the final step, the pupil displacement is estimated between the previous frame and the current frame to maintain the level of accuracy against a false locating result occurring in a particular frame. We generated a new face dataset, called Keimyung University pupil detection (KMUPD), with infrared camera. The proposed method was successfully applied to the KMUPD dataset, and the results indicate that its pupil center detection capability is better than that of other methods and with a shorter processing time.

  10. Three plot correlation-based small infrared target detection in dense sun-glint environment for infrared search and track

    NASA Astrophysics Data System (ADS)

    Kim, Sungho; Choi, Byungin; Kim, Jieun; Kwon, Soon; Kim, Kyung-Tae

    2012-05-01

    This paper presents a separate spatio-temporal filter based small infrared target detection method to address the sea-based infrared search and track (IRST) problem in dense sun-glint environment. It is critical to detect small infrared targets such as sea-skimming missiles or asymmetric small ships for national defense. On the sea surface, sun-glint clutters degrade the detection performance. Furthermore, if we have to detect true targets using only three images with a low frame rate camera, then the problem is more difficult. We propose a novel three plot correlation filter and statistics based clutter reduction method to achieve robust small target detection rate in dense sun-glint environment. We validate the robust detection performance of the proposed method via real infrared test sequences including synthetic targets.

  11. Adaptive DOF for plenoptic cameras

    NASA Astrophysics Data System (ADS)

    Oberdörster, Alexander; Lensch, Hendrik P. A.

    2013-03-01

    Plenoptic cameras promise to provide arbitrary re-focusing through a scene after the capture. In practice, however, the refocusing range is limited by the depth of field (DOF) of the plenoptic camera. For the focused plenoptic camera, this range is given by the range of object distances for which the microimages are in focus. We propose a technique of recording light fields with an adaptive depth of focus. Between multiple exposures { or multiple recordings of the light field { the distance between the microlens array (MLA) and the image sensor is adjusted. The depth and quality of focus is chosen by changing the number of exposures and the spacing of the MLA movements. In contrast to traditional cameras, extending the DOF does not necessarily lead to an all-in-focus image. Instead, the refocus range is extended. There is full creative control about the focus depth; images with shallow or selective focus can be generated.

  12. Optical and Near-infrared Spectra of σ Orionis Isolated Planetary-mass Objects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zapatero Osorio, M. R.; Béjar, V. J. S.; Ramírez, K. Peña, E-mail: mosorio@cab.inta-csic.es, E-mail: vbejar@iac.es, E-mail: karla.pena@uantof.cl

    We have obtained low-resolution optical (0.7–0.98 μ m) and near-infrared (1.11–1.34 μ m and 0.8–2.5 μ m) spectra of 12 isolated planetary-mass candidates ( J = 18.2–19.9 mag) of the 3 Myr σ Orionis star cluster with the aim of determining the spectroscopic properties of very young, substellar dwarfs and assembling a complete cluster mass function. We have classified our targets by visual comparison with high- and low-gravity standards and by measuring newly defined spectroscopic indices. We derived L0–L4.5 and M9–L2.5 using high- and low-gravity standards, respectively. Our targets reveal clear signposts of youth, thus corroborating their cluster membership andmore » planetary masses (6–13 M {sub Jup}). These observations complete the σ Orionis mass function by spectroscopically confirming the planetary-mass domain to a confidence level of ∼75%. The comparison of our spectra with BT-Settl solar metallicity model atmospheres yields a temperature scale of 2350–1800 K and a low surface gravity of log g ≈ 4.0 [cm s{sup −2}], as would be expected for young planetary-mass objects. We discuss the properties of the cluster’s least-massive population as a function of spectral type. We have also obtained the first optical spectrum of S Ori 70, a T dwarf in the direction of σ Orionis. Our data provide reference optical and near-infrared spectra of very young L dwarfs and a mass function that may be used as templates for future studies of low-mass substellar objects and exoplanets. The extrapolation of the σ Orionis mass function to the solar neighborhood may indicate that isolated planetary-mass objects with temperatures of ∼200–300 K and masses in the interval 6–13 M {sub Jup} may be as numerous as very low-mass stars.« less

  13. Nondestructive assessment of the severity of occlusal caries lesions with near-infrared imaging at 1310 nm.

    PubMed

    Lee, Chulsung; Lee, Dustin; Darling, Cynthia L; Fried, Daniel

    2010-01-01

    The high transparency of dental enamel in the near-infrared (NIR) at 1310 nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study is to determine whether the lesion contrast derived from NIR imaging in both transmission and reflectance can be used to estimate lesion severity. Two NIR imaging detector technologies are investigated: a new Ge-enhanced complementary metal-oxide-semiconductor (CMOS)-based NIR imaging camera, and an InGaAs focal plane array (FPA). Natural occlusal caries lesions are imaged with both cameras at 1310 nm, and the image contrast between sound and carious regions is calculated. After NIR imaging, teeth are sectioned and examined using polarized light microscopy (PLM) and transverse microradiography (TMR) to determine lesion severity. Lesions are then classified into four categories according to lesion severity. Lesion contrast increases significantly with lesion severity for both cameras (p<0.05). The Ge-enhanced CMOS camera equipped with the larger array and smaller pixels yields higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.

  14. Nondestructive assessment of the severity of occlusal caries lesions with near-infrared imaging at 1310 nm

    PubMed Central

    Lee, Chulsung; Lee, Dustin; Darling, Cynthia L.; Fried, Daniel

    2010-01-01

    The high transparency of dental enamel in the near-infrared (NIR) at 1310 nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study is to determine whether the lesion contrast derived from NIR imaging in both transmission and reflectance can be used to estimate lesion severity. Two NIR imaging detector technologies are investigated: a new Ge-enhanced complementary metal-oxide-semiconductor (CMOS)-based NIR imaging camera, and an InGaAs focal plane array (FPA). Natural occlusal caries lesions are imaged with both cameras at 1310 nm, and the image contrast between sound and carious regions is calculated. After NIR imaging, teeth are sectioned and examined using polarized light microscopy (PLM) and transverse microradiography (TMR) to determine lesion severity. Lesions are then classified into four categories according to lesion severity. Lesion contrast increases significantly with lesion severity for both cameras (p<0.05). The Ge-enhanced CMOS camera equipped with the larger array and smaller pixels yields higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity. PMID:20799842

  15. Nondestructive assessment of the severity of occlusal caries lesions with near-infrared imaging at 1310 nm

    NASA Astrophysics Data System (ADS)

    Lee, Chulsung; Lee, Dustin; Darling, Cynthia L.; Fried, Daniel

    2010-07-01

    The high transparency of dental enamel in the near-infrared (NIR) at 1310 nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study is to determine whether the lesion contrast derived from NIR imaging in both transmission and reflectance can be used to estimate lesion severity. Two NIR imaging detector technologies are investigated: a new Ge-enhanced complementary metal-oxide-semiconductor (CMOS)-based NIR imaging camera, and an InGaAs focal plane array (FPA). Natural occlusal caries lesions are imaged with both cameras at 1310 nm, and the image contrast between sound and carious regions is calculated. After NIR imaging, teeth are sectioned and examined using polarized light microscopy (PLM) and transverse microradiography (TMR) to determine lesion severity. Lesions are then classified into four categories according to lesion severity. Lesion contrast increases significantly with lesion severity for both cameras (p<0.05). The Ge-enhanced CMOS camera equipped with the larger array and smaller pixels yields higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.

  16. Time-of-Flight Microwave Camera

    NASA Astrophysics Data System (ADS)

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-10-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.

  17. Time-of-Flight Microwave Camera.

    PubMed

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-10-05

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable "stealth" regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows "camera-like" behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.

  18. Line-Constrained Camera Location Estimation in Multi-Image Stereomatching.

    PubMed

    Donné, Simon; Goossens, Bart; Philips, Wilfried

    2017-08-23

    Stereomatching is an effective way of acquiring dense depth information from a scene when active measurements are not possible. So-called lightfield methods take a snapshot from many camera locations along a defined trajectory (usually uniformly linear or on a regular grid-we will assume a linear trajectory) and use this information to compute accurate depth estimates. However, they require the locations for each of the snapshots to be known: the disparity of an object between images is related to both the distance of the camera to the object and the distance between the camera positions for both images. Existing solutions use sparse feature matching for camera location estimation. In this paper, we propose a novel method that uses dense correspondences to do the same, leveraging an existing depth estimation framework to also yield the camera locations along the line. We illustrate the effectiveness of the proposed technique for camera location estimation both visually for the rectification of epipolar plane images and quantitatively with its effect on the resulting depth estimation. Our proposed approach yields a valid alternative for sparse techniques, while still being executed in a reasonable time on a graphics card due to its highly parallelizable nature.

  19. Near-infrared Thermal Emission Detections of a Number of Hot Jupiters and the Systematics of Ground-based Near-infrared Photometry

    NASA Astrophysics Data System (ADS)

    Croll, Bryce; Albert, Loic; Jayawardhana, Ray; Cushing, Michael; Moutou, Claire; Lafreniere, David; Johnson, John Asher; Bonomo, Aldo S.; Deleuil, Magali; Fortney, Jonathan

    2015-03-01

    We present detections of the near-infrared thermal emission of three hot Jupiters and one brown dwarf using the Wide-field Infrared Camera (WIRCam) on the Canada-France-Hawaii Telescope (CFHT). These include Ks-band secondary eclipse detections of the hot Jupiters WASP-3b and Qatar-1b and the brown dwarf KELT-1b. We also report Y-band, K CONT-band, and two new and one reanalyzed Ks-band detections of the thermal emission of the hot Jupiter WASP-12b. We present a new reduction pipeline for CFHT/WIRCam data, which is optimized for high precision photometry. We also describe novel techniques for constraining systematic errors in ground-based near-infrared photometry, so as to return reliable secondary eclipse depths and uncertainties. We discuss the noise properties of our ground-based photometry for wavelengths spanning the near-infrared (the YJHK bands), for faint and bright stars, and for the same object on several occasions. For the hot Jupiters WASP-3b and WASP-12b we demonstrate the repeatability of our eclipse depth measurements in the Ks band; we therefore place stringent limits on the systematics of ground-based, near-infrared photometry, and also rule out violent weather changes in the deep, high pressure atmospheres of these two hot Jupiters at the epochs of our observations. Based on observations obtained with WIRCam, a joint project of Canada-France-Hawaii Telescope (CFHT), Taiwan, Korea, Canada, France, at the CFHT, which is operated by the National Research Council (NRC) of Canada, the Institute National des Sciences de l'Univers of the Centre National de la Recherche Scientifique of France, and the University of Hawaii.

  20. Visible and infrared reflectance imaging spectroscopy of paintings: pigment mapping and improved infrared reflectography

    NASA Astrophysics Data System (ADS)

    Delaney, John K.; Zeibel, Jason G.; Thoury, Mathieu; Littleton, Roy; Morales, Kathryn M.; Palmer, Michael; de la Rie, E. René

    2009-07-01

    Reflectance imaging spectroscopy, the collection of images in narrow spectral bands, has been developed for remote sensing of the Earth. In this paper we present findings on the use of imaging spectroscopy to identify and map artist pigments as well as to improve the visualization of preparatory sketches. Two novel hyperspectral cameras, one operating from the visible to near-infrared (VNIR) and the other in the shortwave infrared (SWIR), have been used to collect diffuse reflectance spectral image cubes on a variety of paintings. The resulting image cubes (VNIR 417 to 973 nm, 240 bands, and SWIR 970 to 1650 nm, 85 bands) were calibrated to reflectance and the resulting spectra compared with results from a fiber optics reflectance spectrometer (350 to 2500 nm). The results show good agreement between the spectra acquired with the hyperspectral cameras and those from the fiber reflectance spectrometer. For example, the primary blue pigments and their distribution in Picasso's Harlequin Musician (1924) are identified from the reflectance spectra and agree with results from X-ray fluorescence data and dispersed sample analysis. False color infrared reflectograms, obtained from the SWIR hyperspectral images, of extensively reworked paintings such as Picasso's The Tragedy (1903) are found to give improved visualization of changes made by the artist. These results show that including the NIR and SWIR spectral regions along with the visible provides for a more robust identification and mapping of artist pigments than using visible imaging spectroscopy alone.

  1. Clinical usefulness of augmented reality using infrared camera based real-time feedback on gait function in cerebral palsy: a case study

    PubMed Central

    Lee, Byoung-Hee

    2016-01-01

    [Purpose] This study investigated the effects of real-time feedback using infrared camera recognition technology-based augmented reality in gait training for children with cerebral palsy. [Subjects] Two subjects with cerebral palsy were recruited. [Methods] In this study, augmented reality based real-time feedback training was conducted for the subjects in two 30-minute sessions per week for four weeks. Spatiotemporal gait parameters were used to measure the effect of augmented reality-based real-time feedback training. [Results] Velocity, cadence, bilateral step and stride length, and functional ambulation improved after the intervention in both cases. [Conclusion] Although additional follow-up studies of the augmented reality based real-time feedback training are required, the results of this study demonstrate that it improved the gait ability of two children with cerebral palsy. These findings suggest a variety of applications of conservative therapeutic methods which require future clinical trials. PMID:27190489

  2. Clinical usefulness of augmented reality using infrared camera based real-time feedback on gait function in cerebral palsy: a case study.

    PubMed

    Lee, Byoung-Hee

    2016-04-01

    [Purpose] This study investigated the effects of real-time feedback using infrared camera recognition technology-based augmented reality in gait training for children with cerebral palsy. [Subjects] Two subjects with cerebral palsy were recruited. [Methods] In this study, augmented reality based real-time feedback training was conducted for the subjects in two 30-minute sessions per week for four weeks. Spatiotemporal gait parameters were used to measure the effect of augmented reality-based real-time feedback training. [Results] Velocity, cadence, bilateral step and stride length, and functional ambulation improved after the intervention in both cases. [Conclusion] Although additional follow-up studies of the augmented reality based real-time feedback training are required, the results of this study demonstrate that it improved the gait ability of two children with cerebral palsy. These findings suggest a variety of applications of conservative therapeutic methods which require future clinical trials.

  3. Multi-Touch Tabletop System Using Infrared Image Recognition for User Position Identification.

    PubMed

    Suto, Shota; Watanabe, Toshiya; Shibusawa, Susumu; Kamada, Masaru

    2018-05-14

    A tabletop system can facilitate multi-user collaboration in a variety of settings, including small meetings, group work, and education and training exercises. The ability to identify the users touching the table and their positions can promote collaborative work among participants, so methods have been studied that involve attaching sensors to the table, chairs, or to the users themselves. An effective method of recognizing user actions without placing a burden on the user would be some type of visual process, so the development of a method that processes multi-touch gestures by visual means is desired. This paper describes the development of a multi-touch tabletop system using infrared image recognition for user position identification and presents the results of touch-gesture recognition experiments and a system-usability evaluation. Using an inexpensive FTIR touch panel and infrared light, this system picks up the touch areas and the shadow area of the user's hand by an infrared camera to establish an association between the hand and table touch points and estimate the position of the user touching the table. The multi-touch gestures prepared for this system include an operation to change the direction of an object to face the user and a copy operation in which two users generate duplicates of an object. The system-usability evaluation revealed that prior learning was easy and that system operations could be easily performed.

  4. Plate refractive camera model and its applications

    NASA Astrophysics Data System (ADS)

    Huang, Longxiang; Zhao, Xu; Cai, Shen; Liu, Yuncai

    2017-03-01

    In real applications, a pinhole camera capturing objects through a planar parallel transparent plate is frequently employed. Due to the refractive effects of the plate, such an imaging system does not comply with the conventional pinhole camera model. Although the system is ubiquitous, it has not been thoroughly studied. This paper aims at presenting a simple virtual camera model, called a plate refractive camera model, which has a form similar to a pinhole camera model and can efficiently model refractions through a plate. The key idea is to employ a pixel-wise viewpoint concept to encode the refraction effects into a pixel-wise pinhole camera model. The proposed camera model realizes an efficient forward projection computation method and has some advantages in applications. First, the model can help to compute the caustic surface to represent the changes of the camera viewpoints. Second, the model has strengths in analyzing and rectifying the image caustic distortion caused by the plate refraction effects. Third, the model can be used to calibrate the camera's intrinsic parameters without removing the plate. Last but not least, the model contributes to putting forward the plate refractive triangulation methods in order to solve the plate refractive triangulation problem easily in multiviews. We verify our theory in both synthetic and real experiments.

  5. Low Noise Camera for Suborbital Science Applications

    NASA Technical Reports Server (NTRS)

    Hyde, David; Robertson, Bryan; Holloway, Todd

    2015-01-01

    Low-cost, commercial-off-the-shelf- (COTS-) based science cameras are intended for lab use only and are not suitable for flight deployment as they are difficult to ruggedize and repackage into instruments. Also, COTS implementation may not be suitable since mission science objectives are tied to specific measurement requirements, and often require performance beyond that required by the commercial market. Custom camera development for each application is cost prohibitive for the International Space Station (ISS) or midrange science payloads due to nonrecurring expenses ($2,000 K) for ground-up camera electronics design. While each new science mission has a different suite of requirements for camera performance (detector noise, speed of image acquisition, charge-coupled device (CCD) size, operation temperature, packaging, etc.), the analog-to-digital conversion, power supply, and communications can be standardized to accommodate many different applications. The low noise camera for suborbital applications is a rugged standard camera platform that can accommodate a range of detector types and science requirements for use in inexpensive to mid range payloads supporting Earth science, solar physics, robotic vision, or astronomy experiments. Cameras developed on this platform have demonstrated the performance found in custom flight cameras at a price per camera more than an order of magnitude lower.

  6. On the Integration of Medium Wave Infrared Cameras for Vision-Based Navigation

    DTIC Science & Technology

    2015-03-01

    SWIR Short Wave Infrared VisualSFM Visual Structure from Motion WPAFB Wright Patterson Air Force Base xi ON THE INTEGRATION OF MEDIUM WAVE INFRARED...Structure from Motion Visual Structure from Motion ( VisualSFM ) is an application that performs incremental SfM using images fed into it of a scene [20...too drastically in between frames. When this happens, VisualSFM will begin creating a new model with images that do not fit to the old one. These new

  7. Comparison of parameters of modern cooled and uncooled thermal cameras

    NASA Astrophysics Data System (ADS)

    Bareła, Jarosław; Kastek, Mariusz; Firmanty, Krzysztof; Krupiński, Michał

    2017-10-01

    During the design of a system employing thermal cameras one always faces a problem of choosing the camera types best suited for the task. In many cases such a choice is far from optimal one, and there are several reasons for that. System designers often favor tried and tested solution they are used to. They do not follow the latest developments in the field of infrared technology and sometimes their choices are based on prejudice and not on facts. The paper presents the results of measurements of basic parameters of MWIR and LWIR thermal cameras, carried out in a specialized testing laboratory. The measured parameters are decisive in terms of image quality generated by thermal cameras. All measurements were conducted according to current procedures and standards. However the camera settings were not optimized for a specific test conditions or parameter measurements. Instead the real settings used in normal camera operations were applied to obtain realistic camera performance figures. For example there were significant differences between measured values of noise parameters and catalogue data provided by manufacturers, due to the application of edge detection filters to increase detection and recognition ranges. The purpose of this paper is to provide help in choosing the optimal thermal camera for particular application, answering the question whether to opt for cheaper microbolometer device or apply slightly better (in terms of specifications) yet more expensive cooled unit. Measurements and analysis were performed by qualified personnel with several dozen years of experience in both designing and testing of thermal camera systems with both cooled and uncooled focal plane arrays. Cameras of similar array sizes and optics were compared, and for each tested group the best performing devices were selected.

  8. Can reliable sage-grouse lek counts be obtained using aerial infrared technology

    USGS Publications Warehouse

    Gillette, Gifford L.; Coates, Peter S.; Petersen, Steven; Romero, John P.

    2013-01-01

    More effective methods for counting greater sage-grouse (Centrocercus urophasianus) are needed to better assess population trends through enumeration or location of new leks. We describe an aerial infrared technique for conducting sage-grouse lek counts and compare this method with conventional ground-based lek count methods. During the breeding period in 2010 and 2011, we surveyed leks from fixed-winged aircraft using cryogenically cooled mid-wave infrared cameras and surveyed the same leks on the same day from the ground following a standard lek count protocol. We did not detect significant differences in lek counts between surveying techniques. These findings suggest that using a cryogenically cooled mid-wave infrared camera from an aerial platform to conduct lek surveys is an effective alternative technique to conventional ground-based methods, but further research is needed. We discuss multiple advantages to aerial infrared surveys, including counting in remote areas, representing greater spatial variation, and increasing the number of counted leks per season. Aerial infrared lek counts may be a valuable wildlife management tool that releases time and resources for other conservation efforts. Opportunities exist for wildlife professionals to refine and apply aerial infrared techniques to wildlife monitoring programs because of the increasing reliability and affordability of this technology.

  9. Estimation of wetland evapotranspiration in northern New York using infrared thermometry

    NASA Astrophysics Data System (ADS)

    Hwang, K.; Chandler, D. G.

    2016-12-01

    Evapotranspiration (ET) is an important component of the water budget and often regarded as a major water loss. In freshwater wetlands, cumulative annual ET can equal precipitation under well-watered conditions. Wetland ET is therefore an important control on contaminant and nutrient transport. Yet, quantification of wetland ET is challenged by complex surface characteristics, diverse plant species and density, and variations in wetland shape and size. As handheld infrared (IR) cameras have become available, studies exploiting the new technology have increased, especially in agriculture and hydrology. The benefits of IR cameras include (1) high spatial resolution, (2) high sample rates, (3) real-time imaging, (4) a constant viewing geometry, and (5) no need for atmosphere and cloud corrections. Compared with traditional methods, infrared thermometer is capable of monitoring at the scale of a small pond or localized plant community. This enables finer scale survey of heterogeneous land surfaces rather than strict dependence on atmospheric variables. Despite this potential, there has been a limited number of studies of ET and drought stress with IR cameras. In this study, the infrared thermometry-based method was applied to estimate ET over wetland plant species in St. Lawrence River Valley, NY. The results are evaluated with traditional methods to test applicability over multiple vegetation species in a same area.

  10. Computing camera heading: A study

    NASA Astrophysics Data System (ADS)

    Zhang, John Jiaxiang

    2000-08-01

    An accurate estimate of the motion of a camera is a crucial first step for the 3D reconstruction of sites, objects, and buildings from video. Solutions to the camera heading problem can be readily applied to many areas, such as robotic navigation, surgical operation, video special effects, multimedia, and lately even in internet commerce. From image sequences of a real world scene, the problem is to calculate the directions of the camera translations. The presence of rotations makes this problem very hard. This is because rotations and translations can have similar effects on the images, and are thus hard to tell apart. However, the visual angles between the projection rays of point pairs are unaffected by rotations, and their changes over time contain sufficient information to determine the direction of camera translation. We developed a new formulation of the visual angle disparity approach, first introduced by Tomasi, to the camera heading problem. Our new derivation makes theoretical analysis possible. Most notably, a theorem is obtained that locates all possible singularities of the residual function for the underlying optimization problem. This allows identifying all computation trouble spots beforehand, and to design reliable and accurate computational optimization methods. A bootstrap-jackknife resampling method simultaneously reduces complexity and tolerates outliers well. Experiments with image sequences show accurate results when compared with the true camera motion as measured with mechanical devices.

  11. Superimpose methods for uncooled infrared camera applied to the micro-scale thermal characterization of composite materials

    NASA Astrophysics Data System (ADS)

    Morikawa, Junko

    2015-05-01

    The mobile type apparatus for a quantitative micro-scale thermography using a micro-bolometer was developed based on our original techniques such as an achromatic lens design to capture a micro-scale image in long-wave infrared, a video signal superimposing for the real time emissivity correction, and a pseudo acceleration of a timeframe. The total size of the instrument was designed as it was put in the 17 cm x 28 cm x 26 cm size carrying box. The video signal synthesizer enabled to record a direct digital signal of monitoring temperature or positioning data. The encoded digital signal data embedded in each image was decoded to read out. The protocol to encode/decode the measured data was originally defined. The mixed signals of IR camera and the imposed data were applied to the pixel by pixel emissivity corrections and the pseudo-acceleration of the periodical thermal phenomena. Because the emissivity of industrial materials and biological tissues were usually inhomogeneous, it has the different temperature dependence on each pixel. The time-scale resolution for the periodic thermal event was improved with the algorithm for "pseudoacceleration". It contributes to reduce the noise by integrating the multiple image data, keeping a time resolution. The anisotropic thermal properties of some composite materials such as thermal insulating materials of cellular plastics and the biometric composite materials were analyzed using these techniques.

  12. Infrared Imaging for Inquiry-Based Learning

    ERIC Educational Resources Information Center

    Xie, Charles; Hazzard, Edmund

    2011-01-01

    Based on detecting long-wavelength infrared (IR) radiation emitted by the subject, IR imaging shows temperature distribution instantaneously and heat flow dynamically. As a picture is worth a thousand words, an IR camera has great potential in teaching heat transfer, which is otherwise invisible. The idea of using IR imaging in teaching was first…

  13. A study of thermographic diagnosis system and imaging algorithm by distributed thermal data using single infrared sensor.

    PubMed

    Yoon, Se Jin; Noh, Si Cheol; Choi, Heung Ho

    2007-01-01

    The infrared diagnosis device provides two-dimensional images and patient-oriented results that can be easily understood by the inspection target by using infrared cameras; however, it has disadvantages such as large size, high price, and inconvenient maintenance. In this regard, this study has proposed small-sized diagnosis device for body heat using a single infrared sensor and implemented an infrared detection system using a single infrared sensor and an algorithm that represents thermography using the obtained data on the temperature of the point source. The developed systems had the temperature resolution of 0.1 degree and the reproducibility of +/-0.1 degree. The accuracy was 90.39% at the error bound of +/-0 degree and 99.98% at that of +/-0.1 degree. In order to evaluate the proposed algorithm and system, the infrared images of camera method was compared. The thermal images that have clinical meaning were obtained from a patient who has lesion to verify its clinical applicability.

  14. Infrared Camera Characterization of Bi-Propellant Reaction Control Engines during Auxiliary Propulsion Systems Tests at NASA's White Sands Test Facility in Las Cruces, New Mexico

    NASA Technical Reports Server (NTRS)

    Holleman, Elizabeth; Sharp, David; Sheller, Richard; Styron, Jason

    2007-01-01

    This paper describes the application of a FUR Systems A40M infrared (IR) digital camera for thermal monitoring of a Liquid Oxygen (LOX) and Ethanol bi-propellant Reaction Control Engine (RCE) during Auxiliary Propulsion System (APS) testing at the National Aeronautics & Space Administration's (NASA) White Sands Test Facility (WSTF) near Las Cruces, New Mexico. Typically, NASA has relied mostly on the use of ThermoCouples (TC) for this type of thermal monitoring due to the variability of constraints required to accurately map rapidly changing temperatures from ambient to glowing hot chamber material. Obtaining accurate real-time temperatures in the JR spectrum is made even more elusive by the changing emissivity of the chamber material as it begins to glow. The parameters evaluated prior to APS testing included: (1) remote operation of the A40M camera using fiber optic Firewire signal sender and receiver units; (2) operation of the camera inside a Pelco explosion proof enclosure with a germanium window; (3) remote analog signal display for real-time monitoring; (4) remote digital data acquisition of the A40M's sensor information using FUR's ThermaCAM Researcher Pro 2.8 software; and (5) overall reliability of the system. An initial characterization report was prepared after the A40M characterization tests at Marshall Space Flight Center (MSFC) to document controlled heat source comparisons to calibrated TCs. Summary IR digital data recorded from WSTF's APS testing is included within this document along with findings, lessons learned, and recommendations for further usage as a monitoring tool for the development of rocket engines.

  15. New opportunities for quality enhancing of images captured by passive THz camera

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2014-10-01

    As it is well-known, the passive THz camera allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. Obviously, efficiency of using the passive THz camera depends on its temperature resolution. This characteristic specifies possibilities of the detection for concealed object: minimal size of the object; maximal distance of the detection; image quality. Computer processing of the THz image may lead to many times improving of the image quality without any additional engineering efforts. Therefore, developing of modern computer code for its application to THz images is urgent problem. Using appropriate new methods one may expect such temperature resolution which will allow to see banknote in pocket of a person without any real contact. Modern algorithms for computer processing of THz images allow also to see object inside the human body using a temperature trace on the human skin. This circumstance enhances essentially opportunity of passive THz camera applications for counterterrorism problems. We demonstrate opportunities, achieved at present time, for the detection both of concealed objects and of clothes components due to using of computer processing of images captured by passive THz cameras, manufactured by various companies. Another important result discussed in the paper consists in observation of both THz radiation emitted by incandescent lamp and image reflected from ceramic floorplate. We consider images produced by THz passive cameras manufactured by Microsemi Corp., and ThruVision Corp., and Capital Normal University (Beijing, China). All algorithms for computer processing of the THz images under consideration in this paper were developed by Russian part of author list. Keywords: THz wave, passive imaging camera, computer processing, security screening, concealed and forbidden objects, reflected image, hand seeing, banknote seeing, ceramic floorplate, incandescent lamp.

  16. A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks †

    PubMed Central

    Shen, Ju; Xu, Wanxin; Luo, Ying

    2018-01-01

    From object tracking to 3D reconstruction, RGB-Depth (RGB-D) camera networks play an increasingly important role in many vision and graphics applications. Practical applications often use sparsely-placed cameras to maximize visibility, while using as few cameras as possible to minimize cost. In general, it is challenging to calibrate sparse camera networks due to the lack of shared scene features across different camera views. In this paper, we propose a novel algorithm that can accurately and rapidly calibrate the geometric relationships across an arbitrary number of RGB-D cameras on a network. Our work has a number of novel features. First, to cope with the wide separation between different cameras, we establish view correspondences by using a spherical calibration object. We show that this approach outperforms other techniques based on planar calibration objects. Second, instead of modeling camera extrinsic calibration using rigid transformation, which is optimal only for pinhole cameras, we systematically test different view transformation functions including rigid transformation, polynomial transformation and manifold regression to determine the most robust mapping that generalizes well to unseen data. Third, we reformulate the celebrated bundle adjustment procedure to minimize the global 3D reprojection error so as to fine-tune the initial estimates. Finally, our scalable client-server architecture is computationally efficient: the calibration of a five-camera system, including data capture, can be done in minutes using only commodity PCs. Our proposed framework is compared with other state-of-the-arts systems using both quantitative measurements and visual alignment results of the merged point clouds. PMID:29342968

  17. Hubble's Wide View of 'Mystic Mountain' in Infrared

    NASA Image and Video Library

    2010-04-23

    NASA image release April 22, 2010 This is a NASA Hubble Space Telescope near-infrared-light image of a three-light-year-tall pillar of gas and dust that is being eaten away by the brilliant light from nearby stars in the tempestuous stellar nursery called the Carina Nebula, located 7,500 light-years away in the southern constellation Carina. The image marks the 20th anniversary of Hubble's launch and deployment into an orbit around Earth. The image reveals a plethora of stars behind the gaseous veil of the nebula's wall of hydrogen, laced with dust. The foreground pillar becomes semi-transparent because infrared light from background stars penetrates through much of the dust. A few stars inside the pillar also become visible. The false colors are assigned to three different infrared wavelength ranges. Hubble's Wide Field Camera 3 observed the pillar in February and March 2010. Object Names: HH 901, HH 902 Image Type: Astronomical Credit: NASA, ESA, and M. Livio and the Hubble 20th Anniversary Team (STScI) To read learn more about this image go to: www.nasa.gov/mission_pages/hubble/science/hubble20th-img.... NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

  18. Near-Infrared Photon-Counting Camera for High-Sensitivity Observations

    NASA Technical Reports Server (NTRS)

    Jurkovic, Michael

    2012-01-01

    The dark current of a transferred-electron photocathode with an InGaAs absorber, responsive over the 0.9-to-1.7- micron range, must be reduced to an ultralow level suitable for low signal spectral astrophysical measurements by lowering the temperature of the sensor incorporating the cathode. However, photocathode quantum efficiency (QE) is known to reduce to zero at such low temperatures. Moreover, it has not been demonstrated that the target dark current can be reached at any temperature using existing photocathodes. Changes in the transferred-electron photocathode epistructure (with an In- GaAs absorber lattice-matched to InP and exhibiting responsivity over the 0.9- to-1.7- m range) and fabrication processes were developed and implemented that resulted in a demonstrated >13x reduction in dark current at -40 C while retaining >95% of the approximately equal to 25% saturated room-temperature QE. Further testing at lower temperature is needed to confirm a >25 C predicted reduction in cooling required to achieve an ultralow dark-current target suitable for faint spectral astronomical observations that are not otherwise possible. This reduction in dark current makes it possible to increase the integration time of the imaging sensor, thus enabling a much higher near-infrared (NIR) sensitivity than is possible with current technology. As a result, extremely faint phenomena and NIR signals emitted from distant celestial objects can be now observed and imaged (such as the dynamics of redshifting galaxies, and spectral measurements on extra-solar planets in search of water and bio-markers) that were not previously possible. In addition, the enhanced NIR sensitivity also directly benefits other NIR imaging applications, including drug and bomb detection, stand-off detection of improvised explosive devices (IED's), Raman spectroscopy and microscopy for life/physical science applications, and semiconductor product defect detection.

  19. Faint Object Camera imaging and spectroscopy of NGC 4151

    NASA Technical Reports Server (NTRS)

    Boksenberg, A.; Catchpole, R. M.; Macchetto, F.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Crane, P.; Deharveng, J. M.; Disney, M. J.; Jakobsen, P.

    1995-01-01

    We describe ultraviolet and optical imaging and spectroscopy within the central few arcseconds of the Seyfert galaxy NGC 4151, obtained with the Faint Object Camera on the Hubble Space Telescope. A narrowband image including (O III) lambda(5007) shows a bright nucleus centered on a complex biconical structure having apparent opening angle approximately 65 deg and axis at a position angle along 65 deg-245 deg; images in bands including Lyman-alpha and C IV lambda(1550) and in the optical continuum near 5500 A, show only the bright nucleus. In an off-nuclear optical long-slit spectrum we find a high and a low radial velocity component within the narrow emission lines. We identify the low-velocity component with the bright, extended, knotty structure within the cones, and the high-velocity component with more confined diffuse emission. Also present are strong continuum emission and broad Balmer emission line components, which we attribute to the extended point spread function arising from the intense nuclear emission. Adopting the geometry pointed out by Pedlar et al. (1993) to explain the observed misalignment of the radio jets and the main optical structure we model an ionizing radiation bicone, originating within a galactic disk, with apex at the active nucleus and axis centered on the extended radio jets. We confirm that through density bounding the gross spatial structure of the emission line region can be reproduced with a wide opening angle that includes the line of sight, consistent with the presence of a simple opaque torus allowing direct view of the nucleus. In particular, our modelling reproduces the observed decrease in position angle with distance from the nucleus, progressing initially from the direction of the extended radio jet, through our optical structure, and on to the extended narrow-line region. We explore the kinematics of the narrow-line low- and high-velocity components on the basis of our spectroscopy and adopted model structure.

  20. Statistical Analysis of an Infrared Thermography Inspection of Reinforced Carbon-Carbon

    NASA Technical Reports Server (NTRS)

    Comeaux, Kayla

    2011-01-01

    Each piece of flight hardware being used on the shuttle must be analyzed and pass NASA requirements before the shuttle is ready for launch. One tool used to detect cracks that lie within flight hardware is Infrared Flash Thermography. This is a non-destructive testing technique which uses an intense flash of light to heat up the surface of a material after which an Infrared camera is used to record the cooling of the material. Since cracks within the material obstruct the natural heat flow through the material, they are visible when viewing the data from the Infrared camera. We used Ecotherm, a software program, to collect data pertaining to the delaminations and analyzed the data using Ecotherm and University of Dayton Log Logistic Probability of Detection (POD) Software. The goal was to reproduce the statistical analysis produced by the University of Dayton software, by using scatter plots, log transforms, and residuals to test the assumption of normality for the residuals.

  1. Curved CCD detector devices and arrays for multispectral astrophysical applications and terrestrial stereo panoramic cameras

    NASA Astrophysics Data System (ADS)

    Swain, Pradyumna; Mark, David

    2004-09-01

    The emergence of curved CCD detectors as individual devices or as contoured mosaics assembled to match the curved focal planes of astronomical telescopes and terrestrial stereo panoramic cameras represents a major optical design advancement that greatly enhances the scientific potential of such instruments. In altering the primary detection surface within the telescope"s optical instrumentation system from flat to curved, and conforming the applied CCD"s shape precisely to the contour of the telescope"s curved focal plane, a major increase in the amount of transmittable light at various wavelengths through the system is achieved. This in turn enables multi-spectral ultra-sensitive imaging with much greater spatial resolution necessary for large and very large telescope applications, including those involving infrared image acquisition and spectroscopy, conducted over very wide fields of view. For earth-based and space-borne optical telescopes, the advent of curved CCD"s as the principle detectors provides a simplification of the telescope"s adjoining optics, reducing the number of optical elements and the occurrence of optical aberrations associated with large corrective optics used to conform to flat detectors. New astronomical experiments may be devised in the presence of curved CCD applications, in conjunction with large format cameras and curved mosaics, including three dimensional imaging spectroscopy conducted over multiple wavelengths simultaneously, wide field real-time stereoscopic tracking of remote objects within the solar system at high resolution, and deep field survey mapping of distant objects such as galaxies with much greater multi-band spatial precision over larger sky regions. Terrestrial stereo panoramic cameras equipped with arrays of curved CCD"s joined with associative wide field optics will require less optical glass and no mechanically moving parts to maintain continuous proper stereo convergence over wider perspective viewing fields than

  2. High speed Infrared imaging method for observation of the fast varying temperature phenomena

    NASA Astrophysics Data System (ADS)

    Moghadam, Reza; Alavi, Kambiz; Yuan, Baohong

    With new improvements in high-end commercial R&D camera technologies many challenges have been overcome for exploring the high-speed IR camera imaging. The core benefits of this technology is the ability to capture fast varying phenomena without image blur, acquire enough data to properly characterize dynamic energy, and increase the dynamic range without compromising the number of frames per second. This study presents a noninvasive method for determining the intensity field of a High Intensity Focused Ultrasound Device (HIFU) beam using Infrared imaging. High speed Infrared camera was placed above the tissue-mimicking material that was heated by HIFU with no other sensors present in the HIFU axial beam. A MATLAB simulation code used to perform a finite-element solution to the pressure wave propagation and heat equations within the phantom and temperature rise to the phantom was computed. Three different power levels of HIFU transducers were tested and the predicted temperature increase values were within about 25% of IR measurements. The fundamental theory and methods developed in this research can be used to detect fast varying temperature phenomena in combination with the infrared filters.

  3. Camera pose refinement by matching uncertain 3D building models with thermal infrared image sequences for high quality texture extraction

    NASA Astrophysics Data System (ADS)

    Iwaszczuk, Dorota; Stilla, Uwe

    2017-10-01

    Thermal infrared (TIR) images are often used to picture damaged and weak spots in the insulation of the building hull, which is widely used in thermal inspections of buildings. Such inspection in large-scale areas can be carried out by combining TIR imagery and 3D building models. This combination can be achieved via texture mapping. Automation of texture mapping avoids time consuming imaging and manually analyzing each face independently. It also provides a spatial reference for façade structures extracted in the thermal textures. In order to capture all faces, including the roofs, façades, and façades in the inner courtyard, an oblique looking camera mounted on a flying platform is used. Direct geo-referencing is usually not sufficient for precise texture extraction. In addition, 3D building models have also uncertain geometry. In this paper, therefore, methodology for co-registration of uncertain 3D building models with airborne oblique view images is presented. For this purpose, a line-based model-to-image matching is developed, in which the uncertainties of the 3D building model, as well as of the image features are considered. Matched linear features are used for the refinement of the exterior orientation parameters of the camera in order to ensure optimal co-registration. Moreover, this study investigates whether line tracking through the image sequence supports the matching. The accuracy of the extraction and the quality of the textures are assessed. For this purpose, appropriate quality measures are developed. The tests showed good results on co-registration, particularly in cases where tracking between the neighboring frames had been applied.

  4. Geometric database maintenance using CCTV cameras and overlay graphics

    NASA Astrophysics Data System (ADS)

    Oxenberg, Sheldon C.; Landell, B. Patrick; Kan, Edwin

    1988-01-01

    An interactive graphics system using closed circuit television (CCTV) cameras for remote verification and maintenance of a geometric world model database has been demonstrated in GE's telerobotics testbed. The database provides geometric models and locations of objects viewed by CCTV cameras and manipulated by telerobots. To update the database, an operator uses the interactive graphics system to superimpose a wireframe line drawing of an object with known dimensions on a live video scene containing that object. The methodology used is multipoint positioning to easily superimpose a wireframe graphic on the CCTV image of an object in the work scene. An enhanced version of GE's interactive graphics system will provide the object designation function for the operator control station of the Jet Propulsion Laboratory's telerobot demonstration system.

  5. Infrared microscope inspection apparatus

    DOEpatents

    Forman, S.E.; Caunt, J.W.

    1985-02-26

    Apparatus and system for inspecting infrared transparents, such as an array of photovoltaic modules containing silicon solar cells, includes an infrared microscope, at least three sources of infrared light placed around and having their axes intersect the center of the object field and means for sending the reflected light through the microscope. The apparatus is adapted to be mounted on an X-Y translator positioned adjacent the object surface. 4 figs.

  6. Infrared microscope inspection apparatus

    DOEpatents

    Forman, Steven E.; Caunt, James W.

    1985-02-26

    Apparatus and system for inspecting infrared transparents, such as an array of photovoltaic modules containing silicon solar cells, includes an infrared microscope, at least three sources of infrared light placed around and having their axes intersect the center of the object field and means for sending the reflected light through the microscope. The apparatus is adapted to be mounted on an X-Y translator positioned adjacent the object surface.

  7. Venus Monitoring Camera (VMC/VEx) 1 micron emissivity and Magellan microwave properties of crater-related radar-dark parabolas and other terrains

    NASA Astrophysics Data System (ADS)

    Basilevsky, A. T.; Shalygina, O. S.; Bondarenko, N. V.; Shalygin, E. V.; Markiewicz, W. J.

    2017-09-01

    The aim of this work is a comparative study of several typical radar-dark parabolas, the neighboring plains and some other geologic units seen in the study areas which include craters Adivar, Bassi, Bathsheba, du Chatelet and Sitwell, at two depths scales: the upper several meters of the study object available through the Magellan-based microwave (at 12.6 cm wavelength) properties (microwave emissivity, Fresnel reflectivity, large-scale surface roughness, and radar cross-section), and the upper hundreds microns of the object characterized by the 1 micron emissivity resulted from the analysis of the near infra-red (NIR) irradiation of the night-side of the Venusian surface measured by the Venus Monitoring Camera (VMC) on-board of Venus Express (VEx).

  8. Report Of The HST Strategy Panel: A Strategy For Recovery

    DTIC Science & Technology

    1991-01-01

    orbit change out: the Wide Field/Planetary Camera II (WFPC II), the Near-Infrared Camera and Multi- Object Spectrometer (NICMOS) and the Space ...are the Space Telescope Imaging Spectrograph (STB), the Near-Infrared Camera and Multi- Object Spectrom- eter (NICMOS), and the second Wide Field and...expected to fail to lock due to duplicity was 20%; on- orbit data indicates that 10% may be a better estimate, but the guide stars were preselected

  9. Time-of-Flight Microwave Camera

    PubMed Central

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-01-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz–12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum. PMID:26434598

  10. Intraoperative Near-Infrared Fluorescence Imaging using indocyanine green in colorectal carcinomatosis surgery: Proof of concept.

    PubMed

    Barabino, G; Klein, J P; Porcheron, J; Grichine, A; Coll, J-L; Cottier, M

    2016-12-01

    This study assesses the value of using Intraoperative Near Infrared Fluorescence Imaging and Indocyanine green to detect colorectal carcinomatosis during oncological surgery. In colorectal carcinomatosis cancer, two of the most important prognostic factors are completeness of staging and completeness of cytoreductive surgery. Presently, intraoperative assessment of tumoral margins relies on palpation and visual inspection. The recent introduction of Near Infrared fluorescence image guidance provides new opportunities for surgical roles, particularly in cancer surgery. The study was a non-randomized, monocentric, pilot "ex vivo" blinded clinical trial validated by the ethical committee of University Hospital of Saint Etienne. Ten patients with colorectal carcinomatosis cancer scheduled for cytoreductive surgery were included. Patients received 0.25 mg/kg of Indocyanine green intravenously 24 h before surgery. A Near Infrared camera was used to detect "ex-vivo" fluorescent lesions. There was no surgical mortality. Each analysis was done blindly. In a total of 88 lesions analyzed, 58 were classified by a pathologist as cancerous and 30 as non-cancerous. Among the 58 cancerous lesions, 42 were correctly classified by the Intraoperative Near-Infrared camera (sensitivity of 72.4%). Among the 30 non-cancerous lesions, 18 were correctly classified by the Intraoperative Near-Infrared camera (specificity of 60.0%). Near Infrared fluorescence imaging is a promising technique for intraoperative tumor identification. It could help the surgeon to determine resection margins and reduce the risk of locoregional recurrence. Copyright © 2016 Elsevier Ltd, BASO ~ the Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.

  11. D Reconstruction of AN Underwater Archaelogical Site: Comparison Between Low Cost Cameras

    NASA Astrophysics Data System (ADS)

    Capra, A.; Dubbini, M.; Bertacchini, E.; Castagnetti, C.; Mancini, F.

    2015-04-01

    The 3D reconstruction with a metric content of a submerged area, where objects and structures of archaeological interest are found, could play an important role in the research and study activities and even in the digitization of the cultural heritage. The reconstruction of 3D object, of interest for archaeologists, constitutes a starting point in the classification and description of object in digital format and for successive fruition by user after delivering through several media. The starting point is a metric evaluation of the site obtained with photogrammetric surveying and appropriate 3D restitution. The authors have been applying the underwater photogrammetric technique since several years using underwater digital cameras and, in this paper, digital low cost cameras (off-the-shelf). Results of tests made on submerged objects with three cameras are presented: Canon Power Shot G12, Intova Sport HD e GoPro HERO 2. The experimentation had the goal to evaluate the precision in self-calibration procedures, essential for multimedia underwater photogrammetry, and to analyze the quality of 3D restitution. Precisions obtained in the calibration and orientation procedures was assessed by using three cameras, and an homogeneous set control points. Data were processed with Agisoft Photoscan. Successively, 3D models were created and the comparison of the models derived from the use of different cameras was performed. Different potentialities of the used cameras are reported in the discussion section. The 3D restitution of objects and structures was integrated with sea bottom floor morphology in order to achieve a comprehensive description of the site. A possible methodology of survey and representation of submerged objects is therefore illustrated, considering an automatic and a semi-automatic approach.

  12. Lessons Learned from Crime Caught on Camera

    PubMed Central

    Bernasco, Wim

    2018-01-01

    Objectives: The widespread use of camera surveillance in public places offers criminologists the opportunity to systematically and unobtrusively observe crime, their main subject matter. The purpose of this essay is to inform the reader of current developments in research on crimes caught on camera. Methods: We address the importance of direct observation of behavior and review criminological studies that used observational methods, with and without cameras, including the ones published in this issue. We also discuss the uses of camera recordings in other social sciences and in biology. Results: We formulate six key insights that emerge from the literature and make recommendations for future research. Conclusions: Camera recordings of real-life crime are likely to become part of the criminological tool kit that will help us better understand the situational and interactional elements of crime. Like any source, it has limitations that are best addressed by triangulation with other sources. PMID:29472728

  13. Bispectral infrared forest fire detection and analysis using classification techniques

    NASA Astrophysics Data System (ADS)

    Aranda, Jose M.; Melendez, Juan; de Castro, Antonio J.; Lopez, Fernando

    2004-01-01

    Infrared cameras are well established as a useful tool for fire detection, but their use for quantitative forest fire measurements faces difficulties, due to the complex spatial and spectral structure of fires. In this work it is shown that some of these difficulties can be overcome by applying classification techniques, a standard tool for the analysis of satellite multispectral images, to bi-spectral images of fires. Images were acquired by two cameras that operate in the medium infrared (MIR) and thermal infrared (TIR) bands. They provide simultaneous and co-registered images, calibrated in brightness temperatures. The MIR-TIR scatterplot of these images can be used to classify the scene into different fire regions (background, ashes, and several ember and flame regions). It is shown that classification makes possible to obtain quantitative measurements of physical fire parameters like rate of spread, embers temperature, and radiated power in the MIR and TIR bands. An estimation of total radiated power and heat release per unit area is also made and compared with values derived from heat of combustion and fuel consumption.

  14. The MVACS Robotic Arm Camera

    NASA Astrophysics Data System (ADS)

    Keller, H. U.; Hartwig, H.; Kramm, R.; Koschny, D.; Markiewicz, W. J.; Thomas, N.; Fernades, M.; Smith, P. H.; Reynolds, R.; Lemmon, M. T.; Weinberg, J.; Marcialis, R.; Tanner, R.; Boss, B. J.; Oquest, C.; Paige, D. A.

    2001-08-01

    The Robotic Arm Camera (RAC) is one of the key instruments newly developed for the Mars Volatiles and Climate Surveyor payload of the Mars Polar Lander. This lightweight instrument employs a front lens with variable focus range and takes images at distances from 11 mm (image scale 1:1) to infinity. Color images with a resolution of better than 50 μm can be obtained to characterize the Martian soil. Spectral information of nearby objects is retrieved through illumination with blue, green, and red lamp sets. The design and performance of the camera are described in relation to the science objectives and operation. The RAC uses the same CCD detector array as the Surface Stereo Imager and shares the readout electronics with this camera. The RAC is mounted at the wrist of the Robotic Arm and can characterize the contents of the scoop, the samples of soil fed to the Thermal Evolved Gas Analyzer, the Martian surface in the vicinity of the lander, and the interior of trenches dug out by the Robotic Arm. It can also be used to take panoramic images and to retrieve stereo information with an effective baseline surpassing that of the Surface Stereo Imager by about a factor of 3.

  15. Mid-infrared interferometry towards the massive young stellar object CRL 2136: inside the dust rim

    NASA Astrophysics Data System (ADS)

    de Wit, W. J.; Hoare, M. G.; Oudmaijer, R. D.; Nürnberger, D. E. A.; Wheelwright, H. E.; Lumsden, S. L.

    2011-02-01

    Context. Establishing the importance of circumstellar disks and their properties is crucial to fully understand massive star formation. Aims: We aim to spatially resolve the various components that make-up the accretion environment of a massive young stellar object (⪉100 AU), and reproduce the emission from near-infrared to millimeter wavelengths using radiative transfer codes. Methods: We apply mid-infrared spectro-interferometry to the massive young stellar object CRL 2136. The observations were performed with the Very Large Telescope Interferometer and the MIDI instrument at a 42 m baseline probing angular scales of 50 milli-arcseconds. We model the observed visibilities in parallel with diffraction-limited images at both 24.5 μm and in the N-band (with resolutions of 0.6´´and 0.3´´, respectively), as well as the spectral energy distribution. Results: The arcsec-scale spatial information reveals the well-resolved emission from the dusty envelope. By simultaneously modelling the spatial and spectral data, we find that the bulk of the dust emission occurs at several dust sublimation radii (approximately 170 AU). This reproduces the high mid-infrared fluxes and at the same time the low visibilities observed in the MIDI data for wavelengths longward of 8.5 μm. However, shortward of this wavelength the visibility data show a sharp up-turn indicative of compact emission. We discuss various potential sources of this emission. We exclude a dust disk being responsible for the observed spectral imprint on the visibilities. A cool supergiant star and an accretion disk are considered and both shown to be viable origins of the compact mid-infrared emission. Conclusions: We propose that CRL 2136 is embedded in a dusty envelope, which truncates at several times the dust sublimation radius. A dust torus is manifest in the equatorial region. We find that the spectro-interferometric N-band signal can be reproduced by either a gaseous disk or a bloated central star. If the

  16. Low Cost and Efficient 3d Indoor Mapping Using Multiple Consumer Rgb-D Cameras

    NASA Astrophysics Data System (ADS)

    Chen, C.; Yang, B. S.; Song, S.

    2016-06-01

    Driven by the miniaturization, lightweight of positioning and remote sensing sensors as well as the urgent needs for fusing indoor and outdoor maps for next generation navigation, 3D indoor mapping from mobile scanning is a hot research and application topic. The point clouds with auxiliary data such as colour, infrared images derived from 3D indoor mobile mapping suite can be used in a variety of novel applications, including indoor scene visualization, automated floorplan generation, gaming, reverse engineering, navigation, simulation and etc. State-of-the-art 3D indoor mapping systems equipped with multiple laser scanners product accurate point clouds of building interiors containing billions of points. However, these laser scanner based systems are mostly expensive and not portable. Low cost consumer RGB-D Cameras provides an alternative way to solve the core challenge of indoor mapping that is capturing detailed underlying geometry of the building interiors. Nevertheless, RGB-D Cameras have a very limited field of view resulting in low efficiency in the data collecting stage and incomplete dataset that missing major building structures (e.g. ceilings, walls). Endeavour to collect a complete scene without data blanks using single RGB-D Camera is not technic sound because of the large amount of human labour and position parameters need to be solved. To find an efficient and low cost way to solve the 3D indoor mapping, in this paper, we present an indoor mapping suite prototype that is built upon a novel calibration method which calibrates internal parameters and external parameters of multiple RGB-D Cameras. Three Kinect sensors are mounted on a rig with different view direction to form a large field of view. The calibration procedure is three folds: 1, the internal parameters of the colour and infrared camera inside each Kinect are calibrated using a chess board pattern, respectively; 2, the external parameters between the colour and infrared camera inside each

  17. Bridge deck surface temperature monitoring by infrared thermography and inner structure identification using PPT and PCT analysis methods

    NASA Astrophysics Data System (ADS)

    Dumoulin, Jean

    2013-04-01

    One of the objectives of ISTIMES project was to evaluate the potentialities offered by the integration of different electromagnetic techniques able to perform non-invasive diagnostics for surveillance and monitoring of transport infrastructures. Among the EM methods investigated, we focused our research and development efforts on uncooled infrared camera techniques due to their promising potential level of dissemination linked to their relative low cost on the market. On the other hand, works were also carried out to identify well adapted implementation protocols and key limits of Pulse Phase Thermography (PPT) and Principal Component Thermography (PCT) processing methods to analyse thermal image sequence and retrieve information about the inner structure. So the first part of this research works addresses infrared thermography measurement when it is used in quantitative mode (not in laboratory conditions) and not in qualitative mode (vision applied to survey). In such context, it requires to process in real time thermal radiative corrections on raw data acquired to take into account influences of natural environment evolution with time, thanks to additional measurements. But, camera sensor has to be enough smart to apply in real time calibration law and radiometric corrections in a varying atmosphere. So, a complete measurement system was studied and developed [1] with low cost infrared cameras available on the market. In the system developed, infrared camera is coupled with other sensors to feed simplified radiative models running, in real time, on GPU available on small PC. The whole measurement system was implemented on the "Musmeci" bridge located in Potenza (Italy). No traffic interruption was required during the mounting of our measurement system. The infrared camera was fixed on top of a mast at 6 m elevation from the surface of the bridge deck. A small weather station was added on the same mast at 1 m under the camera. A GPS antenna was also fixed at the

  18. Daylight coloring for monochrome infrared imagery

    NASA Astrophysics Data System (ADS)

    Gabura, James

    2015-05-01

    The effectiveness of infrared imagery in poor visibility situations is well established and the range of applications is expanding as we enter a new era of inexpensive thermal imagers for mobile phones. However there is a problem in that the counterintuitive reflectance characteristics of various common scene elements can cause slowed reaction times and impaired situational awareness-consequences that can be especially detrimental in emergency situations. While multiband infrared sensors can be used, they are inherently more costly. Here we propose a technique for adding a daylight color appearance to single band infrared images, using the normally overlooked property of local image texture. The simple method described here is illustrated with colorized images from the visible red and long wave infrared bands. Our colorizing process not only imparts a natural daylight appearance to infrared images but also enhances the contrast and visibility of otherwise obscure detail. We anticipate that this colorizing method will lead to a better user experience, faster reaction times and improved situational awareness for a growing community of infrared camera users. A natural extension of our process could expand upon its texture discerning feature by adding specialized filters for discriminating specific targets.

  19. Real-time implementation of camera positioning algorithm based on FPGA & SOPC

    NASA Astrophysics Data System (ADS)

    Yang, Mingcao; Qiu, Yuehong

    2014-09-01

    In recent years, with the development of positioning algorithm and FPGA, to achieve the camera positioning based on real-time implementation, rapidity, accuracy of FPGA has become a possibility by way of in-depth study of embedded hardware and dual camera positioning system, this thesis set up an infrared optical positioning system based on FPGA and SOPC system, which enables real-time positioning to mark points in space. Thesis completion include: (1) uses a CMOS sensor to extract the pixel of three objects with total feet, implemented through FPGA hardware driver, visible-light LED, used here as the target point of the instrument. (2) prior to extraction of the feature point coordinates, the image needs to be filtered to avoid affecting the physical properties of the system to bring the platform, where the median filtering. (3) Coordinate signs point to FPGA hardware circuit extraction, a new iterative threshold selection method for segmentation of images. Binary image is then segmented image tags, which calculates the coordinates of the feature points of the needle through the center of gravity method. (4) direct linear transformation (DLT) and extreme constraints method is applied to three-dimensional reconstruction of the plane array CMOS system space coordinates. using SOPC system on a chip here, taking advantage of dual-core computing systems, which let match and coordinate operations separately, thus increase processing speed.

  20. Depth estimation and camera calibration of a focused plenoptic camera for visual odometry

    NASA Astrophysics Data System (ADS)

    Zeller, Niclas; Quint, Franz; Stilla, Uwe

    2016-08-01

    This paper presents new and improved methods of depth estimation and camera calibration for visual odometry with a focused plenoptic camera. For depth estimation we adapt an algorithm previously used in structure-from-motion approaches to work with images of a focused plenoptic camera. In the raw image of a plenoptic camera, scene patches are recorded in several micro-images under slightly different angles. This leads to a multi-view stereo-problem. To reduce the complexity, we divide this into multiple binocular stereo problems. For each pixel with sufficient gradient we estimate a virtual (uncalibrated) depth based on local intensity error minimization. The estimated depth is characterized by the variance of the estimate and is subsequently updated with the estimates from other micro-images. Updating is performed in a Kalman-like fashion. The result of depth estimation in a single image of the plenoptic camera is a probabilistic depth map, where each depth pixel consists of an estimated virtual depth and a corresponding variance. Since the resulting image of the plenoptic camera contains two plains: the optical image and the depth map, camera calibration is divided into two separate sub-problems. The optical path is calibrated based on a traditional calibration method. For calibrating the depth map we introduce two novel model based methods, which define the relation of the virtual depth, which has been estimated based on the light-field image, and the metric object distance. These two methods are compared to a well known curve fitting approach. Both model based methods show significant advantages compared to the curve fitting method. For visual odometry we fuse the probabilistic depth map gained from one shot of the plenoptic camera with the depth data gained by finding stereo correspondences between subsequent synthesized intensity images of the plenoptic camera. These images can be synthesized totally focused and thus finding stereo correspondences is enhanced

  1. Strategies for Characterizing the Sensory Environment: Objective and Subjective Evaluation Methods using the VisiSonic Real Space 64/5 Audio-Visual Panoramic Camera

    DTIC Science & Technology

    2017-11-01

    ARL-TR-8205 ● NOV 2017 US Army Research Laboratory Strategies for Characterizing the Sensory Environment: Objective and...Subjective Evaluation Methods using the VisiSonic Real Space 64/5 Audio-Visual Panoramic Camera By Joseph McArdle, Ashley Foots, Chris Stachowiak, and...return it to the originator. ARL-TR-8205 ● NOV 2017 US Army Research Laboratory Strategies for Characterizing the Sensory

  2. A direct-view customer-oriented digital holographic camera

    NASA Astrophysics Data System (ADS)

    Besaga, Vira R.; Gerhardt, Nils C.; Maksimyak, Peter P.; Hofmann, Martin R.

    2018-01-01

    In this paper, we propose a direct-view digital holographic camera system consisting mostly of customer-oriented components. The camera system is based on standard photographic units such as camera sensor and objective and is adapted to operate under off-axis external white-light illumination. The common-path geometry of the holographic module of the system ensures direct-view operation. The system can operate in both self-reference and self-interference modes. As a proof of system operability, we present reconstructed amplitude and phase information of a test sample.

  3. Automatic inference of geometric camera parameters and inter-camera topology in uncalibrated disjoint surveillance cameras

    NASA Astrophysics Data System (ADS)

    den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.

    2015-10-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.

  4. A Comparison of Techniques for Camera Selection and Hand-Off in a Video Network

    NASA Astrophysics Data System (ADS)

    Li, Yiming; Bhanu, Bir

    Video networks are becoming increasingly important for solving many real-world problems. Multiple video sensors require collaboration when performing various tasks. One of the most basic tasks is the tracking of objects, which requires mechanisms to select a camera for a certain object and hand-off this object from one camera to another so as to accomplish seamless tracking. In this chapter, we provide a comprehensive comparison of current and emerging camera selection and hand-off techniques. We consider geometry-, statistics-, and game theory-based approaches and provide both theoretical and experimental comparison using centralized and distributed computational models. We provide simulation and experimental results using real data for various scenarios of a large number of cameras and objects for in-depth understanding of strengths and weaknesses of these techniques.

  5. Analysis of Infrared Signature Variation and Robust Filter-Based Supersonic Target Detection

    PubMed Central

    Sun, Sun-Gu; Kim, Kyung-Tae

    2014-01-01

    The difficulty of small infrared target detection originates from the variations of infrared signatures. This paper presents the fundamental physics of infrared target variations and reports the results of variation analysis of infrared images acquired using a long wave infrared camera over a 24-hour period for different types of backgrounds. The detection parameters, such as signal-to-clutter ratio were compared according to the recording time, temperature and humidity. Through variation analysis, robust target detection methodologies are derived by controlling thresholds and designing a temporal contrast filter to achieve high detection rate and low false alarm rate. Experimental results validate the robustness of the proposed scheme by applying it to the synthetic and real infrared sequences. PMID:24672290

  6. A dual-band adaptor for infrared imaging.

    PubMed

    McLean, A G; Ahn, J-W; Maingi, R; Gray, T K; Roquemore, A L

    2012-05-01

    A novel imaging adaptor providing the capability to extend a standard single-band infrared (IR) camera into a two-color or dual-band device has been developed for application to high-speed IR thermography on the National Spherical Tokamak Experiment (NSTX). Temperature measurement with two-band infrared imaging has the advantage of being mostly independent of surface emissivity, which may vary significantly in the liquid lithium divertor installed on NSTX as compared to that of an all-carbon first wall. In order to take advantage of the high-speed capability of the existing IR camera at NSTX (1.6-6.2 kHz frame rate), a commercial visible-range optical splitter was extensively modified to operate in the medium wavelength and long wavelength IR. This two-band IR adapter utilizes a dichroic beamsplitter, which reflects 4-6 μm wavelengths and transmits 7-10 μm wavelength radiation, each with >95% efficiency and projects each IR channel image side-by-side on the camera's detector. Cutoff filters are used in each IR channel, and ZnSe imaging optics and mirrors optimized for broadband IR use are incorporated into the design. In-situ and ex-situ temperature calibration and preliminary data of the NSTX divertor during plasma discharges are presented, with contrasting results for dual-band vs. single-band IR operation.

  7. FIRST RESULTS FROM THE RAPID-RESPONSE SPECTROPHOTOMETRIC CHARACTERIZATION OF NEAR-EARTH OBJECTS USING UKIRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mommert, M.; Trilling, D. E.; Petersen, E.

    2016-04-15

    Using the Wide Field Camera for the United Kingdom Infrared Telescope (UKIRT), we measure the near-infrared colors of near-Earth objects (NEOs) in order to put constraints on their taxonomic classifications. The rapid-response character of our observations allows us to observe NEOs when they are close to the Earth and bright. Here we present near-infrared color measurements of 86 NEOs, most of which were observed within a few days of their discovery, allowing us to characterize NEOs with diameters of only a few meters. Using machine-learning methods, we compare our measurements to existing asteroid spectral data and provide probabilistic taxonomic classificationsmore » for our targets. Our observations allow us to distinguish between S-complex, C/X-complex, D-type, and V-type asteroids. Our results suggest that the fraction of S-complex asteroids in the whole NEO population is lower than the fraction of ordinary chondrites in the meteorite fall statistics. Future data obtained with UKIRT will be used to investigate the significance of this discrepancy.« less

  8. Omega Centauri Looks Radiant in Infrared

    NASA Technical Reports Server (NTRS)

    2008-01-01

    [figure removed for brevity, see original site] Poster Version

    A cluster brimming with millions of stars glistens like an iridescent opal in this image from NASA's Spitzer Space Telescope. Called Omega Centauri, the sparkling orb of stars is like a miniature galaxy. It is the biggest and brightest of the 150 or so similar objects, called globular clusters, that orbit around the outside of our Milky Way galaxy. Stargazers at southern latitudes can spot the stellar gem with the naked eye in the constellation Centaurus.

    Globular clusters are some of the oldest objects in our universe. Their stars are over 12 billion years old, and, in most cases, formed all at once when the universe was just a toddler. Omega Centauri is unusual in that its stars are of different ages and possess varying levels of metals, or elements heavier than boron. Astronomers say this points to a different origin for Omega Centauri than other globular clusters: they think it might be the core of a dwarf galaxy that was ripped apart and absorbed by our Milky Way long ago.

    In this new view of Omega Centauri, Spitzer's infrared observations have been combined with visible-light data from the National Science Foundation's Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory in Chile. Visible-light data with a wavelength of .55 microns is colored blue, 3.6-micron infrared light captured by Spitzer's infrared array camera is colored green and 24-micron infrared light taken by Spitzer's multiband imaging photometer is colored red.

    Where green and red overlap, the color yellow appears. Thus, the yellow and red dots are stars revealed by Spitzer. These stars, called red giants, are more evolved, larger and dustier. The stars that appear blue were spotted in both visible and 3.6-micron-, or near-, infrared light. They are less evolved, like our own sun. Some of the red spots in the picture are distant galaxies beyond our own.

    Spitzer found very little dust

  9. Thermal signature analysis of human face during jogging activity using infrared thermography technique

    NASA Astrophysics Data System (ADS)

    Budiarti, Putria W.; Kusumawardhani, Apriani; Setijono, Heru

    2016-11-01

    Thermal imaging has been widely used for many applications. Thermal camera is used to measure object's temperature above absolute temperature of 0 Kelvin using infrared radiation emitted by the object. Thermal imaging is color mapping taken using false color that represents temperature. Human body is one of the objects that emits infrared radiation. Human infrared radiations vary according to the activity that is being done. Physical activities such as jogging is among ones that is commonly done. Therefore this experiment will investigate the thermal signature profile of jogging activity in human body, especially in the face parts. The results show that the significant increase is found in periorbital area that is near eyes and forehand by the number of 7.5%. Graphical temperature distributions show that all region, eyes, nose, cheeks, and chin at the temperature of 28.5 - 30.2°C the pixel area tends to be constant since it is the surrounding temperature. At the temperature of 30.2 - 34.7°C the pixel area tends to increase, while at the temperature of 34.7 - 37.1°C the pixel area tends to decrease because pixels at temperature of 34.7 - 37.1°C after jogging activity change into temperature of 30.2 - 34.7°C so that the pixel area increases. The trendline of jogging activity during 10 minutes period also shows the increasing of temperature. The results of each person also show variations due to physiological nature of each person, such as sweat production during physical activities.

  10. SPITZER OBSERVATIONS OF LONG-TERM INFRARED VARIABILITY AMONG YOUNG STELLAR OBJECTS IN CHAMAELEON I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flaherty, Kevin M.; Herbst, William; DeMarchi, Lindsay

    Infrared variability is common among young stellar objects, with surveys finding daily to weekly fluctuations of a few tenths of a magnitude. Space-based observations can produce highly sampled infrared light curves, but are often limited to total baselines of about 1 month due to the orientation of the spacecraft. Here we present observations of the Chameleon I cluster, whose low declination makes it observable by the Spitzer Space Telescope over a 200-day period. We observe 30 young stellar objects with a daily cadence to better sample variability on timescales of months. We find that such variability is common, occurring inmore » ∼80% of the detected cluster members. The change in [3.6]–[4.5] color over 200 days for many of the sources falls between that expected for extinction and fluctuations in disk emission. With our high cadence and long baseline we can derive power spectral density curves covering two orders of magnitude in frequency and find significant power at low frequencies, up to the boundaries of our 200-day survey. Such long timescales are difficult to explain with variations driven by the interaction between the disk and stellar magnetic field, which has a dynamical timescale of days to weeks. The most likely explanation is either structural or temperature fluctuations spread throughout the inner ∼0.5 au of the disk, suggesting that the intrinsic dust structure is highly dynamic.« less

  11. The Orbiter camera payload system's large-format camera and attitude reference system

    NASA Technical Reports Server (NTRS)

    Schardt, B. B.; Mollberg, B. H.

    1985-01-01

    The Orbiter camera payload system (OCPS) is an integrated photographic system carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a large-format camera (LFC), a precision wide-angle cartographic instrument capable of producing high-resolution stereophotography of great geometric fidelity in multiple base-to-height ratios. A secondary and supporting system to the LFC is the attitude reference system (ARS), a dual-lens stellar camera array (SCA) and camera support structure. The SCA is a 70 mm film system that is rigidly mounted to the LFC lens support structure and, through the simultaneous acquisition of two star fields with each earth viewing LFC frame, makes it possible to precisely determine the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high-precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with rocket launch conditions and the on-orbit environment. The full OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on Oct. 5, 1984, as a major payload aboard the STS-41G mission.

  12. Built-in hyperspectral camera for smartphone in visible, near-infrared and middle-infrared lights region (third report): spectroscopic imaging for broad-area and real-time componential analysis system against local unexpected terrorism and disasters

    NASA Astrophysics Data System (ADS)

    Hosono, Satsuki; Kawashima, Natsumi; Wollherr, Dirk; Ishimaru, Ichiro

    2016-05-01

    The distributed networks for information collection of chemical components with high-mobility objects, such as drones or smartphones, will work effectively for investigations, clarifications and predictions against unexpected local terrorisms and disasters like localized torrential downpours. We proposed and reported the proposed spectroscopic line-imager for smartphones in this conference. In this paper, we will mention the wide-area spectroscopic-image construction by estimating 6 DOF (Degrees Of Freedom: parallel movements=x,y,z and rotational movements=θx, θy, θz) from line data to observe and analyze surrounding chemical-environments. Recently, smartphone movies, what were photographed by peoples happened to be there, had worked effectively to analyze what kinds of phenomenon had happened around there. But when a gas tank suddenly blew up, we did not recognize from visible-light RGB-color cameras what kinds of chemical gas components were polluting surrounding atmospheres. Conventionally Fourier spectroscopy had been well known as chemical components analysis in laboratory usages. But volatile gases should be analyzed promptly at accident sites. And because the humidity absorption in near and middle infrared lights has very high sensitivity, we will be able to detect humidity in the sky from wide field spectroscopic image. And also recently, 6-DOF sensors are easily utilized for estimation of position and attitude for UAV (Unmanned Air Vehicle) or smartphone. But for observing long-distance views, accuracies of angle measurements were not sufficient to merge line data because of leverage theory. Thus, by searching corresponding pixels between line spectroscopic images, we are trying to estimate 6-DOF in high accuracy.

  13. Infrared

    NASA Astrophysics Data System (ADS)

    Vollmer, M.

    2013-11-01

    underlying physics. There are now at least six different disciplines that deal with infrared radiation in one form or another, and in one or several different spectral portions of the whole IR range. These are spectroscopy, astronomy, thermal imaging, detector and source development and metrology, as well the field of optical data transmission. Scientists working in these fields range from chemists and astronomers through to physicists and even photographers. This issue presents examples from some of these fields. All the papers—though some of them deal with fundamental or applied research—include interesting elements that make them directly applicable to university-level teaching at the graduate or postgraduate level. Source (e.g. quantum cascade lasers) and detector development (e.g. multispectral sensors), as well as metrology issues and optical data transmission, are omitted since they belong to fundamental research journals. Using a more-or-less arbitrary order according to wavelength range, the issue starts with a paper on the physics of near-infrared photography using consumer product cameras in the spectral range from 800 nm to 1.1 µm [1]. It is followed by a series of three papers dealing with IR imaging in spectral ranges from 3 to 14 µm [2-4]. One of them deals with laboratory courses that may help to characterize the IR camera response [2], the second discusses potential applications for nondestructive testing techniques [3] and the third gives an example of how IR thermal imaging may be used to understand cloud cover of the Earth [4], which is the prerequisite for successful climate modelling. The next two papers cover the vast field of IR spectroscopy [5, 6]. The first of these deals with Fourier transform infrared spectroscopy in the spectral range from 2.5 to 25 µm, studying e.g. ro-vibrational excitations in gases or optical phonon interactions within solids [5]. The second deals mostly with the spectroscopy of liquids such as biofuels and special

  14. Sky camera geometric calibration using solar observations

    DOE PAGES

    Urquhart, Bryan; Kurtz, Ben; Kleissl, Jan

    2016-09-05

    camera model and associated automated calibration procedure for stationary daytime sky imaging cameras is presented. The specific modeling and calibration needs are motivated by remotely deployed cameras used to forecast solar power production where cameras point skyward and use 180° fisheye lenses. Sun position in the sky and on the image plane provides a simple and automated approach to calibration; special equipment or calibration patterns are not required. Sun position in the sky is modeled using a solar position algorithm (requiring latitude, longitude, altitude and time as inputs). Sun position on the image plane is detected using a simple image processing algorithm. Themore » performance evaluation focuses on the calibration of a camera employing a fisheye lens with an equisolid angle projection, but the camera model is general enough to treat most fixed focal length, central, dioptric camera systems with a photo objective lens. Calibration errors scale with the noise level of the sun position measurement in the image plane, but the calibration is robust across a large range of noise in the sun position. In conclusion, calibration performance on clear days ranged from 0.94 to 1.24 pixels root mean square error.« less

  15. Multi-Touch Tabletop System Using Infrared Image Recognition for User Position Identification

    PubMed Central

    Suto, Shota; Watanabe, Toshiya; Shibusawa, Susumu; Kamada, Masaru

    2018-01-01

    A tabletop system can facilitate multi-user collaboration in a variety of settings, including small meetings, group work, and education and training exercises. The ability to identify the users touching the table and their positions can promote collaborative work among participants, so methods have been studied that involve attaching sensors to the table, chairs, or to the users themselves. An effective method of recognizing user actions without placing a burden on the user would be some type of visual process, so the development of a method that processes multi-touch gestures by visual means is desired. This paper describes the development of a multi-touch tabletop system using infrared image recognition for user position identification and presents the results of touch-gesture recognition experiments and a system-usability evaluation. Using an inexpensive FTIR touch panel and infrared light, this system picks up the touch areas and the shadow area of the user’s hand by an infrared camera to establish an association between the hand and table touch points and estimate the position of the user touching the table. The multi-touch gestures prepared for this system include an operation to change the direction of an object to face the user and a copy operation in which two users generate duplicates of an object. The system-usability evaluation revealed that prior learning was easy and that system operations could be easily performed. PMID:29758006

  16. First results from the TOPSAT camera

    NASA Astrophysics Data System (ADS)

    Greenway, Paul; Tosh, Ian; Morris, Nigel; Burton, Gary; Cawley, Steve

    2017-11-01

    The TopSat camera is a low cost remote sensing imager capable of producing 2.5 metre resolution panchromatic imagery, funded by the British National Space Centre's Mosaic programme. The instrument was designed and assembled at the Space Science & Technology Department of the CCLRC's Rutherford Appleton Laboratory (RAL) in the UK, and was launched on the 27th October 2005 from Plesetsk Cosmodrome in Northern Russia on a Kosmos-3M. The camera utilises an off-axis three mirror system, which has the advantages of excellent image quality over a wide field of view, combined with a compactness that makes its overall dimensions smaller than its focal length. Keeping the costs to a minimum has been a major design driver in the development of this camera. The camera is part of the TopSat mission, which is a collaboration between four UK organisations; QinetiQ, Surrey Satellite Technology Ltd (SSTL), RAL and Infoterra. Its objective is to demonstrate provision of rapid response high resolution imagery to fixed and mobile ground stations using a low cost minisatellite. The paper "Development of the TopSat Camera" presented by RAL at the 5th ICSO in 2004 described the opto-mechanical design, assembly, alignment and environmental test methods implemented. Now that the spacecraft is in orbit and successfully acquiring images, this paper presents the first results from the camera and makes an initial assessment of the camera's in-orbit performance.

  17. Interpretation of multispectral and infrared thermal surveys of the Suez Canal Zone, Egypt

    NASA Technical Reports Server (NTRS)

    Elshazly, E. M.; Hady, M. A. A. H.; Hafez, M. A. A.; Salman, A. B.; Morsy, M. A.; Elrakaiby, M. M.; Alaassy, I. E. E.; Kamel, A. F.

    1977-01-01

    Remote sensing airborne surveys were conducted, as part of the plan of rehabilitation, of the Suez Canal Zone using I2S multispectral camera and Bendix LN-3 infrared passive scanner. The multispectral camera gives four separate photographs for the same scene in the blue, green, red, and near infrared bands. The scanner was operated in the microwave bands of 8 to 14 microns and the thermal surveying was carried out both at night and in the day time. The surveys, coupled with intensive ground investigations, were utilized in the construction of new geological, structural lineation and drainage maps for the Suez Canal Zone on a scale of approximately 1:20,000, which are superior to the maps made by normal aerial photography. A considerable number of anomalies belonging to various types were revealed through the interpretation of the executed multispectral and infrared thermal surveys.

  18. Localization and Mapping Using a Non-Central Catadioptric Camera System

    NASA Astrophysics Data System (ADS)

    Khurana, M.; Armenakis, C.

    2018-05-01

    This work details the development of an indoor navigation and mapping system using a non-central catadioptric omnidirectional camera and its implementation for mobile applications. Omnidirectional catadioptric cameras find their use in navigation and mapping of robotic platforms, owing to their wide field of view. Having a wider field of view, or rather a potential 360° field of view, allows the system to "see and move" more freely in the navigation space. A catadioptric camera system is a low cost system which consists of a mirror and a camera. Any perspective camera can be used. A platform was constructed in order to combine the mirror and a camera to build a catadioptric system. A calibration method was developed in order to obtain the relative position and orientation between the two components so that they can be considered as one monolithic system. The mathematical model for localizing the system was determined using conditions based on the reflective properties of the mirror. The obtained platform positions were then used to map the environment using epipolar geometry. Experiments were performed to test the mathematical models and the achieved location and mapping accuracies of the system. An iterative process of positioning and mapping was applied to determine object coordinates of an indoor environment while navigating the mobile platform. Camera localization and 3D coordinates of object points obtained decimetre level accuracies.

  19. A new spherical scanning system for infrared reflectography of paintings

    NASA Astrophysics Data System (ADS)

    Gargano, M.; Cavaliere, F.; Viganò, D.; Galli, A.; Ludwig, N.

    2017-03-01

    Infrared reflectography is an imaging technique used to visualize the underdrawings of ancient paintings; it relies on the fact that most pigment layers are quite transparent to infrared radiation in the spectral band between 0.8 μm and 2.5 μm. InGaAs sensor cameras are nowadays the most used devices to visualize the underdrawings but due to the small size of the detectors, these cameras are usually mounted on scanning systems to record high resolution reflectograms. This work describes a portable scanning system prototype based on a peculiar spherical scanning system built through a light weight and low cost motorized head. The motorized head was built with the purpose of allowing the refocusing adjustment needed to compensate the variable camera-painting distance during the rotation of the camera. The prototype has been tested first in laboratory and then in-situ for the Giotto panel "God the Father with Angels" with a 256 pixel per inch resolution. The system performance is comparable with that of other reflectographic devices with the advantage of extending the scanned area up to 1 m × 1 m, with a 40 min scanning time. The present configuration can be easily modified to increase the resolution up to 560 pixels per inch or to extend the scanned area up to 2 m × 2 m.

  20. Far ultraviolet wide field imaging and photometry - Spartan-202 Mark II Far Ultraviolet Camera

    NASA Technical Reports Server (NTRS)

    Carruthers, George R.; Heckathorn, Harry M.; Opal, Chet B.; Witt, Adolf N.; Henize, Karl G.

    1988-01-01

    The U.S. Naval Research Laboratory' Mark II Far Ultraviolet Camera, which is expected to be a primary scientific instrument aboard the Spartan-202 Space Shuttle mission, is described. This camera is intended to obtain FUV wide-field imagery of stars and extended celestial objects, including diffuse nebulae and nearby galaxies. The observations will support the HST by providing FUV photometry of calibration objects. The Mark II camera is an electrographic Schmidt camera with an aperture of 15 cm, a focal length of 30.5 cm, and sensitivity in the 1230-1600 A wavelength range.