360 degree vision system: opportunities in transportation
NASA Astrophysics Data System (ADS)
Thibault, Simon
2007-09-01
Panoramic technologies are experiencing new and exciting opportunities in the transportation industries. The advantages of panoramic imagers are numerous: increased areas coverage with fewer cameras, imaging of multiple target simultaneously, instantaneous full horizon detection, easier integration of various applications on the same imager and others. This paper reports our work on panomorph optics and potential usage in transportation applications. The novel panomorph lens is a new type of high resolution panoramic imager perfectly suitable for the transportation industries. The panomorph lens uses optimization techniques to improve the performance of a customized optical system for specific applications. By adding a custom angle to pixel relation at the optical design stage, the optical system provides an ideal image coverage which is designed to reduce and optimize the processing. The optics can be customized for the visible, near infra-red (NIR) or infra-red (IR) wavebands. The panomorph lens is designed to optimize the cost per pixel which is particularly important in the IR. We discuss the use of the 360 vision system which can enhance on board collision avoidance systems, intelligent cruise controls and parking assistance. 360 panoramic vision systems might enable safer highways and significant reduction in casualties.
Preface: The Chang'e-3 lander and rover mission to the Moon
NASA Astrophysics Data System (ADS)
Ip, Wing-Huen; Yan, Jun; Li, Chun-Lai; Ouyang, Zi-Yuan
2014-12-01
The Chang'e-3 (CE-3) lander and rover mission to the Moon was an intermediate step in China's lunar exploration program, which will be followed by a sample return mission. The lander was equipped with a number of remote-sensing instruments including a pair of cameras (Landing Camera and Terrain Camera) for recording the landing process and surveying terrain, an extreme ultraviolet camera for monitoring activities in the Earth's plasmasphere, and a first-ever Moon-based ultraviolet telescope for astronomical observations. The Yutu rover successfully carried out close-up observations with the Panoramic Camera, mineralogical investigations with the VIS-NIR Imaging Spectrometer, study of elemental abundances with the Active Particle-induced X-ray Spectrometer, and pioneering measurements of the lunar subsurface with Lunar Penetrating Radar. This special issue provides a collection of key information on the instrumental designs, calibration methods and data processing procedures used by these experiments with a perspective of facilitating further analyses of scientific data from CE-3 in preparation for future missions.
Line-Based Registration of Panoramic Images and LiDAR Point Clouds for Mobile Mapping.
Cui, Tingting; Ji, Shunping; Shan, Jie; Gong, Jianya; Liu, Kejian
2016-12-31
For multi-sensor integrated systems, such as the mobile mapping system (MMS), data fusion at sensor-level, i.e., the 2D-3D registration between an optical camera and LiDAR, is a prerequisite for higher level fusion and further applications. This paper proposes a line-based registration method for panoramic images and a LiDAR point cloud collected by a MMS. We first introduce the system configuration and specification, including the coordinate systems of the MMS, the 3D LiDAR scanners, and the two panoramic camera models. We then establish the line-based transformation model for the panoramic camera. Finally, the proposed registration method is evaluated for two types of camera models by visual inspection and quantitative comparison. The results demonstrate that the line-based registration method can significantly improve the alignment of the panoramic image and the LiDAR datasets under either the ideal spherical or the rigorous panoramic camera model, with the latter being more reliable.
Line-Based Registration of Panoramic Images and LiDAR Point Clouds for Mobile Mapping
Cui, Tingting; Ji, Shunping; Shan, Jie; Gong, Jianya; Liu, Kejian
2016-01-01
For multi-sensor integrated systems, such as the mobile mapping system (MMS), data fusion at sensor-level, i.e., the 2D-3D registration between an optical camera and LiDAR, is a prerequisite for higher level fusion and further applications. This paper proposes a line-based registration method for panoramic images and a LiDAR point cloud collected by a MMS. We first introduce the system configuration and specification, including the coordinate systems of the MMS, the 3D LiDAR scanners, and the two panoramic camera models. We then establish the line-based transformation model for the panoramic camera. Finally, the proposed registration method is evaluated for two types of camera models by visual inspection and quantitative comparison. The results demonstrate that the line-based registration method can significantly improve the alignment of the panoramic image and the LiDAR datasets under either the ideal spherical or the rigorous panoramic camera model, with the latter being more reliable. PMID:28042855
Near-infrared high-resolution real-time omnidirectional imaging platform for drone detection
NASA Astrophysics Data System (ADS)
Popovic, Vladan; Ott, Beat; Wellig, Peter; Leblebici, Yusuf
2016-10-01
Recent technological advancements in hardware systems have made higher quality cameras. State of the art panoramic systems use them to produce videos with a resolution of 9000 x 2400 pixels at a rate of 30 frames per second (fps).1 Many modern applications use object tracking to determine the speed and the path taken by each object moving through a scene. The detection requires detailed pixel analysis between two frames. In fields like surveillance systems or crowd analysis, this must be achieved in real time.2 In this paper, we focus on the system-level design of multi-camera sensor acquiring near-infrared (NIR) spectrum and its ability to detect mini-UAVs in a representative rural Swiss environment. The presented results show the UAV detection from the trial that we conducted during a field trial in August 2015.
NASA Astrophysics Data System (ADS)
Nakagawa, M.; Akano, K.; Kobayashi, T.; Sekiguchi, Y.
2017-09-01
Image-based virtual reality (VR) is a virtual space generated with panoramic images projected onto a primitive model. In imagebased VR, realistic VR scenes can be generated with lower rendering cost, and network data can be described as relationships among VR scenes. The camera network data are generated manually or by an automated procedure using camera position and rotation data. When panoramic images are acquired in indoor environments, network data should be generated without Global Navigation Satellite Systems (GNSS) positioning data. Thus, we focused on image-based VR generation using a panoramic camera in indoor environments. We propose a methodology to automate network data generation using panoramic images for an image-based VR space. We verified and evaluated our methodology through five experiments in indoor environments, including a corridor, elevator hall, room, and stairs. We confirmed that our methodology can automatically reconstruct network data using panoramic images for image-based VR in indoor environments without GNSS position data.
You are here: Earth as seen from Mars
2004-03-11
This is the first image ever taken of Earth from the surface of a planet beyond the Moon. It was taken by the Mars Exploration Rover Spirit one hour before sunrise on the 63rd martian day, or sol, of its mission. The image is a mosaic of images taken by the rover's navigation camera showing a broad view of the sky, and an image taken by the rover's panoramic camera of Earth. The contrast in the panoramic camera image was increased two times to make Earth easier to see. The inset shows a combination of four panoramic camera images zoomed in on Earth. The arrow points to Earth. Earth was too faint to be detected in images taken with the panoramic camera's color filters. http://photojournal.jpl.nasa.gov/catalog/PIA05547
Design, demonstration and testing of low F-number LWIR panoramic imaging relay optics
NASA Astrophysics Data System (ADS)
Furxhi, Orges; Frascati, Joe; Driggers, Ronald
2018-04-01
Panoramic imaging is inherently wide field of view. High sensitivity uncooled Long Wave Infrared (LWIR) imaging requires low F-number optics. These two requirements result in short back working distance designs that, in addition to being costly, are challenging to integrate with commercially available uncooled LWIR cameras and cores. Common challenges include the relocation of the shutter flag, custom calibration of the camera dynamic range and NUC tables, focusing, and athermalization. Solutions to these challenges add to the system cost and make panoramic uncooled LWIR cameras commercially unattractive. In this paper, we present the design of Panoramic Imaging Relay Optics (PIRO) and show imagery and test results with one of the first prototypes. PIRO designs use several reflective surfaces (generally two) to relay a panoramic scene onto a real, donut-shaped image. The PIRO donut is imaged on the focal plane of the camera using a commercially-off-the-shelf (COTS) low F-number lens. This approach results in low component cost and effortless integration with pre-calibrated commercially available cameras and lenses.
The Panoramic Camera (PanCam) Instrument for the ESA ExoMars Rover
NASA Astrophysics Data System (ADS)
Griffiths, A.; Coates, A.; Jaumann, R.; Michaelis, H.; Paar, G.; Barnes, D.; Josset, J.
The recently approved ExoMars rover is the first element of the ESA Aurora programme and is slated to deliver the Pasteur exobiology payload to Mars by 2013. The 0.7 kg Panoramic Camera will provide multispectral stereo images with 65° field-of- view (1.1 mrad/pixel) and high resolution (85 µrad/pixel) monoscopic "zoom" images with 5° field-of-view. The stereo Wide Angle Cameras (WAC) are based on Beagle 2 Stereo Camera System heritage. The Panoramic Camera instrument is designed to fulfil the digital terrain mapping requirements of the mission as well as providing multispectral geological imaging, colour and stereo panoramic images, solar images for water vapour abundance and dust optical depth measurements and to observe retrieved subsurface samples before ingestion into the rest of the Pasteur payload. Additionally the High Resolution Camera (HRC) can be used for high resolution imaging of interesting targets detected in the WAC panoramas and of inaccessible locations on crater or valley walls.
NASA Technical Reports Server (NTRS)
Nabors, Sammy
2015-01-01
NASA offers companies an optical system that provides a unique panoramic perspective with a single camera. NASA's Marshall Space Flight Center has developed a technology that combines a panoramic refracting optic (PRO) lens with a unique detection system to acquire a true 360-degree field of view. Although current imaging systems can acquire panoramic images, they must use up to five cameras to obtain the full field of view. MSFC's technology obtains its panoramic images from one vantage point.
Statis omnidirectional stereoscopic display system
NASA Astrophysics Data System (ADS)
Barton, George G.; Feldman, Sidney; Beckstead, Jeffrey A.
1999-11-01
A unique three camera stereoscopic omnidirectional viewing system based on the periscopic panoramic camera described in the 11/98 SPIE proceedings (AM13). The 3 panoramic cameras are equilaterally combined so each leg of the triangle approximates the human inter-ocular spacing allowing each panoramic camera to view 240 degree(s) of the panoramic scene, the most counter clockwise 120 degree(s) being the left eye field and the other 120 degree(s) segment being the right eye field. Field definition may be by green/red filtration or time discrimination of the video signal. In the first instance a 2 color spectacle is used in viewing the display or in the 2nd instance LCD goggles are used to differentiate the R/L fields. Radially scanned vidicons or re-mapped CCDs may be used. The display consists of three vertically stacked 120 degree(s) segments of the panoramic field of view with 2 fields/frame. Field A being the left eye display and Field B the right eye display.
The Panoramic Camera (Pancam) Investigation on the NASA 2003 Mars Exploration Rover Mission
NASA Technical Reports Server (NTRS)
Bell, J. F., III; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Dingizian, A.; Brown, D.; Morris, R. V.; Arneson, H. M.; Johnson, M. J.
2003-01-01
The Panoramic Camera System (Pancam) is part of the Athena science payload to be launched to Mars in 2003 on NASA's twin Mars Exploration Rover (MER) missions. The Pancam imaging system on each rover consists of two major components: a pair of digital CCD cameras, and the Pancam Mast Assembly (PMA), which provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360 of azimuth and from zenith to nadir, providing a complete view of the scene around the rover.
NASA Astrophysics Data System (ADS)
Haase, I.; Oberst, J.; Scholten, F.; Wählisch, M.; Gläser, P.; Karachevtseva, I.; Robinson, M. S.
2012-05-01
Newly acquired high resolution Lunar Reconnaissance Orbiter Camera (LROC) images allow accurate determination of the coordinates of Apollo hardware, sampling stations, and photographic viewpoints. In particular, the positions from where the Apollo 17 astronauts recorded panoramic image series, at the so-called “traverse stations”, were precisely determined for traverse path reconstruction. We analyzed observations made in Apollo surface photography as well as orthorectified orbital images (0.5 m/pixel) and Digital Terrain Models (DTMs) (1.5 m/pixel and 100 m/pixel) derived from LROC Narrow Angle Camera (NAC) and Wide Angle Camera (WAC) images. Key features captured in the Apollo panoramic sequences were identified in LROC NAC orthoimages. Angular directions of these features were measured in the panoramic images and fitted to the NAC orthoimage by applying least squares techniques. As a result, we obtained the surface panoramic camera positions to within 50 cm. At the same time, the camera orientations, North azimuth angles and distances to nearby features of interest were also determined. Here, initial results are shown for traverse station 1 (northwest of Steno Crater) as well as the Apollo Lunar Surface Experiment Package (ALSEP) area.
Panoramic 3D Reconstruction by Fusing Color Intensity and Laser Range Data
NASA Astrophysics Data System (ADS)
Jiang, Wei; Lu, Jian
Technology for capturing panoramic (360 degrees) three-dimensional information in a real environment have many applications in fields: virtual and complex reality, security, robot navigation, and so forth. In this study, we examine an acquisition device constructed of a regular CCD camera and a 2D laser range scanner, along with a technique for panoramic 3D reconstruction using a data fusion algorithm based on an energy minimization framework. The acquisition device can capture two types of data of a panoramic scene without occlusion between two sensors: a dense spatio-temporal volume from a camera and distance information from a laser scanner. We resample the dense spatio-temporal volume for generating a dense multi-perspective panorama that has equal spatial resolution to that of the original images acquired using a regular camera, and also estimate a dense panoramic depth-map corresponding to the generated reference panorama by extracting trajectories from the dense spatio-temporal volume with a selecting camera. Moreover, for determining distance information robustly, we propose a data fusion algorithm that is embedded into an energy minimization framework that incorporates active depth measurements using a 2D laser range scanner and passive geometry reconstruction from an image sequence obtained using the CCD camera. Thereby, measurement precision and robustness can be improved beyond those available by conventional methods using either passive geometry reconstruction (stereo vision) or a laser range scanner. Experimental results using both synthetic and actual images show that our approach can produce high-quality panoramas and perform accurate 3D reconstruction in a panoramic environment.
Photogrammetry of Apollo 15 photography, part C
NASA Technical Reports Server (NTRS)
Wu, S. S. C.; Schafer, F. J.; Jordan, R.; Nakata, G. M.; Derick, J. L.
1972-01-01
In the Apollo 15 mission, a mapping camera system and a 61 cm optical bar, high resolution panoramic camera, as well as a laser altimeter were used. The panoramic camera is described, having several distortion sources, such as cylindrical shape of the negative film surface, the scanning action of the lens, the image motion compensator, and the spacecraft motion. Film products were processed on a specifically designed analytical plotter.
NASA Astrophysics Data System (ADS)
Li, Jianping; Yang, Bisheng; Chen, Chi; Huang, Ronggang; Dong, Zhen; Xiao, Wen
2018-02-01
Inaccurate exterior orientation parameters (EoPs) between sensors obtained by pre-calibration leads to failure of registration between panoramic image sequence and mobile laser scanning data. To address this challenge, this paper proposes an automatic registration method based on semantic features extracted from panoramic images and point clouds. Firstly, accurate rotation parameters between the panoramic camera and the laser scanner are estimated using GPS and IMU aided structure from motion (SfM). The initial EoPs of panoramic images are obtained at the same time. Secondly, vehicles in panoramic images are extracted by the Faster-RCNN as candidate primitives to be matched with potential corresponding primitives in point clouds according to the initial EoPs. Finally, translation between the panoramic camera and the laser scanner is refined by maximizing the overlapping area of corresponding primitive pairs based on the Particle Swarm Optimization (PSO), resulting in a finer registration between panoramic image sequences and point clouds. Two challenging urban scenes were experimented to assess the proposed method, and the final registration errors of these two scenes were both less than three pixels, which demonstrates a high level of automation, robustness and accuracy.
Thermal infrared panoramic imaging sensor
NASA Astrophysics Data System (ADS)
Gutin, Mikhail; Tsui, Eddy K.; Gutin, Olga; Wang, Xu-Ming; Gutin, Alexey
2006-05-01
Panoramic cameras offer true real-time, 360-degree coverage of the surrounding area, valuable for a variety of defense and security applications, including force protection, asset protection, asset control, security including port security, perimeter security, video surveillance, border control, airport security, coastguard operations, search and rescue, intrusion detection, and many others. Automatic detection, location, and tracking of targets outside protected area ensures maximum protection and at the same time reduces the workload on personnel, increases reliability and confidence of target detection, and enables both man-in-the-loop and fully automated system operation. Thermal imaging provides the benefits of all-weather, 24-hour day/night operation with no downtime. In addition, thermal signatures of different target types facilitate better classification, beyond the limits set by camera's spatial resolution. The useful range of catadioptric panoramic cameras is affected by their limited resolution. In many existing systems the resolution is optics-limited. Reflectors customarily used in catadioptric imagers introduce aberrations that may become significant at large camera apertures, such as required in low-light and thermal imaging. Advantages of panoramic imagers with high image resolution include increased area coverage with fewer cameras, instantaneous full horizon detection, location and tracking of multiple targets simultaneously, extended range, and others. The Automatic Panoramic Thermal Integrated Sensor (APTIS), being jointly developed by Applied Science Innovative, Inc. (ASI) and the Armament Research, Development and Engineering Center (ARDEC) combines the strengths of improved, high-resolution panoramic optics with thermal imaging in the 8 - 14 micron spectral range, leveraged by intelligent video processing for automated detection, location, and tracking of moving targets. The work in progress supports the Future Combat Systems (FCS) and the Intelligent Munitions Systems (IMS). The APTIS is anticipated to operate as an intelligent node in a wireless network of multifunctional nodes that work together to serve in a wide range of applications of homeland security, as well as serve the Army in tasks of improved situational awareness (SA) in defense and offensive operations, and as a sensor node in tactical Intelligence Surveillance Reconnaissance (ISR). The novel ViperView TM high-resolution panoramic thermal imager is the heart of the APTIS system. It features an aberration-corrected omnidirectional imager with small optics designed to match the resolution of a 640x480 pixels IR camera with improved image quality for longer range target detection, classification, and tracking. The same approach is applicable to panoramic cameras working in the visible spectral range. Other components of the ATPIS system include network communications, advanced power management, and wakeup capability. Recent developments include image processing, optical design being expanded into the visible spectral range, and wireless communications design. This paper describes the development status of the APTIS system.
NASA Astrophysics Data System (ADS)
Georgiou, Giota; Verdaasdonk, Rudolf M.; van der Veen, Albert; Klaessens, John H.
2017-02-01
In the development of new near-infrared (NIR) fluorescence dyes for image guided surgery, there is a need for new NIR sensitive camera systems that can easily be adjusted to specific wavelength ranges in contrast the present clinical systems that are only optimized for ICG. To test alternative camera systems, a setup was developed to mimic the fluorescence light in a tissue phantom to measure the sensitivity and resolution. Selected narrow band NIR LED's were used to illuminate a 6mm diameter circular diffuse plate to create uniform intensity controllable light spot (μW-mW) as target/source for NIR camera's. Layers of (artificial) tissue with controlled thickness could be placed on the spot to mimic a fluorescent `cancer' embedded in tissue. This setup was used to compare a range of NIR sensitive consumer's cameras for potential use in image guided surgery. The image of the spot obtained with the cameras was captured and analyzed using ImageJ software. Enhanced CCD night vision cameras were the most sensitive capable of showing intensities < 1 μW through 5 mm of tissue. However, there was no control over the automatic gain and hence noise level. NIR sensitive DSLR cameras proved relative less sensitive but could be fully manually controlled as to gain (ISO 25600) and exposure time and are therefore preferred for a clinical setting in combination with Wi-Fi remote control. The NIR fluorescence testing setup proved to be useful for camera testing and can be used for development and quality control of new NIR fluorescence guided surgery equipment.
Registration of Vehicle-Borne Point Clouds and Panoramic Images Based on Sensor Constellations.
Yao, Lianbi; Wu, Hangbin; Li, Yayun; Meng, Bin; Qian, Jinfei; Liu, Chun; Fan, Hongchao
2017-04-11
A mobile mapping system (MMS) is usually utilized to collect environmental data on and around urban roads. Laser scanners and panoramic cameras are the main sensors of an MMS. This paper presents a new method for the registration of the point clouds and panoramic images based on sensor constellation. After the sensor constellation was analyzed, a feature point, the intersection of the connecting line between the global positioning system (GPS) antenna and the panoramic camera with a horizontal plane, was utilized to separate the point clouds into blocks. The blocks for the central and sideward laser scanners were extracted with the segmentation feature points. Then, the point clouds located in the blocks were separated from the original point clouds. Each point in the blocks was used to find the accurate corresponding pixel in the relative panoramic images via a collinear function, and the position and orientation relationship amongst different sensors. A search strategy is proposed for the correspondence of laser scanners and lenses of panoramic cameras to reduce calculation complexity and improve efficiency. Four cases of different urban road types were selected to verify the efficiency and accuracy of the proposed method. Results indicate that most of the point clouds (with an average of 99.7%) were successfully registered with the panoramic images with great efficiency. Geometric evaluation results indicate that horizontal accuracy was approximately 0.10-0.20 m, and vertical accuracy was approximately 0.01-0.02 m for all cases. Finally, the main factors that affect registration accuracy, including time synchronization amongst different sensors, system positioning and vehicle speed, are discussed.
Automatic panoramic thermal integrated sensor
NASA Astrophysics Data System (ADS)
Gutin, Mikhail A.; Tsui, Eddy K.; Gutin, Olga N.
2005-05-01
Historically, the US Army has recognized the advantages of panoramic imagers with high image resolution: increased area coverage with fewer cameras, instantaneous full horizon detection, location and tracking of multiple targets simultaneously, extended range, and others. The novel ViperViewTM high-resolution panoramic thermal imager is the heart of the Automatic Panoramic Thermal Integrated Sensor (APTIS), being jointly developed by Applied Science Innovative, Inc. (ASI) and the Armament Research, Development and Engineering Center (ARDEC) in support of the Future Combat Systems (FCS) and the Intelligent Munitions Systems (IMS). The APTIS is anticipated to operate as an intelligent node in a wireless network of multifunctional nodes that work together to improve situational awareness (SA) in many defense and offensive operations, as well as serve as a sensor node in tactical Intelligence Surveillance Reconnaissance (ISR). The ViperView is as an aberration-corrected omnidirectional imager with small optics designed to match the resolution of a 640x480 pixels IR camera with improved image quality for longer range target detection, classification, and tracking. The same approach is applicable to panoramic cameras working in the visible spectral range. Other components of the ATPIS sensor suite include ancillary sensors, advanced power management, and wakeup capability. This paper describes the development status of the APTIS system.
Automatic visibility retrieval from thermal camera images
NASA Astrophysics Data System (ADS)
Dizerens, Céline; Ott, Beat; Wellig, Peter; Wunderle, Stefan
2017-10-01
This study presents an automatic visibility retrieval of a FLIR A320 Stationary Thermal Imager installed on a measurement tower on the mountain Lagern located in the Swiss Jura Mountains. Our visibility retrieval makes use of edges that are automatically detected from thermal camera images. Predefined target regions, such as mountain silhouettes or buildings with high thermal differences to the surroundings, are used to derive the maximum visibility distance that is detectable in the image. To allow a stable, automatic processing, our procedure additionally removes noise in the image and includes automatic image alignment to correct small shifts of the camera. We present a detailed analysis of visibility derived from more than 24000 thermal images of the years 2015 and 2016 by comparing them to (1) visibility derived from a panoramic camera image (VISrange), (2) measurements of a forward-scatter visibility meter (Vaisala FD12 working in the NIR spectra), and (3) modeled visibility values using the Thermal Range Model TRM4. Atmospheric conditions, mainly water vapor from European Center for Medium Weather Forecast (ECMWF), were considered to calculate the extinction coefficients using MODTRAN. The automatic visibility retrieval based on FLIR A320 images is often in good agreement with the retrieval from the systems working in different spectral ranges. However, some significant differences were detected as well, depending on weather conditions, thermal differences of the monitored landscape, and defined target size.
Visual Tour Based on Panaromic Images for Indoor Places in Campus
NASA Astrophysics Data System (ADS)
Bakirman, T.
2012-07-01
In this paper, it is aimed to create a visual tour based on panoramic images for Civil Engineering Faculty in Yildiz Technical University. For this purpose, panoramic images should be obtained. Thus, photos taken with a tripod to have the same angle of view in every photo and panoramic images were created with stitching photos. Two different cameras with different focal length were used. With the panoramic images, visual tour with navigation tools created.
Registration of Vehicle-Borne Point Clouds and Panoramic Images Based on Sensor Constellations
Yao, Lianbi; Wu, Hangbin; Li, Yayun; Meng, Bin; Qian, Jinfei; Liu, Chun; Fan, Hongchao
2017-01-01
A mobile mapping system (MMS) is usually utilized to collect environmental data on and around urban roads. Laser scanners and panoramic cameras are the main sensors of an MMS. This paper presents a new method for the registration of the point clouds and panoramic images based on sensor constellation. After the sensor constellation was analyzed, a feature point, the intersection of the connecting line between the global positioning system (GPS) antenna and the panoramic camera with a horizontal plane, was utilized to separate the point clouds into blocks. The blocks for the central and sideward laser scanners were extracted with the segmentation feature points. Then, the point clouds located in the blocks were separated from the original point clouds. Each point in the blocks was used to find the accurate corresponding pixel in the relative panoramic images via a collinear function, and the position and orientation relationship amongst different sensors. A search strategy is proposed for the correspondence of laser scanners and lenses of panoramic cameras to reduce calculation complexity and improve efficiency. Four cases of different urban road types were selected to verify the efficiency and accuracy of the proposed method. Results indicate that most of the point clouds (with an average of 99.7%) were successfully registered with the panoramic images with great efficiency. Geometric evaluation results indicate that horizontal accuracy was approximately 0.10–0.20 m, and vertical accuracy was approximately 0.01–0.02 m for all cases. Finally, the main factors that affect registration accuracy, including time synchronization amongst different sensors, system positioning and vehicle speed, are discussed. PMID:28398256
2004-03-13
This is the first image ever taken of Earth from the surface of a planet beyond the Moon. It was taken by the Mars Exploration Rover Spirit one hour before sunrise on the 63rd martian day, or sol, of its mission. Earth is the tiny white dot in the center. The image is a mosaic of images taken by the rover's navigation camera showing a broad view of the sky, and an image taken by the rover's panoramic camera of Earth. The contrast in the panoramic camera image was increased two times to make Earth easier to see. http://photojournal.jpl.nasa.gov/catalog/PIA05560
NASA Technical Reports Server (NTRS)
2004-01-01
This panoramic camera image shows the hole drilled by the Mars Exploration Rover Opportunity's rock abrasion tool into the rock dubbed 'Bounce' on Sol 65 of the rover's journey. The tool drilled about 7 millimeters (0.3 inches) into the rock and generated small piles of 'tailings' or rock dust around the central hole, which is about 4.5 centimeters (1.7 inches) across. The image from sol 66 of the mission was acquired using the panoramic camera's 430 nanometer filter.
2017-11-01
ARL-TR-8205 ● NOV 2017 US Army Research Laboratory Strategies for Characterizing the Sensory Environment: Objective and...Subjective Evaluation Methods using the VisiSonic Real Space 64/5 Audio-Visual Panoramic Camera By Joseph McArdle, Ashley Foots, Chris Stachowiak, and...return it to the originator. ARL-TR-8205 ● NOV 2017 US Army Research Laboratory Strategies for Characterizing the Sensory
Panoramic Epipolar Image Generation for Mobile Mapping System
NASA Astrophysics Data System (ADS)
Chen, T.; Yamamoto, K.; Chhatkuli, S.; Shimamura, H.
2012-07-01
The notable improvements on performance and low cost of digital cameras and GPS/IMU devices have caused MMSs (Mobile Mapping Systems) to be gradually becoming one of the most important devices for mapping highway and railway networks, generating and updating road navigation data and constructing urban 3D models over the last 20 years. Moreover, the demands for large scale visual street-level image database construction by the internet giants such as Google and Microsoft have made the further rapid development of this technology. As one of the most important sensors, the omni-directional cameras are being commonly utilized on many MMSs to collect panoramic images for 3D close range photogrammetry and fusion with 3D laser point clouds since these cameras could record much visual information of the real environment in one image at field view angle of 360° in longitude direction and 180° in latitude direction. This paper addresses the problem of panoramic epipolar image generation for 3D modelling and mapping by stereoscopic viewing. These panoramic images are captured with Point Grey's Ladybug3 mounted on the top of Mitsubishi MMS-X 220 at 2m intervals along the streets in urban environment. Onboard GPS/IMU, speedometer and post sequence image analysis technology such as bundle adjustment provided high accuracy position and attitude data for these panoramic images and laser data, this makes it possible to construct the epipolar geometric relationship between any two adjacent panoramic images and then the panoramic epipolar images could be generated. Three kinds of projection planes: sphere, cylinder and flat plane are selected as the epipolar images' planes. In final we select the flat plane and use its effective parts (middle parts of base line's two sides) for epipolar image generation. The corresponding geometric relations and results will be presented in this paper.
Matching Real and Synthetic Panoramic Images Using a Variant of Geometric Hashing
NASA Astrophysics Data System (ADS)
Li-Chee-Ming, J.; Armenakis, C.
2017-05-01
This work demonstrates an approach to automatically initialize a visual model-based tracker, and recover from lost tracking, without prior camera pose information. These approaches are commonly referred to as tracking-by-detection. Previous tracking-by-detection techniques used either fiducials (i.e. landmarks or markers) or the object's texture. The main contribution of this work is the development of a tracking-by-detection algorithm that is based solely on natural geometric features. A variant of geometric hashing, a model-to-image registration algorithm, is proposed that searches for a matching panoramic image from a database of synthetic panoramic images captured in a 3D virtual environment. The approach identifies corresponding features between the matched panoramic images. The corresponding features are to be used in a photogrammetric space resection to estimate the camera pose. The experiments apply this algorithm to initialize a model-based tracker in an indoor environment using the 3D CAD model of the building.
Camera Control and Geo-Registration for Video Sensor Networks
NASA Astrophysics Data System (ADS)
Davis, James W.
With the use of large video networks, there is a need to coordinate and interpret the video imagery for decision support systems with the goal of reducing the cognitive and perceptual overload of human operators. We present computer vision strategies that enable efficient control and management of cameras to effectively monitor wide-coverage areas, and examine the framework within an actual multi-camera outdoor urban video surveillance network. First, we construct a robust and precise camera control model for commercial pan-tilt-zoom (PTZ) video cameras. In addition to providing a complete functional control mapping for PTZ repositioning, the model can be used to generate wide-view spherical panoramic viewspaces for the cameras. Using the individual camera control models, we next individually map the spherical panoramic viewspace of each camera to a large aerial orthophotograph of the scene. The result provides a unified geo-referenced map representation to permit automatic (and manual) video control and exploitation of cameras in a coordinated manner. The combined framework provides new capabilities for video sensor networks that are of significance and benefit to the broad surveillance/security community.
Low-cost panoramic infrared surveillance system
NASA Astrophysics Data System (ADS)
Kecskes, Ian; Engel, Ezra; Wolfe, Christopher M.; Thomson, George
2017-05-01
A nighttime surveillance concept consisting of a single surface omnidirectional mirror assembly and an uncooled Vanadium Oxide (VOx) longwave infrared (LWIR) camera has been developed. This configuration provides a continuous field of view spanning 360° in azimuth and more than 110° in elevation. Both the camera and the mirror are readily available, off-the-shelf, inexpensive products. The mirror assembly is marketed for use in the visible spectrum and requires only minor modifications to function in the LWIR spectrum. The compactness and portability of this optical package offers significant advantages over many existing infrared surveillance systems. The developed system was evaluated on its ability to detect moving, human-sized heat sources at ranges between 10 m and 70 m. Raw camera images captured by the system are converted from rectangular coordinates in the camera focal plane to polar coordinates and then unwrapped into the users azimuth and elevation system. Digital background subtraction and color mapping are applied to the images to increase the users ability to extract moving items from background clutter. A second optical system consisting of a commercially available 50 mm f/1.2 ATHERM lens and a second LWIR camera is used to examine the details of objects of interest identified using the panoramic imager. A description of the components of the proof of concept is given, followed by a presentation of raw images taken by the panoramic LWIR imager. A description of the method by which these images are analyzed is given, along with a presentation of these results side-by-side with the output of the 50 mm LWIR imager and a panoramic visible light imager. Finally, a discussion of the concept and its future development are given.
NASA Technical Reports Server (NTRS)
2004-01-01
The color image on the lower left from the panoramic camera on the Mars Exploration Rover Opportunity shows the 'Lily Pad' bounce-mark area at Meridiani Planum, Mars. This image was acquired on the 3rd sol, or martian day, of Opportunity's mission (Jan.26, 2004). The upper left image is a monochrome (single filter) image from the rover's panoramic camera, showing regions from which spectra were extracted from the 'Lily Pad' area. As noted by the line graph on the right, the green spectra is from the undisturbed surface and the red spectra is from the airbag bounce mark.
Orbital-science investigation: Part C: photogrammetry of Apollo 15 photography
Wu, Sherman S.C.; Schafer, Francis J.; Jordan, Raymond; Nakata, Gary M.; Derick, James L.
1972-01-01
Mapping of large areas of the Moon by photogrammetric methods was not seriously considered until the Apollo 15 mission. In this mission, a mapping camera system and a 61-cm optical-bar high-resolution panoramic camera, as well as a laser altimeter, were used. The mapping camera system comprises a 7.6-cm metric terrain camera and a 7.6-cm stellar camera mounted in a fixed angular relationship (an angle of 96° between the two camera axes). The metric camera has a glass focal-plane plate with reseau grids. The ground-resolution capability from an altitude of 110 km is approximately 20 m. Because of the auxiliary stellar camera and the laser altimeter, the resulting metric photography can be used not only for medium- and small-scale cartographic or topographic maps, but it also can provide a basis for establishing a lunar geodetic network. The optical-bar panoramic camera has a 135- to 180-line resolution, which is approximately 1 to 2 m of ground resolution from an altitude of 110 km. Very large scale specialized topographic maps for supporting geologic studies of lunar-surface features can be produced from the stereoscopic coverage provided by this camera.
Photogrammetry using Apollo 16 orbital photography, part B
NASA Technical Reports Server (NTRS)
Wu, S. S. C.; Schafer, F. J.; Jordan, R.; Nakata, G. M.
1972-01-01
Discussion is made of the Apollo 15 and 16 metric and panoramic cameras which provided photographs for accurate topographic portrayal of the lunar surface using photogrammetric methods. Nine stereoscopic models of Apollo 16 metric photographs and three models of panoramic photographs were evaluated photogrammetrically in support of the Apollo 16 geologic investigations. Four of the models were used to collect profile data for crater morphology studies; three models were used to collect evaluation data for the frequency distributions of lunar slopes; one model was used to prepare a map of the Apollo 16 traverse area; and one model was used to determine elevations of the Cayley Formation. The remaining three models were used to test photogrammetric techniques using oblique metric and panoramic camera photographs. Two preliminary contour maps were compiled and a high-oblique metric photograph was rectified.
Endoscopic measurements using a panoramic annular lens
NASA Technical Reports Server (NTRS)
Gilbert, John A.; Matthys, Donald R.
1992-01-01
The objective of this project was to design, build, demonstrate, and deliver a prototype system for making measurements within cavities. The system was to utilize structured lighting as the means for making measurements and was to rely on a stationary probe, equipped with a unique panoramic annular lens, to capture a cylindrical view of the illuminated cavity. Panoramic images, acquired with a digitizing camera and stored in a desk top computer, were to be linearized and analyzed by mouse-driven interactive software.
NASA Technical Reports Server (NTRS)
2004-01-01
Two views of a sundial called the MarsDial can be seen in this image taken on Mars by the Mars Exploration Rover Spirit's panoramic camera. These calibration instruments, positioned on the solar panels of both Spirit and the Mars Exploration Rover Opportunity, are tools for both scientists and educators. Scientists use the sundial to adjust the rovers' panoramic cameras, while students participating in NASA's Red Rover Goes to Mars program will monitor the dial to track time on Mars. Students worldwide will also have the opportunity to build their own Earth sundial and compare it to that on Mars.The left image was captured near martian noon when the Sun was very high in the sky. The right image was acquired later in the afternoon when the Sun was lower in sky, casting longer shadows. The colored blocks in the corners of the sundial are used to fine-tune the panoramic camera's sense of color. Shadows cast on the sundial help scientists adjust the brightness of images.The sundial is embellished with artwork from children, and displays the word Mars in 17 different languages.NASA Technical Reports Server (NTRS)
2004-01-01
This false-color panoramic camera composite traverse map depicts the Mars Exploration Rover Spirit's journey since landing at Gusev Crater, Mars. It was generated from three of the camera's different wavelength filters (750 nanometers, 530 nanometers and 480 nanometers). This map was created on the 65th martian day, or sol, of Spirit's mission, after Spirit had traveled 328 meters (1076 feet) from its lander to the rim of the crater dubbed 'Bonneville.' From this high point, Spirit was able to capture with its panoramic camera the entire rover traverse. The map points out major stops that Spirit made along the way, including features nicknamed 'Adirondack;' 'Stone Council;' 'Laguna Hollow;' and 'Humphrey.' Also highlighted is the landscape feature informally named 'Grissom Hill' and Spirit's landing site, the Columbia Memorial Station.
Mobile Panoramic Video Applications for Learning
ERIC Educational Resources Information Center
Multisilta, Jari
2014-01-01
The use of videos on the internet has grown significantly in the last few years. For example, Khan Academy has a large collection of educational videos, especially on STEM subjects, available for free on the internet. Professional panoramic video cameras are expensive and usually not easy to carry because of the large size of the equipment.…
Fisheye camera around view monitoring system
NASA Astrophysics Data System (ADS)
Feng, Cong; Ma, Xinjun; Li, Yuanyuan; Wu, Chenchen
2018-04-01
360 degree around view monitoring system is the key technology of the advanced driver assistance system, which is used to assist the driver to clear the blind area, and has high application value. In this paper, we study the transformation relationship between multi coordinate system to generate panoramic image in the unified car coordinate system. Firstly, the panoramic image is divided into four regions. By using the parameters obtained by calibration, four fisheye images pixel corresponding to the four sub regions are mapped to the constructed panoramic image. On the basis of 2D around view monitoring system, 3D version is realized by reconstructing the projection surface. Then, we compare 2D around view scheme and 3D around view scheme in unified coordinate system, 3D around view scheme solves the shortcomings of the traditional 2D scheme, such as small visual field, prominent ground object deformation and so on. Finally, the image collected by a fisheye camera installed around the car body can be spliced into a 360 degree panoramic image. So it has very high application value.
Trench Reveals Two Faces of Soils
NASA Technical Reports Server (NTRS)
2004-01-01
This approximate true-color image mosaic from the panoramic camera on the Mars Exploration Rover Opportunity shows a trench dug by the rover in the vicinity of the 'Anatolia' region. Two imprints from the rover's Mossbauer spectrometer instrument were left in the exposed soils. Detailed comparisons between soils exposed at the surface and those found at depth reveal that surface soils have higher levels of hematite while subsurface soils show fine particles derived from basalt. The trench is approximately 11 centimeters deep. This image was taken on sol 81 with the panoramic camera's 430-, 530- and 750-nanometer filters.Similar on the Inside (pre-grinding)
NASA Technical Reports Server (NTRS)
2004-01-01
This approximate true-color image taken by the panoramic camera on the Mars Exploration Rover Opportunity show the rock called 'Pilbara' located in the small crater dubbed 'Fram.' The rock appears to be dotted with the same 'blueberries,' or spherules, found at 'Eagle Crater.' Spirit drilled into this rock with its rock abrasion tool. After analyzing the hole with the rover's scientific instruments, scientists concluded that Pilbara has a similar chemical make-up, and thus watery past, to rocks studied at Eagle Crater. This image was taken with the panoramic camera's 480-, 530- and 600-nanometer filters.Similar on the Inside (post-grinding)
NASA Technical Reports Server (NTRS)
2004-01-01
This approximate true-color image taken by the panoramic camera on the Mars Exploration Rover Opportunity show the hole drilled into the rock called 'Pilbara,' which is located in the small crater dubbed 'Fram.' Spirit drilled into this rock with its rock abrasion tool. The rock appears to be dotted with the same 'blueberries,' or spherules, found at 'Eagle Crater.' After analyzing the hole with the rover's scientific instruments, scientists concluded that Pilbara has a similar chemical make-up, and thus watery past, to rocks studied at Eagle Crater. This image was taken with the panoramic camera's 480-, 530- and 600-nanometer filters.'El Capitan's' Scientific Gems
NASA Technical Reports Server (NTRS)
2004-01-01
This mosaic of images taken by the panoramic camera onboard the Mars Exploration Rover Opportunity shows the rock region dubbed 'El Capitan,' which lies within the larger outcrop near the rover's landing site. 'El Capitan' is being studied in great detail using the scientific instruments on the rover's arm; images from the panoramic camera help scientists choose the locations for this compositional work. The millimeter-scale detail of the lamination covering these rocks can be seen. The face of the rock to the right of the mosaic may be a future target for grinding with the rover's rock abrasion tool.
NASA Technical Reports Server (NTRS)
Ward, J. F.
1981-01-01
Procedures were developed and tested for using KA-80A optical bar camera panoramic photography for timber typing forest land and classifying nonforest land. The study area was the south half of the Lake Tahoe Basin Management Unit. Final products from this study include four timber type map overlays on 1:24,000 orthophoto maps. The following conclusions can be drawn from this study: (1) established conventional timber typing procedures can be used on panoramic photography if the necessary equipment is available, (2) The classification and consistency results warrant further study in using panoramic photography for timber typing; and (3) timber type mapping can be done as fast or faster with panoramic photography than with resource photography while maintaining comparable accuracy.
In vitro near-infrared imaging of occlusal dental caries using a germanium-enhanced CMOS camera
NASA Astrophysics Data System (ADS)
Lee, Chulsung; Darling, Cynthia L.; Fried, Daniel
2010-02-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310-nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study was to determine whether the lesion contrast derived from NIR transillumination can be used to estimate lesion severity. Another aim was to compare the performance of a new Ge enhanced complementary metal-oxide-semiconductor (CMOS) based NIR imaging camera with the InGaAs focal plane array (FPA). Extracted human teeth (n=52) with natural occlusal caries were imaged with both cameras at 1310-nm and the image contrast between sound and carious regions was calculated. After NIR imaging, teeth were sectioned and examined using more established methods, namely polarized light microscopy (PLM) and transverse microradiography (TMR) to calculate lesion severity. Lesions were then classified into 4 categories according to the lesion severity. Lesion contrast increased significantly with lesion severity for both cameras (p<0.05). The Ge enhanced CMOS camera equipped with the larger array and smaller pixels yielded higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
In vitro near-infrared imaging of occlusal dental caries using germanium enhanced CMOS camera.
Lee, Chulsung; Darling, Cynthia L; Fried, Daniel
2010-03-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310-nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study was to determine whether the lesion contrast derived from NIR transillumination can be used to estimate lesion severity. Another aim was to compare the performance of a new Ge enhanced complementary metal-oxide-semiconductor (CMOS) based NIR imaging camera with the InGaAs focal plane array (FPA). Extracted human teeth (n=52) with natural occlusal caries were imaged with both cameras at 1310-nm and the image contrast between sound and carious regions was calculated. After NIR imaging, teeth were sectioned and examined using more established methods, namely polarized light microscopy (PLM) and transverse microradiography (TMR) to calculate lesion severity. Lesions were then classified into 4 categories according to the lesion severity. Lesion contrast increased significantly with lesion severity for both cameras (p<0.05). The Ge enhanced CMOS camera equipped with the larger array and smaller pixels yielded higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
A method and results of color calibration for the Chang'e-3 terrain camera and panoramic camera
NASA Astrophysics Data System (ADS)
Ren, Xin; Li, Chun-Lai; Liu, Jian-Jun; Wang, Fen-Fei; Yang, Jian-Feng; Liu, En-Hai; Xue, Bin; Zhao, Ru-Jin
2014-12-01
The terrain camera (TCAM) and panoramic camera (PCAM) are two of the major scientific payloads installed on the lander and rover of the Chang'e 3 mission respectively. They both use a Bayer color filter array covering CMOS sensor to capture color images of the Moon's surface. RGB values of the original images are related to these two kinds of cameras. There is an obvious color difference compared with human visual perception. This paper follows standards published by the International Commission on Illumination to establish a color correction model, designs the ground calibration experiment and obtains the color correction coefficient. The image quality has been significantly improved and there is no obvious color difference in the corrected images. Ground experimental results show that: (1) Compared with uncorrected images, the average color difference of TCAM is 4.30, which has been reduced by 62.1%. (2) The average color differences of the left and right cameras in PCAM are 4.14 and 4.16, which have been reduced by 68.3% and 67.6% respectively.
Lee, Chulsung; Lee, Dustin; Darling, Cynthia L; Fried, Daniel
2010-01-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310 nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study is to determine whether the lesion contrast derived from NIR imaging in both transmission and reflectance can be used to estimate lesion severity. Two NIR imaging detector technologies are investigated: a new Ge-enhanced complementary metal-oxide-semiconductor (CMOS)-based NIR imaging camera, and an InGaAs focal plane array (FPA). Natural occlusal caries lesions are imaged with both cameras at 1310 nm, and the image contrast between sound and carious regions is calculated. After NIR imaging, teeth are sectioned and examined using polarized light microscopy (PLM) and transverse microradiography (TMR) to determine lesion severity. Lesions are then classified into four categories according to lesion severity. Lesion contrast increases significantly with lesion severity for both cameras (p<0.05). The Ge-enhanced CMOS camera equipped with the larger array and smaller pixels yields higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
Lee, Chulsung; Lee, Dustin; Darling, Cynthia L.; Fried, Daniel
2010-01-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310 nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study is to determine whether the lesion contrast derived from NIR imaging in both transmission and reflectance can be used to estimate lesion severity. Two NIR imaging detector technologies are investigated: a new Ge-enhanced complementary metal-oxide-semiconductor (CMOS)-based NIR imaging camera, and an InGaAs focal plane array (FPA). Natural occlusal caries lesions are imaged with both cameras at 1310 nm, and the image contrast between sound and carious regions is calculated. After NIR imaging, teeth are sectioned and examined using polarized light microscopy (PLM) and transverse microradiography (TMR) to determine lesion severity. Lesions are then classified into four categories according to lesion severity. Lesion contrast increases significantly with lesion severity for both cameras (p<0.05). The Ge-enhanced CMOS camera equipped with the larger array and smaller pixels yields higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity. PMID:20799842
NASA Astrophysics Data System (ADS)
Lee, Chulsung; Lee, Dustin; Darling, Cynthia L.; Fried, Daniel
2010-07-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310 nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study is to determine whether the lesion contrast derived from NIR imaging in both transmission and reflectance can be used to estimate lesion severity. Two NIR imaging detector technologies are investigated: a new Ge-enhanced complementary metal-oxide-semiconductor (CMOS)-based NIR imaging camera, and an InGaAs focal plane array (FPA). Natural occlusal caries lesions are imaged with both cameras at 1310 nm, and the image contrast between sound and carious regions is calculated. After NIR imaging, teeth are sectioned and examined using polarized light microscopy (PLM) and transverse microradiography (TMR) to determine lesion severity. Lesions are then classified into four categories according to lesion severity. Lesion contrast increases significantly with lesion severity for both cameras (p<0.05). The Ge-enhanced CMOS camera equipped with the larger array and smaller pixels yields higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
NASA Astrophysics Data System (ADS)
Gaddam, Vamsidhar Reddy; Griwodz, Carsten; Halvorsen, Pâl.
2014-02-01
One of the most common ways of capturing wide eld-of-view scenes is by recording panoramic videos. Using an array of cameras with limited overlapping in the corresponding images, one can generate good panorama images. Using the panorama, several immersive display options can be explored. There is a two fold synchronization problem associated to such a system. One is the temporal synchronization, but this challenge can easily be handled by using a common triggering solution to control the shutters of the cameras. The other synchronization challenge is the automatic exposure synchronization which does not have a straight forward solution, especially in a wide area scenario where the light conditions are uncontrolled like in the case of an open, outdoor football stadium. In this paper, we present the challenges and approaches for creating a completely automatic real-time panoramic capture system with a particular focus on the camera settings. One of the main challenges in building such a system is that there is not one common area of the pitch that is visible to all the cameras that can be used for metering the light in order to nd appropriate camera parameters. One approach we tested is to use the green color of the eld grass. Such an approach provided us with acceptable results only in limited light conditions.A second approach was devised where the overlapping areas between adjacent cameras are exploited, thus creating pairs of perfectly matched video streams. However, there still existed some disparity between di erent pairs. We nally developed an approach where the time between two temporal frames is exploited to communicate the exposures among the cameras where we achieve a perfectly synchronized array. An analysis of the system and some experimental results are presented in this paper. In summary, a pilot-camera approach running in auto-exposure mode and then distributing the used exposure values to the other cameras seems to give best visual results.
NASA Astrophysics Data System (ADS)
Opromolla, Roberto; Fasano, Giancarmine; Rufino, Giancarlo; Grassi, Michele; Pernechele, Claudio; Dionisio, Cesare
2017-11-01
This paper presents an innovative algorithm developed for attitude determination of a space platform. The algorithm exploits images taken from a multi-purpose panoramic camera equipped with hyper-hemispheric lens and used as star tracker. The sensor architecture is also original since state-of-the-art star trackers accurately image as many stars as possible within a narrow- or medium-size field-of-view, while the considered sensor observes an extremely large portion of the celestial sphere but its observation capabilities are limited by the features of the optical system. The proposed original approach combines algorithmic concepts, like template matching and point cloud registration, inherited from the computer vision and robotic research fields, to carry out star identification. The final aim is to provide a robust and reliable initial attitude solution (lost-in-space mode), with a satisfactory accuracy level in view of the multi-purpose functionality of the sensor and considering its limitations in terms of resolution and sensitivity. Performance evaluation is carried out within a simulation environment in which the panoramic camera operation is realistically reproduced, including perturbations in the imaged star pattern. Results show that the presented algorithm is able to estimate attitude with accuracy better than 1° with a success rate around 98% evaluated by densely covering the entire space of the parameters representing the camera pointing in the inertial space.
'Illinois' and 'New York' Wiped Clean
NASA Technical Reports Server (NTRS)
2004-01-01
This panoramic camera image was taken by NASA's Mars Exploration Rover Spirit on sol 79 after completing a two-location brushing on the rock dubbed 'Mazatzal.' A coating of fine, dust-like material was successfully removed from targets named 'Illinois' (right) and 'New York' (left), revealing the weathered rock underneath. In this image, Spirit's panoramic camera mast assembly, or camera head, can be seen shadowing Mazatzal's surface. This approximate true color image was taken with the 601, 535 and 482 nanometer filters.
The center of the two brushed spots are approximately 10 centimeters (3.9 inches) apart and will be aggressively analyzed by the instruments on the robotic arm on sol 80. Plans for sol 81 are to grind into the New York target to get past any weathered rock and expose the original, internal rock underneath.The Chang'e 3 Mission Overview
NASA Astrophysics Data System (ADS)
Li, Chunlai; Liu, Jianjun; Ren, Xin; Zuo, Wei; Tan, Xu; Wen, Weibin; Li, Han; Mu, Lingli; Su, Yan; Zhang, Hongbo; Yan, Jun; Ouyang, Ziyuan
2015-07-01
The Chang'e 3 (CE-3) mission was implemented as the first lander/rover mission of the Chinese Lunar Exploration Program (CLEP). After its successful launch at 01:30 local time on December 2, 2013, CE-3 was inserted into an eccentric polar lunar orbit on December 6, and landed to the east of a 430 m crater in northwestern Mare Imbrium (19.51°W, 44.12°N) at 21:11 on December 14, 2013. The Yutu rover separated from the lander at 04:35, December 15, and traversed for a total of 0.114 km. Acquisition of science data began during the descent of the lander and will continue for 12 months during the nominal mission. The CE-3 lander and rover each carry four science instruments. Instruments on the lander are: Landing Camera (LCAM), Terrain Camera (TCAM), Extreme Ultraviolet Camera (EUVC), and Moon-based Ultraviolet Telescope (MUVT). The four instruments on the rover are: Panoramic Camera (PCAM), VIS-NIR Imaging Spectrometer (VNIS), Active Particle induced X-ray Spectrometer (APXS), and Lunar Penetrating Radar (LPR). The science objectives of the CE-3 mission include: (1) investigation of the morphological features and geological structures of and near the landing area; (2) integrated in-situ analysis of mineral and chemical composition of and near the landing area; and (3) exploration of the terrestrial-lunar space environment and lunar-based astronomical observations. This paper describes the CE-3 objectives and measurements that address the science objectives outlined by the Comprehensive Demonstration Report of Phase II of CLEP. The CE-3 team has archived the initial science data, and we describe data accessibility by the science community.
NASA Astrophysics Data System (ADS)
de Villiers, Jason P.; Bachoo, Asheer K.; Nicolls, Fred C.; le Roux, Francois P. J.
2011-05-01
Tracking targets in a panoramic image is in many senses the inverse problem of tracking targets with a narrow field of view camera on a pan-tilt pedestal. In a narrow field of view camera tracking a moving target, the object is constant and the background is changing. A panoramic camera is able to model the entire scene, or background, and those areas it cannot model well are the potential targets and typically subtended far fewer pixels in the panoramic view compared to the narrow field of view. The outputs of an outward staring array of calibrated machine vision cameras are stitched into a single omnidirectional panorama and used to observe False Bay near Simon's Town, South Africa. A ground truth data-set was created by geo-aligning the camera array and placing a differential global position system receiver on a small target boat thus allowing its position in the array's field of view to be determined. Common tracking techniques including level-sets, Kalman filters and particle filters were implemented to run on the central processing unit of the tracking computer. Image enhancement techniques including multi-scale tone mapping, interpolated local histogram equalisation and several sharpening techniques were implemented on the graphics processing unit. An objective measurement of each tracking algorithm's robustness in the presence of sea-glint, low contrast visibility and sea clutter - such as white caps is performed on the raw recorded video data. These results are then compared to those obtained with the enhanced video data.
Designing the optimal semi-warm NIR spectrograph for SALT via detailed thermal analysis
NASA Astrophysics Data System (ADS)
Wolf, Marsha J.; Sheinis, Andrew I.; Mulligan, Mark P.; Wong, Jeffrey P.; Rogers, Allen
2008-07-01
The near infrared (NIR) upgrade to the Robert Stobie Spectrograph (RSS) on the Southern African Large Telescope (SALT), RSS/NIR, extends the spectral coverage of all modes of the optical spectrograph. The RSS/NIR is a low to medium resolution spectrograph with broadband, spectropolarimetric, and Fabry-Perot imaging capabilities. The optical and NIR arms can be used simultaneously to extend spectral coverage from 3200 Å to approximately 1.6 μm. Both arms utilize high efficiency volume phase holographic gratings via articulating gratings and cameras. The NIR camera incorporates a HAWAII-2RG detector with an Epps optical design consisting of 6 spherical elements and providing subpixel rms image sizes of 7.5 +/- 1.0 μm over all wavelengths and field angles. The NIR spectrograph is semi-warm, sharing a common slit plane and partial collimator with the optical arm. A pre-dewar, cooled to below ambient temperature, houses the final NIR collimator optic, the grating/Fabry-Perot etalon, the polarizing beam splitter, and the first three camera optics. The last three camera elements, blocking filters, and detector are housed in a cryogenically cooled dewar. The semi-warm design concept has long been proposed as an economical way to extend optical instruments into the NIR, however, success has been very limited. A major portion of our design effort entails a detailed thermal analysis using non-sequential ray tracing to interactively guide the mechanical design and determine a truly realizable long wavelength cutoff over which astronomical observations will be sky-limited. In this paper we describe our thermal analysis, design concepts for the staged cooling scheme, and results to be incorporated into the overall mechanical design and baffling.
Layered Outcrops in Gusev Crater (False Color)
NASA Technical Reports Server (NTRS)
2004-01-01
One of the ways scientists collect mineralogical data about rocks on Mars is to view them through filters that allow only specific wavelengths of light to pass through the lens of the panoramic camera. NASA's Mars Exploration Rover Spirit took this false-color image of the rock nicknamed 'Tetl' at 1:05 p.m. martian time on its 270th martian day, or sol (Oct. 5, 2004) using the panoramic camera's 750-, 530-, and 430-nanometer filters. Darker red hues in the image correspond to greater concentrations of oxidized soil and dust. Bluer hues correspond to portions of rock that are not as heavily coated with soils or are not as highly oxidized.Pancam: A Multispectral Imaging Investigation on the NASA 2003 Mars Exploration Rover Mission
NASA Technical Reports Server (NTRS)
Bell, J. F., III; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Dingizian, A.; Brown, D.; Morris, R. V.; Arneson, H. M.; Johnson, M. J.
2003-01-01
One of the six science payload elements carried on each of the NASA Mars Exploration Rovers (MER; Figure 1) is the Panoramic Camera System, or Pancam. Pancam consists of three major components: a pair of digital CCD cameras, the Pancam Mast Assembly (PMA), and a radiometric calibration target. The PMA provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. The calibration target provides a set of reference color and grayscale standards for calibration validation, and a shadow post for quantification of the direct vs. diffuse illumination of the scene. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360 of azimuth and from zenith to nadir, providing a complete view of the scene around the rover in up to 12 unique wavelengths. The major characteristics of Pancam are summarized.
A panoramic coded aperture gamma camera for radioactive hotspots localization
NASA Astrophysics Data System (ADS)
Paradiso, V.; Amgarou, K.; Blanc De Lanaute, N.; Schoepff, V.; Amoyal, G.; Mahe, C.; Beltramello, O.; Liénard, E.
2017-11-01
A known disadvantage of the coded aperture imaging approach is its limited field-of-view (FOV), which often results insufficient when analysing complex dismantling scenes such as post-accidental scenarios, where multiple measurements are needed to fully characterize the scene. In order to overcome this limitation, a panoramic coded aperture γ-camera prototype has been developed. The system is based on a 1 mm thick CdTe detector directly bump-bonded to a Timepix readout chip, developed by the Medipix2 collaboration (256 × 256 pixels, 55 μm pitch, 14.08 × 14.08 mm2 sensitive area). A MURA pattern coded aperture is used, allowing for background subtraction without the use of heavy shielding. Such system is then combined with a USB color camera. The output of each measurement is a semi-spherical image covering a FOV of 360 degrees horizontally and 80 degrees vertically, rendered in spherical coordinates (θ,phi). The geometrical shapes of the radiation-emitting objects are preserved by first registering and stitching the optical images captured by the prototype, and applying, subsequently, the same transformations to their corresponding radiation images. Panoramic gamma images generated by using the technique proposed in this paper are described and discussed, along with the main experimental results obtained in laboratories campaigns.
Cobbles in Troughs Between Meridiani Ripples
NASA Technical Reports Server (NTRS)
2006-01-01
As NASA's Mars Exploration Rover Opportunity continues to traverse from 'Erebus Crater' toward 'Victoria Crater,' the rover navigates along exposures of bedrock between large, wind-blown ripples. Along the way, scientists have been studying fields of cobbles that sometimes appear on trough floors between ripples. They have also been studying the banding patterns seen in large ripples. This view, obtained by Opportunity's panoramic camera on the rover's 802nd Martian day (sol) of exploration (April 27, 2006), is a mosaic spanning about 30 degrees. It shows a field of cobbles nestled among wind-driven ripples that are about 20 centimeters (8 inches) high. The origin of cobble fields like this one is unknown. The cobbles may be a lag of coarser material left behind from one or more soil deposits whose finer particles have blown away. The cobbles may be eroded fragments of meteoritic material, secondary ejecta of Mars rock thrown here from craters elsewhere on the surface, weathering remnants of locally-derived bedrock, or a mixture of these. Scientists will use the panoramic camera's multiple filters to study the rock types, variability and origins of the cobbles. This is an approximately true-color rendering that combines separate images taken through the panoramic camera's 753-nanometer, 535-nanometer and 432-nanometer filters.System-level analysis and design for RGB-NIR CMOS camera
NASA Astrophysics Data System (ADS)
Geelen, Bert; Spooren, Nick; Tack, Klaas; Lambrechts, Andy; Jayapala, Murali
2017-02-01
This paper presents system-level analysis of a sensor capable of simultaneously acquiring both standard absorption based RGB color channels (400-700nm, 75nm FWHM), as well as an additional NIR channel (central wavelength: 808 nm, FWHM: 30nm collimated light). Parallel acquisition of RGB and NIR info on the same CMOS image sensor is enabled by monolithic pixel-level integration of both a NIR pass thin film filter and NIR blocking filters for the RGB channels. This overcomes the need for a standard camera-level NIR blocking filter to remove the NIR leakage present in standard RGB absorption filters from 700-1000nm. Such a camera-level NIR blocking filter would inhibit the acquisition of the NIR channel on the same sensor. Thin film filters do not operate in isolation. Rather, their performance is influenced by the system context in which they operate. The spectral distribution of light arriving at the photo diode is shaped a.o. by the illumination spectral profile, optical component transmission characteristics and sensor quantum efficiency. For example, knowledge of a low quantum efficiency (QE) of the CMOS image sensor above 800nm may reduce the filter's blocking requirements and simplify the filter structure. Similarly, knowledge of the incoming light angularity as set by the objective lens' F/# and exit pupil location may be taken into account during the thin film's optimization. This paper demonstrates how knowledge of the application context can facilitate filter design and relax design trade-offs and presents experimental results.
Measurement methods and accuracy analysis of Chang'E-5 Panoramic Camera installation parameters
NASA Astrophysics Data System (ADS)
Yan, Wei; Ren, Xin; Liu, Jianjun; Tan, Xu; Wang, Wenrui; Chen, Wangli; Zhang, Xiaoxia; Li, Chunlai
2016-04-01
Chang'E-5 (CE-5) is a lunar probe for the third phase of China Lunar Exploration Project (CLEP), whose main scientific objectives are to implement lunar surface sampling and to return the samples back to the Earth. To achieve these goals, investigation of lunar surface topography and geological structure within sampling area seems to be extremely important. The Panoramic Camera (PCAM) is one of the payloads mounted on CE-5 lander. It consists of two optical systems which installed on a camera rotating platform. Optical images of sampling area can be obtained by PCAM in the form of a two-dimensional image and a stereo images pair can be formed by left and right PCAM images. Then lunar terrain can be reconstructed based on photogrammetry. Installation parameters of PCAM with respect to CE-5 lander are critical for the calculation of exterior orientation elements (EO) of PCAM images, which is used for lunar terrain reconstruction. In this paper, types of PCAM installation parameters and coordinate systems involved are defined. Measurement methods combining camera images and optical coordinate observations are studied for this work. Then research contents such as observation program and specific solution methods of installation parameters are introduced. Parametric solution accuracy is analyzed according to observations obtained by PCAM scientifically validated experiment, which is used to test the authenticity of PCAM detection process, ground data processing methods, product quality and so on. Analysis results show that the accuracy of the installation parameters affects the positional accuracy of corresponding image points of PCAM stereo images within 1 pixel. So the measurement methods and parameter accuracy studied in this paper meet the needs of engineering and scientific applications. Keywords: Chang'E-5 Mission; Panoramic Camera; Installation Parameters; Total Station; Coordinate Conversion
NASA Astrophysics Data System (ADS)
Kittle, David S.; Patil, Chirag G.; Mamelak, Adam; Hansen, Stacey; Perry, Jeff; Ishak, Laura; Black, Keith L.; Butte, Pramod V.
2016-03-01
Current surgical microscopes are limited in sensitivity for NIR fluorescence. Recent developments in tumor markers attached with NIR dyes require newer, more sensitive imaging systems with high resolution to guide surgical resection. We report on a small, single camera solution enabling advanced image processing opportunities previously unavailable for ultra-high sensitivity imaging of these agents. The system captures both visible reflectance and NIR fluorescence at 300 fps while displaying full HD resolution video at 60 fps. The camera head has been designed to easily mount onto the Zeiss Pentero microscope head for seamless integration into surgical procedures.
Saying Goodbye to 'Bonneville' Crater
NASA Technical Reports Server (NTRS)
2004-01-01
[figure removed for brevity, see original site] Annotated Image NASA's Mars Exploration Rover Spirit took this panoramic camera image on sol 86 (March 31, 2004) before driving 36 meters (118 feet) on sol 87 toward its future destination, the Columbia Hills. This is probably the last panoramic camera image that Spirit will take from the high rim of 'Bonneville' crater, and provides an excellent view of the ejecta-covered path the rover has journeyed thus far. The lander can be seen toward the upper right of the frame and is approximately 321 meters (1060 feet) away from Spirit's current location. The large hill on the horizon is Grissom Hill. The Colombia Hills, located to the left, are not visible in this image.NASA Technical Reports Server (NTRS)
2006-01-01
At least three different kinds of rocks await scientific analysis at the place where NASA's Mars Exploration Rover Spirit will likely spend several months of Martian winter. They are visible in this picture, which the panoramic camera on Spirit acquired during the rover's 809th sol, or Martian day, of exploring Mars (April 12, 2006). Paper-thin layers of light-toned, jagged-edged rocks protrude horizontally from beneath small sand drifts; a light gray rock with smooth, rounded edges sits atop the sand drifts; and several dark gray to black, angular rocks with vesicles (small holes) typical of hardened lava lie scattered across the sand. This view is an approximately true-color rendering that combines images taken through the panoramic camera's 753-nanometer, 535-nanometer, and 432-nanometer filters.Lunar orbital photogaphic planning charts for candidate Apollo J-missions
NASA Technical Reports Server (NTRS)
Hickson, P. J.; Piotrowski, W. L.
1971-01-01
A technique is presented for minimizing Mapping Camera film usage by reducing redundant coverage while meeting the desired sidelap of greater than or equal to 55%. The technique uses the normal groundtrack separation determined as a function of the number of revolutions between the respective tracks, of the initial and final nodal azimuths (or orbital inclination), and of the lunar latitude. The technique is also applicable for planning Panoramic Camera photography such that photographic contiguity is attained but redundant coverage is minimized. Graphs are included for planning mapping camera (MC) and panoramic camera (PC) photographic passes for a specific mission (i.e., specific groundtracks) to Descartes (Apollo 16), for specific missions to potential Apollo 17 sites such as Alphonsus, Proclus, Gassendi, Davy, and Tycho, and for a potential Apollo orbit-only mission with a nodal azimuth of 85 deg. Graphs are also included for determining the maximum number of revolutions which can elapse between successive MC and PC passes, for greater than or equal 55% sidelap and rectified contiguity respectively, for nodal azimuths between 5 deg and 85 deg.
NASA Technical Reports Server (NTRS)
2004-01-01
[figure removed for brevity, see original site] Click on the image for 'Santa Anita' Panorama (QTVR) This color mosaic taken on May 21, 25 and 26, 2004, by the panoramic camera on NASA's Mars Exploration Rover Spirit was acquired from a position roughly three-fourths the way between 'Bonneville Crater' and the base of the 'Columbia Hills.' The area is within a low thermal inertia unit (an area that heats up and cools off quickly) identified from orbit by the Mars Odyssey thermal emission imaging system instrument. The rover was roughly 600 meters (1,968 feet) from the base of the hills. This mosaic, referred to as the 'Santa Anita Panorama,' is comprised of 64 pointings, acquired with six of the panoramic camera's color filters, including one designed specifically to allow comparisons between orbital and surface brightness data. A total of 384 images were acquired as part of this panorama. The mosaic is an approximate true-color rendering constructed from images using the camera's 750-, 530- and and 480-nanometer filters, and is presented at the full resolution of the camera.Navigating surgical fluorescence cameras using near-infrared optical tracking.
van Oosterom, Matthias; den Houting, David; van de Velde, Cornelis; van Leeuwen, Fijs
2018-05-01
Fluorescence guidance facilitates real-time intraoperative visualization of the tissue of interest. However, due to attenuation, the application of fluorescence guidance is restricted to superficial lesions. To overcome this shortcoming, we have previously applied three-dimensional surgical navigation to position the fluorescence camera in reach of the superficial fluorescent signal. Unfortunately, in open surgery, the near-infrared (NIR) optical tracking system (OTS) used for navigation also induced an interference during NIR fluorescence imaging. In an attempt to support future implementation of navigated fluorescence cameras, different aspects of this interference were characterized and solutions were sought after. Two commercial fluorescence cameras for open surgery were studied in (surgical) phantom and human tissue setups using two different NIR OTSs and one OTS simulating light-emitting diode setup. Following the outcome of these measurements, OTS settings were optimized. Measurements indicated the OTS interference was caused by: (1) spectral overlap between the OTS light and camera, (2) OTS light intensity, (3) OTS duty cycle, (4) OTS frequency, (5) fluorescence camera frequency, and (6) fluorescence camera sensitivity. By optimizing points 2 to 4, navigation of fluorescence cameras during open surgery could be facilitated. Optimization of the OTS and camera compatibility can be used to support navigated fluorescence guidance concepts. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
NASA Technical Reports Server (NTRS)
Rice, M. S.; Cloutis, E. A.; Bell, J. F., III; Bish, D. L.; Horgan, B. H.; Mertzman, S. A.; Craig, M. A.; Renault, R. W.; Gautason, B.; Mountain, B.
2013-01-01
Hydrated silica-rich materials have recently been discovered on the surface of Mars by the Mars Exploration Rover (MER) Spirit, the Mars Reconnaissance Orbiter (MRO) Compact Reconnaissance Imaging Spectrometer for Mars (CRISM), and the Mars Express Observatoire pour la Mineralogie, l'Eau, les Glaces, et l'Activite'(OMEGA) in several locations. Having been interpreted as hydrothermal deposits and aqueous alteration products, these materials have important implications for the history of water on the martian surface. Spectral detections of these materials in visible to near infrared (Vis NIR) wavelengths have been based on a H2O absorption feature in the 934-1009 nm region seen with Spirit s Pancam instrument, and on SiOH absorption features in the 2.21-2.26 micron range seen with CRISM. Our work aims to determine how the spectral reflectance properties of silica-rich materials in Vis NIR wavelengths vary as a function of environmental conditions and formation. Here we present laboratory reflectance spectra of a diverse suite of silica-rich materials (chert, opal, quartz, natural sinters and synthetic silica) under a range of grain sizes and temperature, pressure, and humidity conditions. We find that the H2O content and form of H2O/OH present in silica-rich materials can have significant effects on their Vis NIR spectra. Our main findings are that the position of the approx.1.4 microns OH feature and the symmetry of the approx.1.9 microns feature can be used to discern between various forms of silica-rich materials, and that the ratio of the approx.2.2 microns (SiOH) and approx.1.9 microns (H2O) band depths can aid in distinguishing between silica phases (opal-A vs. opal-CT) and formation conditions (low vs. high temperature). In a case study of hydrated silica outcrops in Valles Marineris, we show that careful application of a modified version of these spectral parameters to orbital near-infrared spectra (e.g., from CRISM and OMEGA) can aid in characterizing the compositional diversity of silica-bearing deposits on Mars. We also discuss how these results can aid in the interpretation of silica detections on Mars made by the MER Panoramic Camera (Pancam) and Mars Science Laboratory (MSL) Mast-mounted Camera (Mastcam) instruments.
Robust Behavior Recognition in Intelligent Surveillance Environments.
Batchuluun, Ganbayar; Kim, Yeong Gon; Kim, Jong Hyun; Hong, Hyung Gil; Park, Kang Ryoung
2016-06-30
Intelligent surveillance systems have been studied by many researchers. These systems should be operated in both daytime and nighttime, but objects are invisible in images captured by visible light camera during the night. Therefore, near infrared (NIR) cameras, thermal cameras (based on medium-wavelength infrared (MWIR), and long-wavelength infrared (LWIR) light) have been considered for usage during the nighttime as an alternative. Due to the usage during both daytime and nighttime, and the limitation of requiring an additional NIR illuminator (which should illuminate a wide area over a great distance) for NIR cameras during the nighttime, a dual system of visible light and thermal cameras is used in our research, and we propose a new behavior recognition in intelligent surveillance environments. Twelve datasets were compiled by collecting data in various environments, and they were used to obtain experimental results. The recognition accuracy of our method was found to be 97.6%, thereby confirming the ability of our method to outperform previous methods.
Cobbles in Troughs Between Meridiani Ripples (False Color)
NASA Technical Reports Server (NTRS)
2006-01-01
As NASA's Mars Exploration Rover Opportunity continues to traverse from 'Erebus Crater' toward 'Victoria Crater,' the rover navigates along exposures of bedrock between large, wind-blown ripples. Along the way, scientists have been studying fields of cobbles that sometimes appear on trough floors between ripples. They have also been studying the banding patterns seen in large ripples. This view, obtained by Opportunity's panoramic camera on the rover's 802nd Martian day (sol) of exploration (April 27, 2006), is a mosaic spanning about 30 degrees. It shows a field of cobbles nestled among wind-driven ripples that are about 20 centimeters (8 inches) high. The origin of cobble fields like this one is unknown. The cobbles may be a lag of coarser material left behind from one or more soil deposits whose finer particles have blown away. The cobbles may be eroded fragments of meteoritic material, secondary ejecta of Mars rock thrown here from craters elsewhere on the surface, weathering remnants of locally-derived bedrock, or a mixture of these. Scientists will use the panoramic camera's multiple filters to study the rock types, variability and origins of the cobbles. This is a false-color rendering that combines separate images taken through the panoramic camera's 753-nanometer, 535-nanometer and 432-nanometer filters. The false color is used to enhance differences between types of materials in the rocks and soil.Range and Panoramic Image Fusion Into a Textured Range Image for Culture Heritage Documentation
NASA Astrophysics Data System (ADS)
Bila, Z.; Reznicek, J.; Pavelka, K.
2013-07-01
This paper deals with a fusion of range and panoramic images, where the range image is acquired by a 3D laser scanner and the panoramic image is acquired with a digital still camera mounted on a panoramic head and tripod. The fused resulting dataset, called "textured range image", provides more reliable information about the investigated object for conservators and historians, than using both datasets separately. A simple example of fusion of a range and panoramic images, both obtained in St. Francis Xavier Church in town Opařany, is given here. Firstly, we describe the process of data acquisition, then the processing of both datasets into a proper format for following fusion and the process of fusion. The process of fusion can be divided into a two main parts: transformation and remapping. In the first, transformation, part, both images are related by matching similar features detected on both images with a proper detector, which results in transformation matrix enabling transformation of the range image onto a panoramic image. Then, the range data are remapped from the range image space into a panoramic image space and stored as an additional "range" channel. The process of image fusion is validated by comparing similar features extracted on both datasets.
Spirit Scans Winter Haven (False Color)
NASA Technical Reports Server (NTRS)
2006-01-01
At least three different kinds of rocks await scientific analysis at the place where NASA's Mars Exploration Rover Spirit will likely spend several months of Martian winter. They are visible in this picture, which the panoramic camera on Spirit acquired during the rover's 809th sol, or Martian day, of exploring Mars (April 12, 2006). Paper-thin layers of light-toned, jagged-edged rocks protrude horizontally from beneath small sand drifts; a light gray rock with smooth, rounded edges sits atop the sand drifts; and several dark gray to black, angular rocks with vesicles (small holes) typical of hardened lava lie scattered across the sand. This view is a false-color rendering that combines images taken through the panoramic camera's 753-nanometer, 535-nanometer, and 432-nanometer filters.Optical designs for the Mars '03 rover cameras
NASA Astrophysics Data System (ADS)
Smith, Gregory H.; Hagerott, Edward C.; Scherr, Lawrence M.; Herkenhoff, Kenneth E.; Bell, James F.
2001-12-01
In 2003, NASA is planning to send two robotic rover vehicles to explore the surface of Mars. The spacecraft will land on airbags in different, carefully chosen locations. The search for evidence indicating conditions favorable for past or present life will be a high priority. Each rover will carry a total of ten cameras of five various types. There will be a stereo pair of color panoramic cameras, a stereo pair of wide- field navigation cameras, one close-up camera on a movable arm, two stereo pairs of fisheye cameras for hazard avoidance, and one Sun sensor camera. This paper discusses the lenses for these cameras. Included are the specifications, design approaches, expected optical performances, prescriptions, and tolerances.
NASA Technical Reports Server (NTRS)
Farrand, W. H.; Bell, J. F., III; Johnson, J. R.; Squyres, S. W.; Soderblom, J.; Ming, D. W.
2006-01-01
Visible and Near Infrared (VNIR) multispectral observations of rocks made by the Mars Exploration Rover Spirit s Panoramic camera (Pancam) have been analysed using a spectral mixture analysis (SMA) methodology. Scenes have been examined from the Gusev crater plains into the Columbia Hills. Most scenes on the plains and in the Columbia Hills could be modeled as three endmember mixtures of a bright material, rock, and shade. Scenes of rocks disturbed by the rover s Rock Abrasion Tool (RAT) required additional endmembers. In the Columbia Hills there were a number of scenes in which additional rock endmembers were required. The SMA methodology identified relatively dust-free areas on undisturbed rock surfaces, as well as spectrally unique areas on RAT abraded rocks. Spectral parameters from these areas were examined and six spectral classes were identified. These classes are named after a type rock or area and are: Adirondack, Lower West Spur, Clovis, Wishstone, Peace, and Watchtower. These classes are discriminable based, primarily, on near-infrared (NIR) spectral parameters. Clovis and Watchtower class rocks appear more oxidized than Wishstone class rocks and Adirondack basalts based on their having higher 535 nm band depths. Comparison of the spectral parameters of these Gusev crater rocks to parameters of glass-dominated basaltic tuffs indicates correspondence between measurements of Clovis and Watchtower classes, but divergence for the Wishstone class rocks which appear to have a higher fraction of crystalline ferrous iron bearing phases. Despite a high sulfur content, the rock Peace has NIR properties resembling plains basalts.
Estimation of Anthocyanin Content of Berries by NIR Method
NASA Astrophysics Data System (ADS)
Zsivanovits, G.; Ludneva, D.; Iliev, A.
2010-01-01
Anthocyanin contents of fruits were estimated by VIS spectrophotometer and compared with spectra measured by NIR spectrophotometer (600-1100 nm step 10 nm). The aim was to find a relationship between NIR method and traditional spectrophotometric method. The testing protocol, using NIR, is easier, faster and non-destructive. NIR spectra were prepared in pairs, reflectance and transmittance. A modular spectrocomputer, realized on the basis of a monochromator and peripherals Bentham Instruments Ltd (GB) and a photometric camera created at Canning Research Institute, were used. An important feature of this camera is the possibility offered for a simultaneous measurement of both transmittance and reflectance with geometry patterns T0/180 and R0/45. The collected spectra were analyzed by CAMO Unscrambler 9.1 software, with PCA, PLS, PCR methods. Based on the analyzed spectra quality and quantity sensitive calibrations were prepared. The results showed that the NIR method allows measuring of the total anthocyanin content in fresh berry fruits or processed products without destroying them.
Estimation of Anthocyanin Content of Berries by NIR Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zsivanovits, G.; Ludneva, D.; Iliev, A.
2010-01-21
Anthocyanin contents of fruits were estimated by VIS spectrophotometer and compared with spectra measured by NIR spectrophotometer (600-1100 nm step 10 nm). The aim was to find a relationship between NIR method and traditional spectrophotometric method. The testing protocol, using NIR, is easier, faster and non-destructive. NIR spectra were prepared in pairs, reflectance and transmittance. A modular spectrocomputer, realized on the basis of a monochromator and peripherals Bentham Instruments Ltd (GB) and a photometric camera created at Canning Research Institute, were used. An important feature of this camera is the possibility offered for a simultaneous measurement of both transmittance andmore » reflectance with geometry patterns T0/180 and R0/45. The collected spectra were analyzed by CAMO Unscrambler 9.1 software, with PCA, PLS, PCR methods. Based on the analyzed spectra quality and quantity sensitive calibrations were prepared. The results showed that the NIR method allows measuring of the total anthocyanin content in fresh berry fruits or processed products without destroying them.« less
Cameras on the moon with Apollos 15 and 16.
NASA Technical Reports Server (NTRS)
Page, T.
1972-01-01
Description of the cameras used for photography and television by Apollo 15 and 16 missions, covering a hand-held Hasselblad camera for black and white panoramic views at locations visited by the astronauts, a special stereoscopic camera designed by astronomer Tom Gold, a 16-mm movie camera used on the Apollo 15 and 16 Rovers, and several TV cameras. Details are given on the far-UV camera/spectrograph of the Apollo 16 mission. An electronographic camera converts UV light to electrons which are ejected by a KBr layer at the focus of an f/1 Schmidt camera and darken photographic films much more efficiently than far-UV. The astronomical activity of the Apollo 16 astronauts on the moon, using this equipment, is discussed.
Creating 3D models of historical buildings using geospatial data
NASA Astrophysics Data System (ADS)
Alionescu, Adrian; Bǎlǎ, Alina Corina; Brebu, Floarea Maria; Moscovici, Anca-Maria
2017-07-01
Recently, a lot of interest has been shown to understand a real world object by acquiring its 3D images of using laser scanning technology and panoramic images. A realistic impression of geometric 3D data can be generated by draping real colour textures simultaneously captured by a colour camera images. In this context, a new concept of geospatial data acquisition has rapidly revolutionized the method of determining the spatial position of objects, which is based on panoramic images. This article describes an approach that comprises inusing terrestrial laser scanning and panoramic images captured with Trimble V10 Imaging Rover technology to enlarge the details and realism of the geospatial data set, in order to obtain 3D urban plans and virtual reality applications.
Immersive Virtual Moon Scene System Based on Panoramic Camera Data of Chang'E-3
NASA Astrophysics Data System (ADS)
Gao, X.; Liu, J.; Mu, L.; Yan, W.; Zeng, X.; Zhang, X.; Li, C.
2014-12-01
The system "Immersive Virtual Moon Scene" is used to show the virtual environment of Moon surface in immersive environment. Utilizing stereo 360-degree imagery from panoramic camera of Yutu rover, the system enables the operator to visualize the terrain and the celestial background from the rover's point of view in 3D. To avoid image distortion, stereo 360-degree panorama stitched by 112 images is projected onto inside surface of sphere according to panorama orientation coordinates and camera parameters to build the virtual scene. Stars can be seen from the Moon at any time. So we render the sun, planets and stars according to time and rover's location based on Hipparcos catalogue as the background on the sphere. Immersing in the stereo virtual environment created by this imaged-based rendering technique, the operator can zoom, pan to interact with the virtual Moon scene and mark interesting objects. Hardware of the immersive virtual Moon system is made up of four high lumen projectors and a huge curve screen which is 31 meters long and 5.5 meters high. This system which take all panoramic camera data available and use it to create an immersive environment, enable operator to interact with the environment and mark interesting objects contributed heavily to establishment of science mission goals in Chang'E-3 mission. After Chang'E-3 mission, the lab with this system will be open to public. Besides this application, Moon terrain stereo animations based on Chang'E-1 and Chang'E-2 data will be showed to public on the huge screen in the lab. Based on the data of lunar exploration,we will made more immersive virtual moon scenes and animations to help the public understand more about the Moon in the future.
Development of the SEASIS instrument for SEDSAT
NASA Technical Reports Server (NTRS)
Maier, Mark W.
1996-01-01
Two SEASIS experiment objectives are key: take images that allow three axis attitude determination and take multi-spectral images of the earth. During the tether mission it is also desirable to capture images for the recoiling tether from the endmass perspective (which has never been observed). SEASIS must store all its imagery taken during the tether mission until the earth downlink can be established. SEASIS determines attitude with a panoramic camera and performs earth observation with a telephoto lens camera. Camera video is digitized, compressed, and stored in solid state memory. These objectives are addressed through the following architectural choices: (1) A camera system using a Panoramic Annular Lens (PAL). This lens has a 360 deg. azimuthal field of view by a +45 degree vertical field measured from a plan normal to the lens boresight axis. It has been shown in Mr. Mark Steadham's UAH M.S. thesis that his camera can determine three axis attitude anytime the earth and one other recognizable celestial object (for example, the sun) is in the field of view. This will be essentially all the time during tether deployment. (2) A second camera system using telephoto lens and filter wheel. The camera is a black and white standard video camera. The filters are chosen to cover the visible spectral bands of remote sensing interest. (3) A processor and mass memory arrangement linked to the cameras. Video signals from the cameras are digitized, compressed in the processor, and stored in a large static RAM bank. The processor is a multi-chip module consisting of a T800 Transputer and three Zoran floating point Digital Signal Processors. This processor module was supplied under ARPA contract by the Space Computer Corporation to demonstrate its use in space.
NASA Technical Reports Server (NTRS)
2004-01-01
This segment of the first color image from the panoramic camera on the Mars Exploration Rover Spirit shows the rover's airbag trails. These depressions in the soil were made when the airbags were deflated and retracted after landing.Endeavour on the Horizon False Color
2010-04-30
NASA Mars Exploration Rover Opportunity used its panoramic camera Pancam to capture this false-color view of the rim of Endeavour crater, the rover destination in a multi-year traverse along the sandy Martian landscape.
2010-04-30
NASA Mars Exploration Rover Opportunity used its panoramic camera Pancam to capture this view approximately true-color view of the rim of Endeavour crater, the rover destination in a multi-year traverse along the sandy Martian landscape.
Adirondack Under the Microscope-2
NASA Technical Reports Server (NTRS)
2004-01-01
This overhead look at the martian rock dubbed Adirondack was captured by the Mars Exploration Rover Spirit's panoramic camera. It shows the approximate region where the rover's microscopic imager began its first close-up inspection.
Layers in Burns Cliff Examined by Opportunity
2011-11-21
NASA Mars Exploration Rover Opportunity studied layers in the Burns Cliff slope of Endurance Crater in 2004. The layers show different types of deposition of sulfate-rich sediments. Opportunity panoramic camera recorded this image.
Martian Eclipses: Deimos and Phobos
2004-03-08
The panoramic camera on NASA Opportunity combines the first photographs of solar eclipses by Mars two moons, Deimos and Phobos. Deimos appears as a speck in front of the Sun and Phobos grazes its edge.
True 3-D View of 'Columbia Hills' from an Angle
NASA Technical Reports Server (NTRS)
2004-01-01
This mosaic of images from NASA's Mars Exploration Rover Spirit shows a panorama of the 'Columbia Hills' without any adjustment for rover tilt. When viewed through 3-D glasses, depth is much more dramatic and easier to see, compared with a tilt-adjusted version. This is because stereo views are created by producing two images, one corresponding to the view from the panoramic camera's left-eye camera, the other corresponding to the view from the panoramic camera's right-eye camera. The brain processes the visual input more accurately when the two images do not have any vertical offset. In this view, the vertical alignment is nearly perfect, but the horizon appears to curve because of the rover's tilt (because the rover was parked on a steep slope, it was tilted approximately 22 degrees to the west-northwest). Spirit took the images for this 360-degree panorama while en route to higher ground in the 'Columbia Hills.' The highest point visible in the hills is 'Husband Hill,' named for space shuttle Columbia Commander Rick Husband. To the right are the rover's tracks through the soil, where it stopped to perform maintenance on its right front wheel in July. In the distance, below the hills, is the floor of Gusev Crater, where Spirit landed Jan. 3, 2004, before traveling more than 3 kilometers (1.8 miles) to reach this point. This vista comprises 188 images taken by Spirit's panoramic camera from its 213th day, or sol, on Mars to its 223rd sol (Aug. 9 to 19, 2004). Team members at NASA's Jet Propulsion Laboratory and Cornell University spent several weeks processing images and producing geometric maps to stitch all the images together in this mosaic. The 360-degree view is presented in a cylindrical-perspective map projection with geometric seam correction.NASA Technical Reports Server (NTRS)
2004-01-01
This 3-D perspective image taken by the panoramic camera onboard the Mars Exploration Rover Spirit shows 'Adirondack,' the rover's first target rock. Spirit traversed the sandy martian terrain at Gusev Crater to arrive in front of the football-sized rock on Sunday, Jan. 18, 2004, just three days after it successfully rolled off the lander. The rock was selected as Spirit's first target because it has a flat surface and is relatively free of dust - ideal conditions for grinding into the rock to expose fresh rock underneath. Clean surfaces also are better for examining a rock's top coating.Scientists named the angular rock after the Adirondack mountain range in New York. The word Adirondack is Native American and means 'They of the great rocks.' Data from the panoramic camera's red, green and blue filters were combined to create this approximate true color image.
NASA Technical Reports Server (NTRS)
2007-01-01
A promontory nicknamed 'Cape Verde' can be seen jutting out from the walls of Victoria Crater in this false-color picture taken by the panoramic camera on NASA's Mars Exploration Rover Opportunity. The rover took this picture on martian day, or sol, 1329 (Oct. 20, 2007), more than a month after it began descending down the crater walls -- and just 9 sols shy of its second Martian birthday on sol 1338 (Oct. 29, 2007). Opportunity landed on the Red Planet on Jan. 25, 2004. That's nearly four years ago on Earth, but only two on Mars because Mars takes longer to travel around the sun than Earth. One Martian year equals 687 Earth days. This view was taken using three panoramic-camera filters, admitting light with wavelengths centered at 750 nanometers (near infrared), 530 nanometers (green) and 430 nanometers (violet).Mars Cameras Make Panoramic Photography a Snap
NASA Technical Reports Server (NTRS)
2008-01-01
If you wish to explore a Martian landscape without leaving your armchair, a few simple clicks around the NASA Web site will lead you to panoramic photographs taken from the Mars Exploration Rovers, Spirit and Opportunity. Many of the technologies that enable this spectacular Mars photography have also inspired advancements in photography here on Earth, including the panoramic camera (Pancam) and its housing assembly, designed by the Jet Propulsion Laboratory and Cornell University for the Mars missions. Mounted atop each rover, the Pancam mast assembly (PMA) can tilt a full 180 degrees and swivel 360 degrees, allowing for a complete, highly detailed view of the Martian landscape. The rover Pancams take small, 1 megapixel (1 million pixel) digital photographs, which are stitched together into large panoramas that sometimes measure 4 by 24 megapixels. The Pancam software performs some image correction and stitching after the photographs are transmitted back to Earth. Different lens filters and a spectrometer also assist scientists in their analyses of infrared radiation from the objects in the photographs. These photographs from Mars spurred developers to begin thinking in terms of larger and higher quality images: super-sized digital pictures, or gigapixels, which are images composed of 1 billion or more pixels. Gigapixel images are more than 200 times the size captured by today s standard 4 megapixel digital camera. Although originally created for the Mars missions, the detail provided by these large photographs allows for many purposes, not all of which are limited to extraterrestrial photography.
Optics to rectify CORONA panoramic photographs for map making
NASA Astrophysics Data System (ADS)
Hilbert, Robert S.
2006-08-01
In the 1960's, accurate maps of the United States were available to all, from the U.S. Government, but maps of the Soviet Union were not, and in fact were classified. Maps of the Soviet Union were needed by the U.S. Government, including for U.S. targeting of Soviet ICBM sites, and for negotiating the SALT ICBM disarmament treaty. Although mapping cameras were historically frame cameras with low distortion, the CORONA panoramic film coverage was used to identify any ICBM sites. If distortion-free photographs could be produced from this inherently distorted panoramic material, accurate maps could be produced that would be valuable. Use of the stereo photographs from CORONA, for developing accurate topographical maps, was the mission of Itek's Gamma Rectifier. Bob Shannon's department at Itek was responsible for designing the optics for the Gamma Rectifier. He assigned the design to the author. The optical requirements of this system are described along with the optical design solution, which allowed the inherent panoramic distortion of the original photographs to be "rectified" to a very high level of accuracy, in enlarged photographs. These rectifiers were used three shifts a day, for over a decade, and produced the most accurate maps of the earth's surface, that existed at that time. The results facilitated the success of the Strategic Arms Limitation Talks (SALT) Treaty signed by the US and the Soviet Union in 1972, which were verified by "national means of verification" (i.e. space reconnaissance).
NASA Technical Reports Server (NTRS)
2004-01-01
The smooth surfaces of angular and rounded rocks seen in this image of the martian terrain may be the result of wind-polishing debris. The picture was taken by the panoramic camera on the Mars Exploration Rover Spirit.2006-01-03
This is the Opportunity panoramic camera Erebus Rim panorama, acquired on sols 652 to 663 Nov. 23 to Dec. 5, 2005 , as NASA Mars Exploration Rover Opportunity was exploring sand dunes and outcrop rocks in Meridiani Planum.
2011-12-07
This false-color view of a mineral vein called Homestake comes from the panoramic camera Pancam on NASA Mars Exploration Rover Opportunity. The vein is about the width of a thumb and about 18 inches 45 centimeters long.
Experiments in interactive panoramic cinema
NASA Astrophysics Data System (ADS)
Fisher, Scott S.; Anderson, Steve; Ruiz, Susana; Naimark, Michael; Hoberman, Perry; Bolas, Mark; Weinberg, Richard
2005-03-01
For most of the past 100 years, cinema has been the premier medium for defining and expressing relations to the visible world. However, cinematic spectacles delivered in darkened theaters are predicated on a denial of both the body and the physical surroundings of the spectators who are watching it. To overcome these deficiencies, filmmakers have historically turned to narrative, seducing audiences with compelling stories and providing realistic characters with whom to identify. This paper describes several research projects in interactive panoramic cinema that attempt to sidestep the narrative preoccupations of conventional cinema and instead are based on notions of space, movement and embodied spectatorship rather than traditional storytelling. Example projects include interactive works developed with the use of a unique 360 degree camera and editing system, and also development of panoramic imagery for a large projection environment with 14 screens on 3 adjacent walls in a 5-4-5 configuration with observations and findings from an experiment projecting panoramic video on 12 of the 14, in a 4-4-4 270 degree configuration.
View of 'Cape Verde' from 'Cape St. Mary' in Mid-Afternoon
NASA Technical Reports Server (NTRS)
2006-01-01
As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape Verde' from the vantage point of 'Cape St. Mary,' the next promontory clockwise around the crater's deeply scalloped rim. This view of Cape Verde combines several exposures taken by the rover's panoramic camera into an approximately true-color mosaic. The exposures were taken during mid-afternoon lighting conditions. The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact. The images combined into this mosaic were taken during the 1,006th Martian day, or sol, of Opportunity's Mars-surface mission (Nov. 22, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.View of 'Cape Verde' from 'Cape St. Mary' in Late Morning
NASA Technical Reports Server (NTRS)
2006-01-01
As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape Verde' from the vantage point of 'Cape St. Mary,' the next promontory clockwise around the crater's deeply scalloped rim. This view of Cape Verde combines several exposures taken by the rover's panoramic camera into an approximately true-color mosaic. The exposures were taken during late-morning lighting conditions. The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact. The images combined into this mosaic were taken during the 1,006th Martian day, or sol, of Opportunity's Mars-surface mission (Nov. 22, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.Opportunity's Second Martian Birthday at Cape Verde
NASA Technical Reports Server (NTRS)
2007-01-01
A promontory nicknamed 'Cape Verde' can be seen jutting out from the walls of Victoria Crater in this approximate true-color picture taken by the panoramic camera on NASA's Mars Exploration Rover Opportunity. The rover took this picture on martian day, or sol, 1329 (Oct. 20, 2007), more than a month after it began descending down the crater walls -- and just 9 sols shy of its second Martian birthday on sol 1338 (Oct. 29, 2007). Opportunity landed on the Red Planet on Jan. 25, 2004. That's nearly four years ago on Earth, but only two on Mars because Mars takes longer to travel around the sun than Earth. One Martian year equals 687 Earth days. The overall soft quality of the image, and the 'haze' seen in the lower right portion, are the result of scattered light from dust on the front sapphire window of the rover's camera. This view was taken using three panoramic-camera filters, admitting light with wavelengths centered at 750 nanometers (near infrared), 530 nanometers (green) and 430 nanometers (violet).2005-09-11
Taking advantage of extra solar energy collected during the day, NASA's Mars Exploration Rover Spirit settled in for an evening of stargazing, photographing the two moons of Mars as they crossed the night sky. The first two images in this sequence show gradual enhancements in the surface detail of Mars' largest moon, Phobos, made possible through a combination technique known as "stacking." In "stacking," scientists use a mathematical process known as Laplacian sharpening to reinforce features that appear consistently in repetitive images and minimize features that show up only intermittently. In this view of Phobos, the large crater named Stickney is just out of sight on the moon's upper right limb. Spirit acquired the first two images with the panoramic camera on the night of sol 585 (Aug. 26,2005). The far right image of Phobos, for comparison, was taken by the High Resolution Stereo Camera on Mars Express, a European Space Agency orbiter. The third image in this sequence was derived from the far right image by making it blurrier for comparison with the panoramic camera images to the left http://photojournal.jpl.nasa.gov/catalog/PIA06335
Development of Camera Model and Geometric Calibration/validation of Xsat IRIS Imagery
NASA Astrophysics Data System (ADS)
Kwoh, L. K.; Huang, X.; Tan, W. J.
2012-07-01
XSAT, launched on 20 April 2011, is the first micro-satellite designed and built in Singapore. It orbits the Earth at altitude of 822 km in a sun synchronous orbit. The satellite carries a multispectral camera IRIS with three spectral bands - 0.52~0.60 mm for Green, 0.63~0.69 mm for Red and 0.76~0.89 mm for NIR at 12 m resolution. In the design of IRIS camera, the three bands were acquired by three lines of CCDs (NIR, Red and Green). These CCDs were physically separated in the focal plane and their first pixels not absolutely aligned. The micro-satellite platform was also not stable enough to allow for co-registration of the 3 bands with simple linear transformation. In the camera model developed, this platform stability was compensated with 3rd to 4th order polynomials for the satellite's roll, pitch and yaw attitude angles. With the camera model, the camera parameters such as the band to band separations, the alignment of the CCDs relative to each other, as well as the focal length of the camera can be validated or calibrated. The results of calibration with more than 20 images showed that the band to band along-track separation agreed well with the pre-flight values provided by the vendor (0.093° and 0.046° for the NIR vs red and for green vs red CCDs respectively). The cross-track alignments were 0.05 pixel and 5.9 pixel for the NIR vs red and green vs red CCDs respectively. The focal length was found to be shorter by about 0.8%. This was attributed to the lower operating temperature which XSAT is currently operating. With the calibrated parameters and the camera model, a geometric level 1 multispectral image with RPCs can be generated and if required, orthorectified imagery can also be produced.
Rock with Odd Coating Beside a Young Martian Crater
2010-03-24
This image from the panoramic camera on NASA Mars Exploration Rover Opportunity shows a rock called Chocolate Hills, which the rover found and examined at the edge of a young crater called Concepción.
NASA Technical Reports Server (NTRS)
2004-01-01
This segment of the first color image from the panoramic camera on the Mars Exploration Rover Spirit shows the rover's airbag trails (upper left). These depressions in the soil were made when the airbags were deflated and retracted after landing.2004-06-17
This 3-D image taken by the left and right eyes of the panoramic camera on NASA Mars Exploration Rover Spirit shows the odd rock formation dubbed Cobra Hoods center. 3D glasses are necessary to view this image.
'Algonquin' Outcrop on Spirit's Sol 680
NASA Technical Reports Server (NTRS)
2005-01-01
This view combines four frames from Spirit's panoramic camera, looking in the drive direction on the rover's 680th Martian day, or sol (Dec. 1, 2005). The outcrop of apparently layered bedrock has the informal name 'Algonquin.'2010-02-16
This false-color image, taken by the panoramic camera on NASA rover Opportunity, shows the rock Chocolate Hills, perched on the rim of the 10-meter 33-foot wide Concepcion crater. This rock has a thick, dark-colored coating resembling chocolate.
NASA Technical Reports Server (NTRS)
2004-01-01
The rust color of the Martian landscape is apparent in this low-resolution thumbnail image taken by the panoramic camera on the Mars Exploration Rover Spirit. This image is part of a larger image currently stored onboard the rover in its memory.Churned-Up Rocky Debris and Dust (True Color)
NASA Technical Reports Server (NTRS)
2005-01-01
NASA's Mars Exploration Rover Spirit has been analyzing sulfur-rich rocks and surface materials in the 'Columbia Hills' in Gusev Crater on Mars. This image shows rocky debris and dust, which planetary scientists call 'regolith' or 'soil,' that has been churned up by the rover wheels. This 40-centimeter-wide (16-inch-wide) patch of churned-up dirt, nicknamed 'Paso Robles,' contains brighter patches measured to be high in sulfur by Spirit's alpha particle X-ray Spectrometer. Spirit's panoramic camera took this image on martian day, or sol, 400 (Feb. 16, 2005). The image represents the panoramic camera team's best current attempt at generating a true color view of what this scene would look like if viewed by a human on Mars. The image was generated from a combination of six calibrated, left-eye images acquired through filters ranging from 430-nanometer to 750-nanometer wavelengths.NASA Astrophysics Data System (ADS)
García-Moreno, Angel-Iván; González-Barbosa, José-Joel; Ramírez-Pedraza, Alfonso; Hurtado-Ramos, Juan B.; Ornelas-Rodriguez, Francisco-Javier
2016-04-01
Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera-LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier's analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.
Frost on Mars Rover Opportunity
NASA Technical Reports Server (NTRS)
2004-01-01
Frost can form on surfaces if enough water is present and the temperature is sufficiently low. On each of NASA's Mars Exploration Rovers, the calibration target for the panoramic camera provides a good place to look for such events. A thin frost was observed by Opportunity's panoramic camera on the rover's 257th sol (Oct. 13, 2004) 11 minutes after sunrise (left image). The presence of the frost is most clearly seen on the post in the center of the target, particularly when compared with the unsegmented outer ring of the target, which is white. The post is normally black. For comparison, note the difference in appearance in the image on the right, taken about three hours later, after the frost had dissipated. Frost has not been observed at Spirit, where the amount of atmospheric water vapor is observed to be appreciably lower. Both images were taken through a filter centered at a wavelength of 440 nanometers (blue).Achieving real-time capsule endoscopy (CE) video visualization through panoramic imaging
NASA Astrophysics Data System (ADS)
Yi, Steven; Xie, Jean; Mui, Peter; Leighton, Jonathan A.
2013-02-01
In this paper, we mainly present a novel and real-time capsule endoscopy (CE) video visualization concept based on panoramic imaging. Typical CE videos run about 8 hours and are manually reviewed by physicians to locate diseases such as bleedings and polyps. To date, there is no commercially available tool capable of providing stabilized and processed CE video that is easy to analyze in real time. The burden on physicians' disease finding efforts is thus big. In fact, since the CE camera sensor has a limited forward looking view and low image frame rate (typical 2 frames per second), and captures very close range imaging on the GI tract surface, it is no surprise that traditional visualization method based on tracking and registration often fails to work. This paper presents a novel concept for real-time CE video stabilization and display. Instead of directly working on traditional forward looking FOV (field of view) images, we work on panoramic images to bypass many problems facing traditional imaging modalities. Methods on panoramic image generation based on optical lens principle leading to real-time data visualization will be presented. In addition, non-rigid panoramic image registration methods will be discussed.
NASA Astrophysics Data System (ADS)
Kinch, K. M.; Bell, J. F.; Madsen, M. B.
2012-12-01
The Panoramic Cameras (Pancams) [1] on NASA's Mars Exploration Rovers have each returned in excess of 17000 images of their external calibration targets (caltargets), a set of optically well-characterized patches of materials with differing reflectance properties. During the mission dust deposition on the caltargets changed their optical reflectance properties [2]. The thickness of dust on the caltargets can be derived with high confidence from the contrast between brighter and darker colored patches. The dustier the caltarget the less contrast. We present a new history of dust deposition and removal at the two MER landing sites. Our data reveals two quite distinct dust environments. At the Spirit landing site half the Martian year is dominated by dust deposition, the other half by dust removal that usually happens during brief sharp wind events. At the Opportunity landing site the Martian year has a four-season cycle of deposition-removal-deposition-removal with dust removal happening gradually throughout the two removal seasons. Comparison to atmospheric optical depth measurements [3] shows that dust removals happen during dusty high-wind periods and that dust deposition rates are roughly proportional to the atmospheric dust load. We compare with dust deposition studies from other Mars landers and also present some early results from observation of dust on a similar camera calibration target on the Mars Science Laboratory mission. References: 1. Bell, J.F., III, et al., Mars Exploration Rover Athena Panoramic Camera (Pancam) investigation. J. Geophys. Res., 2003. 108(E12): p. 8063. 2. Kinch, K.M., et al., Dust Deposition on the Mars Exploration Rover Panoramic Camera (Pancam) Calibration Targets. J. Geophys. Res., 2007. 112(E06S03): p. doi:10.1029/2006JE002807. 3. Lemmon, M., et al., Atmospheric Imaging Results from the Mars Exploration Rovers: Spirit and Opportunity. Science, 2004. 306: p. 1753-1756. Deposited dust optical depth on the Pancam caltargets as a function of time. The lower x-axes show sol number, the upper x-axes shows the areocentric longitude of the sun, Ls. Data shown are from caltarget observations with solar incidence angle i < 45°. Left column is Spirit. Right column is Opportunity. Top row shows our derived deposited optical depth in the L5 (535 nm) filter. Bottom row shows the atmospheric optical depth in the L8 (440 nm) filter as reported by the MER atmospheric team [3].
Near-Infrared Imaging for Detecting Caries and Structural Deformities in Teeth
Angelino, Keith; Edlund, David A.
2017-01-01
2-D radiographs, while commonly used for evaluating sub-surface hard structures of teeth, have low sensitivity for early caries lesions, particularly those on tooth occlusal surfaces. Radiographs are also frequently refused by patients over safety concerns. Translucency of teeth in the near-infrared (NIR) range offers a non-ionizing and safe approach to detect dental caries. We report the construction of an NIR (850 nm) LED imaging system, comprised of an NIR source and an intraoral camera for rapid dental evaluations. The NIR system was used to image teeth of ten consenting human subjects and successfully detected secondary, amalgam–occluded and early caries lesions without supplementary image processing. The camera-wand system was also capable of revealing demineralized areas, deep and superficial cracks, and other clinical features of teeth usually visualized by X-rays. The NIR system’s clinical utility, simplistic design, low cost, and user friendliness make it an effective dental caries screening technology in conjunction or in place of radiographs. PMID:28507826
Near-Infrared Imaging for Detecting Caries and Structural Deformities in Teeth.
Angelino, Keith; Edlund, David A; Shah, Pratik
2017-01-01
2-D radiographs, while commonly used for evaluating sub-surface hard structures of teeth, have low sensitivity for early caries lesions, particularly those on tooth occlusal surfaces. Radiographs are also frequently refused by patients over safety concerns. Translucency of teeth in the near-infrared (NIR) range offers a non-ionizing and safe approach to detect dental caries. We report the construction of an NIR (850 nm) LED imaging system, comprised of an NIR source and an intraoral camera for rapid dental evaluations. The NIR system was used to image teeth of ten consenting human subjects and successfully detected secondary, amalgam-occluded and early caries lesions without supplementary image processing. The camera-wand system was also capable of revealing demineralized areas, deep and superficial cracks, and other clinical features of teeth usually visualized by X-rays. The NIR system's clinical utility, simplistic design, low cost, and user friendliness make it an effective dental caries screening technology in conjunction or in place of radiographs.
Raspberry Pi camera with intervalometer used as crescograph
NASA Astrophysics Data System (ADS)
Albert, Stefan; Surducan, Vasile
2017-12-01
The intervalometer is an attachment or facility on a photo-camera that operates the shutter regularly at set intervals over a period. Professional cameras with built in intervalometers are expensive and quite difficult to find. The Canon CHDK open source operating system allows intervalometer implementation on Canon cameras only. However finding a Canon camera with near infra-red (NIR) photographic lens at affordable price is impossible. On experiments requiring several cameras (used to measure growth in plants - the crescographs, but also for coarse evaluation of the water content of leaves), the costs of the equipment are often over budget. Using two Raspberry Pi modules each equipped with a low cost NIR camera and a WIFI adapter (for downloading pictures stored on the SD card) and some freely available software, we have implemented two low budget intervalometer cameras. The shutting interval, the number of pictures to be taken, image resolution and some other parameters can be fully programmed. Cameras have been in use continuously for three months (July-October 2017) in a relevant environment (outside), proving the concept functionality.
Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor.
Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung
2018-03-23
Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works.
Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor
Kim, Dong Seop; Arsalan, Muhammad; Park, Kang Ryoung
2018-01-01
Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR) light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR) open database, show that our method outperforms previous works. PMID:29570690
Astronaut Ronald Evans photographed during transearth coast EVA
NASA Technical Reports Server (NTRS)
1972-01-01
Astronaut Ronald E. Evans is photographed performing extravehicular activity (EVA) during the Apollo 17 spacecraft's transearth coast. During his EVA Command Module pilot Evans retrieved film cassettes from the Lunar Sounder, Mapping Camera, and Panoramic Camera. The cylindrical object at Evans left side is the mapping camera cassette. The total time for the transearth EVA was one hour seven minutes 19 seconds, starting at ground elapsed time of 257:25 (2:28 p.m.) amd ending at ground elapsed time of 258:42 (3:35 p.m.) on Sunday, December 17, 1972.
Quantifying seasonal variation of leaf area index using near-infrared digital camera in a rice paddy
NASA Astrophysics Data System (ADS)
Hwang, Y.; Ryu, Y.; Kim, J.
2017-12-01
Digital camera has been widely used to quantify leaf area index (LAI). Numerous simple and automatic methods have been proposed to improve the digital camera based LAI estimates. However, most studies in rice paddy relied on arbitrary thresholds or complex radiative transfer models to make binary images. Moreover, only a few study reported continuous, automatic observation of LAI over the season in rice paddy. The objective of this study is to quantify seasonal variations of LAI using raw near-infrared (NIR) images coupled with a histogram shape-based algorithm in a rice paddy. As vegetation highly reflects the NIR light, we installed NIR digital camera 1.8 m above the ground surface and acquired unsaturated raw format images at one-hour intervals between 15 to 80 º solar zenith angles over the entire growing season in 2016 (from May to September). We applied a sub-pixel classification combined with light scattering correction method. Finally, to confirm the accuracy of the quantified LAI, we also conducted direct (destructive sampling) and indirect (LAI-2200) manual observations of LAI once per ten days on average. Preliminary results show that NIR derived LAI agreed well with in-situ observations but divergence tended to appear once rice canopy is fully developed. The continuous monitoring of LAI in rice paddy will help to understand carbon and water fluxes better and evaluate satellite based LAI products.
It's a Bird, It's a Plane, It's a... Spacecraft?
NASA Technical Reports Server (NTRS)
2004-01-01
Observing the sky with the green filter of it panoramic camera, the Mars Exploration Rover Spirit came across a surprise: a streak across the sky. The streak, seen in the middle of this mosaic of images taken by the navigation and panoramic cameras, was probably the brightest object in the sky at the time. Scientists theorize that the mystery line could be either a meteorite or one of seven out-of-commission spacecraft still orbiting Mars. Because the object appeared to move 4 degrees of an arc in 15 seconds it is probably not the Russian probes Mars 2, Mars 3, Mars 5, or Phobos 2; or the American probes Mariner 9 or Viking 1. That leaves Viking 2, which has a polar orbit that would fit with the north-south orientation of the streak. In addition, only Viking 1 and 2 were left in orbits that could produce motion as fast as that seen by Spirit. Said Mark Lemmon, a rover team member from Texas A&M University, Texas, 'Is this the first image of a meteor on Mars, or an image of a spacecraft sent from another world during the dawn of our robotic space exploration program? We may never know, but we are still looking for clues'.
The inset shows only the panoramic image of the streak.Farrand, W. H.; Bell, J.F.; Johnson, J. R.; Squyres, S. W.; Soderblom, J.; Ming, D. W.
2006-01-01
Visible and near-infrared (VNIR) multispectral observations of rocks made by the Mars Exploration Rover Spirit's Panoramic camera (Pancam) have been analyzed using a spectral mixture analysis (SMA) methodology. Scenes have been examined from the Gusev crater plains into the Columbia Hills. Most scenes on the plains and in the Columbia Hills could be modeled as three end-member mixtures of a bright material, rock, and shade. Scenes of rocks disturbed by the rover's Rock Abrasion Tool (RAT) required additional end-members. In the Columbia Hills, there were a number of scenes in which additional rock end-members were required. The SMA methodology identified relatively dust-free areas on undisturbed rock surfaces as well as spectrally unique areas on RAT abraded rocks. Spectral parameters from these areas were examined, and six spectral classes were identified. These classes are named after a type rock or area and are Adirondack, Lower West Spur, Clovis, Wishstone, Peace, and Watchtower. These classes are discriminable based, primarily, on near-infrared (NIR) spectral parameters. Clovis and Watchtower class rocks appear more oxidized than Wishstone class rocks and Adirondack basalts based on their having higher 535 nm band depths. Comparison of the spectral parameters of these Gusev crater rocks to parameters of glass-dominated basaltic tuffs indicates correspondence between measurements of Clovis and Watchtower classes but divergence for the Wishstone class rocks, which appear to have a higher fraction of crystalline ferrous iron-bearing phases. Despite a high sulfur content, the rock Peace has NIR properties resembling plains basalts. Copyright 2006 by the American Geophysical Union.
Image quality prediction - An aid to the Viking lander imaging investigation on Mars
NASA Technical Reports Server (NTRS)
Huck, F. O.; Wall, S. D.
1976-01-01
Image quality criteria and image quality predictions are formulated for the multispectral panoramic cameras carried by the Viking Mars landers. Image quality predictions are based on expected camera performance, Mars surface radiance, and lighting and viewing geometry (fields of view, Mars lander shadows, solar day-night alternation), and are needed in diagnosis of camera performance, in arriving at a preflight imaging strategy, and revision of that strategy should the need arise. Landing considerations, camera control instructions, camera control logic, aspects of the imaging process (spectral response, spatial response, sensitivity), and likely problems are discussed. Major concerns include: degradation of camera response by isotope radiation, uncertainties in lighting and viewing geometry and in landing site local topography, contamination of camera window by dust abrasion, and initial errors in assigning camera dynamic ranges (gains and offsets).
Rock with Odd Coating Beside a Young Martian Crater, False Color
2010-03-24
This false color image from the panoramic camera on NASA Mars Exploration Rover Opportunity shows a rock called Chocolate Hills, which the rover found and examined at the edge of a young crater called Concepción.
Martian Sunsets More Than Just Pretty
2004-01-10
This image shows the Sun as it appears on Mars throughout the day. Scientists monitor the dimming of the setting Sun to assess how much dust is in the martian atmosphere. The pictures were taken by the Mars Exploration Rover Spirit panoramic camera.
The Utility of Using a Near-Infrared (NIR) Camera to Measure Beach Surface Moisture
NASA Astrophysics Data System (ADS)
Nelson, S.; Schmutz, P. P.
2017-12-01
Surface moisture content is an important factor that must be considered when studying aeolian sediment transport in a beach environment. A few different instruments and procedures are available for measuring surface moisture content (i.e. moisture probes, LiDAR, and gravimetric moisture data from surface scrapings); however, these methods can be inaccurate, costly, and inapplicable, particularly in the field. Near-infrared (NIR) spectral band imagery is another technique used to obtain moisture data. NIR imagery has been predominately used through remote sensing and has yet to be used for ground-based measurements. Dry sand reflects infrared radiation given off by the sun and wet sand absorbs IR radiation. All things considered, this study assesses the utility of measuring surface moisture content of beach sand with a modified NIR camera. A traditional point and shoot digital camera was internally modified with the placement of a visible light-blocking filter. Images were taken of three different types of beach sand at controlled moisture content values, with sunlight as the source of infrared radiation. A technique was established through trial and error by comparing resultant histogram values using Adobe Photoshop with the various moisture conditions. The resultant IR absorption histogram values were calibrated to actual gravimetric moisture content from surface scrapings of the samples. Overall, the results illustrate that the NIR spectrum modified camera does not provide the ability to adequately measure beach surface moisture content. However, there were noted differences in IR absorption histogram values among the different sediment types. Sediment with darker quartz mineralogy provided larger variations in histogram values, but the technique is not sensitive enough to accurately represent low moisture percentages, which are of most importance when studying aeolian sediment transport.
NASA Astrophysics Data System (ADS)
Griffiths, Andrew; Coates, Andrew; Muller, Jan-Peter; Jaumann, Ralf; Josset, Jean-Luc; Paar, Gerhard; Barnes, David
2010-05-01
The ExoMars mission has evolved into a joint European-US mission to deliver a trace gas orbiter and a pair of rovers to Mars in 2016 and 2018 respectively. The European rover will carry the Pasteur exobiology payload including the 1.56 kg Panoramic Camera. PanCam will provide multispectral stereo images with 34 deg horizontal field-of-view (580 microrad/pixel) Wide-Angle Cameras (WAC) and (83 microrad/pixel) colour monoscopic "zoom" images with 5 deg horizontal field-of-view High Resolution Camera (HRC). The stereo Wide Angle Cameras (WAC) are based on Beagle 2 Stereo Camera System heritage [1]. Integrated with the WACs and HRC into the PanCam optical bench (which helps the instrument meet its planetary protection requirements) is the PanCam interface unit (PIU); which provides image storage, a Spacewire interface to the rover and DC-DC power conversion. The Panoramic Camera instrument is designed to fulfil the digital terrain mapping requirements of the mission [2] as well as providing multispectral geological imaging, colour and stereo panoramic images and solar images for water vapour abundance and dust optical depth measurements. The High Resolution Camera (HRC) can be used for high resolution imaging of interesting targets detected in the WAC panoramas and of inaccessible locations on crater or valley walls. Additionally HRC will be used to observe retrieved subsurface samples before ingestion into the rest of the Pasteur payload. In short, PanCam provides the overview and context for the ExoMars experiment locations, required to enable the exobiology aims of the mission. In addition to these baseline capabilities further enhancements are possible to PanCam to enhance it's effectiveness for astrobiology and planetary exploration: 1. Rover Inspection Mirror (RIM) 2. Organics Detection by Fluorescence Excitation (ODFE) LEDs [3-6] 3. UVIS broadband UV Flux and Opacity Determination (UVFOD) photodiode This paper will discuss the scientific objectives and resource impacts of these enhancements. References: 1. Griffiths, A.D., Coates, A.J., Josset, J.-L., Paar, G., Hofmann, B., Pullan, D., Ruffer, P., Sims, M.R., Pillinger, C.T., The Beagle 2 stereo camera system, Planet. Space Sci. 53, 1466-1488, 2005. 2. Paar, G., Oberst, J., Barnes, D.P., Griffiths, A.D., Jaumann, R., Coates, A.J., Muller, J.P., Gao, Y., Li, R., 2007, Requirements and Solutions for ExoMars Rover Panoramic Camera 3d Vision Processing, abstract submitted to EGU meeting, Vienna, 2007. 3. Storrie-Lombardi, M.C., Hug, W.F., McDonald, G.D., Tsapin, A.I., and Nealson, K.H. 2001. Hollow cathode ion lasers for deep ultraviolet Raman spectroscopy and fluorescence imaging. Rev. Sci. Ins., 72 (12), 4452-4459. 4. Nealson, K.H., Tsapin, A., and Storrie-Lombardi, M. 2002. Searching for life in the universe: unconventional methods for an unconventional problem. International Microbiology, 5, 223-230. 5. Mormile, M.R. and Storrie-Lombardi, M.C. 2005. The use of ultraviolet excitation of native fluorescence for identifying biomarkers in halite crystals. Astrobiology and Planetary Missions (R. B. Hoover, G. V. Levin and A. Y. Rozanov, Eds.), Proc. SPIE, 5906, 246-253. 6. Storrie-Lombardi, M.C. 2005. Post-Bayesian strategies to optimize astrobiology instrument suites: lessons from Antarctica and the Pilbara. Astrobiology and Planetary Missions (R. B. Hoover, G. V. Levin and A. Y. Rozanov, Eds.), Proc. SPIE, 5906, 288-301.
Stargazing at 'Husband Hill Observatory' on Mars
NASA Technical Reports Server (NTRS)
2005-01-01
NASA's Mars Exploration Rover Spirit continues to take advantage of extra solar energy by occasionally turning its cameras upward for night sky observations. Most recently, Spirit made a series of observations of bright star fields from the summit of 'Husband Hill' in Gusev Crater on Mars. Scientists use the images to assess the cameras' sensitivity and to search for evidence of nighttime clouds or haze. The image on the left is a computer simulation of the stars in the constellation Orion. The next three images are actual views of Orion captured with Spirit's panoramic camera during exposures of 10, 30, and 60 seconds. Because Spirit is in the southern hemisphere of Mars, Orion appears upside down compared to how it would appear to viewers in the Northern Hemisphere of Earth. 'Star trails' in the longer exposures are a result of the planet's rotation. The faintest stars visible in the 60-second exposure are about as bright as the faintest stars visible with the naked eye from Earth (about magnitude 6 in astronomical terms). The Orion Nebula, famous as a nursery of newly forming stars, is also visible in these images. Bright streaks in some parts of the images aren't stars or meteors or unidentified flying objects, but are caused by solar and galactic cosmic rays striking the camera's detector. Spirit acquired these images with the panoramic camera on Martian day, or sol, 632 (Oct. 13, 2005) at around 45 minutes past midnight local time, using the camera's broadband filter (wavelengths of 739 nanometers plus or minus 338 nanometers).NASA Astrophysics Data System (ADS)
Wierzbicki, Damian; Fryskowska, Anna; Kedzierski, Michal; Wojtkowska, Michalina; Delis, Paulina
2018-01-01
Unmanned aerial vehicles are suited to various photogrammetry and remote sensing missions. Such platforms are equipped with various optoelectronic sensors imaging in the visible and infrared spectral ranges and also thermal sensors. Nowadays, near-infrared (NIR) images acquired from low altitudes are often used for producing orthophoto maps for precision agriculture among other things. One major problem results from the application of low-cost custom and compact NIR cameras with wide-angle lenses introducing vignetting. In numerous cases, such cameras acquire low radiometric quality images depending on the lighting conditions. The paper presents a method of radiometric quality assessment of low-altitude NIR imagery data from a custom sensor. The method utilizes statistical analysis of NIR images. The data used for the analyses were acquired from various altitudes in various weather and lighting conditions. An objective NIR imagery quality index was determined as a result of the research. The results obtained using this index enabled the classification of images into three categories: good, medium, and low radiometric quality. The classification makes it possible to determine the a priori error of the acquired images and assess whether a rerun of the photogrammetric flight is necessary.
Near-infrared imaging of water in human hair.
Egawa, Mariko; Hagihara, Motofumi; Yanai, Motohiro
2013-02-01
The water content of hair can be evaluated by weighing, the Karl Fischer method, and from electrical properties. However, these methods cannot be used to study the distribution of water in the hair. Imaging techniques are required for this purpose. In this study, a highly sensitive near-infrared (NIR) imaging system was developed for evaluating water in human hair. The results obtained from NIR imaging and conventional methods were compared. An extended indium-gallium-arsenide NIR camera (detection range: 1100-2200 nm) and diffuse illumination unit developed in our laboratory were used to obtain a NIR image of hair. A water image was obtained using a 1950-nm interference filter and polarization filter. Changes in the hair water content with relative humidity (20-95% RH) and after immersion in a 7% (w/w) sorbitol solution were measured using the NIR camera and an insulation resistance tester. The changes in the water content after treatment with two types of commercially available shampoo were also measured using the NIR camera. As the water content increased with changes in the relative humidity, the brightness of the water image decreased and the insulation resistance decreased. The brightness in the NIR image of hair treated with sorbitol solution was lower than that in the image of hair treated with water. This shows the sorbitol-treated hair contains more water than water-treated hair. The sorbitol-treated hair had a lower resistance after treatment than before, which also shows that sorbitol treatment increases the water content. With this system, we could detect a difference in the moisturizing effect between two commercially available shampoos. The highly sensitive imaging system could be used to study water in human hair. Changes in the water content of hair depended on the relative humidity and treatment with moisturizer. The results obtained using the NIR imaging system were similar to those obtained using a conventional method. Our system could detect differences in the moisturizing effects of two commercially available shampoos. © 2012 John Wiley & Sons A/S.
Mars Exploration Rover Athena Panoramic Camera (Pancam) investigation
Bell, J.F.; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.N.; Arneson, H.M.; Brown, D.; Collins, S.A.; Dingizian, A.; Elliot, S.T.; Hagerott, E.C.; Hayes, A.G.; Johnson, M.J.; Johnson, J. R.; Joseph, J.; Kinch, K.; Lemmon, M.T.; Morris, R.V.; Scherr, L.; Schwochert, M.; Shepard, M.K.; Smith, G.H.; Sohl-Dickstein, J. N.; Sullivan, R.J.; Sullivan, W.T.; Wadsworth, M.
2003-01-01
The Panoramic Camera (Pancam) investigation is part of the Athena science payload launched to Mars in 2003 on NASA's twin Mars Exploration Rover (MER) missions. The scientific goals of the Pancam investigation are to assess the high-resolution morphology, topography, and geologic context of each MER landing site, to obtain color images to constrain the mineralogic, photometric, and physical properties of surface materials, and to determine dust and aerosol opacity and physical properties from direct imaging of the Sun and sky. Pancam also provides mission support measurements for the rovers, including Sun-finding for rover navigation, hazard identification and digital terrain modeling to help guide long-term rover traverse decisions, high-resolution imaging to help guide the selection of in situ sampling targets, and acquisition of education and public outreach products. The Pancam optical, mechanical, and electronics design were optimized to achieve these science and mission support goals. Pancam is a multispectral, stereoscopic, panoramic imaging system consisting of two digital cameras mounted on a mast 1.5 m above the Martian surface. The mast allows Pancam to image the full 360?? in azimuth and ??90?? in elevation. Each Pancam camera utilizes a 1024 ?? 1024 active imaging area frame transfer CCD detector array. The Pancam optics have an effective focal length of 43 mm and a focal ratio f/20, yielding an instantaneous field of view of 0.27 mrad/pixel and a field of view of 16?? ?? 16??. Each rover's two Pancam "eyes" are separated by 30 cm and have a 1?? toe-in to provide adequate stereo parallax. Each eye also includes a small eight position filter wheel to allow surface mineralogic studies, multispectral sky imaging, and direct Sun imaging in the 400-1100 nm wavelength region. Pancam was designed and calibrated to operate within specifications on Mars at temperatures from -55?? to +5??C. An onboard calibration target and fiducial marks provide the capability to validate the radiometric and geometric calibration on Mars. Copyright 2003 by the American Geophysical Union.
NASA Technical Reports Server (NTRS)
2004-01-01
[figure removed for brevity, see original site] Click for larger view
This high-resolution image from the panoramic camera on the Mars Exploration Rover Spirit shows the region containing the patch of soil scientists examined at Gusev Crater just after Spirit rolled off the Columbia Memorial Station. Scientists examined this patch on the 13th and 15th martian days, or sols, of Spirit's journey. Using nearly all the science instruments located on the rover's instrument deployment device or 'arm,' scientists yielded some puzzling results including the detection of a mineral called olivine and the appearance that the soil is stronger and more cohesive than they expected. Like detectives searching for clues, the science team will continue to peruse the landscape for explanations of their findings.Data taken from the camera's red, green and blue filters were combined to create this approximate true color picture, acquired on the 12th martian day, or sol, of Spirit's journey.The yellow box (see inset above) in this high-resolution image from the panoramic camera on the Mars Exploration Rover Spirit outlines the patch of soil scientists examined at Gusev Crater just after Spirit rolled off the Columbia Memorial Station.Oh, Hyun Jun; Yang, Il-Hyung
2016-01-01
Objectives: To propose a novel method for determining the three-dimensional (3D) root apex position of maxillary teeth using a two-dimensional (2D) panoramic radiograph image and a 3D virtual maxillary cast model. Methods: The subjects were 10 adult orthodontic patients treated with non-extraction. The multiple camera matrices were used to define transformative relationships between tooth images of the 2D panoramic radiographs and the 3D virtual maxillary cast models. After construction of the root apex-specific projective (RASP) models, overdetermined equations were used to calculate the 3D root apex position with a direct linear transformation algorithm and the known 2D co-ordinates of the root apex in the panoramic radiograph. For verification of the estimated 3D root apex position, the RASP and 3D-CT models were superimposed using a best-fit method. Then, the values of estimation error (EE; mean, standard deviation, minimum error and maximum error) between the two models were calculated. Results: The intraclass correlation coefficient values exhibited good reliability for the landmark identification. The mean EE of all root apices of maxillary teeth was 1.88 mm. The EE values, in descending order, were as follows: canine, 2.30 mm; first premolar, 1.93 mm; second premolar, 1.91 mm; first molar, 1.83 mm; second molar, 1.82 mm; lateral incisor, 1.80 mm; and central incisor, 1.53 mm. Conclusions: Camera calibration technology allows reliable determination of the 3D root apex position of maxillary teeth without the need for 3D-CT scan or tooth templates. PMID:26317151
Kinch, Kjartan M; Bell, James F; Goetz, Walter; Johnson, Jeffrey R; Joseph, Jonathan; Madsen, Morten Bo; Sohl-Dickstein, Jascha
2015-05-01
The Panoramic Cameras on NASA's Mars Exploration Rovers have each returned more than 17,000 images of their calibration targets. In order to make optimal use of this data set for reflectance calibration, a correction must be made for the presence of air fall dust. Here we present an improved dust correction procedure based on a two-layer scattering model, and we present a dust reflectance spectrum derived from long-term trends in the data set. The dust on the calibration targets appears brighter than dusty areas of the Martian surface. We derive detailed histories of dust deposition and removal revealing two distinct environments: At the Spirit landing site, half the year is dominated by dust deposition, the other half by dust removal, usually in brief, sharp events. At the Opportunity landing site the Martian year has a semiannual dust cycle with dust removal happening gradually throughout two removal seasons each year. The highest observed optical depth of settled dust on the calibration target is 1.5 on Spirit and 1.1 on Opportunity (at 601 nm). We derive a general prediction for dust deposition rates of 0.004 ± 0.001 in units of surface optical depth deposited per sol (Martian solar day) per unit atmospheric optical depth. We expect this procedure to lead to improved reflectance-calibration of the Panoramic Camera data set. In addition, it is easily adapted to similar data sets from other missions in order to deliver improved reflectance calibration as well as data on dust reflectance properties and deposition and removal history.
Bell, James F.; Goetz, Walter; Johnson, Jeffrey R.; Joseph, Jonathan; Madsen, Morten Bo; Sohl‐Dickstein, Jascha
2015-01-01
Abstract The Panoramic Cameras on NASA's Mars Exploration Rovers have each returned more than 17,000 images of their calibration targets. In order to make optimal use of this data set for reflectance calibration, a correction must be made for the presence of air fall dust. Here we present an improved dust correction procedure based on a two‐layer scattering model, and we present a dust reflectance spectrum derived from long‐term trends in the data set. The dust on the calibration targets appears brighter than dusty areas of the Martian surface. We derive detailed histories of dust deposition and removal revealing two distinct environments: At the Spirit landing site, half the year is dominated by dust deposition, the other half by dust removal, usually in brief, sharp events. At the Opportunity landing site the Martian year has a semiannual dust cycle with dust removal happening gradually throughout two removal seasons each year. The highest observed optical depth of settled dust on the calibration target is 1.5 on Spirit and 1.1 on Opportunity (at 601 nm). We derive a general prediction for dust deposition rates of 0.004 ± 0.001 in units of surface optical depth deposited per sol (Martian solar day) per unit atmospheric optical depth. We expect this procedure to lead to improved reflectance‐calibration of the Panoramic Camera data set. In addition, it is easily adapted to similar data sets from other missions in order to deliver improved reflectance calibration as well as data on dust reflectance properties and deposition and removal history. PMID:27981072
Astronaut Ronald Evans photographed during transearth coast EVA
1972-12-17
AS17-152-23391 (17 Dec. 1972) --- Astronaut Ronald E. Evans is photographed performing extravehicular activity during the Apollo 17 spacecraft's trans-Earth coast. During his EVA, Evans, command module pilot, retrieved film cassettes from the lunar sounder, mapping camera and panoramic camera. The cylindrical object at Evans' left side is the mapping camera cassette. The total time for the trans-Earth EVA was one hour, seven minutes, 18 seconds, starting at ground elapsed time of 257:25 (2:28 p.m.) and ending at G.E.T. of 258:42 (3:35 p.m.) on Sunday, Dec. 17, 1972.
Astronaut Ronald Evans photographed during transearth coast EVA
1972-12-17
AS17-152-23393 (17 Dec. 1972) --- Astronaut Ronald E. Evans is photographed performing extravehicular activity during the Apollo 17 spacecraft's trans-Earth coast. During his EVA, command module pilot Evans retrieved film cassettes from the Lunar Sounder, Mapping Camera, and Panoramic Camera. The cylindrical object at Evans' left side is the Mapping Camera cassette. The total time for the trans-Earth EVA was one hour seven minutes 18 seconds, starting at ground elapsed time of 257:25 (2:28 p.m.) and ending at ground elapsed timed of 258:42 (3:35 p.m.) on Sunday, Dec. 17, 1972.
Surveying the Newly Digitized Apollo Metric Images for Highland Fault Scarps on the Moon
NASA Astrophysics Data System (ADS)
Williams, N. R.; Pritchard, M. E.; Bell, J. F.; Watters, T. R.; Robinson, M. S.; Lawrence, S.
2009-12-01
The presence and distribution of thrust faults on the Moon have major implications for lunar formation and thermal evolution. For example, thermal history models for the Moon imply that most of the lunar interior was initially hot. As the Moon cooled over time, some models predict global-scale thrust faults should form as stress builds from global thermal contraction. Large-scale thrust fault scarps with lengths of hundreds of kilometers and maximum relief of up to a kilometer or more, like those on Mercury, are not found on the Moon; however, relatively small-scale linear and curvilinear lobate scarps with maximum lengths typically around 10 km have been observed in the highlands [Binder and Gunga, Icarus, v63, 1985]. These small-scale scarps are interpreted to be thrust faults formed by contractional stresses with relatively small maximum (tens of meters) displacements on the faults. These narrow, low relief landforms could only be identified in the highest resolution Lunar Orbiter and Apollo Panoramic Camera images and under the most favorable lighting conditions. To date, the global distribution and other properties of lunar lobate faults are not well understood. The recent micron-resolution scanning and digitization of the Apollo Mapping Camera (Metric) photographic negatives [Lawrence et al., NLSI Conf. #1415, 2008; http://wms.lroc.asu.edu/apollo] provides a new dataset to search for potential scarps. We examined more than 100 digitized Metric Camera image scans, and from these identified 81 images with favorable lighting (incidence angles between about 55 and 80 deg.) to manually search for features that could be potential tectonic scarps. Previous surveys based on Panoramic Camera and Lunar Orbiter images found fewer than 100 lobate scarps in the highlands; in our Apollo Metric Camera image survey, we have found additional regions with one or more previously unidentified linear and curvilinear features on the lunar surface that may represent lobate thrust fault scarps. In this presentation we review the geologic characteristics and context of these newly-identified, potentially tectonic landforms. The lengths and relief of some of these linear and curvilinear features are consistent with previously identified lobate scarps. Most of these features are in the highlands, though a few occur along the edges of mare and/or crater ejecta deposits. In many cases the resolution of the Metric Camera frames (~10 m/pix) is not adequate to unequivocally determine the origin of these features. Thus, to assess if the newly identified features have tectonic or other origins, we are examining them in higher-resolution Panoramic Camera (currently being scanned) and Lunar Reconnaissance Orbiter Camera Narrow Angle Camera images [Watters et al., this meeting, 2009].
2006-06-01
conventional camera vs. thermal imager vs. night vision; camera field of view (narrow, wide, panoramic); keyboard + mouse vs. joystick control vs...motorised platform which could scan the immediate area, producing a 360o panorama of “stitched-together” digital pictures. The picture file, together with...VBS was used to automate the process of creating a QuickTime panorama (.mov or .qt), which includes the initial retrieval of the images, the
NASA Technical Reports Server (NTRS)
2004-01-01
This image taken by the Mars Exploration Rover Opportunity's panoramic camera shows where the rover's airbag seams left impressions in the martian soil. The drag marks were made after the rover successfully landed at Meridiani Planum and its airbags were retracted. The rover can be seen in the foreground.
NASA Technical Reports Server (NTRS)
2004-01-01
This image taken at Meridiani Planum, Mars by the panoramic camera on the Mars Exploration Rover Opportunity shows the rover's microscopic imager (circular device in center), located on its instrument deployment device, or 'arm.' The image was acquired on the ninth martian day or sol of the rover's mission.
NASA Technical Reports Server (NTRS)
2004-01-01
This image taken by the Mars Exploration Rover Opportunity's panoramic camera shows where the rover's airbags left impressions in the martian soil. The drag marks were made after the rover successfully landed at Meridiani Planum and its airbags were retracted. The rover can be seen in the foreground.
Sulfur-Rich Rocks and Dirt (True Color)
NASA Technical Reports Server (NTRS)
2005-01-01
NASA's Mars Rover Spirit has been analyzing sulfur-rich rocks and surface materials in the 'Columbia Hills' in Gusev Crater on Mars. This image of a very soft, nodular, layered rock nicknamed 'Peace' in honor of Martin Luther King Jr. shows a 4.5-centimeter-wide (1.8-inch-wide) hole Spirit ground into the surface with the rover's rock abrasion tool. The high sulfur content of the rock measured by Spirit's alpha particle X-ray spectrometer and its softness measured by the abrasion tool are probably evidence of past alteration by water. Spirit's panoramic camera took this image on martian day, or sol, 381 (Jan. 27, 2005). The image represents the panoramic camera team's best current attempt at generating a true color view of what this scene would look like if viewed by a human on Mars. The image was generated from a combination of six calibrated, left-eye Pancam images acquired through filters ranging from 430-nanometer to 750-nanometer wavelengths.A low-cost dual-camera imaging system for aerial applicators
USDA-ARS?s Scientific Manuscript database
Agricultural aircraft provide a readily available remote sensing platform as low-cost and easy-to-use consumer-grade cameras are being increasingly used for aerial imaging. In this article, we report on a dual-camera imaging system we recently assembled that can capture RGB and near-infrared (NIR) i...
NASA Astrophysics Data System (ADS)
Bauer, Jacob R.; van Beekum, Karlijn; Klaessens, John; Noordmans, Herke Jan; Boer, Christa; Hardeberg, Jon Y.; Verdaasdonk, Rudolf M.
2018-02-01
Non contact spatial resolved oxygenation measurements remain an open challenge in the biomedical field and non contact patient monitoring. Although point measurements are the clinical standard till this day, regional differences in the oxygenation will improve the quality and safety of care. Recent developments in spectral imaging resulted in spectral filter array cameras (SFA). These provide the means to acquire spatial spectral videos in real-time and allow a spatial approach to spectroscopy. In this study, the performance of a 25 channel near infrared SFA camera was studied to obtain spatial oxygenation maps of hands during an occlusion of the left upper arm in 7 healthy volunteers. For comparison a clinical oxygenation monitoring system, INVOS, was used as a reference. In case of the NIRS SFA camera, oxygenation curves were derived from 2-3 wavelength bands with a custom made fast analysis software using a basic algorithm. Dynamic oxygenation changes were determined with the NIR SFA camera and INVOS system at different regional locations of the occluded versus non-occluded hands and showed to be in good agreement. To increase the signal to noise ratio, algorithm and image acquisition were optimised. The measurement were robust to different illumination conditions with NIR light sources. This study shows that imaging of relative oxygenation changes over larger body areas is potentially possible in real time.
Road sign recognition using Viapix module and correlation
NASA Astrophysics Data System (ADS)
Ouerhani, Y.; Desthieux, M.; Alfalou, A.
2015-03-01
In this paper, we propose and validate a new system used to explore road assets. In this work we are interested on the vertical road signs. To do this, we are based on the combination of road signs detection, recognition and identification using data provides by sensors. The proposed approach consists on using panoramic views provided by the innovative device, VIAPIX®1, developed by our company ACTRIS2. We are based also on the optimized correlation technique for road signs recognition and identification on pictures. Obtained results shows the interest on using panoramic views compared to results obtained using images provided using only one camera.
Rover imaging system for the Mars rover/sample return mission
NASA Technical Reports Server (NTRS)
1993-01-01
In the past year, the conceptual design of a panoramic imager for the Mars Environmental Survey (MESUR) Pathfinder was finished. A prototype camera was built and its performace in the laboratory was tested. The performance of this camera was excellent. Based on this work, we have recently proposed a small, lightweight, rugged, and highly capable Mars Surface Imager (MSI) instrument for the MESUR Pathfinder mission. A key aspect of our approach to optimization of the MSI design is that we treat image gathering, coding, and restoration as a whole, rather than as separate and independent tasks. Our approach leads to higher image quality, especially in the representation of fine detail with good contrast and clarity, without increasing either the complexity of the camera or the amount of data transmission. We have made significant progress over the past year in both the overall MSI system design and in the detailed design of the MSI optics. We have taken a simple panoramic camera and have upgraded it substantially to become a prototype of the MSI flight instrument. The most recent version of the camera utilizes miniature wide-angle optics that image directly onto a 3-color, 2096-element CCD line array. There are several data-taking modes, providing resolution as high as 0.3 mrad/pixel. Analysis tasks that were performed or that are underway with the test data from the prototype camera include the following: construction of 3-D models of imaged scenes from stereo data, first for controlled scenes and later for field scenes; and checks on geometric fidelity, including alignment errors, mast vibration, and oscillation in the drive system. We have outlined a number of tasks planned for Fiscal Year '93 in order to prepare us for submission of a flight instrument proposal for MESUR Pathfinder.
Huber, V; Huber, A; Kinna, D; Balboa, I; Collins, S; Conway, N; Drewelow, P; Maggi, C F; Matthews, G F; Meigs, A G; Mertens, Ph; Price, M; Sergienko, G; Silburn, S; Wynn, A; Zastrow, K-D
2016-11-01
The in situ absolute calibration of the JET real-time protection imaging system has been performed for the first time by means of radiometric light source placed inside the JET vessel and operated by remote handling. High accuracy of the calibration is confirmed by cross-validation of the near infrared (NIR) cameras against each other, with thermal IR cameras, and with the beryllium evaporator, which lead to successful protection of the JET first wall during the last campaign. The operation temperature ranges of NIR protection cameras for the materials used on JET are Be 650-1600 °C, W coating 600-1320 °C, and W 650-1500 °C.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huber, V., E-mail: V.Huber@fz-juelich.de; Huber, A.; Mertens, Ph.
The in situ absolute calibration of the JET real-time protection imaging system has been performed for the first time by means of radiometric light source placed inside the JET vessel and operated by remote handling. High accuracy of the calibration is confirmed by cross-validation of the near infrared (NIR) cameras against each other, with thermal IR cameras, and with the beryllium evaporator, which lead to successful protection of the JET first wall during the last campaign. The operation temperature ranges of NIR protection cameras for the materials used on JET are Be 650-1600 °C, W coating 600-1320 °C, and W 650-1500 °C.
NASA Technical Reports Server (NTRS)
2004-01-01
This 'postcard' from the panoramic camera on the Mars Exploration Rover Opportunity shows the view of the martian landscape southwest of the rover. The image was taken in the late martian afternoon at Meridiani Planum on Mars, where Opportunity landed at approximately 9:05 p.m. PST on Saturday, Jan. 24.
Stereoscopic wide field of view imaging system
NASA Technical Reports Server (NTRS)
Prechtl, Eric F. (Inventor); Sedwick, Raymond J. (Inventor); Jonas, Eric M. (Inventor)
2011-01-01
A stereoscopic imaging system incorporates a plurality of imaging devices or cameras to generate a high resolution, wide field of view image database from which images can be combined in real time to provide wide field of view or panoramic or omni-directional still or video images.
Astronaut Ronald Evans photographed during transearth coast EVA
NASA Technical Reports Server (NTRS)
1972-01-01
Astronaut Ronald E. Evans is photographed performing extravehicular activity (EVA) during the Apollo 17 spacecraft's transearth coast. During his EVA Command Module pilot Evans retrieved film cassettes from the Lunar Sounder, Mapping Camera, and Panoramic Camera. The total time for the transearth EVA was one hour seven minutes 19 seconds, starting at ground elapsed time of 257:25 (2:28 p.m.) amd ending at ground elapsed time of 258:42 (3:35 p.m.) on Sunday, December 17, 1972.
NASA Technical Reports Server (NTRS)
2004-01-01
This approximate true-color image of the rock called 'Lion Stone' was acquired by the Mars Exploration Rover Opportunity's panoramic camera on sol 104 (May 9, 2004). The rock stands about 10 centimeters tall (about 4 inches) and is about 30 centimeters long (12 inches). Plans for the coming sols include investigating the rock with the spectrometers on the rover's instrument arm. This image was generated using the camera's L2 (750-nanometer), L5 (530-nanometer) and L6 (480-nanometer) filters.Improving NIR snow pit stratigraphy observations by introducing a controlled NIR light source
NASA Astrophysics Data System (ADS)
Dean, J.; Marshall, H.; Rutter, N.; Karlson, A.
2013-12-01
Near-infrared (NIR) photography in a prepared snow pit measures mm-/grain-scale variations in snow structure, as reflectivity is strongly dependent on microstructure and grain size at the NIR wavelengths. We explore using a controlled NIR light source to maximize signal to noise ratio and provide uniform incident, diffuse light on the snow pit wall. NIR light fired from the flash is diffused across and reflected by an umbrella onto the snow pit; the lens filter transmits NIR light onto the spectrum-modified sensor of the DSLR camera. Lenses are designed to refract visible light properly, not NIR light, so there must be a correction applied for the subsequent NIR bright spot. To avoid interpolation and debayering algorithms automatically performed by programs like Adobe's Photoshop on the images, the raw data are analyzed directly in MATLAB. NIR image data show a doubling of the amount of light collected in the same time for flash over ambient lighting. Transitions across layer boundaries in the flash-lit image are detailed by higher camera intensity values than ambient-lit images. Curves plotted using median intensity at each depth, normalized to the average profile intensity, show a separation between flash- and ambient-lit images in the upper 10-15 cm; the ambient-lit image curve asymptotically approaches the level of the flash-lit image curve below 15cm. We hypothesize that the difference is caused by additional ambient light penetrating the upper 10-15 cm of the snowpack from above and transmitting through the wall of the snow pit. This indicates that combining NIR ambient and flash photography could be a powerful technique for studying penetration depth of radiation as a function of microstructure and grain size. The NIR flash images do not increase the relative contrast at layer boundaries; however, the flash more than doubles the amount of recorded light and controls layer noise as well as layer boundary transition noise.
PANIC: A General-purpose Panoramic Near-infrared Camera for the Calar Alto Observatory
NASA Astrophysics Data System (ADS)
Cárdenas Vázquez, M.-C.; Dorner, B.; Huber, A.; Sánchez-Blanco, E.; Alter, M.; Rodríguez Gómez, J. F.; Bizenberger, P.; Naranjo, V.; Ibáñez Mengual, J.-M.; Panduro, J.; García Segura, A. J.; Mall, U.; Fernández, M.; Laun, W.; Ferro Rodríguez, I. M.; Helmling, J.; Terrón, V.; Meisenheimer, K.; Fried, J. W.; Mathar, R. J.; Baumeister, H.; Rohloff, R.-R.; Storz, C.; Verdes-Montenegro, L.; Bouy, H.; Ubierna, M.; Fopp, P.; Funke, B.
2018-02-01
PANIC7 is the new PAnoramic Near-Infrared Camera for Calar Alto and is a project jointly developed by the MPIA in Heidelberg, Germany, and the IAA in Granada, Spain, for the German-Spanish Astronomical Center at Calar Alto Observatory (CAHA; Almería, Spain). This new instrument works with the 2.2 m and 3.5 m CAHA telescopes covering a field of view of 30 × 30 arcmin and 15 × 15 arcmin, respectively, with a sampling of 4096 × 4096 pixels. It is designed for the spectral bands from Z to K S , and can also be equipped with narrowband filters. The instrument was delivered to the observatory in 2014 October and was commissioned at both telescopes between 2014 November and 2015 June. Science verification at the 2.2 m telescope was carried out during the second semester of 2015 and the instrument is now at full operation. We describe the design, assembly, integration, and verification process, the final laboratory tests and the PANIC instrument performance. We also present first-light data obtained during the commissioning and preliminary results of the scientific verification. The final optical model and the theoretical performance of the camera were updated according to the as-built data. The laboratory tests were made with a star simulator. Finally, the commissioning phase was done at both telescopes to validate the camera real performance on sky. The final laboratory test confirmed the expected camera performances, complying with the scientific requirements. The commissioning phase on sky has been accomplished.
Kim, Jong Hyun; Hong, Hyung Gil; Park, Kang Ryoung
2017-05-08
Because intelligent surveillance systems have recently undergone rapid growth, research on accurately detecting humans in videos captured at a long distance is growing in importance. The existing research using visible light cameras has mainly focused on methods of human detection for daytime hours when there is outside light, but human detection during nighttime hours when there is no outside light is difficult. Thus, methods that employ additional near-infrared (NIR) illuminators and NIR cameras or thermal cameras have been used. However, in the case of NIR illuminators, there are limitations in terms of the illumination angle and distance. There are also difficulties because the illuminator power must be adaptively adjusted depending on whether the object is close or far away. In the case of thermal cameras, their cost is still high, which makes it difficult to install and use them in a variety of places. Because of this, research has been conducted on nighttime human detection using visible light cameras, but this has focused on objects at a short distance in an indoor environment or the use of video-based methods to capture multiple images and process them, which causes problems related to the increase in the processing time. To resolve these problems, this paper presents a method that uses a single image captured at night on a visible light camera to detect humans in a variety of environments based on a convolutional neural network. Experimental results using a self-constructed Dongguk night-time human detection database (DNHD-DB1) and two open databases (Korea advanced institute of science and technology (KAIST) and computer vision center (CVC) databases), as well as high-accuracy human detection in a variety of environments, show that the method has excellent performance compared to existing methods.
Panoramic 3d Vision on the ExoMars Rover
NASA Astrophysics Data System (ADS)
Paar, G.; Griffiths, A. D.; Barnes, D. P.; Coates, A. J.; Jaumann, R.; Oberst, J.; Gao, Y.; Ellery, A.; Li, R.
The Pasteur payload on the ESA ExoMars Rover 2011/2013 is designed to search for evidence of extant or extinct life either on or up to ˜2 m below the surface of Mars. The rover will be equipped by a panoramic imaging system to be developed by a UK, German, Austrian, Swiss, Italian and French team for visual characterization of the rover's surroundings and (in conjunction with an infrared imaging spectrometer) remote detection of potential sample sites. The Panoramic Camera system consists of a wide angle multispectral stereo pair with 65° field-of-view (WAC; 1.1 mrad/pixel) and a high resolution monoscopic camera (HRC; current design having 59.7 µrad/pixel with 3.5° field-of-view) . Its scientific goals and operational requirements can be summarized as follows: • Determination of objects to be investigated in situ by other instruments for operations planning • Backup and Support for the rover visual navigation system (path planning, determination of subsequent rover positions and orientation/tilt within the 3d environment), and localization of the landing site (by stellar navigation or by combination of orbiter and ground panoramic images) • Geological characterization (using narrow band geology filters) and cartography of the local environments (local Digital Terrain Model or DTM). • Study of atmospheric properties and variable phenomena near the Martian surface (e.g. aerosol opacity, water vapour column density, clouds, dust devils, meteors, surface frosts,) 1 • Geodetic studies (observations of Sun, bright stars, Phobos/Deimos). The performance of 3d data processing is a key element of mission planning and scientific data analysis. The 3d Vision Team within the Panoramic Camera development Consortium reports on the current status of development, consisting of the following items: • Hardware Layout & Engineering: The geometric setup of the system (location on the mast & viewing angles, mutual mounting between WAC and HRC) needs to be optimized w.r.t. fields of view, ranging capability (distance measurement capability), data rate, necessity of calibration targets, hardware & data interfaces to other subsystems (e.g. navigation) as well as accuracy impacts of sensor design and compression ratio. • Geometric Calibration: The geometric properties of the individual cameras including various spectral filters, their mutual relations and the dynamic geometrical relation between rover frame and cameras - with the mast in between - are precisely described by a calibration process. During surface operations these relations will be continuously checked and updated by photogrammetric means, environmental influences such as temperature, pressure and the Mars gravity will be taken into account. • Surface Mapping: Stereo imaging using the WAC stereo pair is used for the 3d reconstruction of the rover vicinity to identify, locate and characterize potentially interesting spots (3-10 for an experimental cycle to be performed within approx. 10-30 sols). The HRC is used for high resolution imagery of these regions of interest to be overlaid on the 3d reconstruction and potentially refined by shape-from-shading techniques. A quick processing result is crucial for time critical operations planning, therefore emphasis is laid on the automatic behaviour and intrinsic error detection mechanisms. The mapping results will be continuously fused, updated and synchronized with the map used by the navigation system. The surface representation needs to take into account the different resolutions of HRC and WAC as well as uncommon or even unexpected image acquisition modes such as long range, wide baseline stereo from different rover positions or escape strategies in the case of loss of one of the stereo camera heads. • Panorama Mosaicking: The production of a high resolution stereoscopic panorama nowadays is state-of-art in computer vision. However, certain 2 challenges such as the need for access to accurate spherical coordinates, maintenance of radiometric & spectral response in various spectral bands, fusion between HRC and WAC, super resolution, and again the requirement of quick yet robust processing will add some complexity to the ground processing system. • Visualization for Operations Planning: Efficient operations planning is directly related to an ergonomic and well performing visualization. It is intended to adapt existing tools to an integrated visualization solution for the purpose of scientific site characterization, view planning and reachability mapping/instrument placement of pointing sensors (including the panoramic imaging system itself), and selection of regions of interest. The main interfaces between the individual components as well as the first version of a user requirement document are currently under definition. Beside the support for sensor layout and calibration the 3d vision system will consist of 2-3 main modules to be used during ground processing & utilization of the ExoMars Rover panoramic imaging system. 3
NASA Astrophysics Data System (ADS)
Mahmood, Usama; Dehdari, Reza; Cerussi, Albert; Nguyen, Quoc; Kelley, Timothy; Tromberg, Bruce J.; Wong, Brian J.
2005-04-01
Though sinusitis is a significant health problem, it remains a challenging diagnosis for many physicians mainly because of its vague, non-specific symptomology. As such, physicians must often rely on x-rays and CT, which are not only costly but also expose the patient to ionizing radiation. As an alternative to these methods of diagnosis, our laboratory constructed a near infrared (NIR) transillumination system to image the paranasal maxillary sinuses. In contrast to the more conventional form of transillumination, which uses visible light, NIR transillumination uses light with a longer wavelength which is less attenuated by soft tissues, allowing increased signal intensity and tissue penetration. Our NIR transillumination system is low-cost, consisting of a light source containing two series of light emitting diodes, which give off light at wavelengths of 810 nm and 850 nm, and a charge coupled device (CCD) camera sensitive to NIR light. The light source is simply placed in the patient"s mouth and the resultant image created by the transmittance of NIR light is captured with the CCD camera via notebook PC. Using this NIR transillumination system, we imaged the paranasal maxillary sinuses of both healthy patients (n=5) and patients with sinus disease (n=12) and compared the resultant findings with conventional CT scans. We found that air and fluid/tissue-filled spaces can be reasonably distinguished by their differing NIR opacities. Based on these findings, we believe NIR transillumination of the paranasal sinuses may provide a simple, safe, and cost effective modality in the diagnosis and management of sinus disease.
View of 'Cape St. Mary' from 'Cape Verde'
NASA Technical Reports Server (NTRS)
2006-01-01
As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape St. Mary' from the from the vantage point of 'Cape Verde,' the next promontory counterclockwise around the crater's deeply scalloped rim. This view of Cape St. Mary combines several exposures taken by the rover's panoramic camera into an approximately true-color mosaic. The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact. Near the base of the Cape St. Mary cliff are layers with a pattern called 'crossbedding,' intersecting with each other at angles, rather than parallel to each other. Large-scale crossbedding can result from material being deposited as wind-blown dunes. The images combined into this mosaic were taken during the 970th Martian day, or sol, of Opportunity's Mars-surface mission (Oct. 16, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.View of 'Cape Verde' from 'Cape St. Mary' in Mid-Afternoon (False Color)
NASA Technical Reports Server (NTRS)
2006-01-01
As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape Verde' from the vantage point of 'Cape St. Mary,' the next promontory clockwise around the crater's deeply scalloped rim. This view of Cape Verde combines several exposures taken by the rover's panoramic camera into an approximately false-color mosaic. The exposures were taken during mid-afternoon lighting conditions. The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact. The images combined into this mosaic were taken during the 1,006th Martian day, or sol, of Opportunity's Mars-surface mission (Nov. 22, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters. The false color enhances subtle color differences among materials in the rocks and soils of the scene.View of 'Cape Verde' from 'Cape St. Mary' in Late Morning (False Color)
NASA Technical Reports Server (NTRS)
2006-01-01
As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape Verde' from the vantage point of 'Cape St. Mary,' the next promontory clockwise around the crater's deeply scalloped rim. This view of Cape Verde combines several exposures taken by the rover's panoramic camera into a false-color mosaic. The exposures were taken during late-morning lighting conditions. The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact. The images combined into this mosaic were taken during the 1,006th Martian day, or sol, of Opportunity's Mars-surface mission (Nov. 22, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters. The false color enhances subtle color differences among materials in the rocks and soils of the scene.NASA Astrophysics Data System (ADS)
Blaser, S.; Nebiker, S.; Cavegn, S.
2017-05-01
Image-based mobile mapping systems enable the efficient acquisition of georeferenced image sequences, which can later be exploited in cloud-based 3D geoinformation services. In order to provide a 360° coverage with accurate 3D measuring capabilities, we present a novel 360° stereo panoramic camera configuration. By using two 360° panorama cameras tilted forward and backward in combination with conventional forward and backward looking stereo camera systems, we achieve a full 360° multi-stereo coverage. We furthermore developed a fully operational new mobile mapping system based on our proposed approach, which fulfils our high accuracy requirements. We successfully implemented a rigorous sensor and system calibration procedure, which allows calibrating all stereo systems with a superior accuracy compared to that of previous work. Our study delivered absolute 3D point accuracies in the range of 4 to 6 cm and relative accuracies of 3D distances in the range of 1 to 3 cm. These results were achieved in a challenging urban area. Furthermore, we automatically reconstructed a 3D city model of our study area by employing all captured and georeferenced mobile mapping imagery. The result is a very high detailed and almost complete 3D city model of the street environment.
NASA Astrophysics Data System (ADS)
Schonlau, William J.
2006-05-01
An immersive viewing engine providing basic telepresence functionality for a variety of application types is presented. Augmented reality, teleoperation and virtual reality applications all benefit from the use of head mounted display devices that present imagery appropriate to the user's head orientation at full frame rates. Our primary application is the viewing of remote environments, as with a camera equipped teleoperated vehicle. The conventional approach where imagery from a narrow field camera onboard the vehicle is presented to the user on a small rectangular screen is contrasted with an immersive viewing system where a cylindrical or spherical format image is received from a panoramic camera on the vehicle, resampled in response to sensed user head orientation and presented via wide field eyewear display, approaching 180 degrees of horizontal field. Of primary interest is the user's enhanced ability to perceive and understand image content, even when image resolution parameters are poor, due to the innate visual integration and 3-D model generation capabilities of the human visual system. A mathematical model for tracking user head position and resampling the panoramic image to attain distortion free viewing of the region appropriate to the user's current head pose is presented and consideration is given to providing the user with stereo viewing generated from depth map information derived using stereo from motion algorithms.
NASA Astrophysics Data System (ADS)
Patonay, Gabor; Strekowski, Lucjan; Salon, Jozef; Medou-Ovono, Martial; Krutak, James J.; Leggitt, Jeffrey; Seubert, Heather; Craig, Rhonda
2004-12-01
New chemistry for leuco fluorescin and leuco rhodamine for latent bloodstain and fingerprint detection has been developed in our laboratories. The use of these leuco dyes results in excellent contrast for several hours. The FBI's Evidence Response Team and DNA I unit collaborated with Georgia State University to validate the new fluorescin chemistry for use in the field. In addition, several new NIR dyes have been developed in our laboratories that can be used to detect different chemical residues, e.g., pepper spray, latent fingerprint, latent blood, metal ions, or other trace evidence during crime scene investigations. Proof of principle experiments showed that NIR dyes reacting with such residues can be activated with appropriately filtered semiconductor lasers and LEDs to emit NIR fluorescence that can be observed using optimally filtered night vision intensifiers or pocket scopes, digital cameras, CCD and CMOS cameras, or other NIR detection systems. The main advantage of NIR detection is that the color of the background has very little influence on detection and that there are very few materials that would interfere by exhibiting NIR fluorescence. The use of pocket scopes permits sensitive and convenient detection. Once the residues are located, digital images of the fluorescence can be recorded and samples obtained for further analyses. NIR dyes do not interfere with subsequent follow-up or confirmation methods such as DNA or LC/MS analysis. Near-infrared absorbing dyes will be summarized along with detection mechanisms.
Imaging using a supercontinuum laser to assess tumors in patients with breast carcinoma
NASA Astrophysics Data System (ADS)
Sordillo, Laura A.; Sordillo, Peter P.; Alfano, R. R.
2016-03-01
The supercontinuum laser light source has many advantages over other light sources, including broad spectral range. Transmission images of paired normal and malignant breast tissue samples from two patients were obtained using a Leukos supercontinuum (SC) laser light source with wavelengths in the second and third NIR optical windows and an IR- CCD InGaAs camera detector (Goodrich Sensors Inc. high response camera SU320KTSW-1.7RT with spectral response between 900 nm and 1,700 nm). Optical attenuation measurements at the four NIR optical windows were obtained from the samples.
NASA Astrophysics Data System (ADS)
Swain, Pradyumna; Mark, David
2004-09-01
The emergence of curved CCD detectors as individual devices or as contoured mosaics assembled to match the curved focal planes of astronomical telescopes and terrestrial stereo panoramic cameras represents a major optical design advancement that greatly enhances the scientific potential of such instruments. In altering the primary detection surface within the telescope"s optical instrumentation system from flat to curved, and conforming the applied CCD"s shape precisely to the contour of the telescope"s curved focal plane, a major increase in the amount of transmittable light at various wavelengths through the system is achieved. This in turn enables multi-spectral ultra-sensitive imaging with much greater spatial resolution necessary for large and very large telescope applications, including those involving infrared image acquisition and spectroscopy, conducted over very wide fields of view. For earth-based and space-borne optical telescopes, the advent of curved CCD"s as the principle detectors provides a simplification of the telescope"s adjoining optics, reducing the number of optical elements and the occurrence of optical aberrations associated with large corrective optics used to conform to flat detectors. New astronomical experiments may be devised in the presence of curved CCD applications, in conjunction with large format cameras and curved mosaics, including three dimensional imaging spectroscopy conducted over multiple wavelengths simultaneously, wide field real-time stereoscopic tracking of remote objects within the solar system at high resolution, and deep field survey mapping of distant objects such as galaxies with much greater multi-band spatial precision over larger sky regions. Terrestrial stereo panoramic cameras equipped with arrays of curved CCD"s joined with associative wide field optics will require less optical glass and no mechanically moving parts to maintain continuous proper stereo convergence over wider perspective viewing fields than their flat CCD counterparts, lightening the cameras and enabling faster scanning and 3D integration of objects moving within a planetary terrain environment. Preliminary experiments conducted at the Sarnoff Corporation indicate the feasibility of curved CCD imagers with acceptable electro-optic integrity. Currently, we are in the process of evaluating the electro-optic performance of a curved wafer scale CCD imager. Detailed ray trace modeling and experimental electro-optical data performance obtained from the curved imager will be presented at the conference.
Field Test of the ExoMars Panoramic Camera in the High Arctic - First Results and Lessons Learned
NASA Astrophysics Data System (ADS)
Schmitz, N.; Barnes, D.; Coates, A.; Griffiths, A.; Hauber, E.; Jaumann, R.; Michaelis, H.; Mosebach, H.; Paar, G.; Reissaus, P.; Trauthan, F.
2009-04-01
The ExoMars mission as the first element of the ESA Aurora program is scheduled to be launched to Mars in 2016. Part of the Pasteur Exobiology Payload onboard the ExoMars rover is a Panoramic Camera System (‘PanCam') being designed to obtain high-resolution color and wide-angle multi-spectral stereoscopic panoramic images from the mast of the ExoMars rover. The PanCam instrument consists of two wide-angle cameras (WACs), which will provide multispectral stereo images with 34° field-of-view (FOV) and a High-Resolution RGB Channel (HRC) to provide close-up images with 5° field-of-view. For field testing of the PanCam breadboard in a representative environment the ExoMars PanCam team joined the 6th Arctic Mars Analogue Svalbard Expedition (AMASE) 2008. The expedition took place from 4-17 August 2008 in the Svalbard archipelago, Norway, which is considered to be an excellent site, analogue to ancient Mars. 31 scientists and engineers involved in Mars Exploration (among them the ExoMars WISDOM, MIMA and Raman-LIBS team as well as several NASA MSL teams) combined their knowledge, instruments and techniques to study the geology, geophysics, biosignatures, and life forms that can be found in volcanic complexes, warm springs, subsurface ice, and sedimentary deposits. This work has been carried out by using instruments, a rover (NASA's CliffBot), and techniques that will/may be used in future planetary missions, thereby providing the capability to simulate a full mission environment in a Mars analogue terrain. Besides demonstrating PanCam's general functionality in a field environment, test and verification of the interpretability of PanCam data for in-situ geological context determination and scientific target selection was a main objective. To process the collected data, a first version of the preliminary PanCam 3D reconstruction processing & visualization chain was used. Other objectives included to test and refine the operational scenario (based on ExoMars Rover Reference Surface Mission), to investigate data commonalities and data fusion potential w.r.t. other instruments, and to collect representative image data to evaluate various influences, such as viewing distance, surface structure, and availability of structures at "infinity" (e.g. resolution, focus quality and associated accuracy of the 3D reconstruction). Airborne images with the HRSC-AX camera (airborne camera with heritage from the Mars Express High Resolution Stereo Camera HRSC), collected during a flight campaign over Svalbard in June 2008, provided large-scale geological context information for all field sites.
NASA Technical Reports Server (NTRS)
2004-01-01
This image from the Mars Exploration Rover Opportunity's panoramic camera shows one octant of a larger panoramic image which has not yet been fully processed. The full panorama, dubbed 'Lion King' was obtained on sols 58 and 60 of the mission as the rover was perched at the lip of Eagle Crater, majestically looking down into its former home. It is the largest panorama yet obtained by either rover. The octant, which faces directly into the crater, shows features as small as a few millimeters across in the field near the rover arm, to features a few meters across or larger on the horizon.
The full panoramic image was taken in eight segments using six filters per segment, for a total of 558 images and more than 75 megabytes of data. This enhanced color composite was assembled from the infrared (750 nanometer), green (530 nanometer), and violet (430 nanometer) filters. Additional lower elevation tiers were added relative to other panoramas to ensure that the entire crater was covered in the mosaic.NASA Technical Reports Server (NTRS)
2004-01-01
This image taken by the Mars Exploration Rover Opportunity shows the dunes that line the floor of 'Endurance Crater.' Small-scale ripples on top of the larger dune waves suggest that these dunes may have been active in geologically recent times. The image was taken by the rover's panoramic camera on sol 198 (August 14, 2004).'Endurance': A Daunting Challenge
NASA Technical Reports Server (NTRS)
2004-01-01
This image shows the approximate size of the Mars Exploration Rover Opportunity in comparison to the impressive impact crater dubbed 'Endurance,' which is roughly 130 meters (430 feet) across. A model of Opportunity has been superimposed on top of an approximate true-color image taken by the rover's panoramic camera. Scientists are eager to explore Endurance for clues to the red planet's history. The crater's exposed walls provide a window to what lies beneath the surface of Mars and thus what geologic processes occurred there in the past. While recent studies of the smaller crater nicknamed 'Eagle' revealed an evaporating body of salty water, that crater was not deep enough to indicate what came before the water. Endurance may be able to help answer this question, but the challenge is getting to the scientific targets: most of the crater's rocks are embedded in vertical cliffs. Rover planners are developing strategies to overcome this obstacle. This image is a portion of a larger mosaic taken with the panoramic camera's 480-, 530- and 750-nanometer filters on sols 97 and 98.NASA Technical Reports Server (NTRS)
2004-01-01
This approximate true-color image, acquired by the Mars Exploration Rover Opportunity's panoramic camera, features the hole ground by the rover's rock abrasion tool into 'Bounce' rock. The hole measures approximately 35 centimeters (14 inches) long and 10 centimeters (4 inches) high. The depression measures 6.44 millimeters (0.25 inch) deep and about 4.5 centimeters (1.7 inches) across. The grinding procedure took place on the rover's 66th sol on Mars and lasted 2 hours and 15 minutes. A combination of limited solar power, added safety measures and the rock's jagged texture led the rock abrasion tool team to set more aggressive grinding parameters to ensure that the end result was a full circle, suitable for a thorough read from the rover's spectrometers. Bounce's outer ring consists of the cuttings from the rock, pushed out by the brushes on the grinding instrument. The small impressions filled with red dust on the outer ring were caused by the instrument's contact mechanism, which serves to stabilize it while grinding. This image was created using the panoramic camera's blue, green and red filters.ARTIST CONCEPT - ASTRONAUT WORDEN'S EXTRAVEHICULAR ACTIVITY (EVA) (APOLLO XV)
1971-07-09
S71-39614 (July 1971) --- An artist's concept of the Apollo 15 Command and Service Modules (CSM), showing two crewmembers performing a new-to-Apollo extravehicular activity (EVA). The figure at left represents astronaut Alfred M. Worden, command module pilot, connected by an umbilical tether to the CM, at right, where a figure representing astronaut James B. Irwin, lunar module pilot, stands at the open CM hatch. Worden is working with the panoramic camera in the Scientific Instrument Module (SIM). Behind Irwin is the 16mm data acquisition camera. Artwork by North American Rockwell.
Ishikawa, Daitaro; Nishii, Takashi; Mizuno, Fumiaki; Sato, Harumi; Kazarian, Sergei G; Ozaki, Yukihiro
2013-12-01
This study was carried out to evaluate a new high-speed hyperspectral near-infrared (NIR) camera named Compovision. Quantitative analyses of the crystallinity and crystal evolution of biodegradable polymer, polylactic acid (PLA), and its concentration in PLA/poly-(R)-3-hydroxybutyrate (PHB) blends were investigated using near-infrared (NIR) imaging. This NIR camera can measure two-dimensional NIR spectral data in the 1000-2350 nm region obtaining images with wide field of view of 150 × 250 mm(2) (approximately 100 000 pixels) at high speeds (in less than 5 s). PLA with differing crystallinities between 0 and 50% blended samples with PHB in ratios of 80/20, 60/40, 40/60, 20/80, and pure films of 100% PLA and PHB were prepared. Compovision was used to collect respective NIR spectra in the 1000-2350 nm region and investigate the crystallinity of PLA and its concentration in the blends. The partial least squares (PLS) regression models for the crystallinity of PLA were developed using absorbance, second derivative, and standard normal variate (SNV) spectra from the most informative region of the spectra, between 1600 and 2000 nm. The predicted results of PLS models achieved using the absorbance and second derivative spectra were fairly good with a root mean square error (RMSE) of less than 6.1% and a determination of coefficient (R(2)) of more than 0.88 for PLS factor 1. The results obtained using the SNV spectra yielded the best prediction with the smallest RMSE of 2.93% and the highest R(2) of 0.976. Moreover, PLS models developed for estimating the concentration of PLA in the blend polymers using SNV spectra gave good predicted results where the RMSE was 4.94% and R(2) was 0.98. The SNV-based models provided the best-predicted results, since it can reduce the effects of the spectral changes induced by the inhomogeneity and the thickness of the samples. Wide area crystal evolution of PLA on a plate where a temperature slope of 70-105 °C had occurred was also monitored using NIR imaging. An SNV-based image gave an obvious contrast of the crystallinity around the crystal growth area according to slight temperature change. Moreover, it clarified the inhomogeneity of crystal evolution over the significant wide area. These results have proved that the newly developed hyperspectral NIR camera, Compovision, can be successfully used to study polymers for industrial processes, such as monitoring the crystallinity of PLA and the different composition of PLA/PHB blends.
Carr, Jessica A; Franke, Daniel; Caram, Justin R; Perkinson, Collin F; Saif, Mari; Askoxylakis, Vasileios; Datta, Meenal; Fukumura, Dai; Jain, Rakesh K; Bawendi, Moungi G; Bruns, Oliver T
2018-04-24
Fluorescence imaging is a method of real-time molecular tracking in vivo that has enabled many clinical technologies. Imaging in the shortwave IR (SWIR; 1,000-2,000 nm) promises higher contrast, sensitivity, and penetration depths compared with conventional visible and near-IR (NIR) fluorescence imaging. However, adoption of SWIR imaging in clinical settings has been limited, partially due to the absence of US Food and Drug Administration (FDA)-approved fluorophores with peak emission in the SWIR. Here, we show that commercially available NIR dyes, including the FDA-approved contrast agent indocyanine green (ICG), exhibit optical properties suitable for in vivo SWIR fluorescence imaging. Even though their emission spectra peak in the NIR, these dyes outperform commercial SWIR fluorophores and can be imaged in the SWIR, even beyond 1,500 nm. We show real-time fluorescence imaging using ICG at clinically relevant doses, including intravital microscopy, noninvasive imaging in blood and lymph vessels, and imaging of hepatobiliary clearance, and show increased contrast compared with NIR fluorescence imaging. Furthermore, we show tumor-targeted SWIR imaging with IRDye 800CW-labeled trastuzumab, an NIR dye being tested in multiple clinical trials. Our findings suggest that high-contrast SWIR fluorescence imaging can be implemented alongside existing imaging modalities by switching the detection of conventional NIR fluorescence systems from silicon-based NIR cameras to emerging indium gallium arsenide-based SWIR cameras. Using ICG in particular opens the possibility of translating SWIR fluorescence imaging to human clinical applications. Indeed, our findings suggest that emerging SWIR-fluorescent in vivo contrast agents should be benchmarked against the SWIR emission of ICG in blood.
Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung
2016-08-31
Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.
Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung
2016-01-01
Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user’s head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768
Near-infrared fluorescence imaging with a mobile phone (Conference Presentation)
NASA Astrophysics Data System (ADS)
Ghassemi, Pejhman; Wang, Bohan; Wang, Jianting; Wang, Quanzeng; Chen, Yu; Pfefer, T. Joshua
2017-03-01
Mobile phone cameras employ sensors with near-infrared (NIR) sensitivity, yet this capability has not been exploited for biomedical purposes. Removing the IR-blocking filter from a phone-based camera opens the door to a wide range of techniques and applications for inexpensive, point-of-care biophotonic imaging and sensing. This study provides proof of principle for one of these modalities - phone-based NIR fluorescence imaging. An imaging system was assembled using a 780 nm light source along with excitation and emission filters with 800 nm and 825 nm cut-off wavelengths, respectively. Indocyanine green (ICG) was used as an NIR fluorescence contrast agent in an ex vivo rodent model, a resolution test target and a 3D-printed, tissue-simulating vascular phantom. Raw and processed images for red, green and blue pixel channels were analyzed for quantitative evaluation of fundamental performance characteristics including spectral sensitivity, detection linearity and spatial resolution. Mobile phone results were compared with a scientific CCD. The spatial resolution of CCD system was consistently superior to the phone, and green phone camera pixels showed better resolution than blue or green channels. The CCD exhibited similar sensitivity as processed red and blue pixels channels, yet a greater degree of detection linearity. Raw phone pixel data showed lower sensitivity but greater linearity than processed data. Overall, both qualitative and quantitative results provided strong evidence of the potential of phone-based NIR imaging, which may lead to a wide range of applications from cancer detection to glucose sensing.
View of 'Cape St. Mary' from 'Cape Verde' (Altered Contrast)
NASA Technical Reports Server (NTRS)
2006-01-01
As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape St. Mary' from the from the vantage point of 'Cape Verde,' the next promontory counterclockwise around the crater's deeply scalloped rim. This view of Cape St. Mary combines several exposures taken by the rover's panoramic camera into an approximately true-color mosaic with contrast adjusted to improve the visibility of details in shaded areas. The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact. Near the base of the Cape St. Mary cliff are layers with a pattern called 'crossbedding,' intersecting with each other at angles, rather than parallel to each other. Large-scale crossbedding can result from material being deposited as wind-blown dunes. The images combined into this mosaic were taken during the 970th Martian day, or sol, of Opportunity's Mars-surface mission (Oct. 16, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.The Beagle 2 Stereo Camera System: Scientific Objectives and Design Characteristics
NASA Astrophysics Data System (ADS)
Griffiths, A.; Coates, A.; Josset, J.; Paar, G.; Sims, M.
2003-04-01
The Stereo Camera System (SCS) will provide wide-angle (48 degree) multi-spectral stereo imaging of the Beagle 2 landing site in Isidis Planitia with an angular resolution of 0.75 milliradians. Based on the SpaceX Modular Micro-Imager, the SCS is composed of twin cameras (with 1024 by 1024 pixel frame transfer CCD) and twin filter wheel units (with a combined total of 24 filters). The primary mission objective is to construct a digital elevation model of the area in reach of the lander’s robot arm. The SCS specifications and following baseline studies are described: Panoramic RGB colour imaging of the landing site and panoramic multi-spectral imaging at 12 distinct wavelengths to study the mineralogy of landing site. Solar observations to measure water vapour absorption and the atmospheric dust optical density. Also envisaged are multi-spectral observations of Phobos &Deimos (observations of the moons relative to background stars will be used to determine the lander’s location and orientation relative to the Martian surface), monitoring of the landing site to detect temporal changes, observation of the actions and effects of the other PAW experiments (including rock texture studies with a close-up-lens) and collaborative observations with the Mars Express orbiter instrument teams. Due to be launched in May of this year, the total system mass is 360 g, the required volume envelope is 747 cm^3 and the average power consumption is 1.8 W. A 10Mbit/s RS422 bus connects each camera to the lander common electronics.
NASA Technical Reports Server (NTRS)
2004-01-01
In these line graphs of laboratory spectra, it is evident that different minerals have different spectra. The graph on the left shows the typical minerals found in igneous rocks, which are rocks related to magma or volcanic activity. The graph on the right shows iron-bearing candidates for further study and comparison to spectra from the Mars Exploration Rover panoramic cameras on Mars.Things Aren't Always What They Seem
NASA Technical Reports Server (NTRS)
2004-01-01
This mosaic was assembled from images taken by the panoramic camera on the Mars Exploration Rover Spirit at a region dubbed 'site 31.' Spirit is looking at 'Missoula Crater.' From orbit, the features within the crater appeared to be ejecta from the younger 'Bonneville Crater,' but Spirit's closer look revealed wind-blown drift deposits, not ejecta, within Missoula Crater.Samara Probe For Remote Imaging
NASA Technical Reports Server (NTRS)
Burke, James D.
1989-01-01
Imaging probe descends through atmosphere of planet, obtaining images of ground surface as it travels. Released from aircraft over Earth or from spacecraft over another planet. Body and single wing shaped like samara - winged seed like those of maple trees. Rotates as descends, providing panoramic view of terrain below. Radio image obtained by video camera to aircraft or spacecraft overhead.
Photometric Observations of Soils and Rocks at the Mars Exploration Rover Landing Sites
NASA Technical Reports Server (NTRS)
Johnson, J. R.; Arvidson, R. A.; Bell, J. F., III; Farrand, W.; Guinness, E.; Johnson, M.; Herkenhoff, K. E.; Lemmon, M.; Morris, R. V.; Seelos, F., IV
2005-01-01
The Panoramic Cameras (Pancam) on the Spirit and Opportunity Mars Exploration Rovers have acquired multispectral reflectance observations of rocks and soils at different incidence, emission, and phase angles that will be used for photometric modeling of surface materials. Phase angle coverage at both sites extends from approx. 0 deg. to approx. 155 deg.
Kang, Han Gyu; Lee, Ho-Young; Kim, Kyeong Min; Song, Seong-Hyun; Hong, Gun Chul; Hong, Seong Jong
2017-01-01
The aim of this study is to integrate NIR, gamma, and visible imaging tools into a single endoscopic system to overcome the limitation of NIR using gamma imaging and to demonstrate the feasibility of endoscopic NIR/gamma/visible fusion imaging for sentinel lymph node (SLN) mapping with a small animal. The endoscopic NIR/gamma/visible imaging system consists of a tungsten pinhole collimator, a plastic focusing lens, a BGO crystal (11 × 11 × 2 mm 3 ), a fiber-optic taper (front = 11 × 11 mm 2 , end = 4 × 4 mm 2 ), a 122-cm long endoscopic fiber bundle, an NIR emission filter, a relay lens, and a CCD camera. A custom-made Derenzo-like phantom filled with a mixture of 99m Tc and indocyanine green (ICG) was used to assess the spatial resolution of the NIR and gamma images. The ICG fluorophore was excited using a light-emitting diode (LED) with an excitation filter (723-758 nm), and the emitted fluorescence photons were detected with an emission filter (780-820 nm) for a duration of 100 ms. Subsequently, the 99m Tc distribution in the phantom was imaged for 3 min. The feasibility of in vivo SLN mapping with a mouse was investigated by injecting a mixture of 99m Tc-antimony sulfur colloid (12 MBq) and ICG (0.1 mL) into the right paw of the mouse (C57/B6) subcutaneously. After one hour, NIR, gamma, and visible images were acquired sequentially. Subsequently, the dissected SLN was imaged in the same way as the in vivo SLN mapping. The NIR, gamma, and visible images of the Derenzo-like phantom can be obtained with the proposed endoscopic imaging system. The NIR/gamma/visible fusion image of the SLN showed a good correlation among the NIR, gamma, and visible images both for the in vivo and ex vivo imaging. We demonstrated the feasibility of the integrated NIR/gamma/visible imaging system using a single endoscopic fiber bundle. In future, we plan to investigate miniaturization of the endoscope head and simultaneous NIR/gamma/visible imaging with dichroic mirrors and three CCD cameras. © 2016 American Association of Physicists in Medicine.
Newer views of the Moon: Comparing spectra from Clementine and the Moon Mineralogy Mapper
Kramer, G.Y.; Besse, S.; Nettles, J.; Combe, J.-P.; Clark, R.N.; Pieters, C.M.; Staid, M.; Malaret, E.; Boardman, J.; Green, R.O.; Head, J.W.; McCord, T.B.
2011-01-01
The Moon Mineralogy Mapper (M3) provided the first global hyperspectral data of the lunar surface in 85 bands from 460 to 2980 nm. The Clementine mission provided the first global multispectral maps the lunar surface in 11 spectral bands across the ultraviolet-visible (UV-VIS) and near-infrared (NIR). In an effort to understand how M3 improves our ability to analyze and interpret lunar data, we compare M3 spectra with those from Clementine's UV-VIS and NIR cameras. The Clementine mission provided the first global multispectral maps the lunar surface in 11 spectral bands across the UV-VIS and NIR. We have found that M3 reflectance values are lower across all wavelengths compared with albedos from both of Clementine's UV-VIS and NIR cameras. M3 spectra show the Moon to be redder, that is, have a steeper continuum slope, than indicated by Clementine. The 1 m absorption band depths may be comparable between the instruments, but Clementine data consistently exhibit shallower 2 m band depths than M 3. Absorption band minimums are difficult to compare due to the significantly different spectral resolutions. Copyright 2011 by the American Geophysical Union.
Newer views of the Moon: Comparing spectra from Clementineand the Moon Mineralogy Mapper
Georgiana Y. Kramer,; Sebastian Besse,; Nettles, Jeff; Jean-Philippe Combe,; Clark, Roger N.; Pieters, Carle M.; Matthew Staid,; Joseph Boardman,; Robert Green,; McCord, Thomas B.; Malaret, Erik; Head, James W.
2011-01-01
The Moon Mineralogy Mapper (M3) provided the first global hyperspectral data of the lunar surface in 85 bands from 460 to 2980 nm. The Clementine mission provided the first global multispectral maps the lunar surface in 11 spectral bands across the ultraviolet-visible (UV-VIS) and near-infrared (NIR). In an effort to understand how M3 improves our ability to analyze and interpret lunar data, we compare M3 spectra with those from Clementine's UV-VIS and NIR cameras. The Clementine mission provided the first global multispectral maps the lunar surface in 11 spectral bands across the UV-VIS and NIR. We have found that M3 reflectance values are lower across all wavelengths compared with albedos from both of Clementine's UV-VIS and NIR cameras. M3 spectra show the Moon to be redder, that is, have a steeper continuum slope, than indicated by Clementine. The 1 μm absorption band depths may be comparable between the instruments, but Clementine data consistently exhibit shallower 2 μm band depths than M3. Absorption band minimums are difficult to compare due to the significantly different spectral resolutions.
Lee, John Y-K.; Thawani, Jayesh P.; Pierce, John; Zeh, Ryan; Martinez-Lage, Maria; Chanin, Michelle; Venegas, Ollin; Nims, Sarah; Learned, Kim; Keating, Jane; Singhal, Sunil
2016-01-01
Background Although real-time localization of gliomas has improved with intraoperative image guidance systems, these tools are limited by brain shift, surgical cavity deformation, and expense. Objective To propose a novel method to perform near-infrared (NIR) imaging during glioma resections based on preclinical and clinical investigations, in order to localize tumors and to potentially identify residual disease. Methods Fifteen patients were identified and administered an FDA-approved, NIR contrast agent (Second Window indocyanine green [ICG], 5 mg/kg) prior to surgical resection. An NIR camera was utilized to localize the tumor prior to resection and to visualize surgical margins following resection. Neuropathology and MR imaging data were used to assess the accuracy and precision of NIR-fluorescence in identifying tumor tissue. Results NIR visualization of 15 gliomas (10 glioblastoma multiforme, 1 anaplastic astrocytoma, 2 low grade astrocytoma, 1 juvenile pilocytic astrocytoma, and 1 ganglioglioma) was performed 22.7 hours (mean) after intravenous injection of ICG. During surgery, 12/15 tumors were visualized with the NIR camera. The mean signal-to-background ratio was 9.5 ± 0.8 and fluorescence was noted through the dura to a maximum parenchymal depth of 13 mm. The best predictor of positive fluorescence was enhancement on T1-weighted imaging; this correlated with SBR (P = .03). Non-enhancing tumors did not demonstrate NIR fluorescence. Using pathology as the gold standard, the technique demonstrated a sensitivity of 98% and specificity of 45% to identify tumor in gadolinium-enhancing specimens (n = 71). Conclusion Using Second Window ICG, gadolinium-enhancing tumors can be localized through brain parenchyma intraoperatively. Its utility for margin detection is promising but limited by lower specificity. PMID:27741220
Discovery of the near-infrared counterpart to the luminous neutron-star low-mass X-ray binary GX 3+1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van den Berg, Maureen; Fridriksson, Joel K.; Homan, Jeroen
2014-10-01
Using the High Resolution Camera on board the Chandra X-ray Observatory, we have measured an accurate position for the bright persistent neutron star X-ray binary and atoll source GX 3+1. At a location that is consistent with this new position, we have discovered the near-infrared (NIR) counterpart to GX 3+1 in images taken with the PANIC and FourStar cameras on the Magellan Baade Telescope. The identification of this K{sub s} = 15.8 ± 0.1 mag star as the counterpart is based on the presence of a Br γ emission line in an NIR spectrum taken with the Folded-port InfraRed Echelettemore » spectrograph on the Baade Telescope. The absolute magnitude derived from the best available distance estimate to GX 3+1 indicates that the mass donor in the system is not a late-type giant. We find that the NIR light in GX 3+1 is likely dominated by the contribution from a heated outer accretion disk. This is similar to what has been found for the NIR flux from the brighter class of Z sources, but unlike the behavior of atolls fainter (L{sub X} ≈ 10{sup 36}-10{sup 37} erg s{sup –1}) than GX 3+1, where optically thin synchrotron emission from a jet probably dominates the NIR flux.« less
Dworak, Volker; Selbeck, Joern; Dammer, Karl-Heinz; Hoffmann, Matthias; Zarezadeh, Ali Akbar; Bobda, Christophe
2013-01-24
The application of (smart) cameras for process control, mapping, and advanced imaging in agriculture has become an element of precision farming that facilitates the conservation of fertilizer, pesticides, and machine time. This technique additionally reduces the amount of energy required in terms of fuel. Although research activities have increased in this field, high camera prices reflect low adaptation to applications in all fields of agriculture. Smart, low-cost cameras adapted for agricultural applications can overcome this drawback. The normalized difference vegetation index (NDVI) for each image pixel is an applicable algorithm to discriminate plant information from the soil background enabled by a large difference in the reflectance between the near infrared (NIR) and the red channel optical frequency band. Two aligned charge coupled device (CCD) chips for the red and NIR channel are typically used, but they are expensive because of the precise optical alignment required. Therefore, much attention has been given to the development of alternative camera designs. In this study, the advantage of a smart one-chip camera design with NDVI image performance is demonstrated in terms of low cost and simplified design. The required assembly and pixel modifications are described, and new algorithms for establishing an enhanced NDVI image quality for data processing are discussed.
Dworak, Volker; Selbeck, Joern; Dammer, Karl-Heinz; Hoffmann, Matthias; Zarezadeh, Ali Akbar; Bobda, Christophe
2013-01-01
The application of (smart) cameras for process control, mapping, and advanced imaging in agriculture has become an element of precision farming that facilitates the conservation of fertilizer, pesticides, and machine time. This technique additionally reduces the amount of energy required in terms of fuel. Although research activities have increased in this field, high camera prices reflect low adaptation to applications in all fields of agriculture. Smart, low-cost cameras adapted for agricultural applications can overcome this drawback. The normalized difference vegetation index (NDVI) for each image pixel is an applicable algorithm to discriminate plant information from the soil background enabled by a large difference in the reflectance between the near infrared (NIR) and the red channel optical frequency band. Two aligned charge coupled device (CCD) chips for the red and NIR channel are typically used, but they are expensive because of the precise optical alignment required. Therefore, much attention has been given to the development of alternative camera designs. In this study, the advantage of a smart one-chip camera design with NDVI image performance is demonstrated in terms of low cost and simplified design. The required assembly and pixel modifications are described, and new algorithms for establishing an enhanced NDVI image quality for data processing are discussed. PMID:23348037
NASA Technical Reports Server (NTRS)
Moore, H. J.; Wu, S. C.
1973-01-01
The effect of reading error on two hypothetical slope frequency distributions and two slope frequency distributions from actual lunar data in order to ensure that these errors do not cause excessive overestimates of algebraic standard deviations for the slope frequency distributions. The errors introduced are insignificant when the reading error is small and the slope length is large. A method for correcting the errors in slope frequency distributions is presented and applied to 11 distributions obtained from Apollo 15, 16, and 17 panoramic camera photographs and Apollo 16 metric camera photographs.
'Berries' Here, There, Everywhere
NASA Technical Reports Server (NTRS)
2004-01-01
This approximate true-color image suggests that the plains beyond the small crater where the Mars Exploration Rover Opportunity now sits are littered with the same dark grey material found inside the crater in the form of spherules or 'blueberries.' Because Mars orbiters have observed the iron-bearing mineral hematite across these plains, scientists hypothesize that the blueberries are also made up of this mineral. This image was taken by the rover's panoramic camera on the 17th martian day, or sol, of its mission. Data from the camera's red, green and blue filters were combined to create this image.
NASA Technical Reports Server (NTRS)
2004-01-01
The microscopic imager (circular device in center) is in clear view above the surface at Meridiani Planum, Mars, in this approximate true-color image taken by the panoramic camera on the Mars Exploration Rover Opportunity. The image was taken on the 9th sol of the rover's journey. The microscopic imager is located on the rover's instrument deployment device, or arm. The arrow is pointing to the lens of the instrument. Note the dust cover, which flips out to the left of the lens, is open. This approximated color image was created using the camera's violet and infrared filters as blue and red.
Multi-scale auroral observations in Apatity: winter 2010-2011
NASA Astrophysics Data System (ADS)
Kozelov, B. V.; Pilgaev, S. V.; Borovkov, L. P.; Yurov, V. E.
2012-03-01
Routine observations of the aurora are conducted in Apatity by a set of five cameras: (i) all-sky TV camera Watec WAT-902K (1/2"CCD) with Fujinon lens YV2.2 × 1.4A-SA2; (ii) two monochromatic cameras Guppy F-044B NIR (1/2"CCD) with Fujinon HF25HA-1B (1:1.4/25 mm) lens for 18° field of view and glass filter 558 nm; (iii) two color cameras Guppy F-044C NIR (1/2"CCD) with Fujinon DF6HA-1B (1:1.2/6 mm) lens for 67° field of view. The observational complex is aimed at investigating spatial structure of the aurora, its scaling properties, and vertical distribution in the rayed forms. The cameras were installed on the main building of the Apatity division of the Polar Geophysical Institute and at the Apatity stratospheric range. The distance between these sites is nearly 4 km, so the identical monochromatic cameras can be used as a stereoscopic system. All cameras are accessible and operated remotely via Internet. For 2010-2011 winter season the equipment was upgraded by special blocks of GPS-time triggering, temperature control and motorized pan-tilt rotation mounts. This paper presents the equipment, samples of observed events and the web-site with access to available data previews.
Multi-scale auroral observations in Apatity: winter 2010-2011
NASA Astrophysics Data System (ADS)
Kozelov, B. V.; Pilgaev, S. V.; Borovkov, L. P.; Yurov, V. E.
2011-12-01
Routine observations of the aurora are conducted in Apatity by a set of five cameras: (i) all-sky TV camera Watec WAT-902K (1/2"CCD) with Fujinon lens YV2.2 × 1.4A-SA2; (ii) two monochromatic cameras Guppy F-044B NIR (1/2"CCD) with Fujinon HF25HA-1B (1:1.4/25 mm) lens for 18° field of view and glass filter 558 nm; (iii) two color cameras Guppy F-044C NIR (1/2"CCD) with Fujinon DF6HA-1B (1:1.2/6 mm) lens for 67° field of view. The observational complex is aimed at investigating spatial structure of the aurora, its scaling properties, and vertical distribution in the rayed forms. The cameras were installed on the main building of the Apatity division of the Polar Geophysical Institute and at the Apatity stratospheric range. The distance between these sites is nearly 4 km, so the identical monochromatic cameras can be used as a stereoscopic system. All cameras are accessible and operated remotely via Internet. For 2010-2011 winter season the equipment was upgraded by special blocks of GPS-time triggering, temperature control and motorized pan-tilt rotation mounts. This paper presents the equipment, samples of observed events and the web-site with access to available data previews.
Using Google Streetview Panoramic Imagery for Geoscience Education
NASA Astrophysics Data System (ADS)
De Paor, D. G.; Dordevic, M. M.
2014-12-01
Google Streetview is a feature of Google Maps and Google Earth that allows viewers to switch from map or satellite view to 360° panoramic imagery recorded close to the ground. Most panoramas are recorded by Google engineers using special cameras mounted on the roofs of cars. Bicycles, snowmobiles, and boats have also been used and sometimes the camera has been mounted on a backpack for off-road use by hikers and skiers or attached to scuba-diving gear for "Underwater Streetview (sic)." Streetview panoramas are linked together so that the viewer can change viewpoint by clicking forward and reverse buttons. They therefore create a 4-D touring effect. As part of the GEODE project ("Google Earth for Onsite and Distance Education"), we are experimenting with the use of Streetview imagery for geoscience education. Our web-based test application allows instructors to select locations for students to study. Students are presented with a set of questions or tasks that they must address by studying the panoramic imagery. Questions include identification of rock types, structures such as faults, and general geological setting. The student view is locked into Streetview mode until they submit their answers, whereupon the map and satellite views become available, allowing students to zoom out and verify their location on Earth. Student learning is scaffolded by automatic computerized feedback. There are lots of existing Streetview panoramas with rich geological content. Additionally, instructors and members of the general public can create panoramas, including 360° Photo Spheres, by stitching images taken with their mobiles devices and submitting them to Google for evaluation and hosting. A multi-thousand-dollar, multi-directional camera and mount can be purchased from DIY-streetview.com. This allows power users to generate their own high-resolution panoramas. A cheaper, 360° video camera is soon to be released according to geonaute.com. Thus there are opportunities for geoscience educators both to use existing Streetview imagery and to generate new imagery for specific locations of geological interest. The GEODE team includes the authors and: H. Almquist, C. Bentley, S. Burgin, C. Cervato, G. Cooper, P. Karabinos, T. Pavlis, J. Piatek, B. Richards, J. Ryan, R. Schott, K. St. John, B. Tewksbury, and S. Whitmeyer.
NASA Technical Reports Server (NTRS)
2004-01-01
This close-up image of the Mars Exploration Rover Spirit's instrument deployment device, or 'arm,' shows the donut-shaped plate on the Moessbauer spectrometer. This image makes it easy to recognize the imprint left by the instrument in the martian soil at a location called 'Peak' on sol 43 (February 16, 2004). This image was taken by the rover's panoramic camera on sol 39 (February 11, 2004).
NASA Technical Reports Server (NTRS)
2004-01-01
This false-color image taken by the panoramic camera on the Mars Exploration Rover Spirit shows the rock dubbed 'Pot of Gold' (upper left), located near the base of the 'Columbia Hills' in Gusev Crater. The rock's nodules and layered appearance have inspired rover team members to investigate the rock's detailed chemistry in coming sols. This picture was taken on sol 158 (June 13, 2004).HUBBLE'S PANORAMIC PICTURE OF COMET SHOEMAKER-LEVY 9
NASA Technical Reports Server (NTRS)
2002-01-01
Infrared image shows bright spot, aftermath of the impact of the first fragment of Comet Shoemaker-Levy 9 on the planet Jupiter. The image was made using an infrared camera built by Ohio State University and the 4-meter telescope at the Cerro Tololo Interamerican Observatory (CTIO) at La Serena, Chile. Credit: John Spencer (Lowell Observatory), Darren Depoy (Ohio State University), CTIO.
Color View of a 'Rat' Hole Trail Inside 'Endurance'
NASA Technical Reports Server (NTRS)
2004-01-01
This view from the Mars Exploration Rover Opportunity's panoramic camera is an approximately true color rendering of the first seven holes that the rover's rock abrasion tool dug on the inner slope of 'Endurance Crater.' The rover was about 12 meters (about 39 feet) down into the crater when it acquired the images combined into this mosaic. The view is looking back toward the rim of the crater, with the rover's tracks visible. The tailings around the holes drilled by the rock abrasion tool, or 'Rat,' show evidence for fine-grained red hematite similar to what was observed months earlier in 'Eagle Crater' outcrop holes. Starting from the uppermost pictured (closest to the crater rim) to the lowest, the rock abrasion tool hole targets are called 'Tennessee,' 'Cobblehill,' 'Virginia,' 'London,' 'Grindstone,' 'Kettlestone,' and 'Drammensfjorden.' Opportunity drilled these holes on sols 138 (June 13, 2004), 143 (June 18), 145 (June 20), 148 (June 23), 151 (June 26), 153 (June 28) and 161 (July 7), respectively. Each hole is 4.5 centimeters (1.8 inches) in diameter. This image was generated using the panoramic camera's 750-, 530-, and 430-nanometer filters. It was taken on sol 173 (July 19).NASA Technical Reports Server (NTRS)
2004-01-01
[figure removed for brevity, see original site] Figure 1 In the quest to determine if a pebble was jamming the rock abrasion tool on NASA's Mars Exploration Rover Opportunity, scientists and engineers examined this up-close, approximate true-color image of the tool. The picture was taken by the rover's panoramic camera, using filters centered at 601, 535, and 482 nanometers, at 12:47 local solar time on sol 200 (August 16, 2004).
Colored spots have been drawn on this image corresponding to regions where panoramic camera reflectance spectra were acquired (see chart in Figure 1). Those regions are: the grinding wheel heads (yellow); the rock abrasion tool magnets (green); the supposed pebble (red); a sunlit portion of the aluminum rock abrasion tool housing (purple); and a shadowed portion of the rock abrasion tool housing (brown). These spectra demonstrated that the composition of the supposed pebble was clearly different from that of the sunlit and shadowed portions of the rock abrasion tool, while similar to that of the dust-coated rock abrasion tool magnets and grinding heads. This led the team to conclude that the object disabling the rock abrasion tool was indeed a martian pebble.The Two Moons of Mars as Seen from Mars
NASA Technical Reports Server (NTRS)
2005-01-01
Taking advantage of extra solar energy collected during the day, NASA's Mars Exploration Rover Spirit settled in for an evening of stargazing, photographing the two moons of Mars as they crossed the night sky. 'It is incredibly cool to be running an observatory on another planet,' said planetary scientist Jim Bell of Cornell University, Ithaca, N.Y., lead scientist for the panoramic cameras on Spirit and Opportunity. This time-lapse composite, acquired the evening of Spirit's martian sol 585 (Aug. 26, 2005) from a perch atop 'Husband Hill' in Gusev Crater, shows Phobos, the brighter moon, on the right, and Deimos, the dimmer moon, on the left. Tiny streaks mark the trails of background stars moving across the sky or the impact of cosmic rays lighting up random groups of pixels in the image. Scientists will use images of the two moons to better map their orbital positions, learn more about their composition, and monitor the presence of nighttime clouds or haze. Spirit took the five images that make up this composite using the panoramic camera's broadband filter, which was designed specifically for acquiring images under low-light conditions.NASA Astrophysics Data System (ADS)
Piermattei, Livia; Bozzi, Carlo Alberto; Mancini, Adriano; Tassetti, Anna Nora; Karel, Wilfried; Pfeifer, Norbert
2017-04-01
Unmanned aerial vehicles (UAVs) in combination with consumer grade cameras have become standard tools for photogrammetric applications and surveying. The recent generation of multispectral, cost-efficient and lightweight cameras has fostered a breakthrough in the practical application of UAVs for precision agriculture. For this application, multispectral cameras typically use Green, Red, Red-Edge (RE) and Near Infrared (NIR) wavebands to capture both visible and invisible images of crops and vegetation. These bands are very effective for deriving characteristics like soil productivity, plant health and overall growth. However, the quality of results is affected by the sensor architecture, the spatial and spectral resolutions, the pattern of image collection, and the processing of the multispectral images. In particular, collecting data with multiple sensors requires an accurate spatial co-registration of the various UAV image datasets. Multispectral processed data in precision agriculture are mainly presented as orthorectified mosaics used to export information maps and vegetation indices. This work aims to investigate the acquisition parameters and processing approaches of this new type of image data in order to generate orthoimages using different sensors and UAV platforms. Within our experimental area we placed a grid of artificial targets, whose position was determined with differential global positioning system (dGPS) measurements. Targets were used as ground control points to georeference the images and as checkpoints to verify the accuracy of the georeferenced mosaics. The primary aim is to present a method for the spatial co-registration of visible, Red-Edge, and NIR image sets. To demonstrate the applicability and accuracy of our methodology, multi-sensor datasets were collected over the same area and approximately at the same time using the fixed-wing UAV senseFly "eBee". The images were acquired with the camera Canon S110 RGB, the multispectral cameras Canon S110 NIR and S110 RE and with the multi-camera system Parrot Sequoia, which is composed of single-band cameras (Green, Red, Red Edge, NIR and RGB). Imagery from each sensor was georeferenced and mosaicked with the commercial software Agisoft PhotoScan Pro and different approaches for image orientation were compared. To assess the overall spatial accuracy of each dataset the root mean square error was computed between check point coordinates measured with dGPS and coordinates retrieved from georeferenced image mosaics. Additionally, image datasets from different UAV platforms (i.e. DJI Phantom 4Pro, DJI Phantom 3 professional, and DJI Inspire 1 Pro) were acquired over the same area and the spatial accuracy of the orthoimages was evaluated.
Sávio, Luís Felipe; Panizzutti Barboza, Marcelo; Alameddine, Mahmoud; Ahdoot, Michael; Alonzo, David; Ritch, Chad R
2018-03-01
To describe our novel technique for performing a combined partial penectomy and bilateral robotic inguinal lymphadenectomy using intraoperative near-infrared (NIR) fluorescence guidance with indocyanine green (ICG) and the DaVinci Firefly camera system. A 58-year-old man presented status post recent excisional biopsy of a 2-cm lesion on the left coronal aspect of the glans penis. Pathology revealed "invasive squamous cell carcinoma of the penis with multifocal positive margins." His examination was suspicious for cT2 primary and his inguinal nodes were cN0. He was counseled to undergo partial penectomy with possible combined vs staged bilateral robotic inguinal lymphadenectomy. Preoperative computed tomography scan was negative for pathologic lymphadenopathy. Before incision, 5 mL of ICG was injected subcutaneously beneath the tumor. Bilateral thigh pockets were then developed simultaneously and a right, then left robotic modified inguinal lymphadenectomy was performed using NIR fluorescence guidance via the DaVinci Firefly camera. A partial penectomy was then performed in the standard fashion. The combined procedure was performed successfully without complication. Total operative time was 379 minutes and total robotic console time was 95 minutes for the right and 58 minutes to the left. Estimated blood loss on the right and left were 15 and 25 mL, respectively. A total of 24 lymph nodes were retrieved. This video demonstrates a safe and feasible approach for combined partial penectomy and bilateral inguinal lymphadenectomy with NIR guidance using ICG and the DaVinci Firefly camera system. The combined robotic approach has minimal morbidity and avoids the need for a staged procedure. Furthermore, use of NIR guidance with ICG during robotic inguinal lymphadenectomy is feasible and may help identify sentinel lymph nodes and improve the quality of dissection. Further studies are needed to confirm the utility of NIR guidance for robotic sentinel lymph node dissection. Copyright © 2017 Elsevier Inc. All rights reserved.
View of 'Cape St. Mary' from 'Cape Verde' (False Color)
NASA Technical Reports Server (NTRS)
2006-01-01
As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape St. Mary' from the from the vantage point of 'Cape Verde,' the next promontory counterclockwise around the crater's deeply scalloped rim. This view of Cape St. Mary combines several exposures taken by the rover's panoramic camera into a false-color mosaic. Contrast has been adjusted to improve the visibility of details in shaded areas. The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact. Near the base of the Cape St. Mary cliff are layers with a pattern called 'crossbedding,' intersecting with each other at angles, rather than parallel to each other. Large-scale crossbedding can result from material being deposited as wind-blown dunes. The images combined into this mosaic were taken during the 970th Martian day, or sol, of Opportunity's Mars-surface mission (Oct. 16, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters. The false color enhances subtle color differences among materials in the rocks and soils of the scene.Design and Implementation of a Novel Portable 360° Stereo Camera System with Low-Cost Action Cameras
NASA Astrophysics Data System (ADS)
Holdener, D.; Nebiker, S.; Blaser, S.
2017-11-01
The demand for capturing indoor spaces is rising with the digitalization trend in the construction industry. An efficient solution for measuring challenging indoor environments is mobile mapping. Image-based systems with 360° panoramic coverage allow a rapid data acquisition and can be processed to georeferenced 3D images hosted in cloud-based 3D geoinformation services. For the multiview stereo camera system presented in this paper, a 360° coverage is achieved with a layout consisting of five horizontal stereo image pairs in a circular arrangement. The design is implemented as a low-cost solution based on a 3D printed camera rig and action cameras with fisheye lenses. The fisheye stereo system is successfully calibrated with accuracies sufficient for the applied measurement task. A comparison of 3D distances with reference data delivers maximal deviations of 3 cm on typical distances in indoor space of 2-8 m. Also the automatic computation of coloured point clouds from the stereo pairs is demonstrated.
Extra-luminal detection of assumed colonic tumor site by near-infrared laparoscopy.
Zako, Tamotsu; Ito, Masaaki; Hyodo, Hiroshi; Yoshimoto, Miya; Watanabe, Masayuki; Takemura, Hiroshi; Kishimoto, Hidehiro; Kaneko, Kazuhiro; Soga, Kohei; Maeda, Mizuo
2016-09-01
Localization of colorectal tumors during laparoscopic surgery is generally performed by tattooing into the submucosal layer of the colon. However, faint and diffuse tattoos may lead to difficulties in recognizing cancer sites, resulting in inappropriate resection of the colon. We previously demonstrated that yttrium oxide nanoparticles doped with the rare earth ions (ytterbium and erbium) (YNP) showed strong near-infrared (NIR) emission under NIR excitation (1550 nm emission with 980 nm excitation). NIR light can penetrate deep tissues. In this study, we developed an NIR laparoscopy imaging system and demonstrated its use for accurate resection of the colon in swine. The NIR laparoscopy system consisted of an NIR laparoscope, NIR excitation laser diode, and an NIR camera. Endo-clips coated with YNP (NIR clip), silicon rubber including YNP (NIR silicon mass), and YNP solution (NIR ink) were prepared as test NIR markers. We used a swine model to detect an assumed colon cancer site using NIR laparoscopy, followed by laparoscopic resection. The NIR markers were fixed at an assumed cancer site within the colon by endoscopy. An NIR laparoscope was then introduced into the abdominal cavity through a laparoscopy port. NIR emission from the markers in the swine colon was successfully recognized using the NIR laparoscopy imaging system. The position of the markers in the colon could be identified. Accurate resection of the colon was performed successfully by laparoscopic surgery under NIR fluorescence guidance. The presence of the NIR markers within the extirpated colon was confirmed, indicating resection of the appropriate site. NIR laparoscopic surgery is useful for colorectal cancer site recognition and accurate resection using laparoscopic surgery.
Note: Retrofitting an analog spectrometer for high resolving power in NUV-NIR
NASA Astrophysics Data System (ADS)
Taylor, Andrew S.; Batishchev, Oleg V.
2017-11-01
We demonstrate how an older spectrometer designed for photographic films can be efficiently retrofitted with a narrow laser-cut slit and a modern μm-pixel-size imaging CMOS camera, yielding sub-pm resolution in the broad near ultraviolet to near infrared (NUV-NIR) spectral range. Resolving power approaching 106 is achieved. Such digital retrofitting of an analog instrument is practical for research and teaching laboratories.
NASA Technical Reports Server (NTRS)
2004-01-01
This 3-D image taken by the left and right eyes of the panoramic camera on the Mars Exploration Rover Spirit shows the odd rock formation dubbed 'Cobra Hoods' (center). Rover scientists say this resistant rock is unlike anything they've seen on Mars so far. Spirit will investigate the rock in coming sols. The stereo pictures making up this image were captured on sol 156 (June 11, 2004).Novel fast catadioptric objective with wide field of view
NASA Astrophysics Data System (ADS)
Muñoz, Fernando; Infante Herrero, José M.; Benítez, Pablo; Miñano, Juan C.; Lin, Wang; Vilaplana, Juan; Biot, Guillermo; de la Fuente, Marta
2010-08-01
Using the Simultaneous Multiple Surface method in 2D (SMS2D), we present a fast catadioptric objective with a wide field of view (125°×96°) designed for a microbolometer detector with 640×480 pixels and 25 microns pixel pitch Keywords: Infrared lens design, thermal imaging, Schwarzschild configuration, SMS2D, wide field of view, driving cameras, panoramic systems
Optical changes of dentin in the near-IR as a function of mineral content
NASA Astrophysics Data System (ADS)
Berg, Rhett A.; Simon, Jacob C.; Fried, Daniel; Darling, Cynthia L.
2017-02-01
The optical properties of human dentin can change markedly due to aging, friction from opposing teeth, and acute trauma, resulting in the formation of transparent or sclerotic dentin with increased mineral density. The objective of this study was to determine the optical attenuation coefficient of human dentin tissues with different mineral densities in the near-infrared (NIR) spectral regions from 1300-2200 nm using NIR transillumination and optical coherence tomography (OCT). N=50 dentin samples of varying opacities were obtained by sectioning whole extracted teeth into 150 μm transverse sections at the cemento-enamel junction or the apical root. Transillumination images were acquired with a NIR camera and attenuation measurements were acquired at various NIR wavelengths using a NIR sensitive photodiode. Samples were imaged with transverse microradiography (gold standard) in order to determine the mineral density of each sample.
Portable wide-field hand-held NIR scanner
NASA Astrophysics Data System (ADS)
Jung, Young-Jin; Roman, Manuela; Carrasquilla, Jennifer; Erickson, Sarah J.; Godavarty, Anuradha
2013-03-01
Near-infrared (NIR) optical imaging modality is one of the widely used medical imaging techniques for breast cancer imaging, functional brain mapping, and many other applications. However, conventional NIR imaging systems are bulky and expensive, thereby limiting their accelerated clinical translation. Herein a new compact (6 × 7 × 12 cm3), cost-effective, and wide-field NIR scanner has been developed towards contact as well as no-contact based real-time imaging in both reflectance and transmission mode. The scanner mainly consists of an NIR source light (between 700- 900 nm), an NIR sensitive CCD camera, and a custom-developed image acquisition and processing software to image an area of 12 cm2. Phantom experiments have been conducted to estimate the feasibility of diffuse optical imaging by using Indian-Ink as absorption-based contrast agents. As a result, the developed NIR system measured the light intensity change in absorption-contrasted target up to 4 cm depth under transillumination mode. Preliminary in-vivo studies demonstrated the feasibility of real-time monitoring of blood flow changes. Currently, extensive in-vivo studies are carried out using the ultra-portable NIR scanner in order to assess the potential of the imager towards breast imaging..
Constructing spherical panoramas of a bladder phantom from endoscopic video using bundle adjustment
NASA Astrophysics Data System (ADS)
Soper, Timothy D.; Chandler, John E.; Porter, Michael P.; Seibel, Eric J.
2011-03-01
The high recurrence rate of bladder cancer requires patients to undergo frequent surveillance screenings over their lifetime following initial diagnosis and resection. Our laboratory is developing panoramic stitching software that would compile several minutes of cystoscopic video into a single panoramic image, covering the entire bladder, for review by an urolgist at a later time or remote location. Global alignment of video frames is achieved by using a bundle adjuster that simultaneously recovers both the 3D structure of the bladder as well as the scope motion using only the video frames as input. The result of the algorithm is a complete 360° spherical panorama of the outer surface. The details of the software algorithms are presented here along with results from both a virtual cystoscopy as well from real endoscopic imaging of a bladder phantom. The software successfully stitched several hundred video frames into a single panoramic with subpixel accuracy and with no knowledge of the intrinsic camera properties, such as focal length and radial distortion. In the discussion, we outline future work in development of the software as well as identifying factors pertinent to clinical translation of this technology.
Near-infrared observations of the variable crab nebula
NASA Astrophysics Data System (ADS)
Yamamoto, M.; Mori, K.; Shibata, S.; Tsujimoto, M.; Misawa, T.; Burrows, D.; Kawai, N.
We present three near-infrared NIR observations of the Crab Nebula obtained with CISCO on the Subaru Telescope and Quick Infrared Camera on the University of HAWAII 88 inch Telescope The observations were performed on 2004 September 2005 February and 2005 October and were coordinated with X-ray observations obtained with the Chandra X-ray observatory within 10 days As shown in previous optical and X-ray monitoring observations outward-moving wisps and variable knots are detected also in our NIR observations The NIR variations are closely correlated with variations in the X-ray observations indicating that both variations are driven by the same physical process We discuss the origin of NIR-emitting particles based on the temporal variations as well as the spectral energy distributions of each variable component
NASA Astrophysics Data System (ADS)
Seong, Myeongsu; Phillips, Zephaniah; Mai, Phuong M.; Yeo, Chaebeom; Song, Cheol; Lee, Kijoon; Kim, Jae G.
2015-07-01
Appropriate oxygen supply and blood flow are important in coordination of body functions and maintaining a life. To measure both oxygen supply and blood flow simultaneously, we developed a system that combined near-infrared spectroscopy (NIRS) and diffuse speckle contrast analysis (DSCA). Our system is more cost effective and compact than such combined systems as diffuse correlation spectroscopy(DCS)-NIRS or DCS flow oximeter, and also offers the same quantitative information. In this article, we present the configuration of DSCA-NIRS and preliminary data from an arm cuff occlusion and a repeated gripping exercise. With further investigation, we believe that DSCA-NIRS can be a useful tool for the field of neuroscience, muscle physiology and metabolic diseases such as diabetes.
NASA Technical Reports Server (NTRS)
2004-01-01
This false-color image taken by the panoramic camera on the Mars Exploration Rover Spirit shows a close-up of the rock dubbed 'Pot of Gold' (left), which is located near the base of the 'Columbia Hills' in Gusev Crater. Scientists are intrigued by this unusual-looking, nodule-covered rock and plan to investigate its detailed chemistry in coming sols. This picture was taken on sol 159 (June 14, 2004).45. This 360degree panorama was taken from the balcony using ...
45. This 360-degree panorama was taken from the balcony using a Hulcherama panoramic camera with a 35mm Mamiya Sekor lens. Image size on 120 roll film (Tri-X) for 360-deg. view is 6 x 22.5 cm. Because of overlap in view, actual image size is 6 x 24 cm. (2.25' x 9.5'). - Fox Theater, Seventh Avenue & Olive Way, Seattle, King County, WA
A Unified Framework for Street-View Panorama Stitching
Li, Li; Yao, Jian; Xie, Renping; Xia, Menghan; Zhang, Wei
2016-01-01
In this paper, we propose a unified framework to generate a pleasant and high-quality street-view panorama by stitching multiple panoramic images captured from the cameras mounted on the mobile platform. Our proposed framework is comprised of four major steps: image warping, color correction, optimal seam line detection and image blending. Since the input images are captured without a precisely common projection center from the scenes with the depth differences with respect to the cameras to different extents, such images cannot be precisely aligned in geometry. Therefore, an efficient image warping method based on the dense optical flow field is proposed to greatly suppress the influence of large geometric misalignment at first. Then, to lessen the influence of photometric inconsistencies caused by the illumination variations and different exposure settings, we propose an efficient color correction algorithm via matching extreme points of histograms to greatly decrease color differences between warped images. After that, the optimal seam lines between adjacent input images are detected via the graph cut energy minimization framework. At last, the Laplacian pyramid blending algorithm is applied to further eliminate the stitching artifacts along the optimal seam lines. Experimental results on a large set of challenging street-view panoramic images captured form the real world illustrate that the proposed system is capable of creating high-quality panoramas. PMID:28025481
NASA Technical Reports Server (NTRS)
2004-01-01
This panoramic camera image from the Mars Exploration Rover Opportunity features the 6.44 millimeter (0.25 inch) deep hole ground into the rock dubbed 'Bounce' by the rover's rock abrasion tool. The tool took 2 hours and 15 minutes to grind the hole on sol 66 of the rover's journey. A combination of limited solar power and the rock's jagged texture led the rock abrasion tool team to set very aggressive grinding parameters to ensure that the end result was a full circle, suitable for a thorough read from the rover's spectrometers.
Bounce's markedly different appearance (when compared to the rocks that were previously examined in the Eagle Crater outcrop) made it a natural target for rover research. In order to achieve an ideal position from which to grind into the rock, Opportunity moved in very close with its right wheel next to Bounce. In this image, the panoramic camera on the rover's mast is looking down, catching the tip of the solar panel which partially blocks the full circle ground by the rock abrasion tool. The outer ring consists of the cuttings from the rock, pushed out by the brushes on the grinding instrument. The dark impression at the top of the outer circle was caused by the instrument's contact mechanism which serves to stabilize it while grinding.The NASA 2003 Mars Exploration Rover Panoramic Camera (Pancam) Investigation
NASA Astrophysics Data System (ADS)
Bell, J. F.; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Morris, R. V.; Athena Team
2002-12-01
The Panoramic Camera System (Pancam) is part of the Athena science payload to be launched to Mars in 2003 on NASA's twin Mars Exploration Rover missions. The Pancam imaging system on each rover consists of two major components: a pair of digital CCD cameras, and the Pancam Mast Assembly (PMA), which provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360o of azimuth and from zenith to nadir, providing a complete view of the scene around the rover. Pancam utilizes two 1024x2048 Mitel frame transfer CCD detector arrays, each having a 1024x1024 active imaging area and 32 optional additional reference pixels per row for offset monitoring. Each array is combined with optics and a small filter wheel to become one "eye" of a multispectral, stereoscopic imaging system. The optics for both cameras consist of identical 3-element symmetrical lenses with an effective focal length of 42 mm and a focal ratio of f/20, yielding an IFOV of 0.28 mrad/pixel or a rectangular FOV of 16o\\x9D 16o per eye. The two eyes are separated by 30 cm horizontally and have a 1o toe-in to provide adequate parallax for stereo imaging. The cameras are boresighted with adjacent wide-field stereo Navigation Cameras, as well as with the Mini-TES instrument. The Pancam optical design is optimized for best focus at 3 meters range, and allows Pancam to maintain acceptable focus from infinity to within 1.5 meters of the rover, with a graceful degradation (defocus) at closer ranges. Each eye also contains a small 8-position filter wheel to allow multispectral sky imaging, direct Sun imaging, and surface mineralogic studies in the 400-1100 nm wavelength region. Pancam has been designed and calibrated to operate within specifications from -55oC to +5oC. An onboard calibration target and fiducial marks provide the ability to validate the radiometric and geometric calibration on Mars. Pancam relies heavily on use of the JPL ICER wavelet compression algorithm to maximize data return within stringent mission downlink limits. The scientific goals of the Pancam investigation are to: (a) obtain monoscopic and stereoscopic image mosaics to assess the morphology, topography, and geologic context of each MER landing site; (b) obtain multispectral visible to short-wave near-IR images of selected regions to determine surface color and mineralogic properties; (c) obtain multispectral images over a range of viewing geometries to constrain surface photometric and physical properties; and (d) obtain images of the Martian sky, including direct images of the Sun, to determine dust and aerosol opacity and physical properties. In addition, Pancam also serves a variety of operational functions on the MER mission, including (e) serving as the primary Sun-finding camera for rover navigation; (f) resolving objects on the scale of the rover wheels to distances of ~100 m to help guide navigation decisions; (g) providing stereo coverage adequate for the generation of digital terrain models to help guide and refine rover traverse decisions; (h) providing high resolution images and other context information to guide the selection of the most interesting in situ sampling targets; and (i) supporting acquisition and release of exciting E/PO products.
View of Scientific Instrument Module to be flown on Apollo 15
1971-06-27
S71-2250X (June 1971) --- A close-up view of the Scientific Instrument Module (SIM) to be flown for the first time on the Apollo 15 lunar landing mission. Mounted in a previously vacant sector of the Apollo Service Module (SM), the SIM carries specialized cameras and instrumentation for gathering lunar orbit scientific data. SIM equipment includes a laser altimeter for accurate measurement of height above the lunar surface; a large-format panoramic camera for mapping, correlated with a metric camera and the laser altimeter for surface mapping; a gamma ray spectrometer on a 25-feet extendible boom; a mass spectrometer on a 21-feet extendible boom; X-ray and alpha particle spectrometers; and a subsatellite which will be injected into lunar orbit carrying a particle and magnetometer, and the S-Band transponder.
Final Optical Design of PANIC, a Wide-Field Infrared Camera for CAHA
NASA Astrophysics Data System (ADS)
Cárdenas, M. C.; Gómez, J. Rodríguez; Lenzen, R.; Sánchez-Blanco, E.
We present the Final Optical Design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Ritchey-Chrtien focus of the Calar Alto 2.2 m telescope. This will be the first instrument built under the German-Spanish consortium that manages the Calar Alto observatory. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. The optical design produces a well defined internal pupil available to reducing the thermal background by a cryogenic pupil stop. A mosaic of four detectors Hawaii 2RG of 2 k ×2 k, made by Teledyne, will give a field of view of 31.9 arcmin ×31.9 arcmin.
False-Color View of a 'Rat' Hole Trail
NASA Technical Reports Server (NTRS)
2004-01-01
This view from the Mars Exploration Rover Opportunity's panoramic camera is a false-color composite rendering of the first seven holes that the rover's rock abrasion tool dug on the inner slope of 'Endurance Crater.' The rover was about 12 meters (about 39 feet) down into the crater when it acquired the images combined into this mosaic. The view is looking back toward the rim of the crater, with the rover's tracks visible. The tailings around the holes drilled by the rock abrasion tool, or 'Rat,' show evidence for fine-grained red hematite similar to what was observed months earlier in 'Eagle Crater' outcrop holes. Last week, viewers were asked to try seeing as many holes as they could from a black-and-white, navigation-camera image (PIA06716). Most viewers will find it far easier to see the seven holes in this exaggerated color image; the same is true for scientists who are studying the holes from millions of miles away. Starting from the uppermost pictured (closest to the crater rim) to the lowest, the rock abrasion tool hole targets are called 'Tennessee,' 'Cobblehill,' 'Virginia,' 'London,' 'Grindstone,' 'Kettlestone,' and 'Drammensfjorden.' Opportunity drilled these holes on sols 138 (June 13, 2004), 143 (June 18), 145 (June 20), 148 (June 23), 151 (June 26), 153 (June 28) and 161 (July 7), respectively. Each hole is 4.5 centimeters (1.8 inches) in diameter. This image was generated using the panoramic camera's 750-, 530-, and 430-nanometer filters. It was taken on sol 173 (July 19).Jolliff, B.; Knoll, A.; Morris, R.V.; Moersch, J.; McSween, H.; Gilmore, M.; Arvidson, R.; Greeley, R.; Herkenhoff, K.; Squyres, S.
2002-01-01
Blind field tests of the Field Integration Design and Operations (FIDO) prototype Mars rover were carried out 7-16 May 2000. A Core Operations Team (COT), sequestered at the Jet Propulsion Laboratory without knowledge of test site location, prepared command sequences and interpreted data acquired by the rover. Instrument sensors included a stereo panoramic camera, navigational and hazard-avoidance cameras, a color microscopic imager, an infrared point spectrometer, and a rock coring drill. The COT designed command sequences, which were relayed by satellite uplink to the rover, and evaluated instrument data. Using aerial photos and Airborne Visible and Infrared Imaging Spectrometer (AVIRIS) data, and information from the rover sensors, the COT inferred the geology of the landing site during the 18 sol mission, including lithologic diversity, stratigraphic relationships, environments of deposition, and weathering characteristics. Prominent lithologic units were interpreted to be dolomite-bearing rocks, kaolinite-bearing altered felsic volcanic materials, and basalt. The color panoramic camera revealed sedimentary layering and rock textures, and geologic relationships seen in rock exposures. The infrared point spectrometer permitted identification of prominent carbonate and kaolinite spectral features and permitted correlations to outcrops that could not be reached by the rover. The color microscopic imager revealed fine-scale rock textures, soil components, and results of coring experiments. Test results show that close-up interrogation of rocks is essential to investigations of geologic environments and that observations must include scales ranging from individual boulders and outcrops (microscopic, macroscopic) to orbital remote sensing, with sufficient intermediate steps (descent images) to connect in situ and remote observations.
'Everest' Panorama; 20-20 Vision
NASA Technical Reports Server (NTRS)
2005-01-01
[figure removed for brevity, see original site] 'Everest' Panorama 20-20 Vision (QTVR) [figure removed for brevity, see original site] 'Everest' Panorama Animation If a human with perfect vision donned a spacesuit and stepped onto the martian surface, the view would be as clear as this sweeping panorama taken by NASA's Mars Exploration Rover Spirit. That's because the rover's panoramic camera has the equivalent of 20-20 vision. Earthlings can take a virtual tour of the scenery by zooming in on their computer screens many times to get a closer look at, say, a rock outcrop or a sand drift, without losing any detail. This level of clarity is unequaled in the history of Mars exploration. It took Spirit three days, sols 620 to 622 (Oct. 1 to Oct. 3, 2005), to acquire all the images combined into this mosaic, called the 'Everest Panorama,' looking outward in every direction from the true summit of 'Husband Hill.' During that period, the sky changed in color and brightness due to atmospheric dust variations, as shown in contrasting sections of this mosaic. Haze occasionally obscured the view of the hills on the distant rim of Gusev Crater 80 kilometers (50 miles) away. As dust devils swooped across the horizon in the upper right portion of the panorama, the robotic explorer changed the filters on the camera from red to green to blue, making the dust devils appear red, green, and blue. In reality, the dust devils are similar in color to the reddish-brown soils of Mars. No attempt was made to 'smooth' the sky in this mosaic, as has been done in other panoramic-camera mosaics to simulate the view one would get by taking in the landscape all at once. The result is a sweeping vista that allows viewers to observe weather changes on Mars. The summit of Husband Hill is a broad plateau of rock outcrops and windblown drifts about 100 meters (300 feet) higher than the surrounding plains of Gusev Crater. In the distance, near the center of the mosaic, is the 'South Basin,' the destination for the downhill travel Spirit began after exploring the summit region. This panorama spans 360 degrees and consists of images obtained during 81 individual pointings of the panoramic camera. Four filters were used at each pointing. Images through three of the filters, for wavelengths of 750 nanometers, 530 nanometers and 430 nanometers, were combined for this approximately true-color rendering.Layers of 'Cabo Frio' in 'Victoria Crater'
NASA Technical Reports Server (NTRS)
2006-01-01
This view of 'Victoria crater' is looking southeast from 'Duck Bay' towards the dramatic promontory called 'Cabo Frio.' The small crater in the right foreground, informally known as 'Sputnik,' is about 20 meters (about 65 feet) away from the rover, the tip of the spectacular, layered, Cabo Frio promontory itself is about 200 meters (about 650 feet) away from the rover, and the exposed rock layers are about 15 meters (about 50 feet) tall. This is an approximately true color rendering of images taken by the panoramic camera (Pancam) on NASA's Mars Exploration Rover Opportunity during the rover's 952nd sol, or Martian day, (Sept. 28, 2006) using the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.Layers of 'Cabo Frio' in 'Victoria Crater' (Stereo)
NASA Technical Reports Server (NTRS)
2006-01-01
This view of 'Victoria crater' is looking southeast from 'Duck Bay' towards the dramatic promontory called 'Cabo Frio.' The small crater in the right foreground, informally known as 'Sputnik,' is about 20 meters (about 65 feet) away from the rover, the tip of the spectacular, layered, Cabo Frio promontory itself is about 200 meters (about 650 feet) away from the rover, and the exposed rock layers are about 15 meters (about 50 feet) tall. This is a red-blue stereo anaglyph generated from images taken by the panoramic camera (Pancam) on NASA's Mars Exploration Rover Opportunity during the rover's 952nd sol, or Martian day, (Sept. 28, 2006) using the camera's 430-nanometer filters.NASA Technical Reports Server (NTRS)
2004-01-01
[figure removed for brevity, see original site] Click on the image for 'Fram' in Color (QTVR) This view in approximately true color reveals details in an impact crater informally named 'Fram' in the Meridian Planum region of Mars. The picture is a mosaic of frames taken by the panoramic camera on NASA's Mars Exploration Rover Opportunity during the rover's 88th martian day on Mars, on April 23, 2004. The crater spans about 8 meters (26 feet) in diameter. Opportunity paused beside it while traveling from the rover's landing site toward a larger crater farther east. This view combines images taken using three of the camera's filters for different wavelengths of light: 750 nanometers, 530 nanometers and 430 nanometers.Layers of 'Cape Verde' in 'Victoria Crater'
NASA Technical Reports Server (NTRS)
2006-01-01
This view of Victoria crater is looking north from 'Duck Bay' towards the dramatic promontory called 'Cape Verde.' The dramatic cliff of layered rocks is about 50 meters (about 165 feet) away from the rover and is about 6 meters (about 20 feet) tall. The taller promontory beyond that is about 100 meters (about 325 feet) away, and the vista beyond that extends away for more than 400 meters (about 1300 feet) into the distance. This is an approximately true color rendering of images taken by the panoramic camera (Pancam) on NASA's Mars Exploration Rover Opportunity during the rover's 952nd sol, or Martian day, (Sept. 28, 2006) using the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.Layers of 'Cape Verde' in 'Victoria Crater' (Stereo)
NASA Technical Reports Server (NTRS)
2006-01-01
This view of Victoria crater is looking north from 'Duck Bay' towards the dramatic promontory called 'Cape Verde.' The dramatic cliff of layered rocks is about 50 meters (about 165 feet) away from the rover and is about 6 meters (about 20 feet) tall. The taller promontory beyond that is about 100 meters (about 325 feet) away, and the vista beyond that extends away for more than 400 meters (about 1300 feet) into the distance. This is a red-blue stereo anaglyph generated from images taken by the panoramic camera (Pancam) on NASA's Mars Exploration Rover Opportunity during the rover's 952nd sol, or Martian day, (Sept. 28, 2006) using the camera's 430-nanometer filters.NASA Technical Reports Server (NTRS)
2006-01-01
As NASA's Mars Exploration Rover Opportunity continues a southward trek from 'Erebus Crater' toward 'Victoria Crater,' the terrain consists of large sand ripples and patches of flat-lying rock outcrops, as shown in this image. Whenever possible, rover planners keep Opportunity on the 'pavement' for best mobility. This false-color image mosaic was assembled using images acquired by the panoramic camera on Opportunity's 784th sol (April 8, 2006) at about 11:45 a.m. local solar time. The camera used its 753-nanometer, 535-nanometer and 432-nanometer filters. This view shows a portion of the outcrop named 'Bosque,' including rover wheel tracks, fractured and finely-layered outcrop rocks and smaller, dark cobbles littered across the surface.NASA Astrophysics Data System (ADS)
Trokielewicz, Mateusz; Bartuzi, Ewelina; Michowska, Katarzyna; Andrzejewska, Antonina; Selegrat, Monika
2015-09-01
In the age of modern, hyperconnected society that increasingly relies on mobile devices and solutions, implementing a reliable and accurate biometric system employing iris recognition presents new challenges. Typical biometric systems employing iris analysis require expensive and complicated hardware. We therefore explore an alternative way using visible spectrum iris imaging. This paper aims at answering several questions related to applying iris biometrics for images obtained in the visible spectrum using smartphone camera. Can irides be successfully and effortlessly imaged using a smartphone's built-in camera? Can existing iris recognition methods perform well when presented with such images? The main advantage of using near-infrared (NIR) illumination in dedicated iris recognition cameras is good performance almost independent of the iris color and pigmentation. Are the images obtained from smartphone's camera of sufficient quality even for the dark irides? We present experiments incorporating simple image preprocessing to find the best visibility of iris texture, followed by a performance study to assess whether iris recognition methods originally aimed at NIR iris images perform well with visible light images. To our best knowledge this is the first comprehensive analysis of iris recognition performance using a database of high-quality images collected in visible light using the smartphones flashlight together with the application of commercial off-the-shelf (COTS) iris recognition methods.
Near-infrared hyperspectral imaging of atherosclerotic tissue phantom
NASA Astrophysics Data System (ADS)
Ishii, K.; Nagao, R.; Kitayabu, A.; Awazu, K.
2013-06-01
A method to identify vulnerable plaques that are likely to cause acute coronary events has been required. The object of this study is identifying vulnerable plaques by hyperspectral imaging in near-infrared range (NIR-HSI) for an angioscopic application. In this study, NIR-HSI of atherosclerotic tissue phantoms was demonstrated under simulated angioscopic conditions. NIR-HSI system was constructed by a NIR super continuum light and a mercury-cadmium-telluride camera. Spectral absorbance values were obtained in the wavelength range from 1150 to 2400 nm at 10 nm intervals. The hyperspectral images were constructed with spectral angle mapper algorithm. As a result, detections of the lipid area in the atherosclerotic tissue phantom under angioscopic observation conditions were achieved especially in the wavelength around 1200 nm, which corresponds to the second overtone of CH stretching vibration mode.
A New Platform for Investigating In-Situ NIR Reflectance in Snow
NASA Astrophysics Data System (ADS)
Johnson, M.; Taubenheim, J. R. L.; Stevenson, R.; Eldred, D.
2017-12-01
In-situ near infrared (NIR) reflectance measurements of the snowpack have been shown to have correlations to valuable snowpack properties. To-date many studies take these measurements by digging a pit and setting up a NIR camera to take images of the wall. This setup is cumbersome, making it challenging to investigate things like spatial variability. Over the course of 3 winters, a new device has been developed capable of mitigating some of the downfalls of NIR open pit photography. This new instrument is a NIR profiler capable of taking NIR reflectance measurements without digging a pit, with most measurements taking less than 30 seconds to retrieve data. The latest prototype is built into a ski pole and automatically transfers data wirelessly to the users smartphone. During 2016-2017 winter, the device was used by 37 different users resulting in over 4000 measurements in the Western United States, demonstrating a dramatic reduction in time to data when compared to other methods. Presented here are some initial findings from a full winter of using the ski pole version of this device.
Near-infrared hyperspectral imaging of atherosclerotic plaque in WHHLMI rabbit artery
NASA Astrophysics Data System (ADS)
Ishii, Katsunori; Kitayabu, Akiko; Omiya, Kota; Honda, Norihiro; Awazu, Kunio
2013-03-01
Hyperspectral imaging (HSI) of rabbit atherosclerotic plaque in near-infrared (NIR) range from 1150 to 2400 nm was demonstrated. A method to identify vulnerable plaques that are likely to cause acute coronary events has been required. The object of this study is identifying vulnerable plaques by NIR-HSI for an angioscopic application. In this study, we observed the hyperspectral images of the atherosclerotic plaque in WHHLMI rabbit (atherosclerotic rabbit) artery under simulated angioscopic conditions by NIR-HSI. NIR-HSI system was constructed by a NIR super continuum light and a mercury-cadmium-telluride camera. Spectral absorbance values (log (1/R) data) were obtained in the wavelength range from 1150 to 2400 nm at 10 nm intervals. The hyperspectral images were constructed with spectral angle mapper algorithm. As a result, the detections of atherosclerotic plaque under angioscopic observation conditions were achieved especially in the wavelength around 1200 nm, which corresponds to the second overtone of CH stretching vibration mode. The NIR-HSI was considered to serve as an angioscopic diagnosis technique to identify vulnerable plaques without clamping and saline injection.
Performance of PHOTONIS' low light level CMOS imaging sensor for long range observation
NASA Astrophysics Data System (ADS)
Bourree, Loig E.
2014-05-01
Identification of potential threats in low-light conditions through imaging is commonly achieved through closed-circuit television (CCTV) and surveillance cameras by combining the extended near infrared (NIR) response (800-10000nm wavelengths) of the imaging sensor with NIR LED or laser illuminators. Consequently, camera systems typically used for purposes of long-range observation often require high-power lasers in order to generate sufficient photons on targets to acquire detailed images at night. While these systems may adequately identify targets at long-range, the NIR illumination needed to achieve such functionality can easily be detected and therefore may not be suitable for covert applications. In order to reduce dependency on supplemental illumination in low-light conditions, the frame rate of the imaging sensors may be reduced to increase the photon integration time and thus improve the signal to noise ratio of the image. However, this may hinder the camera's ability to image moving objects with high fidelity. In order to address these particular drawbacks, PHOTONIS has developed a CMOS imaging sensor (CIS) with a pixel architecture and geometry designed specifically to overcome these issues in low-light level imaging. By combining this CIS with field programmable gate array (FPGA)-based image processing electronics, PHOTONIS has achieved low-read noise imaging with enhanced signal-to-noise ratio at quarter moon illumination, all at standard video frame rates. The performance of this CIS is discussed herein and compared to other commercially available CMOS and CCD for long-range observation applications.
Caries detection and diagnostics with near-infrared light transillumination: clinical experiences.
Söchtig, Friederike; Hickel, Reinhard; Kühnisch, Jan
2014-06-01
The aim of this paper was to present the function and potential of diagnosing caries lesions using a recently introduced near-infrared (NIR) transillumination technique (DIAGNOcam, KaVo). The study included 130 adolescents and adults with complete permanent dentition (age > 12). All patients underwent visual examination and, if necessary, bitewing radiographs. Proximal and occlusal surfaces, which had not yet been restored, were photographed by a NIR transillumination camera system using light with a wavelength of 780 nm rather than ionizing radiation. Of the study patients, 85 showed 127 proximal dentin caries lesions that were treated operatively. A cross table shows the correlation of radiography and NIR transillumination. Based on our practical clinical experiences to date, a possible classifi cation of diagnosis is introduced. The main result of our study was that NIR light was able to visualize caries lesions on proximal and occlusal surfaces. The study suggests that NIR transillumination is a method that may help to avoid bitewing radiographs for diagnosis of caries in everyday clinical practice.
Standoff reconnaissance imagery - Applications and interpreter training
NASA Astrophysics Data System (ADS)
Gustafson, G. C.
1980-01-01
The capabilities, advantages and applications of Long Range Oblique Photography (LOROP) standoff air reconnaissance cameras are reviewed, with emphasis on the problems likely to be encountered in photo interpreter training. Results of student exercises in descriptive image analysis and mensuration are presented and discussed, and current work on the computer programming of oblique and panoramic mensuration tasks is summarized. Numerous examples of this class of photographs and their interpretation at various magnifications are also presented.
Enhanced Virtual Presence for Immersive Visualization of Complex Situations for Mission Rehearsal
1997-06-01
taken. We propose to join both these technologies together in a registration device . The registration device would be small and portable and easily...registering the panning of the camera (or other sensing device ) and also stitch together the shots to automatically generate panoramic files necessary to...database and as the base information changes each of the linked drawings is automatically updated. Filename Format A specific naming convention should be
TIFR Near Infrared Imaging Camera-II on the 3.6 m Devasthal Optical Telescope
NASA Astrophysics Data System (ADS)
Baug, T.; Ojha, D. K.; Ghosh, S. K.; Sharma, S.; Pandey, A. K.; Kumar, Brijesh; Ghosh, Arpan; Ninan, J. P.; Naik, M. B.; D’Costa, S. L. A.; Poojary, S. S.; Sandimani, P. R.; Shah, H.; Krishna Reddy, B.; Pandey, S. B.; Chand, H.
Tata Institute of Fundamental Research (TIFR) Near Infrared Imaging Camera-II (TIRCAM2) is a closed-cycle Helium cryo-cooled imaging camera equipped with a Raytheon 512×512 pixels InSb Aladdin III Quadrant focal plane array (FPA) having sensitivity to photons in the 1-5μm wavelength band. In this paper, we present the performance of the camera on the newly installed 3.6m Devasthal Optical Telescope (DOT) based on the calibration observations carried out during 2017 May 11-14 and 2017 October 7-31. After the preliminary characterization, the camera has been released to the Indian and Belgian astronomical community for science observations since 2017 May. The camera offers a field-of-view (FoV) of ˜86.5‧‧×86.5‧‧ on the DOT with a pixel scale of 0.169‧‧. The seeing at the telescope site in the near-infrared (NIR) bands is typically sub-arcsecond with the best seeing of ˜0.45‧‧ realized in the NIR K-band on 2017 October 16. The camera is found to be capable of deep observations in the J, H and K bands comparable to other 4m class telescopes available world-wide. Another highlight of this camera is the observational capability for sources up to Wide-field Infrared Survey Explorer (WISE) W1-band (3.4μm) magnitudes of 9.2 in the narrow L-band (nbL; λcen˜ 3.59μm). Hence, the camera could be a good complementary instrument to observe the bright nbL-band sources that are saturated in the Spitzer-Infrared Array Camera (IRAC) ([3.6] ≲ 7.92 mag) and the WISE W1-band ([3.4] ≲ 8.1 mag). Sources with strong polycyclic aromatic hydrocarbon (PAH) emission at 3.3μm are also detected. Details of the observations and estimated parameters are presented in this paper.
ATTICA family of thermal cameras in submarine applications
NASA Astrophysics Data System (ADS)
Kuerbitz, Gunther; Fritze, Joerg; Hoefft, Jens-Rainer; Ruf, Berthold
2001-10-01
Optronics Mast Systems (US: Photonics Mast Systems) are electro-optical devices which enable a submarine crew to observe the scenery above water during dive. Unlike classical submarine periscopes they are non-hull-penetrating and therefore have no direct viewing capability. Typically they have electro-optical cameras both for the visual and for an IR spectral band with panoramic view and a stabilized line of sight. They can optionally be equipped with laser range- finders, antennas, etc. The brand name ATTICA (Advanced Two- dimensional Thermal Imager with CMOS-Array) characterizes a family of thermal cameras using focal-plane-array (FPA) detectors which can be tailored to a variety of requirements. The modular design of the ATTICA components allows the use of various detectors (InSb, CMT 3...5 μm , CMT 7...11 μm ) for specific applications. By means of a microscanner ATTICA cameras achieve full standard TV resolution using detectors with only 288 X 384 (US:240 X 320) detector elements. A typical requirement for Optronics-Mast Systems is a Quick- Look-Around capability. For FPA cameras this implies the need for a 'descan' module which can be incorporated in the ATTICA cameras without complications.
Panoramic Views of the Landing site from Sagan Memorial Station
NASA Technical Reports Server (NTRS)
1997-01-01
Each of these panoramic views is a controlled mosaic of approximately 300 IMP images covering 360 degrees of azimuth and elevations from approximately 4 degrees above the horizon to 45 degrees below it. Simultaneous adjustment of orientations of all images has been performed to minimize discontinuities between images. Mosaics have been highpass-filtered and contrast-enhanced to improve discrimination of details without distorting relative colors overall.
TOP IMAGE: Enhanced true-color image created from the 'Gallery Pan' sequence, acquired on sols 8-10 so that local solar time increases nearly continuously from about 10:00 at the right edge to about 12:00 at the left. Mosaics of images obtained by the right camera through 670 nm, 530 nm, and 440 nm filters were used as red, green and blue channels. Grid ticks indicate azimuth clockwise from north in 30 degree increments and elevation in 15 degree increments.BOTTOM IMAGE: Anaglyphic stereoimage created from the 'monster pan' sequence, acquired in four sections between about 8:30 and 15:00 local solar time on sol 3. Mosaics of images obtained through the 670 nm filter (left camera) and 530 and 440 nm filters (right camera) were used where available. At the top and bottom, left- and right-camera 670 nm images were used. Part of the northern horizon was not imaged because of the tilt of the lander. This image may be viewed stereoscopically through glasses with a red filter for the left eye and a cyan filter for the right eye.NOTE: original caption as published in Science MagazineMars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is a division of the California Institute of Technology (Caltech).VizieR Online Data Catalog: Multiwavelenght photometry of Sh 2-138 YSOs (Baug+, 2015)
NASA Astrophysics Data System (ADS)
Baug, T.; Ojha, D. K.; Dewangan, L. K.; Ninan, J. P.; Bhatt, B. C.; Ghosh, S. K.; Mallick, K. K.
2016-07-01
Optical BVRI imaging observations of the Sh2-138 region were carried out on 2005 September 8 using the Himalaya Faint Object Spectrograph and Camera (HFOSC) mounted on the 2 m Himalayan Chandra Telescope (HCT). In order to identify strong Hα emission sources in the Sh2-138 region, slitless Hα spectra were obtained using the HFOSC on 2007 November 16. Optical spectroscopic observations of the central brightest source were performed using the HFOSC on 2014 November 18. The newly installed TIFR Near Infrared Spectrometer and Imager Camera (TIRSPEC) on the HCT was used for NIR observations on 2014 November 18 under photometric conditions with an average seeing of 1.4 arcsec. We obtained NIR spectra of the central brightest source on 2014 May 29, using the TIRSPEC, in NIR Y (1.02-1.20um), J (1.21-1.48um), H (1.49-1.78um), and K (2.04-2.35um) bands. We conducted optical narrow-band imaging observations of the region in Hα filter (λ~6563Å, Δλ~100Å) with exposure times of 600s, 250s, and 50s on 2005 September 8 using the HFOSC. (1 data file).
2004-11-11
NASA's Mars Exploration Rover Opportunity captured this view from the base of "Burns Cliff" during the rover's 280th martian day (Nov. 6, 2004). This cliff in the inner wall of "Endurance Crater" displays multiple layers of bedrock for the rover to examine with its panoramic camera and miniature thermal emission spectrometer. The rover team has decided that the farthest Opportunity can safely advance along the base of the cliff is close to the squarish white rock near the center of this image. After examining the site for a few days from that position, the the rover will turn around and head out of the crater. The view is a mosaic of frames taken by Opportunity's navigation camera. The rover was on ground with a slope of about 30 degrees when the pictures were taken, and the view is presented here in a way that corrects for that tilt of the camera. http://photojournal.jpl.nasa.gov/catalog/PIA07039
NASA Astrophysics Data System (ADS)
Rice, M. S.; Bell, J. F.
2011-12-01
We have developed a "hydration signature" for mapping H2O- and/or OH-bearing materials at Mars landing sites using multispectral visible to near-infrared (Vis-NIR) observations from the Mars Exploration Rover (MER) Panoramic Camera (Pancam). Pancam's 13 narrowband geology filters cover 11 unique wavelengths in the visible and near infrared (434 to 1009 nm). The hydration signature is based on a strongly negative slope from 934 to 1009 nm that characterizes the spectra of hydrated silica-rich rocks and soils observed by MER Spirit; this feature is likely due to the 2ν1 + ν3 H2O combination band and/or the 3vOH overtone centered near ~1000 nm, whose positions vary slightly depending on bonding to nearest-neighbor atoms. Here we present the ways we have used this hydration signature, in combination with observations of morphology and texture, to remotely identify candidate hydrated materials in Pancam observations. At Gusev Crater, we find that the hydration signature is widespread along Spirit's traverse in the Columbia Hills, which adds to the growing body of evidence that aqueous alteration has played a significant role in the complex geologic history of this site. At Meridiani Planum, the hydration signature is associated with a specific stratigraphic layer ("Smith") exposed within the walls of Victoria Crater. We also discuss limitations to the use of the hydration signature, which can give false detections under specific viewing geometries. This hydration signature can similarly be used to map hydrated materials at the Mars Science Laboratory (MSL) landing site, Gale Crater. The MSL Mast Camera (Mastcam) is a two-instrument suite of fixed-focal length (FFL) cameras, one with a 15-degree field of view (FOV) and the other with a 5.1-degree FOV. Mastcam's narrowband filters cover 9 unique wavelengths in the visible and near-infrared (band centers near 440, 525, 675, 750, 800, 865, 905, 935, and 1035 nm), and are distributed between the two FFL cameras. Full-filter multispectral observations of the region of overlap between the two cameras can be used for our hydration signature mapping. Mastcam's longest-wavelength filter should be able to detect hydrated and/or hydroxylated minerals with strong absorptions between ~990 and ~1080 nm; because of the width of this IR filter, Mastcam will be sensitive to more H2O and/or OH-bearing species than Pancam. Here we summarize the minerals Mastcam should be able to detect if present at Gale Crater, and our plans for hydration signature mapping with MSL.
Consumer electronic optics: how small can a lens be: the case of panomorph lenses
NASA Astrophysics Data System (ADS)
Thibault, Simon; Parent, Jocelyn; Zhang, Hu; Du, Xiaojun; Roulet, Patrice
2014-09-01
In 2014, miniature camera modules are applied to a variety of applications such as webcam, mobile phone, automotive, endoscope, tablets, portable computers and many other products. Mobile phone cameras are probably one of the most challenging parts due to the need for smaller and smaller total track length (TTL) and optimized embedded image processing algorithms. As the technology is developing, higher resolution and higher image quality, new capabilities are required to fulfil the market needs. Consequently, the lens system becomes more complex and requires more optical elements and/or new optical elements. What is the limit? How small an injection molded lens can be? We will discuss those questions by comparing two wide angle lenses for consumer electronic market. The first lens is a 6.56 mm (TTL) panoramic (180° FOV) lens built in 2012. The second is a more recent (2014) panoramic lens (180° FOV) with a TTL of 3.80 mm for mobile phone camera. Both optics are panomorph lenses used with megapixel sensors. Between 2012 and 2014, the development in design and plastic injection molding allowed a reduction of the TTL by more than 40%. This TTL reduction has been achieved by pushing the lens design to the extreme (edge/central air and material thicknesses as well as lens shape). This was also possible due to a better control of the injection molding process and material (low birefringence, haze and thermal stability). These aspects will be presented and discussed. During the next few years, we don't know if new material will come or new process but we will still need innovative people and industries to push again the limits.
Details of Layers in Victoria Crater's Cape St. Vincent
NASA Technical Reports Server (NTRS)
2007-01-01
NASA's Mars Exploration Rover Opportunity rover spent about 300 sols (Martian days) during 2006 and 2007 traversing the rim of Victoria Crater. Besides looking for a good place to enter the crater, the rover obtained images of rock outcrops exposed at several cliffs along the way. The cliff in this image from Opportunity's panoramic camera (Pancam) is informally named Cape St. Vincent. It is a promontory approximately 12 meters (39 feet) tall on the northern rim of Victoria crater, near the farthest point along the rover's traverse around the rim. Layers seen in Cape St. Vincent have proven to be among the best examples of meter scale cross-bedding observed on Mars to date. Cross-bedding is a geologic term for rock layers which are inclined relative to the horizontal and which are indicative of ancient sand dune deposits. In order to get a better look at these outcrops, Pancam 'super-resolution' imaging techniques were utilized. Super-resolution is a type of imaging mode which acquires many pictures of the same target to reconstruct a digital image at a higher resolution than is native to the camera. These super-resolution images have allowed scientists to discern that the rocks at Victoria Crater once represented a large dune field, not unlike the Sahara desert on Earth, and that this dune field migrated with an ancient wind flowing from the north to the south across the region. Other rover chemical and mineral measurements have shown that many of the ancient sand dunes studied in Meridiani Planum were modified by surface and subsurface liquid water long ago. This is a Mars Exploration Rover Opportunity Panoramic Camera image acquired on sol 1167 (May 7, 2007), and was constructed from a mathematical combination of 16 different blue filter (480 nm) images.NASA Technical Reports Server (NTRS)
2004-01-01
This high-resolution image captured by the Mars Exploration Rover Opportunity's panoramic camera shows in superb detail a portion of the puzzling rock outcropping that scientists are eagerly planning to investigate. Presently, Opportunity is on its lander facing northeast; the outcropping lies to the northwest. These layered rocks measure only 10 centimeters (4 inches) tall and are thought to be either volcanic ash deposits or sediments carried by water or wind. The small rock in the center is about the size of a golf ball.
Partial 'Seminole' Panorama (False Color)
NASA Technical Reports Server (NTRS)
2005-01-01
This view from Spirit's panoramic camera is assembled from frames acquired on Martian days, or sols, 672 and 673 (Nov. 23 and 24, 2005) from the rover's position near an outcrop called 'Seminole.' The view is a southward-looking portion of a larger panorama still being completed. This is a false-color version to emphasize geological differences. It is a composite of images shot through three different filters, admitting light of wavelengths 750 nanometers, 530 nanometers and 430 nanometers.Earth Obsersation taken by the Expedition 11 crew
2005-07-07
ISS011-E-10221 (7 July 2005) --- At the time of this Expedition 11 digital still camera's image, Hurricane Dennis was churning northwestward through the Caribbean Sea between Jamaica and eastern Cuba packing winds of up to 115 miles per hour. Even though the hurricane had just attained Category 3 intensity, the eye had not yet cleared. This high-oblique, panoramic view, taken through a 28mm lens at 21:14:00 gmt, is looking southwest.
2010-12-01
including thermal optics Much more precise target engagement and stabilization method Drawbacks Mechanical malfunctions more common Gunner has...complete panorama view that extends from 0–180 degrees off-center, from our camera system. Figure 20 360° view dome projection Figure 21 shows the...method can incorporate various types of synthetic vision aids, such as thermal or electro-optical sensors, to give the user the capability to see in
Airbag Trail Dubbed 'Magic Carpet'
NASA Technical Reports Server (NTRS)
2004-01-01
[figure removed for brevity, see original site] Click on the image for Airbag Trail Dubbed 'Magic Carpet' (QTVR) [figure removed for brevity, see original site] [figure removed for brevity, see original site] Magic Carpet Close-upMagic Carpet Close-up HDThis section of the first color image from the Mars Exploration Rover Spirit has been further processed to produce a sharper look at a trail left by the one of rover's airbags. The drag mark was made after the rover landed and its airbags were deflated and retracted. Scientists have dubbed the region the 'Magic Carpet' after a crumpled portion of the soil that appears to have been peeled away (lower left side of the drag mark). Rocks were also dragged by the airbags, leaving impressions and 'bow waves' in the soil. The mission team plans to drive the rover over to this site to look for additional clues about the composition of the martian soil. This image was taken by Spirit's panoramic camera.This extreme close-up image (see insets above) highlights the martian feature that scientists have named 'Magic Carpet' because of its resemblance to a crumpled carpet fold. Scientists think the soil here may have detached from its underlying layer, possibly due to interaction with the Mars Exploration Rover Spirit's airbag after landing. This image was taken on Mars by the rover's panoramic camera.View Northward from Spirit's Winter Roost
NASA Technical Reports Server (NTRS)
2006-01-01
One part of the research program that NASA's Mars Exploration Rover Spirit is conducting while sitting at a favorable location for wintertime solar energy is the most detailed panorama yet taken on the surface of Mars. This view is a partial preliminary product from the continuing work on the full image, which will be called the 'McMurdo Panorama.' Spirit's panoramic camera (Pancam) began taking exposures for the McMurdo Panorama on the rover's 814th Martian day (April 18, 2006). The rover has accumulated more than 900 exposures for this panorama so far, through all of the Pancam mineralogy filters and using little or no image compression. Even with a tilt toward the winter sun, the amount of energy available daily is small, so the job will still take one to two more months to complete. This portion of the work in progress looks toward the north. 'Husband Hill,' which Spirit was climbing a year ago, is on the horizon near the center. 'Home Plate' is a between that hill and the rover's current position. Wheel tracks imprinted when Spirit drove south from Home Plate can be seen crossing the middle distance of the image from the center to the right. This is an approximate true-color rendering combining exposures taken through three of the panoramic camera's filters. The filters used are centered on wavelengths of 750 nanometers, 530 nanometers and 430 nanometers.A CMOS camera-based system for clinical photoplethysmographic applications
NASA Astrophysics Data System (ADS)
Humphreys, Kenneth; Markham, Charles; Ward, Tomas E.
2005-06-01
In this work an image-based photoplethysmography (PPG) system is developed and tested against a conventional finger-based system as commonly used in clinical practise. A PPG is essentially an optical instrument consisting of a near infrared (NIR) source and detector that is capable of tracking blood flow changes in body tissue. When used with a number of wavelengths in the NIR band blood oxygenation changes as well as other blood chemical signatures can be ascertained yielding a very useful device in the clinical realm. Conventionally such a device requires direct contact with the tissue under investigation which eliminates the possibility of its use for applications like wound management where the tissue oxygenation measurement could be extremely useful. To circumnavigate this shortcoming we have developed a CMOS camera-based system, which can successfully extract the PPG signal without contact with the tissue under investigation. A comparison of our results with conventional techniques has yielded excellent results.
NASA Astrophysics Data System (ADS)
Zalameda, Joseph N.; Burke, Eric R.; Hafley, Robert A.; Taminger, Karen M.; Domack, Christopher S.; Brewer, Amy; Martin, Richard E.
2013-05-01
Additive manufacturing is a rapidly growing field where 3-dimensional parts can be produced layer by layer. NASA's electron beam freeform fabrication (EBF3) technology is being evaluated to manufacture metallic parts in a space environment. The benefits of EBF3 technology are weight savings to support space missions, rapid prototyping in a zero gravity environment, and improved vehicle readiness. The EBF3 system is composed of 3 main components: electron beam gun, multi-axis position system, and metallic wire feeder. The electron beam is used to melt the wire and the multi-axis positioning system is used to build the part layer by layer. To insure a quality deposit, a near infrared (NIR) camera is used to image the melt pool and solidification areas. This paper describes the calibration and application of a NIR camera for temperature measurement. In addition, image processing techniques are presented for deposit assessment metrics.
Layers of 'Cabo Frio' in 'Victoria Crater' (False Color)
NASA Technical Reports Server (NTRS)
2006-01-01
This view of 'Victoria crater' is looking southeast from 'Duck Bay' towards the dramatic promontory called 'Cabo Frio.' The small crater in the right foreground, informally known as 'Sputnik,' is about 20 meters (about 65 feet) away from the rover, the tip of the spectacular, layered, Cabo Frio promontory itself is about 200 meters (about 650 feet) away from the rover, and the exposed rock layers are about 15 meters (about 50 feet) tall. This is an enhanced false color rendering of images taken by the panoramic camera (Pancam) on NASA's Mars Exploration Rover Opportunity during the rover's 952nd sol, or Martian day, (Sept. 28, 2006) using the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.Layers of 'Cape Verde' in 'Victoria Crater' (False Color)
NASA Technical Reports Server (NTRS)
2006-01-01
This view of Victoria crater is looking north from 'Duck Bay' towards the dramatic promontory called 'Cape Verde.' The dramatic cliff of layered rocks is about 50 meters (about 165 feet) away from the rover and is about 6 meters (about 20 feet) tall. The taller promontory beyond that is about 100 meters (about 325 feet) away, and the vista beyond that extends away for more than 400 meters (about 1300 feet) into the distance. This is an enhanced false color rendering of images taken by the panoramic camera (Pancam) on NASA's Mars Exploration Rover Opportunity during the rover's 952nd sol, or Martian day, (Sept. 28, 2006) using the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.NASA Astrophysics Data System (ADS)
Jiang, Zhen; Holyoak, G. Reed; Bartels, Kenneth E.; Ritchey, Jerry W.; Xu, Guan; Bunting, Charles F.; Slobodov, Gennady; Krasinski, Jerzy S.; Piao, Daqing
2009-02-01
In vivo trans-rectal near-infrared (NIR) optical tomography is conducted on a tumor-bearing canine prostate with the assistance of trans-rectal ultrasound (TRUS). The canine prostate tumor model is made possible by a unique round cell neoplasm of dogs, transmissible venereal tumor (TVT) that can be transferred from dog to dog regardless of histocompatibility. A characterized TVT cell line was homogenized and passed twice in subcutaneous tissue of NOD/SCID mice. Following the second passage, the tumor was recovered, homogenized and then inoculated by ultrasound guidance into the prostate gland of a healthy dog. The dog was then imaged with a combined trans-rectal NIR and TRUS imager using an integrated trans-rectal NIR/US applicator. The image was taken by NIR and US modalities concurrently, both in sagittal view. The trans-rectal NIR imager is a continuous-wave system that illuminates 7 source channels sequentially by a fiber switch to deliver sufficient light power to the relatively more absorbing prostate tissue and samples 7 detection channels simultaneously by a gated intensified high-resolution CCD camera. This work tests the feasibility of detecting prostate tumor by trans-rectal NIR optical tomography and the benefit of augmenting TRUS with trans-rectal NIR imaging.
The Mars NetLander panoramic camera
NASA Astrophysics Data System (ADS)
Jaumann, Ralf; Langevin, Yves; Hauber, Ernst; Oberst, Jürgen; Grothues, Hans-Georg; Hoffmann, Harald; Soufflot, Alain; Bertaux, Jean-Loup; Dimarellis, Emmanuel; Mottola, Stefano; Bibring, Jean-Pierre; Neukum, Gerhard; Albertz, Jörg; Masson, Philippe; Pinet, Patrick; Lamy, Philippe; Formisano, Vittorio
2000-10-01
The panoramic camera (PanCam) imaging experiment is designed to obtain high-resolution multispectral stereoscopic panoramic images from each of the four Mars NetLander 2005 sites. The main scientific objectives to be addressed by the PanCam experiment are (1) to locate the landing sites and support the NetLander network sciences, (2) to geologically investigate and map the landing sites, and (3) to study the properties of the atmosphere and of variable phenomena. To place in situ measurements at a landing site into a proper regional context, it is necessary to determine the lander orientation on ground and to exactly locate the position of the landing site with respect to the available cartographic database. This is not possible by tracking alone due to the lack of on-ground orientation and the so-called map-tie problem. Images as provided by the PanCam allow to determine accurate tilt and north directions for each lander and to identify the lander locations based on landmarks, which can also be recognized in appropriate orbiter imagery. With this information, it will be further possible to improve the Mars-wide geodetic control point network and the resulting geometric precision of global map products. The major geoscientific objectives of the PanCam lander images are the recognition of surface features like ripples, ridges and troughs, and the identification and characterization of different rock and surface units based on their morphology, distribution, spectral characteristics, and physical properties. The analysis of the PanCam imagery will finally result in the generation of precise map products for each of the landing sites. So far comparative geologic studies of the Martian surface are restricted to the timely separated Mars Pathfinder and the two Viking Lander Missions. Further lander missions are in preparation (Beagle-2, Mars Surveyor 03). NetLander provides the unique opportunity to nearly double the number of accessible landing site data by providing simultaneous and long-term observations at four different surface locations which becomes especially important for studies of variable surface features as well as properties and phenomena of the atmosphere. Major changes on the surface that can be detected by PanCam are caused by eolian activities and condensation processes, which directly reflect variations in the prevailing near-surface wind regime and the diurnal and seasonal volatile and dust cycles. Atmospheric studies will concentrate on the detection of clouds, measurements of the aerosol contents and the water vapor absorption at 936 nm. In order to meet these objectives, the proposed PanCam instrument is a highly miniaturized, dedicated stereo and multispectral imaging device. The camera consists of two identical camera cubes, which are arranged in a common housing at a fixed stereo base length of 11 cm. Each camera cube is equipped with a CCD frame transfer detector with 1024×1024 active pixels and optics with a focal length of 13 mm yielding a field-of-view of 53°×53° and an instantaneous filed of view of 1.1 mrad. A filter swivel with six positions provides different color band passes in the wavelength range of 400-950 nm. The camera head is mounted on top of a deployable scissors boom and can be rotated by 360° to obtain a full panorama, which is already covered by eight images. The boom raises the camera head to a final altitude of 90 cm above the surface. Most camera activities will take place within the first week and the first month of the mission. During the remainder of the mission, the camera will operate with a reduced data rate to monitor time-dependent variations on a daily basis. PanCam is a joint German/French project with contributions from DLR, Institute of Space Sensor Technology and Planetary Exploration, Berlin, Institut d'Astrophysique Spatiale, CNRS, Orsay, and Service d'Aéronomie, CNRS, Verrières-le-Buisson.
NASA Astrophysics Data System (ADS)
Venugopal, Vivek; Park, Minho; Ashitate, Yoshitomo; Neacsu, Florin; Kettenring, Frank; Frangioni, John V.; Gangadharan, Sidhu P.; Gioux, Sylvain
2013-12-01
We report the design, characterization, and validation of an optimized simultaneous color and near-infrared (NIR) fluorescence rigid endoscopic imaging system for minimally invasive surgery. This system is optimized for illumination and collection of NIR wavelengths allowing the simultaneous acquisition of both color and NIR fluorescence at frame rates higher than 6.8 fps with high sensitivity. The system employs a custom 10-mm diameter rigid endoscope optimized for NIR transmission. A dual-channel light source compatible with the constraints of an endoscope was built and includes a plasma source for white light illumination and NIR laser diodes for fluorescence excitation. A prism-based 2-CCD camera was customized for simultaneous color and NIR detection with a highly efficient filtration scheme for fluorescence imaging of both 700- and 800-nm emission dyes. The performance characterization studies indicate that the endoscope can efficiently detect fluorescence signal from both indocyanine green and methylene blue in dimethyl sulfoxide at the concentrations of 100 to 185 nM depending on the background optical properties. Finally, we performed the validation of this imaging system in vivo during a minimally invasive procedure for thoracic sentinel lymph node mapping in a porcine model.
Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects
Lambers, Martin; Kolb, Andreas
2017-01-01
In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data. PMID:29271888
Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects.
Bulczak, David; Lambers, Martin; Kolb, Andreas
2017-12-22
In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.
Payload topography camera of Chang'e-3
NASA Astrophysics Data System (ADS)
Yu, Guo-Bin; Liu, En-Hai; Zhao, Ru-Jin; Zhong, Jie; Zhou, Xiang-Dong; Zhou, Wu-Lin; Wang, Jin; Chen, Yuan-Pei; Hao, Yong-Jie
2015-11-01
Chang'e-3 was China's first soft-landing lunar probe that achieved a successful roving exploration on the Moon. A topography camera functioning as the lander's “eye” was one of the main scientific payloads installed on the lander. It was composed of a camera probe, an electronic component that performed image compression, and a cable assembly. Its exploration mission was to obtain optical images of the lunar topography in the landing zone for investigation and research. It also observed rover movement on the lunar surface and finished taking pictures of the lander and rover. After starting up successfully, the topography camera obtained static images and video of rover movement from different directions, 360° panoramic pictures of the lunar surface around the lander from multiple angles, and numerous pictures of the Earth. All images of the rover, lunar surface, and the Earth were clear, and those of the Chinese national flag were recorded in true color. This paper describes the exploration mission, system design, working principle, quality assessment of image compression, and color correction of the topography camera. Finally, test results from the lunar surface are provided to serve as a reference for scientific data processing and application.
Murray, Trevor; Zeil, Jochen
2017-01-01
Panoramic views of natural environments provide visually navigating animals with two kinds of information: they define locations because image differences increase smoothly with distance from a reference location and they provide compass information, because image differences increase smoothly with rotation away from a reference orientation. The range over which a given reference image can provide navigational guidance (its 'catchment area') has to date been quantified from the perspective of walking animals by determining how image differences develop across the ground plane of natural habitats. However, to understand the information available to flying animals there is a need to characterize the 'catchment volumes' within which panoramic snapshots can provide navigational guidance. We used recently developed camera-based methods for constructing 3D models of natural environments and rendered panoramic views at defined locations within these models with the aim of mapping navigational information in three dimensions. We find that in relatively open woodland habitats, catchment volumes are surprisingly large extending for metres depending on the sensitivity of the viewer to image differences. The size and the shape of catchment volumes depend on the distance of visual features in the environment. Catchment volumes are smaller for reference images close to the ground and become larger for reference images at some distance from the ground and in more open environments. Interestingly, catchment volumes become smaller when only above horizon views are used and also when views include a 1 km distant panorama. We discuss the current limitations of mapping navigational information in natural environments and the relevance of our findings for our understanding of visual navigation in animals and autonomous robots.
Quantifying navigational information: The catchment volumes of panoramic snapshots in outdoor scenes
Zeil, Jochen
2017-01-01
Panoramic views of natural environments provide visually navigating animals with two kinds of information: they define locations because image differences increase smoothly with distance from a reference location and they provide compass information, because image differences increase smoothly with rotation away from a reference orientation. The range over which a given reference image can provide navigational guidance (its ‘catchment area’) has to date been quantified from the perspective of walking animals by determining how image differences develop across the ground plane of natural habitats. However, to understand the information available to flying animals there is a need to characterize the ‘catchment volumes’ within which panoramic snapshots can provide navigational guidance. We used recently developed camera-based methods for constructing 3D models of natural environments and rendered panoramic views at defined locations within these models with the aim of mapping navigational information in three dimensions. We find that in relatively open woodland habitats, catchment volumes are surprisingly large extending for metres depending on the sensitivity of the viewer to image differences. The size and the shape of catchment volumes depend on the distance of visual features in the environment. Catchment volumes are smaller for reference images close to the ground and become larger for reference images at some distance from the ground and in more open environments. Interestingly, catchment volumes become smaller when only above horizon views are used and also when views include a 1 km distant panorama. We discuss the current limitations of mapping navigational information in natural environments and the relevance of our findings for our understanding of visual navigation in animals and autonomous robots. PMID:29088300
NASA Technical Reports Server (NTRS)
2004-01-01
This false-color image from NASA's Mars Exploration Rover Opportunity panoramic camera shows a downward view from the rover as it sits at the edge of 'Endurance' crater. The gradual, 'blueberry'-strewn slope before the rover contains an exposed dark layer of rock that wraps around the upper section of the crater. Scientists suspect that this rock layer will provide clues about Mars' distant past. This mosaic image comprises images taken from 10 rover positions using 750, 530 and 430 nanometer filters, acquired on sol 131 (June 6, 2004).NASA Technical Reports Server (NTRS)
2004-01-01
This image taken by the Mars Exploration Rover Opportunity shows a bizarre, lumpy rock dubbed 'Wopmay' on the inner slopes of 'Endurance Crater.' Scientists say the rock's unusual texture is unlike any others observed so far at Meridiani Planum. Wopmay measures approximately 1 meter (3.3 feet) across. The image was taken by the rover's panoramic camera on sol 195 (Aug. 11, 2004). Opportunity will likely travel to this or a similar rock in coming sols for a closer look at the alien surface.NASA Technical Reports Server (NTRS)
2004-01-01
A three-dimensional color model created using data from the Mars Exploration Rover's panoramic camera shows images of airbag drag marks on the martian surface. The triangular rock in the upper left corner is approximately 20 centimeters (8 inches) tall. The meatball-shaped rock in the upper right corner is approximately 10 centimeters (4 inches) tall. The dark portion of the surface, or 'trough' is approximately 1 centimeter (0.4 inches) deep at its deepest point. This model is displayed using software developed by NASA's Ames Research Center.
NASA Technical Reports Server (NTRS)
2004-01-01
[figure removed for brevity, see original site] Figure 1 (close-up) This panoramic camera image of the soil target whimsically called 'Neopolitan' from the Mars Exploration Rover Opportunity's 'Eagle Crater' soil survey highlights the border between two different soil types - a lighter, finer-grained unit to the left and a darker, coarser-grained to the right. Scientists are pondering the unusually distinct border between these different soil types. To the lower left and partially hidden by the shadow of the mast is an airbag bounce mark.UAV-based NDVI calculation over grassland: An alternative approach
NASA Astrophysics Data System (ADS)
Mejia-Aguilar, Abraham; Tomelleri, Enrico; Asam, Sarah; Zebisch, Marc
2016-04-01
The Normalised Difference Vegetation Index (NDVI) is one of the most widely used indicators for monitoring and assessing vegetation in remote sensing. The index relies on the reflectance difference between the near infrared (NIR) and red light and is thus able to track variations of structural, phenological, and biophysical parameters for seasonal and long-term monitoring. Conventionally, NDVI is inferred from space-borne spectroradiometers, such as MODIS, with moderate resolution up to 250 m ground resolution. In recent years, a new generation of miniaturized radiometers and integrated hyperspectral sensors with high resolution became available. Such small and light instruments are particularly adequate to be mounted on airborne unmanned aerial vehicles (UAV) used for monitoring services reaching ground sampling resolution in the order of centimetres. Nevertheless, such miniaturized radiometers and hyperspectral sensors are still very expensive and require high upfront capital costs. Therefore, we propose an alternative, mainly cheaper method to calculate NDVI using a camera constellation consisting of two conventional consumer-grade cameras: (i) a Ricoh GR modified camera that acquires the NIR spectrum by removing the internal infrared filter. A mounted optical filter additionally obstructs all wavelengths below 700 nm. (ii) A Ricoh GR in RGB configuration using two optical filters for blocking wavelengths below 600 nm as well as NIR and ultraviolet (UV) light. To assess the merit of the proposed method, we carry out two comparisons: First, reflectance maps generated by the consumer-grade camera constellation are compared to reflectance maps produced with a hyperspectral camera (Rikola). All imaging data and reflectance maps are processed using the PIX4D software. In the second test, the NDVI at specific points of interest (POI) generated by the consumer-grade camera constellation is compared to NDVI values obtained by ground spectral measurements using a portable spectroradiometer (Spectravista SVC HR-1024i). All data were collected on a dry alpine mountain grassland site in the Matsch valley, Italy, during the vegetation period of 2015. Data acquisition for the first comparison followed a pre-programmed flight plan in which the hyperspectral and alternative dual-camera constellation were mounted separately on an octocopter-UAV during two consecutive flight campaigns. Ground spectral measurements collection took place on the same site and on the same dates (three in total) of the flight campaigns. The proposed technique achieves promising results and therewith constitutes a cheap and simple way of collecting spatially explicit information on vegetated areas even in challenging terrain.
Video Completion in Digital Stabilization Task Using Pseudo-Panoramic Technique
NASA Astrophysics Data System (ADS)
Favorskaya, M. N.; Buryachenko, V. V.; Zotin, A. G.; Pakhirka, A. I.
2017-05-01
Video completion is a necessary stage after stabilization of a non-stationary video sequence, if it is desirable to make the resolution of the stabilized frames equalled the resolution of the original frames. Usually the cropped stabilized frames lose 10-20% of area that means the worse visibility of the reconstructed scenes. The extension of a view of field may appear due to the pan-tilt-zoom unwanted camera movement. Our approach deals with a preparing of pseudo-panoramic key frame during a stabilization stage as a pre-processing step for the following inpainting. It is based on a multi-layered representation of each frame including the background and objects, moving differently. The proposed algorithm involves four steps, such as the background completion, local motion inpainting, local warping, and seamless blending. Our experiments show that a necessity of a seamless stitching occurs often than a local warping step. Therefore, a seamless blending was investigated in details including four main categories, such as feathering-based, pyramid-based, gradient-based, and optimal seam-based blending.
AUGUSTO'S Sundial: Image-Based Modeling for Reverse Engeneering Purposes
NASA Astrophysics Data System (ADS)
Baiocchi, V.; Barbarella, M.; Del Pizzo, S.; Giannone, F.; Troisi, S.; Piccaro, C.; Marcantonio, D.
2017-02-01
A photogrammetric survey of a unique archaeological site is reported in this paper. The survey was performed using both a panoramic image-based solution and by classical procedure. The panoramic image-based solution was carried out employing a commercial solution: the Trimble V10 Imaging Rover (IR). Such instrument is an integrated cameras system that captures 360 degrees digital panoramas, composed of 12 images, with a single push. The direct comparison of the point clouds obtained with traditional photogrammetric procedure and V10 stations, using the same GCP coordinates has been carried out in Cloud Compare, open source software that can provide the comparison between two point clouds supplied by all the main statistical data. The site is a portion of the dial plate of the "Horologium Augusti" inaugurated in 9 B.C.E. in the area of Campo Marzio and still present intact in the same position, in a cellar of a building in Rome, around 7 meter below the present ground level.
NASA Astrophysics Data System (ADS)
Khalifa, Aly A.; Aly, Hussein A.; El-Sherif, Ashraf F.
2016-02-01
Near infrared (NIR) dynamic scene projection systems are used to perform hardware in-the-loop (HWIL) testing of a unit under test operating in the NIR band. The common and complex requirement of a class of these units is a dynamic scene that is spatio-temporal variant. In this paper we apply and investigate active external modulation of NIR laser in different ranges of temporal frequencies. We use digital micromirror devices (DMDs) integrated as the core of a NIR projection system to generate these dynamic scenes. We deploy the spatial pattern to the DMD controller to simultaneously yield the required amplitude by pulse width modulation (PWM) of the mirror elements as well as the spatio-temporal pattern. Desired modulation and coding of high stable, high power visible (Red laser at 640 nm) and NIR (Diode laser at 976 nm) using the combination of different optical masks based on DMD were achieved. These spatial versatile active coding strategies for both low and high frequencies in the range of kHz for irradiance of different targets were generated by our system and recorded using VIS-NIR fast cameras. The temporally-modulated laser pulse traces were measured using array of fast response photodetectors. Finally using a high resolution spectrometer, we evaluated the NIR dynamic scene projection system response in terms of preserving the wavelength and band spread of the NIR source after projection.
Using Vertical Panoramic Images to Record a Historic Cemetery
NASA Astrophysics Data System (ADS)
Tommaselli, A. M. G.; Polidori, L.; Hasegawa, J. K.; Camargo, P. O.; Hirao, H.; Moraes, M. V. A.; Rissate, E. A., Jr.; Henrique, G. R.; Abreu, P. A. G.; Berveglieri, A.; Marcato, J., Jr.
2013-07-01
In 1919, during colonization of the West Region of São Paulo State, Brazil, the Ogassawara family built a cemetery and a school with donations received from the newspaper Osaka Mainichi Shimbum, in Osaka, Japan. The cemetery was closed by President Getúlio Vargas in 1942, during the Second World War. The architecture of the Japanese cemetery is a unique feature in Latin America. Even considering its historical and cultural relevance, there is a lack of geometric documentation about the location and features of the tombs and other buildings within the cemetery. As an alternative to provide detailed and fast georeferenced information about the area, it is proposed to use near vertical panoramic images taken with a digital camera with fisheye lens as the primary data followed by bundle adjustment and photogrammetric restitution. The aim of this paper is to present a feasibility study on the proposed technique with the assessment of the results with a strip of five panoramic images, taken over some graves in the Japanese cemetery. The results showed that a plant in a scale of 1 : 200 can be produced with photogrammetric restitution at a very low cost, when compared to topographic surveying or laser scanning. The paper will address the main advantages of this technique as well as its drawbacks, with quantitative analysis of the results achieved in this experiment.
NASA Technical Reports Server (NTRS)
2006-01-01
As NASA's Mars Exploration Rover Spirit began collecting images for a 360-degree panorama of new terrain, the rover captured this view of a dark boulder with an interesting surface texture. The boulder sits about 40 centimeters (16 inches) tall on Martian sand about 5 meters (16 feet) away from Spirit. It is one of many dark, volcanic rock fragments -- many pocked with rounded holes called vesicles -- littering the slope of 'Low Ridge.' The rock surface facing the rover is similar in appearance to the surface texture on the outside of lava flows on Earth. Spirit took this approximately true-color image with the panoramic camera on the rover's 810th sol, or Martian day, of exploring Mars (April 13, 2006), using the camera's 753-nanometer, 535-nanometer, and 432-nanometer filters.Detection of unmanned aerial vehicles using a visible camera system.
Hu, Shuowen; Goldman, Geoffrey H; Borel-Donohue, Christoph C
2017-01-20
Unmanned aerial vehicles (UAVs) flown by adversaries are an emerging asymmetric threat to homeland security and the military. To help address this threat, we developed and tested a computationally efficient UAV detection algorithm consisting of horizon finding, motion feature extraction, blob analysis, and coherence analysis. We compare the performance of this algorithm against two variants, one using the difference image intensity as the motion features and another using higher-order moments. The proposed algorithm and its variants are tested using field test data of a group 3 UAV acquired with a panoramic video camera in the visible spectrum. The performance of the algorithms was evaluated using receiver operating characteristic curves. The results show that the proposed approach had the best performance compared to the two algorithmic variants.
Bird's-Eye View of Opportunity at 'Erebus' (Polar)
NASA Technical Reports Server (NTRS)
2006-01-01
This view combines frames taken by the panoramic camera (Pancam) on NASA's Mars Exploration Rover Opportunity on the rover's 652nd through 663rd Martian days, or sols (Nov. 23 to Dec. 5, 2005), at the edge of 'Erebus Crater.' The mosaic is presented as a polar projection. This type of projection provides a kind of overhead view of all of the surrounding terrain. Opportunity examined targets on the outcrop called 'Rimrock' in front of the rover, testing the mobility and operation of Opportunity's robotic arm. The view shows examples of the dunes and ripples that Opportunity has been crossing as the rover drives on the Meridiani plains. This view is an approximate true color rendering composed of images taken through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.Layers of 'Cape Verde' in 'Victoria Crater' (Enhanced)
NASA Technical Reports Server (NTRS)
2006-01-01
This view of Victoria crater is looking north from 'Duck Bay' towards the dramatic promontory called 'Cape Verde.' The dramatic cliff of layered rocks is about 50 meters (about 165 feet) away from the rover and is about 6 meters (about 20 feet) tall. The taller promontory beyond that is about 100 meters (about 325 feet) away, and the vista beyond that extends away for more than 400 meters (about 1300 feet) into the distance. This is a false color rendering (enhanced to bring out details from within the shadowed regions of the scene) of images taken by the panoramic camera (Pancam) on NASA's Mars Exploration Rover Opportunity during the rover's 952nd sol, or Martian day, (Sept. 28, 2006) using the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.VizieR Online Data Catalog: JHK and IRAC photometry of Sh2-90 YSOs (Samal+, 2014)
NASA Astrophysics Data System (ADS)
Samal, M. R.; Zavagno, A.; Deharveng, L.; Molinari, S.; Ojha, D. K.; Paradis, D.; Tige, J.; Pandey, A. K.; Russeil, D.
2014-03-01
To identify YSOs, we observed the Sh2-90 complex at NIR bands with WIRCAM instrument at the 3.6m CHFT telescope, and supplement these observations with the GLIMPSE point source catalog from Benjamin et al. (2003PASP..115..953B, Cat. II/293). The complex were observed at NIR bands on 2006 July 8 using the WIRCAM camera on the CHFT 3.6m telescope. This table includes photometry of the identified YSOs at NIR and Spitzer-IRAC bands. In the table, the columns one and two give coordinates of the YSOs. The following six columns provide the JHK magnitudes and associated errors obtained in our observations, while the next eight columns list the Spitzer-IRAC magnitudes and associated errors. The last column provides the sequence of the table and the sequence number 1 to 21, 22 to 55 and 56 to 129 corresponds to the Class I, Class II and NIR-excess YSOs, respectively. (1 data file).
VizieR Online Data Catalog: Young stellar objects in NGC 6823 (Riaz+, 2012)
NASA Astrophysics Data System (ADS)
Riaz, B.; Martin, E. L.; Tata, R.; Monin, J.-L.; Phan-Bao, N.; Bouy, H.
2016-10-01
The optical V-, R- and I-band images were obtained using the Prime Focus camera [William Herschel Telescope (WHT)/Wide Field Camera (WFC) detector] mounted on 4-m WHT in La Palma, Canary Islands, Spain. Observations were performed in 2005 May, The NIR J-, H-, Ks-band images were obtained using the Infrared Side Port Imager (ISPI) mounted on Cerro Tololo Inter-American Observatory (CTIO) 4-m Blanco Telescope in Cerro Tololo, Chile. Observations were performed in 2007 March. (3 data files).
View Northward from Spirit's Winter Roost (False Color)
NASA Technical Reports Server (NTRS)
2006-01-01
One part of the research program that NASA's Mars Exploration Rover Spirit is conducting while sitting at a favorable location for wintertime solar energy is the most detailed panorama yet taken on the surface of Mars. This view is a partial preliminary product from the continuing work on the full image, which will be called the 'McMurdo Panorama.' Spirit's panoramic camera (Pancam) began taking exposures for the McMurdo Panorama on the rover's 814th Martian day (April 18, 2006). The rover has accumulated more than 900 exposures for this panorama so far, through all of the Pancam mineralogy filters and using little or no image compression. Even with a tilt toward the winter sun, the amount of energy available daily is small, so the job will still take one to two more months to complete. This portion of the work in progress looks toward the north. 'Husband Hill,' which Spirit was climbing a year ago, is on the horizon near the center. 'Home Plate' is a between that hill and the rover's current position. Wheel tracks imprinted when Spirit drove south from Home Plate can be seen crossing the middle distance of the image from the center to the right. This view is presented in false color to emphasize differences among rock and soil materials. It combines exposures taken through three of the panoramic camera's filters, centered on wavelengths of 750 nanometers, 530 nanometers and 430 nanometers.Dust deposition on the Mars Exploration Rover Panoramic Camera (Pancam) calibration targets
Kinch, K.M.; Sohl-Dickstein, J.; Bell, J.F.; Johnson, J. R.; Goetz, W.; Landis, G.A.
2007-01-01
The Panoramic Camera (Pancam) on the Mars Exploration Rover mission has acquired in excess of 20,000 images of the Pancam calibration targets on the rovers. Analysis of this data set allows estimates of the rate of deposition and removal of aeolian dust on both rovers. During the first 150-170 sols there was gradual dust accumulation on the rovers but no evidence for dust removal. After that time there is ample evidence for both dust removal and dust deposition on both rover decks. We analyze data from early in both rover missions using a diffusive reflectance mixing model. Assuming a dust settling rate proportional to the atmospheric optical depth, we derive spectra of optically thick layers of airfall dust that are consistent with spectra from dusty regions on the Martian surface. Airfall dust reflectance at the Opportunity site appears greater than at the Spirit site, consistent with other observations. We estimate the optical depth of dust deposited on the Spirit calibration target by sol 150 to be 0.44 ?? 0.13. For Opportunity the value was 0.39 ?? 0.12. Assuming 80% pore space, we estimate that the dust layer grew at a rate of one grain diameter per ???100 sols on the Spirit calibration target. On Opportunity the rate was one grain diameter per ???125 sols. These numbers are consistent with dust deposition rates observed by Mars Pathfinder taking into account the lower atmospheric dust optical depth during the Mars Pathfinder mission. Copyright 2007 by the American Geophysical Union.
Sundaramoorthy, Sriramkumar; Badaracco, Adrian Garcia; Hirsch, Sophia M.; Park, Jun Hong; Davies, Tim; Dumont, Julien; Shirasu-Hiza, Mimi; Kummel, Andrew C.; Canman, Julie C.
2017-01-01
The combination of near infrared (NIR) and visible wavelengths in light microscopy for biological studies is increasingly common. For example, many fields of biology are developing the use of NIR for optogenetics, in which an NIR laser induces a change in gene expression and/or protein function. One major technical barrier in working with both NIR and visible light on an optical microscope is obtaining their precise coalignment at the imaging plane position. Photon upconverting particles (UCPs) can bridge this gap as they are excited by NIR light but emit in the visible range via an anti-Stokes luminescence mechanism. Here, two different UCPs have been identified, high-efficiency micro540-UCPs and lower efficiency nano545-UCPs, that respond to NIR light and emit visible light with high photostability even at very high NIR power densities (>25,000 Suns). Both of these UCPs can be rapidly and reversibly excited by visible and NIR light and emit light at visible wavelengths detectable with standard emission settings used for Green Fluorescent Protein (GFP), a commonly used genetically-encoded fluorophore. However, the high efficiency micro540-UCPs were suboptimal for NIR and visible light coalignment, due to their larger size and spatial broadening from particle-to-particle energy transfer consistent with a long lived excited state and saturated power dependence. In contrast, the lower efficiency nano-UCPs were superior for precise coalignment of the NIR beam with the visible light path (~2 µm versus ~8 µm beam broadening respectively) consistent with limited particle-to-particle energy transfer, superlinear power dependence for emission, and much smaller particle size. Furthermore, the nano-UCPs were superior to a traditional two-camera method for NIR and visible light path alignment in an in vivo Infrared-Laser-Evoked Gene Operator (IR-LEGO) optogenetics assay in the budding yeast S. cerevisiae. In summary, nano-UCPs are powerful new tools for coaligning NIR and visible light paths on a light microscope. PMID:28221018
Fisheye Multi-Camera System Calibration for Surveying Narrow and Complex Architectures
NASA Astrophysics Data System (ADS)
Perfetti, L.; Polari, C.; Fassi, F.
2018-05-01
Narrow spaces and passages are not a rare encounter in cultural heritage, the shape and extension of those areas place a serious challenge on any techniques one may choose to survey their 3D geometry. Especially on techniques that make use of stationary instrumentation like terrestrial laser scanning. The ratio between space extension and cross section width of many corridors and staircases can easily lead to distortions/drift of the 3D reconstruction because of the problem of propagation of uncertainty. This paper investigates the use of fisheye photogrammetry to produce the 3D reconstruction of such spaces and presents some tests to contain the degree of freedom of the photogrammetric network, thereby containing the drift of long data set as well. The idea is that of employing a multi-camera system composed of several fisheye cameras and to implement distances and relative orientation constraints, as well as the pre-calibration of the internal parameters for each camera, within the bundle adjustment. For the beginning of this investigation, we used the NCTech iSTAR panoramic camera as a rigid multi-camera system. The case study of the Amedeo Spire of the Milan Cathedral, that encloses a spiral staircase, is the stage for all the tests. Comparisons have been made between the results obtained with the multi-camera configuration, the auto-stitched equirectangular images and a data set obtained with a monocular fisheye configuration using a full frame DSLR. Results show improved accuracy, down to millimetres, using a rigidly constrained multi-camera.
Near-infrared face recognition utilizing open CV software
NASA Astrophysics Data System (ADS)
Sellami, Louiza; Ngo, Hau; Fowler, Chris J.; Kearney, Liam M.
2014-06-01
Commercially available hardware, freely available algorithms, and authors' developed software are synergized successfully to detect and recognize subjects in an environment without visible light. This project integrates three major components: an illumination device operating in near infrared (NIR) spectrum, a NIR capable camera and a software algorithm capable of performing image manipulation, facial detection and recognition. Focusing our efforts in the near infrared spectrum allows the low budget system to operate covertly while still allowing for accurate face recognition. In doing so a valuable function has been developed which presents potential benefits in future civilian and military security and surveillance operations.
NASA Technical Reports Server (NTRS)
Zalameda, Joseph N.; Burke, Eric R.; Hafley, Robert A.; Taminger, Karen M.; Domack, Christopher S.; Brewer, Amy R.; Martin, Richard E.
2013-01-01
Additive manufacturing is a rapidly growing field where 3-dimensional parts can be produced layer by layer. NASA s electron beam free-form fabrication (EBF(sup 3)) technology is being evaluated to manufacture metallic parts in a space environment. The benefits of EBF(sup 3) technology are weight savings to support space missions, rapid prototyping in a zero gravity environment, and improved vehicle readiness. The EBF(sup 3) system is composed of 3 main components: electron beam gun, multi-axis position system, and metallic wire feeder. The electron beam is used to melt the wire and the multi-axis positioning system is used to build the part layer by layer. To insure a quality weld, a near infrared (NIR) camera is used to image the melt pool and solidification areas. This paper describes the calibration and application of a NIR camera for temperature measurement. In addition, image processing techniques are presented for weld assessment metrics.
Fuzzy System-Based Target Selection for a NIR Camera-Based Gaze Tracker
Naqvi, Rizwan Ali; Arsalan, Muhammad; Park, Kang Ryoung
2017-01-01
Gaze-based interaction (GBI) techniques have been a popular subject of research in the last few decades. Among other applications, GBI can be used by persons with disabilities to perform everyday tasks, as a game interface, and can play a pivotal role in the human computer interface (HCI) field. While gaze tracking systems have shown high accuracy in GBI, detecting a user’s gaze for target selection is a challenging problem that needs to be considered while using a gaze detection system. Past research has used the blinking of the eyes for this purpose as well as dwell time-based methods, but these techniques are either inconvenient for the user or requires a long time for target selection. Therefore, in this paper, we propose a method for fuzzy system-based target selection for near-infrared (NIR) camera-based gaze trackers. The results of experiments performed in addition to tests of the usability and on-screen keyboard use of the proposed method show that it is better than previous methods. PMID:28420114
3D terrain reconstruction using Chang’E-3 PCAM images
NASA Astrophysics Data System (ADS)
Chen, Wangli; Zeng, Xingguo; Zhang, Hongbo
2017-10-01
In order to improve understanding of the topography of Chang’E-3 landing site, 3D terrain models are reconstructed using PCMA images. PCAM (panoramic cameras) is a stereo camera system with a 27cm baseline on-board Yutu rover. It obtained panoramic images at four detection sites, and can achieve a resolution of 1.48mm/pixel at 10m. So the PCAM images reveal fine details of the detection region. In the method, SIFT is employed for feature description and feature matching. In addition to collinearity equations, the measure of baseline of the stereo system is also used in bundle adjustment to solve orientation parameters of all images. And then, pair-wise depth map computation is applied for dense surface reconstruction. Finally, DTM of the detection region is generated. The DTM covers an area with radius of about 20m, and centering at the location of the camera. In consequence of the design, each individual wheel of Yutu rover can leave three tracks on the surface of moon, and the width between the first and third track is 15cm, and these tracks are clear and distinguishable in images. So we chose the second detection site which is of the best ability of recognition of wheel tracks to evaluate the accuracy of the DTM. We measured the width of wheel tracks every 1.5m from the center of the detection region, and obtained 13 measures. It is noticed that the area where wheel tracks are ambiguous is avoided. Result shows that the mean value of wheel track width is 0.155m with a standard deviation of 0.007m. Generally, the closer to the center the more accurate the measure of wheel width is. This is due to the fact that the deformation of images aggravates with increase distance from the location of the camera, and this induces the decline of DTM quality in far areas. In our work, images of the four detection sites are adjusted independently, and this means that there is no tie point between different sites. So deviations between the locations of the same object measured from DTMs of adjacent detection sites may exist.
In vivo non-invasive optical imaging of temperature-sensitive co-polymeric nanohydrogel
NASA Astrophysics Data System (ADS)
Chen, Haiyan; Zhang, Jian; Qian, Zhiyu; Liu, Fei; Chen, Xinyang; Hu, Yuzhu; Gu, Yueqing
2008-05-01
Assessment of hyperthermia in pathological tissue is a promising strategy for earlier diagnosis of malignant tumors. In this study, temperature-sensitive co-polymeric nanohydrogel poly(N-isopropylacrylamide-co-acrylic acid) (PNIPA-co-AA) was successfully synthesized by the precipitation polymerization method. The diameters of nanohydrogels were controlled to be less than 100 nm. Also the lower critical solution temperature (LCST, 40 °C) was manipulated above physiological temperature after integration of near-infrared (NIR) organic dye (heptamethine cyanine dye, HMCD) within its interior cores. NIR laser light (765 nm), together with sensitive charge coupled device (CCD) cameras, were designed to construct an NIR imaging system. The dynamic behaviors of PNIPA-co-AA-HMCD composites in denuded mice with or without local hyperthermia treatment were real-time monitored by an NIR imager. The results showed that the PNIPA-co-AA-HMCD composites accumulated in the leg treated with local heating and diffused much slower than that in the other leg without heating. The results demonstrated that the temperature-responsive PNIPA-co-AA-HMCD composites combining with an NIR imaging system could be an effective temperature mapping technique, which provides a promising prospect for earlier tumor diagnosis and thermally related therapeutic assessment.
Near-infrared imaging of developmental defects in dental enamel.
Hirasuna, Krista; Fried, Daniel; Darling, Cynthia L
2008-01-01
Polarization-sensitive optical coherence tomography (PS-OCT) and near-infrared (NIR) imaging are promising new technologies under development for monitoring early carious lesions. Fluorosis is a growing problem in the United States, and the more prevalent mild fluorosis can be visually mistaken for early enamel demineralization. Unfortunately, there is little quantitative information available regarding the differences in optical properties of sound enamel, enamel developmental defects, and caries. Thirty extracted human teeth with various degrees of suspected fluorosis were imaged using PS-OCT and NIR. An InGaAs camera and a NIR diode laser were used to measure the optical attenuation through transverse tooth sections (approximately 200 microm). A digital microradiography system was used to quantify the enamel defect severity by measurement of the relative mineral loss for comparison with optical scattering measurements. Developmental defects were clearly visible in the polarization-resolved OCT images, demonstrating that PS-OCT can be used to nondestructively measure the depth and possible severity of the defects. Enamel defects on whole teeth that could be imaged with high contrast with visible light were transparent in the NIR. This study suggests that PS-OCT and NIR methods may potentially be used as tools to assess the severity and extent of enamel defects.
Transillumination and reflectance probes for in vivo near-IR imaging of dental caries
NASA Astrophysics Data System (ADS)
Simon, Jacob C.; Lucas, Seth A.; Staninec, Michal; Tom, Henry; Chan, Kenneth H.; Darling, Cynthia L.; Fried, Daniel
2014-02-01
Previous studies have demonstrated the utility of near infrared (NIR) imaging for caries detection employing transillumination and reflectance imaging geometries. Three intra-oral NIR imaging probes were fabricated for the acquisition of in vivo, real time videos using a high definition InGaAs SWIR camera and near-IR broadband light sources. Two transillumination probes provide occlusal and interproximal images using 1300-nm light where water absorption is low and enamel manifests the highest transparency. A third reflectance probe utilizes cross polarization and operates at >1500-nm, where water absorption is higher which reduces the reflectivity of sound tissues, significantly increasing lesion contrast. These probes are being used in an ongoing clinical study to assess the diagnostic performance of NIR imaging for the detection of caries lesions in teeth scheduled for extraction for orthodontic reasons.
KAPAO Prime: Design and Simulation
NASA Astrophysics Data System (ADS)
McGonigle, Lorcan; Choi, P. I.; Severson, S. A.; Spjut, E.
2013-01-01
KAPAO (KAPAO A Pomona Adaptive Optics instrument) is a dual-band natural guide star adaptive optics system designed to measure and remove atmospheric aberration over UV-NIR wavelengths from Pomona College’s telescope atop Table Mountain. We present here, the final optical system, KAPAO Prime, designed in Zemax Optical Design Software that uses custom off-axis paraboloid mirrors (OAPs) to manipulate light appropriately for a Shack-Hartman wavefront sensor, deformable mirror, and science cameras. KAPAO Prime is characterized by diffraction limited imaging over the full 81” field of view of our optical camera at f/33 as well as over the smaller field of view of our NIR camera at f/50. In Zemax, tolerances of 1% on OAP focal length and off-axis distance were shown to contribute an additional 4 nm of wavefront error (98% confidence) over the field of view of our optical camera; the contribution from surface irregularity was determined analytically to be 40nm for OAPs specified to λ/10 surface irregularity (632.8nm). Modeling of the temperature deformation of the breadboard in SolidWorks revealed 70 micron contractions along the edges of the board for a decrease of 75°F when applied to OAP positions such displacements from the optimal layout are predicted to contribute an additional 20 nanometers of wavefront error. Flexure modeling of the breadboard due to gravity is on-going. We hope to begin alignment and testing of KAPAO Prime in Q1 2013.
A goggle navigation system for cancer resection surgery
NASA Astrophysics Data System (ADS)
Xu, Junbin; Shao, Pengfei; Yue, Ting; Zhang, Shiwu; Ding, Houzhu; Wang, Jinkun; Xu, Ronald
2014-02-01
We describe a portable fluorescence goggle navigation system for cancer margin assessment during oncologic surgeries. The system consists of a computer, a head mount display (HMD) device, a near infrared (NIR) CCD camera, a miniature CMOS camera, and a 780 nm laser diode excitation light source. The fluorescence and the background images of the surgical scene are acquired by the CCD camera and the CMOS camera respectively, co-registered, and displayed on the HMD device in real-time. The spatial resolution and the co-registration deviation of the goggle navigation system are evaluated quantitatively. The technical feasibility of the proposed goggle system is tested in an ex vivo tumor model. Our experiments demonstrate the feasibility of using a goggle navigation system for intraoperative margin detection and surgical guidance.
IR sensitivity enhancement of CMOS Image Sensor with diffractive light trapping pixels.
Yokogawa, Sozo; Oshiyama, Itaru; Ikeda, Harumi; Ebiko, Yoshiki; Hirano, Tomoyuki; Saito, Suguru; Oinoue, Takashi; Hagimoto, Yoshiya; Iwamoto, Hayato
2017-06-19
We report on the IR sensitivity enhancement of back-illuminated CMOS Image Sensor (BI-CIS) with 2-dimensional diffractive inverted pyramid array structure (IPA) on crystalline silicon (c-Si) and deep trench isolation (DTI). FDTD simulations of semi-infinite thick c-Si having 2D IPAs on its surface whose pitches over 400 nm shows more than 30% improvement of light absorption at λ = 850 nm and the maximum enhancement of 43% with the 540 nm pitch at the wavelength is confirmed. A prototype BI-CIS sample with pixel size of 1.2 μm square containing 400 nm pitch IPAs shows 80% sensitivity enhancement at λ = 850 nm compared to the reference sample with flat surface. This is due to diffraction with the IPA and total reflection at the pixel boundary. The NIR images taken by the demo camera equip with a C-mount lens show 75% sensitivity enhancement in the λ = 700-1200 nm wavelength range with negligible spatial resolution degradation. Light trapping CIS pixel technology promises to improve NIR sensitivity and appears to be applicable to many different image sensor applications including security camera, personal authentication, and range finding Time-of-Flight camera with IR illuminations.
Noisy Ocular Recognition Based on Three Convolutional Neural Networks.
Lee, Min Beom; Hong, Hyung Gil; Park, Kang Ryoung
2017-12-17
In recent years, the iris recognition system has been gaining increasing acceptance for applications such as access control and smartphone security. When the images of the iris are obtained under unconstrained conditions, an issue of undermined quality is caused by optical and motion blur, off-angle view (the user's eyes looking somewhere else, not into the front of the camera), specular reflection (SR) and other factors. Such noisy iris images increase intra-individual variations and, as a result, reduce the accuracy of iris recognition. A typical iris recognition system requires a near-infrared (NIR) illuminator along with an NIR camera, which are larger and more expensive than fingerprint recognition equipment. Hence, many studies have proposed methods of using iris images captured by a visible light camera without the need for an additional illuminator. In this research, we propose a new recognition method for noisy iris and ocular images by using one iris and two periocular regions, based on three convolutional neural networks (CNNs). Experiments were conducted by using the noisy iris challenge evaluation-part II (NICE.II) training dataset (selected from the university of Beira iris (UBIRIS).v2 database), mobile iris challenge evaluation (MICHE) database, and institute of automation of Chinese academy of sciences (CASIA)-Iris-Distance database. As a result, the method proposed by this study outperformed previous methods.
NASA Technical Reports Server (NTRS)
2004-01-01
This animation shows the transit of Mars' moon Phobos across the Sun. It is made up of images taken by the Mars Exploration Rover Opportunity on the morning of the 45th martian day, or sol, of its mission. This observation will help refine our knowledge of the orbit and position of Phobos. Other spacecraft may be able to take better images of Phobos using this new information. This event is similar to solar eclipses seen on Earth in which our Moon passes in front of the Sun. The images were taken by the rover's panoramic camera.
NASA Technical Reports Server (NTRS)
2004-01-01
The circular shapes seen on the martian surface in these images are 'footprints' left by the Mars Exploration Rover Opportunity's airbags during landing as the spacecraft gently rolled to a stop. Opportunity landed at approximately 9:05 p.m. PST on Saturday, Jan. 24, 2004, Earth-received time. The circular region of the flower-like feature on the right is about the size of a basketball. Scientists are studying the prints for more clues about the makeup of martian soil. The images were taken at Meridiani Planum, Mars, by the panoramic camera on the Mars Exploration Rover Opportunity.
The Athena Pancam and Color Microscopic Imager (CMI)
NASA Technical Reports Server (NTRS)
Bell, J. F., III; Herkenhoff, K. E.; Schwochert, M.; Morris, R. V.; Sullivan, R.
2000-01-01
The Athena Mars rover payload includes two primary science-grade imagers: Pancam, a multispectral, stereo, panoramic camera system, and the Color Microscopic Imager (CMI), a multispectral and variable depth-of-field microscope. Both of these instruments will help to achieve the primary Athena science goals by providing information on the geology, mineralogy, and climate history of the landing site. In addition, Pancam provides important support for rover navigation and target selection for Athena in situ investigations. Here we describe the science goals, instrument designs, and instrument performance of the Pancam and CMI investigations.
Spacecraft technology. [development of satellites and remote sensors
NASA Technical Reports Server (NTRS)
1975-01-01
Developments in spacecraft technology are discussed with emphasis on the Explorer satellite program. The subjects considered include the following: (1) nutational behavior of the Explorer-45 satellite, (2) panoramic sensor development, (3) onboard camera signal processor for Explorer satellites, and (4) microcircuit development. Information on the zero gravity testing of heat pipes is included. Procedures for cleaning heat treated aluminum heat pipes are explained. The development of a five-year magnetic tape, an accurate incremental angular encoder, and a blood freezing apparatus for leukemia research are also discussed.
Coarse Layering at 'Home Plate'
NASA Technical Reports Server (NTRS)
2006-01-01
This image shows coarse-grained layers from around the edge of a low plateau called 'Home Plate' inside Mars' Gusev Crater. One possible origin is material falling to the ground after being thrown aloft by an explosion such as a volcanic eruption or meteorite impact. The panoramic camera (Pancam) on NASA's Mars Exploration Rover Spirit acquired the exposures for this image on Spirit's 749th Martian day (Feb. 10, 2006). This view is an approximately true-color rendering mathematically generated from separate images taken through all of the left Pancam's 432-nanometer to 753-nanometer filters.Earth Obsersation taken by the Expedition 11 crew
2005-07-16
ISS011-E-10509 (16 July 2005) --- This high-oblique panoramic view, recorded by a digital still camera using a 400mm lens, shows the eye of Hurricane Emily. The image was captured by the crew of the international space station while the complex was over the southern Gulf of Mexico looking eastwardly toward the rising moon. At the time, Emily was a strengthening Category 4 hurricane with winds of nearly 155 miles per hour and moving west-northwestwardly over the northwest Caribbean Sea about 135 miles southwest of Kingston, Jamaica.
Near-field observation platform
NASA Astrophysics Data System (ADS)
Schlemmer, Harry; Baeurle, Constantin; Vogel, Holger
2008-04-01
A miniaturized near-field observation platform is presented comprising a sensitive daylight camera and an uncooled micro-bolometer thermal imager each equipped with a wide angle lens. Both cameras are optimised for a range between a few meters and 200 m. The platform features a stabilised line of sight and can therefore be used also on a vehicle when it is in motion. The line of sight either can be directed manually or the platform can be used in a panoramic mode. The video output is connected to a control panel where algorithms for moving target indication or tracking can be applied in order to support the observer. The near-field platform also can be netted with the vehicle system and the signals can be utilised, e.g. to designate a new target to the main periscope or the weapon sight.
Spirit Beholds Bumpy Boulder (False Color)
NASA Technical Reports Server (NTRS)
2006-01-01
As NASA's Mars Exploration Rover Spirit began collecting images for a 360-degree panorama of new terrain, the rover captured this view of a dark boulder with an interesting surface texture. The boulder sits about 40 centimeters (16 inches) tall on Martian sand about 5 meters (16 feet) away from Spirit. It is one of many dark, volcanic rock fragments -- many pocked with rounded holes called vesicles -- littering the slope of 'Low Ridge.' The rock surface facing the rover is similar in appearance to the surface texture on the outside of lava flows on Earth. Spirit took this false-color image with the panoramic camera on the rover's 810th sol, or Martian day, of exploring Mars (April 13, 2006). This image is a false-color rendering using camera's 753-nanometer, 535-nanometer, and 432-nanometer filters.NASA Astrophysics Data System (ADS)
Keller, H. U.; Hartwig, H.; Kramm, R.; Koschny, D.; Markiewicz, W. J.; Thomas, N.; Fernades, M.; Smith, P. H.; Reynolds, R.; Lemmon, M. T.; Weinberg, J.; Marcialis, R.; Tanner, R.; Boss, B. J.; Oquest, C.; Paige, D. A.
2001-08-01
The Robotic Arm Camera (RAC) is one of the key instruments newly developed for the Mars Volatiles and Climate Surveyor payload of the Mars Polar Lander. This lightweight instrument employs a front lens with variable focus range and takes images at distances from 11 mm (image scale 1:1) to infinity. Color images with a resolution of better than 50 μm can be obtained to characterize the Martian soil. Spectral information of nearby objects is retrieved through illumination with blue, green, and red lamp sets. The design and performance of the camera are described in relation to the science objectives and operation. The RAC uses the same CCD detector array as the Surface Stereo Imager and shares the readout electronics with this camera. The RAC is mounted at the wrist of the Robotic Arm and can characterize the contents of the scoop, the samples of soil fed to the Thermal Evolved Gas Analyzer, the Martian surface in the vicinity of the lander, and the interior of trenches dug out by the Robotic Arm. It can also be used to take panoramic images and to retrieve stereo information with an effective baseline surpassing that of the Surface Stereo Imager by about a factor of 3.
Near-infrared dental imaging using scanning fiber endoscope
NASA Astrophysics Data System (ADS)
Zhou, Yaxuan; Lee, Robert; Sadr, Alireza; Seibel, Eric J.
2018-02-01
Near-infrared (NIR) wavelength range of 1300-1500nm has the potential to outperform or augment other dental imaging modalities such as fluorescence imaging, owing to its lower scattering coefficient in enamel and trans- parency on stains and non-cariogenic plaque. However, cameras in this wavelength range are bulky and expensive, which lead to difficulties for in-vivo use and commercialization. Thus, we have proposed a new imaging device combining the scanning fiber endoscopy (SFE) and NIR imaging technology. The NIR SFE system has the advantage of miniature size (1.6 mm), flexible shaft, video frame rate (7Hz) and expandable wide field-of-view (60 degrees). Eleven extracted human teeth with or without occlusal caries were scanned by micro-computed X-ray tomography (micro-CT) to obtain 3D micro-CT images, which serve as the standard for comparison. NIR images in reflection mode were then taken on all the occlusal surfaces, using 1310nm super luminescent diode and 1460nm laser diode respectively. Qualitative comparison was performed between near-infrared im- ages and micro-CT images. Enamel demineralization in NIR appeared as areas of increased reflectivity, and distinguished from non-carious staining at the base of occlusal fissures or developmental defects on cusps. This preliminary work presented proof for practicability of combining NIR imaging technology with SFE for reliable and noninvasive dental imaging with miniaturization and low cost.
Jablonski-Momeni, Anahita; Jablonski, Boris; Lippe, Nikola
2017-01-01
Apart from the visual detection of caries, X-rays can be taken for detection of approximal lesions. The Proxi head of VistaCam iX intraoral camera system uses near-infrared light (NIR) to enable caries detection in approximal surfaces. The aim of this study was to evaluate the performance of the NIR for the detection of approximal enamel lesions by comparison with radiographic findings. One hundred ninety-three approximal surfaces from 18 patients were examined visually and using digital radiographs for presence or absence of enamel lesions. Then digital images of each surface were produced using the near-infrared light. Correlation between methods was assessed using Spearman's rank correlation coefficient ( r s ). Agreement between radiographic and NIR findings was calculated using the kappa coefficient. McNemar's test was used to analyse differences between the radiographic and NIR findings ( α =0.05). Moderate correlation was found between all detection methods ( r s =0.33-0.50, P <0.0001). Agreement between the radiographic and NIR findings was moderate ( κ =0.50, 95% CI=0.37-0.62) for the distinction between sound surfaces and enamel caries. No significant differences were found between the findings ( P =0.07). Radiographs and NIR were found to be comparable for the detection of enamel lesions in permanent teeth.
Jablonski-Momeni, Anahita; Jablonski, Boris; Lippe, Nikola
2017-01-01
Objectives/Aims: Apart from the visual detection of caries, X-rays can be taken for detection of approximal lesions. The Proxi head of VistaCam iX intraoral camera system uses near-infrared light (NIR) to enable caries detection in approximal surfaces. The aim of this study was to evaluate the performance of the NIR for the detection of approximal enamel lesions by comparison with radiographic findings. Materials and methods: One hundred ninety-three approximal surfaces from 18 patients were examined visually and using digital radiographs for presence or absence of enamel lesions. Then digital images of each surface were produced using the near-infrared light. Correlation between methods was assessed using Spearman’s rank correlation coefficient (rs). Agreement between radiographic and NIR findings was calculated using the kappa coefficient. McNemar’s test was used to analyse differences between the radiographic and NIR findings (α=0.05). Results: Moderate correlation was found between all detection methods (rs=0.33–0.50, P<0.0001). Agreement between the radiographic and NIR findings was moderate (κ=0.50, 95% CI=0.37–0.62) for the distinction between sound surfaces and enamel caries. No significant differences were found between the findings (P=0.07). Conclusion: Radiographs and NIR were found to be comparable for the detection of enamel lesions in permanent teeth. PMID:29607082
Cheetah: A high frame rate, high resolution SWIR image camera
NASA Astrophysics Data System (ADS)
Neys, Joel; Bentell, Jonas; O'Grady, Matt; Vermeiren, Jan; Colin, Thierry; Hooylaerts, Peter; Grietens, Bob
2008-10-01
A high resolution, high frame rate InGaAs based image sensor and associated camera has been developed. The sensor and the camera are capable of recording and delivering more than 1700 full 640x512pixel frames per second. The FPA utilizes a low lag CTIA current integrator in each pixel, enabling integration times shorter than one microsecond. On-chip logics allows for four different sub windows to be read out simultaneously at even higher rates. The spectral sensitivity of the FPA is situated in the SWIR range [0.9-1.7 μm] and can be further extended into the Visible and NIR range. The Cheetah camera has max 16 GB of on-board memory to store the acquired images and transfer the data over a Gigabit Ethernet connection to the PC. The camera is also equipped with a full CameralinkTM interface to directly stream the data to a frame grabber or dedicated image processing unit. The Cheetah camera is completely under software control.
NASA Astrophysics Data System (ADS)
Zelazny, Amy; Benson, Robert; Deegan, John; Walsh, Ken; Schmidt, W. David; Howe, Russell
2013-06-01
We describe the benefits to camera system SWaP-C associated with the use of aspheric molded glasses and optical polymers in the design and manufacture of optical components and elements. Both camera objectives and display eyepieces, typical for night vision man-portable EO/IR systems, are explored. We discuss optical trade-offs, system performance, and cost reductions associated with this approach in both visible and non-visible wavebands, specifically NIR and LWIR. Example optical models are presented, studied, and traded using this approach.
NASA Astrophysics Data System (ADS)
Šedina, Jaroslav; Pavelka, Karel; Raeva, Paulina
2017-04-01
For ecologically valuable areas monitoring, precise agriculture and forestry, thematic maps or small GIS are needed. Remotely Piloted Aircraft Systems (RPAS) data can be obtained on demand in a short time with cm resolution. Data collection is environmentally friendly and low-cost from an economical point of view. This contribution is focused on using eBee drone for mapping or monitoring national natural reserve which is not opened to public and partly pure inaccessible because its moorland nature. Based on a new equipment (thermal imager, multispectral imager, NIR, NIR red-edge and VIS camera) we started new projects in precise agriculture and forestry.
Surface albedo observations at Gusev Crater and Meridiani Planum, Mars
Bell, J.F.; Rice, M.S.; Johnson, J. R.; Hare, T.M.
2008-01-01
During the Mars Exploration Rover mission, the Pancam instrument has periodically acquired large-scale panoramic images with its broadband (739??338 nm) filter in order to estimate the Lambert bolometric albedo of the surface along each rover's traverse. In this work we present the full suite of such estimated albedo values measured to date by the Spirit and Opportunity rovers along their traverses in Gusev Crater and Meridiani Planum, respectively. We include estimated bolometric albedo values of individual surface features (e.g., outcrops, dusty plains, aeolian bed forms, wheel tracks, light-toned soils, and crater walls) as well as overall surface averages of the 43 total panoramic albedo data sets acquired to date. We also present comparisons to estimated Lambert albedo values taken from the Mars Global Surveyor Mars Orbiter Camera (MOC) along the rovers' traverses, and to the large-scale bolometric albedos of the sites from the Viking Orbiter Infrared Thermal Mapper (IRTM) and Mars Global Surveyor/Thermal Emission Spectrometer (TES). The ranges of Pancam-derived albedos at Gusev Crater (0.14 to 0.25) and in Meridiani Planum. (0.10 to 0.18) are in good agreement with IRTM, TES, and MOC orbital measurements. These data sets will be a useful tool and benchmark for future investigations of albodo variations with time, including measurements from orbital instruments like the Context Camera and High Resolution Imaging Science Experiment on Mars Reconnaissance Orbiter. Long-term, accurate albedo measurements could also be important for future efforts in climate modeling as well as for studies of active surface processes. Copyright 2008 by the American Geophysical Union.
Spirit Mini-TES Observations: From Bonneville Crater to the Columbia Hills.
NASA Astrophysics Data System (ADS)
Blaney, D. L.; Athena Science
2004-11-01
During the Mars Exploration Rover Extended Mission the Spirit rover traveled from the rim of the crater informally known as "Bonneville, Crater" into the hills informally known as the "Columbia Hills" in Gusev Crater. During this >3 km drive Mini-TES (Miniature Thermal Emission Spectrometer) collected systematic observations to characterize spectral diversity and targeted observations of rocks, soils, rover tracks, and trenches. Surface temperatures have steadily decreased during the drive and arrival into the Columbia hills with the approach of winter. Mini-TES covers the 5-29 micron spectral region with a 20 mrad aperture that is co-registered with panoramic and navigation cameras. As at the landing site (Christensen et al., Science, 2004), many dark rocks in the plains between "Bonneville Crater" show long wavelength (15-25 μm) absorptions due to olivine consistent with the detection of olivine-bearing basalt at this site from orbital TES infrared spectroscopy. Rocks with the spectral signature of olivine are rarer in the Columbia Hills. Measurements of outcrops of presumably intact bedrock lack any olivine signature and are consistent with other results indicating that these rocks are highly altered. Rock coatings and fine dust on rocks are common. Soils have thin dust coatings and disturbed soil (e.g rover tracks and trenches) are consistent with basalt. Mini-TES observations were coordinated with Panoramic Camera (Pancam) observations to allow us to search for correlations of visible spectra properties with infrared. This work was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under contract to NASA.
Surface albedo observations at Gusev Crater and Meridiani Planum, Mars
NASA Astrophysics Data System (ADS)
Bell, J. F.; Rice, M. S.; Johnson, J. R.; Hare, T. M.
2008-05-01
During the Mars Exploration Rover mission, the Pancam instrument has periodically acquired large-scale panoramic images with its broadband (739 +/- 338 nm) filter in order to estimate the Lambert bolometric albedo of the surface along each rover's traverse. In this work we present the full suite of such estimated albedo values measured to date by the Spirit and Opportunity rovers along their traverses in Gusev Crater and Meridiani Planum, respectively. We include estimated bolometric albedo values of individual surface features (e.g., outcrops, dusty plains, aeolian bed forms, wheel tracks, light-toned soils, and crater walls) as well as overall surface averages of the 43 total panoramic albedo data sets acquired to date. We also present comparisons to estimated Lambert albedo values taken from the Mars Global Surveyor Mars Orbiter Camera (MOC) along the rovers' traverses, and to the large-scale bolometric albedos of the sites from the Viking Orbiter Infrared Thermal Mapper (IRTM) and Mars Global Surveyor/Thermal Emission Spectrometer (TES). The ranges of Pancam-derived albedos at Gusev Crater (0.14 to 0.25) and in Meridiani Planum (0.10 to 0.18) are in good agreement with IRTM, TES, and MOC orbital measurements. These data sets will be a useful tool and benchmark for future investigations of albedo variations with time, including measurements from orbital instruments like the Context Camera and High Resolution Imaging Science Experiment on Mars Reconnaissance Orbiter. Long-term, accurate albedo measurements could also be important for future efforts in climate modeling as well as for studies of active surface processes.
Comparison of Near-Infrared Imaging Camera Systems for Intracranial Tumor Detection.
Cho, Steve S; Zeh, Ryan; Pierce, John T; Salinas, Ryan; Singhal, Sunil; Lee, John Y K
2018-04-01
Distinguishing neoplasm from normal brain parenchyma intraoperatively is critical for the neurosurgeon. 5-Aminolevulinic acid (5-ALA) has been shown to improve gross total resection and progression-free survival but has limited availability in the USA. Near-infrared (NIR) fluorescence has advantages over visible light fluorescence with greater tissue penetration and reduced background fluorescence. In order to prepare for the increasing number of NIR fluorophores that may be used in molecular imaging trials, we chose to compare a state-of-the-art, neurosurgical microscope (System 1) to one of the commercially available NIR visualization platforms (System 2). Serial dilutions of indocyanine green (ICG) were imaged with both systems in the same environment. Each system's sensitivity and dynamic range for NIR fluorescence were documented and analyzed. In addition, brain tumors from six patients were imaged with both systems and analyzed. In vitro, System 2 demonstrated greater ICG sensitivity and detection range (System 1 1.5-251 μg/l versus System 2 0.99-503 μg/l). Similarly, in vivo, System 2 demonstrated signal-to-background ratio (SBR) of 2.6 ± 0.63 before dura opening, 5.0 ± 1.7 after dura opening, and 6.1 ± 1.9 after tumor exposure. In contrast, System 1 could not easily detect ICG fluorescence prior to dura opening with SBR of 1.2 ± 0.15. After the dura was reflected, SBR increased to 1.4 ± 0.19 and upon exposure of the tumor SBR increased to 1.8 ± 0.26. Dedicated NIR imaging platforms can outperform conventional microscopes in intraoperative NIR detection. Future microscopes with improved NIR detection capabilities could enhance the use of NIR fluorescence to detect neoplasm and improve patient outcome.
On the Origin of the Near-infrared Emission from the Neutron-star Low-mass X-Ray Binary GX 9+1
NASA Astrophysics Data System (ADS)
van den Berg, Maureen; Homan, Jeroen
2017-01-01
We have determined an improved position for the luminous persistent neutron-star low-mass X-ray binary and atoll source GX 9+1 from archival Chandra X-ray Observatory data. The new position significantly differs from a previously published Chandra position for this source. Based on the revised X-ray position we have identified a new near-infrared (NIR) counterpart to GX 9+1 in Ks-band images obtained with the PANIC and FourStar cameras on the Magellan Baade Telescope. NIR spectra of this {K}s=16.5+/- 0.1 mag star, taken with the FIRE spectrograph on the Baade Telescope, show a strong Br γ emission line, which is a clear signature that we discovered the true NIR counterpart to GX 9+1. The mass donor in GX 9+1 cannot be a late-type giant, as such a star would be brighter than the estimated absolute Ks magnitude of the NIR counterpart. The slope of the dereddened NIR spectrum is poorly constrained due to uncertainties in the column density NH and NIR extinction. Considering the source’s distance and X-ray luminosity, we argue that NH likely lies near the high end of the previously suggested range. If this is indeed the case, the NIR spectrum is consistent with thermal emission from a heated accretion disk, possibly with a contribution from the secondary. In this respect, GX 9+1 is similar to other bright atolls and the Z sources, whose NIR spectra do not show the slope that is expected for a dominant contribution from optically thin synchrotron emission from the inner regions of a jet. This paper includes data gathered with the 6.5 m Magellan Telescopes located at Las Campanas Observatory, Chile.
NASA Astrophysics Data System (ADS)
Hervey, Nathan; Khan, Bilal; Shagman, Laura; Tian, Fenghua; Delgado, Mauricio R.; Tulchin-Francis, Kirsten; Shierk, Angela; Smith, Linsley; Reid, Dahlia; Clegg, Nancy J.; Liu, Hanli; MacFarlane, Duncan; Alexandrakis, George
2013-03-01
Functional neurological imaging has been shown to be valuable in evaluating brain plasticity in children with cerebral palsy (CP). In recent studies it has been demonstrated that functional near-infrared spectroscopy (fNIRS) is a viable and sensitive method for imaging motor cortex activities in children with CP. However, during unilateral finger tapping tasks children with CP often exhibit mirror motions (unintended motions in the non-tapping hand), and current fNIRS image formation techniques do not account for this. Therefore, the resulting fNIRS images contain activation from intended and unintended motions. In this study, cortical activity was mapped with fNIRS on four children with CP and five controls during a finger tapping task. Finger motion and arm muscle activation were concurrently measured using motion tracking cameras and electromyography (EMG). Subject-specific regressors were created from motion capture and EMG data and used in a general linear model (GLM) analysis in an attempt to create fNIRS images representative of different motions. The analysis provided an fNIRS image representing activation due to motion and muscle activity for each hand. This method could prove to be valuable in monitoring brain plasticity in children with CP by providing more consistent images between measurements. Additionally, muscle effort versus cortical effort was compared between control and CP subjects. More cortical effort was required to produce similar muscle effort in children with CP. It is possible this metric could be a valuable diagnostic tool in determining response to treatment.
Water Ice Clouds as Seen from the Mars Exploration Rovers
NASA Astrophysics Data System (ADS)
Wolff, M. J.; Clancy, R. T.; Banfield, D.; Cuozzo, K.
2005-12-01
Water ice clouds that bear a striking resemblance to terrestrial cirrus (e.g., "Mare's tails") have been observed by the Panoramic Camera (Pancam), the Navigation Camera (Navcam), the Hazard Camera (Hazcam), and the Minature Thermal Emission Spectrometer (Mini-TES) on board the Mars Exploration Rovers (MER). Such phenomena represent an opportunity to characterize local and regional scale meteorology as well as our understanding of the processes involved. However, a necessary first-step is to adequately describe some basic properties of the detected clouds: 1) when are the clouds present (i.e., local time, season, etc.)? 2) where are the clouds present? That is to say, what is the relative frequency between the two rover sites as well as the connection to detections from orbiting spacecraft. 3) what are the observed morphologies? 4) what are the projected velocities (i.e., wind speeds and directions) associated with the clouds? 5) what is the abundance of water ice nuclei (i.e., optical depth)? Our talk will summarize our progress in answering the above questions, as well as provide initial results in connecting the observations to more global behavior in the Martian climate.
NASA Technical Reports Server (NTRS)
2005-01-01
Taking advantage of extra solar energy collected during the day, NASA's Mars Exploration Rover Spirit settled in for an evening of stargazing, photographing the two moons of Mars as they crossed the night sky. This time-lapse composite, acquired the evening of Spirit's martian sol 590 (Aug. 30, 2005) from a perch atop 'Husband Hill' in Gusev Crater, shows Phobos, the brighter moon, on the left, and Deimos, the dimmer moon, on the right. In this sequence of images obtained every 170 seconds, both moons move from top to bottom. The bright star Aldebaran forms a trail on the right, along with some other stars in the constellation Taurus. Most of the other streaks in the image mark the collision of cosmic rays with pixels in the camera. Scientists will use images of the two moons to better map their orbital positions, learn more about their composition, and monitor the presence of nighttime clouds or haze. Spirit took the six images that make up this composite using Spirit's panoramic camera with the camera's broadband filter, which was designed specifically for acquiring images under low-light conditions.NASA Astrophysics Data System (ADS)
Ghionis, George; Trygonis, Vassilis; Karydis, Antonis; Vousdoukas, Michalis; Alexandrakis, George; Drakopoulos, Panos; Amdreadis, Olympos; Psarros, Fotis; Velegrakis, Antonis; Poulos, Serafim
2016-04-01
Effective beach management requires environmental assessments that are based on sound science, are cost-effective and are available to beach users and managers in an accessible, timely and transparent manner. The most common problems are: 1) The available field data are scarce and of sub-optimal spatio-temporal resolution and coverage, 2) our understanding of local beach processes needs to be improved in order to accurately model/forecast beach dynamics under a changing climate, and 3) the information provided by coastal scientists/engineers in the form of data, models and scientific interpretation is often too complicated to be of direct use by coastal managers/decision makers. A multispectral video system has been developed, consisting of one or more video cameras operating in the visible part of the spectrum, a passive near-infrared (NIR) camera, an active NIR camera system, a thermal infrared camera and a spherical video camera, coupled with innovative image processing algorithms and a telemetric system for the monitoring of coastal environmental parameters. The complete system has the capability to record, process and communicate (in quasi-real time) high frequency information on shoreline position, wave breaking zones, wave run-up, erosion hot spots along the shoreline, nearshore wave height, turbidity, underwater visibility, wind speed and direction, air and sea temperature, solar radiation, UV radiation, relative humidity, barometric pressure and rainfall. An innovative, remotely-controlled interactive visual monitoring system, based on the spherical video camera (with 360°field of view), combines the video streams from all cameras and can be used by beach managers to monitor (in real time) beach user numbers, flow activities and safety at beaches of high touristic value. The high resolution near infrared cameras permit 24-hour monitoring of beach processes, while the thermal camera provides information on beach sediment temperature and moisture, can detect upwelling in the nearshore zone, and enhances the safety of beach users. All data can be presented in real- or quasi-real time and are stored for future analysis and training/validation of coastal processes models. Acknowledgements: This work was supported by the project BEACHTOUR (11SYN-8-1466) of the Operational Program "Cooperation 2011, Competitiveness and Entrepreneurship", co-funded by the European Regional Development Fund and the Greek Ministry of Education and Religious Affairs.
'Berries' and Rock Share Common Origins
NASA Technical Reports Server (NTRS)
2004-01-01
This false-color composite image, taken at a region of the rock outcrop dubbed 'Shoemaker's Patio' near the Mars Exploration Rover Opportunity's landing site, shows finely layered sediments, which have been accentuated by erosion. The sphere-like grains or 'blueberries' distributed throughout the outcrop can be seen lining up with individual layers. This observation indicates that the spherules are geologic features called concretions, which form in pre-existing wet sediments. Other sphere-like grains, such as impact spherules or volcanic lapilli (fragments of material etween 2 and 64 millimeters or .08 and 2.5 inches in maximum dimension that are ejected from a volcano) are thought to be deposited with sediments and thus would form layers distinct from those of the rocks. This image was captured by the rover's panoramic camera on the 50th martian day, or sol, of the mission. Data from the camera's infrared, green and violet filters were used to create this false-color picture.
Apollo 17 Command/Service modules photographed from lunar module in orbit
1972-12-14
AS17-145-22254 (14 Dec. 1972) --- An excellent view of the Apollo 17 Command and Service Modules (CSM) photographed from the Lunar Module (LM) "Challenger" during rendezvous and docking maneuvers in lunar orbit. The LM ascent stage, with astronauts Eugene A. Cernan and Harrison H. Schmitt aboard, had just returned from the Taurus-Littrow landing site on the lunar surface. Astronaut Ronald E. Evans remained with the CSM in lunar orbit. Note the exposed Scientific Instrument Module (SIM) Bay in Sector 1 of the Service Module (SM). Three experiments are carried in the SIM bay: S-209 lunar sounder, S-171 infrared scanning spectrometer, and the S-169 far-ultraviolet spectrometer. Also mounted in the SIM bay are the panoramic camera, mapping camera and laser altimeter used in service module photographic tasks. A portion of the LM is on the right.
Design of a Day/Night Lunar Rover
NASA Astrophysics Data System (ADS)
Berkelman, Peter; Easudes, Jesse; Martin, Martin C.; Rollins, Eric; Silberman, Jack; Chen, Mei; Hancock, John; Mor, Andrew B.; Sharf, Alex; Warren, Tom; Bapna, Deepak
1995-06-01
The pair of lunar rovers discussed in this report will return video and state data to various ventures, including theme park and marketing concerns, science agencies, and educational institutions. The greatest challenge accepted by the design team was to enable operations throughout the extremely cold and dark lunar night, an unprecedented goal in planetary exploration. This is achieved through the use of the emerging technology of Alkali Metal Thermal to Electric Converters (AMTEC), provided with heat from a innovative beta-decay heat source, Krypton-85 gas. Although previous space missions have returned still images, our design will convey panoramic video from a ring of cameras around the rover. A six-wheel rocker bogie mechanism is implemented to propel the rover. The rovers will also provide the ability to safeguard their operation to allow untrained members of the general public to drive the vehicle. Additionally, scientific exploration and educational outreach will be supported with a user operable, steerable and zoomable camera.
NASA Technical Reports Server (NTRS)
2004-01-01
The Mars Exploration Rover Opportunity finished observations of the prominent rock outcrop it has been studying during its 51 martian days, or sols, on Mars, and is currently on the hunt for new discoveries. This image from the rover's navigation camera atop its mast features Opportunity's lander--its temporary home for the six-month cruise to Mars. The rover's soil survey traverse plan involves arcing around its landing site, called the Challenger Memorial Station, and over the trench it made on sol 23. In this image, Opportunity is situated about 6.2 meters (about 20.3 feet) from the lander. Rover tracks zig-zag along the surface. Bounce marks and airbag retraction marks are visible around the lander. The calibration target or sundial, which both rover panoramic cameras use to verify the true colors and brightness of the red planet, is visible on the back end of the rover.
Still Giving Thanks for Good Health
NASA Technical Reports Server (NTRS)
2005-01-01
[figure removed for brevity, see original site] Click on the image for Still Giving Thanks for Good Health (QTVR) NASA's Mars Exploration Rover Spirit took this full-circle panorama of the region near 'Husband Hill' (the peak just to the left of center) over the Thanksgiving holiday, before ascending farther. Both the Spirit and Opportunity rovers are still going strong, more than a year after landing on Mars. This 360-degree view combines 243 images taken by Spirit's panoramic camera over several martian days, or sols, from sol 318 (Nov. 24, 2004) to sol 325 (Dec. 2, 2004). It is an approximately true-color rendering generated from images taken through the camera's 750-, 530-, and 480-nanometer filters. The view is presented here in a cylindrical projection with geometric seam correction. Spirit is now driving up the slope of Husband Hill along a path about one-quarter of the way from the left side of this mosaic.Autonomous Exploration for Gathering Increased Science
NASA Technical Reports Server (NTRS)
Bornstein, Benjamin J.; Castano, Rebecca; Estlin, Tara A.; Gaines, Daniel M.; Anderson, Robert C.; Thompson, David R.; DeGranville, Charles K.; Chien, Steve A.; Tang, Benyang; Burl, Michael C.;
2010-01-01
The Autonomous Exploration for Gathering Increased Science System (AEGIS) provides automated targeting for remote sensing instruments on the Mars Exploration Rover (MER) mission, which at the time of this reporting has had two rovers exploring the surface of Mars (see figure). Currently, targets for rover remote-sensing instruments must be selected manually based on imagery already on the ground with the operations team. AEGIS enables the rover flight software to analyze imagery onboard in order to autonomously select and sequence targeted remote-sensing observations in an opportunistic fashion. In particular, this technology will be used to automatically acquire sub-framed, high-resolution, targeted images taken with the MER panoramic cameras. This software provides: 1) Automatic detection of terrain features in rover camera images, 2) Feature extraction for detected terrain targets, 3) Prioritization of terrain targets based on a scientist target feature set, and 4) Automated re-targeting of rover remote-sensing instruments at the highest priority target.
Noisy Ocular Recognition Based on Three Convolutional Neural Networks
Lee, Min Beom; Hong, Hyung Gil; Park, Kang Ryoung
2017-01-01
In recent years, the iris recognition system has been gaining increasing acceptance for applications such as access control and smartphone security. When the images of the iris are obtained under unconstrained conditions, an issue of undermined quality is caused by optical and motion blur, off-angle view (the user’s eyes looking somewhere else, not into the front of the camera), specular reflection (SR) and other factors. Such noisy iris images increase intra-individual variations and, as a result, reduce the accuracy of iris recognition. A typical iris recognition system requires a near-infrared (NIR) illuminator along with an NIR camera, which are larger and more expensive than fingerprint recognition equipment. Hence, many studies have proposed methods of using iris images captured by a visible light camera without the need for an additional illuminator. In this research, we propose a new recognition method for noisy iris and ocular images by using one iris and two periocular regions, based on three convolutional neural networks (CNNs). Experiments were conducted by using the noisy iris challenge evaluation-part II (NICE.II) training dataset (selected from the university of Beira iris (UBIRIS).v2 database), mobile iris challenge evaluation (MICHE) database, and institute of automation of Chinese academy of sciences (CASIA)-Iris-Distance database. As a result, the method proposed by this study outperformed previous methods. PMID:29258217
Earth Observations taken by Expedition 34 crewmember
2013-01-04
ISS034-E-016601 (4 Jan. 2013) --- On Jan. 4 a large presence of stratocumulus clouds was the central focus of camera lenses which remained aimed at the clouds as the Expedition 34 crew members aboard the International Space Station flew above the northwestern Pacific Ocean about 460 miles east of northern Honshu, Japan. This is a descending pass with a panoramic view looking southeast in late afternoon light with the terminator (upper left). The cloud pattern is typical for this part of the world. The low clouds carry cold air over a warmer sea with no discernable storm pattern.
Immersive Photography Renders 360 degree Views
NASA Technical Reports Server (NTRS)
2008-01-01
An SBIR contract through Langley Research Center helped Interactive Pictures Corporation, of Knoxville, Tennessee, create an innovative imaging technology. This technology is a video imaging process that allows real-time control of live video data and can provide users with interactive, panoramic 360 views. The camera system can see in multiple directions, provide up to four simultaneous views, each with its own tilt, rotation, and magnification, yet it has no moving parts, is noiseless, and can respond faster than the human eye. In addition, it eliminates the distortion caused by a fisheye lens, and provides a clear, flat view of each perspective.
NASA Technical Reports Server (NTRS)
2004-01-01
This image of the martian sundial onboard the Mars Exploration Rover Spirit was processed by students in the Red Rover Goes to Mars program to impose hour markings on the face of the dial. The position of the shadow of the sundial's post within the markings indicates the time of day and the season, which in this image is 12:17 p.m. local solar time, late summer. A team of 16 students from 12 countries were selected by the Planetary Society to participate in this program. This image was taken on Mars by the rover's panoramic camera.2004-02-13
This color image taken by the Mars Exploration Rover Spirit's panoramic camera on Sol 40 is centered on an unusually flaky rock called Mimi. Mimi is only one of many features in the area known as "Stone Council," but looks very different from any rock that scientists have seen at the Gusev crater site so far. Mimi's flaky appearance leads scientists to a number of hypotheses. Mimi could have been subjected to pressure either through burial or impact, or may have once been a dune that was cemented into flaky layers, a process that sometimes involves the action of water. http://photojournal.jpl.nasa.gov/catalog/PIA05283
As Far as Opportunity's Eye Can See
NASA Technical Reports Server (NTRS)
2004-01-01
[figure removed for brevity, see original site] Click on the image for As Far as Opportunity's Eye Can See (QTVR) This expansive view of the martian real estate surrounding the Mars Exploration Rover Opportunity is the first 360 degree, high-resolution color image taken by the rover's panoramic camera. The airbag marks, or footprints, seen in the soil trace the route by which Opportunity rolled to its final resting spot inside a small crater at Meridiani Planum, Mars. The exposed rock outcropping is a future target for further examination. This image mosaic consists of 225 individual frames.Pancam Imaging of the Mars Exploration Rover Landing Sites in Gusev Crater and Meridiani Planum
NASA Technical Reports Server (NTRS)
Bell, J. F., III; Squyres, S. W.; Arvidson, R. E.; Arneson, H. M.; Bass, D.; Cabrol, N.; Calvin, W.; Farmer, J.; Farrand, W. H.
2004-01-01
The Mars Exploration Rovers carry four Panoramic Camera (Pancam) instruments (two per rover) that have obtained high resolution multispectral and stereoscopic images for studies of the geology, mineralogy, and surface and atmospheric physical properties at both rover landing sites. The Pancams are also providing significant mission support measurements for the rovers, including Sun-finding for rover navigation, hazard identification and digital terrain modeling to help guide long-term rover traverse decisions, high resolution imaging to help guide the selection of in situ sampling targets, and acquisition of education and public outreach imaging products.
Earth Observations taken by the Expedition 39 Crew
2014-04-22
ISS039-E-014807 (22 April 2014) --- As the International Space Station passed over the Bering Sea on Earth Day, one of the Expedition 39 crew members aboard the orbital outpost shot this panoramic scene looking toward Russia. The Kamchatka Peninsula can be seen in the foreground. Sunglint is visible on the left side of the frame. Only two points of view from Earth orbit were better for taking in this scene than that of the crew member with the camera inside, and those belonged to the two spacewalking astronauts -- Flight Engineers Rick Mastracchio and Steve Swanson of NASA.
SFR test fixture for hemispherical and hyperhemispherical camera systems
NASA Astrophysics Data System (ADS)
Tamkin, John M.
2017-08-01
Optical testing of camera systems in volume production environments can often require expensive tooling and test fixturing. Wide field (fish-eye, hemispheric and hyperhemispheric) optical systems create unique challenges because of the inherent distortion, and difficulty in controlling reflections from front-lit high resolution test targets over the hemisphere. We present a unique design for a test fixture that uses low-cost manufacturing methods and equipment such as 3D printing and an Arduino processor to control back-lit multi-color (VIS/NIR) targets and sources. Special care with LED drive electronics is required to accommodate both global and rolling shutter sensors.
Gyrocopter-Based Remote Sensing Platform
NASA Astrophysics Data System (ADS)
Weber, I.; Jenal, A.; Kneer, C.; Bongartz, J.
2015-04-01
In this paper the development of a lightweight and highly modularized airborne sensor platform for remote sensing applications utilizing a gyrocopter as a carrier platform is described. The current sensor configuration consists of a high resolution DSLR camera for VIS-RGB recordings. As a second sensor modality, a snapshot hyperspectral camera was integrated in the aircraft. Moreover a custom-developed thermal imaging system composed of a VIS-PAN camera and a LWIR-camera is used for aerial recordings in the thermal infrared range. Furthermore another custom-developed highly flexible imaging system for high resolution multispectral image acquisition with up to six spectral bands in the VIS-NIR range is presented. The performance of the overall system was tested during several flights with all sensor modalities and the precalculated demands with respect to spatial resolution and reliability were validated. The collected data sets were georeferenced, georectified, orthorectified and then stitched to mosaics.
Geometric Calibration of Full Spherical Panoramic Ricoh-Theta Camera
NASA Astrophysics Data System (ADS)
Aghayari, S.; Saadatseresht, M.; Omidalizarandi, M.; Neumann, I.
2017-05-01
A novel calibration process of RICOH-THETA, full-view fisheye camera, is proposed which has numerous applications as a low cost sensor in different disciplines such as photogrammetry, robotic and machine vision and so on. Ricoh Company developed this camera in 2014 that consists of two lenses and is able to capture the whole surrounding environment in one shot. In this research, each lens is calibrated separately and interior/relative orientation parameters (IOPs and ROPs) of the camera are determined on the basis of designed calibration network on the central and side images captured by the aforementioned lenses. Accordingly, designed calibration network is considered as a free distortion grid and applied to the measured control points in the image space as correction terms by means of bilinear interpolation. By performing corresponding corrections, image coordinates are transformed to the unit sphere as an intermediate space between object space and image space in the form of spherical coordinates. Afterwards, IOPs and EOPs of each lens are determined separately through statistical bundle adjustment procedure based on collinearity condition equations. Subsequently, ROPs of two lenses is computed from both EOPs. Our experiments show that by applying 3*3 free distortion grid, image measurements residuals diminish from 1.5 to 0.25 degrees on aforementioned unit sphere.
Non-contact finger vein acquisition system using NIR laser
NASA Astrophysics Data System (ADS)
Kim, Jiman; Kong, Hyoun-Joong; Park, Sangyun; Noh, SeungWoo; Lee, Seung-Rae; Kim, Taejeong; Kim, Hee Chan
2009-02-01
Authentication using finger vein pattern has substantial advantage than other biometrics. Because human vein patterns are hidden inside the skin and tissue, it is hard to forge vein structure. But conventional system using NIR LED array has two drawbacks. First, direct contact with LED array raise sanitary problem. Second, because of discreteness of LEDs, non-uniform illumination exists. We propose non-contact finger vein acquisition system using NIR laser and Laser line generator lens. Laser line generator lens makes evenly distributed line laser from focused laser light. Line laser is aimed on the finger longitudinally. NIR camera was used for image acquisition. 200 index finger vein images from 20 candidates are collected. Same finger vein pattern extraction algorithm was used to evaluate two sets of images. Acquired images from proposed non-contact system do not show any non-uniform illumination in contrary with conventional system. Also results of matching are comparable to conventional system. We developed Non-contact finger vein acquisition system. It can prevent potential cross contamination of skin diseases. Also the system can produce uniformly illuminated images unlike conventional system. With the benefit of non-contact, proposed system shows almost equivalent performance compared with conventional system.
Docosahexaenoic acid conjugated near infrared flourescence probe for in vivo early tumor diagnosis
NASA Astrophysics Data System (ADS)
Li, Siwen; Cao, Jie; Qin, Jingyi; Zhang, Xin; Achilefu, Samuel; Qian, Zhiyu; Gu, Yueqing
2013-02-01
Docosahexaenoic acid(DHA) is an omega-3 C22 natural fatty acid with six cis double bonds and as a constituent of membranes used as a precursor for metabolic and biochemical path ways. In this manuscript,we describe the synthesis of near-infrared(NIR) flourescence ICG-Der-01 labeled DHA for in vitro and vivo tumor targeting.The structure of the probe was intensively characterized by UV and MS. The in vitro and vivo tumor targeting abilities of the DHA-based NIR probes were investigeted in MCF-7 cells and MCF-7 xenograft mice model differently by confocal microscopy and CCD camera. The cell cytotoxicity were tested in tumor cells MCF-7 .The results shows that the DHA-based NIR probes have high affinity with the tumor both in vitro and vivo.In addition ,we also found that the DHA-based NIR probes have the apparent cytotoxicity on MCF-7 cells .which demonstrated that DHA was conjugated with other antitumor drug could increase the abilities of antirumor efficacy .So DHA-ICG-Der-01 is a promising optical agent for diagnosis of tumors especially in their early stage.
Intraoperative near-infrared autofluorescence imaging of parathyroid glands.
Ladurner, Roland; Sommerey, Sandra; Arabi, Nora Al; Hallfeldt, Klaus K J; Stepp, Herbert; Gallwas, Julia K S
2017-08-01
To identify parathyroid glands intraoperatively by exposing their autofluorescence using near-infrared light. Fluorescence imaging was carried out during minimally invasive and open parathyroid and thyroid surgery. After identification, the parathyroid glands as well as the surrounding tissue were exposed to near-infrared (NIR) light with a wavelength of 690-770 nm using a modified Karl Storz near-infrared/indocyanine green (NIR/ICG) endoscopic system. Parathyroid tissue was expected to show near-infrared autofluorescence, captured in the blue channel of the camera. Whenever possible the visual identification of parathyroid tissue was confirmed histologically. In preliminary investigations, using the original NIR/ICG endoscopic system we noticed considerable interference of light in the blue channel overlying the autofluorescence. Therefore, we modified the light source by interposing additional filters. In a second series, we investigated 35 parathyroid glands from 25 patients. Twenty-seven glands were identified correctly based on NIR autofluorescence. Regarding the extent of autofluorescence, there were no noticeable differences between parathyroid adenomas, hyperplasia and normal parathyroid glands. In contrast, thyroid tissue, lymph nodes and adipose tissue revealed no substantial autofluorescence. Parathyroid tissue is characterized by showing autofluorescence in the near-infrared spectrum. This effect can be used to distinguish parathyroid glands from other cervical tissue entities.
Yang, Hualei; Yang, Xi; Heskel, Mary; Sun, Shucun; Tang, Jianwu
2017-04-28
Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporal resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). We found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.
Vermeulen, Ph; Fernández Pierna, J A; van Egmond, H P; Zegers, J; Dardenne, P; Baeten, V
2013-09-01
In recent years, near-infrared (NIR) hyperspectral imaging has proved its suitability for quality and safety control in the cereal sector by allowing spectroscopic images to be collected at single-kernel level, which is of great interest to cereal control laboratories. Contaminants in cereals include, inter alia, impurities such as straw, grains from other crops, and insects, as well as undesirable substances such as ergot (sclerotium of Claviceps purpurea). For the cereal sector, the presence of ergot creates a high toxicity risk for animals and humans because of its alkaloid content. A study was undertaken, in which a complete procedure for detecting ergot bodies in cereals was developed, based on their NIR spectral characteristics. These were used to build relevant decision rules based on chemometric tools and on the morphological information obtained from the NIR images. The study sought to transfer this procedure from a pilot online NIR hyperspectral imaging system at laboratory level to a NIR hyperspectral imaging system at industrial level and to validate the latter. All the analyses performed showed that the results obtained using both NIR hyperspectral imaging cameras were quite stable and repeatable. In addition, a correlation higher than 0.94 was obtained between the predicted values obtained by NIR hyperspectral imaging and those supplied by the stereo-microscopic method which is the reference method. The validation of the transferred protocol on blind samples showed that the method could identify and quantify ergot contamination, demonstrating the transferability of the method. These results were obtained on samples with an ergot concentration of 0.02% which is less than the EC limit for cereals (intervention grains) destined for humans fixed at 0.05%.
Integrated ExoMars PanCam, Raman, and close-up imaging field tests on AMASE 2009
NASA Astrophysics Data System (ADS)
Foss Amundsen, Hans Erik; Westall, Frances; Steele, Andrew; Vago, Jorge; Schmitz, Nicole; Bauer, Arnold; Cousins, Claire; Rull, Fernando; Sansano, Antonio; Midtkandal, Ivar
2010-05-01
Arctic Mars Analog Svalbard Expedition (AMASE) uses Mars analog field sites on the Arctic islands of Svalbard (Norway) for research within astrobiology and for testing of payload instruments onboard Mars missions Mars Science Laboratory, ExoMars and Mars Sample Return. AMASE 2009 marked the seventh consecutive year of field testing. Instrument shakedowns were arranged to mimic rover operations on Mars and included the panoramic camera (PanCam), mineral- and organic chemistry sensors (Raman-LIBS) and ground penetrating radar (Wisdom) onboard ExoMars together with CheMin and SAM instruments onboard MSL and testing of sampling and caching protocols using JPĹs Fido rover. Test sites included volcanic rocks within the Bockfjord Volcanic Complex (BVC) with carbonate deposits identical to those in ALH84001 and Carboniferous sandstones and paleosols at Ismåsestranda. In view of the 2018 ExoMars mission, field models of the PanCam and Raman instruments, as well as an Olympus E410 camera having similar technical specifications to the ExoMars Close-Up Imager (CLUPI) were used in an integrated exercise to characterise the geology and habitability of the different field sites. The BVC locality consisted of volcanclastic sediments deposited on the flanks of the 1 Ma old Sverrefjell volcano. This volcano is constructed of primitive alkaline basalt with abundant mantle xenoliths. The sediments were a mixture of hyaloclastite, ash, volcanic bombs, lava detritus, and xenoliths (peridotites, granulites) deposited in a roughly laminated fashion on the slopes of the volcano. Late stage carbonate deposits were also present. The Ismåsestranda locality consisted of fine-grained sandstone deposited in a littoral environment. The sandstones were characterised by a variety of sedimentary structures reflecting a marginal marine depositional environment. They were highly variegated in colour due to diagenetic remobilisation of trace elements. PanCam made general context observations using the stereo Wide Angle Camera for taking images at 12 VIS-NIR wavelengths. More detailed images were made with the narrow angle colour High Resolution Channel of PanCam (PanCam HRC). These images were complimented by colour images made at 50-7 cm distance from the rock targets by the CLUPI-simulator camera. Compositional information was provided by the Raman spectrometer.The images and analyses obtained from the instruments permitted preliminary characterisation of the geological context at the two test sites. However, full characterisation of the rocks using more than one site is necessary to correctly interpret the nature of the rocks and their environment of formation, especially in the case of the Ismåsestranda sediments. Joint testing of ExoMars, MSL and MSR instruments on AMASE provides a unique opportunity to highgrade instrument selection for future Mars missions and to foster collaboration between ESA and NASA teams towards the tandem launch of ExoMars and MAX-C in 2018.
Non-destructive clinical assessment of occlusal caries lesions using near-IR imaging methods.
Staninec, Michal; Douglas, Shane M; Darling, Cynthia L; Chan, Kenneth; Kang, Hobin; Lee, Robert C; Fried, Daniel
2011-12-01
Enamel is highly transparent in the near-IR (NIR) at wavelengths near 1,300 nm, and stains are not visible. The purpose of this study was to use NIR transillumination and optical coherence tomography (OCT) to estimate the severity of caries lesions on occlusal surfaces both in vivo and on extracted teeth. Extracted molars with suspected occlusal lesions were examined with OCT and polarization sensitive OCT (PS-OCT), and subsequently sectioned and examined with polarized light microscopy (PLM) and transverse microradiography (TMR). Teeth in test subjects with occlusal caries lesions that were not cavitated or visible on radiographs were examined using NIR transillumination at 1,310 nm using a custom built probe attached to an indium gallium arsenide (InGaAs) camera and a linear OCT scanner. After imaging, cavities were prepared using dye staining to guide caries removal and physical impressions of the cavities were taken. The lesion severity determined from OCT and PS-OCT scans in vitro correlated with the depth determined using PLM and TMR. Occlusal caries lesions appeared in NIR images with high contrast in vivo. OCT scans showed that most of the lesions penetrated to dentin and spread laterally below the sound enamel. This study demonstrates that both NIR transillumination and OCT are promising new methods for the clinical diagnosis of occlusal caries. Copyright © 2011 Wiley Periodicals, Inc.
Nondestructive Clinical Assessment of Occlusal Caries Lesions using Near-IR Imaging Methods
Staninec, Michal; Douglas, Shane M.; Darling, Cynthia L.; Chan, Kenneth; Kang, Hobin; Lee, Robert C.; Fried, Daniel
2011-01-01
Objective Enamel is highly transparent in the near-IR (NIR) at wavelengths near 1300-nm, and stains are not visible. The purpose of this study was to use NIR transillumination and optical coherence tomography (OCT) to estimate the severity of caries lesions on occlusal surfaces both in vivo and on extracted teeth. Methods Extracted molars with suspected occlusal lesions were examined with OCT and polarization sensitive OCT (PS-OCT), and subsequently sectioned and examined with polarized light microscopy (PLM) and transverse microradiography (TMR). Teeth in test subjects with occlusal caries lesions that were not cavitated or visible on radiographs were examined using NIR transillumination at 1310 nm using a custom built probe attached to an indium gallium arsenide (InGaAs) camera and a linear OCT scanner. After imaging, cavities were prepared using dye staining to guide caries removal and physical impressions of the cavities were taken. Results The lesion severity determined from OCT and PS-OCT scans in vitro correlated with the depth determined using polarized light microscopy (PLM) and transverse microradiography (TMR). Occlusal caries lesions appeared in NIR images with high contrast in vivo. OCT scans showed that most of the lesions penetrated to dentin and spread laterally below the sound enamel. Conclusion This study demonstrates that both NIR transillumination and OCT are promising new methods for the clinical diagnosis of occlusal caries. PMID:22109697
Adaptive optics imaging of geographic atrophy.
Gocho, Kiyoko; Sarda, Valérie; Falah, Sabrina; Sahel, José-Alain; Sennlaub, Florian; Benchaboune, Mustapha; Ullern, Martine; Paques, Michel
2013-05-01
To report the findings of en face adaptive optics (AO) near infrared (NIR) reflectance fundus flood imaging in eyes with geographic atrophy (GA). Observational clinical study of AO NIR fundus imaging was performed in 12 eyes of nine patients with GA, and in seven controls using a flood illumination camera operating at 840 nm, in addition to routine clinical examination. To document short term and midterm changes, AO imaging sessions were repeated in four patients (mean interval between sessions 21 days; median follow up 6 months). As compared with scanning laser ophthalmoscope imaging, AO NIR imaging improved the resolution of the changes affecting the RPE. Multiple hyporeflective clumps were seen within and around GA areas. Time-lapse imaging revealed micrometric-scale details of the emergence and progression of areas of atrophy as well as the complex kinetics of some hyporeflective clumps. Such dynamic changes were observed within as well as outside atrophic areas. in eyes affected by GA, AO nir imaging allows high resolution documentation of the extent of RPE damage. this also revealed that a complex, dynamic process of redistribution of hyporeflective clumps throughout the posterior pole precedes and accompanies the emergence and progression of atrophy. therefore, these clumps are probably also a biomarker of rpe damage. AO NIR imaging may, therefore, be of interest to detect the earliest stages, to document the retinal pathology and to monitor the progression oF GA. (ClinicalTrials.gov number, NCT01546181.).
Rind-Like Features at a Meridiani Outcrop
NASA Technical Reports Server (NTRS)
2005-01-01
After months spent roving across a sea of rippled sands, Opportunity reached an outcrop in August 2005 and began investigating exposures of sedimentary rocks, intriguing rind-like features that appear to cap the rocks, and cobbles that dot the martian surface locally. Opportunity spent several sols analyzing a feature called 'Lemon Rind,' a thin surface layer covering portions of outcrop rocks poking through the sand north of 'Erebus Crater.' In images from the panoramic camera, Lemon Rind appears slightly different in color than surrounding rocks. It also appears to be slightly more resistant to wind erosion than the outcrop's interior. To obtain information on how this surface layer (or weathering rind) may have formed and how it compares to previously analyzed outcrops, Opportunity is using the microscopic imager, alpha particle X-ray spectrometer and Moessbauer spectrometer to analyze surfaces that have been brushed and ground with the rock abrasion tool. Scientists will compare these measurements with similar measurements made on the underlying rock material. This is a false-color composite generated by draping enhanced red-green-blue color from the panoramic camera's 753-nanometer, 535-nanometer and 482-nanometer filters over a high-fidelity violet, 432-nanometer-filter image. The image was acquired on martian day, or sol 552 (Aug. 13, 2005) around 11:55 a.m. local true solar time. In this representation, bright sulfur-bearing sedimentary rocks appear light tan to brown, depending on their degree of dust contamination, and small dark 'blueberries' and other much less dusty rock fragments appear as different shades of blue. Draping the color derived from the blue to near-infrared filters over the violet filter image results in a false color view with the sharpest color and morphology contrasts.2003-07-07
KENNEDY SPACE CENTER, FLA. - On Launch Complex 17-B, Cape Canaveral Air Force Station, the Delta II Heavy launch vehicle carrying the second Mars Exploration Rover, Opportunity, is poised for launch after rollback of the Mobile Service Tower. Opportunity will reach Mars on Jan. 25, 2004. Together the two MER rovers, Spirit (launched June 10) and Opportunity, seek to determine the history of climate and water at two sites on Mars where conditions may once have been favorable to life. The rovers are identical. They will navigate themselves around obstacles as they drive across the Martian surface, traveling up to about 130 feet each Martian day. Each rover carries five scientific instruments including a panoramic camera and microscope, plus a rock abrasion tool that will grind away the outer surfaces of rocks to expose their interiors for examination. Each rover’s prime mission is planned to last three months on Mars.
2003-07-07
KENNEDY SPACE CENTER, FLA. - On Launch Complex 17-B, Cape Canaveral Air Force Station, the Delta II Heavy launch vehicle carrying the rover "Opportunity" for the second Mars Exploration Rover mission launches at 11:18:15 p.m. EDT. Opportunity will reach Mars on Jan. 25, 2004. Together the two MER rovers, Spirit (launched June 10) and Opportunity, seek to determine the history of climate and water at two sites on Mars where conditions may once have been favorable to life. The rovers are identical. They will navigate themselves around obstacles as they drive across the Martian surface, traveling up to about 130 feet each Martian day. Each rover carries five scientific instruments including a panoramic camera and microscope, plus a rock abrasion tool that will grind away the outer surfaces of rocks to expose their interiors for examination. Each rover’s prime mission is planned to last three months on Mars.
NASA Technical Reports Server (NTRS)
2006-01-01
While driving eastward toward the northwestern flank of 'McCool Hill,' the wheels of NASA's Mars Exploration Rover Spirit churned up the largest amount of bright soil discovered so far in the mission. This image from Spirit's panoramic camera (Pancam), taken on the rover's 788th Martian day, or sol, of exploration (March 22, 2006), shows the strikingly bright tone and large extent of the materials uncovered. Several days earlier, Spirit's wheels unearthed a small patch of light-toned material informally named 'Tyrone.' In images from Spirit's panoramic camera, 'Tyrone' strongly resembled both 'Arad' and 'Paso Robles,' two patches of light-toned soils discovered earlier in the mission. Spirit found 'Paso Robles' in 2005 while climbing 'Cumberland Ridge' on the western slope of 'Husband Hill.' In early January 2006, the rover discovered 'Arad' on the basin floor just south of 'Husband Hill.' Spirit's instruments confirmed that those soils had a salty chemistry dominated by iron-bearing sulfates. Spirit's Pancam and miniature thermal emission spectrometer examined this most recent discovery, and researchers will compare its properties with the properties of those other deposits. These discoveries indicate that salty, light-toned soil deposits might be widely distributed on the flanks and valley floors of the 'Columbia Hills' region in Gusev Crater on Mars. The salts, which are easily mobilized and concentrated in liquid solution, may record the past presence of water. So far, these enigmatic materials have generated more questions than answers, however, and as Spirit continues to drive across this region in search of a safe winter haven, the team continues to formulate and test hypotheses to explain the rover's most fascinating recent discovery. This view is an approximately true-color rendering that combines separate images taken through the Pancam's 753-nanometer, 535-nanometer, and 432-nanometer filters.Bright Soil Near 'McCool' (False Color)
NASA Technical Reports Server (NTRS)
2006-01-01
While driving eastward toward the northwestern flank of 'McCool Hill,' the wheels of NASA's Mars Exploration Rover Spirit churned up the largest amount of bright soil discovered so far in the mission. This image from Spirit's panoramic camera (Pancam), taken on the rover's 788th Martian day, or sol, of exploration (March 22, 2006), shows the strikingly bright tone and large extent of the materials uncovered. Several days earlier, Spirit's wheels unearthed a small patch of light-toned material informally named 'Tyrone.' In images from Spirit's panoramic camera, 'Tyrone' strongly resembled both 'Arad' and 'Paso Robles,' two patches of light-toned soils discovered earlier in the mission. Spirit found 'Paso Robles' in 2005 while climbing 'Cumberland Ridge' on the western slope of 'Husband Hill.' In early January 2006, the rover discovered 'Arad' on the basin floor just south of 'Husband Hill.' Spirit's instruments confirmed that those soils had a salty chemistry dominated by iron-bearing sulfates. Spirit's Pancam and miniature thermal emission spectrometer examined this most recent discovery, and researchers will compare its properties with the properties of those other deposits. These discoveries indicate that salty, light-toned soil deposits might be widely distributed on the flanks and valley floors of the 'Columbia Hills' region in Gusev Crater on Mars. The salts, which are easily mobilized and concentrated in liquid solution, may record the past presence of water. So far, these enigmatic materials have generated more questions than answers, however, and as Spirit continues to drive across this region in search of a safe winter haven, the team continues to formulate and test hypotheses to explain the rover's most fascinating recent discovery. This image is a false-color rendering using using Pancam's 753-nanometer, 535-nanometer, and 432-nanometer filters.Opportunity Landing Spot Panorama (3-D Model)
NASA Technical Reports Server (NTRS)
2004-01-01
The rocky outcrop traversed by the Mars Exploration Rover Opportunity is visible in this three-dimensional model of the rover's landing site. Opportunity has acquired close-up images along the way, and scientists are using the rover's instruments to closely examine portions of interest. The white fragments that look crumpled near the center of the image are portions of the airbags. Distant scenery is displayed on a spherical backdrop or 'billboard' for context. Artifacts near the top rim of the crater are a result of the transition between the three-dimensional model and the billboard. Portions of the terrain model lacking sufficient data appear as blank spaces or gaps, colored reddish-brown for better viewing. This image was generated using special software from NASA's Ames Research Center and a mosaic of images taken by the rover's panoramic camera.
[figure removed for brevity, see original site] Click on image for larger view The rocky outcrop traversed by the Mars Exploration Rover Opportunity is visible in this zoomed-in portion of a three-dimensional model of the rover's landing site. Opportunity has acquired close-up images along the way, and scientists are using the rover's instruments to closely examine portions of interest. The white fragments that look crumpled near the center of the image are portions of the airbags. Distant scenery is displayed on a spherical backdrop or 'billboard' for context. Artifacts near the top rim of the crater are a result of the transition between the three-dimensional model and the billboard. Portions of the terrain model lacking sufficient data appear as blank spaces or gaps, colored reddish-brown for better viewing. This image was generated using special software from NASA's Ames Research Center and a mosaic of images taken by the rover's panoramic camera.Anderson, Adam L; Lin, Bingxiong; Sun, Yu
2013-12-01
This work first overviews a novel design, and prototype implementation, of a virtually transparent epidermal imagery (VTEI) system for laparo-endoscopic single-site (LESS) surgery. The system uses a network of multiple, micro-cameras and multiview mosaicking to obtain a panoramic view of the surgery area. The prototype VTEI system also projects the generated panoramic view on the abdomen area to create a transparent display effect that mimics equivalent, but higher risk, open-cavity surgeries. The specific research focus of this paper is on two important aspects of a VTEI system: 1) in vivo wireless high-definition (HD) video transmission and 2) multi-image processing-both of which play key roles in next-generation systems. For transmission and reception, this paper proposes a theoretical wireless communication scheme for high-definition video in situations that require extremely small-footprint image sensors and in zero-latency applications. In such situations the typical optimized metrics in communication schemes, such as power and data rate, are far less important than latency and hardware footprint that absolutely preclude their use if not satisfied. This work proposes the use of a novel Frequency-Modulated Voltage-Division Multiplexing (FM-VDM) scheme where sensor data is kept analog and transmitted via "voltage-multiplexed" signals that are also frequency-modulated. Once images are received, a novel Homographic Image Mosaicking and Morphing (HIMM) algorithm is proposed to stitch images from respective cameras, that also compensates for irregular surfaces in real-time, into a single cohesive view of the surgical area. In VTEI, this view is then visible to the surgeon directly on the patient to give an "open cavity" feel to laparoscopic procedures.
FluoSTIC: miniaturized fluorescence image-guided surgery system
NASA Astrophysics Data System (ADS)
Gioux, Sylvain; Coutard, Jean-Guillaume; Berger, Michel; Grateau, Henri; Josserand, Véronique; Keramidas, Michelle; Righini, Christian; Coll, Jean-Luc; Dinten, Jean-Marc
2012-10-01
Over the last few years, near-infrared (NIR) fluorescence imaging has witnessed rapid growth and is already used in clinical trials for various procedures. However, most clinically compatible imaging systems are optimized for large, open-surgery procedures. Such systems cannot be employed during head and neck oncologic surgeries because the system is not able to image inside deep cavities or allow the surgeon access to certain tumors due to the large footprint of the system. We describe a miniaturized, low-cost, NIR fluorescence system optimized for clinical use during oral oncologic surgeries. The system, termed FluoSTIC, employs a miniature, high-quality, consumer-grade lipstick camera for collecting fluorescence light and a novel custom circular optical fiber array for illumination that combines both white light and NIR excitation. FluoSTIC maintains fluorescence imaging quality similar to that of current large-size imaging systems and is 22 mm in diameter and 200 mm in height and weighs less than 200 g.
Real-time interactive virtual tour on the World Wide Web (WWW)
NASA Astrophysics Data System (ADS)
Yoon, Sanghyuk; Chen, Hai-jung; Hsu, Tom; Yoon, Ilmi
2003-12-01
Web-based Virtual Tour has become a desirable and demanded application, yet challenging due to the nature of web application's running environment such as limited bandwidth and no guarantee of high computation power on the client side. Image-based rendering approach has attractive advantages over traditional 3D rendering approach in such Web Applications. Traditional approach, such as VRML, requires labor-intensive 3D modeling process, high bandwidth and computation power especially for photo-realistic virtual scenes. QuickTime VR and IPIX as examples of image-based approach, use panoramic photos and the virtual scenes that can be generated from photos directly skipping the modeling process. But, these image-based approaches may require special cameras or effort to take panoramic views and provide only one fixed-point look-around and zooming in-out rather than 'walk around', that is a very important feature to provide immersive experience to virtual tourists. The Web-based Virtual Tour using Tour into the Picture employs pseudo 3D geometry with image-based rendering approach to provide viewers with immersive experience of walking around the virtual space with several snap shots of conventional photos.
Arsalan, Muhammad; Naqvi, Rizwan Ali; Kim, Dong Seop; Nguyen, Phong Ha; Owais, Muhammad; Park, Kang Ryoung
2018-01-01
The recent advancements in computer vision have opened new horizons for deploying biometric recognition algorithms in mobile and handheld devices. Similarly, iris recognition is now much needed in unconstraint scenarios with accuracy. These environments make the acquired iris image exhibit occlusion, low resolution, blur, unusual glint, ghost effect, and off-angles. The prevailing segmentation algorithms cannot cope with these constraints. In addition, owing to the unavailability of near-infrared (NIR) light, iris recognition in visible light environment makes the iris segmentation challenging with the noise of visible light. Deep learning with convolutional neural networks (CNN) has brought a considerable breakthrough in various applications. To address the iris segmentation issues in challenging situations by visible light and near-infrared light camera sensors, this paper proposes a densely connected fully convolutional network (IrisDenseNet), which can determine the true iris boundary even with inferior-quality images by using better information gradient flow between the dense blocks. In the experiments conducted, five datasets of visible light and NIR environments were used. For visible light environment, noisy iris challenge evaluation part-II (NICE-II selected from UBIRIS.v2 database) and mobile iris challenge evaluation (MICHE-I) datasets were used. For NIR environment, the institute of automation, Chinese academy of sciences (CASIA) v4.0 interval, CASIA v4.0 distance, and IIT Delhi v1.0 iris datasets were used. Experimental results showed the optimal segmentation of the proposed IrisDenseNet and its excellent performance over existing algorithms for all five datasets. PMID:29748495
Arsalan, Muhammad; Naqvi, Rizwan Ali; Kim, Dong Seop; Nguyen, Phong Ha; Owais, Muhammad; Park, Kang Ryoung
2018-05-10
The recent advancements in computer vision have opened new horizons for deploying biometric recognition algorithms in mobile and handheld devices. Similarly, iris recognition is now much needed in unconstraint scenarios with accuracy. These environments make the acquired iris image exhibit occlusion, low resolution, blur, unusual glint, ghost effect, and off-angles. The prevailing segmentation algorithms cannot cope with these constraints. In addition, owing to the unavailability of near-infrared (NIR) light, iris recognition in visible light environment makes the iris segmentation challenging with the noise of visible light. Deep learning with convolutional neural networks (CNN) has brought a considerable breakthrough in various applications. To address the iris segmentation issues in challenging situations by visible light and near-infrared light camera sensors, this paper proposes a densely connected fully convolutional network (IrisDenseNet), which can determine the true iris boundary even with inferior-quality images by using better information gradient flow between the dense blocks. In the experiments conducted, five datasets of visible light and NIR environments were used. For visible light environment, noisy iris challenge evaluation part-II (NICE-II selected from UBIRIS.v2 database) and mobile iris challenge evaluation (MICHE-I) datasets were used. For NIR environment, the institute of automation, Chinese academy of sciences (CASIA) v4.0 interval, CASIA v4.0 distance, and IIT Delhi v1.0 iris datasets were used. Experimental results showed the optimal segmentation of the proposed IrisDenseNet and its excellent performance over existing algorithms for all five datasets.
CCD-camera-based diffuse optical tomography to study ischemic stroke in preclinical rat models
NASA Astrophysics Data System (ADS)
Lin, Zi-Jing; Niu, Haijing; Liu, Yueming; Su, Jianzhong; Liu, Hanli
2011-02-01
Stroke, due to ischemia or hemorrhage, is the neurological deficit of cerebrovasculature and is the third leading cause of death in the United States. More than 80 percent of stroke patients are ischemic stroke due to blockage of artery in the brain by thrombosis or arterial embolism. Hence, development of an imaging technique to image or monitor the cerebral ischemia and effect of anti-stoke therapy is more than necessary. Near infrared (NIR) optical tomographic technique has a great potential to be utilized as a non-invasive image tool (due to its low cost and portability) to image the embedded abnormal tissue, such as a dysfunctional area caused by ischemia. Moreover, NIR tomographic techniques have been successively demonstrated in the studies of cerebro-vascular hemodynamics and brain injury. As compared to a fiberbased diffuse optical tomographic system, a CCD-camera-based system is more suitable for pre-clinical animal studies due to its simpler setup and lower cost. In this study, we have utilized the CCD-camera-based technique to image the embedded inclusions based on tissue-phantom experimental data. Then, we are able to obtain good reconstructed images by two recently developed algorithms: (1) depth compensation algorithm (DCA) and (2) globally convergent method (GCM). In this study, we will demonstrate the volumetric tomographic reconstructed results taken from tissuephantom; the latter has a great potential to determine and monitor the effect of anti-stroke therapies.
In-Process Thermal Imaging of the Electron Beam Freeform Fabrication Process
NASA Technical Reports Server (NTRS)
Taminger, Karen M.; Domack, Christopher S.; Zalameda, Joseph N.; Taminger, Brian L.; Hafley, Robert A.; Burke, Eric R.
2016-01-01
Researchers at NASA Langley Research Center have been developing the Electron Beam Freeform Fabrication (EBF3) metal additive manufacturing process for the past 15 years. In this process, an electron beam is used as a heat source to create a small molten pool on a substrate into which wire is fed. The electron beam and wire feed assembly are translated with respect to the substrate to follow a predetermined tool path. This process is repeated in a layer-wise fashion to fabricate metal structural components. In-process imaging has been integrated into the EBF3 system using a near-infrared (NIR) camera. The images are processed to provide thermal and spatial measurements that have been incorporated into a closed-loop control system to maintain consistent thermal conditions throughout the build. Other information in the thermal images is being used to assess quality in real time by detecting flaws in prior layers of the deposit. NIR camera incorporation into the system has improved the consistency of the deposited material and provides the potential for real-time flaw detection which, ultimately, could lead to the manufacture of better, more reliable components using this additive manufacturing process.
Shaul, Oren; Fanrazi-Kahana, Michal; Meitav, Omri; Pinhasi, Gad A; Abookasis, David
2017-11-10
Heat stress (HS) is a medical emergency defined by abnormally elevated body temperature that causes biochemical, physiological, and hematological changes. The goal of the present research was to detect variations in optical properties (absorption, reduced scattering, and refractive index coefficients) of mouse brain tissue during HS by using near-infrared (NIR) spatial light modulation. NIR spatial patterns with different spatial phases were used to differentiate the effects of tissue scattering from those of absorption. Decoupling optical scattering from absorption enabled the quantification of a tissue's chemical constituents (related to light absorption) and structural properties (related to light scattering). Technically, structured light patterns at low and high spatial frequencies of six wavelengths ranging between 690 and 970 nm were projected onto the mouse scalp surface while diffuse reflected light was recorded by a CCD camera positioned perpendicular to the mouse scalp. Concurrently to pattern projection, brain temperature was measured with a thermal camera positioned slightly off angle from the mouse head while core body temperature was monitored by thermocouple probe. Data analysis demonstrated variations from baseline measurements in a battery of intrinsic brain properties following HS.
Interdisciplinary scientist participation in the Phobos mission
NASA Technical Reports Server (NTRS)
1992-01-01
Data was acquired from VSK (2 wide-angle visible-NIR TV cameras at 0.4 to 0.6 micrometers and 0.8 to 1.1 micrometers, and a narrow-angle TV camera), KRFM (10-band UV-visible spectrometer at 0.3 to 0.6 micrometers and a 6-band radiometer at 5-50 micrometers), and ISM (a 128-channel NIR imaging spectrometer at 0.8 to 3 micrometers). These data provided improved mapping coverage of Phobos; improved mass, shape, and volume determinations, with the density shown to be lower than that of all known meteorites, suggesting a porous interior; evidence for a physically, spectrally and possibly compositionally heterogeneous surface; and proof that the spectral properties do not closely resemble those of unaltered carbonaceous chondrites, but show more resemblance to the spectra of altered mafic material. For Mars, the data show that the underlying rock type can be distinguished through the global dust cover; that the spectral properties and possibly composition vary laterally between and within the geologic provinces; that the surface physical properties vary laterally, and in many cases, the boundaries coincide with those of the geologic units; and the acquired data also demonstrate the value of reflectance spectroscopy and radiometry to the study of Martian geology.
Presentation Attack Detection for Iris Recognition System Using NIR Camera Sensor
Nguyen, Dat Tien; Baek, Na Rae; Pham, Tuyen Danh; Park, Kang Ryoung
2018-01-01
Among biometric recognition systems such as fingerprint, finger-vein, or face, the iris recognition system has proven to be effective for achieving a high recognition accuracy and security level. However, several recent studies have indicated that an iris recognition system can be fooled by using presentation attack images that are recaptured using high-quality printed images or by contact lenses with printed iris patterns. As a result, this potential threat can reduce the security level of an iris recognition system. In this study, we propose a new presentation attack detection (PAD) method for an iris recognition system (iPAD) using a near infrared light (NIR) camera image. To detect presentation attack images, we first localized the iris region of the input iris image using circular edge detection (CED). Based on the result of iris localization, we extracted the image features using deep learning-based and handcrafted-based methods. The input iris images were then classified into real and presentation attack categories using support vector machines (SVM). Through extensive experiments with two public datasets, we show that our proposed method effectively solves the iris recognition presentation attack detection problem and produces detection accuracy superior to previous studies. PMID:29695113
Presentation Attack Detection for Iris Recognition System Using NIR Camera Sensor.
Nguyen, Dat Tien; Baek, Na Rae; Pham, Tuyen Danh; Park, Kang Ryoung
2018-04-24
Among biometric recognition systems such as fingerprint, finger-vein, or face, the iris recognition system has proven to be effective for achieving a high recognition accuracy and security level. However, several recent studies have indicated that an iris recognition system can be fooled by using presentation attack images that are recaptured using high-quality printed images or by contact lenses with printed iris patterns. As a result, this potential threat can reduce the security level of an iris recognition system. In this study, we propose a new presentation attack detection (PAD) method for an iris recognition system (iPAD) using a near infrared light (NIR) camera image. To detect presentation attack images, we first localized the iris region of the input iris image using circular edge detection (CED). Based on the result of iris localization, we extracted the image features using deep learning-based and handcrafted-based methods. The input iris images were then classified into real and presentation attack categories using support vector machines (SVM). Through extensive experiments with two public datasets, we show that our proposed method effectively solves the iris recognition presentation attack detection problem and produces detection accuracy superior to previous studies.
The Two Moons of Mars As Seen from 'Husband Hill'
NASA Technical Reports Server (NTRS)
2005-01-01
Taking advantage of extra solar energy collected during the day, NASA's Mars Exloration Rover Spirit settled in for an evening of stargazing, photographing the two moons of Mars as they crossed the night sky. Spirit took this succession of images at 150-second intervals from a perch atop 'Husband Hill' in Gusev Crater on martian day, or sol, 594 (Sept. 4, 2005), as the faster-moving martian moon Phobos was passing Deimos in the night sky. Phobos is the brighter object on the left and Deimos is the dimmer object on the right. The bright star Aldebaran and some other stars in the constellation Taurus are visible as star trails. Most of the other streaks in the image are the result of cosmic rays lighting up random groups of pixels in the camera. Scientists will use images of the two moons to better map their orbital positions, learn more about their composition, and monitor the presence of nighttime clouds or haze. Spirit took the five images that make up this c omposite with its panoramic camera using the camera's broadband filter, which was designed specifically for acquiring images under low-light conditions.NASA Technical Reports Server (NTRS)
2007-01-01
Another of the best examples of spectacular cross-bedding in Victoria crater are the outcrops at Cape St. Mary, which is an approximately 15 m (45 foot) high promontory located along the western rim of Victoria crater and near the beginning of the rover's traverse around the rim. Like the Cape St. Vincent images, these Pancam super-resolution images have allowed scientists to discern that the rocks at Victoria Crater once represented a large dune field that migrated across this region. This is a Mars Exploration Rover Opportunity Panoramic Camera image mosaic acquired on sol 1213 (June 23, 2007), and was constructed from a mathematical combination of 32 different blue filter (480 nm) images.Surface Stereo Imager on Mars, Side View
NASA Technical Reports Server (NTRS)
2008-01-01
This image is a view of NASA's Phoenix Mars Lander's Surface Stereo Imager (SSI) as seen by the lander's Robotic Arm Camera. This image was taken on the afternoon of the 116th Martian day, or sol, of the mission (September 22, 2008). The mast-mounted SSI, which provided the images used in the 360 degree panoramic view of Phoenix's landing site, is about 4 inches tall and 8 inches long. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.NASA Technical Reports Server (NTRS)
2004-01-01
This latest color 'postcard from Mars,' taken on Sol 5 by the panoramic camera on the Mars Exploration Rover Spirit, looks to the north. The apparent slope of the horizon is due to the several-degree tilt of the lander deck. On the left, the circular topographic feature dubbed Sleepy Hollow can be seen along with dark markings that may be surface disturbances caused by the airbag-encased lander as it bounced and rolled to rest. A dust-coated airbag is prominent in the foreground, and a dune-like object that has piqued the interest of the science team with its dark, possibly armored top coating, can be seen on the right.NASA Technical Reports Server (NTRS)
Bell, J. F., III; Arneson, H. M.; Farrand, W. H.; Goetz, W.; Hayes, A. G.; Herkenhoff, K.; Johnson, M. J.; Johnson, J. R.; Joseph, J.; Kinch, K.
2005-01-01
Introduction. The panoramic camera (Pancam) multispectral, stereoscopic imaging systems on the Mars Exploration Rovers Spirit and Opportunity [1] have acquired and downlinked more than 45,000 images (35 Gbits of data) over more than 700 combined sols of operation on Mars as of early January 2005. A large subset of these images were acquired as part of 26 large multispectral and/or broadband "albedo" panoramas (15 on Spirit, 11 on Opportunity) covering large ranges of azimuth (12 spanning 360 ) and designed to characterize major regional color and albedo characteristics of the landing sites and various points along both rover traverses.
Stack of Layers at 'Payson' in Meridiani Planum
NASA Technical Reports Server (NTRS)
2006-01-01
The stack of fine layers exposed at a ledge called 'Payson' on the western edge of 'Erebus Crater' in Mars' Meridiani Planum shows a diverse range of primary and secondary sedimentary textures formed billions of years ago. These structures likely result from an interplay between windblown and water-involved processes. The panoramic camera (Pancam) on NASA's Mars Exploration Rover Opportunity acquired the exposures for this image on the rover's 749th Martian day (March 3, 2006) This view is an approximately true-color rendering mathematically generated from separate images taken through all of the left Pancam's 432-nanometer to 753-nanometer filters.NASA Technical Reports Server (NTRS)
2004-01-01
This approximate true color image taken by the panoramic camera onboard the Mars Exploration Rover Spirit shows 'Adirondack,' the rover's first target rock. Spirit traversed the sandy martian terrain at Gusev Crater to arrive in front of the football-sized rock on Sunday, Jan. 18, 2004, just three days after it successfully rolled off the lander. The rock was selected as Spirit's first target because its dust-free, flat surface is ideally suited for grinding. Clean surfaces also are better for examining a rock's top coating. Scientists named the angular rock after the Adirondack mountain range in New York. The word Adirondack is Native American and means 'They of the great rocks.'
Marquette Island: A Distinct Mafic Lithology Discovered by Opportunity
NASA Technical Reports Server (NTRS)
Mittlefehldt, David W.; Gellert, R.; Herkenhoff, K. E.; Clark, B. C.; Cohen, B. A.; Fleischer, I.; Jolliff, B. L.; Klingelhoefer, G.; Ming, D. W.; Yingst, R. A.
2010-01-01
While rolling over the Meridiani Planum sedimentary terrane, the rover Opportunity has occasionally discovered large, > 10 cm erratics. Most of these have proven to be meteorites [1], but one - Bounce Rock - is a martian basaltic rock similar in composition to the meteorite EETA79001 lithology B [2]. Presently, Opportunity is intensively investigating an --30 cm tall rock named Marquette Island that may be a distinct type of martian mafic lithology. We report the results of its continuing investigation using the Microscopic Imager (MI); Mossbauer Spectrometer (MB) and Alpha Particle X-ray Spectrometer (APXS). A companion abstract discusses the results of Panoramic Camera (Pancam) imaging of the rock [3].
Stars in Orion as Seen from Mars
2004-03-11
Stars in the upper portion of the constellation Orion the Hunter, including the bright shoulder star Betelgeuse and Orion three-star belt, appear in this image taken from the surface of Mars by the panoramic camera on NASA rover Spirit. Spirit imaged stars on March 11, 2004, after it awoke during the martian night for a communication session with NASA's Mars Global Surveyor orbiter. This image is an eight-second exposure. Longer exposures were also taken. The images tested the capabilities of the rover for night-sky observations. Scientists will use the results to aid planning for possible future astronomical observations from Mars. http://photojournal.jpl.nasa.gov/catalog/PIA05546
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. Inside the Space Station Processing Facility, a technician begins checking the Cupola after its delivery and uncrating. It was shipped from Alenia Spazio in Turin, Italy, for the European Space Agency. A dome-shaped module with seven windows, the Cupola will give astronauts a panoramic view for observing many operations on the outside of the orbiting complex. The view out of the Cupola windows will enhance an arm operator's situational awareness, supplementing television camera views and graphics. It will provide external observation capabilities during spacewalks, docking operations and hardware surveys and for Earth and celestial studies. The Cupola is the final element of the Space Station core.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. Inside the Space Station Processing Facility, technicians begin checking the Cupola after its delivery and uncrating. It was shipped from Alenia Spazio in Turin, Italy, for the European Space Agency. A dome-shaped module with seven windows, the Cupola will give astronauts a panoramic view for observing many operations on the outside of the orbiting complex. The view out of the Cupola windows will enhance an arm operator's situational awareness, supplementing television camera views and graphics. It will provide external observation capabilities during spacewalks, docking operations and hardware surveys, and for Earth and celestial studies. The Cupola is the final element of the Space Station core.
Mapping the Apollo 17 Astronauts' Positions Based on LROC Data and Apollo Surface Photography
NASA Astrophysics Data System (ADS)
Haase, I.; Oberst, J.; Scholten, F.; Gläser, P.; Wählisch, M.; Robinson, M. S.
2011-10-01
The positions from where the Apollo 17 astronauts recorded panoramic image series, e.g. at the so-called "traverse stations", were precisely determined using ortho-images (0.5 m/pxl) as well as Digital Terrain Models (DTM) (1.5 m/pxl and 100 m/pxl) derived from Lunar Reconnaissance Orbiter Camera (LROC) data. Features imaged in the Apollo panoramas were identified in LROC ortho-images. Least-squares techniques were applied to angles measured in the panoramas to determine the astronaut's position to within the ortho-image pixel. The result of our investigation of Traverse Station 1 in the north-west of Steno Crater is presented.
NASA Technical Reports Server (NTRS)
Ridd, M. K.
1984-01-01
Twenty-three missions were flown using the EPA's panoramic camera to obtain color and color infrared photographs of landslide and flood damage in Utah. From the state's point of view, there were many successes. The biggest single obstacle to smooth and continued performance was unavailable aircraft. The Memorandum of Understanding between the State of Utah, the Environmental Protection Agency, and the Center for Remote Sensing and Cartography is included along with forms for planning enviropod missions, for requesting flights, and for obtaining feedback from participating agencies.
NASA Technical Reports Server (NTRS)
2004-01-01
This image shows the patch of soil at the bottom of the shallow depression dubbed 'Laguna Hollow' where the Mars Exploration Rover Spirit will soon begin trenching. Scientists are intrigued by the clustering of small pebbles and the crack-like fine lines, which indicate a coherent surface that expands and contracts. A number of processes can cause materials to expand and contract, including cycles of heating and cooling; freezing and thawing; and rising and falling of salty liquids within a substance. This false-color image was created using the blue, green and infrared filters of the rover's panoramic camera. Scientists chose this particular combination of filters to enhance the heterogeneity of the martian soil.
Robust Feature Matching in Terrestrial Image Sequences
NASA Astrophysics Data System (ADS)
Abbas, A.; Ghuffar, S.
2018-04-01
From the last decade, the feature detection, description and matching techniques are most commonly exploited in various photogrammetric and computer vision applications, which includes: 3D reconstruction of scenes, image stitching for panoramic creation, image classification, or object recognition etc. However, in terrestrial imagery of urban scenes contains various issues, which include duplicate and identical structures (i.e. repeated windows and doors) that cause the problem in feature matching phase and ultimately lead to failure of results specially in case of camera pose and scene structure estimation. In this paper, we will address the issue related to ambiguous feature matching in urban environment due to repeating patterns.
Hanna, Matthew G; Monaco, Sara E; Cuda, Jacqueline; Xing, Juan; Ahmed, Ishtiaque; Pantanowitz, Liron
2017-09-01
Whole-slide imaging in cytology is limited when glass slides are digitized without z-stacks for focusing. Different vendors have started to provide z-stacking solutions to overcome this limitation. The Panoptiq imaging system allows users to create digital files combining low-magnification panoramic images with regions of interest (ROIs) that are imaged with high-magnification z-stacks. The aim of this study was to compare such panoramic images with conventional whole-slide images and glass slides for the tasks of screening and interpretation in cytopathology. Thirty glass slides, including 10 ThinPrep Papanicolaou tests and 20 nongynecologic cytology cases, were digitized with an Olympus BX45 integrated microscope with an attached Prosilica GT camera. ViewsIQ software was used for image acquisition and viewing. These glass slides were also scanned on an Aperio ScanScope XT at ×40 (0.25 μm/pixel) with 1 z-plane and were viewed with ImageScope software. Digital and glass sides were screened and dotted/annotated by a cytotechnologist and were subsequently reviewed by 3 cytopathologists. For panoramic images, the cytotechnologist manually created digital maps and selected representative ROIs to generate z-stacks at a higher magnification. After 3-week washout periods, panoramic images were compared with Aperio digital slides and glass slides. The Panoptiq system permitted fine focusing of thick smears and cell clusters. In comparison with glass slides, the average screening times were 5.5 and 1.8 times longer with Panoptiq and Aperio images, respectively, but this improved with user experience. There was no statistical difference in diagnostic concordance between all 3 modalities. Users' diagnostic confidence was also similar for all modalities. The Aperio whole-slide scanner with 1 z-plane scanning and the Panoptiq imaging system with z-stacking are both suitable for cytopathology screening and interpretation. However, ROI z-stacks do offer a superior mechanism for overcoming focusing problems commonly encountered with digital cytology slides. Unlike whole-slide imaging, the acquisition of representative z-stack images with the Panoptiq system requires a trained cytologist to create digital files. Cancer Cytopathol 2017;125:701-9. © 2017 American Cancer Society. © 2017 American Cancer Society.
AO WFS detector developments at ESO to prepare for the E-ELT
NASA Astrophysics Data System (ADS)
Downing, Mark; Casali, Mark; Finger, Gert; Lewis, Steffan; Marchetti, Enrico; Mehrgan, Leander; Ramsay, Suzanne; Reyes, Javier
2016-07-01
ESO has a very active on-going AO WFS detector development program to not only meet the needs of the current crop of instruments for the VLT, but also has been actively involved in gathering requirements, planning, and developing detectors and controllers/cameras for the instruments in design and being proposed for the E-ELT. This paper provides an overall summary of the AO WFS Detector requirements of the E-ELT instruments currently in design and telescope focal units. This is followed by a description of the many interesting detector, controller, and camera developments underway at ESO to meet these needs; a) the rationale behind and plan to upgrade the 240x240 pixels, 2000fps, "zero noise", L3Vision CCD220 sensor based AONGC camera; b) status of the LGSD/NGSD High QE, 3e- RoN, fast 700fps, 1760x1680 pixels, Visible CMOS Imager and camera development; c) status of and development plans for the Selex SAPHIRA NIR eAPD and controller. Most of the instruments and detector/camera developments are described in more detail in other papers at this conference.
360° Film Brings Bombed Church to Life
NASA Astrophysics Data System (ADS)
Kwiatek, K.
2011-09-01
This paper explores how a computer-generated reconstruction of a church can be adapted to create a panoramic film that is presented in a panoramic viewer and also on a wrap-around projection system. It focuses on the fundamental principles of creating 360º films, not only in 3D modelling software, but also presents how to record 360º video using panoramic cameras inside the heritage site. These issues are explored in a case study of Charles Church in Plymouth, UK that was bombed in 1941 and has never been rebuilt. The generation of a 3D model of the bombed church started from the creation of five spherical panoramas and through the use of Autodesk ImageModeler software. The processed files were imported and merged together in Autodesk 3ds Max where a visualisation of the ruin was produced. A number of historical images were found and this collection enabled the process of a virtual reconstruction of the site. The aspect of merging two still or two video panoramas (one from 3D modelling software, the other one recorded on the site) from the same locations or with the same trajectories is also discussed. The prototype of 360º non-linear film tells a narrative of a wartime wedding that occurred in this church. The film was presented on two 360º screens where members of the audience could make decisions on whether to continue the ceremony or whether to run away when the bombing of the church starts. 3D modelling software made this possible to render a number of different alternati ves (360º images and 360º video). Immersive environments empower the visitor to imagine the building before it was destroyed.
Yang, Hualei; Yang, Xi; Heskel, Mary; ...
2017-04-28
Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporalmore » resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). Here we found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Hualei; Yang, Xi; Heskel, Mary
Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporalmore » resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). Here we found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.« less
Singh, M Suheshkumar; Yalavarthy, Phaneendra K; Vasu, R M; Rajan, K
2010-07-01
To assess the effect of ultrasound modulation of near infrared (NIR) light on the quantification of scattering coefficient in tissue-mimicking biological phantoms. A unique method to estimate the phase of the modulated NIR light making use of only time averaged intensity measurements using a charge coupled device camera is used in this investigation. These experimental measurements from tissue-mimicking biological phantoms are used to estimate the differential pathlength, in turn leading to estimation of optical scattering coefficient. A Monte-Carlo model based numerical estimation of phase in lieu of ultrasound modulation is performed to verify the experimental results. The results indicate that the ultrasound modulation of NIR light enhances the effective scattering coefficient. The observed effective scattering coefficient enhancement in tissue-mimicking viscoelastic phantoms increases with increasing ultrasound drive voltage. The same trend is noticed as the ultrasound modulation frequency approaches the natural vibration frequency of the phantom material. The contrast enhancement is less for the stiffer (larger storage modulus) tissue, mimicking tumor necrotic core, compared to the normal tissue. The ultrasound modulation of the insonified region leads to an increase in the effective number of scattering events experienced by NIR light, increasing the measured phase, causing the enhancement in the effective scattering coefficient. The ultrasound modulation of NIR light could provide better estimation of scattering coefficient. The observed local enhancement of the effective scattering coefficient, in the ultrasound focal region, is validated using both experimental measurements and Monte-Carlo simulations.
Real-time endoscopic guidance using near-infrared fluorescent light for thoracic surgery
NASA Astrophysics Data System (ADS)
Venugopal, Vivek; Stockdale, Alan; Neacsu, Florin; Kettenring, Frank; Frangioni, John V.; Gangadharan, Sidharta P.; Gioux, Sylvain
2013-03-01
Lung cancer is the leading cause of cancer death in the United States, accounting for 28% of all cancer deaths. Standard of care for potentially curable lung cancer involves preoperative radiographic or invasive staging, followed by surgical resection. With recent adjuvant chemotherapy and radiation studies showing a survival advantage in nodepositive patients, it is crucial to accurately stage these patients surgically in order to identify those who may benefit. However, lymphadenectomy in lung cancer is currently performed without guidance, mainly due to the lack of tools permitting real-time, intraoperative identification of lymph nodes. In this study we report the design and validation of a novel, clinically compatible near-infrared (NIR) fluorescence thoracoscope for real-time intraoperative guidance during lymphadenectomy. A novel, NIR-compatible, clinical rigid endoscope has been designed and fabricated, and coupled to a custom source and a dual channel camera to provide simultaneous color and NIR fluorescence information to the surgeon. The device has been successfully used in conjunction with a safe, FDA-approved fluorescent tracer to detect and resect mediastinal lymph nodes during thoracic surgery on Yorkshire pigs. Taken together, this study lays the foundation for the clinical translation of endoscopic NIR fluorescence intraoperative guidance and has the potential to profoundly impact the management of lung cancer patients.
NASA Astrophysics Data System (ADS)
Pu, Yang; Alfano, Robert R.
2015-03-01
Near-infrared (NIR) dyes absorb and emit light within the range from 700 to 900 nm have several benefits in biological studies for one- and/or two-photon excitation for deeper penetration of tissues. These molecules undergo vibrational and rotational motion in the relaxation of the excited electronic states, Due to the less than ideal anisotropy behavior of NIR dyes stemming from the fluorophores elongated structures and short fluorescence lifetime in picosecond range, no significant efforts have been made to recognize the theory of these dyes in time-resolved polarization dynamics. In this study, the depolarization of the fluorescence due to emission from rotational deactivation in solution will be measured with the excitation of a linearly polarized femtosecond laser pulse and a streak camera. The theory, experiment and application of the ultrafast fluorescence polarization dynamics and anisotropy are illustrated with examples of two of the most important medical based dyes. One is NIR dye, namely Indocyanine Green (ICG) and is compared with Fluorescein which is in visible range with much longer lifetime. A set of first-order linear differential equations was developed to model fluorescence polarization dynamics of NIR dye in picosecond range. Using this model, the important parameters of ultrafast polarization spectroscopy were identified: risetime, initial time, fluorescence lifetime, and rotation times.
NASA Astrophysics Data System (ADS)
Wong, Erwin
2000-03-01
Traditional methods of linear based imaging limits the viewer to a single fixed-point perspective. By means of a single lens multiple perspective mirror system, a 360-degree representation of the area around the camera is reconstructed. This reconstruction is used overcome the limitations of a traditional camera by providing the viewer with many different perspectives. By constructing the mirror into a hemispherical surface with multiple focal lengths at various diameters on the mirror, and by placing a parabolic mirror overhead, a stereoscopic image can be extracted from the image captured by a high-resolution camera placed beneath the mirror. Image extraction and correction is made by computer processing of the image obtained by camera; the image present up to five distinguishable different viewpoints that a computer can extrapolate pseudo- perspective data from. Geometric and depth for field can be extrapolated via comparison and isolation of objects within a virtual scene post processed by the computer. Combining data with scene rendering software provides the viewer with the ability to choose a desired viewing position, multiple dynamic perspectives, and virtually constructed perspectives based on minimal existing data. An examination into the workings of the mirror relay system is provided, including possible image extrapolation and correctional methods. Generation of data and virtual interpolated and constructed data is also mentioned.
A 3D camera for improved facial recognition
NASA Astrophysics Data System (ADS)
Lewin, Andrew; Orchard, David A.; Scott, Andrew M.; Walton, Nicholas A.; Austin, Jim
2004-12-01
We describe a camera capable of recording 3D images of objects. It does this by projecting thousands of spots onto an object and then measuring the range to each spot by determining the parallax from a single frame. A second frame can be captured to record a conventional image, which can then be projected onto the surface mesh to form a rendered skin. The camera is able of locating the images of the spots to a precision of better than one tenth of a pixel, and from this it can determine range to an accuracy of less than 1 mm at 1 meter. The data can be recorded as a set of two images, and is reconstructed by forming a 'wire mesh' of range points and morphing the 2 D image over this structure. The camera can be used to record the images of faces and reconstruct the shape of the face, which allows viewing of the face from various angles. This allows images to be more critically inspected for the purpose of identifying individuals. Multiple images can be stitched together to create full panoramic images of head sized objects that can be viewed from any direction. The system is being tested with a graph matching system capable of fast and accurate shape comparisons for facial recognition. It can also be used with "models" of heads and faces to provide a means of obtaining biometric data.
NASA Astrophysics Data System (ADS)
Abbas, O.; Fernández Pierna, J. A.; Dardenne, P.; Baeten, V.
2010-04-01
Since the BSE crisis, researches concern mainly the detection, identification, and quantification of meat and bone meal with an important focus on the development of new analytical methods. Microscopic based spectroscopy methods (NIR microscopy - NIRM or/and NIR hyperspectral imaging) have been proposed as complementary methods to the official method; the optical microscopy. NIR spectroscopy offers the advantage of being rapid, accurate and independent of human analyst skills. The combination of an NIR detector and a microscope or a camera allows the collection of high quality spectra for small feed particles having a size larger than 50 μm. Several studies undertaken have demonstrated the clear potential of NIR microscopic methods for the detection of animal particles in both raw and sediment fractions. Samples are sieved and only the gross fraction (superior than 250 μm) is investigated. Proposed methodologies have been developed to assure, with an acceptable level of confidence (95%), the detection of at least one animal particle when a feed sample is adulterated at a level of 0.1%. NIRM and NIR hyperspectral imaging are running under accreditation ISO 17025 since 2005 at CRA-W. A quantitative NIRM approach has been developed in order to fulfill the new requirements of the European commission policies. The capacities of NIRM method have been improved; only the raw fraction is analyzed, both the gross and the fine fractions of the samples are considered, and the acquisition parameters are optimized (the aperture, the gap, and the composition of the animal feed). A mapping method for a faster collection of spectra is also developed. The aim of this work is to show the new advances in the analytical methods developed in the frame of the feed ban applied in Europe.
Hyperspectral imaging with near-infrared-enabled mobile phones for tissue oximetry
NASA Astrophysics Data System (ADS)
Lin, Jonathan L.; Ghassemi, Pejhman; Chen, Yu; Pfefer, Joshua
2018-02-01
Hyperspectral reflectance imaging (HRI) is an emerging clinical tool for characterizing spatial and temporal variations in blood perfusion and oxygenation for applications such as burn assessment, wound healing, retinal exams and intraoperative tissue viability assessment. Since clinical HRI-based oximeters often use near-infrared (NIR) light, NIR-enabled mobile phones may provide a useful platform for future point-of-care devices. Furthermore, quantitative NIR imaging on mobile phones may dramatically increase the availability and accessibility of medical diagnostics for low-resource settings. We have evaluated the potential for phone-based NIR oximetry imaging and elucidated factors affecting performance using devices from two different manufacturers, as well as a scientific CCD. A broadband light source and liquid crystal tunable filter were used for imaging at 10 nm bands from 650 to 1000 nm. Spectral sensitivity measurements indicated that mobile phones with standard NIR blocking filters had minimal response beyond 700 nm, whereas one modified phone showed sensitivity to 800 nm and another to 1000 nm. Red pixel channels showed the greatest sensitivity up to 800 nm, whereas all channels provided essentially equivalent sensitivity at longer wavelengths. Referencing of blood oxygenation levels was performed with a CO-oximeter. HRI measurements were performed using cuvettes filled with hemoglobin solutions of different oxygen saturation levels. Good agreement between absorbance spectra measured with mobile phone and a CCD cameras were seen for wavelengths below 900 nm. Saturation estimates showed root-mean-squared-errors of 5.2% and 4.5% for the CCD and phone, respectively. Overall, this work provides strong evidence of the potential for mobile phones to provide quantitative spectral imaging in the NIR for applications such as oximetry, and generates practical insights into factors that impact performance as well as test methods for performance assessment.
Binocular Multispectral Adaptive Imaging System (BMAIS)
2010-07-26
system for pilots that adaptively integrates shortwave infrared (SWIR), visible, near ‐IR (NIR), off‐head thermal, and computer symbology/imagery into...respective areas. BMAIS is a binocular helmet mounted imaging system that features dual shortwave infrared (SWIR) cameras, embedded image processors and...algorithms and fusion of other sensor sites such as forward looking infrared (FLIR) and other aircraft subsystems. BMAIS is attached to the helmet
Science Experiments of a Jupiter Trojan asteroid in the Solar Power Sail Mission
NASA Astrophysics Data System (ADS)
Okada, T.; Kebukawa, Y.; Aoki, J.; Kawai, Y.; Ito, M.; Yano, H.; Okamoto, C.; Matsumoto, J.; Bibring, J. P.; Ulamec, S.; Jaumann, R.; Iwata, T.; Mori, O.; Kawaguchi, J.
2017-12-01
A Jupiter Trojan asteroid mission using a large area solar power sail (SPS) is under study in JAXA in collaboration with DLR and CNES. The asteroid will be investigated through remote sensing, followed by in situ in-depth observations on the asteroid with a lander. A sample-return is also studied as an option. LUCY has been selected as the NASA's future Discovery class mission which aims at understanding the diversity of Jupiter Trojans by multiple flybys, complementally to the SPS mission. The SPS is a candidate of the next medium class space science mission in Japan. The 1.4-ton spacecraft will carry a 100-kg class lander and 20-kg mission payloads on it. Its launch is expected in mid 2020s, and will take at least 11 years to visit a Jupiter Trojan asteroid. During the cruise phase, science experiments will be performed such as an infrared astronomy, a very long baseline gamma ray interferometry, and dust and magnetic field measurements. A classical static model of solar system suggests that the Jupiter Trojans were formed around the Jupiter region, while a dynamical model such as Nice model indicates that they formed at the far end of the solar system and then scattered inward due to a dynamical migration of giant planets. The physical, mineralogical, organics and isotopic distribution in the heliocentric distance could solve their origin and evolution of the solar system. A global mapping of the asteroid from the mothership will be conducted such as high-resolved imaging, NIR and TIR imaging spectrometry, and radar soundings. The lander will characterize the asteroid with geological, mineralogical, and geophysical observations using a panoramic camera, an infrared hyperspectral imager, a magnetometer, and a thermal radiometer. These samples will be measured by a high resolved mass spectrometer (HRMS) to investigate isotopic ratios of hydrogen, nitrogen, oxygen, as well as organic species.
An in vitro comparison of subjective image quality of panoramic views acquired via 2D or 3D imaging.
Pittayapat, P; Galiti, D; Huang, Y; Dreesen, K; Schreurs, M; Souza, P Couto; Rubira-Bullen, I R F; Westphalen, F H; Pauwels, R; Kalema, G; Willems, G; Jacobs, R
2013-01-01
The objective of this study is to compare subjective image quality and diagnostic validity of cone-beam CT (CBCT) panoramic reformatting with digital panoramic radiographs. Four dry human skulls and two formalin-fixed human heads were scanned using nine different CBCTs, one multi-slice CT (MSCT) and one standard digital panoramic device. Panoramic views were generated from CBCTs in four slice thicknesses. Seven observers scored image quality and visibility of 14 anatomical structures. Four observers repeated the observation after 4 weeks. Digital panoramic radiographs showed significantly better visualization of anatomical structures except for the condyle. Statistical analysis of image quality showed that the 3D imaging modalities (CBCTs and MSCT) were 7.3 times more likely to receive poor scores than the 2D modality. Yet, image quality from NewTom VGi® and 3D Accuitomo 170® was almost equivalent to that of digital panoramic radiographs with respective odds ratio estimates of 1.2 and 1.6 at 95% Wald confidence limits. A substantial overall agreement amongst observers was found. Intra-observer agreement was moderate to substantial. While 2D-panoramic images are significantly better for subjective diagnosis, 2/3 of the 3D-reformatted panoramic images are moderate or good for diagnostic purposes. Panoramic reformattings from particular CBCTs are comparable to digital panoramic images concerning the overall image quality and visualization of anatomical structures. This clinically implies that a 3D-derived panoramic view can be generated for diagnosis with a recommended 20-mm slice thickness, if CBCT data is a priori available for other purposes.
NASA Astrophysics Data System (ADS)
Zhang, Zhenhai; Li, Kejie; Wu, Xiaobing; Zhang, Shujiang
2008-03-01
The unwrapped and correcting algorithm based on Coordinate Rotation Digital Computer (CORDIC) and bilinear interpolation algorithm was presented in this paper, with the purpose of processing dynamic panoramic annular image. An original annular panoramic image captured by panoramic annular lens (PAL) can be unwrapped and corrected to conventional rectangular image without distortion, which is much more coincident with people's vision. The algorithm for panoramic image processing is modeled by VHDL and implemented in FPGA. The experimental results show that the proposed panoramic image algorithm for unwrapped and distortion correction has the lower computation complexity and the architecture for dynamic panoramic image processing has lower hardware cost and power consumption. And the proposed algorithm is valid.
Panoramic Images Mapping Tools Integrated Within the ESRI ArcGIS Software
NASA Astrophysics Data System (ADS)
Guo, Jiao; Zhong, Ruofei; Zeng, Fanyang
2014-03-01
There is a general study on panoramic images which are presented along with appearance of the Google street map. Despite 360 degree viewing of street, we can realize more applications over panoramic images. This paper developed a toolkits plugged in ArcGIS, which can view panoramic photographs at street level directly from ArcMap and measure and capture all visible elements as frontages, trees and bridges. We use a series of panoramic images adjoined with absolute coordinate through GPS and IMU. There are two methods in this paper to measure object from these panoramic images: one is to intersect object position through a stereogram; the other one is multichip matching involved more than three images which all cover the object. While someone wants to measure objects from these panoramic images, each two panoramic images which both contain the object can be chosen to display on ArcMap. Then we calculate correlation coefficient of the two chosen panoramic images so as to calculate the coordinate of object. Our study test different patterns of panoramic pairs and compare the results of measurement to the real value of objects so as to offer the best choosing suggestion. The article has mainly elaborated the principles of calculating correlation coefficient and multichip matching.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fraser, Wesley C.; Brown, Michael E.; Glass, Florian, E-mail: wesley.fraser@nrc.ca
2015-05-01
Here, we present additional photometry of targets observed as part of the Hubble Wide Field Camera 3 (WFC3) Test of Surfaces in the Outer Solar System. Twelve targets were re-observed with the WFC3 in the optical and NIR wavebands designed to complement those used during the first visit. Additionally, all of the observations originally presented by Fraser and Brown were reanalyzed through the same updated photometry pipeline. A re-analysis of the optical and NIR color distribution reveals a bifurcated optical color distribution and only two identifiable spectral classes, each of which occupies a broad range of colors and has correlatedmore » optical and NIR colors, in agreement with our previous findings. We report the detection of significant spectral variations on five targets which cannot be attributed to photometry errors, cosmic rays, point-spread function or sensitivity variations, or other image artifacts capable of explaining the magnitude of the variation. The spectrally variable objects are found to have a broad range of dynamical classes and absolute magnitudes, exhibit a broad range of apparent magnitude variations, and are found in both compositional classes. The spectrally variable objects with sufficiently accurate colors for spectral classification maintain their membership, belonging to the same class at both epochs. 2005 TV189 exhibits a sufficiently broad difference in color at the two epochs that span the full range of colors of the neutral class. This strongly argues that the neutral class is one single class with a broad range of colors, rather than the combination of multiple overlapping classes.« less
NASA Technical Reports Server (NTRS)
2004-01-01
This Long Term Planning graphic was created from a mosaic of navigation camera images overlain by a polar coordinate grid with the center point as Opportunity's original landing site. The blue dots represent the rover position at various locations.
The red dots represent the center points of the target areas for the instruments on the rover mast (the panoramic camera and miniature thermal emission spectrometer). Opportunity visited Stone Mountain on Feb. 5. Stone Mountain was named after the southernmost point of the Appalachian Mountains outside of Atlanta, Ga. On Earth, Stone Mountain is the last big mountain before the Piedmont flatlands, and on Mars, Stone Mountain is at one end of Opportunity Ledge. El Capitan is a target of interest on Mars named after the second highest peak in Texas in Guadaloupe National Park, which is one of the most visited outcrops in the United States by geologists. It has been a training ground for students and professional geologists to understand what the layering means in relation to the formation of Earth, and scientists will study this prominent point of Opportunity Ledge to understand what the layering means on Mars.The yellow lines show the midpoint where the panoramic camera has swept and will sweep a 120-degree area from the three waypoints on the tour of the outcrop. Imagine a fan-shaped wedge from left to right of the yellow line.The white contour lines are one meter apart, and each drive has been roughly about 2-3 meters in length over the last few sols. The large white blocks are dropouts in the navigation camera data.Opportunity is driving along and taking a photographic panorama of the entire outcrop. Scientists will stitch together these images and use the new mosaic as a 'base map' to decide on geology targets of interest for a more detailed study of the outcrop using the instruments on the robotic arm. Once scientists choose their targets of interest, they plan to study the outcrop for roughly five to fifteen sols. This will include El Capitan and probably one to two other areas.Blue Dot Dates Sol 7 / Jan 31 = Egress & first soil data collected by instruments on the arm Sol 9 / Feb 2 = Second Soil Target Sol 12 / Feb 5 = First Rock Target Sol 16 / Feb 9 = Alpha Waypoint Sol 17 / Feb 10 = Bravo Waypoint Sol 19 or 20 / Feb 12 or 13 = Charlie WaypointNASA Astrophysics Data System (ADS)
Chen, Su-Chin; Hsiao, Yu-Shen; Chung, Ta-Hsien
2015-04-01
This study is aimed at determining the landslide and driftwood potentials at Shenmu area in Taiwan by Unmanned Aerial Vehicle (UAV). High-resolution orthomosaics and digital surface models (DSMs) are both obtained from several UAV practical surveys by using a red-green-blue(RGB) camera and a near-infrared(NIR) one, respectively. Couples of artificial aerial survey targets are used for ground control in photogrammtry. The algorithm for this study is based on Logistic regression. 8 main factors, which are elevations, terrain slopes, terrain aspects, terrain reliefs, terrain roughness, distances to roads, distances to rivers, land utilizations, are taken into consideration in our Logistic regression model. The related results from UAV are compared with those from traditional photogrammetry. Overall, the study is focusing on monitoring the distribution of the areas with high-risk landslide and driftwood potentials in Shenmu area by Fixed-wing UAV-Borne RGB and NIR images. We also further analyze the relationship between forests, landslides, disaster potentials and upper river areas.
Constraining Aerosol Properties with the Spectrally-Resolved Phase Function of Pluto's Hazes
NASA Astrophysics Data System (ADS)
Parker, A. H.; Howett, C.; Olkin, C.; Protopapa, S.; Grundy, W. M.; Gladstone, R.; Young, L. A.; Horst, S. M.; Weaver, H. A., Jr.; Moore, J. M.; Ennico Smith, K.; Stern, A.
2017-12-01
The Multi-spectral Visible Imaging Camera (MVIC) and Lisa Hardaway Infrared Mapping Spectrometer (LEISA) aboard New Horizons imaged Pluto at high phase throughout departure from the system in July of 2015. The repeated MVIC color scans captured the phase behavior of Pluto's atmospheric hazes through phase angles of 165.0 to 169.5 degrees in four bandpasses in the visible and NIR. A spatially-resolved departure LEISA scan delivered moderate SNR NIR spectra of the hazes over wavelengths from 1.25 - 2.5 microns. Here we present our analysis of the departure MVIC and LEISA data, extracting high precision color phase curves of the hazes using the most up-to-date radiometric calibration and NIR gain drift corrections. We interpret these phase curves and spectra using Mie theory to constrain the size and composition of haze particles, with results indicating broad similarity to Titan aerosol analogues ("tholins"). Finally, we will explore the implications of the nature of these haze particles for the evolution of Pluto's surface as they settle out onto it over time.
VizieR Online Data Catalog: NGC 1893 optical and NIR photometry (Prisinzano+, 2011)
NASA Astrophysics Data System (ADS)
Prisinzano, L.; Sanz-Forcada, J.; Micela, G.; Caramazza, M.; Guarcello, M. G.; Sciortino, S.; Testi, L.
2010-10-01
We present new optical and NIR photometric data in the VRIJHK and H-α bands for the cluster NGC 1893. The optical photometry was obtained by using images acquired in service mode using two different telescopes: the Device Optimized for the LOw RESolution (DOLORES) mounted on the Telescopio Nazionale Galileo (TNG), used in service mode during three nights in 2007, and the Calar Alto Faint Object Spectrograph (CAFOS), mounted on the 2.2m telescope in Calar Alto German-Spanish Observatory (Spain), during three nights in 2007 and 2008. NIR observations were acquired in service mode at the TNG, using the large field Near Infrared Camera Spectrometer (NICS) with the Js(1.25um), H(1.63um) and K'(2.12um) filters during eight nights in 2007 and 2008. We observed a field around NGC 1893 with a raster of 4x4 pointings, at each pointing we obtained a series of NINT dithered exposures. Each exposure is a repetition of a DIT (Detector Integration Time) times NDIT (number of DIT), to avoid saturation of the background. (4 data files).
NASA Astrophysics Data System (ADS)
Pérez Ramos, A.; Robleda Prieto, G.
2016-06-01
Indoor Gothic apse provides a complex environment for virtualization using imaging techniques due to its light conditions and architecture. Light entering throw large windows in combination with the apse shape makes difficult to find proper conditions to photo capture for reconstruction purposes. Thus, documentation techniques based on images are usually replaced by scanning techniques inside churches. Nevertheless, the need to use Terrestrial Laser Scanning (TLS) for indoor virtualization means a significant increase in the final surveying cost. So, in most cases, scanning techniques are used to generate dense point clouds. However, many Terrestrial Laser Scanner (TLS) internal cameras are not able to provide colour images or cannot reach the image quality that can be obtained using an external camera. Therefore, external quality images are often used to build high resolution textures of these models. This paper aims to solve the problem posted by virtualizing indoor Gothic churches, making that task more affordable using exclusively techniques base on images. It reviews a previous proposed methodology using a DSRL camera with 18-135 lens commonly used for close range photogrammetry and add another one using a HDR 360° camera with four lenses that makes the task easier and faster in comparison with the previous one. Fieldwork and office-work are simplified. The proposed methodology provides photographs in such a good conditions for building point clouds and textured meshes. Furthermore, the same imaging resources can be used to generate more deliverables without extra time consuming in the field, for instance, immersive virtual tours. In order to verify the usefulness of the method, it has been decided to apply it to the apse since it is considered one of the most complex elements of Gothic churches and it could be extended to the whole building.
NASA Astrophysics Data System (ADS)
Romanishkin, Igor D.; Grachev, Pavel V.; Pominova, Daria V.; Burmistrov, Ivan A.; Sildos, Ilmo; Vanetsev, Alexander S.; Orlovskaya, Elena O.; Orlovskii, Yuri V.; Loschenov, Victor B.; Ryabova, Anastasia V.
2018-04-01
In this work we investigated the use of composite crystalline core/shell nanoparticles LaF3:Nd3+(1%)@DyPO4 for fluorescence-based contactless thermometry, as well as laser-induced hyperthermia effect in optical model of biological tissue with modeled neoplasm. In preparation for this, a thermal calibration of the nanoparticles luminescence spectra was carried out. The results of the spectroscopic temperature measurement were compared to infrared thermal camera measurements. It showed that there is a significant difference between temperature recorded with IR camera and the actual temperature of the nanoparticles in the depth of the tissue model. The temperature calculated using the spectral method was up to 10 °C higher.
Spherical Images for Cultural Heritage: Survey and Documentation with the Nikon KM360
NASA Astrophysics Data System (ADS)
Gottardi, C.; Guerra, F.
2018-05-01
The work presented here focuses on the analysis of the potential of spherical images acquired with specific cameras for documentation and three-dimensional reconstruction of Cultural Heritage. Nowadays, thanks to the introduction of cameras able to generate panoramic images automatically, without the requirement of a stitching software to join together different photos, spherical images allow the documentation of spaces in an extremely fast and efficient way. In this particular case, the Nikon Key Mission 360 spherical camera was tested on the Tolentini's cloister, which used to be part of the convent of the close church and now location of the Iuav University of Venice. The aim of the research is based on testing the acquisition of spherical images with the KM360 and comparing the obtained photogrammetric models with data acquired from a laser scanning survey in order to test the metric accuracy and the level of detail achievable with this particular camera. This work is part of a wider research project that the Photogrammetry Laboratory of the Iuav University of Venice has been dealing with in the last few months; the final aim of this research project will be not only the comparison between 3D models obtained from spherical images and laser scanning survey's techniques, but also the examination of their reliability and accuracy with respect to the previous methods of generating spherical panoramas. At the end of the research work, we would like to obtain an operational procedure for spherical cameras applied to metric survey and documentation of Cultural Heritage.
DEEP NEAR-IR OBSERVATIONS OF THE GLOBULAR CLUSTER M4: HUNTING FOR BROWN DWARFS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dieball, A.; Bedin, L. R.; Knigge, C.
2016-01-20
We present an analysis of deep Hubble Space Telescope (HST)/Wide Field Camera 3 near-IR (NIR) imaging data of the globular cluster (GC) M4. The best-photometry NIR color–magnitude diagram (CMD) clearly shows the main sequence extending toward the expected end of the hydrogen-burning limit and going beyond this point toward fainter sources. The white dwarf (WD) sequence can be identified. As such, this is the deepest NIR CMD of a GC to date. Archival HST optical data were used for proper-motion cleaning of the CMD and for distinguishing the WDs from brown dwarf (BD) candidates. Detection limits in the NIR aremore » around F110W ≈ 26.5 mag and F160W ≈ 27 mag, and in the optical around F775W ≈ 28 mag. Comparing our observed CMDs with theoretical models, we conclude that we have reached beyond the H-burning limit in our NIR CMD and are probably just above or around this limit in our optical–NIR CMDs. Thus, any faint NIR sources that have no optical counterpart are potential BD candidates, since the optical data are not deep enough to detect them. We visually inspected the positions of NIR sources that are fainter than the H-burning limit in F110W and for which the optical photometry did not return a counterpart. We found in total five sources for which we did not get an optical measurement. For four of these five sources, a faint optical counterpart could be visually identified, and an upper optical magnitude was estimated. Based on these upper optical magnitude limits, we conclude that one source is likely a WD, one source could be either a WD or BD candidate, and the remaining two sources agree with being BD candidates. No optical counterpart could be detected for just one source, which makes this source a good BD candidate. We conclude that we found in total four good BD candidates.« less
Opportunity Examines Cracks and Coatings on Mars Rocks
NASA Technical Reports Server (NTRS)
2005-01-01
This false-color panoramic image, taken on martian day, or sol, 561 (Aug. 22, 2005) by NASA's Opportunity rover, shows the nature of the outcrop rocks that the rover is encountering on its southward journey across the martian plains to 'Erebus Crater.' The rocks, similar in make-up to those encountered earlier in the mission, display a clear pattern of cracks as well as rind-like features (identifiable as a light shade of blue to olive in the image) coating the outcrop surface. Prominent in the image are two holes (one on the rock, one on the rind) drilled with the rover's rock abrasion tool to facilitate chemical analysis of the underlying material. The reddish color around the holes is from iron-rich dust produced during the grinding operation. The rind, nicknamed 'Lemon Rind,' and the underlying rock, nicknamed 'Strawberry,' have turned out to be similar in overall chemistry and texture. Science team members are working to understand the nature of the relationship between these kinds of rocks and rinds on the Meridiani plains. This false-color composite was generated from a combination of 750-, 530-, and 430-nanometer filter images taken by the Opportunity panoramic camera, an instrument that has acquired more than 36,000 color filter images to date of martian terrain at Meridiani Planum.VizieR Online Data Catalog: SN 2007on and SN 2011iv light curves (Gall+, 2018)
NASA Astrophysics Data System (ADS)
Gall, C.; Stritzinger, M. D.; Ashall, C.; Baron, E.; Burns, C. R.; Hoeflich, P.; Hsiao, E. Y.; Mazzali, P. A.; Phillips, M. M.; Filippenko, A. V.; Anderson, J. P.; Benetti, S.; Brown, P. J.; Campillay, A.; Challis, P.; Contreras, C.; Elias de La Rosa, N.; Folatelli, G.; Foley, R. J.; Fraser, M.; Holmbo, S.; Marion, G. H.; Morrell, N.; Pan, Y.-C.; Pignata, G.; Suntzeff, N. B.; Taddia, F.; Torres Robledo, S.; Valenti, S.
2017-11-01
Detailed optical and NIR light curves of SN 2007on obtained by the first phase of the Carnegie Supernova Project (CSP-I, 2004-2009; Hamuy et al., 2006PASP..118....2H) were published by Stritzinger et al. (2011, Cat. J/AJ/142/156).UV uvw2-, uvm2-, and uvw1-band imaging of both SN 2007on and SN 2011iv were obtained with Swift (+ UVOT). Photome- try of SN 2007on and SN 2011iv was computed following the method described in detail by Brown et al. (2014Ap&SS.354...89B), who use the calibration published by Breeveld et al. (2011, AIPCS, 1358, 373). The Swift UVOT images and photometry are also available as part of the Swift Optical Ultraviolet Supernova Archive (SOUSA; Brown et al. 2014Ap&SS.354...89B). Optical ugriBV-band imaging of SN 2007on and SN 2011iv was obtained with the Henrietta Swope 1.0m telescope (+ SITe3 direct CCD camera) located at the Las Campanas Observatory (LCO). The NIR YJH-band imaging of SN 2007on was obtained with the Swope (+ RetroCam) and the Irenee du Pont 2.5m (+ WIRC: Wide Field Infrared Camera) telescopes (Stritzinger et al., Cat. J/AJ/142/156), while in the case of SN 2011iv all NIR YJH-band imaging was taken with RetroCam attached to the Irenee du Pont telescope. The optical local sequence is calibrated relative to Landolt (1992AJ....104..372L) (BV) and Smith et al. (2002AJ....123.2121S) (ugri) standard-star fields observed over multiple photometric nights. The NIR J-band and H-band local sequences were calibrated relative to the Persson et al. (1998AJ....116.2475P) standard stars, while the Y- band local sequence was calibrated relative to standard Y-band magnitudes computed using a combination of stellar atmosphere models (Castelli & Kurucz, 2003, IAUSymp, 210, A20) with the J-Ks colours of the Persson et al. standard-star catalogue (Hamuy et al., 2006PASP..118....2H). (5 data files).
Three Fresh Exposures, Enhanced Color
NASA Technical Reports Server (NTRS)
2004-01-01
This enhanced-color panoramic camera image from the Mars Exploration Rover Opportunity features three holes created by the rock abrasion tool between sols 143 and 148 (June 18 and June 23, 2004) inside 'Endurance Crater.' The enhanced image makes the red colors a little redder and blue colors a little bluer, allowing viewers to see differences too subtle to be seen without the exaggeration. When compared with an approximately true color image, the tailings from the rock abrasion tool and the interior of the abraded holes are more prominent in this view. Being able to discriminate color variations helps scientists determine rocks' compositional differences and texture variations. This image was created using the 753-, 535- and 432-nanometer filters.Opportunity on 'Cabo Frio' (Simulated)
NASA Technical Reports Server (NTRS)
2006-01-01
This image superimposes an artist's concept of the Mars Exploration Rover Opportunity atop the 'Cabo Frio' promontory on the rim of 'Victoria Crater' in the Meridiani Planum region of Mars. It is done to give a sense of scale. The underlying image was taken by Opportunity's panoramic camera during the rover's 952nd Martian day, or sol (Sept. 28, 2006). This synthetic image of NASA's Opportunity Mars Exploration Rover at Victoria Crater was produced using 'Virtual Presence in Space' technology. Developed at NASA's Jet Propulsion Laboratory, Pasadena, Calif., this technology combines visualization and image processing tools with Hollywood-style special effects. The image was created using a photorealistic model of the rover and an approximately full-color mosaic.NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. At the Space Station Processing Facility, a trailer delivers the Cupola, an element scheduled to be installed on the International Space Station in early 2009. It was shipped from Alenia Spazio in Turin, Italy, for the European Space Agency. A dome-shaped module with seven windows, the Cupola will give astronauts a panoramic view for observing many operations on the outside of the orbiting complex. The view out of the Cupola windows will enhance an arm operator's situational awareness, supplementing television camera views and graphics. It will provide external observation capabilities during spacewalks, docking operations and hardware surveys and for Earth and celestial studies. The Cupola is the final element of the Space Station core.
Spirit's First Grinding of a Rock on Mars
NASA Technical Reports Server (NTRS)
2004-01-01
The round, shallow depression in this image resulted from history's first grinding of a rock on Mars. The rock abrasion tool on NASA's Spirit rover ground off the surface of a patch 45.5 millimeters (1.8 inches) in diameter on a rock called Adirondack. The hole is 2.65 millimeters (0.1 inch) deep, exposing fresh interior material of the rock for close inspection with the rover's microscopic imager and two spectrometers on the robotic arm. This image was taken by Spirit's panoramic camera, providing a quick visual check of the success of the grinding. The rock abrasion tools on both Mars Exploration Rovers were supplied by Honeybee Robotics, New York, N.Y.
NASA Technical Reports Server (NTRS)
2004-01-01
This image mosaic illustrates how scientists use the color calibration targets (upper left) located on both Mars Exploration Rovers to fine-tune the rovers' sense of color. In the center, spectra, or light signatures, acquired in the laboratory of the colored chips on the targets are shown as lines. Actual data from Mars Exploration Rover Spirit's panoramic camera is mapped on top of these lines as dots. The plot demonstrates that the observed colors of Mars match the colors of the chips, and thus approximate the red planet's true colors. This finding is further corroborated by the picture taken on Mars of the calibration target, which shows the colored chips as they would appear on Earth.
Pancam multispectral imaging results from the Spirit Rover at Gusev crater
Bell, J.F.; Squyres, S. W.; Arvidson, R. E.; Arneson, H.M.; Bass, D.; Blaney, D.; Cabrol, N.; Calvin, W.; Farmer, J.; Farrand, W. H.; Goetz, W.; Golombek, M.; Grant, J. A.; Greeley, R.; Guinness, E.; Hayes, A.G.; Hubbard, M.Y.H.; Herkenhoff, K. E.; Johnson, M.J.; Johnson, J. R.; Joseph, J.; Kinch, K.M.; Lemmon, M.T.; Li, R.; Madsen, M.B.; Maki, J.N.; Malin, M.; McCartney, E.; McLennan, S.; McSween, H.Y.; Ming, D. W.; Moersch, J.E.; Morris, R.V.; Dobrea, E.Z.N.; Parker, T.J.; Proton, J.; Rice, J. W.; Seelos, F.; Soderblom, J.; Soderblom, L.A.; Sohl-Dickstein, J. N.; Sullivan, R.J.; Wolff, M.J.; Wang, A.
2004-01-01
Panoramic Camera images at Gusev crater reveal a rock-strewn surface interspersed with high- to moderate-albedo fine-grained deposits occurring in part as drifts or in small circular swales or hollows. Optically thick coatings of fine-grained ferric iron-rich dust dominate most bright soil and rock surfaces. Spectra of some darker rock surfaces and rock regions exposed by brushing or grinding show near-infrared spectral signatures consistent with the presence of mafic silicates such as pyroxene or olivine. Atmospheric observations show a steady decline in dust opacity during the mission, and astronomical observations captured solar transits by the martian moons, Phobos and Deimos, as well as a view of Earth from the martian surface.
View from Spirit's Overwintering Position (False Color)
NASA Technical Reports Server (NTRS)
2008-01-01
NASA's Mars Exploration Rover Spirit has this view northward from the position at the north edge of the 'Home Plate' plateau where the rover will spend its third Martian winter. Husband Hill is on the horizon. The dark area in the middle distance is 'El Dorado' sand dune field. Spirit used its panoramic camera (Pancam) to capture this image during the rover's 1,448th Martian day, of sol (Jan. 29, 2008). This view combines separate images taken through the Pancam filters centered on wavelengths of 753 nanometers, 535 nanometers and 432 nanometers. It is presented in a false-color stretch to bring out subtle color differences in the scene.NASA Technical Reports Server (NTRS)
2004-01-01
This approximate true color image taken by the panoramic camera onboard the Mars Exploration Rover Spirit shows 'Adirondack,' the rover's first target rock. Spirit traversed the sandy martian terrain at Gusev Crater to arrive in front of the football-sized rock on Sunday, Jan. 18, 2004, just three days after it successfully rolled off the lander. The rock was selected as Spirit's first target because its dust-free, flat surface is ideally suited for grinding. Clean surfaces also are better for examining a rock's top coating. Scientists named the angular rock after the Adirondack mountain range in New York. The word Adirondack is Native American and is interpreted by some to mean 'They of the great rocks.'
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. Inside the Space Station Processing Facility, the Cupola is uncrated. It was shipped from Alenia Spazio in Turin, Italy, for the European Space Agency. The Cupola is an element scheduled to be installed on the International Space Station in early 2009. A dome-shaped module with seven windows, the Cupola will give astronauts a panoramic view for observing many operations on the outside of the orbiting complex. The view out of the Cupola windows will enhance an arm operator's situational awareness, supplementing television camera views and graphics. It will provide external observation capabilities during spacewalks, docking operations and hardware surveys and for Earth and celestial studies. The Cupola is the final element of the Space Station core.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. The Cupola, an element scheduled to be installed on the International Space Station in early 2009, arrives at KSC on the flatbed of a trailer. It was shipped from Alenia Spazio in Turin, Italy, for the European Space Agency. A dome-shaped module with seven windows, the Cupola will give astronauts a panoramic view for observing many operations on the outside of the orbiting complex. The view out of the Cupola windows will enhance an arm operator's situational awareness, supplementing television camera views and graphics. It will provide external observation capabilities during spacewalks, docking operations and hardware surveys and for Earth and celestial studies. The Cupola is the final element of the Space Station core.
Pancam multispectral imaging results from the Spirit Rover at Gusev Crater.
Bell, J F; Squyres, S W; Arvidson, R E; Arneson, H M; Bass, D; Blaney, D; Cabrol, N; Calvin, W; Farmer, J; Farrand, W H; Goetz, W; Golombek, M; Grant, J A; Greeley, R; Guinness, E; Hayes, A G; Hubbard, M Y H; Herkenhoff, K E; Johnson, M J; Johnson, J R; Joseph, J; Kinch, K M; Lemmon, M T; Li, R; Madsen, M B; Maki, J N; Malin, M; McCartney, E; McLennan, S; McSween, H Y; Ming, D W; Moersch, J E; Morris, R V; Dobrea, E Z Noe; Parker, T J; Proton, J; Rice, J W; Seelos, F; Soderblom, J; Soderblom, L A; Sohl-Dickstein, J N; Sullivan, R J; Wolff, M J; Wang, A
2004-08-06
Panoramic Camera images at Gusev crater reveal a rock-strewn surface interspersed with high- to moderate-albedo fine-grained deposits occurring in part as drifts or in small circular swales or hollows. Optically thick coatings of fine-grained ferric iron-rich dust dominate most bright soil and rock surfaces. Spectra of some darker rock surfaces and rock regions exposed by brushing or grinding show near-infrared spectral signatures consistent with the presence of mafic silicates such as pyroxene or olivine. Atmospheric observations show a steady decline in dust opacity during the mission, and astronomical observations captured solar transits by the martian moons, Phobos and Deimos, as well as a view of Earth from the martian surface.
Pancam multispectral imaging results from the Spirit Rover at Gusev Crater
NASA Technical Reports Server (NTRS)
Bell, J. F., III; Squyres, S. W.; Arvidson, R. E.; Arneson, H. M.; Bass, D.; Blaney, D.; Cabrol, N.; Calvin, W.; Farmer, J.; Farrand, W. H.;
2004-01-01
Panoramic Camera images at Gusev crater reveal a rock-strewn surface interspersed with high- to moderate-albedo fine-grained deposits occurring in part as drifts or in small circular swales or hollows. Optically thick coatings of fine-grained ferric iron-rich dust dominate most bright soil and rock surfaces. Spectra of some darker rock surfaces and rock regions exposed by brushing or grinding show near-infrared spectral signatures consistent with the presence of mafic silicates such as pyroxene or olivine. Atmospheric observations show a steady decline in dust opacity during the mission, and astronomical observations captured solar transits by the martian moons, Phobos and Deimos, as well as a view of Earth from the martian surface.
NASA Technical Reports Server (NTRS)
2004-01-01
This false-color image shows the area within 'Endurance Crater,' currently being investigated by the Mars Exploration Rover Opportunity. The rover is inspecting a hole it drilled into a flat rock (center) dubbed 'Tennessee,' which scientists believe may be made up of the same evaporite-rich materials as those found in 'Eagle Crater.' The overall geography inside Endurance is more complex than scientists anticipated, with at least three distinct bands of rock visible in front of the rover. Scientists hope to investigate the second and third layers of rock for more clues to Mars' history. This image was taken on sol 133 (June 8, 2004) with the rover's panoramic camera, using the 750-, 530- and 430-nanometer filters.VizieR Online Data Catalog: 76 T dwarfs from the UKIDSS LAS (Burningham+, 2013)
NASA Astrophysics Data System (ADS)
Burningham, B.; Cardoso, C. V.; Smith, L.; Leggett, S. K.; Smart, R. L.; Mann, A. W.; Dhital, S.; Lucas, P. W.; Tinney, C. G.; Pinfield, D. J.; Zhang, Z.; Morley, C.; Saumon, D.; Aller, K.; Littlefair, S. P.; Homeier, D.; Lodieu, N.; Deacon, N.; Marley, M. S.; van Spaandonk, L.; Baker, D.; Allard, F.; Andrei, A. H.; Canty, J.; Clarke, J.; Day-Jones, A. C.; Dupuy, T.; Fortney, J. J.; Gomes, J.; Ishii, M.; Jones, H. R. A.; Liu, M.; Magazzu, A.; Marocco, F.; Murray, D. N.; Rojas-Ayala, B.; Tamura, M.
2014-07-01
Our broad-band NIR photometry was obtained using the UKIRT Fast Track Imager (UFTI) and WFCAM, both mounted on UKIRT across a number of observing runs spanning 2009 to the end of 2010. Differential methane photometry was obtained using the Near Infrared Camera Spectrometer (NICS) mounted on the TNG under programme AOT22 TAC 96 spanning from 2010 to 2012. (5 data files).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ohsawa, R.; Sakon, I.; Onaka, T.
2010-08-01
We present the results of near-infrared (NIR) multi-epoch observations of the optical transient in the nearby galaxy NGC 300 (NGC 300-OT) at 398 and 582 days after the discovery with the Infrared Camera (IRC) on board AKARI. NIR spectra (2-5 {mu}m) of NGC 300-OT were obtained for the first time. They show no prominent emission nor absorption features, but are dominated by continuum thermal emission from the dust around NGC 300-OT. NIR images were taken in the 2.4, 3.2, and 4.1 {mu}m bands. The spectral energy distributions (SEDs) of NGC 300-OT indicate the dust temperature of 810 {+-} 14 Kmore » at 398 days and 670 {+-} 12 K at 582 days. We attribute the observed NIR emission to the thermal emission from dust grains formed in the ejecta of NGC 300-OT. The multi-epoch observations enable us to estimate the dust optical depth as {approx}>12 at 398 days and {approx}>6 at 582 days at 2.4 {mu}m by assuming an isothermal dust cloud. The observed NIR emission must be optically thick, unless the amount of dust grains increases with time. Little extinction at visible wavelengths reported in earlier observations suggests that the dust cloud around NGC 300-OT should be distributed inhomogeneously so as to not screen the radiation from the ejecta gas and the central star. The present results suggest the dust grains are not formed in a spherically symmetric geometry, but rather in a torus, a bipolar outflow, or clumpy cloudlets.« less
Brown dwarf distances and atmospheres: Spitzer Parallaxes and the Keck/NIRSPEC upgrade
NASA Astrophysics Data System (ADS)
Martin, Emily C.
2018-01-01
Advances in infrared technology have been essential towards improving our understanding of the solar neighborhood, revealing a large population of brown dwarfs, which span the mass regime between planets and stars. My thesis combines near-infrared (NIR) spectroscopic and astrometric analysis of nearby low-mass stars and brown dwarfs with instrumentation work to upgrade the NIRSPEC instrument for the Keck II Telescope. I will present results from a program using Spitzer/IRAC data to measure precise locations and distances to 22 of the coldest and closest brown dwarfs. These distances allow us to constrain absolute physical properties, such as mass, radius, and age, of free-floating planetary-mass objects through comparison to atmospheric and evolutionary models. NIR spectroscopy combined with the Spitzer photometry reveals a detailed look into the atmospheres of brown dwarfs and gaseous extrasolar planets. Additionally, I will discuss the improvements we are making to the NIRSPEC instrument at Keck. NIRSPEC is a NIR echelle spectrograph, capable of R~2000 and R~25,000 observations in the 1-5 μm range. As part of the upgrade, I performed detector characterization, optical design of a new slit-viewing camera, mechanical testing, and electronics design. NIRSPEC’s increased efficiency will allow us to obtain moderate- and high-resolution NIR spectra of objects up to a magnitude fainter than the current NIRSPEC design. Finally, I will demonstrate the utility of a NIR laser frequency comb as a high-resolution calibrator. This new technology will revolutionize precision radial velocity measurements in the coming decade.
Parallel-Processing Software for Creating Mosaic Images
NASA Technical Reports Server (NTRS)
Klimeck, Gerhard; Deen, Robert; McCauley, Michael; DeJong, Eric
2008-01-01
A computer program implements parallel processing for nearly real-time creation of panoramic mosaics of images of terrain acquired by video cameras on an exploratory robotic vehicle (e.g., a Mars rover). Because the original images are typically acquired at various camera positions and orientations, it is necessary to warp the images into the reference frame of the mosaic before stitching them together to create the mosaic. [Also see "Parallel-Processing Software for Correlating Stereo Images," Software Supplement to NASA Tech Briefs, Vol. 31, No. 9 (September 2007) page 26.] The warping algorithm in this computer program reflects the considerations that (1) for every pixel in the desired final mosaic, a good corresponding point must be found in one or more of the original images and (2) for this purpose, one needs a good mathematical model of the cameras and a good correlation of individual pixels with respect to their positions in three dimensions. The desired mosaic is divided into slices, each of which is assigned to one of a number of central processing units (CPUs) operating simultaneously. The results from the CPUs are gathered and placed into the final mosaic. The time taken to create the mosaic depends upon the number of CPUs, the speed of each CPU, and whether a local or a remote data-staging mechanism is used.
NV-CMOS HD camera for day/night imaging
NASA Astrophysics Data System (ADS)
Vogelsong, T.; Tower, J.; Sudol, Thomas; Senko, T.; Chodelka, D.
2014-06-01
SRI International (SRI) has developed a new multi-purpose day/night video camera with low-light imaging performance comparable to an image intensifier, while offering the size, weight, ruggedness, and cost advantages enabled by the use of SRI's NV-CMOS HD digital image sensor chip. The digital video output is ideal for image enhancement, sharing with others through networking, video capture for data analysis, or fusion with thermal cameras. The camera provides Camera Link output with HD/WUXGA resolution of 1920 x 1200 pixels operating at 60 Hz. Windowing to smaller sizes enables operation at higher frame rates. High sensitivity is achieved through use of backside illumination, providing high Quantum Efficiency (QE) across the visible and near infrared (NIR) bands (peak QE <90%), as well as projected low noise (<2h+) readout. Power consumption is minimized in the camera, which operates from a single 5V supply. The NVCMOS HD camera provides a substantial reduction in size, weight, and power (SWaP) , ideal for SWaP-constrained day/night imaging platforms such as UAVs, ground vehicles, fixed mount surveillance, and may be reconfigured for mobile soldier operations such as night vision goggles and weapon sights. In addition the camera with the NV-CMOS HD imager is suitable for high performance digital cinematography/broadcast systems, biofluorescence/microscopy imaging, day/night security and surveillance, and other high-end applications which require HD video imaging with high sensitivity and wide dynamic range. The camera comes with an array of lens mounts including C-mount and F-mount. The latest test data from the NV-CMOS HD camera will be presented.
A new bite block for panoramic radiographs of anterior edentulous patients: A technical report.
Park, Jong-Woong; Symkhampha, Khanthaly; Huh, Kyung-Hoe; Yi, Won-Jin; Heo, Min-Suk; Lee, Sam-Sun; Choi, Soon-Chul
2015-06-01
Panoramic radiographs taken using conventional chin-support devices have often presented problems with positioning accuracy and reproducibility. The aim of this report was to propose a new bite block for panoramic radiographs of anterior edentulous patients that better addresses these two issues. A new panoramic radiography bite block similar to the bite block for dentulous patients was developed to enable proper positioning stability for edentulous patients. The new bite block was designed and implemented in light of previous studies. The height of the new bite block was 18 mm and to compensate for the horizontal edentulous space, its horizontal width was 7 mm. The panoramic radiographs using the new bite block were compared with those using the conventional chin-support device. Panoramic radiographs taken with the new bite block showed better stability and bilateral symmetry than those taken with the conventional chin-support device. Patients also showed less movement and more stable positioning during panoramic radiography with the new bite block. Conventional errors in panoramic radiographs of edentulous patients could be caused by unreliability of the chin-support device. The newly proposed bite block for panoramic radiographs of edentulous patients showed better reliability. Further study is required to evaluate the image quality and reproducibility of images with the new bite block.
Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test.
Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno
2008-11-17
The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.
[Absorbed dose and the effective dose of panoramic temporo mandibular joint radiography].
Matsuo, Ayae; Okano, Tsuneichi; Gotoh, Kenichi; Yokoi, Midori; Hirukawa, Akiko; Okumura, Shinji; Koyama, Syuji
2011-01-01
This study measured the radiation doses absorbed by the patient during Panoramic temporo mandibular joint radiography (Panoramic TMJ), Schüllers method and Orbitoramus projection. The dose of the frontal view in Panoramic TMJ was compared to that with Orbitoramus projection and the lateral view in Panoramic TMJ was compared to that with Schüllers method. We measured the doses received by various organs and calculated the effective doses using the guidelines of the International Commission on Radiological Protection in Publication 103. Organ absorbed doses were measured using an anthropomorphic phantom, loaded with thermoluminescent dosimeters (TLD), located at 160 sensitive sites. The dose shows the sum value of irradiation on both the right and left sides. In addition, we set a few different exposure field sizes. The effective dose for a frontal view in Panoramic TMJ was 11 µSv, and that for the lateral view was 14 µSv. The lens of the Orbitoramus projection was 40 times higher than the frontal view in Panoramic TMJ. Although the effective dose of the lateral view in Panoramic TMJ was 3 times higher than that of the small exposure field (10×10 cm on film) in Schüller's method, it was the same as that of a mid-sized exposure field. When the exposure field in the inferior 1/3 was reduced during panoramic TMJ, the effective doses could be decreased. Therefore we recommend that the size of the exposure field in Panoramic TMJ be decreased.
A Comparison of the AVS-9 and the Panoramic Night Vision Goggles During Rotorcraft Hover and Landing
NASA Technical Reports Server (NTRS)
Szoboszlay, Zoltan; Haworth, Loran; Simpson, Carol
2000-01-01
A flight test was conducted to assess any differences in pilot-vehicle performance and pilot opinion between the use of a current generation night vision goggle (the AVS-9) and one variant of the prototype panoramic night vision goggle (the PNVGII). The panoramic goggle has more than double the horizontal field-of-view of the AVS-9, but reduced image quality. Overall the panoramic goggles compared well to the AVS-9 goggles. However, pilot comment and data are consistent with the assertion that some of the benefits of additional field-of-view with the panoramic goggles were negated by the reduced image quality of the particular variant of the panoramic goggles tested.
NASA Astrophysics Data System (ADS)
Chen, Chung-Hao; Yao, Yi; Chang, Hong; Koschan, Andreas; Abidi, Mongi
2013-06-01
Due to increasing security concerns, a complete security system should consist of two major components, a computer-based face-recognition system and a real-time automated video surveillance system. A computerbased face-recognition system can be used in gate access control for identity authentication. In recent studies, multispectral imaging and fusion of multispectral narrow-band images in the visible spectrum have been employed and proven to enhance the recognition performance over conventional broad-band images, especially when the illumination changes. Thus, we present an automated method that specifies the optimal spectral ranges under the given illumination. Experimental results verify the consistent performance of our algorithm via the observation that an identical set of spectral band images is selected under all tested conditions. Our discovery can be practically used for a new customized sensor design associated with given illuminations for an improved face recognition performance over conventional broad-band images. In addition, once a person is authorized to enter a restricted area, we still need to continuously monitor his/her activities for the sake of security. Because pantilt-zoom (PTZ) cameras are capable of covering a panoramic area and maintaining high resolution imagery for real-time behavior understanding, researches in automated surveillance systems with multiple PTZ cameras have become increasingly important. Most existing algorithms require the prior knowledge of intrinsic parameters of the PTZ camera to infer the relative positioning and orientation among multiple PTZ cameras. To overcome this limitation, we propose a novel mapping algorithm that derives the relative positioning and orientation between two PTZ cameras based on a unified polynomial model. This reduces the dependence on the knowledge of intrinsic parameters of PTZ camera and relative positions. Experimental results demonstrate that our proposed algorithm presents substantially reduced computational complexity and improved flexibility at the cost of slightly decreased pixel accuracy as compared to Chen and Wang's method [18].
NASA Technical Reports Server (NTRS)
2005-01-01
[figure removed for brevity, see original site] Click on the image for A Whale of a Panorama (QTVR) More than 1.5 years into their exploration of Mars, both of NASA's Mars Exploration Rovers continue to send a cornucopia of images to Earth. The results are so spectacular that Deputy Project Manager John Callas recently described them as 'an embarrassment of riches.' Spirit produced this image mosaic, nicknamed the 'Whale Panorama,' two-thirds of the way to the summit of 'Husband Hill,' where the rover investigated martian rocks. On the right side of the panorama is a tilted layer of rocks dubbed 'Larry's Outcrop,' one of several tilted outcrops that scientists examined in April, 2005. They used spatial information to create geologic maps showing the compass orientation and degree of tilting of rock formations in the vicinity. Such information is key to geologic fieldwork because it helps establish if rock layers have been warped since they formed. In this case, scientists have also been studying the mineral and chemical differences, which show that some rocks have been more highly altered than others. In the foreground, in the middle of the image mosaic, Spirit is shown with the scientific instruments at the end of its robotic arm positioned on a rock target known as 'Ahab.' The rover was busy collecting elemental chemistry and mineralogy data on the rock at the same time that it was taking 50 individual snapshots with its five panoramic camera filters to create this stunning view of the martian scenery. The twin tracks of the rover's all-terrain wheels are clearly visible on the left. This mosaic of images spans about 220 degrees from left to right and is an approximate true-color rendering of the Mars terrain acquired through the panoramic camera's 750-, 530-, and 430-nanometer filters. Spirit collected these images from its 497th martian day, or sol, through its 500th sol (May 27 through May 30, 2005).Bright Soil Near 'McCool' (3-D)
NASA Technical Reports Server (NTRS)
2006-01-01
While driving eastward toward the northwestern flank of 'McCool Hill,' the wheels of NASA's Mars Exploration Rover Spirit churned up the largest amount of bright soil discovered so far in the mission. This image from Spirit's panoramic camera (Pancam), taken on the rover's 788th Martian day, or sol, of exploration (March 22, 2006), shows the strikingly bright tone and large extent of the materials uncovered. Several days earlier, Spirit's wheels unearthed a small patch of light-toned material informally named 'Tyrone.' In images from Spirit's panoramic camera, 'Tyrone' strongly resembled both 'Arad' and 'Paso Robles,' two patches of light-toned soils discovered earlier in the mission. Spirit found 'Paso Robles' in 2005 while climbing 'Cumberland Ridge' on the western slope of 'Husband Hill.' In early January 2006, the rover discovered 'Arad' on the basin floor just south of 'Husband Hill.' Spirit's instruments confirmed that those soils had a salty chemistry dominated by iron-bearing sulfates. Spirit's Pancam and miniature thermal emission spectrometer examined this most recent discovery, and researchers will compare its properties with the properties of those other deposits. These discoveries indicate that salty, light-toned soil deposits might be widely distributed on the flanks and valley floors of the 'Columbia Hills' region in Gusev Crater on Mars. The salts, which are easily mobilized and concentrated in liquid solution, may record the past presence of water. So far, these enigmatic materials have generated more questions than answers, however, and as Spirit continues to drive across this region in search of a safe winter haven, the team continues to formulate and test hypotheses to explain the rover's most fascinating recent discovery. This stereo view combines images from the two blue (430-nanometer) filters in the Pancam's left and right 'eyes.' The image should be viewed using red-and-blue stereo glasses, with the red over your left eye.Innovative uses of GigaPan Technology for Onsite and Distance Education
NASA Astrophysics Data System (ADS)
Bentley, C.; Schott, R. C.; Piatek, J. L.; Richards, B.
2013-12-01
GigaPans are gigapixel panoramic images that can be viewed at a wide range of magnifications, allowing users to explore them in various degrees of detail from the smallest scale to the full image extent. In addition to panoramic images captured with the GigaPan camera mount ('Dry Falls' - http://www.gigapan.com/gigapans/89093), users can also upload annotated images (For example, 'Massanutten sandstone slab with trace fossils (annotated)', http://www.gigapan.com/gigapans/124295) and satellite images (For example, 'Geology vs. Topography - State of Connecticut', http://www.gigapan.com/gigapans/111265). Panoramas with similar topics have been gathered together on the site in galleries, both user-generated and site-curated (For example, http://www.gigapan.com/galleries?categories=geology&page=1). Further innovations in display technology have also led to the development of improved viewers (for example, the annotations in the image linked above can be explored via paired viewers at http://coursecontent.nic.edu/bdrichards/gigapixelimages/callanview) GigaPan panoramas can be created through use of the GigaPan robotic camera mount and a digital camera (different models of the camera mount are available and work with a wide range of cameras). The camera mount can be used to create high-resolution pans ranging in scale from hand sample to outcrop up to landscape via the stitching software included with the robotic mount. The software can also be used to generate GigaPan images from other sources, such as thin section or satellite images, so these images can also be viewed with the online viewer. GigaPan images are typically viewed via a web-based interface that allows the user to interact with the image from the limits of the image detail up to the full panorama. After uploading, information can be added to panoramas with both text captions and geo-referencing (geo-located panoramas can then be viewed in Google Earth). Users can record specific locations and zoom levels in these images via "snapshots": these snapshots can direct others to the same location in the image as well as generate conversations with attached text comments. Users can also group related GigaPans by creating "galleries" of thematically related images (similar to photo albums). Gigapixel images can also be formatted for processing and viewing in an increasing number of platforms/modes as software vendors and internet browsers begin to provide 'add-in' support. This opens up opportunities for innovative adaptations for geoscience education. (For example, http://coursecontent.nic.edu/bdrichards/gigapixelimages/dryfalls) Specific applications of these images for geoscience educations include classroom activities and independent exercises that encourage students to take an active inquiry-based approach to understanding geoscience concepts at multiple skill levels. GigaPans in field research serve as both records of field locations and additional datasets for detailed analyses, such as observing color changes or variations in grain size. Related GigaPans can be also be presented together when embedded in webpages, useful for generating exercises for education purposes or for analyses of outcrops from the macro (landscape, outcrop) down to the micro scale (hand sample, thin section).
NASA Astrophysics Data System (ADS)
Linkin, V.; Harri, A.-M.; Lipatov, A.; Belostotskaja, K.; Derbunovich, B.; Ekonomov, A.; Khloustova, L.; Kremnev, R.; Makarov, V.; Martinov, B.; Nenarokov, D.; Prostov, M.; Pustovalov, A.; Shustko, G.; Järvinen, I.; Kivilinna, H.; Korpela, S.; Kumpulainen, K.; Lehto, A.; Pellinen, R.; Pirjola, R.; Riihelä, P.; Salminen, A.; Schmidt, W.; Siili, T.; Blamont, J.; Carpentier, T.; Debus, A.; Hua, C. T.; Karczewski, J.-F.; Laplace, H.; Levacher, P.; Lognonné, Ph.; Malique, C.; Menvielle, M.; Mouli, G.; Pommereau, J.-P.; Quotb, K.; Runavot, J.; Vienne, D.; Grunthaner, F.; Kuhnke, F.; Musmann, G.; Rieder, R.; Wänke, H.; Economou, T.; Herring, M.; Lane, A.; McKay, C. P.
1998-02-01
A mission to Mars including two Small Stations, two Penetrators and an Orbiter was launched at Baikonur, Kazakhstan, on 16 November 1996. This was called the Mars-96 mission. The Small Stations were expected to land in September 1997 (L s approximately 178°), nominally to Amazonis-Arcadia region on locations (33 N, 169.4 W) and (37.6 N, 161.9W). The fourth stage of the Mars-96 launcher malfunctioned and hence the mission was lost. However, the state of the art concept of the Small Station can be applied to future Martian lander missions. Also, from the manufacturing and performance point of view, the Mars-96 Small Station could be built as such at low cost, and be fairly easily accommodated on almost any forthcoming Martian mission. This is primarily due to the very simple interface between the Small Station and the spacecraft. The Small Station is a sophisticated piece of equipment. With the total available power of approximately 400 mW the Station successfully supports an ambitious scientific program. The Station accommodates a panoramic camera, an alpha-proton-x-ray spectrometer, a seismometer, a magnetometer, an oxidant instrument, equipment for meteorological observations, and sensors for atmospheric measurement during the descent phase, including images taken by a descent phase camera. The total mass of the Small Station with payload on the Martian surface, including the airbags, is only 32 kg. Lander observations on the surface of Mars combined with data from Orbiter instruments will shed light on the contemporary Mars and its evolution. As in the Mars-96 mission, specific science goals could be exploration of the interior and surface of Mars, investigation of the structure and dynamics of the atmosphere, the role of water and other materials containing volatiles and in situ studies of the atmospheric boundary layer processes. To achieve the scientific goals of the mission the lander should carry a versatile set of instruments. The Small Station accommodates devices for atmospheric measurements, geophysical and geochemical studies of the Martian surface and interior, and cameras for descent phase and panoramic views. These instruments would be able to contribute remarkably to the process of solving some of the scientific puzzles of Mars.
Linkin, V; Harri, A M; Lipatov, A; Belostotskaja, K; Derbunovich, B; Ekonomov, A; Khloustova, L; Kremnev, R; Makarov, V; Martinov, B; Nenarokov, D; Prostov, M; Pustovalov, A; Shustko, G; Jarvinen, I; Kivilinna, H; Korpela, S; Kumpulainen, K; Lehto, A; Pellinen, R; Pirjola, R; Riihela, P; Salminen, A; Schmidt, W; McKay, C P
1998-01-01
A mission to Mars including two Small Stations, two Penetrators and an Orbiter was launched at Baikonur, Kazakhstan, on 16 November 1996. This was called the Mars-96 mission. The Small Stations were expected to land in September 1997 (Ls approximately 178 degrees), nominally to Amazonis-Arcadia region on locations (33 N, 169.4 W) and (37.6 N, 161.9 W). The fourth stage of the Mars-96 launcher malfunctioned and hence the mission was lost. However, the state of the art concept of the Small Station can be applied to future Martian lander missions. Also, from the manufacturing and performance point of view, the Mars-96 Small Station could be built as such at low cost, and be fairly easily accommodated on almost any forthcoming Martian mission. This is primarily due to the very simple interface between the Small Station and the spacecraft. The Small Station is a sophisticated piece of equipment. With the total available power of approximately 400 mW the Station successfully supports an ambitious scientific program. The Station accommodates a panoramic camera, an alpha-proton-x-ray spectrometer, a seismometer, a magnetometer, an oxidant instrument, equipment for meteorological observations, and sensors for atmospheric measurement during the descent phase, including images taken by a descent phase camera. The total mass of the Small Station with payload on the Martian surface, including the airbags, is only 32 kg. Lander observations on the surface of Mars combined with data from Orbiter instruments will shed light on the contemporary Mars and its evolution. As in the Mars-96 mission, specific science goals could be exploration of the interior and surface of Mars, investigation of the structure and dynamics of the atmosphere, the role of water and other materials containing volatiles and in situ studies of the atmospheric boundary layer processes. To achieve the scientific goals of the mission the lander should carry a versatile set of instruments. The Small Station accommodates devices for atmospheric measurements, geophysical and geochemical studies of the Martian surface and interior, and cameras for descent phase and panoramic views. These instruments would be able to contribute remarkably to the process of solving some of the scientific puzzles of Mars.
'Non-standard' panoramic programmes and the unusual artefacts they produce.
Harvey, S; Ball, F; Brown, J; Thomas, B
2017-08-25
Dental panoramic radiographs (DPTs) are commonly taken in dental practice in the UK with the number estimated to be 2.7 million per annum. They are used to diagnose caries, periodontal disease, trauma, pathology in the jaws, supernumerary teeth and for orthodontic assessment. Panoramic radiographs are not simple projections but involve a moving X-ray source and detector plate. Ideally only the objects in the focal trough are displayed. This is achieved with a tomographic movement and one or more centre(s) of rotation. One advantage of digital radiography is hardware and software changes to optimise the image. This has led to increasingly complex manufacturer specific digital panoramic programmes. Panoramic radiographs suffer from ghost artefacts which can limit the effectiveness and make interpretation difficult. Conversely 'conventional dental imaging' such as intraoral bitewings do not suffer the same problems. There are also now several 'non-standard' panoramic programmes which aim to optimise the image for different clinical scenarios. These include 'improved interproximality', 'improved orthogonality' and 'panoramic bitewing mode'.This technical report shows that these 'non-standard' panoramic programmes can produce potentially confusing ghost artefacts, of which the practitioner may not be aware.
RESOURCESAT-2: a mission for Earth resources management
NASA Astrophysics Data System (ADS)
Venkata Rao, M.; Gupta, J. P.; Rattan, Ram; Thyagarajan, K.
2006-12-01
The Indian Space Research Organisation (ISRO) has established an operational Remote sensing satellite system by launching its first satellite, IRS-1A in 1988, followed by a series of IRS spacecraft. The IRS-1C/1D satellites with their unique combination of Payloads have taken a lead position in the Global remote sensing scenario. Realising the growing User demands for the "Multi" level approach in terms of Spatial, Spectral, Temporal and Radiometric resolutions, ISRO identified the Resourcesat as a continuity as well as improved RS Satellite. The Resourcesat-1 (IRS-P6) was launched in October 2003 using PSLV launch vehicle and it is in operational service. Resourcesat-2 is its follow-on Mission scheduled for launch in 2008. Each Resourcesat satellite carries three Electro-optical cameras as its payload - LISS-3, LISS-4 and AWIFS. All the three are multi-spectral push-broom scanners with linear array CCDs as Detectors. LISS-3 and AWIFS operate in four identical spectral bands in the VIS-NIR-SWIR range while LISS-4 is a high resolution camera with three spectral bands in VIS-NIR range. In order to meet the stringent requirements of band-to-band registration and platform stability, several improvements have been incorporated in the mainframe Bus configuration like wide field Star trackers, precision Gyroscopes, on-board GPS receiver etc,. The Resourcesat data finds its application in several areas like agricultural crop discrimination and monitoring, crop acreage/yield estimation, precision farming, water resources, forest mapping, Rural infrastructure development, disaster management etc,. to name a few. A brief description of the Payload cameras, spacecraft bus elements and operational modes and few applications are presented.
A custom hardware classifier for bruised apple detection in hyperspectral images
NASA Astrophysics Data System (ADS)
Cárdenas, Javier; Figueroa, Miguel; Pezoa, Jorge E.
2015-09-01
We present a custom digital architecture for bruised apple classification using hyperspectral images in the near infrared (NIR) spectrum. The algorithm classifies each pixel in an image into one of three classes: bruised, non-bruised, and background. We extract two 5-element feature vectors for each pixel using only 10 out of the 236 spectral bands provided by the hyperspectral camera, thereby greatly reducing both the requirements of the imager and the computational complexity of the algorithm. We then use two linear-kernel support vector machine (SVM) to classify each pixel. Each SVM was trained with 504 windows of size 17×17-pixel taken from 14 hyperspectral images of 320×320 pixels each, for each class. The architecture then computes the percentage of bruised pixels in each apple in order to adequately classify the fruit. We implemented the architecture on a Xilinx Zynq Z-7010 field-programmable gate array (FPGA) and tested it on images from a NIR N17E push-broom camera with a frame rate of 25 fps, a band-pixel rate of 1.888 MHz, and 236 spectral bands between 900 and 1700 nanometers in laboratory conditions. Using 28-bit fixed-point arithmetic, the circuit accurately discriminates 95.2% of the pixels corresponding to an apple, 81% of the pixels corresponding to a bruised apple, and 96.4% of the background. With the default threshold settings, the highest false positive (FP) for a bruised apple is 18.7%. The circuit operates at the native frame rate of the camera, consumes 67 mW of dynamic power, and uses less than 10% of the logic resources on the FPGA.
Ripples in Rocks Point to Water
NASA Technical Reports Server (NTRS)
2004-01-01
This image taken by the Mars Exploration Rover Opportunity's panoramic camera shows the rock nicknamed 'Last Chance,' which lies within the outcrop near the rover's landing site at Meridiani Planum, Mars. The image provides evidence for a geologic feature known as ripple cross-stratification. At the base of the rock, layers can be seen dipping downward to the right. The bedding that contains these dipping layers is only one to two centimeters (0.4 to 0.8 inches) thick. In the upper right corner of the rock, layers also dip to the right, but exhibit a weak 'concave-up' geometry. These two features -- the thin, cross-stratified bedding combined with the possible concave geometry -- suggest small ripples with sinuous crest lines. Although wind can produce ripples, they rarely have sinuous crest lines and never form steep, dipping layers at this small scale. The most probable explanation for these ripples is that they were formed in the presence of moving water.
Crossbedding Evidence for Underwater Origin Interpretations of cross-lamination patterns presented as clues to this martian rock's origin under flowing water are marked on images taken by the panoramic camera and microscopic imager on NASA's Opportunity. [figure removed for brevity, see original site] [figure removed for brevity, see original site] Figure 1Figure 2 The red arrows (Figure 1) point to features suggesting cross-lamination within the rock called 'Last Chance' taken at a distance of 4.5 meters (15 feet) during Opportunity's 17th sol (February 10, 2004). The inferred sets of fine layers at angles to each other (cross-laminae) are up to 1.4 centimeters (half an inch) thick. For scale, the distance between two vertical cracks in the rock is about 7 centimeters (2.8 inches). The feature indicated by the middle red arrow suggests a pattern called trough cross-lamination, likely produced when flowing water shaped sinuous ripples in underwater sediment and pushed the ripples to migrate in one direction. The direction of the ancient flow would have been either toward or away from the line of sight from this perspective. The lower and upper red arrows point to cross-lamina sets that are consistent with underwater ripples in the sediment having moved in water that was flowing left to right from this perspective. The yellow arrows (Figure 2) indicate places in the panoramic camera view that correlate with places in the microscope's view of the same rock. [figure removed for brevity, see original site] Figure 3 The microscopic view (Figure 3) is a mosaic of some of the 152 microscopic imager frames of 'Last Chance' that Opportunity took on sols 39 and 40 (March 3 and 4, 2004). [figure removed for brevity, see original site] Figure 4 Figure 4 shows cross-lamination expressed by lines that trend downward from left to right, traced with black lines in the interpretive overlay. These cross-lamination lines are consistent with dipping planes that would have formed surfaces on the down-current side of migrating ripples. Interpretive blue lines indicate boundaries between possible sets of cross-laminae.Performance analysis of panoramic infrared systems
NASA Astrophysics Data System (ADS)
Furxhi, Orges; Driggers, Ronald G.; Holst, Gerald; Krapels, Keith
2014-05-01
Panoramic imagers are becoming more commonplace in the visible part of the spectrum. These imagers are often used in the real estate market, extreme sports, teleconferencing, and security applications. Infrared panoramic imagers, on the other hand, are not as common and only a few have been demonstrated. A panoramic image can be formed in several ways, using pan and stitch, distributed aperture, or omnidirectional optics. When omnidirectional optics are used, the detected image is a warped view of the world that is mapped on the focal plane array in a donut shape. The final image on the display is the mapping of the omnidirectional donut shape image back to the panoramic world view. In this paper we analyze the performance of uncooled thermal panoramic imagers that use omnidirectional optics, focusing on range performance.
Bird's-Eye View of Opportunity at 'Erebus' (Vertical)
NASA Technical Reports Server (NTRS)
2006-01-01
This view combines frames taken by the panoramic camera on NASA's Mars Exploration Rover Opportunity on the rover's 652nd through 663rd Martian days, or sols (Nov. 23 to Dec. 5, 2005), at the edge of 'Erebus Crater.' The mosaic is presented as a vertical projection. This type of projection provides a true-to-scale overhead view of the rover deck and nearby surrounding terrain. The view here shows outcrop rocks, sand dunes, and other features out to a distance of about 25 meters (82 feet) from the rover. Opportunity examined targets on the outcrop called 'Rimrock' in front of the rover, testing the mobility and operation of Opportunity's robotic arm. The view shows examples of the dunes and ripples that Opportunity has been crossing as the rover drives on the Meridiani plains. This view is a false-color composite of images taken through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters. This kind of false-color scheme emphasizes differences in composition among the different kinds of materials that the rover is exploring.NASA Technical Reports Server (NTRS)
2002-01-01
In 1999, Genex submitted a proposal to Stennis Space Center for a volumetric 3-D display technique that would provide multiple users with a 360-degree perspective to simultaneously view and analyze 3-D data. The futuristic capabilities of the VolumeViewer(R) have offered tremendous benefits to commercial users in the fields of medicine and surgery, air traffic control, pilot training and education, computer-aided design/computer-aided manufacturing, and military/battlefield management. The technology has also helped NASA to better analyze and assess the various data collected by its satellite and spacecraft sensors. Genex capitalized on its success with Stennis by introducing two separate products to the commercial market that incorporate key elements of the 3-D display technology designed under an SBIR contract. The company Rainbow 3D(R) imaging camera is a novel, three-dimensional surface profile measurement system that can obtain a full-frame 3-D image in less than 1 second. The third product is the 360-degree OmniEye(R) video system. Ideal for intrusion detection, surveillance, and situation management, this unique camera system offers a continuous, panoramic view of a scene in real time.
Rock Abrasion Tool Exhibits the Deep Red Pigment of Mars
NASA Technical Reports Server (NTRS)
2006-01-01
During recent soil-brushing experiments, the rock abrasion tool on NASA's Mars Exploration Rover Spirit became covered with dust, as shown here. An abundance of iron oxide minerals in the dust gave the device a reddish-brown veneer. Investigators were using the rock abrasion tool to uncover successive layers of soil in an attempt to reveal near-surface stratigraphy. Afterward, remnant dirt clods were visible on both the bit and the brush of the tool. Designers of the rock abrasion tool at Honeybee Robotics and engineers at the Jet Propulsion Laboratory developed a plan to run the brush on the rock abrasion tool in reverse to dislodge the dirt and return the tool to normal operation. Subsequent communications with the rover revealed that the procedure is working and the rock abrasion tool remains healthy. Spirit acquired this approximately true-color image with the panoramic camera on the rover's 893rd sol, or Martian day (July 8, 2006). The image combines exposures taken through three of the camera's filters, centered on wavelengths of 750 nanometers, 530 nanometers, and 430 nanometers.NASA Astrophysics Data System (ADS)
Shinoj, V. K.; Murukeshan, V. M.; Hong, Jesmond; Baskaran, M.; Aung, Tin
2015-07-01
Noninvasive medical imaging techniques have generated great interest and high potential in the research and development of ocular imaging and follow up procedures. It is well known that angle closure glaucoma is one of the major ocular diseases/ conditions that causes blindness. The identification and treatment of this disease are related primarily to angle assessment techniques. In this paper, we illustrate a probe-based imaging approach to obtain the images of the angle region in eye. The proposed probe consists of a micro CCD camera and LED/NIR laser light sources and they are configured at the distal end to enable imaging of iridocorneal region inside eye. With this proposed dualmodal probe, imaging is performed in light (white visible LED ON) and dark (NIR laser light source alone) conditions and the angle region is noticeable in both cases. The imaging using NIR sources have major significance in anterior chamber imaging since it evades pupil constriction due to the bright light and thereby the artificial altering of anterior chamber angle. The proposed methodology and developed scheme are expected to find potential application in glaucoma disease detection and diagnosis.
The Large Ultraviolet/Optical/Infrared Surveyor (LUVOIR)
NASA Astrophysics Data System (ADS)
Peterson, Bradley M.; Fischer, Debra; LUVOIR Science and Technology Definition Team
2017-01-01
LUVOIR is one of four potential large mission concepts for which the NASA Astrophysics Division has commissioned studies by Science and Technology Definition Teams (STDTs) drawn from the astronomical community. LUVOIR will have an 8 to16-m segmented primary mirror and operate at the Sun-Earth L2 point. It will be designed to support a broad range of astrophysics and exoplanet studies. The notional initial complement of instruments will include 1) a high-performance optical/NIR coronagraph with imaging and spectroscopic capability, 2) a UV imager and spectrograph with high spectral resolution and multi-object capability, 3) a high-definition wide-field optical/NIR camera, and 4) a multi-resolution optical/NIR spectrograph. LUVOIR will be designed for extreme stability to support unprecedented spatial resolution and coronagraphy. It is intended to be a long-lifetime facility that is both serviceable and upgradable. This is the first report by the LUVOIR STDT to the community on the top-level architectures we are studying, including preliminary capabilities of a mission with those parameters. The STDT seeks feedback from the astronomical community for key science investigations that can be undertaken with the notional instrument suite and to identify desirable capabilities that will enable additional key science.
Monitoring of antisolvent crystallization of sodium scutellarein by combined FBRM-PVM-NIR.
Liu, Xuesong; Sun, Di; Wang, Feng; Wu, Yongjiang; Chen, Yong; Wang, Longhu
2011-06-01
Antisolvent crystallization can be used as an alternative to cooling or evaporation for the separation and purification of solid product in the pharmaceutical industry. To improve the process understanding of antisolvent crystallization, the use of in-line tools is vital. In this study, the process analytical technology (PAT) tools including focused beam reflectance measurement (FBRM), particle video microscope (PVM), and near-infrared spectroscopy (NIRS) were utilized to monitor antisolvent crystallization of sodium scutellarein. FBRM was used to monitor chord count and chord length distribution of sodium scutellarein particles in the crystallizer, and PVM, as an in-line video camera, provided pictures imaging particle shape and dimension. In addition, a quantitative model of PLS was established by in-line NIRS to detect the concentration of sodium scutellarein in the solvent and good calibration statistics were obtained (r(2) = 0.976) with the residual predictive deviation value of 11.3. The discussion over sensitivities, strengths, and weaknesses of the PAT tools may be helpful in selection of suitable PAT techniques. These in-line techniques eliminate the need for sample preparation and offer a time-saving approach to understand and monitor antisolvent crystallization process. Copyright © 2011 Wiley-Liss, Inc.
Zhou, Quan; Wood, Ronald; Schwarz, Edward M.; Wang, Yong-Jun; Xing, Lianping
2010-01-01
Objective Development of an in vivo imaging method to assess lymphatic draining function in the K/B×N mouse model of inflammatory arthritis. Methods Indocyanine green (ICG), a near-infrared (NIR) fluorescent dye, was injected intradermally into the footpad of wild-type mice, the limb was illuminated with an 806 nm NIR laser, and the movement of ICG from the injection site to the draining popliteal lymph node (PLN) was recorded with a CCD camera. ICG-NIR images were analyzed to obtain 5 measures of lymphatic function across time. K/B×N arthritic mice and control non-arthritic littermates were imaged at one-month of age when acute joint inflammation commenced, and repeated at 3 months when joint inflammation became chronic. Lymphangiogenesis in PLNs was assessed by immunochemistry. Results ICG and its transport within lymphatic vessels were readily visualized and quantitative measures derived. During the acute phase of arthritis, the lymphatic vessels were dilated with increased ICG signal intensity and lymphatic pulses, and PLNs became fluorescent quickly. During the chronic phase, new lymphatic vessels were present near the foot. However, ICG appearance in lymphatic vessels was delayed. The size and area of PLN lymphatic sinuses progressively increased in the K/B×N mice. Conclusion ICG-NIR lymphatic imaging is a valuable method to assess the lymphatic draining function in mice with inflammatory arthritis. ICG-NIR imaging of K/B×N mice identified two distinct lymphatic phenotypes during the acute and chronic phase of inflammation. This technique can be used to assess new therapies for lymphatic disorders. PMID:20309866
2004-03-06
The red marks in this image, taken by the Mars Exploration Rover Opportunity's panoramic camera, indicate holes made by the rover's rock abrasion tool, located on its instrument deployment device, or "arm." The lower hole, located on a target called "McKittrick," was made on the 30th martian day, or sol, of Opportunity's journey. The upper hole, located on a target called "Guadalupe" was made on sol 34 of the rover's mission. The mosaic image was taken using a blue filter at the "El Capitan" region of the Meridiani Planum, Mars, rock outcrop. The image, shown in a vertical-perspective map projection, consists of images acquired on sols 27, 29 and 30 of the rover's mission. http://photojournal.jpl.nasa.gov/catalog/PIA05513
First Grinding of a Rock on Mars
NASA Technical Reports Server (NTRS)
2004-01-01
The round, shallow depression in this image resulted from history's first grinding of a rock on Mars. The rock abrasion tool on NASA's Spirit rover ground off the surface of a patch 45.5 millimeters (1.8 inches) in diameter on a rock called Adirondack during Spirit's 34th sol on Mars, Feb. 6, 2004. The hole is 2.65 millimeters (0.1 inch) deep, exposing fresh interior material of the rock for close inspection with the rover's microscopic imager and two spectrometers on the robotic arm. This image was taken by Spirit's panoramic camera, providing a quick visual check of the success of the grinding. The rock abrasion tools on both Mars Exploration Rovers were supplied by Honeybee Robotics, New York, N.Y.
NASA Astrophysics Data System (ADS)
Cárdenas, M. C.; Rodríguez Gómez, J.
2011-11-01
PANIC, the PAnoramic Near Infrared Camera, is a new instrument for Calar Alto Observatory (CAHA) is a wide-field infraredimager for the CAHA 2.2 m and 3.5 m telescopes. The optics is a folded single optical train, pure lens optics, with a pixel scale of 0.45 arcsec/pixel (18 microns) at the 2.2 m telescope and 0.23 arcsec/pixel at the 3.5 m. A mosaic of four Hawaii-2RG detectorsprovides a field of view (FOV) of 0.5x0.5 degrees and 0.25x0.25 degrees, respectively. It will cover the photometric bandsfrom Z to K_s (0.8 to 2.5 microns) with a low thermal background due to cold stops. Here we present the current status of the project.
NASA Technical Reports Server (NTRS)
2004-01-01
The pointy features in this image may only be a few centimeters high and less than 1 centimeter (0.4 inches) wide, but they generate major scientific interest. Dubbed 'Razorback,' this chunk of rock sticks up at the edge of flat rocks in 'Endurance Crater.' Based on their understanding of processes on Earth, scientists believe these features may have formed when fluids migrated through fractures, depositing minerals. Fracture-filling minerals would have formed veins composed of a harder material that eroded more slowly than the rock slabs. Possible examination of these features using the instruments on NASA's Mars Exploration Rover Opportunity may further explain what these features have to do with the history of water on Mars. This false-color image was taken by the rover's panoramic camera.First Panoramic View From The Surface Of Mars
1996-12-30
First panoramic view by NASA's Viking 1 from the surface of Mars. The out of focus spacecraft component toward left center is the housing for the Viking sample arm, which is not yet deployed. Parallel lines in the sky are an artifact and are not real features. However, the change of brightness from horizon towards zenith and towards the right (west) is accurately reflected in this picture, taken in late Martian afternoon. At the horizon to the left is a plateau-like prominence much brighter than the foreground material between the rocks. The horizon features are approximately three kilometers (1.8 miles) away. At left is a collection of fine-grained material reminiscent of sand dunes. The dark sinuous markings in left foreground are of unknown origin. Some unidentified shapes can be perceived on the hilly eminence at the horizon towards the right. A horizontal cloud stratum can be made out halfway from the horizon to the top of the picture. At left is seen the low gain antenna for receipt of commands from the Earth. The projections on or near the horizon may represent the rims distant impact craters. In right foreground are color charts for Lander camera calibration, a mirror for the Viking magnetic properties experiment and part of a grid on the top of the Lander body. At upper right is the high gain dish antenna for direct communication between landed spacecraft and Earth. Toward the right edge is an array of smooth fine-grained material which shows some hint of ripple structure and may be the beginning of a large dune field off to the right of the picture, which joins with dunes seen at the top left in this 300 degree panoramic view. Some of the rocks appear to be undercut on one side and partially buried by drifting sand on the other. http://photojournal.jpl.nasa.gov/catalog/PIA00383
Lee, Peter; Yan, Ping; Ewart, Paul; Kohl, Peter
2012-01-01
Whole-heart multi-parametric optical mapping has provided valuable insight into the interplay of electro-physiological parameters, and this technology will continue to thrive as dyes are improved and technical solutions for imaging become simpler and cheaper. Here, we show the advantage of using improved 2nd-generation voltage dyes, provide a simple solution to panoramic multi-parametric mapping, and illustrate the application of flash photolysis of caged compounds for studies in the whole heart. For proof of principle, we used the isolated rat whole-heart model. After characterising the blue and green isosbestic points of di-4-ANBDQBS and di-4-ANBDQPQ, respectively, two voltage and calcium mapping systems are described. With two newly custom-made multi-band optical filters, (1) di-4-ANBDQBS and fluo-4 and (2) di-4-ANBDQPQ and rhod-2 mapping are demonstrated. Furthermore, we demonstrate three-parameter mapping using di-4-ANBDQPQ, rhod-2 and NADH. Using off-the-shelf optics and the di-4-ANBDQPQ and rhod-2 combination, we demonstrate panoramic multi-parametric mapping, affording a 360° spatiotemporal record of activity. Finally, local optical perturbation of calcium dynamics in the whole heart is demonstrated using the caged compound, o-nitrophenyl ethylene glycol tetraacetic acid (NP-EGTA), with an ultraviolet light-emitting diode (LED). Calcium maps (heart loaded with di-4-ANBDQPQ and rhod-2) demonstrate successful NP-EGTA loading and local flash photolysis. All imaging systems were built using only a single camera. In conclusion, using novel 2nd-generation voltage dyes, we developed scalable techniques for multi-parametric optical mapping of the whole heart from one point of view and panoramically. In addition to these parameter imaging approaches, we show that it is possible to use caged compounds and ultraviolet LEDs to locally perturb electrophysiological parameters in the whole heart. PMID:22886365
Compressed single pixel imaging in the spatial frequency domain
Torabzadeh, Mohammad; Park, Il-Yong; Bartels, Randy A.; Durkin, Anthony J.; Tromberg, Bruce J.
2017-01-01
Abstract. We have developed compressed sensing single pixel spatial frequency domain imaging (cs-SFDI) to characterize tissue optical properties over a wide field of view (35 mm×35 mm) using multiple near-infrared (NIR) wavelengths simultaneously. Our approach takes advantage of the relatively sparse spatial content required for mapping tissue optical properties at length scales comparable to the transport scattering length in tissue (ltr∼1 mm) and the high bandwidth available for spectral encoding using a single-element detector. cs-SFDI recovered absorption (μa) and reduced scattering (μs′) coefficients of a tissue phantom at three NIR wavelengths (660, 850, and 940 nm) within 7.6% and 4.3% of absolute values determined using camera-based SFDI, respectively. These results suggest that cs-SFDI can be developed as a multi- and hyperspectral imaging modality for quantitative, dynamic imaging of tissue optical and physiological properties. PMID:28300272
VizieR Online Data Catalog: LMC NIR Synoptic Survey. II. Wesenheit relations (Bhardwaj+, 2016)
NASA Astrophysics Data System (ADS)
Bhardwaj, A.; Kanbur, S. M.; Macri, L. M.; Singh, H. P.; Ngeow, C.-C.; Wagner-Kaiser, R.; Sarajedini, A.
2018-03-01
We make use of NIR mean magnitudes for 775 fundamental-mode and 474 first-overtone Cepheids in the LMC from Macri et al. 2015, J/AJ/149/117 (Paper I). These magnitudes are based on observations from a synoptic survey (average of 16 epochs) of the central region of the LMC using the CPAPIR camera at the Cerro Tololo Interamerican Observatory 1.5-m telescope between 2006 and 2007. Most of these Cepheid variables were previously studied in the optical V and I bands by the third phase of the Optical Gravitational Lensing Experiment (OGLE-III) survey (Soszynski et al. 2008, J/AcA/58/163; Ulaczyk et al. 2013, J/AcA/63/159). The V and I band mean magnitudes are also compiled in Paper I. The calibration into the 2MASS photometric system, extinction corrections, and the adopted reddening law are discussed in detail in Paper I. (4 data files).
Monitoring Telluric Water Absorption with CAMAL
NASA Astrophysics Data System (ADS)
Baker, Ashley; Blake, Cullen; Sliski, David
2017-01-01
Ground-based observations are severely limited by telluric water vapor absorption features, which are highly variable in time and significantly complicate both spectroscopy and photometry in the near-infrared (NIR). To achieve the stability required to study Earth-sized exoplanets, monitoring the precipitable water vapor (PWV) becomes necessary to mitigate the impact of telluric lines on radial velocity measurements and transit light curves. To address this issue, we present the Camera for the Automatic Monitoring of Atmospheric Lines (CAMAL), a stand-alone, inexpensive 6-inch aperture telescope dedicated to measuring PWV at the Whipple Observatory. CAMAL utilizes three NIR narrowband filters to trace the amount of atmospheric water vapor affecting simultaneous observations with the MINiature Exoplanet Radial Velocity Array (MINERVA) and MINERVA-Red telescopes. We present the current design of CAMAL, discuss our calibration methods, and show PWV measurements taken with CAMAL compared to those of a nearby GPS water vapor monitor.
A novel image-based BRDF measurement system and its application to human skin
NASA Astrophysics Data System (ADS)
Bintz, Jeffrey R.; Mendenhall, Michael J.; Marciniak, Michael A.; Butler, Samuel D.; Lloyd, James Tommy
2016-09-01
Human skin detection is an important first step in search and rescue (SAR) scenarios. Previous research performed human skin detection through an application specific camera system that ex- ploits the spectral properties of human skin at two visible and two near-infrared (NIR) wavelengths. The current theory assumes human skin is diffuse; however, it is observed that human skin exhibits specular and diffuse reflectance properties. This paper presents a novel image-based bidirectional reflectance distribution function (BRDF) measurement system, and applies it to the collection of human skin BRDF. The system uses a grid projecting laser and a novel signal processing chain to extract the surface normal from each grid location. Human skin BRDF measurements are shown for a variety of melanin content and hair coverage at the four spectral channels needed for human skin detection. The NIR results represent a novel contribution to the existing body of human skin BRDF measurements.
CMOS Imaging Sensor Technology for Aerial Mapping Cameras
NASA Astrophysics Data System (ADS)
Neumann, Klaus; Welzenbach, Martin; Timm, Martin
2016-06-01
In June 2015 Leica Geosystems launched the first large format aerial mapping camera using CMOS sensor technology, the Leica DMC III. This paper describes the motivation to change from CCD sensor technology to CMOS for the development of this new aerial mapping camera. In 2002 the DMC first generation was developed by Z/I Imaging. It was the first large format digital frame sensor designed for mapping applications. In 2009 Z/I Imaging designed the DMC II which was the first digital aerial mapping camera using a single ultra large CCD sensor to avoid stitching of smaller CCDs. The DMC III is now the third generation of large format frame sensor developed by Z/I Imaging and Leica Geosystems for the DMC camera family. It is an evolution of the DMC II using the same system design with one large monolithic PAN sensor and four multi spectral camera heads for R,G, B and NIR. For the first time a 391 Megapixel large CMOS sensor had been used as PAN chromatic sensor, which is an industry record. Along with CMOS technology goes a range of technical benefits. The dynamic range of the CMOS sensor is approx. twice the range of a comparable CCD sensor and the signal to noise ratio is significantly better than with CCDs. Finally results from the first DMC III customer installations and test flights will be presented and compared with other CCD based aerial sensors.
Multisensory System for the Detection and Localization of Peripheral Subcutaneous Veins
Fernández, Roemi; Armada, Manuel
2017-01-01
This paper proposes a multisensory system for the detection and localization of peripheral subcutaneous veins, as a first step for achieving automatic robotic insertion of catheters in the near future. The multisensory system is based on the combination of a SWIR (Short-Wave Infrared) camera, a TOF (Time-Of-Flight) camera and a NIR (Near Infrared) lighting source. The associated algorithm consists of two main parts: one devoted to the features extraction from the SWIR image, and another envisaged for the registration of the range data provided by the TOF camera, with the SWIR image and the results of the peripheral veins detection. In this way, the detected subcutaneous veins are mapped onto the 3D reconstructed surface, providing a full representation of the region of interest for the automatic catheter insertion. Several experimental tests were carried out in order to evaluate the capabilities of the presented approach. Preliminary results demonstrate the feasibility of the proposed design and highlight the potential benefits of the solution. PMID:28422075
... Physician Resources Professions Site Index A-Z Panoramic Dental X-ray Panoramic dental x-ray uses a very small dose of ... x-ray , is a two-dimensional (2-D) dental x-ray examination that captures the entire mouth ...
Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test
Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno
2008-01-01
The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces. PMID:27873930
RATIR Follow-up of LIGO/Virgo Gravitational Wave Events
NASA Astrophysics Data System (ADS)
Golkhou, V. Zach; Butler, Nathaniel R.; Strausbaugh, Robert; Troja, Eleonora; Kutyrev, Alexander; Lee, William H.; Román-Zúñiga, Carlos G.; Watson, Alan M.
2018-04-01
We have recently witnessed the first multi-messenger detection of colliding neutron stars through gravitational waves (GWs) and electromagnetic (EM) waves (GW 170817) thanks to the joint efforts of LIGO/Virgo and Space/Ground-based telescopes. In this paper, we report on the RATIR follow-up observation strategies and show the results for the trigger G194575. This trigger is not of astrophysical interest; however, it is of great interest to the robust design of a follow-up engine to explore large sky-error regions. We discuss the development of an image-subtraction pipeline for the six-color, optical/NIR imaging camera RATIR. Considering a two-band (i and r) campaign in the fall of 2015, we find that the requirement of simultaneous detection in both bands leads to a factor ∼10 reduction in false alarm rate, which can be further reduced using additional bands. We also show that the performance of our proposed algorithm is robust to fluctuating observing conditions, maintaining a low false alarm rate with a modest decrease in system efficiency that can be overcome utilizing repeat visits. Expanding our pipeline to search for either optical or NIR detections (three or more bands), considering separately the optical riZ and NIR YJH bands, should result in a false alarm rate ≈1% and an efficiency ≈90%. RATIR’s simultaneous optical/NIR observations are expected to yield about one candidate transient in the vast 100 deg2 LIGO error region for prioritized follow-up with larger aperture telescopes.
Airborne laser systems for atmospheric sounding in the near infrared
NASA Astrophysics Data System (ADS)
Sabatini, Roberto; Richardson, Mark A.; Jia, Huamin; Zammit-Mangion, David
2012-06-01
This paper presents new techniques for atmospheric sounding using Near Infrared (NIR) laser sources, direct detection electro-optics and passive infrared imaging systems. These techniques allow a direct determination of atmospheric extinction and, through the adoption of suitable inversion algorithms, the indirect measurement of some important natural and man-made atmospheric constituents, including Carbon Dioxide (CO2). The proposed techniques are suitable for remote sensing missions performed by using aircraft, satellites, Unmanned Aerial Vehicles (UAV), parachute/gliding vehicles, Roving Surface Vehicles (RSV), or Permanent Surface Installations (PSI). The various techniques proposed offer relative advantages in different scenarios. All are based on measurements of the laser energy/power incident on target surfaces of known geometric and reflective characteristics, by means of infrared detectors and/or infrared cameras calibrated for radiance. Experimental results are presented relative to ground and flight trials performed with laser systems operating in the near infrared (NIR) at λ = 1064 nm and λ = 1550 nm. This includes ground tests performed with 10 Hz and 20 KHz PRF NIR laser systems in a variety of atmospheric conditions, and flight trials performed with a 10 Hz airborne NIR laser system installed on a TORNADO aircraft, flying up to altitudes of 22,000 ft above ground level. Future activities are planned to validate the atmospheric retrieval algorithms developed for CO2 column density measurements, with emphasis on aircraft related emissions at airports and other high air-traffic density environments.
NASA Astrophysics Data System (ADS)
Shaul, Oren; Fanrazi-Kahana, Michal; Meitav, Omri; Pinhasi, Gad A.; Abookasis, David
2018-03-01
Optical properties of biological tissues are valuable diagnostic parameters which can provide necessary information regarding tissue state during disease pathogenesis and therapy. However, different sources of interference, such as temperature changes may modify these properties, introducing confounding factors and artifacts to data, consequently skewing their interpretation and misinforming clinical decision-making. In the current study, we apply spatial light modulation, a type of diffuse reflectance hyperspectral imaging technique, to monitor the variation in optical properties of highly scattering turbid media in the presence varying levels of the following sources of interference: scattering concentration, temperature, and pressure. Spatial near-infrared (NIR) light modulation is a wide-field, non-contact emerging optical imaging platform capable of separating the effects of tissue scattering from those of absorption, thereby accurately estimating both parameters. With this technique, periodic NIR illumination patterns at alternately low and high spatial frequencies, at six discrete wavelengths between 690 to 970 nm, were sequentially projected upon the medium while a CCD camera collects the diffusely reflected light. Data analysis based assumptions is then performed off-line to recover the medium's optical properties. We conducted a series of experiments demonstrating the changes in absorption and reduced scattering coefficients of commercially available fresh milk and chicken breast tissue under different interference conditions. In addition, information on the refractive index was study under increased pressure. This work demonstrates the utility of NIR spatial light modulation to detect varying sources of interference upon the optical properties of biological samples.
Development of an oxygen saturation measuring system by using near-infrared spectroscopy
NASA Astrophysics Data System (ADS)
Kono, K.; Nakamachi, E.; Morita, Y.
2017-08-01
Recently, the hypoxia imaging has been recognized as the advanced technique to detect cancers because of a strong relationship with the biological characterization of cancer. In previous studies, hypoxia imaging systems for endoscopic diagnosis have been developed. However, these imaging technologies using the visible light can observe only blood vessels in gastric mucous membrane. Therefore, they could not detect scirrhous gastric cancer which accounts for 10% of all gastric cancers and spreads rapidly into submucous membrane. To overcome this problem, we developed a measuring system of blood oxygen saturation in submucous membrane by using near-infrared (NIR) spectroscopy. NIR, which has high permeability for bio-tissues and high absorbency for hemoglobin, can image and observe blood vessels in submucous membrane. NIR system with LED lights and a CCD camera module was developed to image blood vessels. We measured blood oxygen saturation using the optical density ratio (ODR) of two wavelengths, based on Lambert-Beer law. To image blood vessel clearly and measure blood oxygen saturation accurately, we searched two optimum wavelengths by using a multilayer human gastric-like phantom which has same optical properties as human gastric one. By using Monte Carlo simulation of light propagation, we derived the relationship between the ODR and blood oxygen saturation and elucidated the influence of blood vessel depth on measuring blood oxygen saturation. The oxygen saturation measuring methodology was validated with experiments using our NIR system. Finally, it was confirmed that our system can detect oxygen saturation in various depth blood vessels accurately.
NASA Astrophysics Data System (ADS)
Rustan, Pedro L.
1995-01-01
The U.S. Department of Defense (DoD) and the National Aeronautics and Space Administration (NASA) started a cooperative program in 1992 to flight qualify recently developed lightweight technologies in a radiation stressed environment. The spacecraft, referred to as Clementine, was designed, built, and launched in less than a two year period. The spacecraft was launched into a high inclination orbit from Vandenburg Air Force Base in California on a Titan IIG launch vehicle in January 1994. The spacecraft was injected into a 420 by 3000 km orbit around the Moon and remained there for over two months. Unfortunately, after successfully completing the Lunar phase of the mission, a software malfunction prevented the accomplishment of the near-Earth asteroid (NEA) phase. Some of the technologies incorporated in the Clementine spacecraft include: a 370 gram, 7 watt star tracker camera; a 500 gram, 6 watt, UV/Vis camera; a 1600 gram, 30 watt Indium Antimonide focal plane array NIR camera; a 1650 gram, 30 watt, Mercury Cadmium Telluride LWIR camera; a LIDAR camera which consists of a Nd:YAG diode pumped laser for ranging and an intensified photocathode charge-coupled detector for imaging. The scientific results of the mission will be first analyzed by a NASA selected team, and then will be available to the entire community.
Location Distribution Optimization of Photographing Sites for Indoor Panorama Modeling
NASA Astrophysics Data System (ADS)
Zhang, S.; Wu, J.; Zhang, Y.; Zhang, X.; Xin, Z.; Liu, J.
2017-09-01
Generally, panoramas image modeling is costly and time-consuming because of photographing continuously to capture enough photos along the routes, especially in complicated indoor environment. Thus, difficulty follows for a wider applications of panoramic image modeling for business. It is indispensable to make a feasible arrangement of panorama sites locations because the locations influence the clarity, coverage and the amount of panoramic images under the condition of certain device. This paper is aim to propose a standard procedure to generate the specific location and total amount of panorama sites in indoor panoramas modeling. Firstly, establish the functional relationship between one panorama site and its objectives. Then, apply the relationship to panorama sites network. We propose the Distance Clarity function (FC and Fe) manifesting the mathematical relationship between panoramas and objectives distance or obstacle distance. The Distance Buffer function (FB) is modified from traditional buffer method to generate the coverage of panorama site. Secondly, transverse every point in possible area to locate possible panorama site, calculate the clarity and coverage synthetically. Finally select as little points as possible to satiate clarity requirement preferentially and then the coverage requirement. In the experiments, detailed parameters of camera lens are given. Still, more experiments parameters need trying out given that relationship between clarity and distance is device dependent. In short, through the function FC, Fe and FB, locations of panorama sites can be generated automatically and accurately.
NASA Astrophysics Data System (ADS)
Wu, Xiaojun; Wu, Yumei; Wen, Peizhi
2018-03-01
To obtain information on the outer surface of a cylinder object, we propose a catadioptric panoramic imaging system based on the principle of uniform spatial resolution for vertical scenes. First, the influence of the projection-equation coefficients on the spatial resolution and astigmatism of the panoramic system are discussed, respectively. Through parameter optimization, we obtain the appropriate coefficients for the projection equation, and so the imaging quality of the entire imaging system can reach an optimum value. Finally, the system projection equation is calibrated, and an undistorted rectangular panoramic image is obtained using the cylindrical-surface projection expansion method. The proposed 360-deg panoramic-imaging device overcomes the shortcomings of existing surface panoramic-imaging methods, and it has the advantages of low cost, simple structure, high imaging quality, and small distortion, etc. The experimental results show the effectiveness of the proposed method.
NASA Astrophysics Data System (ADS)
Basilevsky, A. T.; Shalygina, O. S.; Bondarenko, N. V.; Shalygin, E. V.; Markiewicz, W. J.
2017-09-01
The aim of this work is a comparative study of several typical radar-dark parabolas, the neighboring plains and some other geologic units seen in the study areas which include craters Adivar, Bassi, Bathsheba, du Chatelet and Sitwell, at two depths scales: the upper several meters of the study object available through the Magellan-based microwave (at 12.6 cm wavelength) properties (microwave emissivity, Fresnel reflectivity, large-scale surface roughness, and radar cross-section), and the upper hundreds microns of the object characterized by the 1 micron emissivity resulted from the analysis of the near infra-red (NIR) irradiation of the night-side of the Venusian surface measured by the Venus Monitoring Camera (VMC) on-board of Venus Express (VEx).
Preliminary optical design of PANIC, a wide-field infrared camera for CAHA
NASA Astrophysics Data System (ADS)
Cárdenas, M. C.; Rodríguez Gómez, J.; Lenzen, R.; Sánchez-Blanco, E.
2008-07-01
In this paper, we present the preliminary optical design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Calar Alto 2.2 m telescope. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. A mosaic of four Hawaii 2RG of 2k x 2k made by Teledyne is used as detector and will give a field of view of 31.9 arcmin x 31.9 arcmin. This cryogenic instrument has been optimized for the Y, J, H and K bands. Special care has been taken in the selection of the standard IR materials used for the optics in order to maximize the instrument throughput and to include the z band. The main challenges of this design are: to produce a well defined internal pupil which allows reducing the thermal background by a cryogenic pupil stop; the correction of off-axis aberrations due to the large field available; the correction of chromatic aberration because of the wide spectral coverage; and the capability of introduction of narrow band filters (~1%) in the system minimizing the degradation in the filter passband without a collimated stage in the camera. We show the optomechanical error budget and compensation strategy that allows our as built design to met the performances from an optical point of view. Finally, we demonstrate the flexibility of the design showing the performances of PANIC at the CAHA 3.5m telescope.
Three-dimensional images contribute to the diagnosis of mucous retention cyst in maxillary sinus
Donizeth-Rodrigues, Cleomar; Fonseca-Da Silveira, Márcia; Gonçalves-De Alencar, Ana H.; Garcia-Santos-Silva, Maria A.; Francisco-De-Mendonça, Elismauro
2013-01-01
Objective: To evaluate the detection of mucous retention cyst of maxillary sinus (MRCMS) using panoramic radiography and cone beam computed tomography (CBCT). Study Design: A digital database with 6,000 panoramic radiographs was reviewed for MRCMS. Suggestive images of MRCMS were detected on 185 radiographs, and patients were located and invited to return for follow-up. Thirty patients returned, and control panoramic radiographs were obtained 6 to 46 months after the initial radiograph. When MRCMS was found on control radiographs, CBCT scans were obtained. Cysts were measured and compared on radiographs and scans. The Wilcoxon, Spearman and Kolmorogov-Smirnov tests were used for statistical analysis. The level of significance was set at 5%. Results: There were statistically significant differences between the two methods (p<0.05): 23 MRCMS detected on panoramic radiographs were confirmed by CBCT, but 5 MRCMS detected on CBCT images had not been identified by panoramic radiography. Eight MRCMS detected on control radiographs were not confirmed by CBCT. MRCMS size differences from initial to control panoramic radiographs and CBCT scans were not statistically significant (p= 0.617 and p= 0.626). The correlation between time and MRCMS size differences was not significant (r = -0.16, p = 0.381). Conclusion: CBCT scanning detect MRCMS more accurately than panoramic radiography. Key words:Mucous cyst, maxillary sinus, panoramic radiograph, cone beam computed tomography. PMID:23229251
Dagassan-Berndt, Dorothea C; Zitzmann, Nicola U; Walter, Clemens; Schulze, Ralf K W
2016-08-01
To evaluate the impact of cone beam computed tomography (CBCT) imaging on treatment planning regarding augmentation procedures for implant placement. Panoramic radiographs and CBCT images of 40 patients requesting single-tooth implants in 59 sites were retrospectively analyzed by six specialists in implantology, and treatment planning was performed. Therapeutic recommendations were compared with the surgical protocol performed initially. Bone height estimation from panoramic radiographs yielded to higher measures and greater variability compared to CBCT. The suggested treatment plan for lateral and vertical augmentation procedures based on CBCT or panoramic radiographs coincided for 55-72% of the cases. A trend to a more invasive augmentation procedure was seen when planning was based on CBCT. Panoramic radiography revealed 57-63% (lateral) vs. 67% (vertical augmentation) congruent plans in agreement with surgery. Among the dissenting sites, there was a trend toward less invasive planning for lateral augmentation with panoramic radiographs, while vertical augmentation requirements were more frequently more invasive when based on CBCT. Vertical augmentation requirements can be adequately determined from panoramic radiographs. In difficult cases with a deficient lateral alveolar bone, the augmentation schedule may better be evaluated from CBCT to avoid underestimation, which occurs more frequently when based on panoramic radiographs only. However, overall, radiographic interpretation and diagnostic thinking accuracy seem to be mainly depending on the opinion of observers. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
VizieR Online Data Catalog: The hot Jupiter Kepler-13Ab planet's occultation (Shporer+, 2014)
NASA Astrophysics Data System (ADS)
Shporer, A.; O'Rourke, J. G.; Knutson, H. A.; Szabo, G. M.; Zhao, M.; Burrows, A.; Fortney, J.; Agol, E.; Cowan, N. B.; Desert, J.-M.; Howard, A. W.; Isaacson, H.; Lewis, N. K.; Showman, A. P.; Todorov, K. O.
2017-07-01
Here we carry out an atmospheric characterization of Kepler-13Ab by measuring its occultation in four different wavelength bands, from the infrared (IR; Spitzer/Infrared array camera (IRAC) 4.5 um and 3.6 um), through the near-IR (NIR; Ks band), to the optical (Kepler). We also analyze the Kepler phase curve and obtain Keck/high-resolution echelle spectrometer (HIRES) spectra that result in revised parameters for the objects in the system. (4 data files).
An extensive coronagraphic simulation applied to LBT
NASA Astrophysics Data System (ADS)
Vassallo, D.; Carolo, E.; Farinato, J.; Bergomi, M.; Bonavita, M.; Carlotti, A.; D'Orazi, V.; Greggio, D.; Magrin, D.; Mesa, D.; Pinna, E.; Puglisi, A.; Stangalini, M.; Verinaud, C.; Viotto, V.
2016-08-01
In this article we report the results of a comprehensive simulation program aimed at investigating coronagraphic capabilities of SHARK-NIR, a camera selected to proceed to the final design phase at Large Binocular Telescope. For the purpose, we developed a dedicated simulation tool based on physical optics propagation. The code propagates wavefronts through SHARK optical train in an end-to-end fashion and can implement any kind of coronagraph. Detection limits can be finally computed, exploring a wide range of Strehl values and observing conditions.
Rushton, Michael N; Rushton, Vivian E
2012-08-01
To measure the added value of panoramic radiography in new dentate patients attending for routine treatment. Thirty-seven general dental practitioners using panoramic radiographs routinely were recruited. Twenty dentate patients were identified prospectively by each participating dentist if they were new to the practice, attending for an examination and requesting any treatment deemed necessary. A panoramic radiograph was taken with appropriate intraoral radiographs in line with national guidelines. Each dentist completed a radiological report for the panoramic radiograph only and these 20 reports were forwarded to the researchers along with the 20 panoramic radiographs, their accompanying bitewing and periapical radiographs and twenty completed clinical assessment sheets. 740 panoramic, 1418 bitewing and 325 periapical radiographs were assessed by the researchers. Only 32 panoramic films provided any additional diagnostic value when compared to intraoral films when guidelines had been observed resulting from the poor technical and processing quality of the accompanying intraoral films. Assessment of the number of caries and periapical lesions and the degree of periodontal bone loss from the intraoral films provided a greater diagnostic yield at the p<0.001 level of significance. The research found that dentists underestimated the number of caries lesions present and level of periodontal bone loss when compared to the researchers but overestimated the presence of periapical pathology, at the level of significance at p<0.001. The study found that there was no support for the use of panoramic radiographs in routine screening as there was no net diagnostic benefit to the patient. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Du, Jian; Sheng, Wanxing; Lin, Tao; Lv, Guangxian
2018-05-01
Nowadays, the smart distribution network has made tremendous progress, and the business visualization becomes even more significant and indispensable. Based on the summarization of traditional visualization technologies and demands of smart distribution network, a panoramic visualization application is proposed in this paper. The overall architecture, integrated architecture and service architecture of panoramic visualization application is firstly presented. Then, the architecture design and main functions of panoramic visualization system are elaborated in depth. In addition, the key technologies related to the application is discussed briefly. At last, two typical visualization scenarios in smart distribution network, which are risk warning and fault self-healing, proves that the panoramic visualization application is valuable for the operation and maintenance of the distribution network.
Panoramic Scanning: Essential Element of Higher-Order Thought.
ERIC Educational Resources Information Center
Ambrose, Don
1996-01-01
Panoramic scanning is the capacity to perceive, interpret, and appreciate complex problems from a big-picture vantage point. Barriers to panoramic scanning (sensory bombardment, superficial polarized thought, and tunnel vision) and facilitators (broad interests and knowledge, pattern finding, and connection-making skills) are identified. Educators…
Registration of Panoramic/Fish-Eye Image Sequence and LiDAR Points Using Skyline Features
Zhu, Ningning; Jia, Yonghong; Ji, Shunping
2018-01-01
We propose utilizing a rigorous registration model and a skyline-based method for automatic registration of LiDAR points and a sequence of panoramic/fish-eye images in a mobile mapping system (MMS). This method can automatically optimize original registration parameters and avoid the use of manual interventions in control point-based registration methods. First, the rigorous registration model between the LiDAR points and the panoramic/fish-eye image was built. Second, skyline pixels from panoramic/fish-eye images and skyline points from the MMS’s LiDAR points were extracted, relying on the difference in the pixel values and the registration model, respectively. Third, a brute force optimization method was used to search for optimal matching parameters between skyline pixels and skyline points. In the experiments, the original registration method and the control point registration method were used to compare the accuracy of our method with a sequence of panoramic/fish-eye images. The result showed: (1) the panoramic/fish-eye image registration model is effective and can achieve high-precision registration of the image and the MMS’s LiDAR points; (2) the skyline-based registration method can automatically optimize the initial attitude parameters, realizing a high-precision registration of a panoramic/fish-eye image and the MMS’s LiDAR points; and (3) the attitude correction values of the sequences of panoramic/fish-eye images are different, and the values must be solved one by one. PMID:29883431
NASA Technical Reports Server (NTRS)
2005-01-01
On its 449th martian day, or sol (April 29, 2005), NASA's Mars rover Opportunity woke up approximately an hour after sunset and took this picture of the fading twilight as the stars began to come out. Set against the fading red glow of the sky, the pale dot near the center of the picture is not a star, but a planet -- Earth. Earth appears elongated because it moved slightly during the 15-second exposures. The faintly blue light from the Earth combines with the reddish sky glow to give the pale white appearance. The images were taken with Opportunity's panoramic camera, using 440-nanometer, 530-nanometer, and 750-nanometer color filters. In processing on the ground, the images were shifted slightly to compensate for Earth's motion between one image and the next.Pancam multispectral imaging results from the opportunity Rover at Meridiani Planum
Bell, J.F.; Squyres, S. W.; Arvidson, R. E.; Arneson, H.M.; Bass, D.; Calvin, W.; Farrand, W. H.; Goetz, W.; Golombek, M.; Greeley, R.; Grotzinger, J.; Guinness, E.; Hayes, A.G.; Hubbard, M.Y.H.; Herkenhoff, K. E.; Johnson, M.J.; Johnson, J. R.; Joseph, J.; Kinch, K.M.; Lemmon, M.T.; Li, R.; Madsen, M.B.; Maki, J.N.; Malin, M.; McCartney, E.; McLennan, S.; McSween, H.Y.; Ming, D. W.; Morris, R.V.; Noe Dobrea, E.Z.; Parker, T.J.; Proton, J.; Rice, J. W.; Seelos, F.; Soderblom, J.M.; Soderblom, L.A.; Sohl-Dickstein, J. N.; Sullivan, R.J.; Weitz, C.M.; Wolff, M.J.
2004-01-01
Panoramic Camera (Pancam) images from Meridiani Planum reveal a low-albedo, generally flat, and relatively rock-free surface. Within and around impact craters and fractures, laminated outcrop rocks with higher albedo are observed. Fine-grained materials include dark sand, bright ferric iron-rich dust, angular rock clasts, and millimeter-size spheroidal granules that are eroding out of the laminated rocks. Spectra of sand, clasts, and one dark plains rock are consistent with mafic silicates such as pyroxene and olivine. Spectra of both the spherules and the laminated outcrop materials indicate the presence of crystalline ferric oxides or oxyhydroxides. Atmospheric observations show a steady decline in dust opacity during the mission. Astronomical observations captured solar transits by Phobos and Deimos and time-lapse observations of sunsets.
Opto-mechanical design of PANIC
NASA Astrophysics Data System (ADS)
Fried, Josef W.; Baumeister, Harald; Huber, Armin; Laun, Werner; Rohloff, Ralf-Rainer; Concepción Cárdenas, M.
2010-07-01
PANIC, the Panoramic Near-Infrared Camera, is a new instrument for the Calar Alto Observatory. A 4x4 k detector yields a field of view of 0.5x0.5 degrees at a pixel scale of 0.45 arc sec/pixel at the 2.2m telescope. PANIC can be used also at the 3.5m telescope with half the pixel scale. The optics consists of 9 lenses and 3 folding mirrors. Mechanical tolerances are as small as 50 microns for some elements. PANIC will have a low thermal background due to cold stops. Read-out is done with MPIA's own new electronics which allows read-out of 132 channels in parallel. Weight and size limits lead to interesting design features. Here we describe the opto-mechanical design.
'Pot of Gold' and 'Rotten Rocks'
NASA Technical Reports Server (NTRS)
2004-01-01
This false-color image taken by the panoramic camera on the Mars Exploration Rover Spirit shows the rock dubbed 'Pot of Gold' (upper left), located near the base of the 'Columbia Hills' in Gusev Crater. Scientists are intrigued by this unusual-looking, nodule-covered rock and plan to investigate its detailed chemistry in coming sols. This picture was taken on sol 159 (June 14, 2004). To the right is a set of rocks referred to as 'Rotten Rocks' for their resemblance to rotting loaves of bread. The insides of these rocks appear to have been eroded, while their outer rinds remain more intact. These outer rinds are reminiscent of those found on rocks at Meridiani Planum's 'Eagle Crater.' This image was captured on sol 158 (June 13, 2004).Surface Stereo Imager on Mars, Face-On
NASA Technical Reports Server (NTRS)
2008-01-01
This image is a view of NASA's Phoenix Mars Lander's Surface Stereo Imager (SSI) as seen by the lander's Robotic Arm Camera. This image was taken on the afternoon of the 116th Martian day, or sol, of the mission (September 22, 2008). The mast-mounted SSI, which provided the images used in the 360 degree panoramic view of Phoenix's landing site, is about 4 inches tall and 8 inches long. The two 'eyes' of the SSI seen in this image can take photos to create three-dimensional views of the landing site. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.Pancam multispectral imaging results from the Opportunity Rover at Meridiani Planum.
Bell, J F; Squyres, S W; Arvidson, R E; Arneson, H M; Bass, D; Calvin, W; Farrand, W H; Goetz, W; Golombek, M; Greeley, R; Grotzinger, J; Guinness, E; Hayes, A G; Hubbard, M Y H; Herkenhoff, K E; Johnson, M J; Johnson, J R; Joseph, J; Kinch, K M; Lemmon, M T; Li, R; Madsen, M B; Maki, J N; Malin, M; McCartney, E; McLennan, S; McSween, H Y; Ming, D W; Morris, R V; Dobrea, E Z Noe; Parker, T J; Proton, J; Rice, J W; Seelos, F; Soderblom, J M; Soderblom, L A; Sohl-Dickstein, J N; Sullivan, R J; Weitz, C M; Wolff, M J
2004-12-03
Panoramic Camera (Pancam) images from Meridiani Planum reveal a low-albedo, generally flat, and relatively rock-free surface. Within and around impact craters and fractures, laminated outcrop rocks with higher albedo are observed. Fine-grained materials include dark sand, bright ferric iron-rich dust, angular rock clasts, and millimeter-size spheroidal granules that are eroding out of the laminated rocks. Spectra of sand, clasts, and one dark plains rock are consistent with mafic silicates such as pyroxene and olivine. Spectra of both the spherules and the laminated outcrop materials indicate the presence of crystalline ferric oxides or oxyhydroxides. Atmospheric observations show a steady decline in dust opacity during the mission. Astronomical observations captured solar transits by Phobos and Deimos and time-lapse observations of sunsets.
Pancam multispectral imaging results from the Opportunity Rover at Meridiani Planum
NASA Technical Reports Server (NTRS)
Bell, J. F., III; Squyres, S. W.; Arvidson, R. E.; Arneson, H. M.; Bass, D.; Calvin, W.; Farrand, W. H.; Goetz, W.; Golombek, M.; Greeley, R.;
2004-01-01
Panoramic Camera (Pancam) images from Meridiani Planum reveal a low-albedo, generally flat, and relatively rock-free surface. Within and around impact craters and fractures, laminated outcrop rocks with higher albedo are observed. Fine-grained materials include dark sand, bright ferric iron-rich dust, angular rock clasts, and millimeter-size spheroidal granules that are eroding out of the laminated rocks. Spectra of sand, clasts, and one dark plains rock are consistent with mafic silicates such as pyroxene and olivine. Spectra of both the spherules and the laminated outcrop materials indicate the presence of crystalline ferric oxides or oxyhydroxides. Atmospheric observations show a steady decline in dust opacity during the mission. Astronomical observations captured solar transits by Phobos and Deimos and time-lapse observations of sunsets.
Stars and Cosmic Rays Observed from Mars
2004-03-12
In this five-minute exposure taken from the surface of Mars by NASA Spirit rover, stars appear as streaks due to the rotation of the planet, and instantaneous cosmic-ray hits appear as points of light. Spirit took the image with its panoramic camera on March 11, 2004, after waking up during the martian night for a communication session with NASA's Mars Global Surveyor orbiter. Other exposures were also taken. The images tested the capabilities of the rover for night-sky observations. Scientists will use the results to aid planning for possible future astronomical observations from Mars. The difference in Mars' rotation, compared to Earth's, gives the star trails in this image a different orientation than they would have in a comparable exposure taken from Earth. http://photojournal.jpl.nasa.gov/catalog/PIA05551
Using VIS/NIR and IR spectral cameras for detecting and separating crime scene details
NASA Astrophysics Data System (ADS)
Kuula, Jaana; Pölönen, Ilkka; Puupponen, Hannu-Heikki; Selander, Tuomas; Reinikainen, Tapani; Kalenius, Tapani; Saari, Heikki
2012-06-01
Detecting invisible details and separating mixed evidence is critical for forensic inspection. If this can be done reliably and fast at the crime scene, irrelevant objects do not require further examination at the laboratory. This will speed up the inspection process and release resources for other critical tasks. This article reports on tests which have been carried out at the University of Jyväskylä in Finland together with the Central Finland Police Department and the National Bureau of Investigation for detecting and separating forensic details with hyperspectral technology. In the tests evidence was sought after at an assumed violent burglary scene with the use of VTT's 500-900 nm wavelength VNIR camera, Specim's 400- 1000 nm VNIR camera, and Specim's 1000-2500 nm SWIR camera. The tested details were dried blood on a ceramic plate, a stain of four types of mixed and absorbed blood, and blood which had been washed off a table. Other examined details included untreated latent fingerprints, gunshot residue, primer residue, and layered paint on small pieces of wood. All cameras could detect visible details and separate mixed paint. The SWIR camera could also separate four types of human and animal blood which were mixed in the same stain and absorbed into a fabric. None of the cameras could however detect primer residue, untreated latent fingerprints, or blood that had been washed off. The results are encouraging and indicate the need for further studies. The results also emphasize the importance of creating optimal imaging conditions into the crime scene for each kind of subjects and backgrounds.
Three-dimensional images contribute to the diagnosis of mucous retention cyst in maxillary sinus.
Donizeth-Rodrigues, Cleomar; Fonseca-Da Silveira, Márcia; Gonçalves-De Alencar, Ana-Helena; Garcia-Santos-Silva, Maria-Alves; Francisco-De-Mendonça, Elismauro; Estrela, Carlos
2013-01-01
To evaluate the detection of mucous retention cyst of maxillary sinus (MRCMS) using panoramic radiography and cone beam computed tomography (CBCT). A digital database with 6,000 panoramic radiographs was reviewed for MRCMS. Suggestive images of MRCMS were detected on 185 radiographs, and patients were located and invited to return for follow-up. Thirty patients returned, and control panoramic radiographs were obtained 6 to 46 months after the initial radiograph. When MRCMS was found on control radiographs, CBCT scans were obtained. Cysts were measured and compared on radiographs and scans. The Wilcoxon, Spearman and Kolmorogov-Smirnov tests were used for statistical analysis. The level of significance was set at 5%. There were statistically significant differences between the two methods (p<0.05): 23 MRCMS detected on panoramic radiographs were confirmed by CBCT, but 5 MRCMS detected on CBCT images had not been identified by panoramic radiography. Eight MRCMS detected on control radiographs were not confirmed by CBCT. MRCMS size differences from initial to control panoramic radiographs and CBCT scans were not statistically significant (p= 0.617 and p= 0.626). The correlation between time and MRCMS size differences was not significant (r = -0.16, p = 0.381). CBCT scanning detect MRCMS more accurately than panoramic radiography.
An automatic panoramic image reconstruction scheme from dental computed tomography images
Papakosta, Thekla K; Savva, Antonis D; Economopoulos, Theodore L; Gröhndal, H G
2017-01-01
Objectives: Panoramic images of the jaws are extensively used for dental examinations and/or surgical planning because they provide a general overview of the patient's maxillary and mandibular regions. Panoramic images are two-dimensional projections of three-dimensional (3D) objects. Therefore, it should be possible to reconstruct them from 3D radiographic representations of the jaws, produced by CBCT scanning, obviating the need for additional exposure to X-rays, should there be a need of panoramic views. The aim of this article is to present an automated method for reconstructing panoramic dental images from CBCT data. Methods: The proposed methodology consists of a series of sequential processing stages for detecting a fitting dental arch which is used for projecting the 3D information of the CBCT data to the two-dimensional plane of the panoramic image. The detection is based on a template polynomial which is constructed from a training data set. Results: A total of 42 CBCT data sets of real clinical pre-operative and post-operative representations from 21 patients were used. Eight data sets were used for training the system and the rest for testing. Conclusions: The proposed methodology was successfully applied to CBCT data sets, producing corresponding panoramic images, suitable for examining pre-operatively and post-operatively the patients' maxillary and mandibular regions. PMID:28112548
Observations of the Perseids 2012 using SPOSH cameras
NASA Astrophysics Data System (ADS)
Margonis, A.; Flohrer, J.; Christou, A.; Elgner, S.; Oberst, J.
2012-09-01
The Perseids are one of the most prominent annual meteor showers occurring every summer when the stream of dust particles, originating from Halley-type comet 109P/Swift-Tuttle, intersects the orbital path of the Earth. The dense core of this stream passes Earth's orbit on the 12th of August producing the maximum number of meteors. The Technical University of Berlin (TUB) and the German Aerospace Center (DLR) organize observing campaigns every summer monitoring the Perseids activity. The observations are carried out using the Smart Panoramic Optical Sensor Head (SPOSH) camera system [0]. The SPOSH camera has been developed by DLR and Jena-Optronik GmbH under an ESA/ESTEC contract and it is designed to image faint, short-lived phenomena on dark planetary hemispheres. The camera features a highly sensitive backilluminated 1024x1024 CCD chip and a high dynamic range of 14 bits. The custom-made fish-eye lens offers a 120°x120° field-of-view (168° over the diagonal). Figure 1: A meteor captured by the SPOSH cameras simultaneously during the last 2011 observing campaign in Greece. The horizon including surrounding mountains can be seen in the image corners as a result of the large FOV of the camera. The observations will be made on the Greek Peloponnese peninsula monitoring the post-peak activity of the Perseids during a one-week period around the August New Moon (14th to 21st). Two SPOSH cameras will be deployed in two remote sites in high altitudes for the triangulation of meteor trajectories captured at both stations simultaneously. The observations during this time interval will give us the possibility to study the poorly-observed postmaximum branch of the Perseid stream and compare the results with datasets from previous campaigns which covered different periods of this long-lived meteor shower. The acquired data will be processed using dedicated software for meteor data reduction developed at TUB and DLR. Assuming a successful campaign, statistics, trajectories and photometric properties of the processed double-station meteors will be presented at the conference. Furthermore, a first order statistical analysis of the meteors processed during the 2011 and the new 2012 campaigns will be presented [0].
Food quality assessment by NIR hyperspectral imaging
NASA Astrophysics Data System (ADS)
Whitworth, Martin B.; Millar, Samuel J.; Chau, Astor
2010-04-01
Near infrared reflectance (NIR) spectroscopy is well established in the food industry for rapid compositional analysis of bulk samples. NIR hyperspectral imaging provides new opportunities to measure the spatial distribution of components such as moisture and fat, and to identify and measure specific regions of composite samples. An NIR hyperspectral imaging system has been constructed for food research applications, incorporating a SWIR camera with a cooled 14 bit HgCdTe detector and N25E spectrograph (Specim Ltd, Finland). Samples are scanned in a pushbroom mode using a motorised stage. The system has a spectral resolution of 256 pixels covering a range of 970-2500 nm and a spatial resolution of 320 pixels covering a swathe adjustable from 8 to 300 mm. Images are acquired at a rate of up to 100 lines s-1, enabling samples to be scanned within a few seconds. Data are captured using SpectralCube software (Specim) and analysed using ENVI and IDL (ITT Visual Information Solutions). Several food applications are presented. The strength of individual absorbance bands enables the distribution of particular components to be assessed. Examples are shown for detection of added gluten in wheat flour and to study the effect of processing conditions on fat distribution in chips/French fries. More detailed quantitative calibrations have been developed to study evolution of the moisture distribution in baguettes during storage at different humidities, to assess freshness of fish using measurements of whole cod and fillets, and for prediction of beef quality by identification and separate measurement of lean and fat regions.
NASA Astrophysics Data System (ADS)
Usenik, Peter; Bürmen, Miran; Vrtovec, Tomaž; Fidler, Aleš; Pernuš, Franjo; Likar, Boštjan
2011-03-01
Despite major improvements in dental healthcare and technology, dental caries remains one of the most prevalent chronic diseases of modern society. The initial stages of dental caries are characterized by demineralization of enamel crystals, commonly known as white spots which are difficult to diagnose. If detected early enough, such demineralization can be arrested and reversed by non-surgical means through well established dental treatments (fluoride therapy, anti-bacterial therapy, low intensity laser irradiation). Near-infrared (NIR) hyper-spectral imaging is a new promising technique for early detection of demineralization based on distinct spectral features of healthy and pathological dental tissues. In this study, we apply NIR hyper-spectral imaging to classify and visualize healthy and pathological dental tissues including enamel, dentin, calculus, dentin caries, enamel caries and demineralized areas. For this purpose, a standardized teeth database was constructed consisting of 12 extracted human teeth with different degrees of natural dental lesions imaged by NIR hyper-spectral system, X-ray and digital color camera. The color and X-ray images of teeth were presented to a clinical expert for localization and classification of the dental tissues, thereby obtaining the gold standard. Principal component analysis was used for multivariate local modeling of healthy and pathological dental tissues. Finally, the dental tissues were classified by employing multiple discriminant analysis. High agreement was observed between the resulting classification and the gold standard with the classification sensitivity and specificity exceeding 85 % and 97 %, respectively. This study demonstrates that NIR hyper-spectral imaging has considerable diagnostic potential for imaging hard dental tissues.
Augmented microscopy with near-infrared fluorescence detection
NASA Astrophysics Data System (ADS)
Watson, Jeffrey R.; Martirosyan, Nikolay; Skoch, Jesse; Lemole, G. Michael; Anton, Rein; Romanowski, Marek
2015-03-01
Near-infrared (NIR) fluorescence has become a frequently used intraoperative technique for image-guided surgical interventions. In procedures such as cerebral angiography, surgeons use the optical surgical microscope for the color view of the surgical field, and then switch to an electronic display for the NIR fluorescence images. However, the lack of stereoscopic, real-time, and on-site coregistration adds time and uncertainty to image-guided surgical procedures. To address these limitations, we developed the augmented microscope, whereby the electronically processed NIR fluorescence image is overlaid with the anatomical optical image in real-time within the optical path of the microscope. In vitro, the augmented microscope can detect and display indocyanine green (ICG) concentrations down to 94.5 nM, overlaid with the anatomical color image. We prepared polyacrylamide tissue phantoms with embedded polystyrene beads, yielding scattering properties similar to brain matter. In this model, 194 μM solution of ICG was detectable up to depths of 5 mm. ICG angiography was then performed in anesthetized rats. A dynamic process of ICG distribution in the vascular system overlaid with anatomical color images was observed and recorded. In summary, the augmented microscope demonstrates NIR fluorescence detection with superior real-time coregistration displayed within the ocular of the stereomicroscope. In comparison to other techniques, the augmented microscope retains full stereoscopic vision and optical controls including magnification and focus, camera capture, and multiuser access. Augmented microscopy may find application in surgeries where the use of traditional microscopes can be enhanced by contrast agents and image guided delivery of therapeutics, including oncology, neurosurgery, and ophthalmology.
NASA Astrophysics Data System (ADS)
Naqvi, Rizwan Ali; Park, Kang Ryoung
2016-06-01
Gaze tracking systems are widely used in human-computer interfaces, interfaces for the disabled, game interfaces, and for controlling home appliances. Most studies on gaze detection have focused on enhancing its accuracy, whereas few have considered the discrimination of intentional gaze fixation (looking at a target to activate or select it) from unintentional fixation while using gaze detection systems. Previous research methods based on the use of a keyboard or mouse button, eye blinking, and the dwell time of gaze position have various limitations. Therefore, we propose a method for discriminating between intentional and unintentional gaze fixation using a multimodal fuzzy logic algorithm applied to a gaze tracking system with a near-infrared camera sensor. Experimental results show that the proposed method outperforms the conventional method for determining gaze fixation.
Opportunity's 'Rub al Khali' Panorama
NASA Technical Reports Server (NTRS)
2005-01-01
[figure removed for brevity, see original site] Click on the image for Opportunity's 'Rub al Khali' Panorama (QTVR) This panoramic image, dubbed 'Rub al Khali,' was acquired by NASA's Mars Exploration Rover Opportunity on the plains of Meridiani during the period from the rover's 456th to 464th sols on Mars (May 6 to May 14, 2005). Opportunity was about 2 kilometers (1.2 miles) south of 'Endurance Crater' at a place known informally as 'Purgatory Dune.' The rover was stuck in the dune's deep fine sand for more than a month. 'Rub al Khali' (Arabic translation: 'the empty quarter') was chosen as the name for this panorama because it is the name of a similarly barren, desolate part of the Saudi Arabian desert on Earth. The view spans 360 degrees. It consists of images obtained in 97 individual pointings of the panoramic camera. The camera took images with five camera filters at each pointing. This 22,780-by-6,000-pixel mosaic is an approximately true-color rendering generated using the images acquired through filters admitting light wavelengths of 750, 530, and 480 nanometers. Lighting varied during the nine sols it took to acquire this panorama, resulting in some small image seams within the mosaic. These seams have been smoothed in sky parts of the mosaic to better simulate the vista that a person would see if able to view it all at the same time on Mars. Opportunity's tracks leading back to the north (center of the panorama) are a reminder of the rover's long trek from Endurance Crater. The deep ruts dug by Opportunity's wheels as it became stuck in the sand appear in the foreground. The crest and trough of the last ripple the rover crossed before getting stuck is visible in the center. These wind-formed sand features are only about 10 to 15 centimeters (4 to 6 inches) tall. The crest of the actual ripple where the rover got stuck can be seen just to the right of center. The tracks and a few other places on and near ripple crests can be seen in this color image to be dustier than the undisturbed or 'normal' plains soils in Meridiani. Since the time these ruts were made, some of the dust there has been blown away by the wind, reaffirming the dynamic nature of the martian environment, even in this barren, ocean-like desert of sand.Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor.
Naqvi, Rizwan Ali; Arsalan, Muhammad; Batchuluun, Ganbayar; Yoon, Hyo Sik; Park, Kang Ryoung
2018-02-03
A paradigm shift is required to prevent the increasing automobile accident deaths that are mostly due to the inattentive behavior of drivers. Knowledge of gaze region can provide valuable information regarding a driver's point of attention. Accurate and inexpensive gaze classification systems in cars can improve safe driving. However, monitoring real-time driving behaviors and conditions presents some challenges: dizziness due to long drives, extreme lighting variations, glasses reflections, and occlusions. Past studies on gaze detection in cars have been chiefly based on head movements. The margin of error in gaze detection increases when drivers gaze at objects by moving their eyes without moving their heads. To solve this problem, a pupil center corneal reflection (PCCR)-based method has been considered. However, the error of accurately detecting the pupil center and corneal reflection center is increased in a car environment due to various environment light changes, reflections on glasses surface, and motion and optical blurring of captured eye image. In addition, existing PCCR-based methods require initial user calibration, which is difficult to perform in a car environment. To address this issue, we propose a deep learning-based gaze detection method using a near-infrared (NIR) camera sensor considering driver head and eye movement that does not require any initial user calibration. The proposed system is evaluated on our self-constructed database as well as on open Columbia gaze dataset (CAVE-DB). The proposed method demonstrated greater accuracy than the previous gaze classification methods.
Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor
Naqvi, Rizwan Ali; Arsalan, Muhammad; Batchuluun, Ganbayar; Yoon, Hyo Sik; Park, Kang Ryoung
2018-01-01
A paradigm shift is required to prevent the increasing automobile accident deaths that are mostly due to the inattentive behavior of drivers. Knowledge of gaze region can provide valuable information regarding a driver’s point of attention. Accurate and inexpensive gaze classification systems in cars can improve safe driving. However, monitoring real-time driving behaviors and conditions presents some challenges: dizziness due to long drives, extreme lighting variations, glasses reflections, and occlusions. Past studies on gaze detection in cars have been chiefly based on head movements. The margin of error in gaze detection increases when drivers gaze at objects by moving their eyes without moving their heads. To solve this problem, a pupil center corneal reflection (PCCR)-based method has been considered. However, the error of accurately detecting the pupil center and corneal reflection center is increased in a car environment due to various environment light changes, reflections on glasses surface, and motion and optical blurring of captured eye image. In addition, existing PCCR-based methods require initial user calibration, which is difficult to perform in a car environment. To address this issue, we propose a deep learning-based gaze detection method using a near-infrared (NIR) camera sensor considering driver head and eye movement that does not require any initial user calibration. The proposed system is evaluated on our self-constructed database as well as on open Columbia gaze dataset (CAVE-DB). The proposed method demonstrated greater accuracy than the previous gaze classification methods. PMID:29401681
a Low-Cost Panoramic Camera for the 3d Documentation of Contaminated Crime Scenes
NASA Astrophysics Data System (ADS)
Abate, D.; Toschi, I.; Sturdy-Colls, C.; Remondino, F.
2017-11-01
Crime scene documentation is a fundamental task which has to be undertaken in a fast, accurate and reliable way, highlighting evidence which can be further used for ensuring justice for victims and for guaranteeing the successful prosecution of perpetrators. The main focus of this paper is on the documentation of a typical crime scene and on the rapid recording of any possible contamination that could have influenced its original appearance. A 3D reconstruction of the environment is first generated by processing panoramas acquired with the low-cost Ricoh Theta 360 camera, and further analysed to highlight potentials and limits of this emerging and consumer-grade technology. Then, a methodology is proposed for the rapid recording of changes occurring between the original and the contaminated crime scene. The approach is based on an automatic 3D feature-based data registration, followed by a cloud-to-cloud distance computation, given as input the 3D point clouds generated before and after e.g. the misplacement of evidence. All the algorithms adopted for panoramas pre-processing, photogrammetric 3D reconstruction, 3D geometry registration and analysis, are presented and currently available in open-source or low-cost software solutions.
Legacy Panorama on Spirit's Way to 'Bonneville'
NASA Technical Reports Server (NTRS)
2005-01-01
[figure removed for brevity, see original site] Click on the image for Legacy Panorama on Spirit's Way to 'Bonneville' (QTVR) This view captured by the panoramic camera on NASA's Mars Exploration Rover Spirit nearly a year ago is called Spirit's 'Legacy' panorama. It combines many frames acquired during Spirit's 59th through 61st martian days, or sols (March 3 to 5, 2004) from a position about halfway between the landing site and the rim of 'Bonneville Crater.' The location is within the transition from the relatively smooth plains to the more rocky and rugged blanket of material ejected from Bonneville by the force of the impact that dug the crater. The panorama spans 360 degrees and consists of images obtained in 78 individual pointings. The camera took images though 5 different filter at each pointing. This mosaic is an approximately true-color rendering generated using the images acquired through filters centered at wavelengths of 750, 530, and 480 nanometers. The Columbia Memorial Station lander can be seen about 200 meters (about 650 feet) in the distance by following the rover tracks back toward right of center in the mosaic and zooming in.2D and 3D visualization methods of endoscopic panoramic bladder images
NASA Astrophysics Data System (ADS)
Behrens, Alexander; Heisterklaus, Iris; Müller, Yannick; Stehle, Thomas; Gross, Sebastian; Aach, Til
2011-03-01
While several mosaicking algorithms have been developed to compose endoscopic images of the internal urinary bladder wall into panoramic images, the quantitative evaluation of these output images in terms of geometrical distortions have often not been discussed. However, the visualization of the distortion level is highly desired for an objective image-based medical diagnosis. Thus, we present in this paper a method to create quality maps from the characteristics of transformation parameters, which were applied to the endoscopic images during the registration process of the mosaicking algorithm. For a global first view impression, the quality maps are laid over the panoramic image and highlight image regions in pseudo-colors according to their local distortions. This illustration supports then surgeons to identify geometrically distorted structures easily in the panoramic image, which allow more objective medical interpretations of tumor tissue in shape and size. Aside from introducing quality maps in 2-D, we also discuss a visualization method to map panoramic images onto a 3-D spherical bladder model. Reference points are manually selected by the surgeon in the panoramic image and the 3-D model. Then the panoramic image is mapped by the Hammer-Aitoff equal-area projection onto the 3-D surface using texture mapping. Finally the textured bladder model can be freely moved in a virtual environment for inspection. Using a two-hemisphere bladder representation, references between panoramic image regions and their corresponding space coordinates within the bladder model are reconstructed. This additional spatial 3-D information thus assists the surgeon in navigation, documentation, as well as surgical planning.
AN ANALYSIS OF THE SHAPES OF INTERSTELLAR EXTINCTION CURVES. VI. THE NEAR-IR EXTINCTION LAW
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzpatrick, E. L.; Massa, D.
We combine new observations from the Hubble Space Telescope's Advanced Camera of Survey with existing data to investigate the wavelength dependence of near-IR (NIR) extinction. Previous studies suggest a power law form for NIR extinction, with a 'universal' value of the exponent, although some recent observations indicate that significant sight line-to-sight line variability may exist. We show that a power-law model for the NIR extinction provides an excellent fit to most extinction curves, but that the value of the power, {beta}, varies significantly from sight line to sight line. Therefore, it seems that a 'universal NIR extinction law' is notmore » possible. Instead, we find that as {beta} decreases, R(V) {identical_to} A(V)/E(B - V) tends to increase, suggesting that NIR extinction curves which have been considered 'peculiar' may, in fact, be typical for different R(V) values. We show that the power-law parameters can depend on the wavelength interval used to derive them, with the {beta} increasing as longer wavelengths are included. This result implies that extrapolating power-law fits to determine R(V) is unreliable. To avoid this problem, we adopt a different functional form for NIR extinction. This new form mimics a power law whose exponent increases with wavelength, has only two free parameters, can fit all of our curves over a longer wavelength baseline and to higher precision, and produces R(V) values which are consistent with independent estimates and commonly used methods for estimating R(V). Furthermore, unlike the power-law model, it gives R(V)s that are independent of the wavelength interval used to derive them. It also suggests that the relation R(V) = -1.36 E(K-V)/(E(B-V)) - 0.79 can estimate R(V) to {+-}0.12. Finally, we use model extinction curves to show that our extinction curves are in accord with theoretical expectations, and demonstrate how large samples of observational quantities can provide useful constraints on the grain properties.« less
Ezoddini Ardakani, Fatemeh; Zangoie Booshehri, Maryam; Banadaki, Seyed Hossein Saeed; Nafisi-Moghadam, Reza
2012-01-01
Background Scaphoid fractures are the most common type of carpal fractures. Objectives The aim of the study was to compare the diagnostic value of panoramic and conventional radiographs of the wrist in scaphoid fractures. Patients and Methods The panoramic and conventional radiographs of 122 patients with acute and chronic wrist trauma were studied. The radiographs were analyzed and examined by two independent radiologist observers; one physician radiologist and one maxillofacial radiologist. The final diagnosis was made by an orthopedic specialist. Kappa test was used for statistical calculations, inter- and intra-observer agreement and correlation between the two techniques. Results Wrist panoramic radiography was more accurate than conventional radiography for ruling out scaphoid fractures. There was an agreement in 85% or more of the cases. Agreement values were higher with better inter and intra observer agreement for panoramic examinations than conventional radiographic examinations. Conclusion The panoramic examination of the wrist is a useful technique for the diagnosis and follow-up of scaphoid fractures. Its use is recommended as a complement to conventional radiography in cases with inconclusive findings. PMID:23599708
NASA Astrophysics Data System (ADS)
Kedzierski, M.; Walczykowski, P.; Wojtkowska, M.; Fryskowska, A.
2017-08-01
Terrestrial Laser Scanning is currently one of the most common techniques for modelling and documenting structures of cultural heritage. However, only geometric information on its own, without the addition of imagery data is insufficient when formulating a precise statement about the status of studies structure, for feature extraction or indicating the sites to be restored. Therefore, the Authors propose the integration of spatial data from terrestrial laser scanning with imaging data from low-cost cameras. The use of images from low-cost cameras makes it possible to limit the costs needed to complete such a study, and thus, increasing the possibility of intensifying the frequency of photographing and monitoring of the given structure. As a result, the analysed cultural heritage structures can be monitored more closely and in more detail, meaning that the technical documentation concerning this structure is also more precise. To supplement the laser scanning information, the Authors propose using both images taken both in the near-infrared range and in the visible range. This choice is motivated by the fact that not all important features of historical structures are always visible RGB, but they can be identified in NIR imagery, which, with the additional merging with a three-dimensional point cloud, gives full spatial information about the cultural heritage structure in question. The Authors proposed an algorithm that automates the process of integrating NIR images with a point cloud using parameters, which had been calculated during the transformation of RGB images. A number of conditions affecting the accuracy of the texturing had been studies, in particular, the impact of the geometry of the distribution of adjustment points and their amount on the accuracy of the integration process, the correlation between the intensity value and the error on specific points using images in different ranges of the electromagnetic spectrum and the selection of the optimal method of transforming the acquired imagery. As a result of the research, an innovative solution was achieved, giving high accuracy results and taking into account a number of factors important in the creation of the documentation of historical structures. In addition, thanks to the designed algorithm, the final result can be obtained in a very short time at a high level of automation, in relation to similar types of studies, meaning that it would be possible to obtain a significant data set for further analyses and more detailed monitoring of the state of the historical structures.
Spirit Greets New Terrain, New Season on Mars
NASA Technical Reports Server (NTRS)
2006-01-01
In time to survive the Martian winter, NASA's Mars Exploration Rover Spirit has driven to and parked on a north-facing slope in the 'Columbia Hills.' This vantage point will optimize solar power during the upcoming winter season and maximize the vehicle's ability to communicate with the NASA Odyssey orbiter. Top science priorities for the coming months are a detailed, 360-degree panorama using all 13 filters of the panoramic camera, a study of surface and subsurface soil properties, and monitoring of the atmosphere and its changes. The planned subsurface soil experiments will be a first for the Mars Exploration Rover mission. To conduct the study, Spirit will use the brush on the rock abrasion tool to carefully sweep away soil, much the way an archaeologist uses a brush to uncover artifacts. At each level, Spirit will measure the mineral and chemical properties and assess the physical nature (such as grain size, texture, hardness) of the material, using the Athena science instruments on the robotic arm. Of particular interest are vertical variations in soil characteristics that may indicate water-related deposition of sulfates and other minerals. Panoramic images will provide important information about the nature and origin of surrounding rocks and soils. Spirit will also study the mineralogy of the surrounding terrain using the thermal emission spectrometer and search for surface changes caused by high winds. After the winter solstice in August, depending on energy levels, scientists may direct the rover to pivot around the disabled, right front wheel to get different targets within reach of the arm. When the winter season is over and solar energy levels rise again, scientists will direct Spirit to leave its winter campaign site and continue examining the 'Columbia Hills.' Spirit acquired the images in this mosaic with the navigation camera on the rover's 807th Martian day, or sol, of exploring Gusev Crater on Mars (April 11, 2006). Approaching from the east are the rover's tracks, including a shallow trench created by the dragging front wheel. On the horizon, in the center of the panorama, is 'McCool Hill.' This view is presented in a cylindrical projection with geometric seam correction.