Traffic Sign Recognition with Invariance to Lighting in Dual-Focal Active Camera System
NASA Astrophysics Data System (ADS)
Gu, Yanlei; Panahpour Tehrani, Mehrdad; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki
In this paper, we present an automatic vision-based traffic sign recognition system, which can detect and classify traffic signs at long distance under different lighting conditions. To realize this purpose, the traffic sign recognition is developed in an originally proposed dual-focal active camera system. In this system, a telephoto camera is equipped as an assistant of a wide angle camera. The telephoto camera can capture a high accuracy image for an object of interest in the view field of the wide angle camera. The image from the telephoto camera provides enough information for recognition when the accuracy of traffic sign is low from the wide angle camera. In the proposed system, the traffic sign detection and classification are processed separately for different images from the wide angle camera and telephoto camera. Besides, in order to detect traffic sign from complex background in different lighting conditions, we propose a type of color transformation which is invariant to light changing. This color transformation is conducted to highlight the pattern of traffic signs by reducing the complexity of background. Based on the color transformation, a multi-resolution detector with cascade mode is trained and used to locate traffic signs at low resolution in the image from the wide angle camera. After detection, the system actively captures a high accuracy image of each detected traffic sign by controlling the direction and exposure time of the telephoto camera based on the information from the wide angle camera. Moreover, in classification, a hierarchical classifier is constructed and used to recognize the detected traffic signs in the high accuracy image from the telephoto camera. Finally, based on the proposed system, a set of experiments in the domain of traffic sign recognition is presented. The experimental results demonstrate that the proposed system can effectively recognize traffic signs at low resolution in different lighting conditions.
NASA Technical Reports Server (NTRS)
Ivanov, Anton B.
2003-01-01
The Mars Orbiter Camera (MOC) has been operating on board of the Mars Global Surveyor (MGS) spacecraft since 1998. It consists of three cameras - Red and Blue Wide Angle cameras (FOV=140 deg.) and Narrow Angle camera (FOV=0.44 deg.). The Wide Angle camera allows surface resolution down to 230 m/pixel and the Narrow Angle camera - down to 1.5 m/pixel. This work is a continuation of the project, which we have reported previously. Since then we have refined and improved our stereo correlation algorithm and have processed many more stereo pairs. We will discuss results of our stereo pair analysis located in the Mars Exploration rovers (MER) landing sites and address feasibility of recovering topography from stereo pairs (especially in the polar regions), taken during MGS 'Relay-16' mode.
A wide-angle camera module for disposable endoscopy
NASA Astrophysics Data System (ADS)
Shim, Dongha; Yeon, Jesun; Yi, Jason; Park, Jongwon; Park, Soo Nam; Lee, Nanhee
2016-08-01
A wide-angle miniaturized camera module for disposable endoscope is demonstrated in this paper. A lens module with 150° angle of view (AOV) is designed and manufactured. All plastic injection-molded lenses and a commercial CMOS image sensor are employed to reduce the manufacturing cost. The image sensor and LED illumination unit are assembled with a lens module. The camera module does not include a camera processor to further reduce its size and cost. The size of the camera module is 5.5 × 5.5 × 22.3 mm3. The diagonal field of view (FOV) of the camera module is measured to be 110°. A prototype of a disposable endoscope is implemented to perform a pre-clinical animal testing. The esophagus of an adult beagle dog is observed. These results demonstrate the feasibility of a cost-effective and high-performance camera module for disposable endoscopy.
The Wide Angle Camera of the ROSETTA Mission
NASA Astrophysics Data System (ADS)
Barbieri, C.; Fornasier, S.; Verani, S.; Bertini, I.; Lazzarin, M.; Rampazzi, F.; Cremonese, G.; Ragazzoni, R.; Marzari, F.; Angrilli, F.; Bianchini, G. A.; Debei, S.; Dececco, M.; Guizzo, G.; Parzianello, G.; Ramous, P.; Saggin, B.; Zaccariotto, M.; Da Deppo, V.; Naletto, G.; Nicolosi, G.; Pelizzo, M. G.; Tondello, G.; Brunello, P.; Peron, F.
This paper aims to give a brief description of the Wide Angle Camera (WAC), built by the Centro Servizi e AttivitàSpaziali (CISAS) of the University of Padova for the ESA ROSETTA Mission to comet 46P/Wirtanen and asteroids 4979 Otawara and 140 Siwa. The WAC is part of the OSIRIS imaging system, which comprises also a Narrow Angle Camera (NAC) built by the Laboratoire d'Astrophysique Spatiale (LAS) of Marseille. CISAS had also the responsibility to build the shutter and the front cover mechanism for the NAC. The flight model of the WAC was delivered in December 2001, and has been already integrated on ROSETTA.
1990-02-14
Range : 4 billion miles from Earth, at 32 degrees to the ecliptic. P-36057C This color image of the Sun, Earth, and Venus is one of the first, and maybe, only images that show are solar system from such a vantage point. The image is a portion of a wide angle image containing the sun and the region of space where the Earth and Venus were at the time, with narrow angle cameras centered on each planet. The wide angle was taken with the cameras darkest filter, a methane absorption band, and the shortest possible exposure, one two-hundredth of a second, to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large in the sky, as seen from Voyager's perpective at the edge of the solar system. Yet, it is still 8xs brighter than the brightest star in Earth's sky, Sirius. The image of the sun you see is far larger than the actual dimension of the solar disk. The result of the brightness is a bright burned out image with multiple reflections from the optics of the camera. The rays around th sun are a diffraction pattern of the calibration lamp which is mounted in front of the wide angle lens. the 2 narrow angle frames containing the images of the Earth and Venus have been digitally mosaicked into the wide angle image at the appropriate scale. These images were taken through three color filters and recombined to produce the color image. The violet, green, and blue filters used , as well as exposure times of .72,.48, and .72 for Earth, and .36, .24, and .36 for Venus.The images also show long linear streaks resulting from scatering of sulight off parts of the camera and its shade.
Gooi, Patrick; Ahmed, Yusuf; Ahmed, Iqbal Ike K
2014-07-01
We describe the use of a microscope-mounted wide-angle point-of-view camera to record optimal hand positions in ocular surgery. The camera is mounted close to the objective lens beneath the surgeon's oculars and faces the same direction as the surgeon, providing a surgeon's view. A wide-angle lens enables viewing of both hands simultaneously and does not require repositioning the camera during the case. Proper hand positioning and instrument placement through microincisions are critical for effective and atraumatic handling of tissue within the eye. Our technique has potential in the assessment and training of optimal hand position for surgeons performing intraocular surgery. It is an innovative way to routinely record instrument and operating hand positions in ophthalmic surgery and has minimal requirements in terms of cost, personnel, and operating-room space. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2014 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Jung, Kyunghwa; Choi, Hyunseok; Hong, Hanpyo; Adikrishna, Arnold; Jeon, In-Ho; Hong, Jaesung
2017-02-01
A hands-free region-of-interest (ROI) selection interface is proposed for solo surgery using a wide-angle endoscope. A wide-angle endoscope provides images with a larger field of view than a conventional endoscope. With an appropriate selection interface for a ROI, surgeons can also obtain a detailed local view as if they moved a conventional endoscope in a specific position and direction. To manipulate the endoscope without releasing the surgical instrument in hand, a mini-camera is attached to the instrument, and the images taken by the attached camera are analyzed. When a surgeon moves the instrument, the instrument orientation is calculated by an image processing. Surgeons can select the ROI with this instrument movement after switching from 'task mode' to 'selection mode.' The accelerated KAZE algorithm is used to track the features of the camera images once the instrument is moved. Both the wide-angle and detailed local views are displayed simultaneously, and a surgeon can move the local view area by moving the mini-camera attached to the surgical instrument. Local view selection for a solo surgery was performed without releasing the instrument. The accuracy of camera pose estimation was not significantly different between camera resolutions, but it was significantly different between background camera images with different numbers of features (P < 0.01). The success rate of ROI selection diminished as the number of separated regions increased. However, separated regions up to 12 with a region size of 160 × 160 pixels were selected with no failure. Surgical tasks on a phantom model and a cadaver were attempted to verify the feasibility in a clinical environment. Hands-free endoscope manipulation without releasing the instruments in hand was achieved. The proposed method requires only a small, low-cost camera and an image processing. The technique enables surgeons to perform solo surgeries without a camera assistant.
Lunar Reconnaissance Orbiter Camera (LROC) instrument overview
Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.
2010-01-01
The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.
Voyager spacecraft images of Jupiter and Saturn
NASA Technical Reports Server (NTRS)
Birnbaum, M. M.
1982-01-01
The Voyager imaging system is described, noting that it is made up of a narrow-angle and a wide-angle TV camera, each in turn consisting of optics, a filter wheel and shutter assembly, a vidicon tube, and an electronics subsystem. The narrow-angle camera has a focal length of 1500 mm; its field of view is 0.42 deg and its focal ratio is f/8.5. For the wide-angle camera, the focal length is 200 mm, the field of view 3.2 deg, and the focal ratio of f/3.5. Images are exposed by each camera through one of eight filters in the filter wheel on the photoconductive surface of a magnetically focused and deflected vidicon having a diameter of 25 mm. The vidicon storage surface (target) is a selenium-sulfur film having an active area of 11.14 x 11.14 mm; it holds a frame consisting of 800 lines with 800 picture elements per line. Pictures of Jupiter, Saturn, and their moons are presented, with short descriptions given of the area being viewed.
Inflight Calibration of the Lunar Reconnaissance Orbiter Camera Wide Angle Camera
NASA Astrophysics Data System (ADS)
Mahanti, P.; Humm, D. C.; Robinson, M. S.; Boyd, A. K.; Stelling, R.; Sato, H.; Denevi, B. W.; Braden, S. E.; Bowman-Cisneros, E.; Brylow, S. M.; Tschimmel, M.
2016-04-01
The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) has acquired more than 250,000 images of the illuminated lunar surface and over 190,000 observations of space and non-illuminated Moon since 1 January 2010. These images, along with images from the Narrow Angle Camera (NAC) and other Lunar Reconnaissance Orbiter instrument datasets are enabling new discoveries about the morphology, composition, and geologic/geochemical evolution of the Moon. Characterizing the inflight WAC system performance is crucial to scientific and exploration results. Pre-launch calibration of the WAC provided a baseline characterization that was critical for early targeting and analysis. Here we present an analysis of WAC performance from the inflight data. In the course of our analysis we compare and contrast with the pre-launch performance wherever possible and quantify the uncertainty related to various components of the calibration process. We document the absolute and relative radiometric calibration, point spread function, and scattered light sources and provide estimates of sources of uncertainty for spectral reflectance measurements of the Moon across a range of imaging conditions.
Instrumentation for Infrared Airglow Clutter.
1987-03-10
gain, and filter position to the Camera Head, and monitors these parameters as well as preamp video. GAZER is equipped with a Lenzar wide angle, low...Specifications/Parameters VIDEO SENSOR: Camera ...... . LENZAR Intensicon-8 LLLTV using 2nd gen * micro-channel intensifier and proprietary camera tube
Toslak, Devrim; Liu, Changgeng; Alam, Minhaj Nur; Yao, Xincheng
2018-06-01
A portable fundus imager is essential for emerging telemedicine screening and point-of-care examination of eye diseases. However, existing portable fundus cameras have limited field of view (FOV) and frequently require pupillary dilation. We report here a miniaturized indirect ophthalmoscopy-based nonmydriatic fundus camera with a snapshot FOV up to 67° external angle, which corresponds to a 101° eye angle. The wide-field fundus camera consists of a near-infrared light source (LS) for retinal guidance and a white LS for color retinal imaging. By incorporating digital image registration and glare elimination methods, a dual-image acquisition approach was used to achieve reflection artifact-free fundus photography.
Visible-infrared achromatic imaging by wavefront coding with wide-angle automobile camera
NASA Astrophysics Data System (ADS)
Ohta, Mitsuhiko; Sakita, Koichi; Shimano, Takeshi; Sugiyama, Takashi; Shibasaki, Susumu
2016-09-01
We perform an experiment of achromatic imaging with wavefront coding (WFC) using a wide-angle automobile lens. Our original annular phase mask for WFC was inserted to the lens, for which the difference between the focal positions at 400 nm and at 950 nm is 0.10 mm. We acquired images of objects using a WFC camera with this lens under the conditions of visible and infrared light. As a result, the effect of the removal of the chromatic aberration of the WFC system was successfully determined. Moreover, we fabricated a demonstration set assuming the use of a night vision camera in an automobile and showed the effect of the WFC system.
1999-08-24
One wide-angle and eight narrow-angle camera images of Miranda, taken by NASA Voyager 2, were combined in this view. The controlled mosaic was transformed to an orthographic view centered on the south pole.
Topview stereo: combining vehicle-mounted wide-angle cameras to a distance sensor array
NASA Astrophysics Data System (ADS)
Houben, Sebastian
2015-03-01
The variety of vehicle-mounted sensors in order to fulfill a growing number of driver assistance tasks has become a substantial factor in automobile manufacturing cost. We present a stereo distance method exploiting the overlapping field of view of a multi-camera fisheye surround view system, as they are used for near-range vehicle surveillance tasks, e.g. in parking maneuvers. Hence, we aim at creating a new input signal from sensors that are already installed. Particular properties of wide-angle cameras (e.g. hanging resolution) demand an adaptation of the image processing pipeline to several problems that do not arise in classical stereo vision performed with cameras carefully designed for this purpose. We introduce the algorithms for rectification, correspondence analysis, and regularization of the disparity image, discuss reasons and avoidance of the shown caveats, and present first results on a prototype topview setup.
Solar System Portrait - 60 Frame Mosaic
1996-09-13
The cameras of Voyager 1 on Feb. 14, 1990, pointed back toward the sun and took a series of pictures of the sun and the planets, making the first ever portrait of our solar system as seen from the outside. In the course of taking this mosaic consisting of a total of 60 frames, Voyager 1 made several images of the inner solar system from a distance of approximately 4 billion miles and about 32 degrees above the ecliptic plane. Thirty-nine wide angle frames link together six of the planets of our solar system in this mosaic. Outermost Neptune is 30 times further from the sun than Earth. Our sun is seen as the bright object in the center of the circle of frames. The wide-angle image of the sun was taken with the camera's darkest filter (a methane absorption band) and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large as seen from Voyager, only about one-fortieth of the diameter as seen from Earth, but is still almost 8 million times brighter than the brightest star in Earth's sky, Sirius. The result of this great brightness is an image with multiple reflections from the optics in the camera. Wide-angle images surrounding the sun also show many artifacts attributable to scattered light in the optics. These were taken through the clear filter with one second exposures. The insets show the planets magnified many times. Narrow-angle images of Earth, Venus, Jupiter, Saturn, Uranus and Neptune were acquired as the spacecraft built the wide-angle mosaic. Jupiter is larger than a narrow-angle pixel and is clearly resolved, as is Saturn with its rings. Uranus and Neptune appear larger than they really are because of image smear due to spacecraft motion during the long (15 second) exposures. From Voyager's great distance Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun. http://photojournal.jpl.nasa.gov/catalog/PIA00451
Solar System Portrait - 60 Frame Mosaic
NASA Technical Reports Server (NTRS)
1990-01-01
The cameras of Voyager 1 on Feb. 14, 1990, pointed back toward the sun and took a series of pictures of the sun and the planets, making the first ever 'portrait' of our solar system as seen from the outside. In the course of taking this mosaic consisting of a total of 60 frames, Voyager 1 made several images of the inner solar system from a distance of approximately 4 billion miles and about 32 degrees above the ecliptic plane. Thirty-nine wide angle frames link together six of the planets of our solar system in this mosaic. Outermost Neptune is 30 times further from the sun than Earth. Our sun is seen as the bright object in the center of the circle of frames. The wide-angle image of the sun was taken with the camera's darkest filter (a methane absorption band) and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large as seen from Voyager, only about one-fortieth of the diameter as seen from Earth, but is still almost 8 million times brighter than the brightest star in Earth's sky, Sirius. The result of this great brightness is an image with multiple reflections from the optics in the camera. Wide-angle images surrounding the sun also show many artifacts attributable to scattered light in the optics. These were taken through the clear filter with one second exposures. The insets show the planets magnified many times. Narrow-angle images of Earth, Venus, Jupiter, Saturn, Uranus and Neptune were acquired as the spacecraft built the wide-angle mosaic. Jupiter is larger than a narrow-angle pixel and is clearly resolved, as is Saturn with its rings. Uranus and Neptune appear larger than they really are because of image smear due to spacecraft motion during the long (15 second) exposures. From Voyager's great distance Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun.
Fabrication of multi-focal microlens array on curved surface for wide-angle camera module
NASA Astrophysics Data System (ADS)
Pan, Jun-Gu; Su, Guo-Dung J.
2017-08-01
In this paper, we present a wide-angle and compact camera module that consists of microlens array with different focal lengths on curved surface. The design integrates the principle of an insect's compound eye and the human eye. It contains a curved hexagonal microlens array and a spherical lens. Compared with normal mobile phone cameras which usually need no less than four lenses, but our proposed system only uses one lens. Furthermore, the thickness of our proposed system is only 2.08 mm and diagonal full field of view is about 100 degrees. In order to make the critical microlens array, we used the inkjet printing to control the surface shape of each microlens for achieving different focal lengths and use replication method to form curved hexagonal microlens array.
NASA Astrophysics Data System (ADS)
Oertel, D.; Jahn, H.; Sandau, R.; Walter, I.; Driescher, H.
1990-10-01
Objectives of the multifunctional stereo imaging camera (MUSIC) system to be deployed on the Soviet Mars-94 mission are outlined. A high-resolution stereo camera (HRSC) and wide-angle opto-electronic stereo scanner (WAOSS) are combined in terms of hardware, software, technology aspects, and solutions. Both HRSC and WAOSS are push-button instruments containing a single optical system and focal plates with several parallel CCD line sensors. Emphasis is placed on the MUSIC system's stereo capability, its design, mass memory, and data compression. A 1-Gbit memory is divided into two parts: 80 percent for HRSC and 20 percent for WAOSS, while the selected on-line compression strategy is based on macropixel coding and real-time transform coding.
First Results from the Wide Angle Camera of the ROSETTA Mission .
NASA Astrophysics Data System (ADS)
Barbieri, C.; Fornasier, S.; Bertini, I.; Angrilli, F.; Bianchini, G. A.; Debei, S.; De Cecco, M.; Parzianello, G.; Zaccariotto, M.; Da Deppo, V.; Naletto, G.
This paper gives a brief description of the Wide Angle Camera (WAC), built by the Center of Studies and Activities for Space (CISAS) of the University of Padova for the ESA ROSETTA Mission, of data we have obtained about the new mission targets, and of the first results achieved after the launch in March 2004. The WAC is part of the OSIRIS imaging system, built under the PI-ship of Dr. U. Keller (Max-Planck-Institute for Solar System Studies) which comprises also a Narrow Angle Camera (NAC) built by the Laboratoire d'Astrophysique Spatiale (LAS) of Marseille. CISAS had also the responsibility to build the shutter and the front door mechanism for the NAC. The images show the excellent optical quality of the WAC, exceeding the specifications both in term of encircled energy (80% in one pixel over a FoV of 12×12 sq degree), limiting magnitude (fainter than the 13th in 30s exposure time through a wideband red filter) and amount of distortions.
Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish
2018-01-01
Purpose: The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Methods: Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Conclusion: Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching. PMID:29283133
Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish
2018-01-01
The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching.
Omnidirectional Underwater Camera Design and Calibration
Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David
2015-01-01
This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707
Solar System Portrait - View of the Sun, Earth and Venus
1996-09-13
This color image of the sun, Earth and Venus was taken by the Voyager 1 spacecraft Feb. 14, 1990, when it was approximately 32 degrees above the plane of the ecliptic and at a slant-range distance of approximately 4 billion miles. It is the first -- and may be the only -- time that we will ever see our solar system from such a vantage point. The image is a portion of a wide-angle image containing the sun and the region of space where the Earth and Venus were at the time with two narrow-angle pictures centered on each planet. The wide-angle was taken with the camera's darkest filter (a methane absorption band), and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large in the sky as seen from Voyager's perspective at the edge of the solar system but is still eight million times brighter than the brightest star in Earth's sky, Sirius. The image of the sun you see is far larger than the actual dimension of the solar disk. The result of the brightness is a bright burned out image with multiple reflections from the optics in the camera. The "rays" around the sun are a diffraction pattern of the calibration lamp which is mounted in front of the wide angle lens. The two narrow-angle frames containing the images of the Earth and Venus have been digitally mosaiced into the wide-angle image at the appropriate scale. These images were taken through three color filters and recombined to produce a color image. The violet, green and blue filters were used; exposure times were, for the Earth image, 0.72, 0.48 and 0.72 seconds, and for the Venus frame, 0.36, 0.24 and 0.36, respectively. Although the planetary pictures were taken with the narrow-angle camera (1500 mm focal length) and were not pointed directly at the sun, they show the effects of the glare from the nearby sun, in the form of long linear streaks resulting from the scattering of sunlight off parts of the camera and its sun shade. From Voyager's great distance both Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun. Detailed analysis also suggests that Voyager detected the moon as well, but it is too faint to be seen without special processing. Venus was only 0.11 pixel in diameter. The faint colored structure in both planetary frames results from sunlight scattered in the optics. http://photojournal.jpl.nasa.gov/catalog/PIA00450
Solar System Portrait - View of the Sun, Earth and Venus
NASA Technical Reports Server (NTRS)
1990-01-01
This color image of the sun, Earth and Venus was taken by the Voyager 1 spacecraft Feb. 14, 1990, when it was approximately 32 degrees above the plane of the ecliptic and at a slant-range distance of approximately 4 billion miles. It is the first -- and may be the only -- time that we will ever see our solar system from such a vantage point. The image is a portion of a wide-angle image containing the sun and the region of space where the Earth and Venus were at the time with two narrow-angle pictures centered on each planet. The wide-angle was taken with the camera's darkest filter (a methane absorption band), and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large in the sky as seen from Voyager's perspective at the edge of the solar system but is still eight million times brighter than the brightest star in Earth's sky, Sirius. The image of the sun you see is far larger than the actual dimension of the solar disk. The result of the brightness is a bright burned out image with multiple reflections from the optics in the camera. The 'rays' around the sun are a diffraction pattern of the calibration lamp which is mounted in front of the wide angle lens. The two narrow-angle frames containing the images of the Earth and Venus have been digitally mosaiced into the wide-angle image at the appropriate scale. These images were taken through three color filters and recombined to produce a color image. The violet, green and blue filters were used; exposure times were, for the Earth image, 0.72, 0.48 and 0.72 seconds, and for the Venus frame, 0.36, 0.24 and 0.36, respectively. Although the planetary pictures were taken with the narrow-angle camera (1500 mm focal length) and were not pointed directly at the sun, they show the effects of the glare from the nearby sun, in the form of long linear streaks resulting from the scattering of sunlight off parts of the camera and its sun shade. From Voyager's great distance both Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun. Detailed analysis also suggests that Voyager detected the moon as well, but it is too faint to be seen without special processing. Venus was only 0.11 pixel in diameter. The faint colored structure in both planetary frames results from sunlight scattered in the optics.
Dynamic calibration of pan-tilt-zoom cameras for traffic monitoring.
Song, Kai-Tai; Tai, Jen-Chao
2006-10-01
Pan-tilt-zoom (PTZ) cameras have been widely used in recent years for monitoring and surveillance applications. These cameras provide flexible view selection as well as a wider observation range. This makes them suitable for vision-based traffic monitoring and enforcement systems. To employ PTZ cameras for image measurement applications, one first needs to calibrate the camera to obtain meaningful results. For instance, the accuracy of estimating vehicle speed depends on the accuracy of camera calibration and that of vehicle tracking results. This paper presents a novel calibration method for a PTZ camera overlooking a traffic scene. The proposed approach requires no manual operation to select the positions of special features. It automatically uses a set of parallel lane markings and the lane width to compute the camera parameters, namely, focal length, tilt angle, and pan angle. Image processing procedures have been developed for automatically finding parallel lane markings. Interesting experimental results are presented to validate the robustness and accuracy of the proposed method.
2011-07-01
cameras were installed around the test pan and an underwater GoPro ® video camera recorded the fire from below the layer of fuel. 3.2.2. Camera Images...Distribution A: Approved for public release; distribution unlimited. 3.2.3. Video Images A GoPro video camera with a wide angle lens recorded the tests...camera and the GoPro ® video camera were not used for fire suppression experiments. 3.3.2. Test Pans Two ¼-in thick stainless steel test pans were
Have a Nice Spring! MOC Revisits "Happy Face" Crater
2005-05-16
Smile! Spring has sprung in the martian southern hemisphere. With it comes the annual retreat of the winter polar frost cap. This view of "Happy Face Crater"--officially named "Galle Crater"--shows patches of white water ice frost in and around the crater's south-facing slopes. Slopes that face south will retain frost longer than north-facing slopes because they do not receive as much sunlight in early spring. This picture is a composite of images taken by the Mars Global Surveyor Mars Orbiter Camera (MOC) red and blue wide angle cameras. The wide angle cameras were designed to monitor the changing weather, frost, and wind patterns on Mars. Galle Crater is located on the east rim of the Argyre Basin and is about 215 kilometers (134 miles) across. In this picture, illumination is from the upper left and north is up. http://photojournal.jpl.nasa.gov/catalog/PIA02325
Harry E. Brown
1962-01-01
The canopy camera is a device of new design that takes wide-angle, overhead photographs of vegetation canopies, cloud cover, topographic horizons, and similar subjects. Since the entire hemisphere is photographed in a single exposure, the resulting photograph is circular, with the horizon forming the perimeter and the zenith the center. Photographs of this type provide...
Prediction of Viking lander camera image quality
NASA Technical Reports Server (NTRS)
Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.
1976-01-01
Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.
NASA Astrophysics Data System (ADS)
Haase, I.; Oberst, J.; Scholten, F.; Wählisch, M.; Gläser, P.; Karachevtseva, I.; Robinson, M. S.
2012-05-01
Newly acquired high resolution Lunar Reconnaissance Orbiter Camera (LROC) images allow accurate determination of the coordinates of Apollo hardware, sampling stations, and photographic viewpoints. In particular, the positions from where the Apollo 17 astronauts recorded panoramic image series, at the so-called “traverse stations”, were precisely determined for traverse path reconstruction. We analyzed observations made in Apollo surface photography as well as orthorectified orbital images (0.5 m/pixel) and Digital Terrain Models (DTMs) (1.5 m/pixel and 100 m/pixel) derived from LROC Narrow Angle Camera (NAC) and Wide Angle Camera (WAC) images. Key features captured in the Apollo panoramic sequences were identified in LROC NAC orthoimages. Angular directions of these features were measured in the panoramic images and fitted to the NAC orthoimage by applying least squares techniques. As a result, we obtained the surface panoramic camera positions to within 50 cm. At the same time, the camera orientations, North azimuth angles and distances to nearby features of interest were also determined. Here, initial results are shown for traverse station 1 (northwest of Steno Crater) as well as the Apollo Lunar Surface Experiment Package (ALSEP) area.
NASA Astrophysics Data System (ADS)
Da Deppo, V.; Naletto, G.; Nicolosi, P.; Zambolin, P.; De Cecco, M.; Debei, S.; Parzianello, G.; Ramous, P.; Zaccariotto, M.; Fornasier, S.; Verani, S.; Thomas, N.; Barthol, P.; Hviid, S. F.; Sebastian, I.; Meller, R.; Sierks, H.; Keller, H. U.; Barbieri, C.; Angrilli, F.; Lamy, P.; Rodrigo, R.; Rickman, H.; Wenzel, K. P.
2017-11-01
Rosetta is one of the cornerstone missions of the European Space Agency for having a rendezvous with the comet 67P/Churyumov-Gerasimenko in 2014. The imaging instrument on board the satellite is OSIRIS (Optical, Spectroscopic and Infrared Remote Imaging System), a cooperation among several European institutes, which consists of two cameras: a Narrow (NAC) and a Wide Angle Camera (WAC). The WAC optical design is an innovative one: it adopts an all reflecting, unvignetted and unobstructed two mirror configuration which allows to cover a 12° × 12° field of view with an F/5.6 aperture and gives a nominal contrast ratio of about 10-4. The flight model of this camera has been successfully integrated and tested in our laboratories, and finally has been integrated on the satellite which is now waiting to be launched in February 2004. In this paper we are going to describe the optical characteristics of the camera, and to summarize the results so far obtained with the preliminary calibration data. The analysis of the optical performance of this model shows a good agreement between theoretical performance and experimental results.
Automatic helmet-wearing detection for law enforcement using CCTV cameras
NASA Astrophysics Data System (ADS)
Wonghabut, P.; Kumphong, J.; Satiennam, T.; Ung-arunyawee, R.; Leelapatra, W.
2018-04-01
The objective of this research is to develop an application for enforcing helmet wearing using CCTV cameras. The developed application aims to help law enforcement by police, and eventually resulting in changing risk behaviours and consequently reducing the number of accidents and its severity. Conceptually, the application software implemented using C++ language and OpenCV library uses two different angle of view CCTV cameras. Video frames recorded by the wide-angle CCTV camera are used to detect motorcyclists. If any motorcyclist without helmet is found, then the zoomed (narrow-angle) CCTV is activated to capture image of the violating motorcyclist and the motorcycle license plate in real time. Captured images are managed by database implemented using MySQL for ticket issuing. The results show that the developed program is able to detect 81% of motorcyclists on various motorcycle types during daytime and night-time. The validation results reveal that the program achieves 74% accuracy in detecting the motorcyclist without helmet.
Rosetta/OSIRIS - Nucleus morphology and activity of comet 67P/Churyumov-Gerasimenko
NASA Astrophysics Data System (ADS)
Sierks, Holger; Barbieri, Cesare; Lamy, Philippe; Rickman, Hans; Rodrigo, Rafael; Koschny, Detlef
2015-04-01
ESA's Rosetta mission arrived on August 6, 2014, at target comet 67P/Churyumov-Gerasimenko after 10 years of cruise. OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) is the scientific imaging system onboard Rosetta. It comprises a Narrow Angle Camera (NAC) for nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field coma investigations. OSIRIS imaged the nucleus and coma of the comet from the arrival throughout the mapping phase, PHILAE landing, early escort phase and close fly-by. The overview paper will discuss the surface morpholo-gy and activity of the nucleus as seen in gas, dust, and local jets as well as small scale structures in the local topography.
The Activity of Comet 67P/Churyumov-Gerasimenko as Seen by Rosetta/OSIRIS
NASA Astrophysics Data System (ADS)
Sierks, H.; Barbieri, C.; Lamy, P. L.; Rodrigo, R.; Rickman, H.; Koschny, D.
2015-12-01
The Rosetta mission of the European Space Agency arrived on August 6, 2014, at the target comet 67P/Churyumov-Gerasimenko. OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) is the scientific imaging system onboard Rosetta. OSIRIS consists of a Narrow Angle Camera (NAC) for the nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field gas and dust coma investigations. OSIRIS observed the coma and the nucleus of comet 67P/C-G during approach, arrival, and landing of PHILAE. OSIRIS continued comet monitoring and mapping of surface and activity in 2015 with close fly-bys with high resolution and remote, wide angle observations. The scientific results reveal a nucleus with two lobes and varied morphology. Active regions are located at steep cliffs and collapsed pits which form collimated gas jets. Dust is accelerated by the gas, forming bright jet filaments and the large scale, diffuse coma of the comet. We will present activity and surface changes observed in the Northern and Southern hemisphere and around perihelion passage.
Evaluation of modified portable digital camera for screening of diabetic retinopathy.
Chalam, Kakarla V; Brar, Vikram S; Keshavamurthy, Ravi
2009-01-01
To describe a portable wide-field noncontact digital camera for posterior segment photography. The digital camera has a compound lens consisting of two optical elements (a 90-dpt and a 20-dpt lens) attached to a 7.2-megapixel camera. White-light-emitting diodes are used to illuminate the fundus and reduce source reflection. The camera settings are set to candlelight mode, the optic zoom standardized to x2.4 and the focus is manually set to 3.0 m. The new technique provides quality wide-angle digital images of the retina (60 degrees ) in patients with dilated pupils, at a fraction of the cost of established digital fundus photography. The modified digital camera is a useful alternative technique to acquire fundus images and provides a tool for screening posterior segment conditions, including diabetic retinopathy in a variety of clinical settings.
NASA Technical Reports Server (NTRS)
Park, Jung- Ho; Kim, Jongmin; Zukic, Muamer; Torr, Douglas G.
1994-01-01
We report the design of multilayer reflective filters for the self-filtering cameras of the NUVIEWS project. Wide angle self-filtering cameras were designed to image the C IV (154.9 nm) line emission, and H2 Lyman band fluorescence (centered at 161 nm) over a 20 deg x 30 deg field of view. A key element of the filter design includes the development of pi-multilayers optimized to provide maximum reflectance at 154.9 nm and 161 nm for the respective cameras without significant spectral sensitivity to the large cone angle of the incident radiation. We applied self-filtering concepts to design NUVIEWS telescope filters that are composed of three reflective mirrors and one folding mirror. The filters with narrowband widths of 6 and 8 rim at 154.9 and 161 nm, respectively, have net throughputs of more than 50 % with average blocking of out-of-band wavelengths better than 3 x 10(exp -4)%.
Afocal viewport optics for underwater imaging
NASA Astrophysics Data System (ADS)
Slater, Dan
2014-09-01
A conventional camera can be adapted for underwater use by enclosing it in a sealed waterproof pressure housing with a viewport. The viewport, as an optical interface between water and air needs to consider both the camera and water optical characteristics while also providing a high pressure water seal. Limited hydrospace visibility drives a need for wide angle viewports. Practical optical interfaces between seawater and air vary from simple flat plate windows to complex water contact lenses. This paper first provides a brief overview of the physical and optical properties of the ocean environment along with suitable optical materials. This is followed by a discussion of the characteristics of various afocal underwater viewport types including flat windows, domes and the Ivanoff corrector lens, a derivative of a Galilean wide angle camera adapter. Several new and interesting optical designs derived from the Ivanoff corrector lens are presented including a pair of very compact afocal viewport lenses that are compatible with both in water and in air environments and an afocal underwater hyper-hemispherical fisheye lens.
Colors of active regions on comet 67P
NASA Astrophysics Data System (ADS)
Oklay, N.; Vincent, J.-B.; Sierks, H.; Besse, S.; Fornasier, S.; Barucci, M. A.; Lara, L.; Scholten, F.; Preusker, F.; Lazzarin, M.; Pajola, M.; La Forgia, F.
2015-10-01
The OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) scientific imager (Keller et al. 2007) is successfully delivering images of comet 67P/Churyumov-Gerasimenko from its both wide angle camera (WAC) and narrow angle camera (NAC) since ESA's spacecraft Rosetta's arrival to the comet. Both cameras are equipped with filters covering the wavelength range of about 200 nm to 1000 nm. The comet nucleus is mapped with different combination of the filters in resolutions up to 15 cm/px. Besides the determination of the surface morphology in great details (Thomas et al. 2015), such high resolution images provided us a mean to unambiguously link some activity in the coma to a series of pits on the nucleus surface (Vincent et al. 2015).
The Europa Imaging System (EIS): Investigating Europa's geology, ice shell, and current activity
NASA Astrophysics Data System (ADS)
Turtle, Elizabeth; Thomas, Nicolas; Fletcher, Leigh; Hayes, Alexander; Ernst, Carolyn; Collins, Geoffrey; Hansen, Candice; Kirk, Randolph L.; Nimmo, Francis; McEwen, Alfred; Hurford, Terry; Barr Mlinar, Amy; Quick, Lynnae; Patterson, Wes; Soderblom, Jason
2016-07-01
NASA's Europa Mission, planned for launch in 2022, will perform more than 40 flybys of Europa with altitudes at closest approach as low as 25 km. The instrument payload includes the Europa Imaging System (EIS), a camera suite designed to transform our understanding of Europa through global decameter-scale coverage, topographic and color mapping, and unprecedented sub- meter-scale imaging. EIS combines narrow-angle and wide-angle cameras to address these science goals: • Constrain the formation processes of surface features by characterizing endogenic geologic structures, surface units, global cross-cutting relationships, and relationships to Europa's subsurface structure and potential near-surface water. • Search for evidence of recent or current activity, including potential plumes. • Characterize the ice shell by constraining its thickness and correlating surface features with subsurface structures detected by ice penetrating radar. • Characterize scientifically compelling landing sites and hazards by determining the nature of the surface at scales relevant to a potential lander. EIS Narrow-angle Camera (NAC): The NAC, with a 2.3°° x 1.2°° field of view (FOV) and a 10-μμrad instantaneous FOV (IFOV), achieves 0.5-m pixel scale over a 2-km-wide swath from 50-km altitude. A 2-axis gimbal enables independent targeting, allowing very high-resolution stereo imaging to generate digital topographic models (DTMs) with 4-m spatial scale and 0.5-m vertical precision over the 2-km swath from 50-km altitude. The gimbal also makes near-global (>95%) mapping of Europa possible at ≤50-m pixel scale, as well as regional stereo imaging. The NAC will also perform high-phase-angle observations to search for potential plumes. EIS Wide-angle Camera (WAC): The WAC has a 48°° x 24°° FOV, with a 218-μμrad IFOV, and is designed to acquire pushbroom stereo swaths along flyby ground-tracks. From an altitude of 50 km, the WAC achieves 11-m pixel scale over a 44-km-wide swath, generating DTMs with 32-m spatial scale and 4-m vertical precision. These data also support characterization of surface clutter for interpretation of radar deep and shallow sounding modes. Detectors: The cameras have identical rapid-readout, radiation-hard 4k x 2k CMOS detectors and can image in both pushbroom and framing modes. Color observations are acquired by pushbroom imaging using six broadband filters (~300-1050 nm), allowing mapping of surface units for correlation with geologic structures, topography, and compositional units from other instruments.
NASA Technical Reports Server (NTRS)
1998-01-01
Color composite of condensate clouds over Tharsis made from red and blue images with a synthesized green channel. Mars Orbiter Camera wide angle frames from Orbit 48.
Figure caption from Science MagazineGeomorphologic mapping of the lunar crater Tycho and its impact melt deposits
NASA Astrophysics Data System (ADS)
Krüger, T.; van der Bogert, C. H.; Hiesinger, H.
2016-07-01
Using SELENE/Kaguya Terrain Camera and Lunar Reconnaissance Orbiter Camera (LROC) data, we produced a new, high-resolution (10 m/pixel), geomorphological and impact melt distribution map for the lunar crater Tycho. The distal ejecta blanket and crater rays were investigated using LROC wide-angle camera (WAC) data (100 m/pixel), while the fine-scale morphologies of individual units were documented using high resolution (∼0.5 m/pixel) LROC narrow-angle camera (NAC) frames. In particular, Tycho shows a large coherent melt sheet on the crater floor, melt pools and flows along the terraced walls, and melt pools on the continuous ejecta blanket. The crater floor of Tycho exhibits three distinct units, distinguishable by their elevation and hummocky surface morphology. The distribution of impact melt pools and ejecta, as well as topographic asymmetries, support the formation of Tycho as an oblique impact from the W-SW. The asymmetric ejecta blanket, significantly reduced melt emplacement uprange, and the depressed uprange crater rim at Tycho suggest an impact angle of ∼25-45°.
NASA Astrophysics Data System (ADS)
Gliss, Christine; Parel, Jean-Marie A.; Flynn, John T.; Pratisto, Hans S.; Niederer, Peter F.
2003-07-01
We present a miniaturized version of a fundus camera. The camera is designed for the use in screening for retinopathy of prematurity (ROP). There, but also in other applications a small, light weight, digital camera system can be extremely useful. We present a small wide angle digital camera system. The handpiece is significantly smaller and lighter then in all other systems. The electronics is truly portable fitting in a standard boardcase. The camera is designed to be offered at a compatible price. Data from tests on young rabbits' eyes is presented. The development of the camera system is part of a telemedicine project screening for ROP. Telemedical applications are a perfect application for this camera system using both advantages: the portability as well as the digital image.
Wide-Angle Polarimetric Camera for Korea Pathfinder Lunar Orbiter
NASA Astrophysics Data System (ADS)
Choi, Y. J.; Kim, S.; Kang, K. I.
2016-12-01
A polarimetry data contains valuable information about the lunar surface such as the grain size and porosity of the regolith. However, a polarimetry toward the Moon in its orbit has not been performed. We plan to perform the polarimetry in lunar orbit through Korea Pathfinder Lunar Orbiter (KPLO), which will be launched around 2018/2019 as the first Korean lunar mission. Wide-Angle Polarimetric Camera (PolCam) is selected as one of the onboard instrument for KPLO. The science objectives are ; (1) To obtain the polarization data of the whole lunar surface at wavelengths of 430nm and 650nm for phase angle range from 0° to 120° with a spatial resolution of 80 m. (2) To obtain the reflectance ratios at 320 nm and 430 nm for the whole lunar surface with a spatial resolution of 80m. We will summarize recent results of lunar surface from ground-based polarimetric observations and will briefly introduce the science rationals and operation concept of PolCam.
1998-03-13
Color composite of condensate clouds over Tharsis made from red and blue images with a synthesized green channel. Mars Orbiter Camera wide angle frames from Orbit 48. http://photojournal.jpl.nasa.gov/catalog/PIA00812
2017-11-27
These two images illustrate just how far Cassini traveled to get to Saturn. On the left is one of the earliest images Cassini took of the ringed planet, captured during the long voyage from the inner solar system. On the right is one of Cassini's final images of Saturn, showing the site where the spacecraft would enter the atmosphere on the following day. In the left image, taken in 2001, about six months after the spacecraft passed Jupiter for a gravity assist flyby, the best view of Saturn using the spacecraft's high-resolution (narrow-angle) camera was on the order of what could be seen using the Earth-orbiting Hubble Space Telescope. At the end of the mission (at right), from close to Saturn, even the lower resolution (wide-angle) camera could capture just a tiny part of the planet. The left image looks toward Saturn from 20 degrees below the ring plane and was taken on July 13, 2001 in wavelengths of infrared light centered at 727 nanometers using the Cassini spacecraft narrow-angle camera. The view at right is centered on a point 6 degrees north of the equator and was taken in visible light using the wide-angle camera on Sept. 14, 2017. The view on the left was acquired at a distance of approximately 317 million miles (510 million kilometers) from Saturn. Image scale is about 1,900 miles (3,100 kilometers) per pixel. The view at right was acquired at a distance of approximately 360,000 miles (579,000 kilometers) from Saturn. Image scale is 22 miles (35 kilometers) per pixel. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA21353
Mars Daily Global Image from April 1999
2000-09-08
Twelve orbits a day provide NASA Mars Global Surveyor MOC wide angle cameras a global napshot of weather patterns across the planet. Here, bluish-white water ice clouds hang above the Tharsis volcanoes.
[Reliability of retinal imaging screening in retinopathy of prematurity].
Navarro-Blanco, C; Peralta-Calvo, J; Pastora-Salvador, N; Alvarez-Rementería, L; Chamorro, E; Sánchez-Ramos, C
2014-09-01
The retinopathy of prematurity (ROP) is a potentially avoidable cause of blindness in children. The advances in neonatal care make the survival of extremely premature infants, who show a greater incidence of the disease, possible. The aim of the study is to evaluate the reliability of ROP screening using retinography imaging with the RetCam 3 wide-angle camera and also study the variability of ROP diagnosis depending on the evaluator. The indirect ophthalmoscopy exam was performed by a Pediatric ROP-Expert Ophthalmologist. The same ophthalmologist and a technician specialized in digital image capture took retinal images using the RetCam 3 wide-angle camera. A total of 30 image sets were analyzed by 3 masked groups: group A (8 ophthalmologists), group B (5 experts in vision), and group C (2 ROP-expert ophthalmologists). According to the diagnosis using indirect ophthalmoscopy, the sensitivity (26-93), Kappa (0.24-0.80), and the percent agreement were statistically significant in group C for the diagnosis of ROP Type 1. In the diagnosis of ROP Type 1+Type 2, Kappa (0.17-0.33) and the percent agreement (58-90) were statistically significant, with higher values in group C. The diagnosis, carried out by ROP-expert ophthalmologists, using the wide-angle camera RetCam 3 has proved to be a reliable method. Copyright © 2013 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.
Performance Characteristics For The Orbiter Camera Payload System's Large Format Camera (LFC)
NASA Astrophysics Data System (ADS)
MoIIberg, Bernard H.
1981-11-01
The Orbiter Camera Payload System, the OCPS, is an integrated photographic system which is carried into Earth orbit as a payload in the Shuttle Orbiter vehicle's cargo bay. The major component of the OCPS is a Large Format Camera (LFC) which is a precision wide-angle cartographic instrument that is capable of produc-ing high resolution stereophotography of great geometric fidelity in multiple base to height ratios. The primary design objective for the LFC was to maximize all system performance characteristics while maintaining a high level of reliability compatible with rocket launch conditions and the on-orbit environment.
NASA Astrophysics Data System (ADS)
Gicquel, Adeline; Vincent, Jean-Baptiste; Sierks, Holger; Rose, Martin; Agarwal, Jessica; Deller, Jakob; Guettler, Carsten; Hoefner, Sebastian; Hofmann, Marc; Hu, Xuanyu; Kovacs, Gabor; Oklay Vincent, Nilda; Shi, Xian; Tubiana, Cecilia; Barbieri, Cesare; Lamy, Phylippe; Rodrigo, Rafael; Koschny, Detlef; Rickman, Hans; OSIRIS Team
2016-10-01
Images of the nucleus and the coma (gas and dust) of comet 67P/Churyumov- Gerasimenko have been acquired by the OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) cameras system since March 2014 using both the wide angle camera (WAC) and the narrow angle camera (NAC). We are using the NAC camera to study the bright outburst observed on July 29th, 2015 in the southern hemisphere. The NAC camera's wavelength ranges between 250-1000 nm with a combination of 12 filters. The high spatial resolution is needed to localize the source point of the outburst on the surface of the nucleus. At the time of the observations, the heliocentric distance was 1.25AU and the distance between the spacecraft and the comet was 126 km. We aim to understand the physics leading to such outgassing: Is the jet associated to the outbursts controlled by the micro-topography? Or by ice suddenly exposed? We are using the Direct Simulation Monte Carlo (DSMC) method to study the gas flow close to the nucleus. The goal of the DSMC code is to reproduce the opening angle of the jet, and constrain the outgassing ratio between outburst source and local region. The results of this model will be compared to the images obtained with the NAC camera.
NASA Technical Reports Server (NTRS)
1999-01-01
This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.
Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. -- A remote wide-angle camera captures liftoff of the Delta II rocket carrying the Gravity Probe B spacecraft from Space Launch Complex 2 on Vandenberg AFB, Calif., at 9:57:24 a.m. PDT.
Evaluation of the Quality of Action Cameras with Wide-Angle Lenses in Uav Photogrammetry
NASA Astrophysics Data System (ADS)
Hastedt, H.; Ekkel, T.; Luhmann, T.
2016-06-01
The application of light-weight cameras in UAV photogrammetry is required due to restrictions in payload. In general, consumer cameras with normal lens type are applied to a UAV system. The availability of action cameras, like the GoPro Hero4 Black, including a wide-angle lens (fish-eye lens) offers new perspectives in UAV projects. With these investigations, different calibration procedures for fish-eye lenses are evaluated in order to quantify their accuracy potential in UAV photogrammetry. Herewith the GoPro Hero4 is evaluated using different acquisition modes. It is investigated to which extent the standard calibration approaches in OpenCV or Agisoft PhotoScan/Lens can be applied to the evaluation processes in UAV photogrammetry. Therefore different calibration setups and processing procedures are assessed and discussed. Additionally a pre-correction of the initial distortion by GoPro Studio and its application to the photogrammetric purposes will be evaluated. An experimental setup with a set of control points and a prospective flight scenario is chosen to evaluate the processing results using Agisoft PhotoScan. Herewith it is analysed to which extent a pre-calibration and pre-correction of a GoPro Hero4 will reinforce the reliability and accuracy of a flight scenario.
NASA Astrophysics Data System (ADS)
Ehrhart, Matthias; Lienhart, Werner
2017-09-01
The importance of automated prism tracking is increasingly triggered by the rising automation of total station measurements in machine control, monitoring and one-person operation. In this article we summarize and explain the different techniques that are used to coarsely search a prism, to precisely aim at a prism, and to identify whether the correct prism is tracked. Along with the state-of-the-art review, we discuss and experimentally evaluate possible improvements based on the image data of an additional wide-angle camera which is available for many total stations today. In cases in which the total station's fine aiming module loses the prism, the tracked object may still be visible to the wide-angle camera because of its larger field of view. The theodolite angles towards the target can then be derived from its image coordinates which facilitates a fast reacquisition of the prism. In experimental measurements we demonstrate that our image-based approach for the coarse target search is 4 to 10-times faster than conventional approaches.
Hybrid Image Fusion for Sharpness Enhancement of Multi-Spectral Lunar Images
NASA Astrophysics Data System (ADS)
Awumah, Anna; Mahanti, Prasun; Robinson, Mark
2016-10-01
Image fusion enhances the sharpness of a multi-spectral (MS) image by incorporating spatial details from a higher-resolution panchromatic (Pan) image [1,2]. Known applications of image fusion for planetary images are rare, although image fusion is well-known for its applications to Earth-based remote sensing. In a recent work [3], six different image fusion algorithms were implemented and their performances were verified with images from the Lunar Reconnaissance Orbiter (LRO) Camera. The image fusion procedure obtained a high-resolution multi-spectral (HRMS) product from the LRO Narrow Angle Camera (used as Pan) and LRO Wide Angle Camera (used as MS) images. The results showed that the Intensity-Hue-Saturation (IHS) algorithm results in a high-spatial quality product while the Wavelet-based image fusion algorithm best preserves spectral quality among all the algorithms. In this work we show the results of a hybrid IHS-Wavelet image fusion algorithm when applied to LROC MS images. The hybrid method provides the best HRMS product - both in terms of spatial resolution and preservation of spectral details. Results from hybrid image fusion can enable new science and increase the science return from existing LROC images.[1] Pohl, Cle, and John L. Van Genderen. "Review article multisensor image fusion in remote sensing: concepts, methods and applications." International journal of remote sensing 19.5 (1998): 823-854.[2] Zhang, Yun. "Understanding image fusion." Photogramm. Eng. Remote Sens 70.6 (2004): 657-661.[3] Mahanti, Prasun et al. "Enhancement of spatial resolution of the LROC Wide Angle Camera images." Archives, XXIII ISPRS Congress Archives (2016).
2018-03-05
In this image, NASA's Cassini sees Saturn and its rings through a haze of Sun glare on the camera lens. If you could travel to Saturn in person and look out the window of your spacecraft when the Sun was at a certain angle, you might see a view very similar to this one. Images taken using red, green and blue spectral filters were combined to show the scene in natural color. The images were taken with Cassini's wide-angle camera on June 23, 2013, at a distance of approximately 491,200 miles (790,500 kilometers) from Saturn. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA17185
Surface compositional variation on the comet 67P/Churyumov-Gerasimenko by OSIRIS data
NASA Astrophysics Data System (ADS)
Barucci, M. A.; Fornasier, S.; Feller, C.; Perna, D.; Hasselmann, H.; Deshapriya, J. D. P.; Fulchignoni, M.; Besse, S.; Sierks, H.; Forgia, F.; Lazzarin, M.; Pommerol, A.; Oklay, N.; Lara, L.; Scholten, F.; Preusker, F.; Leyrat, C.; Pajola, M.; Osiris-Rosetta Team
2015-10-01
Since the Rosetta mission arrived at the comet 67P/Churyumov-Gerasimenko (67/P C-G) on July 2014, the comet nucleus has been mapped by both OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System, [1]) NAC (Narrow Angle Camera) and WAC (Wide Angle Camera) acquiring a huge quantity of surface's images at different wavelength bands, under variable illumination conditions and spatial resolution, and producing the most detailed maps at the highest spatial resolution of a comet nucleus surface.67/P C-G's nucleus shows an irregular bi-lobed shape of complex morphology with terrains showing intricate features [2, 3] and a heterogeneity surface at different scales.
Photogrammetry System and Method for Determining Relative Motion Between Two Bodies
NASA Technical Reports Server (NTRS)
Miller, Samuel A. (Inventor); Severance, Kurt (Inventor)
2014-01-01
A photogrammetry system and method provide for determining the relative position between two objects. The system utilizes one or more imaging devices, such as high speed cameras, that are mounted on a first body, and three or more photogrammetry targets of a known location on a second body. The system and method can be utilized with cameras having fish-eye, hyperbolic, omnidirectional, or other lenses. The system and method do not require overlapping fields-of-view if two or more cameras are utilized. The system and method derive relative orientation by equally weighting information from an arbitrary number of heterogeneous cameras, all with non-overlapping fields-of-view. Furthermore, the system can make the measurements with arbitrary wide-angle lenses on the cameras.
Analysis of the effect on optical equipment caused by solar position in target flight measure
NASA Astrophysics Data System (ADS)
Zhu, Shun-hua; Hu, Hai-bin
2012-11-01
Optical equipment is widely used to measure flight parameters in target flight performance test, but the equipment is sensitive to the sun's rays. In order to avoid the disadvantage of sun's rays directly shines to the optical equipment camera lens when measuring target flight parameters, the angle between observation direction and the line which connects optical equipment camera lens and the sun should be kept at a big range, The calculation method of the solar azimuth and altitude to the optical equipment at any time and at any place on the earth, the equipment observation direction model and the calculating model of angle between observation direction and the line which connects optical equipment camera lens are introduced in this article. Also, the simulation of the effect on optical equipment caused by solar position at different time, different date, different month and different target flight direction is given in this article.
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei
2016-01-01
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision. PMID:27892454
NASA Astrophysics Data System (ADS)
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei
2016-11-01
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.
The Uses of a Polarimetric Camera
2008-09-01
are displayed in this thesis the author used two different lenses . One of the lenses is an ARSAT H 20mm with an F number of 2.8. This lens was used...for all the wide angle images collected. For the telephoto images collected, the author used a NIKKOR 200mm lenses which has an F number of 4.0...16 K. DEGREE OF LINEAR POLARIZATION (DOLP) ..................................17 L. PHASE ANGLE OF POLARIZATION
Pre-flight and On-orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Camera
NASA Astrophysics Data System (ADS)
Speyerer, E. J.; Wagner, R. V.; Robinson, M. S.; Licht, A.; Thomas, P. C.; Becker, K.; Anderson, J.; Brylow, S. M.; Humm, D. C.; Tschimmel, M.
2016-04-01
The Lunar Reconnaissance Orbiter Camera (LROC) consists of two imaging systems that provide multispectral and high resolution imaging of the lunar surface. The Wide Angle Camera (WAC) is a seven color push-frame imager with a 90∘ field of view in monochrome mode and 60∘ field of view in color mode. From the nominal 50 km polar orbit, the WAC acquires images with a nadir ground sampling distance of 75 m for each of the five visible bands and 384 m for the two ultraviolet bands. The Narrow Angle Camera (NAC) consists of two identical cameras capable of acquiring images with a ground sampling distance of 0.5 m from an altitude of 50 km. The LROC team geometrically calibrated each camera before launch at Malin Space Science Systems in San Diego, California and the resulting measurements enabled the generation of a detailed camera model for all three cameras. The cameras were mounted and subsequently launched on the Lunar Reconnaissance Orbiter (LRO) on 18 June 2009. Using a subset of the over 793000 NAC and 207000 WAC images of illuminated terrain collected between 30 June 2009 and 15 December 2013, we improved the interior and exterior orientation parameters for each camera, including the addition of a wavelength dependent radial distortion model for the multispectral WAC. These geometric refinements, along with refined ephemeris, enable seamless projections of NAC image pairs with a geodetic accuracy better than 20 meters and sub-pixel precision and accuracy when orthorectifying WAC images.
NASA Technical Reports Server (NTRS)
1997-01-01
Sections of MOC images P024_01 and P024_02, shown here in color composite form, were acquired with the low resolution red and blue wide angle cameras over a 5 minute period starting when Mars Global Surveyor was at its closest point to the planet at the beginning of its 24th orbit (around 4:00 AM PDT on October 20, 1997). To make this image, a third component (green) was synthesized from the red and blue images. During the imaging period, the camera was pointed straight down towards the martian surface, 176 km (109 miles) below the spacecraft. During the time it took to acquire the image, the spacecraft rose to an altitude of 310 km (193 miles). Owing to data camera scanning rate and data volume constraints, the image was acquired at a resolution of roughly 1 km (0.609 mile) per pixel. The image shown here covers an area from 12o to 26o N latitude and 126o N to 138o W longitude. The image is oriented with north to the top.
As has been noted in other MOC releases, Olympus Mons is the largest of the major Tharsis volcanoes, rising 25 km (15.5 miles) and stretching over nearly 550 km (340 miles) east-west. The summit caldera, a composite of as many as seven roughly circular collapse depressions, is 66 by 83 km (41 by 52 miles) across. Also seen in this image are water-ice clouds that accumulate around and above the volcano during the late afternoon (at the time the image was acquired, the summit was at 5:30 PM local solar time). To understand the value of orbital observations, compare this image with the two taken during approach (PIA00929 and PIA00936), that are representative of the best resolution from Earth.Through Monday, October 28, the MOC had acquired a total of 132 images, most of which were at low sun elevation angles. Of these images, 74 were taken with the high resolution narrow angle camera and 58 with the low resolution wide angle cameras. Twenty-eight narrow angle and 24 wide angle images were taken after the suspension of aerobraking. These images, including the one shown above, are among the best returned so far.Launched on November 7, 1996, Mars Global Surveyor entered Mars orbit on Thursday, September 11, 1997. The original mission plan called for using friction with the planet's atmosphere to reduce the orbital energy, leading to a two-year mapping mission from close, circular orbit (beginning in March 1998). Owing to difficulties with one of the two solar panels, aerobraking was suspended in mid-October and is scheduled to resume in mid-November. Many of the original objectives of the mission, and in particular those of the camera, are likely to be accomplished as the mission progresses.Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.Multi-Angle View of the Canary Islands
NASA Technical Reports Server (NTRS)
2000-01-01
A multi-angle view of the Canary Islands in a dust storm, 29 February 2000. At left is a true-color image taken by the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite. This image was captured by the MISR camera looking at a 70.5-degree angle to the surface, ahead of the spacecraft. The middle image was taken by the MISR downward-looking (nadir) camera, and the right image is from the aftward 70.5-degree camera. The images are reproduced using the same radiometric scale, so variations in brightness, color, and contrast represent true variations in surface and atmospheric reflectance with angle. Windblown dust from the Sahara Desert is apparent in all three images, and is much brighter in the oblique views. This illustrates how MISR's oblique imaging capability makes the instrument a sensitive detector of dust and other particles in the atmosphere. Data for all channels are presented in a Space Oblique Mercator map projection to facilitate their co-registration. The images are about 400 km (250 miles)wide, with a spatial resolution of about 1.1 kilometers (1,200 yards). North is toward the top. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.
2015-08-20
This view from NASA Cassini spacecraft looks toward Saturn icy moon Dione, with giant Saturn and its rings in the background, just prior to the mission final close approach to the moon on August 17, 2015. At lower right is the large, multi-ringed impact basin named Evander, which is about 220 miles (350 kilometers) wide. The canyons of Padua Chasma, features that form part of Dione's bright, wispy terrain, reach into the darkness at left. Imaging scientists combined nine visible light (clear spectral filter) images to create this mosaic view: eight from the narrow-angle camera and one from the wide-angle camera, which fills in an area at lower left. The scene is an orthographic projection centered on terrain at 0.2 degrees north latitude, 179 degrees west longitude on Dione. An orthographic view is most like the view seen by a distant observer looking through a telescope. North on Dione is up. The view was acquired at distances ranging from approximately 106,000 miles (170,000 kilometers) to 39,000 miles (63,000 kilometers) from Dione and at a sun-Dione-spacecraft, or phase, angle of 35 degrees. Image scale is about 1,500 feet (450 meters) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA19650
Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.
2012-01-01
We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.
Non-contact measurement of rotation angle with solo camera
NASA Astrophysics Data System (ADS)
Gan, Xiaochuan; Sun, Anbin; Ye, Xin; Ma, Liqun
2015-02-01
For the purpose to measure a rotation angle around the axis of an object, a non-contact rotation angle measurement method based on solo camera was promoted. The intrinsic parameters of camera were calibrated using chessboard on principle of plane calibration theory. The translation matrix and rotation matrix between the object coordinate and the camera coordinate were calculated according to the relationship between the corners' position on object and their coordinates on image. Then the rotation angle between the measured object and the camera could be resolved from the rotation matrix. A precise angle dividing table (PADT) was chosen as the reference to verify the angle measurement error of this method. Test results indicated that the rotation angle measurement error of this method did not exceed +/- 0.01 degree.
1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...
1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
NASA Astrophysics Data System (ADS)
Beyer, Ross A.; Archinal, B.; Li, R.; Mattson, S.; Moratto, Z.; McEwen, A.; Oberst, J.; Robinson, M.
2009-09-01
The Lunar Reconnaissance Orbiter Camera (LROC) will obtain two types of multiple overlapping coverage to derive terrain models of the lunar surface. LROC has two Narrow Angle Cameras (NACs), working jointly to provide a wider (in the cross-track direction) field of view, as well as a Wide Angle Camera (WAC). LRO's orbit precesses, and the same target can be viewed at different solar azimuth and incidence angles providing the opportunity to acquire `photometric stereo' in addition to traditional `geometric stereo' data. Geometric stereo refers to images acquired by LROC with two observations at different times. They must have different emission angles to provide a stereo convergence angle such that the resultant images have enough parallax for a reasonable stereo solution. The lighting at the target must not be radically different. If shadows move substantially between observations, it is very difficult to correlate the images. The majority of NAC geometric stereo will be acquired with one nadir and one off-pointed image (20 degree roll). Alternatively, pairs can be obtained with two spacecraft rolls (one to the left and one to the right) providing a stereo convergence angle up to 40 degrees. Overlapping WAC images from adjacent orbits can be used to generate topography of near-global coverage at kilometer-scale effective spatial resolution. Photometric stereo refers to multiple-look observations of the same target under different lighting conditions. LROC will acquire at least three (ideally five) observations of a target. These observations should have near identical emission angles, but with varying solar azimuth and incidence angles. These types of images can be processed via various methods to derive single pixel resolution topography and surface albedo. The LROC team will produce some topographic models, but stereo data collection is focused on acquiring the highest quality data so that such models can be generated later.
Comparison and evaluation of datasets for off-angle iris recognition
NASA Astrophysics Data System (ADS)
Kurtuncu, Osman M.; Cerme, Gamze N.; Karakaya, Mahmut
2016-05-01
In this paper, we investigated the publicly available iris recognition datasets and their data capture procedures in order to determine if they are suitable for the stand-off iris recognition research. Majority of the iris recognition datasets include only frontal iris images. Even if a few datasets include off-angle iris images, the frontal and off-angle iris images are not captured at the same time. The comparison of the frontal and off-angle iris images shows not only differences in the gaze angle but also change in pupil dilation and accommodation as well. In order to isolate the effect of the gaze angle from other challenging issues including dilation and accommodation, the frontal and off-angle iris images are supposed to be captured at the same time by using two different cameras. Therefore, we developed an iris image acquisition platform by using two cameras in this work where one camera captures frontal iris image and the other one captures iris images from off-angle. Based on the comparison of Hamming distance between frontal and off-angle iris images captured with the two-camera- setup and one-camera-setup, we observed that Hamming distance in two-camera-setup is less than one-camera-setup ranging from 0.05 to 0.001. These results show that in order to have accurate results in the off-angle iris recognition research, two-camera-setup is necessary in order to distinguish the challenging issues from each other.
Csutak, A; Lengyel, I; Jonasson, F; Leung, I; Geirsdottir, A; Xing, W; Peto, T
2010-10-01
To establish the agreement between image grading of conventional (45°) and ultra wide-angle (200°) digital images in the macula. In 2008, the 12-year follow-up was conducted on 573 participants of the Reykjavik Eye Study. This study included the use of the Optos P200C AF ultra wide-angle laser scanning ophthalmoscope alongside Zeiss FF 450 conventional digital fundus camera on 121 eyes with or without age-related macular degeneration using the International Classification System. Of these eyes, detailed grading was carried out on five cases each with hard drusen, geographic atrophy and chorioretinal neovascularisation, and six cases of soft drusen. Exact agreement and κ-statistics were calculated. Comparison of the conventional and ultra wide-angle images in the macula showed an overall 96.43% agreement (κ=0.93) with no disagreement at end-stage disease; although in one eye chorioretinal neovascularisation was graded as drusenoid pigment epithelial detachment. Of patients with drusen only, the exact agreement was 96.1%. The detailed grading showed no clinically significant disagreement between the conventional 45° and 200° images. On the basis of our results, there is a good agreement between grading conventional and ultra wide-angle images in the macula.
Miniature Wide-Angle Lens for Small-Pixel Electronic Camera
NASA Technical Reports Server (NTRS)
Mouroulils, Pantazis; Blazejewski, Edward
2009-01-01
A proposed wideangle lens is shown that would be especially well suited for an electronic camera in which the focal plane is occupied by an image sensor that has small pixels. The design of the lens is intended to satisfy requirements for compactness, high image quality, and reasonably low cost, while addressing issues peculiar to the operation of small-pixel image sensors. Hence, this design is expected to enable the development of a new generation of compact, high-performance electronic cameras. The lens example shown has a 60 degree field of view and a relative aperture (f-number) of 3.2. The main issues affecting the design are also shown.
1996-01-29
In this false color image of Neptune, objects that are deep in the atmosphere are blue, while those at higher altitudes are white. The image was taken by Voyager 2 wide-angle camera through an orange filter and two different methane filters. http://photojournal.jpl.nasa.gov/catalog/PIA00051
Combined position and diameter measures for lunar craters
Arthur, D.W.G.
1977-01-01
The note addresses the problem of simultaneously measuring positions and diameters of circular impact craters on wide-angle photographs of approximately spherical planets such as the Moon and Mercury. The method allows for situations in which the camera is not aligned on the planet's center. ?? 1977.
NASA Astrophysics Data System (ADS)
Li, J.; Wu, Z.; Wei, X.; Zhang, Y.; Feng, F.; Guo, F.
2018-04-01
Cross-calibration has the advantages of high precision, low resource requirements and simple implementation. It has been widely used in recent years. The four wide-field-of-view (WFV) cameras on-board Gaofen-1 satellite provide high spatial resolution and wide combined coverage (4 × 200 km) without onboard calibration. In this paper, the four-band radiometric cross-calibration coefficients of WFV1 camera were obtained based on radiation and geometry matching taking Landsat 8 OLI (Operational Land Imager) sensor as reference. Scale Invariant Feature Transform (SIFT) feature detection method and distance and included angle weighting method were introduced to correct misregistration of WFV-OLI image pair. The radiative transfer model was used to eliminate difference between OLI sensor and WFV1 camera through the spectral match factor (SMF). The near-infrared band of WFV1 camera encompasses water vapor absorption bands, thus a Look Up Table (LUT) for SMF varies from water vapor amount is established to estimate the water vapor effects. The surface synchronization experiment was designed to verify the reliability of the cross-calibration coefficients, which seem to perform better than the official coefficients claimed by the China Centre for Resources Satellite Data and Application (CCRSDA).
Solutions on a high-speed wide-angle zoom lens with aspheric surfaces
NASA Astrophysics Data System (ADS)
Yamanashi, Takanori
2012-10-01
Recent development in CMOS and digital camera technology has accelerated the business and market share of digital cinematography. In terms of optical design, this technology has increased the need to carefully consider pixel pitch and characteristics of the imager. When the field angle at the wide end, zoom ratio, and F-number are specified, choosing an appropriate zoom lens type is crucial. In addition, appropriate power distributions and lens configurations are required. At points near the wide end of a zoom lens, it is known that an aspheric surface is an effective means to correct off-axis aberrations. On the other hand, optical designers have to focus on manufacturability of aspheric surfaces and perform required analysis with respect to the surface shape. Centration errors aside, it is also important to know the sensitivity to aspheric shape errors and their effect on image quality. In this paper, wide angle cine zoom lens design examples are introduced and their main characteristics are described. Moreover, technical challenges are pointed out and solutions are proposed.
NASA Astrophysics Data System (ADS)
Nikolashkin, S. V.; Reshetnikov, A. A.
2017-11-01
The system of video surveillance during active rocket experiments in the Polar geophysical observatory "Tixie" and studies of the effects of "Soyuz" vehicle launches from the "Vostochny" cosmodrome over the territory of the Republic of Sakha (Yakutia) is presented. The created system consists of three AHD video cameras with different angles of view mounted on a common platform mounted on a tripod with the possibility of manual guiding. The main camera with high-sensitivity black and white CCD matrix SONY EXview HADII is equipped depending on the task with lenses "MTO-1000" (F = 1000 mm) or "Jupiter-21M " (F = 300 mm) and is designed for more detailed shooting of luminous formations. The second camera of the same type, but with a 30 degree angle of view. It is intended for shooting of the general plan and large objects, and also for a binding of coordinates of object on stars. The third color wide-angle camera (120 degrees) is designed to be connected to landmarks in the daytime, the optical axis of this channel is directed at 60 degrees down. The data is recorded on the hard disk of a four-channel digital video recorder. Tests of the original version of the system with two channels were conducted during the launch of the geophysical rocket in Tixie in September 2015 and showed its effectiveness.
Visual imaging control systems of the Mariner to Jupiter and Saturn spacecraft
NASA Technical Reports Server (NTRS)
Larks, L.
1979-01-01
Design and fabrication of optical systems for the Mariner Jupiter Saturn (Voyager) mission is described. Because of the long distances of these planets from the sun, the spacecraft was designed without solar panels with the electricity generated on-board by radio-isotope thermal generators (RTG). The presence of RTG's and Jupiter radiation environment required that the optical systems be fabricated out of radiation stabilized materials. A narrow angle and a wide angle camera are located on the spacecraft scan platform, with the narrow angle lens a modification of the Mariner 10 lens. The optical system is described, noting that the lens was modified by moving the aperture correctors forward and placing a spider mounted secondary mirror in the original back surface of the second aperture corrector. The wide angle lens was made out of cerium doped, radiation stabilized optical glass with greatest blue transmittance, which would be resistant to RTG and Jupiter radiation.
Research on Geometric Calibration of Spaceborne Linear Array Whiskbroom Camera
Sheng, Qinghong; Wang, Qi; Xiao, Hui; Wang, Qing
2018-01-01
The geometric calibration of a spaceborne thermal-infrared camera with a high spatial resolution and wide coverage can set benchmarks for providing an accurate geographical coordinate for the retrieval of land surface temperature. The practice of using linear array whiskbroom Charge-Coupled Device (CCD) arrays to image the Earth can help get thermal-infrared images of a large breadth with high spatial resolutions. Focusing on the whiskbroom characteristics of equal time intervals and unequal angles, the present study proposes a spaceborne linear-array-scanning imaging geometric model, whilst calibrating temporal system parameters and whiskbroom angle parameters. With the help of the YG-14—China’s first satellite equipped with thermal-infrared cameras of high spatial resolution—China’s Anyang Imaging and Taiyuan Imaging are used to conduct an experiment of geometric calibration and a verification test, respectively. Results have shown that the plane positioning accuracy without ground control points (GCPs) is better than 30 pixels and the plane positioning accuracy with GCPs is better than 1 pixel. PMID:29337885
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; ...
2016-11-28
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° ×more » 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.« less
2016-09-15
NASA's Cassini spacecraft stared at Saturn for nearly 44 hours on April 25 to 27, 2016, to obtain this movie showing just over four Saturn days. With Cassini's orbit being moved closer to the planet in preparation for the mission's 2017 finale, scientists took this final opportunity to capture a long movie in which the planet's full disk fit into a single wide-angle camera frame. Visible at top is the giant hexagon-shaped jet stream that surrounds the planet's north pole. Each side of this huge shape is slightly wider than Earth. The resolution of the 250 natural color wide-angle camera frames comprising this movie is 512x512 pixels, rather than the camera's full resolution of 1024x1024 pixels. Cassini's imaging cameras have the ability to take reduced-size images like these in order to decrease the amount of data storage space required for an observation. The spacecraft began acquiring this sequence of images just after it obtained the images to make a three-panel color mosaic. When it began taking images for this movie sequence, Cassini was 1,847,000 miles (2,973,000 kilometers) from Saturn, with an image scale of 355 kilometers per pixel. When it finished gathering the images, the spacecraft had moved 171,000 miles (275,000 kilometers) closer to the planet, with an image scale of 200 miles (322 kilometers) per pixel. A movie is available at http://photojournal.jpl.nasa.gov/catalog/PIA21047
Space infrared telescope facility wide field and diffraction limited array camera (IRAC)
NASA Technical Reports Server (NTRS)
Fazio, Giovanni G.
1988-01-01
The wide-field and diffraction limited array camera (IRAC) is capable of two-dimensional photometry in either a wide-field or diffraction-limited mode over the wavelength range from 2 to 30 microns with a possible extension to 120 microns. A low-doped indium antimonide detector was developed for 1.8 to 5.0 microns, detectors were tested and optimized for the entire 1.8 to 30 micron range, beamsplitters were developed and tested for the 1.8 to 30 micron range, and tradeoff studies of the camera's optical system performed. Data are presented on the performance of InSb, Si:In, Si:Ga, and Si:Sb array detectors bumpbonded to a multiplexed CMOS readout chip of the source-follower type at SIRTF operating backgrounds (equal to or less than 1 x 10 to the 8th ph/sq cm/sec) and temperature (4 to 12 K). Some results at higher temperatures are also presented for comparison to SIRTF temperature results. Data are also presented on the performance of IRAC beamsplitters at room temperature at both 0 and 45 deg angle of incidence and on the performance of the all-reflecting optical system baselined for the camera.
Clementine Observes the Moon, Solar Corona, and Venus
NASA Technical Reports Server (NTRS)
1997-01-01
In 1994, during its flight, the Clementine spacecraft returned images of the Moon. In addition to the geologic mapping cameras, the Clementine spacecraft also carried two Star Tracker cameras for navigation. These lightweight (0.3 kg) cameras kept the spacecraft on track by constantly observing the positions of stars, reminiscent of the age-old seafaring tradition of sextant/star navigation. These navigation cameras were also to take some spectacular wide angle images of the Moon.
In this picture the Moon is seen illuminated solely by light reflected from the Earth--Earthshine! The bright glow on the lunar horizon is caused by light from the solar corona; the sun is just behind the lunar limb. Caught in this image is the planet Venus at the top of the frame.2015-10-15
NASA's Cassini spacecraft zoomed by Saturn's icy moon Enceladus on Oct. 14, 2015, capturing this stunning image of the moon's north pole. A companion view from the wide-angle camera (PIA20010) shows a zoomed out view of the same region for context. Scientists expected the north polar region of Enceladus to be heavily cratered, based on low-resolution images from the Voyager mission, but high-resolution Cassini images show a landscape of stark contrasts. Thin cracks cross over the pole -- the northernmost extent of a global system of such fractures. Before this Cassini flyby, scientists did not know if the fractures extended so far north on Enceladus. North on Enceladus is up. The image was taken in visible green light with the Cassini spacecraft narrow-angle camera. The view was acquired at a distance of approximately 4,000 miles (6,000 kilometers) from Enceladus and at a Sun-Enceladus-spacecraft, or phase, angle of 9 degrees. Image scale is 115 feet (35 meters) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA19660
NASA Astrophysics Data System (ADS)
Karachevtseva, I. P.; Kozlova, N. A.; Kokhanov, A. A.; Zubarev, A. E.; Nadezhdina, I. E.; Patratiy, V. D.; Konopikhin, A. A.; Basilevsky, A. T.; Abdrakhimov, A. M.; Oberst, J.; Haase, I.; Jolliff, B. L.; Plescia, J. B.; Robinson, M. S.
2017-02-01
The Lunar Reconnaissance Orbiter Camera (LROC) system consists of a Wide Angle Camera (WAC) and Narrow Angle Camera (NAC). NAC images (∼0.5 to 1.7 m/pixel) reveal details of the Luna-21 landing site and Lunokhod-2 traverse area. We derived a Digital Elevation Model (DEM) and an orthomosaic for the study region using photogrammetric stereo processing techniques with NAC images. The DEM and mosaic allowed us to analyze the topography and morphology of the landing site area and to map the Lunokhod-2 rover route. The total range of topographic elevation along the traverse was found to be less than 144 m; and the rover encountered slopes of up to 20°. With the orthomosaic tied to the lunar reference frame, we derived coordinates of the Lunokhod-2 landing module and overnight stop points. We identified the exact rover route by following its tracks and determined its total length as 39.16 km, more than was estimated during the mission (37 km), which until recently was a distance record for planetary robotic rovers held for more than 40 years.
Towards fish-eye camera based in-home activity assessment.
Bas, Erhan; Erdogmus, Deniz; Ozertem, Umut; Pavel, Misha
2008-01-01
Indoors localization, activity classification, and behavioral modeling are increasingly important for surveillance applications including independent living and remote health monitoring. In this paper, we study the suitability of fish-eye cameras (high-resolution CCD sensors with very-wide-angle lenses) for the purpose of monitoring people in indoors environments. The results indicate that these sensors are very useful for automatic activity monitoring and people tracking. We identify practical and mathematical problems related to information extraction from these video sequences and identify future directions to solve these issues.
13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...
13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...
10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Upper wide-angle viewing system for ITER.
Lasnier, C J; McLean, A G; Gattuso, A; O'Neill, R; Smiley, M; Vasquez, J; Feder, R; Smith, M; Stratton, B; Johnson, D; Verlaan, A L; Heijmans, J A C
2016-11-01
The Upper Wide Angle Viewing System (UWAVS) will be installed on five upper ports of ITER. This paper shows major requirements, gives an overview of the preliminary design with reasons for some design choices, examines self-emitted IR light from UWAVS optics and its effect on accuracy, and shows calculations of signal-to-noise ratios for the two-color temperature output as a function of integration time and divertor temperature. Accurate temperature output requires correction for vacuum window absorption vs. wavelength and for self-emitted IR, which requires good measurement of the temperature of the optical components. The anticipated signal-to-noise ratio using presently available IR cameras is adequate for the required 500 Hz frame rate.
A Warping Framework for Wide-Angle Imaging and Perspective Manipulation
ERIC Educational Resources Information Center
Carroll, Robert E.
2013-01-01
Nearly all photographs are created with lenses that approximate an ideal pinhole camera--that is, a perspective projection. This projection has proven useful not only for creating realistic depictions, but also for its expressive flexibility. Beginning in the Renaissance, the notion of perspective gave artists a systematic way to represent…
Schiaparelli Crater Rim and Interior Deposits
NASA Technical Reports Server (NTRS)
1998-01-01
A portion of the rim and interior of the large impact crater Schiaparelli is seen at different resolutions in images acquired October 18, 1997 by the Mars Global Surveyor Orbiter Camera (MOC) and by the Viking Orbiter 1 twenty years earlier. The left image is a MOC wide angle camera 'context' image showing much of the eastern portion of the crater at roughly 1 km (0.6 mi) per picture element. The image is about 390 by 730 km (240 X 450 miles). Shown within the wide angle image is the outline of a portion of the best Viking image (center, 371S53), acquired at a resolution of about 240 m/pixel (790 feet). The area covered is 144 X 144 km (89 X 89 miles). The right image is the high resolution narrow angle camera view. The area covered is very small--3.9 X 10.2 km (2.4 X 6.33 mi)--but is seen at 63 times higher resolution than the Viking image. The subdued relief and bright surface are attributed to blanketing by dust; many small craters have been completely filled in, and only the most recent (and very small) craters appear sharp and bowl-shaped. Some of the small craters are only 10-12 m (30-35 feet) across. Occasional dark streaks on steeper slopes are small debris slides that have probably occurred in the past few decades. The two prominent, narrow ridges in the center of the image may be related to the adjustment of the crater floor to age or the weight of the material filling the basin.
Malin Space Science Systems (MSSS) and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.The Day the Earth Smiled: Sneak Preview
2013-07-22
In this rare image taken on July 19, 2013, the wide-angle camera on NASA's Cassini spacecraft has captured Saturn's rings and our planet Earth and its moon in the same frame. It is only one footprint in a mosaic of 33 footprints covering the entire Saturn ring system (including Saturn itself). At each footprint, images were taken in different spectral filters for a total of 323 images: some were taken for scientific purposes and some to produce a natural color mosaic. This is the only wide-angle footprint that has the Earth-moon system in it. The dark side of Saturn, its bright limb, the main rings, the F ring, and the G and E rings are clearly seen; the limb of Saturn and the F ring are overexposed. The "breaks" in the brightness of Saturn's limb are due to the shadows of the rings on the globe of Saturn, preventing sunlight from shining through the atmosphere in those regions. The E and G rings have been brightened for better visibility. Earth, which is 898 million miles (1.44 billion kilometers) away in this image, appears as a blue dot at center right; the moon can be seen as a fainter protrusion off its right side. An arrow indicates their location in the annotated version. (The two are clearly seen as separate objects in the accompanying composite image PIA14949.) The other bright dots nearby are stars. This is only the third time ever that Earth has been imaged from the outer solar system. The acquisition of this image, along with the accompanying composite narrow- and wide-angle image of Earth and the moon and the full mosaic from which both are taken, marked the first time that inhabitants of Earth knew in advance that their planet was being imaged. That opportunity allowed people around the world to join together in social events to celebrate the occasion. This view looks toward the unilluminated side of the rings from about 20 degrees below the ring plane. Images taken using red, green and blue spectral filters were combined to create this natural color view. The images were obtained with the Cassini spacecraft wide-angle camera on July 19, 2013 at a distance of approximately 753,000 miles (1.212 million kilometers) from Saturn, and approximately 898.414 million miles (1.445858 billion kilometers) from Earth. Image scale on Saturn is 43 miles (69 kilometers) per pixel; image scale on the Earth is 53,820 miles (86,620 kilometers) per pixel. The illuminated areas of neither Earth nor the Moon are resolved here. Consequently, the size of each "dot" is the same size that a point of light of comparable brightness would have in the wide-angle camera. http://photojournal.jpl.nasa.gov/catalog/PIA17171
NASA Technical Reports Server (NTRS)
2002-01-01
One of the benefits of the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) Extended Mission is the opportunity to observe how the planet's weather changes during a second full martian year. This picture of Arsia Mons was taken June 19, 2001; southern spring equinox occurred the same day. Arsia Mons is a volcano nearly large enough to cover the state of New Mexico. On this particular day (the first day of Spring), the MOC wide angle cameras documented an unusual spiral-shaped cloud within the 110 km (68 mi) diameter caldera--the summit crater--of the giant volcano. Because the cloud is bright both in the red and blue images acquired by the wide angle cameras, it probably consisted mostly of fine dust grains. The cloud's spin may have been induced by winds off the inner slopes of the volcano's caldera walls resulting from the temperature differences between the walls and the caldera floor, or by a vortex as winds blew up and over the caldera. Similar spiral clouds were seen inside the caldera for several days; we don't know if this was a single cloud that persisted throughout that time or one that regenerated each afternoon. Sunlight illuminates this scene from the left/upper left.Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.The Panoramic Camera (PanCam) Instrument for the ESA ExoMars Rover
NASA Astrophysics Data System (ADS)
Griffiths, A.; Coates, A.; Jaumann, R.; Michaelis, H.; Paar, G.; Barnes, D.; Josset, J.
The recently approved ExoMars rover is the first element of the ESA Aurora programme and is slated to deliver the Pasteur exobiology payload to Mars by 2013. The 0.7 kg Panoramic Camera will provide multispectral stereo images with 65° field-of- view (1.1 mrad/pixel) and high resolution (85 µrad/pixel) monoscopic "zoom" images with 5° field-of-view. The stereo Wide Angle Cameras (WAC) are based on Beagle 2 Stereo Camera System heritage. The Panoramic Camera instrument is designed to fulfil the digital terrain mapping requirements of the mission as well as providing multispectral geological imaging, colour and stereo panoramic images, solar images for water vapour abundance and dust optical depth measurements and to observe retrieved subsurface samples before ingestion into the rest of the Pasteur payload. Additionally the High Resolution Camera (HRC) can be used for high resolution imaging of interesting targets detected in the WAC panoramas and of inaccessible locations on crater or valley walls.
Morphology and Dynamics of Jets of Comet 67P Churyumov-Gerasimenko: Early Phase Development
NASA Astrophysics Data System (ADS)
Lin, Zhong-Yi; Ip, Wing-Huen; Lai, Ian-Lin; Lee, Jui-Chi; Pajola, Maurizio; Lara, Luisa; Gutierrez, Pedro; Rodrigo, Rafael; Bodewits, Dennis; A'Hearn, Mike; Vincent, Jean-Baptiste; Agarwal, Jessica; Keller, Uwe; Mottola, Stefano; Bertini, Ivano; Lowry, Stephen; Rozek, Agata; Liao, Ying; Rosetta Osiris Coi Team
2015-04-01
The scientific camera, OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System), onboard the Rosetta spacecraft comprises a Narrow Angle Camera (NAC) for nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field of dust and gas coma investigations. The dynamical behavior of jets in the dust coma continuously monitored by using dust filters from the arrival at the comet (August 2014) throughout the mapping phase (Oct. 2014) is described here. The analysis will cover the study of the time variability of jets, the source regions of these jets, the excess brightness of jets relative to the averaged coma brightness, and the brightness distribution of dust jets along the projected distance. The jets detected between August and September originated mostly from the neck region (Hapi). Morphological changes appeared over a time scale of several days in September. The brightness slope of the dust jets is much steeper than the background coma. This might be related to the sublimation or fragmentation of the emitted dust grains. Inter-comparison with results from other experiments will be necessary to understand the difference between the dust emitted from Hapi and those from the head and the body of the nucleus surface. The physical properties of the Hapi jets will be compared to dust jets (and their source regions) to emerge as comet 67P moves around the perihelion.
The Effect of Camera Angle and Image Size on Source Credibility and Interpersonal Attraction.
ERIC Educational Resources Information Center
McCain, Thomas A.; Wakshlag, Jacob J.
The purpose of this study was to examine the effects of two nonverbal visual variables (camera angle and image size) on variables developed in a nonmediated context (source credibility and interpersonal attraction). Camera angle and image size were manipulated in eight video taped television newscasts which were subsequently presented to eight…
Clementine Observes the Moon, Solar Corona, and Venus
1999-06-12
In 1994, during its flight, NASA's Clementine spacecraft returned images of the Moon. In addition to the geologic mapping cameras, the Clementine spacecraft also carried two Star Tracker cameras for navigation. These lightweight (0.3 kg) cameras kept the spacecraft on track by constantly observing the positions of stars, reminiscent of the age-old seafaring tradition of sextant/star navigation. These navigation cameras were also to take some spectacular wide angle images of the Moon. In this picture the Moon is seen illuminated solely by light reflected from the Earth--Earthshine! The bright glow on the lunar horizon is caused by light from the solar corona; the sun is just behind the lunar limb. Caught in this image is the planet Venus at the top of the frame. http://photojournal.jpl.nasa.gov/catalog/PIA00434
Quality Assessment of 3d Reconstruction Using Fisheye and Perspective Sensors
NASA Astrophysics Data System (ADS)
Strecha, C.; Zoller, R.; Rutishauser, S.; Brot, B.; Schneider-Zapp, K.; Chovancova, V.; Krull, M.; Glassey, L.
2015-03-01
Recent mathematical advances, growing alongside the use of unmanned aerial vehicles, have not only overcome the restriction of roll and pitch angles during flight but also enabled us to apply non-metric cameras in photogrammetric method, providing more flexibility for sensor selection. Fisheye cameras, for example, advantageously provide images with wide coverage; however, these images are extremely distorted and their non-uniform resolutions make them more difficult to use for mapping or terrestrial 3D modelling. In this paper, we compare the usability of different camera-lens combinations, using the complete workflow implemented in Pix4Dmapper to achieve the final terrestrial reconstruction result of a well-known historical site in Switzerland: the Chillon Castle. We assess the accuracy of the outcome acquired by consumer cameras with perspective and fisheye lenses, comparing the results to a laser scanner point cloud.
A Wide Field of View Plasma Spectrometer
Skoug, Ruth M.; Funsten, Herbert O.; Moebius, Eberhard; ...
2016-07-01
Here we present a fundamentally new type of space plasma spectrometer, the wide field of view plasma spectrometer, whose field of view is >1.25π ster using fewer resources than traditional methods. The enabling component is analogous to a pinhole camera with an electrostatic energy-angle filter at the image plane. Particle energy-per-charge is selected with a tunable bias voltage applied to the filter plate relative to the pinhole aperture plate. For a given bias voltage, charged particles from different directions are focused by different angles to different locations. Particles with appropriate locations and angles can transit the filter plate and aremore » measured using a microchannel plate detector with a position-sensitive anode. Full energy and angle coverage are obtained using a single high-voltage power supply, resulting in considerable resource savings and allowing measurements at fast timescales. Lastly, we present laboratory prototype measurements and simulations demonstrating the instrument concept and discuss optimizations of the instrument design for application to space measurements.« less
Challenges and solutions for high performance SWIR lens design
NASA Astrophysics Data System (ADS)
Gardner, M. C.; Rogers, P. J.; Wilde, M. F.; Cook, T.; Shipton, A.
2016-10-01
Shortwave infrared (SWIR) cameras are becoming increasingly attractive due to the improving size, resolution and decreasing prices of InGaAs focal plane arrays (FPAs). The rapid development of competitively priced HD performance SWIR cameras has not been matched in SWIR imaging lenses with the result that the lens is now more likely to be the limiting factor in imaging quality than the FPA. Adapting existing lens designs from the visible region by re-coating for SWIR will improve total transmission but diminished image quality metrics such as MTF, and in particular large field angle performance such as vignetting, field curvature and distortion are serious consequences. To meet this challenge original SWIR solutions are presented including a wide field of view fixed focal length lens for commercial machine vision (CMV) and a wide angle, small, lightweight defence lens and their relevant design considerations discussed. Issues restricting suitable glass types will be examined. The index and dispersion properties at SWIR wavelengths can differ significantly from their visible values resulting in unusual glass combinations when matching doublet elements. Materials chosen simultaneously allow athermalization of the design as well as containing matched CTEs in the elements of doublets. Recently, thinned backside-illuminated InGaAs devices have made Vis.SWIR cameras viable. The SWIR band is sufficiently close to the visible that the same constituent materials can be used for AR coatings covering both bands. Keeping the lens short and mass low can easily result in high incidence angles which in turn complicates coating design, especially when extended beyond SWIR into the visible band. This paper also explores the potential performance of wideband Vis.SWIR AR coatings.
Wide-field fundus imaging with trans-palpebral illumination.
Toslak, Devrim; Thapa, Damber; Chen, Yanjun; Erol, Muhammet Kazim; Paul Chan, R V; Yao, Xincheng
2017-01-28
In conventional fundus imaging devices, transpupillary illumination is used for illuminating the inside of the eye. In this method, the illumination light is directed into the posterior segment of the eye through the cornea and passes the pupillary area. As a result of sharing the pupillary area for the illumination beam and observation path, pupil dilation is typically necessary for wide-angle fundus examination, and the field of view is inherently limited. An alternative approach is to deliver light from the sclera. It is possible to image a wider retinal area with transcleral-illumination. However, the requirement of physical contact between the illumination probe and the sclera is a drawback of this method. We report here trans-palpebral illumination as a new method to deliver the light through the upper eyelid (palpebra). For this study, we used a 1.5 mm diameter fiber with a warm white LED light source. To illuminate the inside of the eye, the fiber illuminator was placed at the location corresponding to the pars plana region. A custom designed optical system was attached to a digital camera for retinal imaging. The optical system contained a 90 diopter ophthalmic lens and a 25 diopter relay lens. The ophthalmic lens collected light coming from the posterior of the eye and formed an aerial image between the ophthalmic and relay lenses. The aerial image was captured by the camera through the relay lens. An adequate illumination level was obtained to capture wide angle fundus images within ocular safety limits, defined by the ISO 15004-2: 2007 standard. This novel trans-palpebral illumination approach enables wide-angle fundus photography without eyeball contact and pupil dilation.
Surveillance Using Multiple Unmanned Aerial Vehicles
2009-03-01
BATCAM wingspan was 21” vs Jodeh’s 9.1 ft, the BATCAM’s propulsion was electric vs. Jodeh’s gas engine, cameras were body fixed vs. gimballed, and...3.1: BATCAM Camera FOV Angles Angle Front Camera Side Camera Depression Angle 49◦ 39◦ horizontal FOV 48◦ 48◦ vertical FOV 40◦ 40◦ by a quiet electric ...motor. The batteries can be recharged with a car cigarette lighter in less than an hour. Assembly of the wing airframe takes less than a minute, and
Reconditioning of Cassini Narrow-Angle Camera
2002-07-23
These five images of single stars, taken at different times with the narrow-angle camera on NASA Cassini spacecraft, show the effects of haze collecting on the camera optics, then successful removal of the haze by warming treatments.
Multi-Angle Snowflake Camera Instrument Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stuefer, Martin; Bailey, J.
2016-07-01
The Multi-Angle Snowflake Camera (MASC) takes 9- to 37-micron resolution stereographic photographs of free-falling hydrometers from three angles, while simultaneously measuring their fall speed. Information about hydrometeor size, shape orientation, and aspect ratio is derived from MASC photographs. The instrument consists of three commercial cameras separated by angles of 36º. Each camera field of view is aligned to have a common single focus point about 10 cm distant from the cameras. Two near-infrared emitter pairs are aligned with the camera’s field of view within a 10-angular ring and detect hydrometeor passage, with the lower emitters configured to trigger the MASCmore » cameras. The sensitive IR motion sensors are designed to filter out slow variations in ambient light. Fall speed is derived from successive triggers along the fall path. The camera exposure times are extremely short, in the range of 1/25,000th of a second, enabling the MASC to capture snowflake sizes ranging from 30 micrometers to 3 cm.« less
Upper wide-angle viewing system for ITER
Lasnier, C. J.; McLean, A. G.; Gattuso, A.; ...
2016-08-15
The Upper Wide Angle Viewing System (UWAVS) will be installed on five upper ports of ITER. Here, this paper shows major requirements, gives an overview of the preliminary design with reasons for some design choices, examines self-emitted IR light from UWAVS optics and its effect on accuracy, and shows calculations of signal-to-noise ratios for the two-color temperature output as a function of integration time and divertor temperature. Accurate temperature output requires correction for vacuum window absorption vs. wavelength and for self-emitted IR, which requires good measurement of the temperature of the optical components. The anticipated signal-to-noise ratio using presently availablemore » IR cameras is adequate for the required 500 Hz frame rate.« less
Angle of sky light polarization derived from digital images of the sky under various conditions.
Zhang, Wenjing; Cao, Yu; Zhang, Xuanzhe; Yang, Yi; Ning, Yu
2017-01-20
Skylight polarization is used for navigation by some birds and insects. Skylight polarization also has potential for human navigation applications. Its advantages include relative immunity from interference and the absence of error accumulation over time. However, there are presently few examples of practical applications for polarization navigation technology. The main reason is its weak robustness during cloudy weather conditions. In this paper, the real-time measurement of the sky light polarization pattern across the sky has been achieved with a wide field of view camera. The images were processed under a new reference coordinate system to clearly display the symmetrical distribution of angle of polarization with respect to the solar meridian. A new algorithm for the extraction of the image axis of symmetry is proposed, in which the real-time azimuth angle between the camera and the solar meridian is accurately calculated. Our experimental results under different weather conditions show that polarization navigation has high accuracy, is strongly robust, and performs well during fog and haze, clouds, and strong sunlight.
Evaluation of a novel collimator for molecular breast tomosynthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilland, David R.; Welch, Benjamin L.; Lee, Seungjoon
Here, this study investigated a novel gamma camera for molecular breast tomosynthesis (MBT), which is a nuclear breast imaging method that uses limited angle tomography. The camera is equipped with a variable angle, slant-hole (VASH) collimator that allows the camera to remain close to the breast throughout the acquisition. The goal of this study was to evaluate the spatial resolution and count sensitivity of this camera and to compare contrast and contrast-to-noise ratio (CNR) with conventional planar imaging using an experimental breast phantom. Methods The VASH collimator mounts to a commercial gamma camera for breast imaging that uses a pixelatedmore » (3.2 mm), 15 × 20 cm NaI crystal. Spatial resolution was measured in planar images over a range of distances from the collimator (30-100 mm) and a range of slant angles (–25° to 25°) using 99mTc line sources. Spatial resolution was also measured in reconstructed MBT images including in the depth dimension. The images were reconstructed from data acquired over the -25° to 25° angular range using an iterative algorithm adapted to the slant-hole geometry. Sensitivity was measured over the range of slant angles using a disk source. Measured spatial resolution and sensitivity were compared to theoretical values. Contrast and CNR were measured using a breast phantom containing spherical lesions (6.2 mm and 7.8 mm diameter) and positioned over a range of depths in the phantom. The MBT and planar methods had equal scan time, and the count density in the breast phantom data was similar to that in clinical nuclear breast imaging. The MBT method used an iterative reconstruction algorithm combined with a postreconstruction Metz filter. Results The measured spatial resolution in planar images agreed well with theoretical calculations over the range of distances and slant angles. The measured FWHM was 9.7 mm at 50 mm distance. In reconstructed MBT images, the spatial resolution in the depth dimension was approximately 2.2 mm greater than the other two dimensions due to the limited angle data. The measured count sensitivity agreed closely with theory over all slant angles when using a wide energy window. At 0° slant angle, measured sensitivity was 19.7 counts sec -1 μCi -1 with the open energy window and 11.2 counts sec -1 μCi -1 with a 20% wide photopeak window (126 to 154 keV). The measured CNR in the MBT images was significantly greater than in the planar images for all but the lowest CNR cases where the lesion detectability was extremely low for both MBT and planar. The 7.8 mm lesion at 37 mm depth was marginally detectable in the planar image but easily visible in the MBT image. The improved CNR with MBT was due to a large improvement in contrast, which out-weighed the increase in image noise. Conclusion The spatial resolution and count sensitivity measurements with the prototype MBT system matched theoretical calculations, and the measured CNR in breast phantom images was generally greater with the MBT system compared to conventional planar imaging. These results demonstrate the potential of the proposed MBT system to improve lesion detection in nuclear breast imaging.« less
Evaluation of a novel collimator for molecular breast tomosynthesis.
Gilland, David R; Welch, Benjamin L; Lee, Seungjoon; Kross, Brian; Weisenberger, Andrew G
2017-11-01
This study investigated a novel gamma camera for molecular breast tomosynthesis (MBT), which is a nuclear breast imaging method that uses limited angle tomography. The camera is equipped with a variable angle, slant-hole (VASH) collimator that allows the camera to remain close to the breast throughout the acquisition. The goal of this study was to evaluate the spatial resolution and count sensitivity of this camera and to compare contrast and contrast-to-noise ratio (CNR) with conventional planar imaging using an experimental breast phantom. The VASH collimator mounts to a commercial gamma camera for breast imaging that uses a pixelated (3.2 mm), 15 × 20 cm NaI crystal. Spatial resolution was measured in planar images over a range of distances from the collimator (30-100 mm) and a range of slant angles (-25° to 25°) using 99m Tc line sources. Spatial resolution was also measured in reconstructed MBT images including in the depth dimension. The images were reconstructed from data acquired over the -25° to 25° angular range using an iterative algorithm adapted to the slant-hole geometry. Sensitivity was measured over the range of slant angles using a disk source. Measured spatial resolution and sensitivity were compared to theoretical values. Contrast and CNR were measured using a breast phantom containing spherical lesions (6.2 mm and 7.8 mm diameter) and positioned over a range of depths in the phantom. The MBT and planar methods had equal scan time, and the count density in the breast phantom data was similar to that in clinical nuclear breast imaging. The MBT method used an iterative reconstruction algorithm combined with a postreconstruction Metz filter. The measured spatial resolution in planar images agreed well with theoretical calculations over the range of distances and slant angles. The measured FWHM was 9.7 mm at 50 mm distance. In reconstructed MBT images, the spatial resolution in the depth dimension was approximately 2.2 mm greater than the other two dimensions due to the limited angle data. The measured count sensitivity agreed closely with theory over all slant angles when using a wide energy window. At 0° slant angle, measured sensitivity was 19.7 counts sec -1 μCi -1 with the open energy window and 11.2 counts sec -1 μCi -1 with a 20% wide photopeak window (126 to 154 keV). The measured CNR in the MBT images was significantly greater than in the planar images for all but the lowest CNR cases where the lesion detectability was extremely low for both MBT and planar. The 7.8 mm lesion at 37 mm depth was marginally detectable in the planar image but easily visible in the MBT image. The improved CNR with MBT was due to a large improvement in contrast, which out-weighed the increase in image noise. The spatial resolution and count sensitivity measurements with the prototype MBT system matched theoretical calculations, and the measured CNR in breast phantom images was generally greater with the MBT system compared to conventional planar imaging. These results demonstrate the potential of the proposed MBT system to improve lesion detection in nuclear breast imaging. © 2017 American Association of Physicists in Medicine.
Evaluation of a novel collimator for molecular breast tomosynthesis
Gilland, David R.; Welch, Benjamin L.; Lee, Seungjoon; ...
2017-09-06
Here, this study investigated a novel gamma camera for molecular breast tomosynthesis (MBT), which is a nuclear breast imaging method that uses limited angle tomography. The camera is equipped with a variable angle, slant-hole (VASH) collimator that allows the camera to remain close to the breast throughout the acquisition. The goal of this study was to evaluate the spatial resolution and count sensitivity of this camera and to compare contrast and contrast-to-noise ratio (CNR) with conventional planar imaging using an experimental breast phantom. Methods The VASH collimator mounts to a commercial gamma camera for breast imaging that uses a pixelatedmore » (3.2 mm), 15 × 20 cm NaI crystal. Spatial resolution was measured in planar images over a range of distances from the collimator (30-100 mm) and a range of slant angles (–25° to 25°) using 99mTc line sources. Spatial resolution was also measured in reconstructed MBT images including in the depth dimension. The images were reconstructed from data acquired over the -25° to 25° angular range using an iterative algorithm adapted to the slant-hole geometry. Sensitivity was measured over the range of slant angles using a disk source. Measured spatial resolution and sensitivity were compared to theoretical values. Contrast and CNR were measured using a breast phantom containing spherical lesions (6.2 mm and 7.8 mm diameter) and positioned over a range of depths in the phantom. The MBT and planar methods had equal scan time, and the count density in the breast phantom data was similar to that in clinical nuclear breast imaging. The MBT method used an iterative reconstruction algorithm combined with a postreconstruction Metz filter. Results The measured spatial resolution in planar images agreed well with theoretical calculations over the range of distances and slant angles. The measured FWHM was 9.7 mm at 50 mm distance. In reconstructed MBT images, the spatial resolution in the depth dimension was approximately 2.2 mm greater than the other two dimensions due to the limited angle data. The measured count sensitivity agreed closely with theory over all slant angles when using a wide energy window. At 0° slant angle, measured sensitivity was 19.7 counts sec -1 μCi -1 with the open energy window and 11.2 counts sec -1 μCi -1 with a 20% wide photopeak window (126 to 154 keV). The measured CNR in the MBT images was significantly greater than in the planar images for all but the lowest CNR cases where the lesion detectability was extremely low for both MBT and planar. The 7.8 mm lesion at 37 mm depth was marginally detectable in the planar image but easily visible in the MBT image. The improved CNR with MBT was due to a large improvement in contrast, which out-weighed the increase in image noise. Conclusion The spatial resolution and count sensitivity measurements with the prototype MBT system matched theoretical calculations, and the measured CNR in breast phantom images was generally greater with the MBT system compared to conventional planar imaging. These results demonstrate the potential of the proposed MBT system to improve lesion detection in nuclear breast imaging.« less
NASA Technical Reports Server (NTRS)
Grunwald, Arthur J.; Kohn, Silvia
1993-01-01
The pilot's ability to derive Control-Oriented Visual Field Information from teleoperated Helmet-Mounted displays in Nap-of-the-Earth flight, is investigated. The visual field with these types of displays, commonly used in Apache and Cobra helicopter night operations, originates from a relatively narrow field-of-view Forward Looking Infrared Radiation Camera, gimbal-mounted at the nose of the aircraft and slaved to the pilot's line-of-sight, in order to obtain a wide-angle field-of-regard. Pilots have encountered considerable difficulties in controlling the aircraft by these devices. Experimental simulator results presented here indicate that part of these difficulties can be attributed to head/camera slaving system phase lags and errors. In the presence of voluntary head rotation, these slaving system imperfections are shown to impair the Control-Oriented Visual Field Information vital in vehicular control, such as the perception of the anticipated flight path or the vehicle yaw rate. Since, in the presence of slaving system imperfections, the pilot will tend to minimize head rotation, the full wide-angle field-of-regard of the line-of-sight slaved Helmet-Mounted Display, is not always fully utilized.
NASA Astrophysics Data System (ADS)
Ott, T.; Drolshagen, E.; Koschny, D.; Güttler, C.; Tubiana, C.; Frattin, E.; Agarwal, J.; Sierks, H.; Bertini, I.; Barbieri, C.; Lamy, P. I.; Rodrigo, R.; Rickman, H.; A'Hearn, M. F.; Barucci, M. A.; Bertaux, J.-L.; Boudreault, S.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; Debei, S.; De Cecco, M.; Deller, J.; Feller, C.; Fornasier, S.; Fulle, M.; Geiger, B.; Gicquel, A.; Groussin, O.; Gutiérrez, P. J.; Hofmann, M.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Keller, H. U.; Knollenberg, J.; Kovacs, G.; Kramm, J. R.; Kührt, E.; Küppers, M.; Lara, L. M.; Lazzarin, M.; Lin, Z.-Y.; López-Moreno, J. J.; Marzari, F.; Mottola, S.; Naletto, G.; Oklay, N.; Pajola, M.; Shi, X.; Thomas, N.; Vincent, J.-B.; Poppe, B.
2017-07-01
The OSIRIS (optical, spectroscopic and infrared remote imaging system) instrument on board the ESA Rosetta spacecraft collected data of 67P/Churyumov-Gerasimenko for over 2 yr. OSIRIS consists of two cameras, a Narrow Angle Camera and a Wide Angle Camera. For specific imaging sequences related to the observation of dust aggregates in 67P's coma, the two cameras were operating simultaneously. The two cameras are mounted 0.7 m apart from each other, as a result this baseline yields a parallax shift of the apparent particle trails on the analysed images directly proportional to their distance. Thanks to such shifts, the distance between observed dust aggregates and the spacecraft was determined. This method works for particles closer than 6000 m to the spacecraft and requires very few assumptions. We found over 250 particles in a suitable distance range with sizes of some centimetres, masses in the range of 10-6-102 kg and a mean velocity of about 2.4 m s-1 relative to the nucleus. Furthermore, the spectral slope was analysed showing a decrease in the median spectral slope of the particles with time. The further a particle is from the spacecraft the fainter is its signal. For this reason, this was counterbalanced by a debiasing. Moreover, the dust mass-loss rate of the nucleus could be computed as well as the Afρ of the comet around perihelion. The summed-up dust mass-loss rate for the mass bins 10-4-102 kg is almost 8300 kg s-1.
A Robust Mechanical Sensing System for Unmanned Sea Surface Vehicles
NASA Technical Reports Server (NTRS)
Kulczycki, Eric A.; Magnone, Lee J.; Huntsberger, Terrance; Aghazarian, Hrand; Padgett, Curtis W.; Trotz, David C.; Garrett, Michael S.
2009-01-01
The need for autonomous navigation and intelligent control of unmanned sea surface vehicles requires a mechanically robust sensing architecture that is watertight, durable, and insensitive to vibration and shock loading. The sensing system developed here comprises four black and white cameras and a single color camera. The cameras are rigidly mounted to a camera bar that can be reconfigured to mount multiple vehicles, and act as both navigational cameras and application cameras. The cameras are housed in watertight casings to protect them and their electronics from moisture and wave splashes. Two of the black and white cameras are positioned to provide lateral vision. They are angled away from the front of the vehicle at horizontal angles to provide ideal fields of view for mapping and autonomous navigation. The other two black and white cameras are positioned at an angle into the color camera's field of view to support vehicle applications. These two cameras provide an overlap, as well as a backup to the front camera. The color camera is positioned directly in the middle of the bar, aimed straight ahead. This system is applicable to any sea-going vehicle, both on Earth and in space.
Modeling of the ITER-like wide-angle infrared thermography view of JET.
Aumeunier, M-H; Firdaouss, M; Travère, J-M; Loarer, T; Gauthier, E; Martin, V; Chabaud, D; Humbert, E
2012-10-01
Infrared (IR) thermography systems are mandatory to ensure safe plasma operation in fusion devices. However, IR measurements are made much more complicated in metallic environment because of the spurious contributions of the reflected fluxes. This paper presents a full predictive photonic simulation able to assess accurately the surface temperature measurement with classical IR thermography from a given plasma scenario and by taking into account the optical properties of PFCs materials. This simulation has been carried out the ITER-like wide angle infrared camera view of JET in comparing with experimental data. The consequences and the effects of the low emissivity and the bidirectional reflectivity distribution function used in the model for the metallic PFCs on the contribution of the reflected flux in the analysis are discussed.
2013-12-23
The globe of Saturn, seen here in natural color, is reminiscent of a holiday ornament in this wide-angle view from NASA's Cassini spacecraft. The characteristic hexagonal shape of Saturn's northern jet stream, somewhat yellow here, is visible. At the pole lies a Saturnian version of a high-speed hurricane, eye and all. This view is centered on terrain at 75 degrees north latitude, 120 degrees west longitude. Images taken using red, green and blue spectral filters were combined to create this natural-color view. The images were taken with the Cassini spacecraft wide-angle camera on July 22, 2013. This view was acquired at a distance of approximately 611,000 miles (984,000 kilometers) from Saturn. Image scale is 51 miles (82 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA17175
Addressing challenges of modulation transfer function measurement with fisheye lens cameras
NASA Astrophysics Data System (ADS)
Deegan, Brian M.; Denny, Patrick E.; Zlokolica, Vladimir; Dever, Barry; Russell, Laura
2015-03-01
Modulation transfer function (MTF) is a well defined and accepted method of measuring image sharpness. The slanted edge test, as defined in ISO12233 is a standard method of calculating MTF, and is widely used for lens alignment and auto-focus algorithm verification. However, there are a number of challenges which should be considered when measuring MTF in cameras with fisheye lenses. Due to trade-offs related Petzval curvature, planarity of the optical plane is difficult to achieve in fisheye lenses. It is therefore critical to have the ability to accurately measure sharpness throughout the entire image, particularly for lens alignment. One challenge for fisheye lenses is that, because of the radial distortion, the slanted edges will have different angles, depending on the location within the image and on the distortion profile of the lens. Previous work in the literature indicates that MTF measurements are robust for angles between 2 and 10 degrees. Outside of this range, MTF measurements become unreliable. Also, the slanted edge itself will be curved by the lens distortion, causing further measurement problems. This study summarises the difficulties in the use of MTF for sharpness measurement in fisheye lens cameras, and proposes mitigations and alternative methods.
The MESSENGER Earth Flyby: Results from the Mercury Dual Imaging System
NASA Astrophysics Data System (ADS)
Prockter, L. M.; Murchie, S. L.; Hawkins, S. E.; Robinson, M. S.; Shelton, R. G.; Vaughan, R. M.; Solomon, S. C.
2005-12-01
The MESSENGER (MErcury Surface, Space ENvironment, Geochemistry, and Ranging) spacecraft was launched from Cape Canaveral Air Force Station, Fla., on 3 August 2004. It returned to Earth for a gravity assist on 2 August 2005, providing an exceptional opportunity for the Science Team to perform instrument calibrations and to test some of the data acquisition sequences that will be used to meet Mercury science goals. The Mercury Dual Imaging System (MDIS), one of seven science instruments on MESSENGER, consists of a wide-angle and a narrow-angle imager that together can map landforms, track variations in surface color, and carry out stereogrammetry. The two imagers are mounted on a pivot platform that enables the instrument to point in a different direction from the spacecraft boresight, allowing great flexibility and increased imaging coverage. During the week prior to the closest approach to Earth, MDIS acquired a number of images of the Moon for radiometric calibration and to test optical navigation sequences that will be used to target planetary flybys. Twenty-four hours before closest approach, images of the Earth were acquired with 11 filters of the wide-angle camera. After MDIS flew over the nightside of the Earth, additional color images centered on South America were obtained at sufficiently high resolution to discriminate small-scale features such as the Amazon River and Lake Titicaca. During its departure from Earth, MDIS acquired a sequence of images taken in three filters every 4 minutes over a period of 24 hours. These images have been assembled into a movie of a crescent Earth that begins as South America slides across the terminator into darkness and continues for one full Earth rotation. This movie and the other images have provided a successful test of the sequences that will be used during the MESSENGER Mercury flybys in 2008 and 2009 and have demonstrated the high quality of the MDIS wide-angle camera.
Wide-Field-of-View, High-Resolution, Stereoscopic Imager
NASA Technical Reports Server (NTRS)
Prechtl, Eric F.; Sedwick, Raymond J.
2010-01-01
A device combines video feeds from multiple cameras to provide wide-field-of-view, high-resolution, stereoscopic video to the user. The prototype under development consists of two camera assemblies, one for each eye. One of these assemblies incorporates a mounting structure with multiple cameras attached at offset angles. The video signals from the cameras are fed to a central processing platform where each frame is color processed and mapped into a single contiguous wide-field-of-view image. Because the resolution of most display devices is typically smaller than the processed map, a cropped portion of the video feed is output to the display device. The positioning of the cropped window will likely be controlled through the use of a head tracking device, allowing the user to turn his or her head side-to-side or up and down to view different portions of the captured image. There are multiple options for the display of the stereoscopic image. The use of head mounted displays is one likely implementation. However, the use of 3D projection technologies is another potential technology under consideration, The technology can be adapted in a multitude of ways. The computing platform is scalable, such that the number, resolution, and sensitivity of the cameras can be leveraged to improve image resolution and field of view. Miniaturization efforts can be pursued to shrink the package down for better mobility. Power savings studies can be performed to enable unattended, remote sensing packages. Image compression and transmission technologies can be incorporated to enable an improved telepresence experience.
1986-01-24
Range : 236,000 km. ( 147,000 mi. ) Resolution : 33 km. ( 20 mi. ) P-29525B/W This Voyager 2 image reveals a contiuos distribution of small particles throughout the Uranus ring system. This unigue geometry, the highest phase angle at which Voyager imaged the rings, allows us to see lanes of fine dust particles not visible from other viewing angles. All the previously known rings are visible. However, some of the brightest features in the image are bright dust lanes not previously seen. the combination of this unique geometry and a long, 96 second exposure allowed this spectacular observation, acquired through the clear filter if Voyager 2's wide angle camera. the long exposure produced a noticable, non-uniform smear, as well as streaks due to trailed stars.
Wide-Field Optic for Autonomous Acquisition of Laser Link
NASA Technical Reports Server (NTRS)
Page, Norman A.; Charles, Jeffrey R.; Biswas, Abhijit
2011-01-01
An innovation reported in Two-Camera Acquisition and Tracking of a Flying Target, NASA Tech Briefs, Vol. 32, No. 8 (August 2008), p. 20, used a commercial fish-eye lens and an electronic imaging camera for initially locating objects with subsequent handover to an actuated narrow-field camera. But this operated against a dark-sky background. An improved solution involves an optical design based on custom optical components for the wide-field optical system that directly addresses the key limitations in acquiring a laser signal from a moving source such as an aircraft or a spacecraft. The first challenge was to increase the light collection entrance aperture diameter, which was approximately 1 mm in the first prototype. The new design presented here increases this entrance aperture diameter to 4.2 mm, which is equivalent to a more than 16 times larger collection area. One of the trades made in realizing this improvement was to restrict the field-of-view to +80 deg. elevation and 360 azimuth. This trade stems from practical considerations where laser beam propagation over the excessively high air mass, which is in the line of sight (LOS) at low elevation angles, results in vulnerability to severe atmospheric turbulence and attenuation. An additional benefit of the new design is that the large entrance aperture is maintained even at large off-axis angles when the optic is pointed at zenith. The second critical limitation for implementing spectral filtering in the design was tackled by collimating the light prior to focusing it onto the focal plane. This allows the placement of the narrow spectral filter in the collimated portion of the beam. For the narrow band spectral filter to function properly, it is necessary to adequately control the range of incident angles at which received light intercepts the filter. When this angle is restricted via collimation, narrower spectral filtering can be implemented. The collimated beam (and the filter) must be relatively large to reduce the incident angle down to only a few degrees. In the presented embodiment, the filter diameter is more than ten times larger than the entrance aperture. Specifically, the filter has a clear aperture of about 51 mm. The optical design is refractive, and is comprised of nine custom refractive elements and an interference filter. The restricted maximum angle through the narrow-band filter ensures the efficient use of a 2-nm noise equivalent bandwidth spectral width optical filter at low elevation angles (where the range is longest), at the expense of less efficiency for high elevations, which can be tolerated because the range at high elevation angles is shorter. The image circle is 12 mm in diameter, mapped to 80 x 360 of sky, centered on the zenith.
A Wide-Angle Camera for the Mobile Asteroid Surface Scout (MASCOT) on Hayabusa-2
NASA Astrophysics Data System (ADS)
Schmitz, N.; Koncz, A.; Jaumann, R.; Hoffmann, H.; Jobs, D.; Kachlicki, J.; Michaelis, H.; Mottola, S.; Pforte, B.; Schroeder, S.; Terzer, R.; Trauthan, F.; Tschentscher, M.; Weisse, S.; Ho, T.-M.; Biele, J.; Ulamec, S.; Broll, B.; Kruselburger, A.; Perez-Prieto, L.
2014-04-01
JAXA's Hayabusa-2 mission, an asteroid sample return mission, is scheduled for launch in December 2014, for a rendezvous with the C-type asteroid 1999 JU3 in 2018. MASCOT, the Mobile Asteroid Surface Scout [1], is a small lander, designed to deliver ground truth for the orbiter remote measurements, support the selection of sampling sites, and provide context for the returned samples.MASCOT's main objective is to investigate the landing site's geomorphology, the internal structure, texture and composition of the regolith (dust, soil and rocks), and the thermal, mechanical, and magnetic properties of the surface. MASCOT comprises a payload of four scientific instruments: camera, radiometer, magnetometer and hyper-spectral microscope. The camera (MASCOT CAM) was designed and built by DLR's Institute of Planetary Research, together with Airbus DS Germany.
7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...
7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Sky brightness and color measurements during the 21 August 2017 total solar eclipse.
Bruns, Donald G; Bruns, Ronald D
2018-06-01
The sky brightness was measured during the partial phases and during totality of the 21 August 2017 total solar eclipse. A tracking CCD camera with color filters and a wide-angle lens allowed measurements across a wide field of view, recording images every 10 s. The partially and totally eclipsed Sun was kept behind an occulting disk attached to the camera, allowing direct brightness measurements from 1.5° to 38° from the Sun. During the partial phases, the sky brightness as a function of time closely followed the integrated intensity of the unobscured fraction of the solar disk. A redder sky was measured close to the Sun just before totality, caused by the redder color of the exposed solar limb. During totality, a bluer sky was measured, dimmer than the normal sky by a factor of 10,000. Suggestions for enhanced measurements at future eclipses are offered.
Towards a Single Sensor Passive Solution for Automated Fall Detection
Belshaw, Michael; Taati, Babak; Snoek, Jasper; Mihailidis, Alex
2012-01-01
Falling in the home is one of the major challenges to independent living among older adults. The associated costs, coupled with a rapidly growing elderly population, are placing a burden on healthcare systems worldwide that will swiftly become unbearable. To facilitate expeditious emergency care, we have developed an artificially intelligent camera-based system that automatically detects if a person within the field-of-view has fallen. The system addresses concerns raised in earlier work and the requirements of a widely deployable in-home solution. The presented prototype utilizes a consumer-grade camera modified with a wide-angle lens. Machine learning techniques applied to carefully engineered features allow the system to classify falls at high accuracy while maintaining invariance to lighting, environment and the presence of multiple moving objects. This paper describes the system, outlines the algorithms used and presents empirical validation of its effectiveness. PMID:22254671
Mars Global Surveyor: 7 Years in Orbit!
NASA Technical Reports Server (NTRS)
2004-01-01
12 September 2004 Today, 12 September 2004, the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) team celebrates 7 Earth years orbiting Mars. MGS first reached the red planet and performed its critical orbit insertion burn on 12 September 1997. Over the past 7 years, MOC has returned over 170,000 images; its narrow angle camera has covered about 4.5% of the surface, and its wide angle cameras have viewed 100% of the planet nearly everyday. At this time, MOC is not acquiring data because Mars is on the other side of the Sun relative to Earth. This period, known as Solar Conjunction, occurs about once every 26 months. During Solar Conjunction, no radio communications from spacecraft that are orbiting or have landed on Mars can be received. MOC was turned off on 7 September and is expected to resume operations on 25 September 2004, when Mars re-emerges from behind the Sun. The rotating color image of Mars shown here was compiled from MOC red and blue wide angle daily global images acquired exactly 1 Mars year ago on 26 October 2002 (Ls 86.4o). In other words, Mars today (12 September 2004) should look about the same as the view provided here. Presently, Mars is in very late northern spring, and the north polar cap has retreated almost to its summer configuration. Water ice clouds form each afternoon at this time of year over the large volcanoes in the Tharsis and Elysium regions. A discontinuous belt of clouds forms over the martian equator; it is most prominent north of the Valles Marineris trough system. In the southern hemisphere, it is late autumn and the giant Hellas Basin floor is nearly white with seasonal frost cover. The south polar cap is not visible, it is enveloped in seasonal darkness. The northern summer and southern winter seasons will begin on 20 September 2004.6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...
6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Objective for monitoring the corona discharge
NASA Astrophysics Data System (ADS)
Obrezkov, Andrey; Rodionov, Andrey Yu.; Pisarev, Viktor N.; Chivanov, Alexsey N.; Baranov, Yuri P.; Korotaev, Valery V.
2016-04-01
Remote optoelectronic probing is one of the most actual aspects of overhead electric line maintenances. By installing such systems on a helicopter (for example) it becomes possible to monitor overhead transmission line status and to search damaged parts of the lines. Thermal and UV-cameras are used for more effective diagnostic. UV-systems are fitted with filters, that attenuate visible spectrum, which is an undesired type of signal. Also these systems have a wide view angle for better view and proper diagnostics. For even more effectiveness, it is better to use several spectral channels: like UV and IR. Such spectral selection provides good noise reduction. Experimental results of spectral parameters of the wide view angle multispectral objective for such systems are provided in this report. There is also data on point spread function, UV and IR scattering index data and technical requirements for detectors.
In-Flight performance of MESSENGER's Mercury dual imaging system
Hawkins, S.E.; Murchie, S.L.; Becker, K.J.; Selby, C.M.; Turner, F.S.; Noble, M.W.; Chabot, N.L.; Choo, T.H.; Darlington, E.H.; Denevi, B.W.; Domingue, D.L.; Ernst, C.M.; Holsclaw, G.M.; Laslo, N.R.; Mcclintock, W.E.; Prockter, L.M.; Robinson, M.S.; Solomon, S.C.; Sterner, R.E.
2009-01-01
The Mercury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) spacecraft, launched in August 2004 and planned for insertion into orbit around Mercury in 2011, has already completed two flybys of the innermost planet. The Mercury Dual Imaging System (MDIS) acquired nearly 2500 images from the first two flybys and viewed portions of Mercury's surface not viewed by Mariner 10 in 1974-1975. Mercury's proximity to the Sun and its slow rotation present challenges to the thermal design for a camera on an orbital mission around Mercury. In addition, strict limitations on spacecraft pointing and the highly elliptical orbit create challenges in attaining coverage at desired geometries and relatively uniform spatial resolution. The instrument designed to meet these challenges consists of dual imagers, a monochrome narrow-angle camera (NAC) with a 1.5?? field of view (FOV) and a multispectral wide-angle camera (WAC) with a 10.5?? FOV, co-aligned on a pivoting platform. The focal-plane electronics of each camera are identical and use a 1024??1024 charge-coupled device detector. The cameras are passively cooled but use diode heat pipes and phase-change-material thermal reservoirs to maintain the thermal configuration during the hot portions of the orbit. Here we present an overview of the instrument design and how the design meets its technical challenges. We also review results from the first two flybys, discuss the quality of MDIS data from the initial periods of data acquisition and how that compares with requirements, and summarize how in-flight tests are being used to improve the quality of the instrument calibration. ?? 2009 SPIE.
NASA Technical Reports Server (NTRS)
1989-01-01
This pair of Voyager 2 images (FDS 11446.21 and 11448.10), two 591-s exposures obtained through the clear filter of the wide angle camera, show the full ring system with the highest sensitivity. Visible in this figure are the bright, narrow N53 and N63 rings, the diffuse N42 ring, and (faintly) the plateau outside of the N53 ring (with its slight brightening near 57,500 km).
Costless Platform for High Resolution Stereoscopic Images of a High Gothic Facade
NASA Astrophysics Data System (ADS)
Héno, R.; Chandelier, L.; Schelstraete, D.
2012-07-01
In October 2011, the PPMD specialized master's degree students (Photogrammetry, Positionning and Deformation Measurement) of the French ENSG (IGN's School of Geomatics, the Ecole Nationale des Sciences Géographiques) were asked to come and survey the main facade of the cathedral of Amiens, which is very complex as far as size and decoration are concerned. Although it was first planned to use a lift truck for the image survey, budget considerations and taste for experimentation led the project to other perspectives: images shot from the ground level with a long focal camera will be combined to complementary images shot from what higher galleries are available on the main facade with a wide angle camera fixed on a horizontal 2.5 meter long pole. This heteroclite image survey is being processed by the PPMD master's degree students during this academic year. Among other type of products, 3D point clouds will be calculated on specific parts of the facade with both sources of images. If the proposed device and methodology to get full image coverage of the main facade happen to be fruitful, the image acquisition phase will be completed later by another team. This article focuses on the production of 3D point clouds with wide angle images on the rose of the main facade.
Visual field information in Nap-of-the-Earth flight by teleoperated Helmet-Mounted displays
NASA Technical Reports Server (NTRS)
Grunwald, Arthur J.; Kohn, S.; Merhav, S. J.
1991-01-01
The human ability to derive Control-Oriented Visual Field Information from teleoperated Helmet-Mounted displays in Nap-of-the-Earth flight, is investigated. The visual field with these types of displays originates from a Forward Looking Infrared Radiation Camera, gimbal-mounted at the front of the aircraft and slaved to the pilot's line-of-sight, to obtain wide-angle visual coverage. Although these displays are proved to be effective in Apache and Cobra helicopter night operations, they demand very high pilot proficiency and work load. Experimental work presented in the paper has shown that part of the difficulties encountered in vehicular control by means of these displays can be attributed to the narrow viewing aperture and head/camera slaving system phase lags. Both these shortcomings will impair visuo-vestibular coordination, when voluntary head rotation is present. This might result in errors in estimating the Control-Oriented Visual Field Information vital in vehicular control, such as the vehicle yaw rate or the anticipated flight path, or might even lead to visuo-vestibular conflicts (motion sickness). Since, under these conditions, the pilot will tend to minimize head rotation, the full wide-angle coverage of the Helmet-Mounted Display, provided by the line-of-sight slaving system, is not always fully utilized.
Calibration of Action Cameras for Photogrammetric Purposes
Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo
2014-01-01
The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898
Calibration of action cameras for photogrammetric purposes.
Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo
2014-09-18
The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution.
A single camera photogrammetry system for multi-angle fast localization of EEG electrodes.
Qian, Shuo; Sheng, Yang
2011-11-01
Photogrammetry has become an effective method for the determination of electroencephalography (EEG) electrode positions in three dimensions (3D). Capturing multi-angle images of the electrodes on the head is a fundamental objective in the design of photogrammetry system for EEG localization. Methods in previous studies are all based on the use of either a rotating camera or multiple cameras, which are time-consuming or not cost-effective. This study aims to present a novel photogrammetry system that can realize simultaneous acquisition of multi-angle head images in a single camera position. Aligning two planar mirrors with the angle of 51.4°, seven views of the head with 25 electrodes are captured simultaneously by the digital camera placed in front of them. A complete set of algorithms for electrode recognition, matching, and 3D reconstruction is developed. It is found that the elapsed time of the whole localization procedure is about 3 min, and camera calibration computation takes about 1 min, after the measurement of calibration points. The positioning accuracy with the maximum error of 1.19 mm is acceptable. Experimental results demonstrate that the proposed system provides a fast and cost-effective method for the EEG positioning.
7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...
7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...
2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Smartphone-Guided Needle Angle Selection During CT-Guided Procedures.
Xu, Sheng; Krishnasamy, Venkatesh; Levy, Elliot; Li, Ming; Tse, Zion Tsz Ho; Wood, Bradford John
2018-01-01
In CT-guided intervention, translation from a planned needle insertion angle to the actual insertion angle is estimated only with the physician's visuospatial abilities. An iPhone app was developed to reduce reliance on operator ability to estimate and reproduce angles. The iPhone app overlays the planned angle on the smartphone's camera display in real-time based on the smartphone's orientation. The needle's angle is selected by visually comparing the actual needle with the guideline in the display. If the smartphone's screen is perpendicular to the planned path, the smartphone shows the Bull's-Eye View mode, in which the angle is selected after the needle's hub overlaps the tip in the camera. In phantom studies, we evaluated the accuracies of the hardware, the Guideline mode, and the Bull's-Eye View mode and showed the app's clinical efficacy. A proof-of-concept clinical case was also performed. The hardware accuracy was 0.37° ± 0.27° (mean ± SD). The mean error and navigation time were 1.0° ± 0.9° and 8.7 ± 2.3 seconds for a senior radiologist with 25 years' experience and 1.5° ± 1.3° and 8.0 ± 1.6 seconds for a junior radiologist with 4 years' experience. The accuracy of the Bull's-Eye View mode was 2.9° ± 1.1°. Combined CT and smart-phone guidance was significantly more accurate than CT-only guidance for the first needle pass (p = 0.046), which led to a smaller final targeting error (mean distance from needle tip to target, 2.5 vs 7.9 mm). Mobile devices can be useful for guiding needle-based interventions. The hardware is low cost and widely available. The method is accurate, effective, and easy to implement.
An ordinary camera in an extraordinary location: Outreach with the Mars Webcam
NASA Astrophysics Data System (ADS)
Ormston, T.; Denis, M.; Scuka, D.; Griebel, H.
2011-09-01
The European Space Agency's Mars Express mission was launched in 2003 and was Europe's first mission to Mars. On-board was a small camera designed to provide ‘visual telemetry’ of the separation of the Beagle-2 lander. After achieving its goal it was shut down while the primary science mission of Mars Express got underway. In 2007 this camera was reactivated by the flight control team of Mars Express for the purpose of providing public education and outreach—turning it into the ‘Mars Webcam’.The camera is a small, 640×480 pixel colour CMOS camera with a wide-angle 30°×40° field of view. This makes it very similar in almost every way to the average home PC webcam. The major difference is that this webcam is not in an average location but is instead in orbit around Mars. On a strict basis of non-interference with the primary science activities, the camera is turned on to provide unique wide-angle views of the planet below.A highly automated process ensures that the observations are scheduled on the spacecraft and then uploaded to the internet as rapidly as possible. There is no intermediate stage, so that visitors to the Mars Webcam blog serve as ‘citizen scientists’. Full raw datasets and processing instructions are provided along with a mechanism to allow visitors to comment on the blog. Members of the public are encouraged to use this in either a personal or an educational context and work with the images. We then take their excellent work and showcase it back on the blog. We even apply techniques developed by them to improve the data and webcam experience for others.The accessibility and simplicity of the images also makes the data ideal for educational use, especially as educational projects can then be showcased on the site as inspiration for others. The oft-neglected target audience of space enthusiasts is also important as this allows them to participate as part of an interplanetary instrument team.This paper will cover the history of the project and the technical background behind using the camera and linking the results to an accessible blog format. It will also cover the outreach successes of the project, some of the contributions from the Mars Webcam community, opportunities to use and work with the Mars Webcam and plans for future uses of the camera.
An effective rectification method for lenselet-based plenoptic cameras
NASA Astrophysics Data System (ADS)
Jin, Jing; Cao, Yiwei; Cai, Weijia; Zheng, Wanlu; Zhou, Ping
2016-10-01
The Lenselet-Based Plenoptic has recently drawn a lot of attention in the field of computational photography. The additional information inherent in light field allows a wide range of applications, but some preliminary processing of the raw image is necessary before further operations. In this paper, an effective method is presented for the rotation rectification of the raw image. The rotation is caused by imperfectly position of micro-lens array relative to the sensor plane in commercially available Lytro plenoptic cameras. The key to our method is locating the center of each microlens image, which is projected by a micro-lens. Because of vignetting, the pixel values at centers of the micro-lens image are higher than those at the peripheries. A mask is applied to probe the micro-lens image to locate the center area by finding the local maximum response. The error of the center coordinate estimate is corrected and the angle of rotation is computed via a subsequent line fitting. The algorithm is performed on two images captured by different Lytro cameras. The angles of rotation are -0.3600° and -0.0621° respectively and the rectified raw image is useful and reliable for further operations, such as extraction of the sub-aperture images. The experimental results demonstrate that our method is efficient and accurate.
Marcot, Bruce G.; Jorgenson, M. Torre; DeGange, Anthony R.
2014-01-01
5. A Canon® Rebel 3Ti with a Sigma zoom lens (18–200 mm focal length). The Drift® HD-170 and GoPro® Hero3 cameras were secured to the struts and underwing for nadir (direct downward) imaging. The Panasonic® and Canon® cameras were each hand-held for oblique-angle landscape images, shooting through the airplanes’ windows, targeting both general landscape conditions as well as landscape features of special interest, such as tundra fire scars and landslips. The Drift® and GoPro® cameras each were set for time-lapse photography at 5-second intervals for overlapping coverage. Photographs from all cameras (100 percent .jpg format) were date- and time-synchronized to geographic positioning system waypoints taken during the flights, also at 5-second intervals, providing precise geotagging (latitude-longitude) of all files. All photographs were adjusted for color saturation and gamma, and nadir photographs were corrected for lens distortion for the Drift® and GoPro® cameras’ 170° wide-angle distortion. EXIF (exchangeable image file format) data on camera settings and geotagging were extracted into spreadsheet databases. An additional 1 hour, 20 minutes, and 43 seconds of high-resolution videos were recorded at 60 frames per second with the GoPro® camera along selected transect segments, and also were image-adjusted and corrected for lens distortion. Geotagged locations of 12,395 nadir photographs from the Drift® and GoPro® cameras were overlayed in a geographic information system (ArcMap 10.0) onto a map of 44 ecotypes (land- and water-cover types) of the Arctic Network study area. Presence and area of each ecotype occurring within a geographic information system window centered on the location of each photograph were recorded and included in the spreadsheet databases. All original and adjusted photographs, videos, geographic positioning system flight tracks, and photograph databases are available by contacting ascweb@usgs.gov.
3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...
3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Thermal Effects on Camera Focal Length in Messenger Star Calibration and Orbital Imaging
NASA Astrophysics Data System (ADS)
Burmeister, S.; Elgner, S.; Preusker, F.; Stark, A.; Oberst, J.
2018-04-01
We analyse images taken by the MErcury Surface, Space ENviorment, GEochemistry, and Ranging (MESSENGER) spacecraft for the camera's thermal response in the harsh thermal environment near Mercury. Specifically, we study thermally induced variations in focal length of the Mercury Dual Imaging System (MDIS). Within the several hundreds of images of star fields, the Wide Angle Camera (WAC) typically captures up to 250 stars in one frame of the panchromatic channel. We measure star positions and relate these to the known star coordinates taken from the Tycho-2 catalogue. We solve for camera pointing, the focal length parameter and two non-symmetrical distortion parameters for each image. Using data from the temperature sensors on the camera focal plane we model a linear focal length function in the form of f(T) = A0 + A1 T. Next, we use images from MESSENGER's orbital mapping mission. We deal with large image blocks, typically used for the production of a high-resolution digital terrain models (DTM). We analyzed images from the combined quadrangles H03 and H07, a selected region, covered by approx. 10,600 images, in which we identified about 83,900 tiepoints. Using bundle block adjustments, we solved for the unknown coordinates of the control points, the pointing of the camera - as well as the camera's focal length. We then fit the above linear function with respect to the focal plane temperature. As a result, we find a complex response of the camera to thermal conditions of the spacecraft. To first order, we see a linear increase by approx. 0.0107 mm per degree temperature for the Narrow-Angle Camera (NAC). This is in agreement with the observed thermal response seen in images of the panchromatic channel of the WAC. Unfortunately, further comparisons of results from the two methods, both of which use different portions of the available image data, are limited. If leaving uncorrected, these effects may pose significant difficulties in the photogrammetric analysis, specifically these may be responsible for erroneous longwavelength trends in topographic models.
A position and attitude vision measurement system for wind tunnel slender model
NASA Astrophysics Data System (ADS)
Cheng, Lei; Yang, Yinong; Xue, Bindang; Zhou, Fugen; Bai, Xiangzhi
2014-11-01
A position and attitude vision measurement system for drop test slender model in wind tunnel is designed and developed. The system used two high speed cameras, one is put to the side of the model and another is put to the position where the camera can look up the model. Simple symbols are set on the model. The main idea of the system is based on image matching technique between the 3D-digital model projection image and the image captured by the camera. At first, we evaluate the pitch angles, the roll angles and the position of the centroid of a model through recognizing symbols in the images captured by the side camera. And then, based on the evaluated attitude info, giving a series of yaw angles, a series of projection images of the 3D-digital model are obtained. Finally, these projection images are matched with the image which captured by the looking up camera, and the best match's projection images corresponds to the yaw angle is the very yaw angle of the model. Simulation experiments are conducted and the results show that the maximal error of attitude measurement is less than 0.05°, which can meet the demand of test in wind tunnel.
2016-10-17
Pandora is seen here, in isolation beside Saturn's kinked and constantly changing F ring. Pandora (near upper right) is 50 miles (81 kilometers) wide. The moon has an elongated, potato-like shape (see PIA07632). Two faint ringlets are visible within the Encke Gap, near lower left. The gap is about 202 miles (325 kilometers) wide. The much narrower Keeler Gap, which lies outside the Encke Gap, is maintained by the diminutive moon Daphnis (not seen here). This view looks toward the sunlit side of the rings from about 23 degrees above the ring plane. The image was taken in visible light with the Cassini spacecraft narrow-angle camera on Aug. 12, 2016. The view was acquired at a distance of approximately 907,000 miles (1.46 million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 113 degrees. Image scale is 6 miles (9 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20504
Auto-converging stereo cameras for 3D robotic tele-operation
NASA Astrophysics Data System (ADS)
Edmondson, Richard; Aycock, Todd; Chenault, David
2012-06-01
Polaris Sensor Technologies has developed a Stereovision Upgrade Kit for TALON robot to provide enhanced depth perception to the operator. This kit previously required the TALON Operator Control Unit to be equipped with the optional touchscreen interface to allow for operator control of the camera convergence angle adjustment. This adjustment allowed for optimal camera convergence independent of the distance from the camera to the object being viewed. Polaris has recently improved the performance of the stereo camera by implementing an Automatic Convergence algorithm in a field programmable gate array in the camera assembly. This algorithm uses scene content to automatically adjust the camera convergence angle, freeing the operator to focus on the task rather than adjustment of the vision system. The autoconvergence capability has been demonstrated on both visible zoom cameras and longwave infrared microbolometer stereo pairs.
NASA Technical Reports Server (NTRS)
Poliner, Jeffrey; Fletcher, Lauren; Klute, Glenn K.
1994-01-01
Video-based motion analysis systems are widely employed to study human movement, using computers to capture, store, process, and analyze video data. This data can be collected in any environment where cameras can be located. One of the NASA facilities where human performance research is conducted is the Weightless Environment Training Facility (WETF), a pool of water which simulates zero-gravity with neutral buoyance. Underwater video collection in the WETF poses some unique problems. This project evaluates the error caused by the lens distortion of the WETF cameras. A grid of points of known dimensions was constructed and videotaped using a video vault underwater system. Recorded images were played back on a VCR and a personal computer grabbed and stored the images on disk. These images were then digitized to give calculated coordinates for the grid points. Errors were calculated as the distance from the known coordinates of the points to the calculated coordinates. It was demonstrated that errors from lens distortion could be as high as 8 percent. By avoiding the outermost regions of a wide-angle lens, the error can be kept smaller.
Spectral methods to detect cometary minerals with OSIRIS on board Rosetta
NASA Astrophysics Data System (ADS)
Oklay, N.; Vincent, J.-B.; Sierks, H.
2013-09-01
Comet 67P/Churyumov-Gerasimenko is going to be observed by the OSIRIS scientific imager (Keller et al. 2007) on board ESA's spacecraft Rosetta in the wavelength range of 250-1000 nm with a combination of 12 filters for the narrow angle camera (NAC) and 14 combination of 12 filters for the narrow angle camera (NAC) and 14 filters in the wavelength range of 240-720 nm for the wide angle camera (WAC). NAC filters are suitable to surface composition studies, while WAC filters are designed for gas and radical emission studies. In order to investigate the composition of the comet surface from the observed images, we need to understand how to detect different minerals and which compositional information can be derived from the NAC filters. Therefore, the most common cometary silicates e.g. enstatite, forsterite are investigated with two hydrated silicates (serpentine and smectite) for the determina- tion of the spectral methods. Laboratory data of those selected minerals are collected from RELAB database (http://www.planetary.brown.edu/relabdocs/relab.htm) and absolute spectra of the minerals observed by OSIRIS NAC filters are calculated. Due to the limited spectral range of the laboratory data, Far-UV and Neutral density filters of NAC are excluded from this analysis. Considered NAC filters in this study are represented in Table 1 and the number of collected laboratory data are presented in Table 2. Detection and separation of the minerals will not only allow us to study the surface composition but also to study observed composition changes due to the cometary activity during the mission.
NASA Astrophysics Data System (ADS)
Castro Marín, J. M.; Brown, V. J. G.; López Jiménez, A. C.; Rodríguez Gómez, J.; Rodrigo, R.
2001-05-01
The optical, spectroscopic infrared remote imaging system (OSIRIS) is an instrument carried on board the European Space Agency spacecraft Rosetta that will be launched in January 2003 to study in situ the comet Wirtanen. The electronic design of the mechanism controller board (MCB) system of the two OSIRIS optical cameras, the narrow angle camera, and the wide angle camera, is described here. The system is comprised of two boards mounted on an aluminum frame as part of an electronics box that contains the power supply and the digital processor unit of the instrument. The mechanisms controlled by the MCB for each camera are the front door assembly and a filter wheel assembly. The front door assembly for each camera is driven by a four phase, permanent magnet stepper motor. Each filter wheel assembly consists of two, eight filter wheels. Each wheel is driven by a four phase, variable reluctance stepper motor. Each motor, for all the assemblies, also contains a redundant set of four stator phase windings that can be energized separately or in parallel with the main windings. All stepper motors are driven in both directions using the full step unipolar mode of operation. The MCB also performs general housekeeping data acquisition of the OSIRIS instrument, i.e., mechanism position encoders and temperature measurements. The electronic design application used is quite new due to use of a field programmable gate array electronic devices that avoid the use of the now traditional system controlled by microcontrollers and software. Electrical tests of the engineering model have been performed successfully and the system is ready for space qualification after environmental testing. This system may be of interest to institutions involved in future space experiments with similar needs for mechanisms control.
Spatial Variations of Spectral Properties of (21) Lutetia as Observed by OSIRIS/Rosetta
NASA Astrophysics Data System (ADS)
Leyrat, Cedric; Sierks, H.; Barbieri, C.; Barucci, A.; Da Deppo, V.; De Leon, J.; Fulchignoni, M.; Fornasier, S.; Groussin, O.; Hviid, S. F.; Jorda, L.; Keller, H. U.; La Forgia, F.; Lara, L.; Lazzarin, M.; Magrin, S.; Marchi, S.; Thomas, N.; Schroder, S. E.; OSIRIS Team
2010-10-01
On July 10, 2010, the Rosetta ESA/NASA spacecraft successfully flew by the asteroid (21) Lutetia, which becomes the largest asteroid observed by a space probe. The closest approach occurred at 15H45 UTC at a relative speed of 15km/s and a relative distance of 3160 km. The Narrow Angle Camera (NAC) and the Wide Angle Camera (WAC) of the OSIRIS instrument onboard Rosetta acquired images at different phase angles ranging from almost zero to more than 150 degrees. The best spatial resolution (60 m/pixel) allowed to reveal a very complex topography with several features and different crater's surface densities. Spectrophotometric analysis of the data could suggest spatial variations of the albedo and spectral properties at the surface of the asteroid, at least in the northern hemisphere. Numerous sets of data have been obtained at different wavelengths from 270nm to 980nm. We will first present a color-color analysis of data in order to locate landscapes where surface variegation is present. We will also present a more accurate study of spectral properties using the shape model and different statistical methods. Possible variations of the surface spectral properties with the slope of the ground and the gravity field orientation will be discussed as well.
NASA Technical Reports Server (NTRS)
2005-01-01
During its approach to Mimas on Aug. 2, 2005, the Cassini spacecraft narrow-angle camera obtained multi-spectral views of the moon from a range of 228,000 kilometers (142,500 miles). This image is a narrow angle clear-filter image which was processed to enhance the contrast in brightness and sharpness of visible features. Herschel crater, a 140-kilometer-wide (88-mile) impact feature with a prominent central peak, is visible in the upper right of this image. This image was obtained when the Cassini spacecraft was above 25 degrees south, 134 degrees west latitude and longitude. The Sun-Mimas-spacecraft angle was 45 degrees and north is at the top. The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA's Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colo. For more information about the Cassini-Huygens mission visit http://saturn.jpl.nasa.gov . The Cassini imaging team homepage is at http://ciclops.org .Memoris, A Wide Angle Camera For Bepicolombo
NASA Astrophysics Data System (ADS)
Cremonese, G.; Memoris Team
In order to answer to the Announcement of Opportunity of ESA for the BepiColombo payload, we are working on a wide angle camera concept named MEMORIS (MEr- cury MOderate Resolution Imaging System). MEMORIS will performe stereoscopic images of the whole Mercury surface using two different channels at +/- 20 degrees from the nadir point. It will achieve a spatial resolution of 50m per pixel at 400 km from the surface (peri-Herm), corresponding to a vertical resolution of about 75m with the stereo performances. The scientific objectives will be addressed by MEMORIS may be identified as follows: Estimate of surface age based on crater counting Crater morphology and degrada- tion Stratigraphic sequence of geological units Identification of volcanic features and related deposits Origin of plain units from morphological observations Distribution and type of the tectonic structures Determination of relative age among the structures based on cross-cutting relationships 3D Tectonics Global mineralogical mapping of main geological units Identification of weathering products The last two items will come from the multispectral capabilities of the camera utilizing 8 to 12 (TBD) broad band filters. MEMORIS will be equipped by a further channel devoted to the observations of the tenuous exosphere. It will look at the limb on a given arc of the BepiColombo orbit, in so doing it will observe the exosphere above a surface latitude range of 25-75 degrees in the northern emisphere. The exosphere images will be obtained above the surface just observed by the other two channels, trying to find possible relantionship, as ground-based observations suggest. The exospheric channel will have four narrow-band filters centered on the sodium and potassium emissions and the adjacent continua.
The Fringe Reading Facility at the Max-Planck-Institut fuer Stroemungsforschung
NASA Astrophysics Data System (ADS)
Becker, F.; Meier, G. E. A.; Wegner, H.; Timm, R.; Wenskus, R.
1987-05-01
A Mach-Zehnder interferometer is used for optical flow measurements in a transonic wind tunnel. Holographic interferograms are reconstructed by illumination with a He-Ne-laser and viewed by a video camera through wide angle optics. This setup was used for investigating industrial double exposure holograms of truck tires in order to develop methods of automatic recognition of certain manufacturing faults. Automatic input is achieved by a transient recorder digitizing the output of a TV camera and transferring the digitized data to a PDP11-34. Interest centered around sequences of interferograms showing the interaction of vortices with a profile and subsequent emission of sound generated by this process. The objective is the extraction of quantitative data which relates to the emission of noise.
The Fringe Reading Facility at the Max-Planck-Institut fuer Stroemungsforschung
NASA Technical Reports Server (NTRS)
Becker, F.; Meier, G. E. A.; Wegner, H.; Timm, R.; Wenskus, R.
1987-01-01
A Mach-Zehnder interferometer is used for optical flow measurements in a transonic wind tunnel. Holographic interferograms are reconstructed by illumination with a He-Ne-laser and viewed by a video camera through wide angle optics. This setup was used for investigating industrial double exposure holograms of truck tires in order to develop methods of automatic recognition of certain manufacturing faults. Automatic input is achieved by a transient recorder digitizing the output of a TV camera and transferring the digitized data to a PDP11-34. Interest centered around sequences of interferograms showing the interaction of vortices with a profile and subsequent emission of sound generated by this process. The objective is the extraction of quantitative data which relates to the emission of noise.
NASA Technical Reports Server (NTRS)
2003-01-01
MGS MOC Release No. MOC2-344, 28 April 2003
This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image mosaic was constructed from data acquired by the MOC red wide angle camera. The large, circular feature in the upper left is Aram Chaos, an ancient impact crater filled with layered sedimentary rock that was later disrupted and eroded to form a blocky, 'chaotic' appearance. To the southeast of Aram Chaos, in the lower right of this picture, is Iani Chaos. The light-toned patches amid the large blocks of Iani Chaos are known from higher-resolution MOC images to be layered, sedimentary rock outcrops. The picture center is near 0.5oN, 20oW. Sunlight illuminates the scene from the left/upper left.Junocam: Juno's Outreach Camera
NASA Astrophysics Data System (ADS)
Hansen, C. J.; Caplinger, M. A.; Ingersoll, A.; Ravine, M. A.; Jensen, E.; Bolton, S.; Orton, G.
2017-11-01
Junocam is a wide-angle camera designed to capture the unique polar perspective of Jupiter offered by Juno's polar orbit. Junocam's four-color images include the best spatial resolution ever acquired of Jupiter's cloudtops. Junocam will look for convective clouds and lightning in thunderstorms and derive the heights of the clouds. Junocam will support Juno's radiometer experiment by identifying any unusual atmospheric conditions such as hotspots. Junocam is on the spacecraft explicitly to reach out to the public and share the excitement of space exploration. The public is an essential part of our virtual team: amateur astronomers will supply ground-based images for use in planning, the public will weigh in on which images to acquire, and the amateur image processing community will help process the data.
Virtual displays for 360-degree video
NASA Astrophysics Data System (ADS)
Gilbert, Stephen; Boonsuk, Wutthigrai; Kelly, Jonathan W.
2012-03-01
In this paper we describe a novel approach for comparing users' spatial cognition when using different depictions of 360- degree video on a traditional 2D display. By using virtual cameras within a game engine and texture mapping of these camera feeds to an arbitrary shape, we were able to offer users a 360-degree interface composed of four 90-degree views, two 180-degree views, or one 360-degree view of the same interactive environment. An example experiment is described using these interfaces. This technique for creating alternative displays of wide-angle video facilitates the exploration of how compressed or fish-eye distortions affect spatial perception of the environment and can benefit the creation of interfaces for surveillance and remote system teleoperation.
NASA Technical Reports Server (NTRS)
Jenniskens, Peter; Nugent, David; Murthy, Jayant; Tedesco, Ed; DeVincenzi, Donal L. (Technical Monitor)
2000-01-01
In November 1997, the Midcourse Space Experiment satellite (MSX) was deployed to observe the Leonid shower from space. The shower lived up to expectations, with abundant bright fireballs. Twenty-nine meteors were detected by a wide-angle, visible wavelength, camera near the limb of the Earth in a 48-minute interval, and three meteors by the narrow field camera. This amounts to a meteoroid influx of 5.5 +/- 0.6 10(exp -5)/sq km hr for masses greater than 0.3 gram. The limiting magnitude for limb observations of Leonid meteors was measured at M(sub v) = -1.5 magn The Leonid shower magnitude population index was 1.6 +/- 0.2 down to M(sub v) = -7 magn., with no sign of an upper mass cut-off.
Atmospheric aerosol profiling with a bistatic imaging lidar system.
Barnes, John E; Sharma, N C Parikh; Kaplan, Trevor B
2007-05-20
Atmospheric aerosols have been profiled using a simple, imaging, bistatic lidar system. A vertical laser beam is imaged onto a charge-coupled-device camera from the ground to the zenith with a wide-angle lens (CLidar). The altitudes are derived geometrically from the position of the camera and laser with submeter resolution near the ground. The system requires no overlap correction needed in monostatic lidar systems and needs a much smaller dynamic range. Nighttime measurements of both molecular and aerosol scattering were made at Mauna Loa Observatory. The CLidar aerosol total scatter compares very well with a nephelometer measuring at 10 m above the ground. The results build on earlier work that compared purely molecular scattered light to theory, and detail instrument improvements.
Preliminary results on photometric properties of materials at the Sagan Memorial Station, Mars
Johnson, J. R.; Kirk, R.; Soderblom, L.A.; Gaddis, L.; Reid, R.J.; Britt, D.T.; Smith, P.; Lemmon, M.; Thomas, N.; Bell, J.F.; Bridges, N.T.; Anderson, R.; Herkenhoff, K. E.; Maki, J.; Murchie, S.; Dummel, A.; Jaumann, R.; Trauthan, F.; Arnold, G.
1999-01-01
Reflectance measurements of selected rocks and soils over a wide range of illumination geometries obtained by the Imager for Mars Pathfinder (IMP) camera provide constraints on interpretations of the physical and mineralogical nature of geologic materials at the landing site. The data sets consist of (1) three small "photometric spot" subframed scenes, covering phase angles from 20?? to 150??; (2) two image strips composed of three subframed images each, located along the antisunrise and antisunset lines (photometric equator), covering phase angles from ???0?? to 155??; and (3) full-image scenes of the rock "Yogi," covering phase angles from 48?? to 100??. Phase functions extracted from calibrated data exhibit a dominantly backscattering photometric function, consistent with the results from the Viking lander cameras. However, forward scattering behavior does appear at phase angles >140??, particularly for the darker gray rock surfaces. Preliminary efforts using a Hapke scattering model are useful in comparing surface properties of different rock and soil types but are not well constrained, possibly due to the incomplete phase angle availability, uncertainties related to the photometric function of the calibration targets, and/or the competing effects of diffuse and direct lighting. Preliminary interpretations of the derived Hapke parameters suggest that (1) red rocks can be modeled as a mixture of gray rocks with a coating of bright and dark soil or dust, and (2) gray rocks have macroscopically smoother surfaces composed of microscopically homogeneous, clear materials with little internal scattering, which may imply a glass-like or varnished surface. Copyright 1999 by the American Geophysical Union.
8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...
8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
The Orbiter camera payload system's large-format camera and attitude reference system
NASA Technical Reports Server (NTRS)
Schardt, B. B.; Mollberg, B. H.
1985-01-01
The Orbiter camera payload system (OCPS) is an integrated photographic system carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a large-format camera (LFC), a precision wide-angle cartographic instrument capable of producing high-resolution stereophotography of great geometric fidelity in multiple base-to-height ratios. A secondary and supporting system to the LFC is the attitude reference system (ARS), a dual-lens stellar camera array (SCA) and camera support structure. The SCA is a 70 mm film system that is rigidly mounted to the LFC lens support structure and, through the simultaneous acquisition of two star fields with each earth viewing LFC frame, makes it possible to precisely determine the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high-precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with rocket launch conditions and the on-orbit environment. The full OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on Oct. 5, 1984, as a major payload aboard the STS-41G mission.
ERIC Educational Resources Information Center
Mackworth, Norman H.; And Others
1972-01-01
The Mackworth wide-angle reflection eye camera was used to record the position of the gaze on a display of 16 white symbols. One of these symbols changed to red after 30 seconds, remained red for a minute of testing, and then became white again. The subjects were 10 aphasic children (aged 5-9), who were compared with a group of 10 normal children,…
Eastern Space and Missile Center (ESMC) Capability.
1983-09-16
Sites Fig. 4 ETR Tracking Itlescopes A unique feature at the ETR is the ability to compute a The Contraves Model 151 includes a TV camera. a widetband...main objective lens. The Contraves wideband transmitter sends video signals from either the main objective TV or the DAGE wide-angle TV system to the...Modified main objective plus the time of day to 0.1 second. to use the ESMC precise 2400 b/s acquisition data system, the Contraves computer system
Aspects of Voyager photogrammetry
NASA Technical Reports Server (NTRS)
Wu, Sherman S. C.; Schafer, Francis J.; Jordan, Raymond; Howington, Annie-Elpis
1987-01-01
In January 1986, Voyager 2 took a series of pictures of Uranus and its satellites with the Imaging Science System (ISS) on board the spacecraft. Based on six stereo images from the ISS narrow-angle camera, a topographic map was compiled of the Southern Hemisphere of Miranda, one of Uranus' moons. Assuming a spherical figure, a 20-km surface relief is shown on the map. With three additional images from the ISS wide-angle camera, a control network of Miranda's Southern Hemisphere was established by analytical photogrammetry, producing 88 ground points for the control of multiple-model compilation on the AS-11AM analytical stereoplotter. Digital terrain data from the topographic map of Miranda have also been produced. By combining these data and the image data from the Voyager 2 mission, perspective views or even a movie of the mapped area can be made. The application of these newly developed techniques to Voyager 1 imagery, which includes a few overlapping pictures of Io and Ganymede, permits the compilation of contour maps or topographic profiles of these bodies on the analytical stereoplotters.
1996-01-29
In this image from NASA's Voyager wide-angle taken on Aug. 23 1989, the two main rings of Neptune can be clearly seen. In the lower part of the frame the originally announced ring arc, consisting of three distinct features, is visible. This feature covers about 35 degrees of longitude and has yet to be radially resolved in Voyager images. From higher resolution images it is known that this region contains much more material than the diffuse belts seen elsewhere in its orbit, which seem to encircle the planet. This is consistent with the fact that ground-based observations of stellar occultations by the rings show them to be very broken and clumpy. The more sensitive wide-angle camera is revealing more widely distributed but fainter material. Each of these rings of material lies just outside of the orbit of a newly discovered moon. One of these moons, 1989N2, may be seen in the upper right corner. The moon is streaked by its orbital motion, whereas the stars in the frame are less smeared. The dark area around the bright moon and star are artifacts of the processing required to bring out the faint rings. This wide-angle image was taken from a range of 2 million kilometers (1.2 million miles), through the clear filter. http://photojournal.jpl.nasa.gov/catalog/PIA00053
Public-Requested Mars Image: Crater on Pavonis Mons
NASA Technical Reports Server (NTRS)
2003-01-01
MGS MOC Release No. MOC2-481, 12 September 2003
This image is in the first pair obtained in the Public Target Request program, which accepts suggestions for sites to photograph with the Mars Orbiter Camera on NASA's Mars Global Surveyor spacecraft.It is a narrow-angle (high-resolution) view of a portion of the lower wall and floor of the caldera at the top of a martian volcano named Pavonis Mons. A companion picture is a wide-angle context image, taken at the same time as the high-resolution view. The white box in the context frame shows the location of the high-resolution picture. [figure removed for brevity, see original site] Pavonis Mons is a broad shield volcano. Its summit region is about 14 kilometers (8.7 miles) above the martian datum (zero-elevation reference level). The caldera is about 4.6 kilometers (2.8 miles) deep. The caldera formed by collapse--long ago--as molten rock withdrew to greater depths within the volcano. The high-resolution picture shows that today the floor and walls of this caldera are covered by a thick, textured mantle of dust, perhaps more than 1 meter (1 yard) deep. Larger boulders and rock outcroppings poke out from within this dust mantle. They are seen as small, dark dots and mounds on the lower slopes of the wall in the high-resolution image. The narrow-angle Mars Orbiter Camera image has a resolution of 1.5 meters (about 5 feet) per pixel and covers an area 1.5 kilometers (0.9 mile) wide by 9 kilometers (5.6 miles) long. The context image, covering much of the summit region of Pavonis Mons, is about 115 kilometers (72 miles) wide. Sunlight illuminates both images from the lower left; north is toward the upper right; east to the right. The high-resolution view is located near 0.4 degrees north latitude, 112.8 degrees west longitude.First NAC Image Obtained in Mercury Orbit
2017-12-08
NASA image acquired: March 29, 2011 This is the first image of Mercury taken from orbit with MESSENGER’s Narrow Angle Camera (NAC). MESSENGER’s camera system, the Mercury Dual Imaging System (MDIS), has two cameras: the Narrow Angle Camera and the Wide Angle Camera (WAC). Comparison of this image with MESSENGER’s first WAC image of the same region shows the substantial difference between the fields of view of the two cameras. At 1.5°, the field of view of the NAC is seven times smaller than the 10.5° field of view of the WAC. This image was taken using MDIS’s pivot. MDIS is mounted on a pivoting platform and is the only instrument in MESSENGER’s payload capable of movement independent of the spacecraft. The other instruments are fixed in place, and most point down the spacecraft’s boresight at all times, relying solely on the guidance and control system for pointing. The 90° range of motion of the pivot gives MDIS a much-needed extra degree of freedom, allowing MDIS to image the planet’s surface at times when spacecraft geometry would normally prevent it from doing so. The pivot also gives MDIS additional imaging opportunities by allowing it to view more of the surface than that at which the boresight-aligned instruments are pointed at any given time. On March 17, 2011 (March 18, 2011, UTC), MESSENGER became the first spacecraft ever to orbit the planet Mercury. The mission is currently in the commissioning phase, during which spacecraft and instrument performance are verified through a series of specially designed checkout activities. In the course of the one-year primary mission, the spacecraft's seven scientific instruments and radio science investigation will unravel the history and evolution of the Solar System's innermost planet. Visit the Why Mercury? section of this website to learn more about the science questions that the MESSENGER mission has set out to answer. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Carnegie Institution of Washington NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook
Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft
NASA Astrophysics Data System (ADS)
Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.
2015-02-01
Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.
NASA Technical Reports Server (NTRS)
2003-01-01
MGS MOC Release No. MOC2-387, 10 June 2003
This is a Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide angle view of the Charitum Montes, south of Argyre Planitia, in early June 2003. The seasonal south polar frost cap, composed of carbon dioxide, has been retreating southward through this area since spring began a month ago. The bright features toward the bottom of this picture are surfaces covered by frost. The picture is located near 57oS, 43oW. North is at the top, south is at the bottom. Sunlight illuminates the scene from the upper left. The area shown is about 217 km (135 miles) wide.Baffling system for the Wide Angle Camera (WAC) of ROSETTA mission
NASA Astrophysics Data System (ADS)
Brunello, Pierfrancesco; Peron, Fabio; Barbieri, Cesare; Fornasier, Sonia
2000-10-01
After the experience of GIOTTO fly-by to comet Halley in 1986, the European Space Agency planned to improve the scientific knowledge of these astronomical objects by means of an even more ambitious rendezvous mission with another comet (P/Wirtanen). This mission, named ROSETTA, will go on from 2003 to 2013, ending after the comet perihelion phase and including also the fly-by with two asteroids of the main belt (140 Siwa and 4979 Otawara). Scientific priority of the mission is the in situ investigation of the cometary nucleus, with the aim of better understanding the formation and the composition of planetesimals and their evolution over the last 4.5 billions of years. In this context, the Authors were involved in the design of the baffling for the Wide Angle Camera (WAC) of the imaging system (OSIRIS) carried on board of the spacecraft. Scientific requirements for the WAC are : a large field of view (FOV) of 12 degree(s) x 12 degree(s) with a resolution of 100 (mu) rad per pixel, UV response, and a contrast ratio of 10-4 in order to detect gaseous and dusty features close to the nucleus of the comet. TO achieve these performances, a fairly novel class of optical solutions employing off-axis sections of concentric mirrors was explored. Regarding baffling, the peculiar demand was the rejection of stray-light generated by the optics for sources within the FOV, since the optical entrance aperture is located at the level of the secondary mirror (instead of the primary as usual). This paper describes the baffle design and analyzes its performances, calculated by numerical simulation with ray tracing methods, at different angles of incidence of the light, for sources both outside and inside the field of view.
NASA Astrophysics Data System (ADS)
Gicquel, A.; Vincent, J.-B.; Agarwal, J.; A'Hearn, M. F.; Bertini, I.; Bodewits, D.; Sierks, H.; Lin, Z.-Y.; Barbieri, C.; Lamy, P. L.; Rodrigo, R.; Koschny, D.; Rickman, H.; Keller, H. U.; Barucci, M. A.; Bertaux, J.-L.; Besse, S.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; Debei, S.; Deller, J.; De Cecco, M.; Frattin, E.; El-Maarry, M. R.; Fornasier, S.; Fulle, M.; Groussin, O.; Gutiérrez, P. J.; Gutiérrez-Marquez, P.; Güttler, C.; Höfner, S.; Hofmann, M.; Hu, X.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Knollenberg, J.; Kovacs, G.; Kramm, J.-R.; Kührt, E.; Küppers, M.; Lara, L. M.; Lazzarin, M.; Moreno, J. J. Lopez; Lowry, S.; Marzari, F.; Masoumzadeh, N.; Massironi, M.; Moreno, F.; Mottola, S.; Naletto, G.; Oklay, N.; Pajola, M.; Pommerol, A.; Preusker, F.; Scholten, F.; Shi, X.; Thomas, N.; Toth, I.; Tubiana, C.
2016-11-01
Beginning in 2014 March, the OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) cameras began capturing images of the nucleus and coma (gas and dust) of comet 67P/Churyumov-Gerasimenko using both the wide angle camera (WAC) and the narrow angle camera (NAC). The many observations taken since July of 2014 have been used to study the morphology, location, and temporal variation of the comet's dust jets. We analysed the dust monitoring observations shortly after the southern vernal equinox on 2015 May 30 and 31 with the WAC at the heliocentric distance Rh = 1.53 AU, where it is possible to observe that the jet rotates with the nucleus. We found that the decline of brightness as a function of the distance of the jet is much steeper than the background coma, which is a first indication of sublimation. We adapted a model of sublimation of icy aggregates and studied the effect as a function of the physical properties of the aggregates (composition and size). The major finding of this paper was that through the sublimation of the aggregates of dirty grains (radius a between 5 and 50 μm) we were able to completely reproduce the radial brightness profile of a jet beyond 4 km from the nucleus. To reproduce the data, we needed to inject a number of aggregates between 8.5 × 1013 and 8.5 × 1010 for a = 5 and 50 μm, respectively, or an initial mass of H2O ice around 22 kg.
A telephoto camera system with shooting direction control by gaze detection
NASA Astrophysics Data System (ADS)
Teraya, Daiki; Hachisu, Takumi; Yendo, Tomohiro
2015-05-01
For safe driving, it is important for driver to check traffic conditions such as traffic lights, or traffic signs as early as soon. If on-vehicle camera takes image of important objects to understand traffic conditions from long distance and shows these to driver, driver can understand traffic conditions earlier. To take image of long distance objects clearly, the focal length of camera must be long. When the focal length is long, on-vehicle camera doesn't have enough field of view to check traffic conditions. Therefore, in order to get necessary images from long distance, camera must have long-focal length and controllability of shooting direction. In previous study, driver indicates shooting direction on displayed image taken by a wide-angle camera, a direction controllable camera takes telescopic image, and displays these to driver. However, driver uses a touch panel to indicate the shooting direction in previous study. It is cause of disturb driving. So, we propose a telephoto camera system for driving support whose shooting direction is controlled by driver's gaze to avoid disturbing drive. This proposed system is composed of a gaze detector and an active telephoto camera whose shooting direction is controlled. We adopt non-wear detecting method to avoid hindrance to drive. The gaze detector measures driver's gaze by image processing. The shooting direction of the active telephoto camera is controlled by galvanometer scanners and the direction can be switched within a few milliseconds. We confirmed that the proposed system takes images of gazing straight ahead of subject by experiments.
NASA Astrophysics Data System (ADS)
Moissl, Richard; Kueppers, Michael
2016-10-01
In this paper we present the results of an analysis on a large part of the existing Image data from the OSIRIS camera system onboard the Rosetta Spacecraft, in which stars of sufficient brightness (down to a limiting magnitude of 6) have been observed through the coma of Comet 67/P Churyumov-Gerasimenko ("C-G"). Over the course of the Rosetta main mission the Coma of the comet underwent large changes in density and structure, owed to the changing insolation along the orbit of C-G. We report on the changes of the stellar signals in the wavelength ranges, covered by the filters of the OSIRIS Narrow-Angle (NAC) and Wide-Angle (WAC) cameras.Acknowledgements: OSIRIS was built by a consortium led by the Max-Planck-Institut für Sonnensystemforschung, Göttingen, Germany, in collaboration with CISAS, University of Padova, Italy, the Laboratoire d'Astrophysique de Marseille, France, the Instituto de Astrofísica de Andalucia, CSIC, Granada, Spain, the Scientific Support Office of the European Space Agency, Noordwijk, The Netherlands, the Instituto Nacional de Técnica Aeroespacial, Madrid, Spain, the Universidad Politéchnica de Madrid, Spain, the Department of Physics and Astronomy of Uppsala University, Sweden, and the Institut für Datentechnik und Kommunikationsnetze der Technischen Universität Braunschweig, Germany.
Southern Florida's River of Grass
NASA Technical Reports Server (NTRS)
2002-01-01
Florida's Everglades is a region of broad, slow-moving sheets of water flowing southward over low-lying areas from Lake Okeechobeeto the Gulf of Mexico. In places this remarkable 'river of grass' is 80 kilometers wide. These images from the Multi-angle Imaging SpectroRadiometer show the Everglades region on January 16, 2002. Each image covers an area measuring 191 kilometers x 205 kilometers. The data were captured during Terra orbit 11072.On the left is a natural color view acquired by MISR's nadir camera. A portion of Lake Okeechobee is visible at the top, to the right of image center. South of the lake, whose name derives from the Seminole word for 'big water,' an extensive region of farmland known as the Everglades Agricultural Area is recognizable by its many clustered squares. Over half of the sugar produced in United States is grown here. Urban areas along the east coast and in the northern part of the image extend to the boundaries of Big Cypress Swamp, situated north of Everglades National Park.The image on the right combines red-band data from the 46-degree backward, nadir and 46-degree forward-viewing camera angles to create a red, green, blue false-color composite. One of the interesting uses of the composite image is for detecting surface water. Wet surfaces appear blue in this rendition because sun glitter produces a greater signal at the forward camera's view angle. Wetlands visible in these images include a series of shallow impoundments called Water Conservation Areas which were built to speed water flow through the Everglades in times of drought. In parts of the Everglades, these levees and extensive systems such as the Miami and Tamiami Canals have altered the natural cycles of water flow. For example, the water volume of the Shark River Slough, a natural wetland which feeds Everglades National Park, is influenced by the Tamiami Canal. The unique and intrinsic value of the Everglades is now widely recognized, and efforts to restore the natural water cycles are underway.MISR Images Forest Fires and Hurricane
NASA Technical Reports Server (NTRS)
2000-01-01
These images show forest fires raging in Montana and Hurricane Hector swirling in the Pacific. These two unrelated, large-scale examples of nature's fury were captured by the Multi-angle Imaging SpectroRadiometer(MISR) during a single orbit of NASA's Terra satellite on August 14, 2000.
In the left image, huge smoke plumes rise from devastating wildfires in the Bitterroot Mountain Range near the Montana-Idaho border. Flathead Lake is near the upper left, and the Great Salt Lake is at the bottom right. Smoke accumulating in the canyons and plains is also visible. This image was generated from the MISR camera that looks forward at a steep angle (60 degrees); the instrument has nine different cameras viewing Earth at different angles. The smoke is far more visible when seen at this highly oblique angle than it would be in a conventional, straight-downward (nadir)view. The wide extent of the smoke is evident from comparison with the image on the right, a view of Hurricane Hector acquired from MISR's nadir-viewing camera. Both images show an area of approximately 400 kilometers (250 miles)in width and about 850 kilometers (530 miles) in length.When this image of Hector was taken, the eastern Pacific tropical cyclone was located approximately 1,100 kilometers (680 miles) west of the southern tip of Baja California, Mexico. The eye is faintly visible and measures 25 kilometers (16 miles) in diameter. The storm was beginning to weaken, and 24hours later the National Weather Service downgraded Hector from a hurricane to a tropical storm.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.For more information: http://www-misr.jpl.nasa.gov2015-04-29
This image from MESSENGER spacecraft covers a small area located about 115 km south of the center of Mansart crater. The smallest craters visible in the image are about the size of the 16-meter (52-feet) crater that will be made by the impact of the MESSENGER spacecraft. The impact will take place tomorrow, April 30, 2015. Just left of center is a crater that is about 80 meters in diameter. The bright area on its right wall may be an outcrop of hollows material. Date acquired: April 28, 2015 Image Mission Elapsed Time (MET): 72505530 Image ID: 8408666 Instrument: Narrow Angle Camera (NAC) of the Mercury Dual Imaging System (MDIS) Center Latitude: 69.8° N Center Longitude: 303.7° E Resolution: 2.0 meters/pixel Scale: The scene is about 1 km (0.6 miles) wide. This image has not been map projected. Incidence Angle: 79.0° Emission Angle: 11.0° Phase Angle: 90.0° http://photojournal.jpl.nasa.gov/catalog/PIA19442
Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System
Lu, Yu; Wang, Keyi; Fan, Gongshu
2016-01-01
A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second. PMID:27077857
2017-06-26
NASA's Cassini spacecraft peers toward a sliver of Saturn's sunlit atmosphere while the icy rings stretch across the foreground as a dark band. This view looks toward the unilluminated side of the rings from about 7 degrees below the ring plane. The image was taken in green light with the Cassini spacecraft wide-angle camera on March 31, 2017. The view was obtained at a distance of approximately 620,000 miles (1 million kilometers) from Saturn. Image scale is 38 miles (61 kilometers) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA21334
1986-01-24
P-29516 BW Range: 125, 000 kilometers (78,000 miles) Voyager 2's wide-angle camera captured this view of the outer part of the Uranian ring system just 11 minutes before passing though the ring plane. The resolution in this clear-filter view is slightly better than 9 km (6 mi). The brightest, outermost ring is known as epsilon. Interior to epsilon lie (from top) the newly discovered 10th ring of Uranus--designated 1986UR1 and barely visible here--and then the delta, gamma and eta rings.
NASA Technical Reports Server (NTRS)
2005-01-01
This somewhat oblique blue wide angle Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows the 174 km (108 mi) diameter crater, Terby, and its vicinity in December 2004. Located north of Hellas, this region can be covered with seasonal frost and ground-hugging fog, even in the afternoon, despite being north of 30oS. The subtle, wavy pattern is a manifestation of fog. Location near: 28oS, 286oW Illumination from: upper left Season: Southern WinterA multi-modal stereo microscope based on a spatial light modulator.
Lee, M P; Gibson, G M; Bowman, R; Bernet, S; Ritsch-Marte, M; Phillips, D B; Padgett, M J
2013-07-15
Spatial Light Modulators (SLMs) can emulate the classic microscopy techniques, including differential interference (DIC) contrast and (spiral) phase contrast. Their programmability entails the benefit of flexibility or the option to multiplex images, for single-shot quantitative imaging or for simultaneous multi-plane imaging (depth-of-field multiplexing). We report the development of a microscope sharing many of the previously demonstrated capabilities, within a holographic implementation of a stereo microscope. Furthermore, we use the SLM to combine stereo microscopy with a refocusing filter and with a darkfield filter. The instrument is built around a custom inverted microscope and equipped with an SLM which gives various imaging modes laterally displaced on the same camera chip. In addition, there is a wide angle camera for visualisation of a larger region of the sample.
Zhang, Wenjing; Cao, Yu; Zhang, Xuanzhe; Liu, Zejin
2015-10-20
Stable information of a sky light polarization pattern can be used for navigation with various advantages such as better performance of anti-interference, no "error cumulative effect," and so on. But the existing method of sky light polarization measurement is weak in real-time performance or with a complex system. Inspired by the navigational capability of a Cataglyphis with its compound eyes, we introduce a new approach to acquire the all-sky image under different polarization directions with one camera and without a rotating polarizer, so as to detect the polarization pattern across the full sky in a single snapshot. Our system is based on a handheld light field camera with a wide-angle lens and a triplet linear polarizer placed over its aperture stop. Experimental results agree with the theoretical predictions. Not only real-time detection but simple and costless architecture demonstrates the superiority of the approach proposed in this paper.
The 1997 Spring Regression of the Martian South Polar Cap: Mars Orbiter Camera Observations
James, P.B.; Cantor, B.A.; Malin, M.C.; Edgett, K.; Carr, M.H.; Danielson, G.E.; Ingersoll, A.P.; Davies, M.E.; Hartmann, W.K.; McEwen, A.S.; Soderblom, L.A.; Thomas, P.C.; Veverka, J.
2000-01-01
The Mars Orbiter cameras (MOC) on Mars Global Surveyor observed the south polar cap of Mars during its spring recession in 1997. The images acquired by the wide angle cameras reveal a pattern of recession that is qualitatively similar to that observed by Viking in 1977 but that does differ in at least two respects. The 1977 recession in the 0o to 120o longitude sector was accelerated relative to the 1997 observations after LS = 240o; the Mountains of Mitchel also detached from the main cap earlier in 1997. Comparison of the MOC images with Mars Orbiter Laser Altimeter data shows that the Mountains of Mitchel feature is controlled by local topography. Relatively dark, low albedo regions well within the boundaries of the seasonal cap were observed to have red-to-violet ratios that characterize them as frost units rather than unfrosted or partially frosted ground; this suggests the possibility of regions covered by CO2 frost having different grain sizes.
2007-01-16
Both luminous and translucent, the C ring sweeps out of the darkness of Saturn's shadow and obscures the planet at lower left. The ring is characterized by broad, isolated bright areas, or "plateaus," surrounded by fainter material. This view looks toward the unlit side of the rings from about 19 degrees above the ringplane. North on Saturn is up. The dark, inner B ring is seen at lower right. The image was taken in visible light with the Cassini spacecraft wide-angle camera on Dec. 15, 2006 at a distance of approximately 632,000 kilometers (393,000 miles) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 56 degrees. Image scale is 34 kilometers (21 miles) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA08855
NASA Astrophysics Data System (ADS)
Bruegge, Carol J.; Val, Sebastian; Diner, David J.; Jovanovic, Veljko; Gray, Ellyn; Di Girolamo, Larry; Zhao, Guangyu
2014-09-01
The Multi-angle Imaging SpectroRadiometer (MISR) has successfully operated on the EOS/ Terra spacecraft since 1999. It consists of nine cameras pointing from nadir to 70.5° view angle with four spectral channels per camera. Specifications call for a radiometric uncertainty of 3% absolute and 1% relative to the other cameras. To accomplish this, MISR utilizes an on-board calibrator (OBC) to measure camera response changes. Once every two months the two Spectralon panels are deployed to direct solar-light into the cameras. Six photodiode sets measure the illumination level that are compared to MISR raw digital numbers, thus determining the radiometric gain coefficients used in Level 1 data processing. Although panel stability is not required, there has been little detectable change in panel reflectance, attributed to careful preflight handling techniques. The cameras themselves have degraded in radiometric response by 10% since launch, but calibration updates using the detector-based scheme has compensated for these drifts and allowed the radiance products to meet accuracy requirements. Validation using Sahara desert observations show that there has been a drift of ~1% in the reported nadir-view radiance over a decade, common to all spectral bands.
Applications of Action Cam Sensors in the Archaeological Yard
NASA Astrophysics Data System (ADS)
Pepe, M.; Ackermann, S.; Fregonese, L.; Fassi, F.; Adami, A.
2018-05-01
In recent years, special digital cameras called "action camera" or "action cam", have become popular due to their low price, smallness, lightness, strength and capacity to make videos and photos even in extreme environment surrounding condition. Indeed, these particular cameras have been designed mainly to capture sport actions and work even in case of dirt, bumps, or underwater and at different external temperatures. High resolution of Digital single-lens reflex (DSLR) cameras are usually preferred to be employed in photogrammetric field. Indeed, beyond the sensor resolution, the combination of such cameras with fixed lens with low distortion are preferred to perform accurate 3D measurements; at the contrary, action cameras have small and wide-angle lens, with a lower performance in terms of sensor resolution, lens quality and distortions. However, by considering the characteristics of the action cameras to acquire under conditions that may result difficult for standard DSLR cameras and because of their lower price, these could be taken into consideration as a possible and interesting approach during archaeological excavation activities to document the state of the places. In this paper, the influence of lens radial distortion and chromatic aberration on this type of cameras in self-calibration mode and an evaluation of their application in the field of Cultural Heritage will be investigated and discussed. Using a suitable technique, it has been possible to improve the accuracy of the 3D model obtained by action cam images. Case studies show the quality and the utility of the use of this type of sensor in the survey of archaeological artefacts.
Image-based path planning for automated virtual colonoscopy navigation
NASA Astrophysics Data System (ADS)
Hong, Wei
2008-03-01
Virtual colonoscopy (VC) is a noninvasive method for colonic polyp screening, by reconstructing three-dimensional models of the colon using computerized tomography (CT). In virtual colonoscopy fly-through navigation, it is crucial to generate an optimal camera path for efficient clinical examination. In conventional methods, the centerline of the colon lumen is usually used as the camera path. In order to extract colon centerline, some time consuming pre-processing algorithms must be performed before the fly-through navigation, such as colon segmentation, distance transformation, or topological thinning. In this paper, we present an efficient image-based path planning algorithm for automated virtual colonoscopy fly-through navigation without the requirement of any pre-processing. Our algorithm only needs the physician to provide a seed point as the starting camera position using 2D axial CT images. A wide angle fisheye camera model is used to generate a depth image from the current camera position. Two types of navigational landmarks, safe regions and target regions are extracted from the depth images. Camera position and its corresponding view direction are then determined using these landmarks. The experimental results show that the generated paths are accurate and increase the user comfort during the fly-through navigation. Moreover, because of the efficiency of our path planning algorithm and rendering algorithm, our VC fly-through navigation system can still guarantee 30 FPS.
NASA Astrophysics Data System (ADS)
Shinoj, V. K.; Murukeshan, V. M.; Hong, Jesmond; Baskaran, M.; Aung, Tin
2015-07-01
Noninvasive medical imaging techniques have generated great interest and high potential in the research and development of ocular imaging and follow up procedures. It is well known that angle closure glaucoma is one of the major ocular diseases/ conditions that causes blindness. The identification and treatment of this disease are related primarily to angle assessment techniques. In this paper, we illustrate a probe-based imaging approach to obtain the images of the angle region in eye. The proposed probe consists of a micro CCD camera and LED/NIR laser light sources and they are configured at the distal end to enable imaging of iridocorneal region inside eye. With this proposed dualmodal probe, imaging is performed in light (white visible LED ON) and dark (NIR laser light source alone) conditions and the angle region is noticeable in both cases. The imaging using NIR sources have major significance in anterior chamber imaging since it evades pupil constriction due to the bright light and thereby the artificial altering of anterior chamber angle. The proposed methodology and developed scheme are expected to find potential application in glaucoma disease detection and diagnosis.
Wide-angle ITER-prototype tangential infrared and visible viewing system for DIII-D.
Lasnier, C J; Allen, S L; Ellis, R E; Fenstermacher, M E; McLean, A G; Meyer, W H; Morris, K; Seppala, L G; Crabtree, K; Van Zeeland, M A
2014-11-01
An imaging system with a wide-angle tangential view of the full poloidal cross-section of the tokamak in simultaneous infrared and visible light has been installed on DIII-D. The optical train includes three polished stainless steel mirrors in vacuum, which view the tokamak through an aperture in the first mirror, similar to the design concept proposed for ITER. A dichroic beam splitter outside the vacuum separates visible and infrared (IR) light. Spatial calibration is accomplished by warping a CAD-rendered image to align with landmarks in a data image. The IR camera provides scrape-off layer heat flux profile deposition features in diverted and inner-wall-limited plasmas, such as heat flux reduction in pumped radiative divertor shots. Demonstration of the system to date includes observation of fast-ion losses to the outer wall during neutral beam injection, and shows reduced peak wall heat loading with disruption mitigation by injection of a massive gas puff.
Wide-angle ITER-prototype tangential infrared and visible viewing system for DIII-D
Lasnier, Charles J.; Allen, Steve L.; Ellis, Ronald E.; ...
2014-08-26
An imaging system with a wide-angle tangential view of the full poloidal cross-section of the tokamak in simultaneous infrared and visible light has been installed on DIII-D. The optical train includes three polished stainless steel mirrors in vacuum, which view the tokamak through an aperture in the first mirror, similar to the design concept proposed for ITER. A dichroic beam splitter outside the vacuum separates visible and infrared (IR) light. Spatial calibration is accomplished by warping a CAD-rendered image to align with landmarks in a data image. The IR camera provides scrape-off layer heat flux profile deposition features in divertedmore » and inner-wall-limited plasmas, such as heat flux reduction in pumped radiative divertor shots. As a result, demonstration of the system to date includes observation of fast-ion losses to the outer wall during neutral beam injection, and shows reduced peak wall heat loading with disruption mitigation by injection of a massive gas puff.« less
Field Test of the ExoMars Panoramic Camera in the High Arctic - First Results and Lessons Learned
NASA Astrophysics Data System (ADS)
Schmitz, N.; Barnes, D.; Coates, A.; Griffiths, A.; Hauber, E.; Jaumann, R.; Michaelis, H.; Mosebach, H.; Paar, G.; Reissaus, P.; Trauthan, F.
2009-04-01
The ExoMars mission as the first element of the ESA Aurora program is scheduled to be launched to Mars in 2016. Part of the Pasteur Exobiology Payload onboard the ExoMars rover is a Panoramic Camera System (‘PanCam') being designed to obtain high-resolution color and wide-angle multi-spectral stereoscopic panoramic images from the mast of the ExoMars rover. The PanCam instrument consists of two wide-angle cameras (WACs), which will provide multispectral stereo images with 34° field-of-view (FOV) and a High-Resolution RGB Channel (HRC) to provide close-up images with 5° field-of-view. For field testing of the PanCam breadboard in a representative environment the ExoMars PanCam team joined the 6th Arctic Mars Analogue Svalbard Expedition (AMASE) 2008. The expedition took place from 4-17 August 2008 in the Svalbard archipelago, Norway, which is considered to be an excellent site, analogue to ancient Mars. 31 scientists and engineers involved in Mars Exploration (among them the ExoMars WISDOM, MIMA and Raman-LIBS team as well as several NASA MSL teams) combined their knowledge, instruments and techniques to study the geology, geophysics, biosignatures, and life forms that can be found in volcanic complexes, warm springs, subsurface ice, and sedimentary deposits. This work has been carried out by using instruments, a rover (NASA's CliffBot), and techniques that will/may be used in future planetary missions, thereby providing the capability to simulate a full mission environment in a Mars analogue terrain. Besides demonstrating PanCam's general functionality in a field environment, test and verification of the interpretability of PanCam data for in-situ geological context determination and scientific target selection was a main objective. To process the collected data, a first version of the preliminary PanCam 3D reconstruction processing & visualization chain was used. Other objectives included to test and refine the operational scenario (based on ExoMars Rover Reference Surface Mission), to investigate data commonalities and data fusion potential w.r.t. other instruments, and to collect representative image data to evaluate various influences, such as viewing distance, surface structure, and availability of structures at "infinity" (e.g. resolution, focus quality and associated accuracy of the 3D reconstruction). Airborne images with the HRSC-AX camera (airborne camera with heritage from the Mars Express High Resolution Stereo Camera HRSC), collected during a flight campaign over Svalbard in June 2008, provided large-scale geological context information for all field sites.
Reductions in injury crashes associated with red light camera enforcement in oxnard, california.
Retting, Richard A; Kyrychenko, Sergey Y
2002-11-01
This study estimated the impact of red light camera enforcement on motor vehicle crashes in one of the first US communities to employ such cameras-Oxnard, California. Crash data were analyzed for Oxnard and for 3 comparison cities. Changes in crash frequencies were compared for Oxnard and control cities and for signalized and nonsignalized intersections by means of a generalized linear regression model. Overall, crashes at signalized intersections throughout Oxnard were reduced by 7% and injury crashes were reduced by 29%. Right-angle crashes, those most associated with red light violations, were reduced by 32%; right-angle crashes involving injuries were reduced by 68%. Because red light cameras can be a permanent component of the transportation infrastructure, crash reductions attributed to camera enforcement should be sustainable.
NASA Astrophysics Data System (ADS)
Suzuki, H.; Yamada, M.; Kouyama, T.; Tatsumi, E.; Kameda, S.; Honda, R.; Sawada, H.; Ogawa, N.; Morota, T.; Honda, C.; Sakatani, N.; Hayakawa, M.; Yokota, Y.; Yamamoto, Y.; Sugita, S.
2018-01-01
Hayabusa2, the first sample return mission to a C-type asteroid was launched by the Japan Aerospace Exploration Agency (JAXA) on December 3, 2014 and will arrive at the asteroid in the middle of 2018 to collect samples from its surface, which may contain both hydrated minerals and organics. The optical navigation camera (ONC) system on board the Hayabusa2 consists of three individual framing CCD cameras, ONC-T for a telescopic nadir view, ONC-W1 for a wide-angle nadir view, and ONC-W2 for a wide-angle slant view will be used to observe the surface of Ryugu. The cameras will be used to measure the global asteroid shape, local morphologies, and visible spectroscopic properties. Thus, image data obtained by ONC will provide essential information to select landing (sampling) sites on the asteroid. This study reports the results of initial inflight calibration based on observations of Earth, Mars, Moon, and stars to verify and characterize the optical performance of the ONC, such as flat-field sensitivity, spectral sensitivity, point-spread function (PSF), distortion, and stray light of ONC-T, and distortion for ONC-W1 and W2. We found some potential problems that may influence our science observations. This includes changes in sensitivity of flat fields for all bands from those that were measured in the pre-flight calibration and existence of a stray light that arises under certain conditions of spacecraft attitude with respect to the sun. The countermeasures for these problems were evaluated by using data obtained during initial in-flight calibration. The results of our inflight calibration indicate that the error of spectroscopic measurements around 0.7 μm using 0.55, 0.70, and 0.86 μm bands of the ONC-T can be lower than 0.7% after these countermeasures and pixel binning. This result suggests that our ONC-T would be able to detect typical strength (∼3%) of the serpentine absorption band often found on CM chondrites and low albedo asteroids with ≥ 4σ confidence.
Development of a stiffness-angle law for simplifying the measurement of human hair stiffness.
Jung, I K; Park, S C; Lee, Y R; Bin, S A; Hong, Y D; Eun, D; Lee, J H; Roh, Y S; Kim, B M
2018-04-01
This research examines the benefits of caffeine absorption on hair stiffness. To test hair stiffness, we have developed an evaluation method that is not only accurate, but also inexpensive. Our evaluation method for measuring hair stiffness culminated in a model, called the Stiffness-Angle Law, which describes the elastic properties of hair and can be widely applied to the development of hair care products. Small molecules (≤500 g mol -1 ) such as caffeine can be absorbed into hair. A common shampoo containing 4% caffeine was formulated and applied to hair 10 times, after which the hair stiffness was measured. The caffeine absorption of the treated hair was observed using Fourier-transform infrared spectroscopy (FTIR) with a focal plane array (FPA) detector. Our evaluation method for measuring hair stiffness consists of a regular camera and a support for single strands of hair. After attaching the hair to the support, the bending angle of the hair was observed with a camera and measured. Then, the hair strand was weighed. The stiffness of the hair was calculated based on our proposed Stiffness-Angle Law using three variables: angle, weight of hair and the distance the hair was pulled across the support. The caffeine absorption was confirmed by FTIR analysis. The concentration of amide bond in the hair certainly increased due to caffeine absorption. After caffeine was absorbed into the hair, the bending angle and weight of the hair changed. Applying these measured changes to the Stiffness-Angle Law, it was confirmed that the hair stiffness increased by 13.2% due to caffeine absorption. The theoretical results using the Stiffness-Angle Law agree with the visual examinations of hair exposed to caffeine and also the known results of hair stiffness from a previous report. Our evaluation method combined with our proposed Stiffness-Angle Law effectively provides an accurate and inexpensive evaluation technique for measuring bending stiffness of human hair. © 2018 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
Compact Autonomous Hemispheric Vision System
NASA Technical Reports Server (NTRS)
Pingree, Paula J.; Cunningham, Thomas J.; Werne, Thomas A.; Eastwood, Michael L.; Walch, Marc J.; Staehle, Robert L.
2012-01-01
Solar System Exploration camera implementations to date have involved either single cameras with wide field-of-view (FOV) and consequently coarser spatial resolution, cameras on a movable mast, or single cameras necessitating rotation of the host vehicle to afford visibility outside a relatively narrow FOV. These cameras require detailed commanding from the ground or separate onboard computers to operate properly, and are incapable of making decisions based on image content that control pointing and downlink strategy. For color, a filter wheel having selectable positions was often added, which added moving parts, size, mass, power, and reduced reliability. A system was developed based on a general-purpose miniature visible-light camera using advanced CMOS (complementary metal oxide semiconductor) imager technology. The baseline camera has a 92 FOV and six cameras are arranged in an angled-up carousel fashion, with FOV overlaps such that the system has a 360 FOV (azimuth). A seventh camera, also with a FOV of 92 , is installed normal to the plane of the other 6 cameras giving the system a > 90 FOV in elevation and completing the hemispheric vision system. A central unit houses the common electronics box (CEB) controlling the system (power conversion, data processing, memory, and control software). Stereo is achieved by adding a second system on a baseline, and color is achieved by stacking two more systems (for a total of three, each system equipped with its own filter.) Two connectors on the bottom of the CEB provide a connection to a carrier (rover, spacecraft, balloon, etc.) for telemetry, commands, and power. This system has no moving parts. The system's onboard software (SW) supports autonomous operations such as pattern recognition and tracking.
NASA Astrophysics Data System (ADS)
Holland, S. Douglas
1992-09-01
A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.
NASA Technical Reports Server (NTRS)
Holland, S. Douglas (Inventor)
1992-01-01
A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.
WindCam and MSPI: two cloud and aerosol instrument concepts derived from Terra/MISR heritage
NASA Astrophysics Data System (ADS)
Diner, David J.; Mischna, Michael; Chipman, Russell A.; Davis, Ab; Cairns, Brian; Davies, Roger; Kahn, Ralph A.; Muller, Jan-Peter; Torres, Omar
2008-08-01
The Multi-angle Imaging SpectroRadiometer (MISR) has been acquiring global cloud and aerosol data from polar orbit since February 2000. MISR acquires moderately high-resolution imagery at nine view angles from nadir to 70.5°, in four visible/near-infrared spectral bands. Stereoscopic parallax, time lapse among the nine views, and the variation of radiance with angle and wavelength enable retrieval of geometric cloud and aerosol plume heights, height-resolved cloud-tracked winds, and aerosol optical depth and particle property information. Two instrument concepts based upon MISR heritage are in development. The Cloud Motion Vector Camera, or WindCam, is a simplified version comprised of a lightweight, compact, wide-angle camera to acquire multiangle stereo imagery at a single visible wavelength. A constellation of three WindCam instruments in polar Earth orbit would obtain height-resolved cloud-motion winds with daily global coverage, making it a low-cost complement to a spaceborne lidar wind measurement system. The Multiangle SpectroPolarimetric Imager (MSPI) is aimed at aerosol and cloud microphysical properties, and is a candidate for the National Research Council Decadal Survey's Aerosol-Cloud-Ecosystem (ACE) mission. MSPI combines the capabilities of MISR with those of other aerosol sensors, extending the spectral coverage to the ultraviolet and shortwave infrared and incorporating high-accuracy polarimetric imaging. Based on requirements for the nonimaging Aerosol Polarimeter Sensor on NASA's Glory mission, a degree of linear polarization uncertainty of 0.5% is specified within a subset of the MSPI bands. We are developing a polarization imaging approach using photoelastic modulators (PEMs) to accomplish this objective.
Video Mosaicking for Inspection of Gas Pipelines
NASA Technical Reports Server (NTRS)
Magruder, Darby; Chien, Chiun-Hong
2005-01-01
A vision system that includes a specially designed video camera and an image-data-processing computer is under development as a prototype of robotic systems for visual inspection of the interior surfaces of pipes and especially of gas pipelines. The system is capable of providing both forward views and mosaicked radial views that can be displayed in real time or after inspection. To avoid the complexities associated with moving parts and to provide simultaneous forward and radial views, the video camera is equipped with a wide-angle (>165 ) fish-eye lens aimed along the axis of a pipe to be inspected. Nine white-light-emitting diodes (LEDs) placed just outside the field of view of the lens (see Figure 1) provide ample diffuse illumination for a high-contrast image of the interior pipe wall. The video camera contains a 2/3-in. (1.7-cm) charge-coupled-device (CCD) photodetector array and functions according to the National Television Standards Committee (NTSC) standard. The video output of the camera is sent to an off-the-shelf video capture board (frame grabber) by use of a peripheral component interconnect (PCI) interface in the computer, which is of the 400-MHz, Pentium II (or equivalent) class. Prior video-mosaicking techniques are applicable to narrow-field-of-view (low-distortion) images of evenly illuminated, relatively flat surfaces viewed along approximately perpendicular lines by cameras that do not rotate and that move approximately parallel to the viewed surfaces. One such technique for real-time creation of mosaic images of the ocean floor involves the use of visual correspondences based on area correlation, during both the acquisition of separate images of adjacent areas and the consolidation (equivalently, integration) of the separate images into a mosaic image, in order to insure that there are no gaps in the mosaic image. The data-processing technique used for mosaicking in the present system also involves area correlation, but with several notable differences: Because the wide-angle lens introduces considerable distortion, the image data must be processed to effectively unwarp the images (see Figure 2). The computer executes special software that includes an unwarping algorithm that takes explicit account of the cylindrical pipe geometry. To reduce the processing time needed for unwarping, parameters of the geometric mapping between the circular view of a fisheye lens and pipe wall are determined in advance from calibration images and compiled into an electronic lookup table. The software incorporates the assumption that the optical axis of the camera is parallel (rather than perpendicular) to the direction of motion of the camera. The software also compensates for the decrease in illumination with distance from the ring of LEDs.
5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF ...
5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF BRIDGE AND ENGINE CAR ON TRACKS, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Interdisciplinary scientist participation in the Phobos mission
NASA Technical Reports Server (NTRS)
1992-01-01
Data was acquired from VSK (2 wide-angle visible-NIR TV cameras at 0.4 to 0.6 micrometers and 0.8 to 1.1 micrometers, and a narrow-angle TV camera), KRFM (10-band UV-visible spectrometer at 0.3 to 0.6 micrometers and a 6-band radiometer at 5-50 micrometers), and ISM (a 128-channel NIR imaging spectrometer at 0.8 to 3 micrometers). These data provided improved mapping coverage of Phobos; improved mass, shape, and volume determinations, with the density shown to be lower than that of all known meteorites, suggesting a porous interior; evidence for a physically, spectrally and possibly compositionally heterogeneous surface; and proof that the spectral properties do not closely resemble those of unaltered carbonaceous chondrites, but show more resemblance to the spectra of altered mafic material. For Mars, the data show that the underlying rock type can be distinguished through the global dust cover; that the spectral properties and possibly composition vary laterally between and within the geologic provinces; that the surface physical properties vary laterally, and in many cases, the boundaries coincide with those of the geologic units; and the acquired data also demonstrate the value of reflectance spectroscopy and radiometry to the study of Martian geology.
The emplacement of long lava flows in Mare Imbrium, the Moon
NASA Astrophysics Data System (ADS)
Garry, W. B.
2012-12-01
Lava flow margins are scarce on the lunar surface. The best developed lava flows on the Moon occur in Mare Imbrium where flow margins are traceable nearly their entire flow length. The flow field originates in the southwest part of the basin from a fissure or series of fissures and cones located in the vicinity of Euler crater and erupted in three phases (Phases I, II, III) over a period of 0.5 Billion years (3.0 - 2.5 Ga). The flow field was originally mapped with Apollo and Lunar Orbiter data by Schaber (1973) and shows the flow field extends 200 to 1200 km from the presumed source area and covers an area of 2.0 x 10^5 km^2 with an estimated eruptive volume of 4 x 10^4 km^3. Phase I flows extend 1200 km and have the largest flow volume, but interestingly do not exhibit visible topography and are instead defined by difference in color from the surrounding mare flows. Phases II and III flows have well-defined flow margins (10 - 65 m thick) and channels (0.4 - 2.0 km wide, 40 - 70 m deep), but shorter flow lengths, 600 km and 400 km respectively. Recent missions, including Lunar Reconnaissance Orbiter (LRO), Kaguya (Selene), and Clementine, provide high resolution data sets of these lava flows. Using a combination of data sets including images from LRO Wide-Angle-Camera (WAC)(50-100 m/pixel) and Narrow-Angle-Camera (NAC) (up to 0.5m/pixel), Kaguya Terrain Camera (TC) (10 m/pixel), and topography from LRO Lunar Orbiter Laser Altimeter (LOLA), the morphology has been remapped and topographic measurements of the flow features have been made in an effort to reevaluate the emplacement of the flow field. Morphologic mapping reveals a different flow path for Phase I compared to the original mapping completed by Schaber (1973). The boundaries of the Phase I flow field have been revised based on Moon Mineralogy Mapper color ratio images (Staid et al., 2011). This has implications for the area covered and volume erupted during this stage, as well as, the age of Phase I. Flow features and margins have been identified in the Phase I flow within the LROC WAC mosaic and in Narrow Angle Camera (NAC) images. These areas have a mottled appearance. LOLA profiles over the more prominent flow lobes in Phase I reveal these margins are less 10 m thick. Phase II and III morphology maps are similar to previous flow maps. Phase III lobes near Euler are 10-12 km wide and 20-30 m thick based on measurements of the LOLA 1024ppd Elevation Digital Terrain Model (DTM) in JMoon. One of the longer Phase III lobes varies between 15 to 50 km wide and 25 to 60 m thick, with the thickest section at the distal end of the lobe. The Phase II lobe is 15 to 25 m thick and up to 35 km wide. The eruptive volume of the Mare Imbrium lava flows has been compared to terrestrial flood basalts. The morphology of the lobes in Phase II and III, which includes levees, thick flow fronts, and lobate margins suggests these could be similar to terrestrial aa-style flows. The Phase I flows might be more representative of sheet flows, pahoehoe-style flows, or inflated flows. Morphologic comparisons will be made with terrestrial flows at Askja volcano in Iceland, a potential analog to compare different styles of emplacement for the flows in Mare Imbrium.
ERIC Educational Resources Information Center
Beverly, Robert E.; Young, Thomas J.
Two hundred forty college undergraduates participated in a study of the effect of camera angle on an audience's perceptual judgments of source credibility, dominance, attraction, and homophily. The subjects were divided into four groups and each group was shown a videotape presentation in which sources had been videotaped according to one of four…
3D bubble reconstruction using multiple cameras and space carving method
NASA Astrophysics Data System (ADS)
Fu, Yucheng; Liu, Yang
2018-07-01
An accurate measurement of bubble shape and size has a significant value in understanding the behavior of bubbles that exist in many engineering applications. Past studies usually use one or two cameras to estimate bubble volume, surface area, among other parameters. The 3D bubble shape and rotation angle are generally not available in these studies. To overcome this challenge and obtain more detailed information of individual bubbles, a 3D imaging system consisting of four high-speed cameras is developed in this paper, and the space carving method is used to reconstruct the 3D bubble shape based on the recorded high-speed images from different view angles. The proposed method can reconstruct the bubble surface with minimal assumptions. A benchmarking test is performed in a 3 cm × 1 cm rectangular channel with stagnant water. The results show that the newly proposed method can measure the bubble volume with an error of less than 2% compared with the syringe reading. The conventional two-camera system has an error around 10%. The one-camera system has an error greater than 25%. The visualization of a 3D bubble rising demonstrates the wall influence on bubble rotation angle and aspect ratio. This also explains the large error that exists in the single camera measurement.
MISR Global Images See the Light of Day
NASA Technical Reports Server (NTRS)
2002-01-01
As of July 31, 2002, global multi-angle, multi-spectral radiance products are available from the MISR instrument aboard the Terra satellite. Measuring the radiative properties of different types of surfaces, clouds and atmospheric particulates is an important step toward understanding the Earth's climate system. These images are among the first planet-wide summary views to be publicly released from the Multi-angle Imaging SpectroRadiometer experiment. Data for these images were collected during the month of March 2002, and each pixel represents monthly-averaged daylight radiances from an area measuring 1/2 degree in latitude by 1/2 degree in longitude.The top panel is from MISR's nadir (vertical-viewing) camera and combines data from the red, green and blue spectral bands to create a natural color image. The central view combines near-infrared, red, and green spectral data to create a false-color rendition that enhances highly vegetated terrain. It takes 9 days for MISR to view the entire globe, and only areas within 8 degrees of latitude of the north and south poles are not observed due to the Terra orbit inclination. Because a single pole-to-pole swath of MISR data is just 400 kilometers wide, multiple swaths must be mosaiced to create these global views. Discontinuities appear in some cloud patterns as a consequence of changes in cloud cover from one day to another.The lower panel is a composite in which red, green, and blue radiances from MISR's 70-degree forward-viewing camera are displayed in the northern hemisphere, and radiances from the 70-degree backward-viewing camera are displayed in the southern hemisphere. At the March equinox (spring in the northern hemisphere, autumn in the southern hemisphere), the Sun is near the equator. Therefore, both oblique angles are observing the Earth in 'forward scattering', particularly at high latitudes. Forward scattering occurs when you (or MISR) observe an object with the Sun at a point in the sky that is in front of you. Relative to the nadir view, this geometry accentuates the appearance of polar clouds, and can even reveal clouds that are invisible in the nadir direction. In relatively clear ocean areas, the oblique-angle composite is generally brighter than its nadir counterpart due to enhanced reflection of light by atmospheric particulates.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.Near-field observation platform
NASA Astrophysics Data System (ADS)
Schlemmer, Harry; Baeurle, Constantin; Vogel, Holger
2008-04-01
A miniaturized near-field observation platform is presented comprising a sensitive daylight camera and an uncooled micro-bolometer thermal imager each equipped with a wide angle lens. Both cameras are optimised for a range between a few meters and 200 m. The platform features a stabilised line of sight and can therefore be used also on a vehicle when it is in motion. The line of sight either can be directed manually or the platform can be used in a panoramic mode. The video output is connected to a control panel where algorithms for moving target indication or tracking can be applied in order to support the observer. The near-field platform also can be netted with the vehicle system and the signals can be utilised, e.g. to designate a new target to the main periscope or the weapon sight.
1986-01-22
Range : 2.7 million miles (1.7 million miles) P-29497C Tis Voyager 2, false color composite of Uranus demonstrates the usefulness of special filters in the Voyager cameras for revealing the presence of high altitude hazes in Uranus' atmosphere. The picture is a composite of images obtained through the single orange and two methane filters of Voyager's wide angle camera. Orange, short wavelength and long wavelength methane images are displayed, retrospectively, as blue, green, and orange. The pink area centered on the pole is due to the presence of hazes high in the atmosphere that reflect the light before it has traversed a long enough path through the atmosphere to suffer absorbtion by methane gas. The bluest region at mid-latitude represent the most haze free regions on Uranus, thus, deeper cloud levels can be detected in these areas.
Modified plenoptic camera for phase and amplitude wavefront sensing
NASA Astrophysics Data System (ADS)
Wu, Chensheng; Davis, Christopher C.
2013-09-01
Shack-Hartmann sensors have been widely applied in wavefront sensing. However, they are limited to measuring slightly distorted wavefronts whose local tilt doesn't surpass the numerical aperture of its micro-lens array and cross talk of incident waves on the mrcro-lens array should be strictly avoided. In medium to strong turbulence cases of optic communication, where large jitter in angle of arrival and local interference caused by break-up of beam are common phenomena, Shack-Hartmann sensors no longer serve as effective tools in revealing distortions in a signal wave. Our design of a modified Plenoptic Camera shows great potential in observing and extracting useful information from severely disturbed wavefronts. Furthermore, by separating complex interference patterns into several minor interference cases, it may also be capable of telling regional phase difference of coherently illuminated objects.
Uav Borne Low Altitude Photogrammetry System
NASA Astrophysics Data System (ADS)
Lin, Z.; Su, G.; Xie, F.
2012-07-01
In this paper,the aforementioned three major aspects related to the Unmanned Aerial Vehicles (UAV) system for low altitude aerial photogrammetry, i.e., flying platform, imaging sensor system and data processing software, are discussed. First of all, according to the technical requirements about the least cruising speed, the shortest taxiing distance, the level of the flight control and the performance of turbulence flying, the performance and suitability of the available UAV platforms (e.g., fixed wing UAVs, the unmanned helicopters and the unmanned airships) are compared and analyzed. Secondly, considering the restrictions on the load weight of a platform and the resolution pertaining to a sensor, together with the exposure equation and the theory of optical information, the principles of designing self-calibration and self-stabilizing combined wide-angle digital cameras (e.g., double-combined camera and four-combined camera) are placed more emphasis on. Finally, a software named MAP-AT, considering the specialty of UAV platforms and sensors, is developed and introduced. Apart from the common functions of aerial image processing, MAP-AT puts more effort on automatic extraction, automatic checking and artificial aided adding of the tie points for images with big tilt angles. Based on the recommended process for low altitude photogrammetry with UAVs in this paper, more than ten aerial photogrammetry missions have been accomplished, the accuracies of Aerial Triangulation, Digital orthophotos(DOM)and Digital Line Graphs(DLG) of which meet the standard requirement of 1:2000, 1:1000 and 1:500 mapping.
Polarimetric Observations of the Lunar Surface
NASA Astrophysics Data System (ADS)
Kim, S.
2017-12-01
Polarimetric images contain valuable information on the lunar surface such as grain size and porosity of the regolith, from which one can estimate the space weathering environment on the lunar surface. Surprisingly, polarimetric observation has never been conducted from the lunar orbit before. A Wide-Angle Polarimetric Camera (PolCam) has been recently selected as one of three Korean science instruments onboard the Korea Pathfinder Lunar Orbiter (KPLO), which is aimed to be launched in 2019/2020 as the first Korean lunar mission. PolCam will obtain 80 m-resolution polarimetric images of the whole lunar surface between -70º and +70º latitudes at 320, 430 and 750 nm bands for phase angles up to 115º. I will also discuss previous polarimetric studies on the lunar surface based on our ground-based observations.
2007-03-01
front of a large area blackbody as background. The viewing angle , defined as the angle between surface normal and camera line of sight, was varied by...and polarization angle were derived from the Stokes parameters. The dependence of these polarization characteristics on viewing angle was investigated
NASA Astrophysics Data System (ADS)
Schmitz, Nicole; Jaumann, Ralf; Coates, Andrew; Griffiths, Andrew; Hauber, Ernst; Trauthan, Frank; Paar, Gerhard; Barnes, Dave; Bauer, Arnold; Cousins, Claire
2010-05-01
Geologic context as a combination of orbital imaging and surface vision, including range, resolution, stereo, and multispectral imaging, is commonly regarded as basic requirement for remote robotic geology and forms the first tier of any multi-instrument strategy for investigating and eventually understanding the geology of a region from a robotic platform. Missions with objectives beyond a pure geologic survey, e.g. exobiology objectives, require goal-oriented operational procedures, where the iterative process of scientific observation, hypothesis, testing, and synthesis, performed via a sol-by-sol data exchange with a remote robot, is supported by a powerful vision system. Beyond allowing a thorough geological mapping of the surface (soil, rocks and outcrops) in 3D, using wide angle stereo imagery, such a system needs to be able to provide detailed visual information on targets of interest in high resolution, thereby enabling the selection of science targets and samples for further analysis with a specialized in-situ instrument suite. Surface vision for ESA's upcoming ExoMars rover will come from a dedicated Panoramic Camera System (PanCam). As integral part of the Pasteur payload package, the PanCam is designed to support the search for evidence of biological processes by obtaining wide angle multispectral stereoscopic panoramic images and high resolution RGB images from the mast of the rover [1]. The camera system will consist of two identical wide-angle cameras (WACs), which are arranged on a common pan-tilt mechanism, with a fixed stereo base length of 50 cm. The WACs are being complemented by a High Resolution Camera (HRC), mounted between the WACs, which allows a magnification of selected targets by a factor of ~8 with respect to the wide-angle optics. The high-resolution images together with the multispectral and stereo capabilities of the camera will be of unprecedented quality for the identification of water-related surface features (such as sedimentary rocks) and form one key to a successful implementation of ESA's multi-level strategy for the ExoMars Reference Surface Mission. A dedicated PanCam Science Implementation Strategy is under development, which connects the PanCam science objectives and needs of the ExoMars Surface Mission with the required investigations, planned measurement approach and sequence, and connected mission requirements. First step of this strategy is obtaining geological context to enable the decision where to send the rover. PanCam (in combination with Wisdom) will be used to obtain ground truth by a thorough geomorphologic mapping of the ExoMars rover's surroundings in near and far range in the form of (1) RGB or monochromatic full (i.e. 360°) or partial stereo panoramas for morphologic and textural information and stereo ranging, (2) mosaics or single images with partly or full multispectral coverage to assess the mineralogy of surface materials as well as their weathering state and possible past or present alteration processes and (3) small-scale high-resolution information on targets/features of interest, and distant or inaccessible sites. This general survey phase will lead to the identification of surface features like outcrops, ridges and troughs and the characterization of different rock and surface units based on their morphology, distribution, and spectral and physical properties. Evidence of water-bearing minerals, water-altered rocks or even water-lain sediments seen in the large-scale wide angle images will then allow for preselecting those targets/features considered relevant for detailed analysis and definition of their geologic context. Detailed characterization and, subsequently, selection of those preselected targets/features for further analysis will then be enabled by color high-resolution imagery, followed by the next tier of contact instruments to enable a decision on whether or not to acquire samples for further analysis. During the following drill/analysis phase, PanCam's High Resolution Camera will characterize the sample in the sample tray and observe the sample discharge into the Core Sample Transfer Mechanism. Key parts of this science strategy have been tested under laboratory conditions in two geology blind tests [2] and during two field test campaigns in Svalbard, using simulated mission conditions, an ExoMars representative Payload (ExoMars and MSL instrument breadboards), and Mars analog settings [3, 4]. The experiences gained are being translated into operational sequences, and, together with the science implementation strategy, form a first version of a PanCam Surface Operations plan. References: [1] Griffiths, A.D. et al. (2006) International Journal of Astrobiology 5 (3): 269-275, doi:10.1017/ S1473550406003387. [2] Pullan, D. et al. (2009) EPSC Abstracts, Vol. 4, EPSC2009-514. [3] Schmitz, N. et al. (2009) Geophysical Research Abstracts, Vol. 11, EGU2009-10621-2. [4] Cousins, C. et al. (2009) EPSC Abstracts, Vol. 4, EPSC2009-813.
Esthetic smile preferences and the orientation of the maxillary occlusal plane.
Kattadiyil, Mathew T; Goodacre, Charles J; Naylor, W Patrick; Maveli, Thomas C
2012-12-01
The anteroposterior orientation of the maxillary occlusal plane has an important role in the creation, assessment, and perception of an esthetic smile. However, the effect of the angle at which this plane is visualized (the viewing angle) in a broad smile has not been quantified. The purpose of this study was to assess the esthetic preferences of dental professionals and nondentists by using 3 viewing angles of the anteroposterior orientation of the maxillary occlusal plane. After Institutional Review Board approval, standardized digital photographic images of the smiles of 100 participants were recorded by simultaneously triggering 3 cameras set at different viewing angles. The top camera was positioned 10 degrees above the occlusal plane (camera #1, Top view); the center camera was positioned at the level of the occlusal plane (camera #2, Center view); and the bottom camera was located 10 degrees below the occlusal plane (camera #3, Bottom view). Forty-two dental professionals and 31 nondentists (persons from the general population) independently evaluated digital images of each participant's smile captured from the Top view, Center view, and Bottom view. The 73 evaluators were asked individually through a questionnaire to rank the 3 photographic images of each patient as 'most pleasing,' 'somewhat pleasing,' or 'least pleasing,' with most pleasing being the most esthetic view and the preferred orientation of the occlusal plane. The resulting esthetic preferences were statistically analyzed by using the Friedman test. In addition, the participants were asked to rank their own images from the 3 viewing angles as 'most pleasing,' 'somewhat pleasing,' and 'least pleasing.' The 73 evaluators found statistically significant differences in the esthetic preferences between the Top and Bottom views and between the Center and Bottom views (P<.001). No significant differences were found between the Top and Center views. The Top position was marginally preferred over the Center, and both were significantly preferred over the Bottom position. When the participants evaluated their own smiles, a significantly greater number (P< .001) preferred the Top view over the Center or the Bottom views. No significant differences were found in preferences based on the demographics of the evaluators when comparing age, education, gender, profession, and race. The esthetic preference for the maxillary occlusal plane was influenced by the viewing angle with the higher (Top) and center views preferred by both dental and nondental evaluators. The participants themselves preferred the higher view of their smile significantly more often than the center or lower angle views (P<.001). Copyright © 2012 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhao, Guihua; Chen, Hong; Li, Xingquan; Zou, Xiaoliang
The paper presents the concept of lever arm and boresight angle, the design requirements of calibration sites and the integrated calibration method of boresight angles of digital camera or laser scanner. Taking test data collected by Applanix's LandMark system as an example, the camera calibration method is introduced to be piling three consecutive stereo images and OTF-Calibration method using ground control points. The laser calibration of boresight angle is proposed to use a manual and automatic method with ground control points. Integrated calibration between digital camera and laser scanner is introduced to improve the systemic precision of two sensors. By analyzing the measurement value between ground control points and its corresponding image points in sequence images, a conclusion is that position objects between camera and images are within about 15cm in relative errors and 20cm in absolute errors. By comparing the difference value between ground control points and its corresponding laser point clouds, the errors is less than 20cm. From achieved results of these experiments in analysis, mobile mapping system is efficient and reliable system for generating high-accuracy and high-density road spatial data more rapidly.
NASA Astrophysics Data System (ADS)
Carbajal Gomez, Leopoldo; Del-Castillo-Negrete, Diego
2017-10-01
Developing avoidance or mitigation strategies of runaway electrons (RE) for the safe operation of ITER is imperative. Synchrotron radiation (SR) of RE is routinely used in current tokamak experiments to diagnose RE. We present the results of a newly developed camera diagnostic of SR for full-orbit kinetic simulations of RE in DIII-D-like plasmas that simultaneously includes: full-orbit effects, information of the spectral and angular distribution of SR of each electron, and basic geometric optics of a camera. We observe a strong dependence of the SR measured by the camera on the pitch angle distribution of RE, namely we find that crescent shapes of the SR on the camera pictures relate to RE distributions with small pitch angles, while ellipse shapes relate to distributions of RE with larger pitch angles. A weak dependence of the SR measured by the camera with the RE energy, value of the q-profile at the edge, and the chosen range of wavelengths is found. Furthermore, we observe that oversimplifying the angular distribution of the SR changes the synchrotron spectra and overestimates its amplitude. Research sponsored by the LDRD Program of ORNL, managed by UT-Battelle, LLC, for the U. S. DoE.
Inside Victoria Crater for Extended Exploration
NASA Technical Reports Server (NTRS)
2007-01-01
After a finishing an in-and-out maneuver to check wheel slippage near the rim of Victoria Crater, NASA's Mars Exploration Rover Opportunity re-entered the crater during the rover's 1,293rd Martian day, or sol, (Sept. 13, 2007) to begin a weeks-long exploration of the inner slope. Opportunity's front hazard-identification camera recorded this wide-angle view looking down into and across the crater at the end of the day's drive. The rover's position was about six meters (20 feet) inside the rim, in the 'Duck Bay' alcove of the crater.NASA Technical Reports Server (NTRS)
2004-01-01
13 August 2004 This red wide angle Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows a view of the retreating seasonal south polar cap in the most recent spring in late 2003. Bright areas are covered with frost, dark areas are those from which the solid carbon dioxide has sublimed away. The center of this image is located near 76.5oS, 28.2oW. The scene is large; it covers an area about 250 km (155 mi) across. The scene is illuminated by sunlight from the upper left.2017-09-15
This view of Saturn's A ring features a lone "propeller" -- one of many such features created by small moonlets embedded in the rings as they attempt, unsuccessfully, to open gaps in the ring material. The image was taken by NASA's Cassini spacecraft on Sept. 13, 2017. It is among the last images Cassini sent back to Earth. The view was taken in visible light using the Cassini spacecraft wide-angle camera at a distance of 420,000 miles (676,000 kilometers) from Saturn. Image scale is 2.3 miles (3.7 kilometers). https://photojournal.jpl.nasa.gov/catalog/PIA21894
NASA Technical Reports Server (NTRS)
2005-01-01
An up-close look at Saturn's atmosphere shows wavelike structures in the planet's constantly changing clouds. Feathery striations in the lower right appear to be small-scale waves propagating at a higher altitude than the other cloud features. The image was taken with the Cassini spacecraft wide-angle camera on April 14, 2005, through a filter sensitive to wavelengths of infrared light centered at 727 nanometers and at a distance of approximately 386,000 kilometers (240,000 miles) from Saturn. The image scale is 19 kilometers (12 miles) per pixel.2007-03-08
This beautiful look at Saturn's south polar atmosphere shows the hurricane-like polar storm swirling there. Sunlight highlights its high cloud walls, especially around the 10 o'clock position. The image was taken with the Cassini spacecraft wide-angle camera using a spectral filter sensitive to wavelengths of infrared light centered at 939 nanometers. The image was taken on Jan. 30, 2007 at a distance of approximately 1.1 million kilometers (700,000 miles) from Saturn. Image scale is 61 kilometers (38 miles) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA08892
2005-10-04
During its time in orbit, Cassini has spotted many beautiful cat's eye-shaped patterns like the ones visible here. These patterns occur in places where the winds and the atmospheric density at one latitude are different from those at another latitude. The opposing east-west flowing cloud bands are the dominant patterns seen here and elsewhere in Saturn's atmosphere. Contrast in the image was enhanced to aid the visibility of atmospheric features. The image was taken with the Cassini spacecraft wide-angle camera on Aug. 20, 2005. http://photojournal.jpl.nasa.gov/catalog/PIA07600
Scotti, Filippo; Roquemore, A L; Soukhanovskii, V A
2012-10-01
A pair of two dimensional fast cameras with a wide angle view (allowing a full radial and toroidal coverage of the lower divertor) was installed in the National Spherical Torus Experiment in order to monitor non-axisymmetric effects. A custom polar remapping procedure and an absolute photometric calibration enabled the easier visualization and quantitative analysis of non-axisymmetric plasma material interaction (e.g., strike point splitting due to application of 3D fields and effects of toroidally asymmetric plasma facing components).
Note: Simple hysteresis parameter inspector for camera module with liquid lens
NASA Astrophysics Data System (ADS)
Chen, Po-Jui; Liao, Tai-Shan; Hwang, Chi-Hung
2010-05-01
A method to inspect hysteresis parameter is presented in this article. The hysteresis of whole camera module with liquid lens can be measured rather than a single lens merely. Because the variation in focal length influences image quality, we propose utilizing the sharpness of images which is captured from camera module for hysteresis evaluation. Experiments reveal that the profile of sharpness hysteresis corresponds to the characteristic of contact angle of liquid lens. Therefore, it can infer that the hysteresis of camera module is induced by the contact angle of liquid lens. An inspection process takes only 20 s to complete. Thus comparing with other instruments, this inspection method is more suitable to integrate into the mass production lines for online quality assurance.
Autonomous pedestrian localization technique using CMOS camera sensors
NASA Astrophysics Data System (ADS)
Chun, Chanwoo
2014-09-01
We present a pedestrian localization technique that does not need infrastructure. The proposed angle-only measurement method needs specially manufactured shoes. Each shoe has two CMOS cameras and two markers such as LEDs attached on the inward side. The line of sight (LOS) angles towards the two markers on the forward shoe are measured using the two cameras on the other rear shoe. Our simulation results shows that a pedestrian walking down in a shopping mall wearing this device can be accurately guided to the front of a destination store located 100m away, if the floor plan of the mall is available.
Easily Accessible Camera Mount
NASA Technical Reports Server (NTRS)
Chalson, H. E.
1986-01-01
Modified mount enables fast alinement of movie cameras in explosionproof housings. Screw on side and readily reached through side door of housing. Mount includes right-angle drive mechanism containing two miter gears that turn threaded shaft. Shaft drives movable dovetail clamping jaw that engages fixed dovetail plate on camera. Mechanism alines camera in housing and secures it. Reduces installation time by 80 percent.
Multi-Angle Snowflake Camera Value-Added Product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shkurko, Konstantin; Garrett, T.; Gaustad, K
The Multi-Angle Snowflake Camera (MASC) addresses a need for high-resolution multi-angle imaging of hydrometeors in freefall with simultaneous measurement of fallspeed. As illustrated in Figure 1, the MASC consists of three cameras, separated by 36°, each pointing at an identical focal point approximately 10 cm away. Located immediately above each camera, a light aims directly at the center of depth of field for its corresponding camera. The focal point at which the cameras are aimed lies within a ring through which hydrometeors fall. The ring houses a system of near-infrared emitter-detector pairs, arranged in two arrays separated vertically by 32more » mm. When hydrometeors pass through the lower array, they simultaneously trigger all cameras and lights. Fallspeed is calculated from the time it takes to traverse the distance between the upper and lower triggering arrays. The trigger electronics filter out ambient light fluctuations associated with varying sunlight and shadows. The microprocessor onboard the MASC controls the camera system and communicates with the personal computer (PC). The image data is sent via FireWire 800 line, and fallspeed (and camera control) is sent via a Universal Serial Bus (USB) line that relies on RS232-over-USB serial conversion. See Table 1 for specific details on the MASC located at the Oliktok Point Mobile Facility on the North Slope of Alaska. The value-added product (VAP) detailed in this documentation analyzes the raw data (Section 2.0) using Python: images rely on OpenCV image processing library and derived aggregated statistics rely on some clever averaging. See Sections 4.1 and 4.2 for more details on what variables are computed.« less
1989-08-23
P-34679 Range : 2 million km. ( 1.2 million miles ) In this Voyager 2, wide-angle image, the two main rings of Neptune can be clearly seen. In the lower part of the frame, the originally-announced ring arc, consisting of three distinct features, is visible. This feature covers about 35 degrees of longitude and has yet to be radially resolved in Voyager Images. from higher resolution images it is known that this region contains much more material than the diffuse belts seen elsewhere in its orbit, which seem to encircle the planet. This is consistent with the fact that ground-based observations of stellar occultations by the rings show them to be very broken and clumpy. The more sensitive, wide-angle camera is revealing more widely distributed but fainter material. Each of these rings of material lies just outside of the orbit of a newly discovered moon. One of these moons, 1989N2, may be seen in the upper right corner. The moon is streaked by its orbital motion, whereas the stars in the frame are less smeared. the dark area around the bright moon and star are artifacts of the processing required to bring out the faint rings.
Capturing method for integral three-dimensional imaging using multiviewpoint robotic cameras
NASA Astrophysics Data System (ADS)
Ikeya, Kensuke; Arai, Jun; Mishina, Tomoyuki; Yamaguchi, Masahiro
2018-03-01
Integral three-dimensional (3-D) technology for next-generation 3-D television must be able to capture dynamic moving subjects with pan, tilt, and zoom camerawork as good as in current TV program production. We propose a capturing method for integral 3-D imaging using multiviewpoint robotic cameras. The cameras are controlled through a cooperative synchronous system composed of a master camera controlled by a camera operator and other reference cameras that are utilized for 3-D reconstruction. When the operator captures a subject using the master camera, the region reproduced by the integral 3-D display is regulated in real space according to the subject's position and view angle of the master camera. Using the cooperative control function, the reference cameras can capture images at the narrowest view angle that does not lose any part of the object region, thereby maximizing the resolution of the image. 3-D models are reconstructed by estimating the depth from complementary multiviewpoint images captured by robotic cameras arranged in a two-dimensional array. The model is converted into elemental images to generate the integral 3-D images. In experiments, we reconstructed integral 3-D images of karate players and confirmed that the proposed method satisfied the above requirements.
Modeling and Simulation of High Resolution Optical Remote Sensing Satellite Geometric Chain
NASA Astrophysics Data System (ADS)
Xia, Z.; Cheng, S.; Huang, Q.; Tian, G.
2018-04-01
The high resolution satellite with the longer focal length and the larger aperture has been widely used in georeferencing of the observed scene in recent years. The consistent end to end model of high resolution remote sensing satellite geometric chain is presented, which consists of the scene, the three line array camera, the platform including attitude and position information, the time system and the processing algorithm. The integrated design of the camera and the star tracker is considered and the simulation method of the geolocation accuracy is put forward by introduce the new index of the angle between the camera and the star tracker. The model is validated by the geolocation accuracy simulation according to the test method of the ZY-3 satellite imagery rigorously. The simulation results show that the geolocation accuracy is within 25m, which is highly consistent with the test results. The geolocation accuracy can be improved about 7 m by the integrated design. The model combined with the simulation method is applicable to the geolocation accuracy estimate before the satellite launching.
The PanCam Instrument for the ExoMars Rover
NASA Astrophysics Data System (ADS)
Coates, A. J.; Jaumann, R.; Griffiths, A. D.; Leff, C. E.; Schmitz, N.; Josset, J.-L.; Paar, G.; Gunn, M.; Hauber, E.; Cousins, C. R.; Cross, R. E.; Grindrod, P.; Bridges, J. C.; Balme, M.; Gupta, S.; Crawford, I. A.; Irwin, P.; Stabbins, R.; Tirsch, D.; Vago, J. L.; Theodorou, T.; Caballo-Perucha, M.; Osinski, G. R.; PanCam Team
2017-07-01
The scientific objectives of the ExoMars rover are designed to answer several key questions in the search for life on Mars. In particular, the unique subsurface drill will address some of these, such as the possible existence and stability of subsurface organics. PanCam will establish the surface geological and morphological context for the mission, working in collaboration with other context instruments. Here, we describe the PanCam scientific objectives in geology, atmospheric science, and 3-D vision. We discuss the design of PanCam, which includes a stereo pair of Wide Angle Cameras (WACs), each of which has an 11-position filter wheel and a High Resolution Camera (HRC) for high-resolution investigations of rock texture at a distance. The cameras and electronics are housed in an optical bench that provides the mechanical interface to the rover mast and a planetary protection barrier. The electronic interface is via the PanCam Interface Unit (PIU), and power conditioning is via a DC-DC converter. PanCam also includes a calibration target mounted on the rover deck for radiometric calibration, fiducial markers for geometric calibration, and a rover inspection mirror.
Camera Trajectory fromWide Baseline Images
NASA Astrophysics Data System (ADS)
Havlena, M.; Torii, A.; Pajdla, T.
2008-09-01
Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mičušík's two-parameter model, that links the radius of the image point r to the angle θ of its corresponding rays w.r.t. the optical axis as θ = ar 1+br2 . After a successful calibration, we know the correspondence of the image points to the 3D optical rays in the coordinate system of the camera. The following steps aim at finding the transformation between the camera and the world coordinate systems, i.e. the pose of the camera in the 3D world, using 2D image matches. For computing 3D structure, we construct a set of tentative matches detecting different affine covariant feature regions including MSER, Harris Affine, and Hessian Affine in acquired images. These features are alternative to popular SIFT features and work comparably in our situation. Parameters of the detectors are chosen to limit the number of regions to 1-2 thousands per image. The detected regions are assigned local affine frames (LAF) and transformed into standard positions w.r.t. their LAFs. Discrete Cosine Descriptors are computed for each region in the standard position. Finally, mutual distances of all regions in one image and all regions in the other image are computed as the Euclidean distances of their descriptors and tentative matches are constructed by selecting the mutually closest pairs. Opposed to the methods using short baseline images, simpler image features which are not affine covariant cannot be used because the view point can change a lot between consecutive frames. Furthermore, feature matching has to be performed on the whole frame because no assumptions on the proximity of the consecutive projections can be made for wide baseline images. This is making the feature detection, description, and matching much more time-consuming than it is for short baseline images and limits the usage to low frame rate sequences when operating in real-time. Robust 3D structure can be computed by RANSAC which searches for the largest subset of the set of tentative matches which is, within a predefined threshold ", consistent with an epipolar geometry. We use ordered sampling as suggested in to draw 5-tuples from the list of tentative matches ordered ascendingly by the distance of their descriptors which may help to reduce the number of samples in RANSAC. From each 5-tuple, relative orientation is computed by solving the 5-point minimal relative orientation problem for calibrated cameras. Often, there are more models which are supported by a large number of matches. Thus the chance that the correct model, even if it has the largest support, will be found by running a single RANSAC is small. Work suggested to generate models by randomized sampling as in RANSAC but to use soft (kernel) voting for a parameter instead of looking for the maximal support. The best model is then selected as the one with the parameter closest to the maximum in the accumulator space. In our case, we vote in a two-dimensional accumulator for the estimated camera motion direction. However, unlike in, we do not cast votes directly by each sampled epipolar geometry but by the best epipolar geometries recovered by ordered sampling of RANSAC. With our technique, we could go up to the 98.5 % contamination of mismatches with comparable effort as simple RANSAC does for the contamination by 84 %. The relative camera orientation with the motion direction closest to the maximum in the voting space is finally selected. As already mentioned in the first paragraph, the use of camera trajectory estimates is quite wide. In we have introduced a technique for measuring the size of camera translation relatively to the observed scene which uses the dominant apical angle computed at the reconstructed scene points and is robust against mismatches. The experiments demonstrated that the measure can be used to improve the robustness of camera path computation and object recognition for methods which use a geometric, e.g. the ground plane, constraint such as does for the detection of pedestrians. Using the camera trajectories, perspective cutouts with stabilized horizon are constructed and an arbitrary object recognition routine designed to work with images acquired by perspective cameras can be used without any further modifications.
2015-08-03
Thanks to the illumination angle, Mimas (right) and Dione (left) appear to be staring up at a giant Saturn looming in the background. Although certainly large enough to be noticeable, moons like Mimas (246 miles or 396 kilometers across) and Dione (698 miles or 1123 kilometers across) are tiny compared to Saturn (75,400 miles or 120,700 kilometers across). Even the enormous moon Titan (3,200 miles or 5,150 kilometers across) is dwarfed by the giant planet. This view looks toward the unilluminated side of the rings from about one degree of the ring plane. The image was taken with the Cassini spacecraft wide-angle camera on May 27, 2015 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 728 nanometers. The view was obtained at a distance of approximately 634,000 miles (one million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 85 degrees. Image scale is 38 miles (61 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18331
9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE ...
9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE LOOKING WEST, APRIL 26, 1948. (ORIGINAL PHOTOGRAPH IN POSSESSION OF DAVE WILLIS, SAN DIEGO, CALIFORNIA.) - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
2004-09-07
Lonely Mimas swings around Saturn, seeming to gaze down at the planet's splendid rings. The outermost, narrow F ring is visible here and exhibits some clumpy structure near the bottom of the frame. The shadow of Saturn's southern hemisphere stretches almost entirely across the rings. Mimas is 398 kilometers (247 miles) wide. The image was taken with the Cassini spacecraft narrow angle camera on August 15, 2004, at a distance of 8.8 million kilometers (5.5 million miles) from Saturn, through a filter sensitive to visible red light. The image scale is 53 kilometers (33 miles) per pixel. Contrast was slightly enhanced to aid visibility.almost entirely across the rings. Mimas is 398 kilometers (247 miles) wide. http://photojournal.jpl.nasa.gov/catalog/PIA06471
Flight Calibration of the LROC Narrow Angle Camera
NASA Astrophysics Data System (ADS)
Humm, D. C.; Tschimmel, M.; Brylow, S. M.; Mahanti, P.; Tran, T. N.; Braden, S. E.; Wiseman, S.; Danton, J.; Eliason, E. M.; Robinson, M. S.
2016-04-01
Characterization and calibration are vital for instrument commanding and image interpretation in remote sensing. The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) takes 500 Mpixel greyscale images of lunar scenes at 0.5 meters/pixel. It uses two nominally identical line scan cameras for a larger crosstrack field of view. Stray light, spatial crosstalk, and nonlinearity were characterized using flight images of the Earth and the lunar limb. These are important for imaging shadowed craters, studying ˜1 meter size objects, and photometry respectively. Background, nonlinearity, and flatfield corrections have been implemented in the calibration pipeline. An eight-column pattern in the background is corrected. The detector is linear for DN = 600--2000 but a signal-dependent additive correction is required and applied for DN<600. A predictive model of detector temperature and dark level was developed to command dark level offset. This avoids images with a cutoff at DN=0 and minimizes quantization error in companding. Absolute radiometric calibration is derived from comparison of NAC images with ground-based images taken with the Robotic Lunar Observatory (ROLO) at much lower spatial resolution but with the same photometric angles.
Hawkins, Liam J; Storey, Kenneth B
2017-01-01
Common Western-blot imaging systems have previously been adapted to measure signals from luminescent microplate assays. This can be a cost saving measure as Western-blot imaging systems are common laboratory equipment and could substitute a dedicated luminometer if one is not otherwise available. One previously unrecognized limitation is that the signals captured by the cameras in these systems are not equal for all wells. Signals are dependent on the angle of incidence to the camera, and thus the location of the well on the microplate. Here we show that: •The position of a well on a microplate significantly affects the signal captured by a common Western-blot imaging system from a luminescent assay.•The effect of well position can easily be corrected for.•This method can be applied to commercially available luminescent assays, allowing for high-throughput quantification of a wide range of biological processes and biochemical reactions.
Impediment to Spirit Drive on Sol 1806
NASA Technical Reports Server (NTRS)
2009-01-01
The hazard avoidance camera on the front of NASA's Mars Exploration Rover Spirit took this image after a drive by Spirit on the 1,806th Martian day, or sol, (January 31, 2009) of Spirit's mission on the surface of Mars. The wheel at the bottom right of the image is Spirit's right-front wheel. Because that wheel no longer turns, Spirit drives backwards dragging that wheel. The drive on Sol 1806 covered about 30 centimeters (1 foot). The rover team had planned a longer drive, but Spirit stopped short, apparently from the right front wheel encountering the partially buried rock visible next to that wheel. The hazard avoidance cameras on the front and back of the rover provide wide-angle views. The hill on the horizon in the right half of this image is Husband Hill. Spirit reached the summit of Husband Hill in 2005.KINEMATICS OF THE ORION TRAPEZIUM BASED ON DIFFRACTO-ASTROMETRY AND HISTORICAL DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivares, J.; Sánchez, L. J.; Ruelas-Mayorga, A.
2013-11-01
Using the novel Diffracto-Astrometry technique, we analyze 44 Hubble Space Telescope Wide Field Planetary Camera 2 images of the Orion Trapezium (OT) taken over a span of 12 yr (1995-2007). We measure the relative positions of the six brighter OT components (A-F) and supplement these results with measurements of the relative separations and position angles taken from the literature, thus extending our analysis time base to ∼200 yr. For every pair of components we find the relative rate of separation as well as the temporal rate of change of their position angles, which enable us to determine the relative kinematicsmore » of the system. Component E shows a velocity larger than the OT's escape velocity, thus confirming that it is escaping from the gravitational pull of this system.« less
2015-06-15
The two large craters on Tethys, near the line where day fades to night, almost resemble two giant eyes observing Saturn. The location of these craters on Tethys' terminator throws their topography into sharp relief. Both are large craters, but the larger and southernmost of the two shows a more complex structure. The angle of the lighting highlights a central peak in this crater. Central peaks are the result of the surface reacting to the violent post-impact excavation of the crater. The northern crater does not show a similar feature. Possibly the impact was too small to form a central peak, or the composition of the material in the immediate vicinity couldn't support the formation of a central peak. In this image Tethys is significantly closer to the camera, while the planet is in the background. Yet the moon is still utterly dwarfed by the giant Saturn. This view looks toward the anti-Saturn side of Tethys. North on Tethys is up and rotated 42 degrees to the right. The image was taken in visible light with the Cassini spacecraft wide-angle camera on April 11, 2015. The view was obtained at a distance of approximately 75,000 miles (120,000 kilometers) from Tethys. Image scale at Tethys is 4 miles (7 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/pia18318
Geomorphological Mapping on the Southern Hemisphere of Comet 67P/Churyumov-Gerasimenko
NASA Astrophysics Data System (ADS)
Lee, Jui-Chi; Massironi, Matteo; Giacomini, Lorenza; Ip, Wing-Huen; El-Maarry, Mohamed R.
2016-04-01
Since its rendezvous with comet 67P/Churyumov-Gerasimenko on the sixth of August, 2014, the Rosetta spacecraft has carried out close-up observations of the nucleus and coma of this Jupiter family comet. The OSIRIS, the Scientific Imaging Camera System onboard the Rosetta spacecraft, which consists of a narrow-angle and wide-angle camera (NAC and WAC), has made detailed investigations of the physical properties and surface morphology of the comet. From May 2015, the southern hemisphere of the comet became visible and the adaptical resolution was high enough for us to do a detailed analysis of the surface. Previous work shows that the fine particle deposits are the most extensive geomorphological unit in the northern hemisphere. On the contrary, southern hemisphere is dominated by rocky-like stratified terrain. The southern hemisphere of the nucleus surface reveals quite different morphologies from the northern hemisphere. This could be linked to the different insolation condition between northern and southern hemisphere. As a result, surface geological processes could operate with a diverse intensity on the different sides of the comet nucleus. In this work, we provide the geomorphological maps of the southern hemisphere with linear features and geological units identified. The geomorphological maps described in this study allow us to understand the processes and the origin of the comet.
Nakajima, Hiroshi; Kotani, Atsuhiro; Harada, Ken; Mori, Shigeo
2018-04-09
We construct an electron optical system to investigate Bragg diffraction (the crystal lattice plane, 10-2 to 10-3 rad) with the objective lens turned off by adjusting the current in the intermediate lenses. A crossover was located on the selected-area aperture plane. Thus, the dark-field imaging can be performed by using a selected-area aperture to select Bragg diffraction spots. The camera length can be controlled in the range of 0.8-4 m without exciting the objective lens. Furthermore, we can observe the magnetic-field dependence of electron diffraction using the objective lens under weak excitation conditions. The diffraction mode for Bragg diffraction can be easily switched to a small-angle electron diffraction mode having a camera length of more than 100 m. We propose this experimental method to acquire electron diffraction patterns that depict an extensive angular range from 10-2 to 10-7 rad. This method is applied to analyze the magnetic microstructures in three distinct magnetic materials, i.e. a uniaxial magnetic structure of BaFe10.35Sc1.6Mg0.05O19, a martensite of a Ni-Mn-Ga alloy, and a helical magnetic structure of Ba0.5Sr1.5Zn2Fe12O22.
Investigation into the use of photoanthropometry in facial image comparison.
Moreton, Reuben; Morley, Johanna
2011-10-10
Photoanthropometry is a metric based facial image comparison technique. Measurements of the face are taken from an image using predetermined facial landmarks. Measurements are then converted to proportionality indices (PIs) and compared to PIs from another facial image. Photoanthropometry has been presented as a facial image comparison technique in UK courts for over 15 years. It is generally accepted that extrinsic factors (e.g. orientation of the head, camera angle and distance from the camera) can cause discrepancies in anthropometric measurements of the face from photographs. However there has been limited empirical research into quantifying the influence of such variables. The aim of this study was to determine the reliability of photoanthropometric measurements between different images of the same individual taken with different angulations of the camera. The study examined the facial measurements of 25 individuals from high resolution photographs, taken at different horizontal and vertical camera angles in a controlled environment. Results show that the degree of variability in facial measurements of the same individual due to variations in camera angle can be as great as the variability of facial measurements between different individuals. Results suggest that photoanthropometric facial comparison, as it is currently practiced, is unsuitable for elimination purposes. Preliminary investigations into the effects of distance from camera and image resolution in poor quality images suggest that such images are not an accurate representation of an individuals face, however further work is required. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Opportunity at Work Inside Victoria Crater
NASA Technical Reports Server (NTRS)
2007-01-01
NASA Mars Exploration Rover Opportunity used its front hazard-identification camera to capture this wide-angle view of its robotic arm extended to a rock in a bright-toned layer inside Victoria Crater. The image was taken during the rover's 1,322nd Martian day, or sol (Oct. 13, 2007). Victoria Crater has a scalloped shape of alternating alcoves and promontories around the crater's circumference. Opportunity descended into the crater two weeks earlier, within an alcove called 'Duck Bay.' Counterclockwise around the rim, just to the right of the arm in this image, is a promontory called 'Cabo Frio.'2004-09-23
Looking beyond Saturn's south pole, this was the Cassini spacecraft's view of the distant, icy moon Enceladus on July 28, 2004. The planet itself shows few obvious features at these ultraviolet wavelengths, due to scattering of light by molecules of the gases high in the atmosphere. Enceladus is 499 kilometers (310 miles) wide. The image was taken with the Cassini spacecraft narrow angle camera at a distance of 7.4 million kilometers (4.6 million miles) from Saturn through a filter sensitive to ultraviolet wavelengths of light. The image scale is 44 kilometers (27 miles) per pixel of Saturn. http://photojournal.jpl.nasa.gov/catalog/PIA06483
Hologram production and representation for corrected image
NASA Astrophysics Data System (ADS)
Jiao, Gui Chao; Zhang, Rui; Su, Xue Mei
2015-12-01
In this paper, a CCD sensor device is used to record the distorted homemade grid images which are taken by a wide angle camera. The distorted images are corrected by using methods of position calibration and correction of gray with vc++ 6.0 and opencv software. Holography graphes for the corrected pictures are produced. The clearly reproduced images are obtained where Fresnel algorithm is used in graph processing by reducing the object and reference light from Fresnel diffraction to delete zero-order part of the reproduced images. The investigation is useful in optical information processing and image encryption transmission.
Calibration of the Lunar Reconnaissance Orbiter Camera
NASA Astrophysics Data System (ADS)
Tschimmel, M.; Robinson, M. S.; Humm, D. C.; Denevi, B. W.; Lawrence, S. J.; Brylow, S.; Ravine, M.; Ghaemi, T.
2008-12-01
The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m/pixel and 2 UV filters (315 and 360 nm) with a resolution of 400 m/pixel. In addition to the multicolor imaging the WAC can operate in monochrome mode to provide a global large- incidence angle basemap and a time-lapse movie of the illumination conditions at both poles. The WAC has a highly linear response, a read noise of 72 e- and a full well capacity of 47,200 e-. The signal-to-noise ratio in each band is 140 in the worst case. There are no out-of-band leaks and the spectral response of each filter is well characterized. Each NAC is a monochrome pushbroom scanner, providing images with a resolution of 50 cm/pixel from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of topography and hazard risks. The North and South poles will be mapped on a 1-meter-scale poleward of 85.5° latitude. Stereo coverage can be provided by pointing the NACs off-nadir. The NACs are also highly linear. Read noise is 71 e- for NAC-L and 74 e- for NAC-R and the full well capacity is 248,500 e- for NAC-L and 262,500 e- for NAC- R. The focal lengths are 699.6 mm for NAC-L and 701.6 mm for NAC-R; the system MTF is 28% for NAC-L and 26% for NAC-R. The signal-to-noise ratio is at least 46 (terminator scene) and can be higher than 200 (high sun scene). Both NACs exhibit a straylight feature, which is caused by out-of-field sources and is of a magnitude of 1-3%. However, as this feature is well understood it can be greatly reduced during ground processing. All three cameras were calibrated in the laboratory under ambient conditions. Future thermal vacuum tests will characterize critical behaviors across the full range of lunar operating temperatures. In-flight tests will check for changes in response after launch and provide key data for meeting the requirements of 1% relative and 10% absolute radiometric calibration.
1986-01-25
P-29506BW Range: 1.12 million kilometers (690,000 miles) This high-resolution image of the epsilon ring of Uranus is a clear-filter picture from Voyager's narrow-angle camera and has a resolution of about 10 km (6 mi). The epsilon ring, approx. 100 km (60 mi) wide at this location, clearly shows a structural variation. Visible here are a broad, bright outer component about 40 km (25 mi) wide; a darker, middle region of comparable width; and a narrow, bright inner strip about 15 km (9 mi) wide. The epsilon-ring structure seen by Voyager is similiar to that observed from the ground with stellar-occultation techniques. This frame represents the first Voyager image that resolves these features within the epsilon ring. The occasional fuzzy splotches on the outer and innerparts of the ring are artifacts left by the removal of reseau marks (used for making measurements on the image).
Uncertainty of Videogrammetric Techniques used for Aerodynamic Testing
NASA Technical Reports Server (NTRS)
Burner, A. W.; Liu, Tianshu; DeLoach, Richard
2002-01-01
The uncertainty of videogrammetric techniques used for the measurement of static aeroelastic wind tunnel model deformation and wind tunnel model pitch angle is discussed. Sensitivity analyses and geometrical considerations of uncertainty are augmented by analyses of experimental data in which videogrammetric angle measurements were taken simultaneously with precision servo accelerometers corrected for dynamics. An analysis of variance (ANOVA) to examine error dependence on angle of attack, sensor used (inertial or optical). and on tunnel state variables such as Mach number is presented. Experimental comparisons with a high-accuracy indexing table are presented. Small roll angles are found to introduce a zero-shift in the measured angles. It is shown experimentally that. provided the proper constraints necessary for a solution are met, a single- camera solution can he comparable to a 2-camera intersection result. The relative immunity of optical techniques to dynamics is illustrated.
NASA Technical Reports Server (NTRS)
Mollberg, Bernard H.; Schardt, Bruton B.
1988-01-01
The Orbiter Camera Payload System (OCPS) is an integrated photographic system which is carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a Large Format Camera (LFC), a precision wide-angle cartographic instrument that is capable of producing high resolution stereo photography of great geometric fidelity in multiple base-to-height (B/H) ratios. A secondary, supporting system to the LFC is the Attitude Reference System (ARS), which is a dual lens Stellar Camera Array (SCA) and camera support structure. The SCA is a 70-mm film system which is rigidly mounted to the LFC lens support structure and which, through the simultaneous acquisition of two star fields with each earth-viewing LFC frame, makes it possible to determine precisely the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with Shuttle launch conditions and the on-orbit environment. The full-up OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on October 5, 1984, as a major payload aboard mission STS 41-G. This report documents the system design, the ground testing, the flight configuration, and an analysis of the results obtained during the Challenger mission STS 41-G.
NASA Astrophysics Data System (ADS)
Griffiths, Andrew; Coates, Andrew; Muller, Jan-Peter; Jaumann, Ralf; Josset, Jean-Luc; Paar, Gerhard; Barnes, David
2010-05-01
The ExoMars mission has evolved into a joint European-US mission to deliver a trace gas orbiter and a pair of rovers to Mars in 2016 and 2018 respectively. The European rover will carry the Pasteur exobiology payload including the 1.56 kg Panoramic Camera. PanCam will provide multispectral stereo images with 34 deg horizontal field-of-view (580 microrad/pixel) Wide-Angle Cameras (WAC) and (83 microrad/pixel) colour monoscopic "zoom" images with 5 deg horizontal field-of-view High Resolution Camera (HRC). The stereo Wide Angle Cameras (WAC) are based on Beagle 2 Stereo Camera System heritage [1]. Integrated with the WACs and HRC into the PanCam optical bench (which helps the instrument meet its planetary protection requirements) is the PanCam interface unit (PIU); which provides image storage, a Spacewire interface to the rover and DC-DC power conversion. The Panoramic Camera instrument is designed to fulfil the digital terrain mapping requirements of the mission [2] as well as providing multispectral geological imaging, colour and stereo panoramic images and solar images for water vapour abundance and dust optical depth measurements. The High Resolution Camera (HRC) can be used for high resolution imaging of interesting targets detected in the WAC panoramas and of inaccessible locations on crater or valley walls. Additionally HRC will be used to observe retrieved subsurface samples before ingestion into the rest of the Pasteur payload. In short, PanCam provides the overview and context for the ExoMars experiment locations, required to enable the exobiology aims of the mission. In addition to these baseline capabilities further enhancements are possible to PanCam to enhance it's effectiveness for astrobiology and planetary exploration: 1. Rover Inspection Mirror (RIM) 2. Organics Detection by Fluorescence Excitation (ODFE) LEDs [3-6] 3. UVIS broadband UV Flux and Opacity Determination (UVFOD) photodiode This paper will discuss the scientific objectives and resource impacts of these enhancements. References: 1. Griffiths, A.D., Coates, A.J., Josset, J.-L., Paar, G., Hofmann, B., Pullan, D., Ruffer, P., Sims, M.R., Pillinger, C.T., The Beagle 2 stereo camera system, Planet. Space Sci. 53, 1466-1488, 2005. 2. Paar, G., Oberst, J., Barnes, D.P., Griffiths, A.D., Jaumann, R., Coates, A.J., Muller, J.P., Gao, Y., Li, R., 2007, Requirements and Solutions for ExoMars Rover Panoramic Camera 3d Vision Processing, abstract submitted to EGU meeting, Vienna, 2007. 3. Storrie-Lombardi, M.C., Hug, W.F., McDonald, G.D., Tsapin, A.I., and Nealson, K.H. 2001. Hollow cathode ion lasers for deep ultraviolet Raman spectroscopy and fluorescence imaging. Rev. Sci. Ins., 72 (12), 4452-4459. 4. Nealson, K.H., Tsapin, A., and Storrie-Lombardi, M. 2002. Searching for life in the universe: unconventional methods for an unconventional problem. International Microbiology, 5, 223-230. 5. Mormile, M.R. and Storrie-Lombardi, M.C. 2005. The use of ultraviolet excitation of native fluorescence for identifying biomarkers in halite crystals. Astrobiology and Planetary Missions (R. B. Hoover, G. V. Levin and A. Y. Rozanov, Eds.), Proc. SPIE, 5906, 246-253. 6. Storrie-Lombardi, M.C. 2005. Post-Bayesian strategies to optimize astrobiology instrument suites: lessons from Antarctica and the Pilbara. Astrobiology and Planetary Missions (R. B. Hoover, G. V. Levin and A. Y. Rozanov, Eds.), Proc. SPIE, 5906, 288-301.
Calibration, Projection, and Final Image Products of MESSENGER's Mercury Dual Imaging System
NASA Astrophysics Data System (ADS)
Denevi, Brett W.; Chabot, Nancy L.; Murchie, Scott L.; Becker, Kris J.; Blewett, David T.; Domingue, Deborah L.; Ernst, Carolyn M.; Hash, Christopher D.; Hawkins, S. Edward; Keller, Mary R.; Laslo, Nori R.; Nair, Hari; Robinson, Mark S.; Seelos, Frank P.; Stephens, Grant K.; Turner, F. Scott; Solomon, Sean C.
2018-02-01
We present an overview of the operations, calibration, geodetic control, photometric standardization, and processing of images from the Mercury Dual Imaging System (MDIS) acquired during the orbital phase of the MESSENGER spacecraft's mission at Mercury (18 March 2011-30 April 2015). We also provide a summary of all of the MDIS products that are available in NASA's Planetary Data System (PDS). Updates to the radiometric calibration included slight modification of the frame-transfer smear correction, updates to the flat fields of some wide-angle camera (WAC) filters, a new model for the temperature dependence of narrow-angle camera (NAC) and WAC sensitivity, and an empirical correction for temporal changes in WAC responsivity. Further, efforts to characterize scattered light in the WAC system are described, along with a mosaic-dependent correction for scattered light that was derived for two regional mosaics. Updates to the geometric calibration focused on the focal lengths and distortions of the NAC and all WAC filters, NAC-WAC alignment, and calibration of the MDIS pivot angle and base. Additionally, two control networks were derived so that the majority of MDIS images can be co-registered with sub-pixel accuracy; the larger of the two control networks was also used to create a global digital elevation model. Finally, we describe the image processing and photometric standardization parameters used in the creation of the MDIS advanced products in the PDS, which include seven large-scale mosaics, numerous targeted local mosaics, and a set of digital elevation models ranging in scale from local to global.
NASA MISR Studies Smoke Plumes from California Sand Fire
2016-08-02
39,000 acres (60 square miles, or 160 square kilometers). Thousands of residents were evacuated, and the fire claimed the life of one person. The Multi-angle Imaging SpectroRadiometer (MISR) instrument aboard NASA's Terra satellite passed over the region on July 23 around 11:50 a.m. PDT. At left is an image acquired by MISR's 60-degree forward-viewing camera. The oblique view angle makes the smoke more apparent than it would be in a more conventional vertical view. This cropped image is about 185 miles (300 kilometers) wide. Smoke from the Sand Fire is visible on the right-hand side of the image. Stereoscopic analysis of MISR's multiple camera angles is used to compute the height of the smoke plume from the Sand Fire. In the right-hand image, these heights are superimposed on the underlying image. The color scale shows that the plume extends up to about 4 miles (6 kilometers) above its source in Santa Clarita, but rapidly diminishes in height as winds push it to the southwest. The data compare well with a pilot report issued at Los Angeles International Airport on the evening of July 22, which reported smoke at 15,000-18,000 feet altitude (4.5 to 5.5 kilometers). Air quality warnings were issued for the San Fernando Valley and the western portion of Los Angeles due to this low-hanging smoke. However, data from air quality monitoring instruments seem to indicate that the smoke did not actually reach the ground. These data were captured during Terra orbit 88284. http://photojournal.jpl.nasa.gov/catalog/PIA20724
Mars Color Imager (MARCI) on the Mars Climate Orbiter
Malin, M.C.; Bell, J.F.; Calvin, W.; Clancy, R.T.; Haberle, R.M.; James, P.B.; Lee, S.W.; Thomas, P.C.; Caplinger, M.A.
2001-01-01
The Mars Color Imager, or MARCI, experiment on the Mars Climate Orbiter (MCO) consists of two cameras with unique optics and identical focal plane assemblies (FPAs), Data Acquisition System (DAS) electronics, and power supplies. Each camera is characterized by small physical size and mass (???6 x 6 x 12 cm, including baffle; <500 g), low power requirements (<2.5 W, including power supply losses), and high science performance (1000 x 1000 pixel, low noise). The Wide Angle (WA) camera will have the capability to map Mars in five visible and two ultraviolet spectral bands at a resolution of better than 8 km/pixel under the worst case downlink data rate. Under better downlink conditions the WA will provide kilometer-scale global maps of atmospheric phenomena such as clouds, hazes, dust storms, and the polar hood. Limb observations will provide additional detail on atmospheric structure at 1/3 scale-height resolution. The Medium Angle (MA) camera is designed to study selected areas of Mars at regional scale. From 400 km altitude its 6?? FOV, which covers ???40 km at 40 m/pixel, will permit all locations on the planet except the poles to be accessible for image acquisitions every two mapping cycles (roughly 52 sols). Eight spectral channels between 425 and 1000 nm provide the ability to discriminate both atmospheric and surface features on the basis of composition. The primary science objectives of MARCI are to (1) observe Martian atmospheric processes at synoptic scales and mesoscales, (2) study details of the interaction of the atmosphere with the surface at a variety of scales in both space and time, and (3) examine surface features characteristic of the evolution of the Martian climate over time. MARCI will directly address two of the three high-level goals of the Mars Surveyor Program: Climate and Resources. Life, the third goal, will be addressed indirectly through the environmental factors associated with the other two goals. Copyright 2001 by the American Geophysical Union.
The Mars Color Imager (MARCI) on the Mars Climate Orbiter
NASA Astrophysics Data System (ADS)
Malin, M. C.; Calvin, W.; Clancy, R. T.; Haberle, R. M.; James, P. B.; Lee, S. W.; Thomas, P. C.; Caplinger, M. A.
2001-08-01
The Mars Color Imager, or MARCI, experiment on the Mars Climate Orbiter (MCO) consists of two cameras with unique optics and identical focal plane assemblies (FPAs), Data Acquisition System (DAS) electronics, and power supplies. Each camera is characterized by small physical size and mass (~6 × 6 × 12 cm, including baffle; <500 g), low power requirements (<2.5 W, including power supply losses), and high science performance (1000 × 1000 pixel, low noise). The Wide Angle (WA) camera will have the capability to map Mars in five visible and two ultraviolet spectral bands at a resolution of better than 8 km/pixel under the worst case downlink data rate. Under better downlink conditions the WA will provide kilometer-scale global maps of atmospheric phenomena such as clouds, hazes, dust storms, and the polar hood. Limb observations will provide additional detail on atmospheric structure at
Expansion of the visual angle of a car rear-view image via an image mosaic algorithm
NASA Astrophysics Data System (ADS)
Wu, Zhuangwen; Zhu, Liangrong; Sun, Xincheng
2015-05-01
The rear-view image system is one of the active safety devices in cars and is widely applied in all types of vehicles and traffic safety areas. However, studies made by both domestic and foreign researchers were based on a single image capture device while reversing, so a blind area still remained to drivers. Even if multiple cameras were used to expand the visual angle of the car's rear-view image in some studies, the blind area remained because different source images were not mosaicked together. To acquire an expanded visual angle of a car rear-view image, two charge-coupled device cameras with optical axes angled at 30 deg were mounted below the left and right fenders of a car in three light conditions-sunny outdoors, cloudy outdoors, and an underground garage-to capture rear-view heterologous images of the car. Then these rear-view heterologous images were rapidly registered through the scale invariant feature transform algorithm. Combined with the random sample consensus algorithm, the two heterologous images were finally mosaicked using the linear weighted gradated in-and-out fusion algorithm, and a seamless and visual-angle-expanded rear-view image was acquired. The four-index test results showed that the algorithms can mosaic rear-view images well in the underground garage condition, where the average rate of correct matching was the lowest among the three conditions. The rear-view image mosaic algorithm presented had the best information preservation, the shortest computation time and the most complete preservation of the image detail features compared to the mean value method (MVM) and segmental fusion method (SFM), and it was also able to perform better in real time and provided more comprehensive image details than MVM and SFM. In addition, it had the most complete image preservation from source images among the three algorithms. The method introduced by this paper provided the basis for researching the expansion of the visual angle of a car rear-view image in all-weather conditions.
2014-07-07
NASA Cassini spacecraft captures three magnificent sights at once: Saturn north polar vortex and hexagon along with its expansive rings. The hexagon, which is wider than two Earths, owes its appearance to the jet stream that forms its perimeter. The jet stream forms a six-lobed, stationary wave which wraps around the north polar regions at a latitude of roughly 77 degrees North. This view looks toward the sunlit side of the rings from about 37 degrees above the ringplane. The image was taken with the Cassini spacecraft wide-angle camera on April 2, 2014 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was obtained at a distance of approximately 1.4 million miles (2.2 million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 43 degrees. Image scale is 81 miles (131 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18274
1989-08-27
P-34715 Range: 900,000 kilometers (560,000 miles) This post-encounter view of the south pole of Neptune was obtained after Voyager 2 passed the planet and sped away on a southward-trending trajectory. Voyager's wide-angle camera saw features as small as 120 km (75 mi) in diameter. The angle between the Sun, the center of the planet, and the spacecraft is 137 °, so the entire south polar region is illuminated. Near the bright limb, clouds located at 71 and 42 degrees south latitude rotate eastward onto Neptune's night side. A bright cloud (bottom center) lies within 1.5 ° of Neptune's south pole, which has been determined from the orbits of the planet's rings and satellites. The feature is believed to be created by an organized circulation around the pole that forms a clear 'eye' at the center of the system.
2005-08-05
During its close flyby of Saturn's moon Mimas on Aug. 2, 2005, Cassini caught a glimpse of Mimas against the broad expanse of Saturn's rings. The Keeler Gap in the outer A ring, in which Cassini spied a never-before-seen small moon (see PIA06237), is at the upper right. The ancient, almost asteroid-like surface of Mimas is evident in its crater-upon-crater appearance. Even the material which has slumped down into the bottom of some of its craters bears the marks of later impacts. This image was taken through the clear filter of the Cassini spacecraft narrow-angle camera at a distance of 68,000 kilometers (42,500 miles) from Mimas and very near closest approach. The smallest features seen on the moon are about 400 meters wide (440 yards); the Sun-Mimas-Cassini angle is 44 degrees. http://photojournal.jpl.nasa.gov/catalog/PIA06412
What convention is used for the illumination and view angles?
Atmospheric Science Data Center
2014-12-08
... Azimuth angles are measured clockwise from the direction of travel to local north. For both the Sun and cameras, azimuth describes the ... to the equator, because of its morning equator crossing time. Additionally, the difference in view and solar azimuth angle will be near ...
NASA Astrophysics Data System (ADS)
Li, Ke; Chen, Jianping; Sofia, Giulia; Tarolli, Paolo
2014-05-01
Moon surface features have great significance in understanding and reconstructing the lunar geological evolution. Linear structures like rilles and ridges are closely related to the internal forced tectonic movement. The craters widely distributed on the moon are also the key research targets for external forced geological evolution. The extremely rare availability of samples and the difficulty for field works make remote sensing the most important approach for planetary studies. New and advanced lunar probes launched by China, U.S., Japan and India provide nowadays a lot of high-quality data, especially in the form of high-resolution Digital Terrain Models (DTMs), bringing new opportunities and challenges for feature extraction on the moon. The aim of this study is to recognize and extract lunar features using geomorphometric analysis based on multi-scale parameters and multi-resolution DTMs. The considered digital datasets include CE1-LAM (Chang'E One, Laser AltiMeter) data with resolution of 500m/pix, LRO-WAC (Lunar Reconnaissance Orbiter, Wide Angle Camera) data with resolution of 100m/pix, LRO-LOLA (Lunar Reconnaissance Orbiter, Lunar Orbiter Laser Altimeter) data with resolution of 60m/pix, and LRO-NAC (Lunar Reconnaissance Orbiter, Narrow Angle Camera) data with resolution of 2-5m/pix. We considered surface derivatives to recognize the linear structures including Rilles and Ridges. Different window scales and thresholds for are considered for feature extraction. We also calculated the roughness index to identify the erosion/deposits area within craters. The results underline the suitability of the adopted methods for feature recognition on the moon surface. The roughness index is found to be a useful tool to distinguish new craters, with higher roughness, from the old craters, which present a smooth and less rough surface.
The PanCam Instrument for the ExoMars Rover
Coates, A.J.; Jaumann, R.; Griffiths, A.D.; Leff, C.E.; Schmitz, N.; Josset, J.-L.; Paar, G.; Gunn, M.; Hauber, E.; Cousins, C.R.; Cross, R.E.; Grindrod, P.; Bridges, J.C.; Balme, M.; Gupta, S.; Crawford, I.A.; Irwin, P.; Stabbins, R.; Tirsch, D.; Vago, J.L.; Theodorou, T.; Caballo-Perucha, M.; Osinski, G.R.
2017-01-01
Abstract The scientific objectives of the ExoMars rover are designed to answer several key questions in the search for life on Mars. In particular, the unique subsurface drill will address some of these, such as the possible existence and stability of subsurface organics. PanCam will establish the surface geological and morphological context for the mission, working in collaboration with other context instruments. Here, we describe the PanCam scientific objectives in geology, atmospheric science, and 3-D vision. We discuss the design of PanCam, which includes a stereo pair of Wide Angle Cameras (WACs), each of which has an 11-position filter wheel and a High Resolution Camera (HRC) for high-resolution investigations of rock texture at a distance. The cameras and electronics are housed in an optical bench that provides the mechanical interface to the rover mast and a planetary protection barrier. The electronic interface is via the PanCam Interface Unit (PIU), and power conditioning is via a DC-DC converter. PanCam also includes a calibration target mounted on the rover deck for radiometric calibration, fiducial markers for geometric calibration, and a rover inspection mirror. Key Words: Mars—ExoMars—Instrumentation—Geology—Atmosphere—Exobiology—Context. Astrobiology 17, 511–541.
3-D Flow Visualization with a Light-field Camera
NASA Astrophysics Data System (ADS)
Thurow, B.
2012-12-01
Light-field cameras have received attention recently due to their ability to acquire photographs that can be computationally refocused after they have been acquired. In this work, we describe the development of a light-field camera system for 3D visualization of turbulent flows. The camera developed in our lab, also known as a plenoptic camera, uses an array of microlenses mounted next to an image sensor to resolve both the position and angle of light rays incident upon the camera. For flow visualization, the flow field is seeded with small particles that follow the fluid's motion and are imaged using the camera and a pulsed light source. The tomographic MART algorithm is then applied to the light-field data in order to reconstruct a 3D volume of the instantaneous particle field. 3D, 3C velocity vectors are then determined from a pair of 3D particle fields using conventional cross-correlation algorithms. As an illustration of the concept, 3D/3C velocity measurements of a turbulent boundary layer produced on the wall of a conventional wind tunnel are presented. Future experiments are planned to use the camera to study the influence of wall permeability on the 3-D structure of the turbulent boundary layer.Schematic illustrating the concept of a plenoptic camera where each pixel represents both the position and angle of light rays entering the camera. This information can be used to computationally refocus an image after it has been acquired. Instantaneous 3D velocity field of a turbulent boundary layer determined using light-field data captured by a plenoptic camera.
Augmented reality glass-free three-dimensional display with the stereo camera
NASA Astrophysics Data System (ADS)
Pang, Bo; Sang, Xinzhu; Chen, Duo; Xing, Shujun; Yu, Xunbo; Yan, Binbin; Wang, Kuiru; Yu, Chongxiu
2017-10-01
An improved method for Augmented Reality (AR) glass-free three-dimensional (3D) display based on stereo camera used for presenting parallax contents from different angle with lenticular lens array is proposed. Compared with the previous implementation method of AR techniques based on two-dimensional (2D) panel display with only one viewpoint, the proposed method can realize glass-free 3D display of virtual objects and real scene with 32 virtual viewpoints. Accordingly, viewers can get abundant 3D stereo information from different viewing angles based on binocular parallax. Experimental results show that this improved method based on stereo camera can realize AR glass-free 3D display, and both of virtual objects and real scene have realistic and obvious stereo performance.
Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung
2016-08-31
Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.
Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung
2016-01-01
Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user’s head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768
Power estimation of martial arts movement using 3D motion capture camera
NASA Astrophysics Data System (ADS)
Azraai, Nur Zaidi; Awang Soh, Ahmad Afiq Sabqi; Mat Jafri, Mohd Zubir
2017-06-01
Motion capture camera (MOCAP) has been widely used in many areas such as biomechanics, physiology, animation, arts, etc. This project is done by approaching physics mechanics and the extended of MOCAP application through sports. Most researchers will use a force plate, but this will only can measure the force of impact, but for us, we are keen to observe the kinematics of the movement. Martial arts is one of the sports that uses more than one part of the human body. For this project, martial art `Silat' was chosen because of its wide practice in Malaysia. 2 performers have been selected, one of them has an experienced in `Silat' practice and another one have no experience at all so that we can compare the energy and force generated by the performers. Every performer will generate a punching with same posture which in this project, two types of punching move were selected. Before the measuring start, a calibration has been done so the software knows the area covered by the camera and reduce the error when analyze by using the T stick that have been pasted with a marker. A punching bag with mass 60 kg was hung on an iron bar as a target. The use of this punching bag is to determine the impact force of a performer when they punch. This punching bag also will be stuck with the optical marker so we can observe the movement after impact. 8 cameras have been used and placed with 2 cameras at every side of the wall with different angle in a rectangular room 270 ft2 and the camera covered approximately 50 ft2. We covered only a small area so less noise will be detected and make the measurement more accurate. A Marker has been pasted on the limb of the entire hand that we want to observe and measure. A passive marker used in this project has a characteristic to reflect the infrared that being generated by the camera. The infrared will reflected to the camera sensor so the marker position can be detected and show in software. The used of many cameras is to increase the precision and improve the accuracy of the marker. Performer movement was recorded and analyzed using software Cortex motion analysis where velocity and acceleration of a performer movement can be measured. With classical mechanics approach we have estimated the power and force of impact and shows that an experienced performer produces more power and force of impact is higher than the inexperienced performer.
A state observer for using a slow camera as a sensor for fast control applications
NASA Astrophysics Data System (ADS)
Gahleitner, Reinhard; Schagerl, Martin
2013-03-01
This contribution concerns about a problem that often arises in vision based control, when a camera is used as a sensor for fast control applications, or more precisely, when the sample rate of the control loop is higher than the frame rate of the camera. In control applications for mechanical axes, e.g. in robotics or automated production, a camera and some image processing can be used as a sensor to detect positions or angles. The sample time in these applications is typically in the range of a few milliseconds or less and this demands the use of a camera with a high frame rate up to 1000 fps. The presented solution is a special state observer that can work with a slower and therefore cheaper camera to estimate the state variables at the higher sample rate of the control loop. To simplify the image processing for the determination of positions or angles and make it more robust, some LED markers are applied to the plant. Simulation and experimental results show that the concept can be used even if the plant is unstable like the inverted pendulum.
Asteroid (21) Lutetia: Disk-resolved photometric analysis of Baetica region
NASA Astrophysics Data System (ADS)
Hasselmann, P. H.; Barucci, M. A.; Fornasier, S.; Leyrat, C.; Carvano, J. M.; Lazzaro, D.; Sierks, H.
2016-03-01
(21) Lutetia has been visited by Rosetta mission on July 2010 and observed with a phase angle ranging from 0.15° to 156.8°. The Baetica region, located at the north pole has been extensively observed by OSIRIS cameras system. Baetica encompass a region called North Pole Crater Cluster (NPCC), shows a cluster of superposed craters which presents signs of variegation at the small phase angle images. For studying the location, we used 187 images distributed throughout 14 filter recorded by the NAC (Narrow Angle Camera) and WAC (Wide Angle Camera) of the OSIRIS system on-board Rosetta taken during the fly-by. Then, we photometrically modeled the region using Minnaert disk-function and Akimov phase function to obtain a resolved spectral slope map at phase angles of 5 ° and 20 ° . We observed a dichotomy between Gallicum and Danuvius-Sarnus Labes in the NPCC, but no significant phase reddening (- 0.04 ± 0.045 % μm-1deg-1). In the next step, we applied the Hapke (Hapke, B. [2008]. Icarus 195, 918-926; Hapke, B. [2012]. Theory of Reflectance and Emittance Spectroscopy, second ed. Cambridge University Press) model for the NAC F82+F22 (649.2 nm), WAC F13 (375 nm) and WAC F17 (631.6 nm) and we obtained normal albedo maps and Hapke parameter maps for NAC F82+F22. On Baetica, at 649.2 nm, the geometric albedo is 0.205 ± 0.005 , the average single-scattering albedo is 0.181 ± 0.005 , the average asymmetric factor is - 0.342 ± 0.003 , the average shadow-hiding opposition effect amplitude and width are 0.824 ± 0.002 and 0.040 ± 0.0007 , the average roughness slope is 11.45 ° ± 3 ° and the average porosity is 0.85 ± 0.002 . We are unable to confirm the presence of coherent-backscattering mechanism. In the NPCC, the normal albedo variegation among the craters walls reach 8% brighter for Gallicum Labes and 2% fainter for Danuvius Labes. The Hapke parameter maps also show a dichotomy at the opposition effect coefficients, single-scattering albedo and asymmetric factor, that may be attributed to the maturation degree of the regolith or to compositonal variation. In addition, we compared the Hapke (Hapke, B. [2008]. Icarus 195, 918-926; Hapke, B. [2012]. Theory of Reflectance and Emittance Spectroscopy, second ed. Cambridge University Press) and Hapke (Hapke, B. [1993]. Theory of Reflectance and Emittance Spectroscopy) parameters with laboratory samples and other small Solar System bodies visited by space missions.
NASA Technical Reports Server (NTRS)
Glaeser, P.; Haase, I.; Oberst, J.; Neumann, G. A.
2013-01-01
We have derived algorithms and techniques to precisely co-register laser altimeter profiles with gridded Digital Terrain Models (DTMs), typically derived from stereo images. The algorithm consists of an initial grid search followed by a least-squares matching and yields the translation parameters at sub-pixel level needed to align the DTM and the laser profiles in 3D space. This software tool was primarily developed and tested for co-registration of laser profiles from the Lunar Orbiter Laser Altimeter (LOLA) with DTMs derived from the Lunar Reconnaissance Orbiter (LRO) Narrow Angle Camera (NAC) stereo images. Data sets can be co-registered with positional accuracy between 0.13 m and several meters depending on the pixel resolution and amount of laser shots, where rough surfaces typically result in more accurate co-registrations. Residual heights of the data sets are as small as 0.18 m. The software can be used to identify instrument misalignment, orbit errors, pointing jitter, or problems associated with reference frames being used. Also, assessments of DTM effective resolutions can be obtained. From the correct position between the two data sets, comparisons of surface morphology and roughness can be made at laser footprint- or DTM pixel-level. The precise co-registration allows us to carry out joint analysis of the data sets and ultimately to derive merged high-quality data products. Examples of matching other planetary data sets, like LOLA with LRO Wide Angle Camera (WAC) DTMs or Mars Orbiter Laser Altimeter (MOLA) with stereo models from the High Resolution Stereo Camera (HRSC) as well as Mercury Laser Altimeter (MLA) with Mercury Dual Imaging System (MDIS) are shown to demonstrate the broad science applications of the software tool.
24/7 security system: 60-FPS color EMCCD camera with integral human recognition
NASA Astrophysics Data System (ADS)
Vogelsong, T. L.; Boult, T. E.; Gardner, D. W.; Woodworth, R.; Johnson, R. C.; Heflin, B.
2007-04-01
An advanced surveillance/security system is being developed for unattended 24/7 image acquisition and automated detection, discrimination, and tracking of humans and vehicles. The low-light video camera incorporates an electron multiplying CCD sensor with a programmable on-chip gain of up to 1000:1, providing effective noise levels of less than 1 electron. The EMCCD camera operates in full color mode under sunlit and moonlit conditions, and monochrome under quarter-moonlight to overcast starlight illumination. Sixty frame per second operation and progressive scanning minimizes motion artifacts. The acquired image sequences are processed with FPGA-compatible real-time algorithms, to detect/localize/track targets and reject non-targets due to clutter under a broad range of illumination conditions and viewing angles. The object detectors that are used are trained from actual image data. Detectors have been developed and demonstrated for faces, upright humans, crawling humans, large animals, cars and trucks. Detection and tracking of targets too small for template-based detection is achieved. For face and vehicle targets the results of the detection are passed to secondary processing to extract recognition templates, which are then compared with a database for identification. When combined with pan-tilt-zoom (PTZ) optics, the resulting system provides a reliable wide-area 24/7 surveillance system that avoids the high life-cycle cost of infrared cameras and image intensifiers.
Quantifying seasonal variation of leaf area index using near-infrared digital camera in a rice paddy
NASA Astrophysics Data System (ADS)
Hwang, Y.; Ryu, Y.; Kim, J.
2017-12-01
Digital camera has been widely used to quantify leaf area index (LAI). Numerous simple and automatic methods have been proposed to improve the digital camera based LAI estimates. However, most studies in rice paddy relied on arbitrary thresholds or complex radiative transfer models to make binary images. Moreover, only a few study reported continuous, automatic observation of LAI over the season in rice paddy. The objective of this study is to quantify seasonal variations of LAI using raw near-infrared (NIR) images coupled with a histogram shape-based algorithm in a rice paddy. As vegetation highly reflects the NIR light, we installed NIR digital camera 1.8 m above the ground surface and acquired unsaturated raw format images at one-hour intervals between 15 to 80 º solar zenith angles over the entire growing season in 2016 (from May to September). We applied a sub-pixel classification combined with light scattering correction method. Finally, to confirm the accuracy of the quantified LAI, we also conducted direct (destructive sampling) and indirect (LAI-2200) manual observations of LAI once per ten days on average. Preliminary results show that NIR derived LAI agreed well with in-situ observations but divergence tended to appear once rice canopy is fully developed. The continuous monitoring of LAI in rice paddy will help to understand carbon and water fluxes better and evaluate satellite based LAI products.
NASA Astrophysics Data System (ADS)
Zambon, F.; De Sanctis, M. C.; Capaccioni, F.; Filacchione, G.; Carli, C.; Ammanito, E.; Friggeri, A.
2011-10-01
During the first two MESSENGER flybys (14th January 2008 and 6th October 2008) the Mercury Dual Imaging System (MDIS) has extended the coverage of the Mercury surface, obtained by Mariner 10 and now we have images of about 90% of the Mercury surface [1]. MDIS is equipped with a Narrow Angle Camera (NAC) and a Wide Angle Camera (WAC). The NAC uses an off-axis reflective design with a 1.5° field of view (FOV) centered at 747 nm. The WAC has a re- fractive design with a 10.5° FOV and 12-position filters that cover a 395-1040 nm spectral range [2]. The color images can be used to infer information on the surface composition and classification meth- ods are an interesting technique for multispectral image analysis which can be applied to the study of the planetary surfaces. Classification methods are based on clustering algorithms and they can be divided in two categories: unsupervised and supervised. The unsupervised classifiers do not require the analyst feedback, and the algorithm automatically organizes pixels values into classes. In the supervised method, instead, the analyst must choose the "training area" that define the pixels value of a given class [3]. Here we will describe the classification in different compositional units of the region near the Rudaki Crater on Mercury.
67P/Churyumov-Gerasimenko: Activity between March and June 2014 as observed from Rosetta/OSIRIS
NASA Astrophysics Data System (ADS)
Tubiana, C.; Snodgrass, C.; Bertini, I.; Mottola, S.; Vincent, J.-B.; Lara, L.; Fornasier, S.; Knollenberg, J.; Thomas, N.; Fulle, M.; Agarwal, J.; Bodewits, D.; Ferri, F.; Güttler, C.; Gutierrez, P. J.; La Forgia, F.; Lowry, S.; Magrin, S.; Oklay, N.; Pajola, M.; Rodrigo, R.; Sierks, H.; A'Hearn, M. F.; Angrilli, F.; Barbieri, C.; Barucci, M. A.; Bertaux, J.-L.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; De Cecco, M.; Debei, S.; Groussin, O.; Hviid, S. F.; Ip, W.; Jorda, L.; Keller, H. U.; Koschny, D.; Kramm, R.; Kührt, E.; Küppers, M.; Lazzarin, M.; Lamy, P. L.; Lopez Moreno, J. J.; Marzari, F.; Michalik, H.; Naletto, G.; Rickman, H.; Sabau, L.; Wenzel, K.-P.
2015-01-01
Aims: 67P/Churyumov-Gerasimenko is the target comet of the ESA's Rosetta mission. After commissioning at the end of March 2014, the Optical, Spectroscopic, and Infrared Remote Imaging System (OSIRIS) onboard Rosetta, started imaging the comet and its dust environment to investigate how they change and evolve while approaching the Sun. Methods: We focused our work on Narrow Angle Camera (NAC) orange images and Wide Angle Camera (WAC) red and visible-610 images acquired between 2014 March 23 and June 24 when the nucleus of 67P was unresolved and moving from approximately 4.3 AU to 3.8 AU inbound. During this period the 67P - Rosetta distance decreased from 5 million to 120 thousand km. Results: Through aperture photometry, we investigated how the comet brightness varies with heliocentric distance. 67P was likely already weakly active at the end of March 2014, with excess flux above that expected for the nucleus. The comet's brightness was mostly constant during the three months of approach observations, apart from one outburst that occurred around April 30 and a second increase in flux after June 20. Coma was resolved in the profiles from mid-April. Analysis of the coma morphology suggests that most of the activity comes from a source towards the celestial north pole of the comet, but the outburst that occurred on April 30 released material in a different direction.
1982-04-02
General S130 Eclipse computer. 2.2.3 Photographic Coverage Each crash test was recorded on 16 mm color film by four W cameras. The event was filmed at...rotate further nose-up until impact. Unfortunately, all cameras had either run out of film or had been turned off prior to impact so that there is no...record of impact angle or crash events. From visual observations at the time, the impact angle seemed to be nearly 90* nose-up. What film exists
Student Measurements of the Double Star Eta Cassiopeiae
NASA Astrophysics Data System (ADS)
Brewer, Mark; Cacace, Gabriel; Do, Vivian; Griffith, Nicholas; Malan, Alexandria; Paredes, Hanna; Peticolas, Brian; Stasiak, Kathryne
2016-10-01
The double star Eta Cassiopeiae was measured at Vanguard Preparatory School. Digital measurements were made with a 14-inch telescope equipped with a CCD camera. The plate scale was determined to be 0.50 arcseconds per pixel. The separations and position angles were determined to be 13.3 arcseconds and 340.4 degrees, by the use of astronomy software. Previous observations reported in the Washington Double Star Catalog were used as a comparison. The camera angle was found to be the ultimate issue in the skewed data gathered for the double star.
Volcanoes Ceraunius Tholus and Uranius Tholus
NASA Technical Reports Server (NTRS)
2002-01-01
Acquired in March 2002, this Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide angle view shows the martian volcanoes, Ceraunius Tholus (lower) and Uranius Tholus (upper). The presence of impact craters on these volcanoes, particularly on Uranius Tholus; indicates that they are quite ancient and are not active today. The light-toned area on the southeastern face (toward lower right) of Ceraunius Tholus is a remnant of a once more extensive deposit of dust from the global dust storm events that occurred in 2001. The crater at the summit of Ceraunius Tholus is about 25 km (15.5 mi) across. Sunlight illuminates the scene from the lower left.
2007-07-26
A surge in brightness appears on the rings directly opposite the Sun from the Cassini spacecraft. This "opposition surge" travels across the rings as the spacecraft watches. This view looks toward the sunlit side of the rings from about 9 degrees below the ringplane. The image was taken in visible light with the Cassini spacecraft wide-angle camera on June 12, 2007 using a spectral filter sensitive to wavelengths of infrared light centered at 853 nanometers. The view was acquired at a distance of approximately 524,374 kilometers (325,830 miles) from Saturn. Image scale is 31 kilometers (19 miles) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA08992
The space shuttle payload planning working groups. Volume 1: Astronomy
NASA Technical Reports Server (NTRS)
1973-01-01
The space astronomy missions to be accomplished by the space shuttle are discussed. The principal instrument is the Large Space Telescope optimized for the ultraviolet and visible regions of the spectrum, but usable also in the infrared. Two infrared telescopes are also proposed and their characteristics are described. Other instruments considered for the astronomical observations are: (1) a very wide angle ultraviolet camera, (2) a grazing incidence telescope, (3) Explorer-class free flyers to measure the cosmic microwave background, and (4) rocket-class instruments which can fly frequently on a variety of missions. The stability requirements of the space shuttle for accomplishing the astronomy mission are defined.
Earth observations taken from shuttle orbiter Discovery during STS-82 mission
1997-02-12
STS082-723-071 (11-21 Feb. 1997) --- The island of Hispaniola appears left center in this wide-angle view, photographed with a 70mm handheld camera from the Earth-orbiting Space Shuttle Discovery. The prominent cape is Cap-a-Foux, the northwest point of Haiti. The cloud is broken by the mountainous spine of the island. Smoke from bush fires appears in the valleys between the ridges. The coppery tinge of light reflected off the sea surface indicates pollution in the air -- probably industrial pollutants from North America which are typically fed around from the Atlantic seaboard into the Caribbean from the east.
NASA Technical Reports Server (NTRS)
Rust, D. M.; Appourchaux, T.
1988-01-01
Progress in the development of an instrument with very high (1:10 billion) wavelength stability designed to measure solar surface velocities and magnetic fields is reported. The instrument determines Doppler and Zeeman shifts in solar spectral lines by a 6-point weighted average. It is built around an electrically tunable solid lithium-niobate Fabry-Perot etalon that is stabilized against a diode laser which itself is locked to a resonance line of cesium 133. Key features are the etalon, which acts as a wide-angle 0.017-nm solar filter, the camera with a specially stabilized shutter, and the instrument control and data collection system. Use of the instrument in helioseismological research is emphasized.
Rover imaging system for the Mars rover/sample return mission
NASA Technical Reports Server (NTRS)
1993-01-01
In the past year, the conceptual design of a panoramic imager for the Mars Environmental Survey (MESUR) Pathfinder was finished. A prototype camera was built and its performace in the laboratory was tested. The performance of this camera was excellent. Based on this work, we have recently proposed a small, lightweight, rugged, and highly capable Mars Surface Imager (MSI) instrument for the MESUR Pathfinder mission. A key aspect of our approach to optimization of the MSI design is that we treat image gathering, coding, and restoration as a whole, rather than as separate and independent tasks. Our approach leads to higher image quality, especially in the representation of fine detail with good contrast and clarity, without increasing either the complexity of the camera or the amount of data transmission. We have made significant progress over the past year in both the overall MSI system design and in the detailed design of the MSI optics. We have taken a simple panoramic camera and have upgraded it substantially to become a prototype of the MSI flight instrument. The most recent version of the camera utilizes miniature wide-angle optics that image directly onto a 3-color, 2096-element CCD line array. There are several data-taking modes, providing resolution as high as 0.3 mrad/pixel. Analysis tasks that were performed or that are underway with the test data from the prototype camera include the following: construction of 3-D models of imaged scenes from stereo data, first for controlled scenes and later for field scenes; and checks on geometric fidelity, including alignment errors, mast vibration, and oscillation in the drive system. We have outlined a number of tasks planned for Fiscal Year '93 in order to prepare us for submission of a flight instrument proposal for MESUR Pathfinder.
Winter precipitation particle size distribution measurement by Multi-Angle Snowflake Camera
NASA Astrophysics Data System (ADS)
Huang, Gwo-Jong; Kleinkort, Cameron; Bringi, V. N.; Notaroš, Branislav M.
2017-12-01
From the radar meteorology viewpoint, the most important properties for quantitative precipitation estimation of winter events are 3D shape, size, and mass of precipitation particles, as well as the particle size distribution (PSD). In order to measure these properties precisely, optical instruments may be the best choice. The Multi-Angle Snowflake Camera (MASC) is a relatively new instrument equipped with three high-resolution cameras to capture the winter precipitation particle images from three non-parallel angles, in addition to measuring the particle fall speed using two pairs of infrared motion sensors. However, the results from the MASC so far are usually presented as monthly or seasonally, and particle sizes are given as histograms, no previous studies have used the MASC for a single storm study, and no researchers use MASC to measure the PSD. We propose the methodology for obtaining the winter precipitation PSD measured by the MASC, and present and discuss the development, implementation, and application of the new technique for PSD computation based on MASC images. Overall, this is the first study of the MASC-based PSD. We present PSD MASC experiments and results for segments of two snow events to demonstrate the performance of our PSD algorithm. The results show that the self-consistency of the MASC measured single-camera PSDs is good. To cross-validate PSD measurements, we compare MASC mean PSD (averaged over three cameras) with the collocated 2D Video Disdrometer, and observe good agreements of the two sets of results.
The History of the CONCAM Project: All Sky Monitors in the Digital Age
NASA Astrophysics Data System (ADS)
Nemiroff, Robert; Shamir, Lior; Pereira, Wellesley
2018-01-01
The CONtinuous CAMera (CONCAM) project, which ran from 2000 to (about) 2008, consisted of real-time, Internet-connected, fisheye cameras located at major astronomical observatories. At its peak, eleven CONCAMs around the globe monitored most of the night sky, most of the time. Initially designed to search for transients and stellar variability, CONCAMs gained initial notoriety as cloud monitors. As such, CONCAMs made -- and its successors continue to make -- ground-based astronomy more efficient. The original, compact, fisheye-observatory-in-a-suitcase design underwent several iterations, starting with CONCAM0 and with the last version dubbed CONCAM3. Although the CONCAM project itself concluded after centralized funding diminished, today more locally-operated, commercially-designed, CONCAM-like devices operate than ever before. It has even been shown that modern smartphones can operate in a CONCAM-like mode. It is speculated that the re-instatement of better global coordination of current wide-angle sky monitors could lead to better variability monitoring of the brightest stars and transients.
Hiby, Lex; Lovell, Phil; Patil, Narendra; Kumar, N Samba; Gopalaswamy, Arjun M; Karanth, K Ullas
2009-06-23
The tiger is one of many species in which individuals can be identified by surface patterns. Camera traps can be used to record individual tigers moving over an array of locations and provide data for monitoring and studying populations and devising conservation strategies. We suggest using a combination of algorithms to calculate similarity scores between pattern samples scanned from the images to automate the search for a match to a new image. We show how using a three-dimensional surface model of a tiger to scan the pattern samples allows comparison of images that differ widely in camera angles and body posture. The software, which is free to download, considerably reduces the effort required to maintain an image catalogue and we suggest it could be used to trace the origin of a tiger skin by searching a central database of living tigers' images for matches to an image of the skin.
Hiby, Lex; Lovell, Phil; Patil, Narendra; Kumar, N. Samba; Gopalaswamy, Arjun M.; Karanth, K. Ullas
2009-01-01
The tiger is one of many species in which individuals can be identified by surface patterns. Camera traps can be used to record individual tigers moving over an array of locations and provide data for monitoring and studying populations and devising conservation strategies. We suggest using a combination of algorithms to calculate similarity scores between pattern samples scanned from the images to automate the search for a match to a new image. We show how using a three-dimensional surface model of a tiger to scan the pattern samples allows comparison of images that differ widely in camera angles and body posture. The software, which is free to download, considerably reduces the effort required to maintain an image catalogue and we suggest it could be used to trace the origin of a tiger skin by searching a central database of living tigers' images for matches to an image of the skin. PMID:19324633
Lunar Satellite Snaps Image of Earth
2014-05-07
This image, captured Feb. 1, 2014, shows a colorized view of Earth from the moon-based perspective of NASA's Lunar Reconnaissance Orbiter. Credit: NASA/Goddard/Arizona State University -- NASA's Lunar Reconnaissance Orbiter (LRO) experiences 12 "earthrises" every day, however LROC (short for LRO Camera) is almost always busy imaging the lunar surface so only rarely does an opportunity arise such that LROC can capture a view of Earth. On Feb. 1, 2014, LRO pitched forward while approaching the moon's north pole allowing the LROC Wide Angle Camera to capture Earth rising above Rozhdestvenskiy crater (112 miles, or 180 km, in diameter). Read more: go.nasa.gov/1oqMlgu NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Lensless imaging for wide field of view
NASA Astrophysics Data System (ADS)
Nagahara, Hajime; Yagi, Yasushi
2015-02-01
It is desirable to engineer a small camera with a wide field of view (FOV) because of current developments in the field of wearable cameras and computing products, such as action cameras and Google Glass. However, typical approaches for achieving wide FOV, such as attaching a fisheye lens and convex mirrors, require a trade-off between optics size and the FOV. We propose camera optics that achieve a wide FOV, and are at the same time small and lightweight. The proposed optics are a completely lensless and catoptric design. They contain four mirrors, two for wide viewing, and two for focusing the image on the camera sensor. The proposed optics are simple and can be simply miniaturized, since we use only mirrors for the proposed optics and the optics are not susceptible to chromatic aberration. We have implemented the prototype optics of our lensless concept. We have attached the optics to commercial charge-coupled device/complementary metal oxide semiconductor cameras and conducted experiments to evaluate the feasibility of our proposed optics.
Photographic assessment of retroreflective film properties
NASA Astrophysics Data System (ADS)
Burgess, G.; Shortis, M. R.; Scott, P.
2011-09-01
Retroreflective film is used widely for target manufacture in close-range photogrammetry, especially where high precision is required for applications in industrial or engineering metrology. 3M Scotchlite 7610 high gain reflective sheeting is the gold standard for retroreflective targets because of the high level of response for incidence angles up to 60°. Retroreflective film is now widely used in the transport industry for signage and many other types of film have become available. This study reports on the performance of six types of retroreflective sheeting, including 7610, based on published metrics for reflectance. Measurements were made using a camera and flash, so as to be directly applicable to photogrammetry. Analysis of the results from this project and the assessment of previous research indicates that the use of standards is essential to enable a valid comparison of retroreflective performance.
Atmospheric Science Data Center
2013-04-16
... faint greenish hue in the multi-angle composite. This subtle effect suggests that the nadir camera is observing more of the brighter ... energy and water at the Earth's surface, and for preserving biodiversity. The Multi-angle Imaging SpectroRadiometer observes the daylit ...
NASA Astrophysics Data System (ADS)
Hwang, Kangseok; Yoon, Eun-A.; Kang, Sukyung; Cha, Hyungkee; Lee, Kyounghoon
2017-12-01
The present study focuses on the influence of target strength (TS) changes in the swimming angle of the hairtail ( Trichiurus lepturus). We measured in-situ TS at 38 and 120 kHz with luring lamps at a fishing ground for jigging boats near the coastal waters of Jeju-do in Korea. Swimming angle and size of hairtails were measured using an acoustic camera. Results showed that mean preanal length was estimated to be 13.5 cm (SD = 2.7 cm) and mean swimming tilt angle was estimated to be 43.9° (SD = 17.6°). The mean TS values were -35.7 and -41.2 dB at 38 and 120 kHz, respectively. The results will assist in understanding the influence of swimming angle on the TS of hairtails and, thus, improve the accuracy of biomass estimates.
Faint Object Camera imaging and spectroscopy of NGC 4151
NASA Technical Reports Server (NTRS)
Boksenberg, A.; Catchpole, R. M.; Macchetto, F.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Crane, P.; Deharveng, J. M.; Disney, M. J.; Jakobsen, P.
1995-01-01
We describe ultraviolet and optical imaging and spectroscopy within the central few arcseconds of the Seyfert galaxy NGC 4151, obtained with the Faint Object Camera on the Hubble Space Telescope. A narrowband image including (O III) lambda(5007) shows a bright nucleus centered on a complex biconical structure having apparent opening angle approximately 65 deg and axis at a position angle along 65 deg-245 deg; images in bands including Lyman-alpha and C IV lambda(1550) and in the optical continuum near 5500 A, show only the bright nucleus. In an off-nuclear optical long-slit spectrum we find a high and a low radial velocity component within the narrow emission lines. We identify the low-velocity component with the bright, extended, knotty structure within the cones, and the high-velocity component with more confined diffuse emission. Also present are strong continuum emission and broad Balmer emission line components, which we attribute to the extended point spread function arising from the intense nuclear emission. Adopting the geometry pointed out by Pedlar et al. (1993) to explain the observed misalignment of the radio jets and the main optical structure we model an ionizing radiation bicone, originating within a galactic disk, with apex at the active nucleus and axis centered on the extended radio jets. We confirm that through density bounding the gross spatial structure of the emission line region can be reproduced with a wide opening angle that includes the line of sight, consistent with the presence of a simple opaque torus allowing direct view of the nucleus. In particular, our modelling reproduces the observed decrease in position angle with distance from the nucleus, progressing initially from the direction of the extended radio jet, through our optical structure, and on to the extended narrow-line region. We explore the kinematics of the narrow-line low- and high-velocity components on the basis of our spectroscopy and adopted model structure.
Photometric analysis of Asteroid (21) Lutetia from Rosetta-OSIRIS images
NASA Astrophysics Data System (ADS)
Masoumzadeh, N.; Boehnhardt, H.; Li, Jian-Yang; Vincent, J.-B.
2015-09-01
We analyzed the photometric properties of Asteroid (21) Lutetia based on images captured by Rosetta during its flyby. We utilized the images recorded in the F17 filter (λ = 631.6 nm) of the Wide Angle Camera (WAC) and in the F82 & F22 filters (λ = 649.2 nm) of the Narrow Angle Camera (NAC) of the OSIRIS imaging system onboard the spacecraft. We present the results of Hapke and Minnaert modeling using disk-integrated and disk-resolved data derived from the surface of the asteroid. At 631.6 nm and 649.2 nm, the geometric albedo of Lutetia is 0.194 ± 0.002. The Bond albedo is 0.076 ± 0.002 at 649.2 nm, and 0.079 ± 0.002 at 631.6 nm. The roughness parameter is 28 ° ± 1 ° , the opposition surge parameters B0 and h are 1.79 ± 0.08 and 0.041 ± 0.003, respectively, and the asymmetry factor of the phase function is -0.28 ± 0.01. The single-scattering albedo is 0.226 ± 0.002 at 631.6 and 649.2 nm. The modeled Hapke parameters of Asteroid Lutetia are close to those of typical S-type asteroids. The Minnaert k parameter of Lutetia at opposition (0.526 ± 0.002) is comparable with other asteroids and comets. Albedo ratio images indicate no significant variation across the surface of Lutetia, apart from the so called NPCC region on Lutetia where a pronounced variation is seen at large phase angle. The small width of the albedo distribution of the surface (∼7% at half maximum) and the similarity between phase ratio maps derived from the measurements and from the modeling suggests that the light scattering property over the whole visible and illuminated surface of the asteroid is widely uniform. The comparison between the reflectance measurement of Lutetia and the available laboratory samples suggests that the regolith on Lutetia is concrete with possible grain size distribution of150 μm or larger.
Portable retinal imaging for eye disease screening using a consumer-grade digital camera
NASA Astrophysics Data System (ADS)
Barriga, Simon; Larichev, Andrey; Zamora, Gilberto; Soliz, Peter
2012-03-01
The development of affordable means to image the retina is an important step toward the implementation of eye disease screening programs. In this paper we present the i-RxCam, a low-cost, hand-held, retinal camera for widespread applications such as tele-retinal screening for eye diseases like diabetic retinopathy (DR), glaucoma, and age-related ocular diseases. Existing portable retinal imagers do not meet the requirements of a low-cost camera with sufficient technical capabilities (field of view, image quality, portability, battery power, and ease-of-use) to be distributed widely to low volume clinics, such as the offices of single primary care physicians serving rural communities. The i-RxCam uses a Nikon D3100 digital camera body. The camera has a CMOS sensor with 14.8 million pixels. We use a 50mm focal lens that gives a retinal field of view of 45 degrees. The internal autofocus can compensate for about 2D (diopters) of focusing error. The light source is an LED produced by Philips with a linear emitting area that is transformed using a light pipe to the optimal shape at the eye pupil, an annulus. To eliminate corneal reflex we use a polarization technique in which the light passes through a nano-wire polarizer plate. This is a novel type of polarizer featuring high polarization separation (contrast ratio of more than 1000) and very large acceptance angle (>45 degrees). The i-RxCam approach will yield a significantly more economical retinal imaging device that would allow mass screening of the at-risk population.
Detection of pointing errors with CMOS-based camera in intersatellite optical communications
NASA Astrophysics Data System (ADS)
Yu, Si-yuan; Ma, Jing; Tan, Li-ying
2005-01-01
For very high data rates, intersatellite optical communications hold a potential performance edge over microwave communications. Acquisition and Tracking problem is critical because of the narrow transmit beam. A single array detector in some systems performs both spatial acquisition and tracking functions to detect pointing errors, so both wide field of view and high update rate is required. The past systems tend to employ CCD-based camera with complex readout arrangements, but the additional complexity reduces the applicability of the array based tracking concept. With the development of CMOS array, CMOS-based cameras can employ the single array detector concept. The area of interest feature of the CMOS-based camera allows a PAT system to specify portion of the array. The maximum allowed frame rate increases as the size of the area of interest decreases under certain conditions. A commercially available CMOS camera with 105 fps @ 640×480 is employed in our PAT simulation system, in which only part pixels are used in fact. Beams angle varying in the field of view can be detected after getting across a Cassegrain telescope and an optical focus system. Spot pixel values (8 bits per pixel) reading out from CMOS are transmitted to a DSP subsystem via IEEE 1394 bus, and pointing errors can be computed by the centroid equation. It was shown in test that: (1) 500 fps @ 100×100 is available in acquisition when the field of view is 1mrad; (2)3k fps @ 10×10 is available in tracking when the field of view is 0.1mrad.
VizieR Online Data Catalog: Solar neighborhood. XXXVII. RVs for M dwarfs (Benedict+, 2016)
NASA Astrophysics Data System (ADS)
Benedict, G. F.; Henry, T. J.; Franz, O. G.; McArthur, B. E.; Wasserman, L. H.; Jao, W.-C.; Cargile, P. A.; Dieterich, S. B.; Bradley, A. J.; Nelan, E. P.; Whipple, A. L.
2017-05-01
During this project we observed with two Fine Guidance Sensor (FGS) units: FGS 3 from 1992 to 2000, and FGS 1r from 2000 to 2009. FGS 1r replaced the original FGS 1 during Hubble Space Telescope (HST) Servicing Mission 3A in late 1999. We included visual, photographic, and CCD observations of separations and position angles from Geyer et al. 1988AJ.....95.1841G for our analysis of GJ 65 AB. We include a single observation of G 193-027 AB from Beuzit et al. 2004A&A...425..997B, who used the Adaptive Optics Bonnette system on the Canada-France-Hawaii Telescope (CFHT). For GJ 65 AB we include five Very Large Telescope/NAos-COnica (VLT/NACO) measures of position angle and separation (Kervella et al. 2016A&A...593A.127K). For our analysis of GJ 623 AB, we included astrometric observations (Martinache et al. 2007ApJ...661..496M) performed with the Palomar High Angular Resolution Observer (PHARO) instrument on the Palomar 200in (5m) telescope and with the Near InfraRed Camera 2 (NIRC2) instrument on the Keck II telescope. Separations have typical errors of 2mas. Position angle errors average 0.5°. Measurements are included for GJ 22 AC from McCarthy et al. 1991AJ....101..214M and for GJ 473 AB from Henry et al. 1992AJ....103.1369H and Torres et al. 1999AJ....117..562T, who used a two-dimensional infrared speckle camera containing a 58*62 pixel InSb array on the Steward Observatory 90in telescope. We also include infrared speckle observations by Woitas et al. 2003A&A...406..293W, who obtained fourteen separation and position angle measurements for GJ 22 AC with the near-infrared cameras MAGIC and OMEGA Cass at the 3.5m telescope on Calar Alto. We also include a few speckle observations at optical wavelengths from the Special Astrophysical Observatory 6m Bolshoi Azimuth Telescope (BTA) and 1m Zeiss (Balega et al. 1994, Cat. J/A+AS/105/503), from the CFHT (Blazit et al. 1987) and from the Differential Speckle Survey Instrument (DSSI) on the Wisconsin, Indiana, Yale, National optical astronomy observatory (WIYN) 3.5m (Horch et al. 2012, Cat. J/AJ/143/10). Where available, we use astrometric observations from HST instruments other than the FGSs, including the Faint Object Camera (FOC; Barbieri et al. 1996A&A...315..418B), the Faint Object Spectrograph (FOS; Schultz et al. 1998PASP..110...31S), the Near-Infrared Camera and Multi-Object Spectrometer (NICMOS; Golimowski et al. 2004AJ....128.1733G), and the Wide-Field Planetary Camera 2 (WFPC2; Schroeder et al. 2000AJ....119..906S; Dieterich et al. 2012, Cat. J/AJ/144/64). Our radial velocity measurements, listed in table3, are from two sources. We obtained most radial velocity data with the McDonald 2.1m Struve telescope and the Sandiford Cassegrain Echelle spectrograph, hereafter CE. The CE delivers a dispersion equivalent to 2.5km/s/pix (R=λ/Δλ=60000) with a wavelength range of 5500{<=}λ{<=}6700Å spread across 26 orders (apertures). The McDonald data were collected during 33 observing runs from 1995 to 2009. Some GJ 623 AB velocities came from the Hobby-Eberly Telescope (HET) using the Tull Spectrograph. (3 data files).
Preplanning and Evaluating Video Documentaries and Features.
ERIC Educational Resources Information Center
Maynard, Riley
1997-01-01
This article presents a ten-part pre-production outline and post-production evaluation that helps communications students more effectively improve video skills. Examines camera movement and motion, camera angle and perspective, lighting, audio, graphics, backgrounds and color, special effects, editing, transitions, and music. Provides a glossary…
Single-Camera Stereoscopy Setup to Visualize 3D Dusty Plasma Flows
NASA Astrophysics Data System (ADS)
Romero-Talamas, C. A.; Lemma, T.; Bates, E. M.; Birmingham, W. J.; Rivera, W. F.
2016-10-01
A setup to visualize and track individual particles in multi-layered dusty plasma flows is presented. The setup consists of a single camera with variable frame rate, and a pair of adjustable mirrors that project the same field of view from two different angles to the camera, allowing for three-dimensional tracking of particles. Flows are generated by inclining the plane in which the dust is levitated using a specially designed setup that allows for external motion control without compromising vacuum. Dust illumination is achieved with an optics arrangement that includes a Powell lens that creates a laser fan with adjustable thickness and with approximately constant intensity everywhere. Both the illumination and the stereoscopy setup allow for the camera to be placed at right angles with respect to the levitation plane, in preparation for magnetized dusty plasma experiments in which there will be no direct optical access to the levitation plane. Image data and analysis of unmagnetized dusty plasma flows acquired with this setup are presented.
Computing camera heading: A study
NASA Astrophysics Data System (ADS)
Zhang, John Jiaxiang
2000-08-01
An accurate estimate of the motion of a camera is a crucial first step for the 3D reconstruction of sites, objects, and buildings from video. Solutions to the camera heading problem can be readily applied to many areas, such as robotic navigation, surgical operation, video special effects, multimedia, and lately even in internet commerce. From image sequences of a real world scene, the problem is to calculate the directions of the camera translations. The presence of rotations makes this problem very hard. This is because rotations and translations can have similar effects on the images, and are thus hard to tell apart. However, the visual angles between the projection rays of point pairs are unaffected by rotations, and their changes over time contain sufficient information to determine the direction of camera translation. We developed a new formulation of the visual angle disparity approach, first introduced by Tomasi, to the camera heading problem. Our new derivation makes theoretical analysis possible. Most notably, a theorem is obtained that locates all possible singularities of the residual function for the underlying optimization problem. This allows identifying all computation trouble spots beforehand, and to design reliable and accurate computational optimization methods. A bootstrap-jackknife resampling method simultaneously reduces complexity and tolerates outliers well. Experiments with image sequences show accurate results when compared with the true camera motion as measured with mechanical devices.
NASA Astrophysics Data System (ADS)
Ormö, J.; Wünnemann, K.; Collins, G.; Melero Asensio, I.
2012-04-01
The Experimental Projectile Impact Chamber at Centro de Astrobiología, Spain, consists of a 7m wide, funnel-shaped test bed, and a 20.5mm caliber compressed N2 gas gun. The test bed can be filled with any type of target material, but is especially designed for wet target experiments. The shape and size aim to decrease disturbance from reflected surface waves in wet target experiments. Experiments are done under 1Atm pressure. The gas gun can launch projectiles of any material and dimensions <20mm (smaller diameters using sabots), and at any angle from vertical to near horizontal. The projectile velocities are of the order of a few hundreds of meters per second depending mainly on the gas pressure, as well as projectile diameter and density. When using a dry sand target a transient crater about 30cm wide is produced. Wet target experiments have not yet been performed in this newly installed test chamber, but transient cavities in water are expected to be in the order of 50-70cm wide. The large scale allows for detailed study of the dynamics of cratering motions during the stages of crater growth and subsequent collapse, especially in wet targets. These observations provide valuable benchmark data for numerical simulations and for comparison with field studies. Here we describe the results of ten impact experiments using three different gas pressures (100bar, 180bar, 200bar), two projectile compositions (20mm, 5.7g delrin; 20mm, 16.3g Al2O3), and two different impact angles (90˚ and 53˚ over the horizontal plane). Nine of the experiments were done in a quarter-space geometry using a specially designed camera tank with a 45mm thick glass window. One experiment was done in half-space geometry as reference. The experiments were recorded with a high-speed digital video camera, and the resulting craters were documented with a digital still frame camera. Projectile velocities are estimated with a combination of tracking software and a Shooting Chrony Alpha M-1 chronograph to be about 330m/s for delrin (100bar), 220m/s for Al2O3 (100bar), 400m/s for delrin (200bar), and 275m/s for Al2O3 (200bar). The velocities for the lighter delrin projectile and at the higher pressure are above the speed of sound in dry silica sand (243 m/s; Sandia report SAND2007-3524). The experimental set up (i.e. target material, projectile density and velocity, impact angle), as well as the dimensions of the resulting craters, are used as inputs in numerical simulation using the iSALE computational code. Results from these simulations will be presented and compared with the experiments.
NASA Astrophysics Data System (ADS)
Wang, Zhi-shan; Zhao, Yue-jin; Li, Zhuo; Dong, Liquan; Chu, Xuhong; Li, Ping
2010-11-01
The comparison goniometer is widely used to measure and inspect small angle, angle difference, and parallelism of two surfaces. However, the common manner to read a comparison goniometer is to inspect the ocular of the goniometer by one eye of the operator. To read an old goniometer that just equips with one adjustable ocular is a difficult work. In the fabrication of an IR reflecting mirrors assembly, a common comparison goniometer is used to measure the angle errors between two neighbor assembled mirrors. In this paper, a quick reading technique image-based for the comparison goniometer used to inspect the parallelism of mirrors in a mirrors assembly is proposed. One digital camera, one comparison goniometer and one set of computer are used to construct a reading system, the image of the sight field in the comparison goniometer will be extracted and recognized to get the angle positions of the reflection surfaces to be measured. In order to obtain the interval distance between the scale lines, a particular technique, left peak first method, based on the local peak values of intensity in the true color image is proposed. A program written in VC++6.0 has been developed to perform the color digital image processing.
EUROPA2: Plan Database Services for Planning and Scheduling Applications
NASA Technical Reports Server (NTRS)
Bedrax-Weiss, Tania; Frank, Jeremy; Jonsson, Ari; McGann, Conor
2004-01-01
NASA missions require solving a wide variety of planning and scheduling problems with temporal constraints; simple resources such as robotic arms, communications antennae and cameras; complex replenishable resources such as memory, power and fuel; and complex constraints on geometry, heat and lighting angles. Planners and schedulers that solve these problems are used in ground tools as well as onboard systems. The diversity of planning problems and applications of planners and schedulers precludes a one-size fits all solution. However, many of the underlying technologies are common across planning domains and applications. We describe CAPR, a formalism for planning that is general enough to cover a wide variety of planning and scheduling domains of interest to NASA. We then describe EUROPA(sub 2), a software framework implementing CAPR. EUROPA(sub 2) provides efficient, customizable Plan Database Services that enable the integration of CAPR into a wide variety of applications. We describe the design of EUROPA(sub 2) from the perspective of both modeling, customization and application integration to different classes of NASA missions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huber, Franz J. T.; Will, Stefan, E-mail: stefan.will@fau.de; Erlangen Graduate School in Advanced Optical Technologies
A mobile demonstrator for the comprehensive online-characterization of gas-borne nanoparticle aggregates is presented. Two optical measurement techniques are combined, both utilizing a pulsed Nd:YAG laser as light source. Aggregate size and fractal dimension are measured by Wide-Angle Light Scattering (WALS). An ellipsoidal mirror images elastically scattered light from scattering angles between 10° and 165° onto a CCD-camera chip resulting in an almost complete scattering diagram with high angular resolution. Primary particle size and volume fraction are measured by time-resolved Laser-Induced Incandescence (TiRe-LII). Here, particles are heated up to about 3000 K by the short laser pulse, the enhanced thermal radiationmore » signal is detected with gated photomultiplier tubes. Analysis of the signal decay time and maximum LII-signal allows for the determination of primary particle diameter and volume fraction. The performance of the system is demonstrated by combined measurements on soot nanoparticle aggregates from a soot aerosol generator. Particle and aggregate sizes are varied by using different equivalence ratios of the combustion in the generator. Soot volume fraction can be adjusted by different levels of dilution with air. Online-measurements were carried out demonstrating the favorable performance of the system and the potential for industrial applications such as process control and product development. The particle properties obtained are confirmed through transmission electron microscopy analysis on representative samples.« less
Huber, Franz J T; Altenhoff, Michael; Will, Stefan
2016-05-01
A mobile demonstrator for the comprehensive online-characterization of gas-borne nanoparticle aggregates is presented. Two optical measurement techniques are combined, both utilizing a pulsed Nd:YAG laser as light source. Aggregate size and fractal dimension are measured by Wide-Angle Light Scattering (WALS). An ellipsoidal mirror images elastically scattered light from scattering angles between 10° and 165° onto a CCD-camera chip resulting in an almost complete scattering diagram with high angular resolution. Primary particle size and volume fraction are measured by time-resolved Laser-Induced Incandescence (TiRe-LII). Here, particles are heated up to about 3000 K by the short laser pulse, the enhanced thermal radiation signal is detected with gated photomultiplier tubes. Analysis of the signal decay time and maximum LII-signal allows for the determination of primary particle diameter and volume fraction. The performance of the system is demonstrated by combined measurements on soot nanoparticle aggregates from a soot aerosol generator. Particle and aggregate sizes are varied by using different equivalence ratios of the combustion in the generator. Soot volume fraction can be adjusted by different levels of dilution with air. Online-measurements were carried out demonstrating the favorable performance of the system and the potential for industrial applications such as process control and product development. The particle properties obtained are confirmed through transmission electron microscopy analysis on representative samples.
NASA Astrophysics Data System (ADS)
Huber, Franz J. T.; Altenhoff, Michael; Will, Stefan
2016-05-01
A mobile demonstrator for the comprehensive online-characterization of gas-borne nanoparticle aggregates is presented. Two optical measurement techniques are combined, both utilizing a pulsed Nd:YAG laser as light source. Aggregate size and fractal dimension are measured by Wide-Angle Light Scattering (WALS). An ellipsoidal mirror images elastically scattered light from scattering angles between 10° and 165° onto a CCD-camera chip resulting in an almost complete scattering diagram with high angular resolution. Primary particle size and volume fraction are measured by time-resolved Laser-Induced Incandescence (TiRe-LII). Here, particles are heated up to about 3000 K by the short laser pulse, the enhanced thermal radiation signal is detected with gated photomultiplier tubes. Analysis of the signal decay time and maximum LII-signal allows for the determination of primary particle diameter and volume fraction. The performance of the system is demonstrated by combined measurements on soot nanoparticle aggregates from a soot aerosol generator. Particle and aggregate sizes are varied by using different equivalence ratios of the combustion in the generator. Soot volume fraction can be adjusted by different levels of dilution with air. Online-measurements were carried out demonstrating the favorable performance of the system and the potential for industrial applications such as process control and product development. The particle properties obtained are confirmed through transmission electron microscopy analysis on representative samples.
Grooves and Kinks in the Rings
2017-06-19
Many of the features seen in Saturn's rings are shaped by the planet's moons. This view from NASA's Cassini spacecraft shows two different effects of moons that cause waves in the A ring and kinks in a faint ringlet. The view captures the outer edge of the 200-mile-wide (320-kilometer-wide) Encke Gap, in the outer portion of Saturn's A ring. This is the same region features the large propeller called Earhart. Also visible here is one of several kinked and clumpy ringlets found within the gap. Kinks and clumps in the Encke ringlet move about, and even appear and disappear, in part due to the gravitational effects of Pan -- which orbits in the gap and whose gravitational influence holds it open. The A ring, which takes up most of the image on the left side, displays wave features caused by Pan, as well as the moons Pandora and Prometheus, which orbit a bit farther from Saturn on both sides of the planet's F ring. This view was taken in visible light with the Cassini spacecraft narrow-angle camera on March 22, 2017, and looks toward the sunlit side of the rings from about 22 degrees above the ring plane. The view was acquired at a distance of approximately 63,000 miles (101,000 kilometers) from Saturn and at a phase angle (the angle between the sun, the rings and the spacecraft) of 59 degrees. Image scale is 1,979 feet (603 meters) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA21333
Impact Site: Cassini's Final Image
2017-09-15
This monochrome view is the last image taken by the imaging cameras on NASA's Cassini spacecraft. It looks toward the planet's night side, lit by reflected light from the rings, and shows the location at which the spacecraft would enter the planet's atmosphere hours later. A natural color view, created using images taken with red, green and blue spectral filters, is also provided (Figure 1). The imaging cameras obtained this view at approximately the same time that Cassini's visual and infrared mapping spectrometer made its own observations of the impact area in the thermal infrared. This location -- the site of Cassini's atmospheric entry -- was at this time on the night side of the planet, but would rotate into daylight by the time Cassini made its final dive into Saturn's upper atmosphere, ending its remarkable 13-year exploration of Saturn. The view was acquired on Sept. 14, 2017 at 19:59 UTC (spacecraft event time). The view was taken in visible light using the Cassini spacecraft wide-angle camera at a distance of 394,000 miles (634,000 kilometers) from Saturn. Image scale is about 11 miles (17 kilometers). The original image has a size of 512x512 pixels. A movie is available at https://photojournal.jpl.nasa.gov/catalog/PIA21895
Multi-target detection and positioning in crowds using multiple camera surveillance
NASA Astrophysics Data System (ADS)
Huang, Jiahu; Zhu, Qiuyu; Xing, Yufeng
2018-04-01
In this study, we propose a pixel correspondence algorithm for positioning in crowds based on constraints on the distance between lines of sight, grayscale differences, and height in a world coordinates system. First, a Gaussian mixture model is used to obtain the background and foreground from multi-camera videos. Second, the hair and skin regions are extracted as regions of interest. Finally, the correspondences between each pixel in the region of interest are found under multiple constraints and the targets are positioned by pixel clustering. The algorithm can provide appropriate redundancy information for each target, which decreases the risk of losing targets due to a large viewing angle and wide baseline. To address the correspondence problem for multiple pixels, we construct a pixel-based correspondence model based on a similar permutation matrix, which converts the correspondence problem into a linear programming problem where a similar permutation matrix is found by minimizing an objective function. The correct pixel correspondences can be obtained by determining the optimal solution of this linear programming problem and the three-dimensional position of the targets can also be obtained by pixel clustering. Finally, we verified the algorithm with multiple cameras in experiments, which showed that the algorithm has high accuracy and robustness.
A Spectralon BRF Data Base for MISR Calibration Application
NASA Technical Reports Server (NTRS)
Bruegge, C.; Chrien, N.; Haner, D.
1999-01-01
The Multi-angle Imaging SpectroRadiometer (MISR) is an Earth observing sensor which will provide global retrievals of aerosols, clouds, and land surface parameters. Instrument specifications require high accuracy absolute calibration, as well as accurate camera-to-camera, band-to-band and pixel-to-pixel relative response determinations.
4. VAL PARTIAL ELEVATION SHOWING LAUNCHER BRIDGE ON SUPPORTS, LAUNCHER ...
4. VAL PARTIAL ELEVATION SHOWING LAUNCHER BRIDGE ON SUPPORTS, LAUNCHER SLAB, SUPPORT CARRIAGE, CONCRETE 'A' FRAME STRUCTURE AND CAMERA TOWER LOOKING SOUTHEAST. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Using Lunar Module Shadows To Scale the Effects of Rocket Exhaust Plumes
NASA Technical Reports Server (NTRS)
2008-01-01
Excavating granular materials beneath a vertical jet of gas involves several physical mechanisms. These occur, for example, beneath the exhaust plume of a rocket landing on the soil of the Moon or Mars. We performed a series of experiments and simulations (Figure 1) to provide a detailed view of the complex gas-soil interactions. Measurements taken from the Apollo lunar landing videos (Figure 2) and from photographs of the resulting terrain helped demonstrate how the interactions extrapolate into the lunar environment. It is important to understand these processes at a fundamental level to support the ongoing design of higher fidelity numerical simulations and larger-scale experiments. These are needed to enable future lunar exploration wherein multiple hardware assets will be placed on the Moon within short distances of one another. The high-velocity spray of soil from the landing spacecraft must be accurately predicted and controlled or it could erode the surfaces of nearby hardware. This analysis indicated that the lunar dust is ejected at an angle of less than 3 degrees above the surface, the results of which can be mitigated by a modest berm of lunar soil. These results assume that future lunar landers will use a single engine. The analysis would need to be adjusted for a multiengine lander. Figure 3 is a detailed schematic of the Lunar Module camera calibration math model. In this chart, formulas relating the known quantities, such as sun angle and Lunar Module dimensions, to the unknown quantities are depicted. The camera angle PSI is determined by measurement of the imaged aspect ratio of a crater, where the crater is assumed to be circular. The final solution is the determination of the camera calibration factor, alpha. Figure 4 is a detailed schematic of the dust angle math model, which again relates known to unknown parameters. The known parameters now include the camera calibration factor and Lunar Module dimensions. The final computation is the ejected dust angle, as a function of Lunar Module altitude.
2014-12-01
Enceladus (visible in the lower-left corner of the image) is but a speck before enormous Saturn, but even a small moon can generate big waves of excitement throughout the scientific community. Enceladus, only 313 miles (504 kilometers) across, spurts vapor jets from its south pole. The presence of these jets from Enceladus has been the subject of intense study since they were discovered by Cassini. Their presence may point to a sub-surface water reservoir. This view looks toward the unilluminated side of the rings from about 2 degrees below the ringplane. The image was taken with the Cassini spacecraft wide-angle camera on Oct. 20, 2014 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was obtained at a distance of approximately 589,000 miles (948,000 kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 26 degrees. Image scale is 35 miles (57 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18296
NASA Technical Reports Server (NTRS)
1999-01-01
This narrow angle image taken by Cassini's camera system of the Moon is one of the best of a sequence of narrow angle frames taken as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. The 80 millisecond exposure was taken through a spectral filter centered at 0.33 microns; the filter bandpass was 85 Angstroms wide. The spatial scale of the image is about 1.4 miles per pixel (about 2.3 kilometers). The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS) at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ. Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.Active retroreflector to measure the rotational orientation in conjunction with a laser tracker
NASA Astrophysics Data System (ADS)
Hofherr, O.; Wachten, C.; Müller, C.; Reinecke, H.
2012-10-01
High precision optical non-contact position measurement is a key technology in modern engineering. Laser trackers (LT) can determine accurately x-y-z coordinates of passive retroreflectors. Next-generation systems answer the additional need to measure an object's rotational orientation (pitch, yaw, roll). These devices are based on photogrammetry or on enhanced retroreflectors. However, photogrammetry relies on camera systems and time-consuming image processing. Enhanced retroreflectors analyze the LT's beam but are restricted in roll angle measurements. Here we present an integrated laser based method to evaluate all six degrees of freedom. An active retroreflector directly analyzes its orientation to the LT's beam path by outcoupling laser light on detectors. A proof of concept prototype has been designed with a specified measuring range of 360° for roll angle measurements and +/-15° for pitch and yaw angle respectively. The prototype's optical design is inspired by a cat's eye retroreflector. First results are promising and further improvements are under development. We anticipate our method to facilitate simple and cost-effective six degrees of freedom measurements. Furthermore, for industrial applications wide customizations are possible, e.g. adaptation of measuring range, optimization of accuracy, and further system miniaturization.
NASA Astrophysics Data System (ADS)
Jolliff, B. L.
2017-12-01
Exploring the South Pole-Aitken basin (SPA), one of the key unsampled geologic terranes on the Moon, is a high priority for Solar System science. As the largest and oldest recognizable impact basin on the Moon, it anchors the heavy bombardment chronology. It is thus a key target for sample return to better understand the impact flux in the Solar System between formation of the Moon and 3.9 Ga when Imbrium, one of the last of the great lunar impact basins, formed. Exploration of SPA has implications for understanding early habitable environments on the terrestrial planets. Global mineralogical and compositional data exist from the Clementine UV-VIS camera, the Lunar Prospector Gamma Ray Spectrometer, the Moon Mineralogy Mapper (M3) on Chandrayaan-1, the Chang'E-1 Imaging Interferometer, the spectral suite on SELENE, and the Lunar Reconnaissance Orbiter Cameras (LROC) Wide Angle Camera (WAC) and Diviner thermal radiometer. Integration of data sets enables synergistic assessment of geology and distribution of units across multiple spatial scales. Mineralogical assessment using hyperspectral data indicates spatial relationships with mineralogical signatures, e.g., central peaks of complex craters, consistent with inferred SPA basin structure and melt differentiation (Moriarty & Pieters, 2015, JGR-P 118). Delineation of mare, cryptomare, and nonmare surfaces is key to interpreting compositional mixing in the formation of SPA regolith to interpret remotely sensed data, and for scientific assessment of landing sites. LROC Narrow Angle Camera (NAC) images show the location and distribution of >0.5 m boulders and fresh craters that constitute the main threats to automated landers and thus provide critical information for landing site assessment and planning. NAC images suitable for geometric stereo derivation and digital terrain models so derived, controlled with Lunar Orbiter Laser Altimeter (LOLA) data, and oblique NAC images made with large slews of the spacecraft, are crucial to both scientific and landing-site assessments. These images, however, require favorable illumination and significant spacecraft resources. Thus they make up only a small percentage of all of the images taken. It is essential for future exploration to support LRO continued operation for these critical datasets.
2018-01-15
In this view, individual layers of haze can be distinguished in the upper atmosphere of Titan, Saturn's largest moon. Titan's atmosphere features a rich and complex chemistry originating from methane and nitrogen and evolving into complex molecules, eventually forming the smog that surrounds the moon. This natural color image was taken in visible light with the Cassini spacecraft wide-angle camera on March 31, 2005, at a distance of approximately 20,556 miles (33,083 kilometers) from Titan. The view looks toward the north polar region on the moon's night side. Part of Titan's sunlit crescent is visible at right. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA21902
NASA Technical Reports Server (NTRS)
2001-01-01
These images taken through the wide angle camera near closest approach in the deep near-infrared methane band, combined with filters which sense electromagnetic radiation of orthogonal polarization, show that the light from the poles is polarized. That is, the poles appear bright in one image, and dark in the other. Polarized light is most readily scattered by aerosols. These images indicate that the aerosol particles at Jupiter's poles are small and likely consist of aggregates of even smaller particles, whereas the particles at the equator and covering the Great Red Spot are larger. Images like these will allow scientists to ascertain the distribution, size and shape of aerosols, and consequently, the distribution of heat, in Jupiter's atmosphere.2005-01-20
Atmospheric features in Saturn's north polar region are revealed in spectacular detail in this Cassini image, taken in the near infrared spectral region, where methane gas is not very absorbing. The dark shadows of Saturn's rings drape across the planet, creating the illusion of atmospheric bands. Dots of bright clouds give the appearance that this is an active place. The image was taken with the Cassini spacecraft wide angle camera on Dec. 14, 2004, at a distance of 717,800 kilometers (446,100 miles) from Saturn through a filter sensitive to wavelengths of infrared light centered at 939 nanometers. The image scale is about 43 kilometers (27 miles) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA06567
Two Perspectives on Forest Fire
NASA Technical Reports Server (NTRS)
2002-01-01
Multi-angle Imaging Spectroradiometer (MISR) images of smoke plumes from wildfires in western Montana acquired on August 14, 2000. A portion of Flathead Lake is visible at the top, and the Bitterroot Range traverses the images. The left view is from MISR's vertical-viewing (nadir) camera. The right view is from the camera that looks forward at a steep angle (60 degrees). The smoke location and extent are far more visible when seen at this highly oblique angle. However, vegetation is much darker in the forward view. A brown burn scar is located nearly in the exact center of the nadir image, while in the high-angle view it is shrouded in smoke. Also visible in the center and upper right of the images, and more obvious in the clearer nadir view, are checkerboard patterns on the surface associated with land ownership boundaries and logging. Compare these images with the high resolution infrared imagery captured nearby by Landsat 7 half an hour earlier. Images by NASA/GSFC/JPL, MISR Science Team.
Optical Transient Monitor (OTM) for BOOTES Project
NASA Astrophysics Data System (ADS)
Páta, P.; Bernas, M.; Castro-Tirado, A. J.; Hudec, R.
2003-04-01
The Optical Transient Monitor (OTM) is a software for control of three wide and ultra-wide filed cameras of BOOTES (Burst Observer and Optical Transient Exploring System) station. The OTM is a PC based and it is powerful tool for taking images from two SBIG CCD cameras in same time or from one camera only. The control program for BOOTES cameras is Windows 98 or MSDOS based. Now the version for Windows 2000 is prepared. There are five main supported modes of work. The OTM program could control cameras and evaluate image data without human interaction.
A Regional View of the Libya Montes
NASA Technical Reports Server (NTRS)
2000-01-01
[figure removed for brevity, see original site]
The Libya Montes are a ring of mountains up-lifted by the giant impact that created the Isidis basin to the north. During 1999, this region became one of the top two that were being considered for the now-canceled Mars Surveyor 2001 Lander. The Isidis basin is very, very ancient. Thus, the mountains that form its rims would contain some of the oldest rocks available at the Martian surface, and a landing in this region might potentially provide information about conditions on early Mars. In May 1999, the wide angle cameras of the Mars Global Surveyor Mars Orbiter Camera system were used in what was called the 'Geodesy Campaign' to obtain nearly global maps of the planet in color and in stereo at resolutions of 240 m/pixel (787 ft/pixel) for the red camera and 480 m/pixel (1575 ft/pixel) for the blue. Shown here are color and stereo views constructed from mosaics of the Geodesy Campaign images for the Libya Montes region of Mars. After they formed by giant impact, the Libya Mountains and valleys were subsequently modified and eroded by other processes, including wind, impact cratering, and flow of liquid water to make the many small valleys that can be seen running northward in the scene. The pictures shown here cover nearly 122,000 square kilometers (47,000 square miles) between latitudes 0.1oN and 4.0oN, longitudes 271.5oW and 279.9oW. The mosaics are about 518 km (322 mi) wide by 235 km (146 mi)high. Red-blue '3-D' glasses are needed to view the stereo image.NASA Astrophysics Data System (ADS)
Schuster, Norbert; Franks, John
2011-06-01
In the 8-12 micron waveband Focal Plane Arrays (FPA) are available with a 17 micron pixel pitch in different arrays sizes (e.g. 512 x 480 pixels and 320 x 240 pixels) and with excellent electrical properties. Many applications become possible using this new type of IR-detector which will become the future standard in uncooled technology. Lenses with an f-number faster than f/1.5 minimize the diffraction impact on the spatial resolution and guarantee a high thermal resolution for uncooled cameras. Both effects will be quantified. The distinction between Traditional f-number (TF) and Radiometric f-number (RF) is discussed. Lenses with different focal lengths are required for applications in a variety of markets. They are classified by their Horizontal field of view (HFOV). Respecting the requirements for high volume markets, several two lens solutions will be discussed. A commonly accepted parameter of spatial resolution is the Modulation Transfer Function (MTF)-value at the Nyquist frequency of the detector (here 30cy/mm). This parameter of resolution will be presented versus field of view. Wide Angle and Super Wide Angle lenses are susceptible to low relative illumination in the corner of the detector. Measures to reduce this drop to an acceptable value are presented.
2016-11-28
Saturn's icy moon Mimas is dwarfed by the planet's enormous rings. Because Mimas (near lower left) appears tiny by comparison, it might seem that the rings would be far more massive, but this is not the case. Scientists think the rings are no more than a few times as massive as Mimas, or perhaps just a fraction of Mimas' mass. Cassini is expected to determine the mass of Saturn's rings to within just a few hundredths of Mimas' mass as the mission winds down by tracking radio signals from the spacecraft as it flies close to the rings. The rings, which are made of small, icy particles spread over a vast area, are extremely thin -- generally no thicker than the height of a house. Thus, despite their giant proportions, the rings contain a surprisingly small amount of material. Mimas is 246 miles (396 kilometers) wide. This view looks toward the sunlit side of the rings from about 6 degrees above the ring plane. The image was taken in red light with the Cassini spacecraft wide-angle camera on July 21, 2016. The view was obtained at a distance of approximately 564,000 miles (907,000 kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 31 degrees. Image scale is 34 miles (54 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20509
Synchronizing Photography For High-Speed-Engine Research
NASA Technical Reports Server (NTRS)
Chun, K. S.
1989-01-01
Light flashes when shaft reaches predetermined angle. Synchronization system facilitates visualization of flow in high-speed internal-combustion engines. Designed for cinematography and holographic interferometry, system synchronizes camera and light source with predetermined rotational angle of engine shaft. 10-bit resolution of absolute optical shaft encoder adapted, and 2 to tenth power combinations of 10-bit binary data computed to corresponding angle values. Pre-computed angle values programmed into EPROM's (erasable programmable read-only memories) to use as angle lookup table. Resolves shaft angle to within 0.35 degree at rotational speeds up to 73,240 revolutions per minute.
Ross, Sandy A; Rice, Clinton; Von Behren, Kristyn; Meyer, April; Alexander, Rachel; Murfin, Scott
2015-01-01
The purpose of this study was to establish intra-rater, intra-session, and inter-rater, reliability of sagittal plane hip, knee, and ankle angles with and without reflective markers using the GAITRite walkway and single video camera between student physical therapists and an experienced physical therapist. This study included thirty-two healthy participants age 20-59, stratified by age and gender. Participants performed three successful walks with and without markers applied to anatomical landmarks. GAITRite software was used to digitize sagittal hip, knee, and ankle angles at two phases of gait: (1) initial contact; and (2) mid-stance. Intra-rater reliability was more consistent for the experienced physical therapist, regardless of joint or phase of gait. Intra-session reliability was variable, the experienced physical therapist showed moderate to high reliability (intra-class correlation coefficient (ICC) = 0.50-0.89) and the student physical therapist showed very poor to high reliability (ICC = 0.07-0.85). Inter-rater reliability was highest during mid-stance at the knee with markers (ICC = 0.86) and lowest during mid-stance at the hip without markers (ICC = 0.25). Reliability of a single camera system, especially at the knee joint shows promise. Depending on the specific type of reliability, error can be attributed to the testers (e.g. lack of digitization practice and marker placement), participants (e.g. loose fitting clothing) and camera systems (e.g. frame rate and resolution). However, until the camera technology can be upgraded to a higher frame rate and resolution, and the software can be linked to the GAITRite walkway, the clinical utility for pre/post measures is limited.
Two-Camera Acquisition and Tracking of a Flying Target
NASA Technical Reports Server (NTRS)
Biswas, Abhijit; Assad, Christopher; Kovalik, Joseph M.; Pain, Bedabrata; Wrigley, Chris J.; Twiss, Peter
2008-01-01
A method and apparatus have been developed to solve the problem of automated acquisition and tracking, from a location on the ground, of a luminous moving target in the sky. The method involves the use of two electronic cameras: (1) a stationary camera having a wide field of view, positioned and oriented to image the entire sky; and (2) a camera that has a much narrower field of view (a few degrees wide) and is mounted on a two-axis gimbal. The wide-field-of-view stationary camera is used to initially identify the target against the background sky. So that the approximate position of the target can be determined, pixel locations on the image-detector plane in the stationary camera are calibrated with respect to azimuth and elevation. The approximate target position is used to initially aim the gimballed narrow-field-of-view camera in the approximate direction of the target. Next, the narrow-field-of view camera locks onto the target image, and thereafter the gimbals are actuated as needed to maintain lock and thereby track the target with precision greater than that attainable by use of the stationary camera.
Robust sky light polarization detection with an S-wave plate in a light field camera.
Zhang, Wenjing; Zhang, Xuanzhe; Cao, Yu; Liu, Haibo; Liu, Zejin
2016-05-01
The sky light polarization navigator has many advantages, such as low cost, no decrease in accuracy with continuous operation, etc. However, current celestial polarization measurement methods often suffer from low performance when the sky is covered by clouds, which reduce the accuracy of navigation. In this paper we introduce a new method and structure based on a handheld light field camera and a radial polarizer, composed of an S-wave plate and a linear polarizer, to detect the sky light polarization pattern across a wide field of view in a single snapshot. Each micro-subimage has a special intensity distribution. After extracting the texture feature of these subimages, stable distribution information of the angle of polarization under a cloudy sky can be obtained. Our experimental results match well with the predicted properties of the theory. Because the polarization pattern is obtained through image processing, rather than traditional methods based on mathematical computation, this method is less sensitive to errors of pixel gray value and thus has better anti-interference performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter
Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature.more » Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.« less
New gonioscopy system using only infrared light.
Sugimoto, Kota; Ito, Kunio; Matsunaga, Koichi; Miura, Katsuya; Esaki, Koji; Uji, Yukitaka
2005-08-01
To describe an infrared gonioscopy system designed to observe the anterior chamber angle under natural mydriasis in a completely darkened room. An infrared light filter was used to modify the light source of the slit-lamp microscope. A television monitor connected to a CCD monochrome camera was used to indirectly observe the angle. Use of the infrared system enabled observation of the angle under natural mydriasis in a completely darkened room. Infrared gonioscopy is a useful procedure for the observation of the angle under natural mydriasis.
STS-31 crew activity on the middeck of the Earth-orbiting Discovery, OV-103
1990-04-29
STS031-05-002 (24-29 April 1990) --- A 35mm camera with a "fish eye" lens captured this high angle image on Discovery's middeck. Astronaut Kathryn D. Sullivan works with the IMAX camera in foreground, while Astronaut Steven A. Hawley consults a checklist in corner. An Arriflex motion picture camera records student ion arc experiment in apparatus mounted on stowage locker. The experiment was the project of Gregory S. Peterson, currently a student at Utah State University.
Video sensor with range measurement capability
NASA Technical Reports Server (NTRS)
Howard, Richard T. (Inventor); Briscoe, Jeri M. (Inventor); Corder, Eric L. (Inventor); Broderick, David J. (Inventor)
2008-01-01
A video sensor device is provided which incorporates a rangefinder function. The device includes a single video camera and a fixed laser spaced a predetermined distance from the camera for, when activated, producing a laser beam. A diffractive optic element divides the beam so that multiple light spots are produced on a target object. A processor calculates the range to the object based on the known spacing and angles determined from the light spots on the video images produced by the camera.
NASA Astrophysics Data System (ADS)
Fernandez-Borda, R.; Waluschka, E.; Pellicori, S.; Martins, J. V.; Ramos-Izquierdo, L.; Cieslak, J. D.; Thompson, P.
2009-08-01
The design and construction of wide FOV imaging polarimeters for use in atmospheric remote sensing requires significant attention to the prevention of artificial polarization induced by the optical elements. Surface, coatings, and angles of incidence throughout the system must be carefully designed in order to minimize these artifacts because the remaining instrumental bias polarization is the main factor which drives the final polarimetric accuracy of the system. In this work, we present a detailed evaluation and analysis to explore the possibility of retrieving the initial polarization state of the light traveling through a generic system that has inherent instrumental polarization. Our case is a wide FOV lens and a splitter device. In particular, we chose as splitter device a Philips-type prism, because it is able to divide the signal in 3 independent channels that could be simultaneously analyze to retrieve the three first elements of the Stoke vector (in atmospheric applications the elliptical polarization can be neglected [1]). The Philips-type configuration is a versatile, compact and robust prism device that is typically used in three color camera systems. It has been used in some commercial polarimetric cameras which do not claim high accuracy polarization measurements [2]. With this work, we address the accuracy of our polarization inversion and measurements made with the Philips-type beam divider.
NASA Astrophysics Data System (ADS)
Buczkowski, S.; Martins, J.; Fernandez-Borda, R.; Cieslak, D.; Hall, J.
2013-12-01
The UMBC Rainbow Polarimetric Imager is a small form factor VIS imaging polarimeter suitable for use on a number of platforms. An optical system based on a Phillips prism with three Bayer filter color detectors, each detecting a separate polarization state, allows simultaneous detection of polarization and spectral information. A Mueller matrix-like calibration scheme corrects for polarization artifacts in the optical train and allows retrieval of the polarization state of incoming light to better than 0.5%. Coupled with wide field of view optics (~90°), RPI can capture images of cloudbows over a wide range of aircraft headings and solar zenith angles for retrieval of cloud droplet size distribution (DSD) parameters. In May-June 2012, RPI was flown in a nadir port on the NASA DC-8 during the DC3 field campaign. We will show examples of cloudbow DSD parameter retrievals from the campaign to demonstrate the efficacy of such a system to terrestrial atmospheric remote sensing. RPI image from DC3 06/15/2012 flight. Left panel is raw image from the RPI 90° camera. Middle panel is Stokes 'q' parameter retrieved from full three camera dataset. Right panel is a horizontal cut in 'q' through the glory. Both middle and right panels clearly show cloudbow features which can be fit to infer cloud DSD parameters.
22. VAL, VIEW OF PROJECTILE LOADING DECK LOOKING NORTHEAST TOWARD ...
22. VAL, VIEW OF PROJECTILE LOADING DECK LOOKING NORTHEAST TOWARD TOP OF CONCRETE 'A' FRAME STRUCTURE SHOWING DRIVE CABLES, DRIVE GEAR, BOTTOM OF CAMERA TOWER AND 'CROWS NEST' CONTROL ROOM. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
2004-06-17
This image shows the comet Wild 2, which NASA's Stardust spacecraft flew by on Jan. 2, 2004. This image is the closest short exposure of the comet, taken at an11.4-degree phase angle, the angle between the camera, comet and the Sun. http://photojournal.jpl.nasa.gov/catalog/PIA06285
Maximizing the Performance of Automated Low Cost All-sky Cameras
NASA Technical Reports Server (NTRS)
Bettonvil, F.
2011-01-01
Thanks to the wide spread of digital camera technology in the consumer market, a steady increase in the number of active All-sky camera has be noticed European wide. In this paper I look into the details of such All-sky systems and try to optimize the performance in terms of accuracy of the astrometry, the velocity determination and photometry. Having autonomous operation in mind, suggestions are done for the optimal low cost All-sky camera.
LROC Targeted Observations for the Next Generation of Scientific Exploration
NASA Astrophysics Data System (ADS)
Jolliff, B. L.
2015-12-01
Imaging of the Moon at high spatial resolution (0.5 to 2 mpp) by the Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Cameras (NAC) plus topographic data derived from LROC NAC and WAC (Wide Angle Camera) and LOLA (Lunar Orbiting Laser Altimeter), coupled with recently obtained hyperspectral NIR and thermal data, permit studies of composition, mineralogy, and geologic context at essentially an outcrop scale. Such studies pave the way for future landed and sample return missions for high science priority targets. Among such targets are (1) the youngest volcanic rocks on the Moon, including mare basalts formed as recently as ~1 Ga, and irregular mare patches (IMPs) that appear to be even younger [1]; (2) volcanic rocks and complexes with compositions more silica-rich than mare basalts [2-4]; (3) differentiated impact-melt deposits [5,6], ancient volcanics, and compositional anomalies within the South Pole-Aitken basin; (4) exposures of recently discovered key crustal rock types in uplifted structures such as essentially pure anorthosite [7] and spinel-rich rocks [8]; and (5) frozen volatile-element-rich deposits in polar areas [9]. Important data sets include feature sequences of paired NAC images obtained under similar illumination conditions, NAC geometric stereo, from which high-resolution DTMs can be made, and photometric sequences useful for assessing composition in areas of mature cover soils. Examples of each of these target types will be discussed in context of potential future missions. References: [1] Braden et al. (2014) Nat. Geo. 7, 787-791. [2] Glotch et al. (2010) Science, 329, 1510-1513. [3] Greenhagen et al. (2010) Science, 329, 1507-1509. [4] Jolliff et al. (2011) Nat. Geo. 4, 566-571. [5] Vaughan et al (2013) PSS 91, 101-106. [6] Hurwitz and Kring (2014) J. Geophys. Res. 119, 1110-1133 [7] Ohtake et al. (2009) Nature, 461, 236-241 [8] Pieters et al. (2014) Am. Min. 99, 1893-1910. [9] Colaprete et al. (2010) Science 330, 463-468.
Morphologic Analysis of Lunar Craters in the Simple-to-Complex Transition
NASA Astrophysics Data System (ADS)
Chandnani, M.; Herrick, R. R.; Kramer, G. Y.
2015-12-01
The diameter range of 15 km to 20 km on the Moon is within the transition from simple to complex impact craters. We examined 207 well preserved craters in this diameter range distributed across the moon using high resolution Lunar Reconnaissance Orbiter Camera Wide Angle Camera Mosaic (WAC) and Narrow Angle Camera (NAC) data. A map of the distribution of the 207 craters on the Moon using the global LROC WAC mosaic has been attahced with the abstract. By examining craters of similar diameter, impact energy is nearly constant, so differences in shape and morphology must be due to either target (e.g., porosity, density, coherence, layering) or impactor (e.g., velocity, density) properties. On the basis of the crater morphology, topographic profiles and depth-diameter ratio, the craters were classified into simple, craters with slumped walls, craters with both slumping and terracing, those containing a central uplift only, those with a central uplift and slumping, and the craters with a central uplift accompanied by both slumping and terracing, as shown in the image. It was observed that simple craters and craters with slumped walls occur predominately on the lunar highlands. The majority of the craters with terraced walls and all classes of central uplifts were observed predominately on the mare. In short, in this size range craters in the highlands were generally simple craters with occasionally some slumped material in the center, and the more developed features (terracing, central peak) were associated with mare craters. This is somewhat counterintuitive, as we expect the highlands to be generally weaker and less consolidated than the mare. We hypothesize that the presence of rheologic layering in the mare may be the cause of the more complex features that we observe. Relatively weak layers in the mare could develop through regolith formation between individual flows, or perhaps by variations within or between the flows themselves.
Apollo 8 Mission image,Farside of Moon
1968-12-21
Apollo 8,Farside of Moon. Image taken on Revolution 4. Camera Tilt Mode: Vertical Stereo. Sun Angle: 13. Original Film Magazine was labeled D. Camera Data: 70mm Hasselblad. Lens: 80mm; F-Stop: F/2.8; Shutter Speed: 1/250 second. Film Type: Kodak SO-3400 Black and White,ASA 40. Flight Date: December 21-27,1968.
The Effect of Selected Cinemagraphic Elements on Audience Perception of Mediated Concepts.
ERIC Educational Resources Information Center
Orr, Quinn
This study is to explore cinemagraphic and visual elements and their inter-relations through the reinterpretation of previous research and literature. The cinemagraphic elements of visual images (camera angle, camera motion, subject motion, color, and lighting) work as a language requiring a proper grammar for the messages to be conveyed in their…
Far ultraviolet wide field imaging and photometry - Spartan-202 Mark II Far Ultraviolet Camera
NASA Technical Reports Server (NTRS)
Carruthers, George R.; Heckathorn, Harry M.; Opal, Chet B.; Witt, Adolf N.; Henize, Karl G.
1988-01-01
The U.S. Naval Research Laboratory' Mark II Far Ultraviolet Camera, which is expected to be a primary scientific instrument aboard the Spartan-202 Space Shuttle mission, is described. This camera is intended to obtain FUV wide-field imagery of stars and extended celestial objects, including diffuse nebulae and nearby galaxies. The observations will support the HST by providing FUV photometry of calibration objects. The Mark II camera is an electrographic Schmidt camera with an aperture of 15 cm, a focal length of 30.5 cm, and sensitivity in the 1230-1600 A wavelength range.
NASA Astrophysics Data System (ADS)
Azhar, N.; Saad, W. H. M.; Manap, N. A.; Saad, N. M.; Syafeeza, A. R.
2017-06-01
This study presents the approach of 3D image reconstruction using an autonomous robotic arm for the image acquisition process. A low cost of the automated imaging platform is created using a pair of G15 servo motor connected in series to an Arduino UNO as a main microcontroller. Two sets of sequential images were obtained using different projection angle of the camera. The silhouette-based approach is used in this study for 3D reconstruction from the sequential images captured from several different angles of the object. Other than that, an analysis based on the effect of different number of sequential images on the accuracy of 3D model reconstruction was also carried out with a fixed projection angle of the camera. The effecting elements in the 3D reconstruction are discussed and the overall result of the analysis is concluded according to the prototype of imaging platform.
Optical Enhancement of Exoskeleton-Based Estimation of Glenohumeral Angles
Cortés, Camilo; Unzueta, Luis; de los Reyes-Guzmán, Ana; Ruiz, Oscar E.; Flórez, Julián
2016-01-01
In Robot-Assisted Rehabilitation (RAR) the accurate estimation of the patient limb joint angles is critical for assessing therapy efficacy. In RAR, the use of classic motion capture systems (MOCAPs) (e.g., optical and electromagnetic) to estimate the Glenohumeral (GH) joint angles is hindered by the exoskeleton body, which causes occlusions and magnetic disturbances. Moreover, the exoskeleton posture does not accurately reflect limb posture, as their kinematic models differ. To address the said limitations in posture estimation, we propose installing the cameras of an optical marker-based MOCAP in the rehabilitation exoskeleton. Then, the GH joint angles are estimated by combining the estimated marker poses and exoskeleton Forward Kinematics. Such hybrid system prevents problems related to marker occlusions, reduced camera detection volume, and imprecise joint angle estimation due to the kinematic mismatch of the patient and exoskeleton models. This paper presents the formulation, simulation, and accuracy quantification of the proposed method with simulated human movements. In addition, a sensitivity analysis of the method accuracy to marker position estimation errors, due to system calibration errors and marker drifts, has been carried out. The results show that, even with significant errors in the marker position estimation, method accuracy is adequate for RAR. PMID:27403044
Functional range of movement of the hand: declination angles to reachable space.
Pham, Hai Trieu; Pathirana, Pubudu N; Caelli, Terry
2014-01-01
The measurement of the range of hand joint movement is an essential part of clinical practice and rehabilitation. Current methods use three finger joint declination angles of the metacarpophalangeal, proximal interphalangeal and distal interphalangeal joints. In this paper we propose an alternate form of measurement for the finger movement. Using the notion of reachable space instead of declination angles has significant advantages. Firstly, it provides a visual and quantifiable method that therapists, insurance companies and patients can easily use to understand the functional capabilities of the hand. Secondly, it eliminates the redundant declination angle constraints. Finally, reachable space, defined by a set of reachable fingertip positions, can be measured and constructed by using a modern camera such as Creative Senz3D or built-in hand gesture sensors such as the Leap Motion Controller. Use of cameras or optical-type sensors for this purpose have considerable benefits such as eliminating and minimal involvement of therapist errors, non-contact measurement in addition to valuable time saving for the clinician. A comparison between using declination angles and reachable space were made based on Hume's experiment on functional range of movement to prove the efficiency of this new approach.
Electronic Still Camera view of Aft end of Wide Field/Planetary Camera in HST
1993-12-06
S61-E-015 (6 Dec 1993) --- A close-up view of the aft part of the new Wide Field/Planetary Camera (WFPC-II) installed on the Hubble Space Telescope (HST). WFPC-II was photographed with the Electronic Still Camera (ESC) from inside Endeavour's cabin as astronauts F. Story Musgrave and Jeffrey A. Hoffman moved it from its stowage position onto the giant telescope. Electronic still photography is a relatively new technology which provides the means for a handheld camera to electronically capture and digitize an image with resolution approaching film quality. The electronic still camera has flown as an experiment on several other shuttle missions.
Mapping and correcting the influence of gaze position on pupil size measurements
Petrov, Alexander A.
2015-01-01
Pupil size is correlated with a wide variety of important cognitive variables and is increasingly being used by cognitive scientists. Pupil data can be recorded inexpensively and non-invasively by many commonly used video-based eye-tracking cameras. Despite the relative ease of data collection and increasing prevalence of pupil data in the cognitive literature, researchers often underestimate the methodological challenges associated with controlling for confounds that can result in misinterpretation of their data. One serious confound that is often not properly controlled is pupil foreshortening error (PFE)—the foreshortening of the pupil image as the eye rotates away from the camera. Here we systematically map PFE using an artificial eye model and then apply a geometric model correction. Three artificial eyes with different fixed pupil sizes were used to systematically measure changes in pupil size as a function of gaze position with a desktop EyeLink 1000 tracker. A grid-based map of pupil measurements was recorded with each artificial eye across three experimental layouts of the eye-tracking camera and display. Large, systematic deviations in pupil size were observed across all nine maps. The measured PFE was corrected by a geometric model that expressed the foreshortening of the pupil area as a function of the cosine of the angle between the eye-to-camera axis and the eye-to-stimulus axis. The model reduced the root mean squared error of pupil measurements by 82.5 % when the model parameters were pre-set to the physical layout dimensions, and by 97.5 % when they were optimized to fit the empirical error surface. PMID:25953668
The Beagle 2 Stereo Camera System: Scientific Objectives and Design Characteristics
NASA Astrophysics Data System (ADS)
Griffiths, A.; Coates, A.; Josset, J.; Paar, G.; Sims, M.
2003-04-01
The Stereo Camera System (SCS) will provide wide-angle (48 degree) multi-spectral stereo imaging of the Beagle 2 landing site in Isidis Planitia with an angular resolution of 0.75 milliradians. Based on the SpaceX Modular Micro-Imager, the SCS is composed of twin cameras (with 1024 by 1024 pixel frame transfer CCD) and twin filter wheel units (with a combined total of 24 filters). The primary mission objective is to construct a digital elevation model of the area in reach of the lander’s robot arm. The SCS specifications and following baseline studies are described: Panoramic RGB colour imaging of the landing site and panoramic multi-spectral imaging at 12 distinct wavelengths to study the mineralogy of landing site. Solar observations to measure water vapour absorption and the atmospheric dust optical density. Also envisaged are multi-spectral observations of Phobos &Deimos (observations of the moons relative to background stars will be used to determine the lander’s location and orientation relative to the Martian surface), monitoring of the landing site to detect temporal changes, observation of the actions and effects of the other PAW experiments (including rock texture studies with a close-up-lens) and collaborative observations with the Mars Express orbiter instrument teams. Due to be launched in May of this year, the total system mass is 360 g, the required volume envelope is 747 cm^3 and the average power consumption is 1.8 W. A 10Mbit/s RS422 bus connects each camera to the lander common electronics.
NASA Astrophysics Data System (ADS)
Lu, Qun; Yu, Li; Zhang, Dan; Zhang, Xuebo
2018-01-01
This paper presentsa global adaptive controller that simultaneously solves tracking and regulation for wheeled mobile robots with unknown depth and uncalibrated camera-to-robot extrinsic parameters. The rotational angle and the scaled translation between the current camera frame and the reference camera frame, as well as the ones between the desired camera frame and the reference camera frame can be calculated in real time by using the pose estimation techniques. A transformed system is first obtained, for which an adaptive controller is then designed to accomplish both tracking and regulation tasks, and the controller synthesis is based on Lyapunov's direct method. Finally, the effectiveness of the proposed method is illustrated by a simulation study.
NASA Astrophysics Data System (ADS)
Borisov, A. P.
2018-01-01
The article is devoted to the development of a software and hardware complex for investigating the grinding process on a pendulum deformer. The hardware part of this complex is the Raspberry Pi model 2B platform, to which a contactless angle sensor is connected, which allows to obtain data on the angle of deviation of the pendulum surface, usb-cameras, which allow to obtain grain images before and after grinding, and stepping motors allowing lifting of the pendulum surface and adjust the clearance between the pendulum and the supporting surfaces. The program part of the complex is written in C # and allows receiving data from the sensor and usb-cameras, processing the received data, and also controlling the synchronous-step motors in manual and automatic mode. The conducted studies show that the rational mode is the deviation of the pendulum surface by an angle of 400, and the location of the grain in the central zone of the support surface, regardless of the orientation of the grain in space. Also, due to the non-contact angle sensor, energy consumption for grinding, speed and acceleration of the pendulum surface, as well as vitreousness of grain and the energy consumption are calculated. With the help of photographs obtained from usb cameras, the work of a pendulum deformer based on the Rebinder formula and calculation of the grain area before and after grinding is determined.
Study on the measurement system of the target polarization characteristics and test
NASA Astrophysics Data System (ADS)
Fu, Qiang; Zhu, Yong; Zhang, Su; Duan, Jin; Yang, Di; Zhan, Juntong; Wang, Xiaoman; Jiang, Hui-Lin
2015-10-01
The polarization imaging detection technology increased the polarization information on the basis of the intensity imaging, which is extensive application in the military and civil and other fields, the research on the polarization characteristics of target is particularly important. The research of the polarization reflection model was introduced in this paper, which describes the scattering vector light energy distribution in reflecting hemisphere polarization characteristics, the target polarization characteristics test system solutions was put forward, by the irradiation light source, measuring turntable and camera, etc, which illuminate light source shall direct light source, with laser light sources and xenon lamp light source, light source can be replaced according to the test need; Hemispherical structure is used in measuring circumarotate placed near its base material sample, equipped with azimuth and pitching rotation mechanism, the manual in order to adjust the azimuth Angle and high Angle observation; Measuring camera pump works, through the different in the way of motor control polaroid polarization test, to ensure the accuracy of measurement and imaging resolution. The test platform has set up by existing laboratory equipment, the laser is 532 nm, line polaroid camera, at the same time also set the sending and receiving optical system. According to the different materials such as wood, metal, plastic, azimuth Angle and zenith Angle in different observation conditions, measurement of target in the polarization scattering properties of different exposure conditions, implementation of hemisphere space pBRDF measurement.
Housing as a Determinant of Tongan Children’s Health: Innovative Methodology Using Wearable Cameras
Robinson, Andrew; Puloka, Viliami; Smith, Moira; Stanley, James; Signal, Louise
2017-01-01
Housing is a significant determinant of health, particularly in developing countries such as Tonga. Currently, very little is known about the quality of the housing in Tonga, as is the case with many developing countries, nor about the interaction between children and the home environment. This study aimed to identify the nature and extent of health risk factors and behaviours in Tongan houses from a child’s perspective. An innovative methodology was used, Kids’Cam Tonga. Seventy-two Class 6 children (10 to 13-year-olds) were randomly selected from 12 randomly selected schools in Tongatapu, the main island. Each participating child wore a wearable camera on lanyards around their neck. The device automatically took wide-angled, 136° images of the child’s perspective every seven seconds. The children were instructed to wear the camera all day from Friday morning to Sunday evening, inclusive. The analysis showed that the majority of Tongan children in the study live in houses that have structural deficiencies and hazards, including water damage (42%), mould (36%), and electrical (89%) and burn risk factors (28%). The findings suggest that improvements to the housing stock may reduce the associated health burden and increase buildings’ resilience to natural hazards. A collaborative approach between communities, community leaders, government and non-governmental organisations (NGOs) is urgently needed. This research methodology may be of value to other developing countries. PMID:28976919
Three dimensional modelling for the target asteroid of HAYABUSA
NASA Astrophysics Data System (ADS)
Demura, H.; Kobayashi, S.; Asada, N.; Hashimoto, T.; Saito, J.
Hayabusa program is the first sample return mission of Japan. This was launched at May 9 2003, and will arrive at the target asteroid 25143 Itokawa on June 2005. The spacecraft has three optical navigation cameras, which are two wide angle ones and a telescopic one. The telescope with a filter wheel was named AMICA (Asteroid Multiband Imaging CAmera). We are going to model a shape of the target asteroid by this telescope; expected resolution: 1m/pixel at 10 km in distanc, field of view: 5.7 squared degrees, MPP-type CCD with 1024 x 1000 pixels. Because size of the Hayabusa is about 1x1x1 m, our goal is shape modeling with about 1m in precision on the basis of a camera system with scanning by rotation of the asteroid. This image-based modeling requires sequential images via AMICA and a history of distance between the asteroid and Hayabusa provided by a Laser Range Finder. We established a system of hierarchically recursive search with sub-pixel matching of Ground Control Points, which are picked up with Susan Operator. The matched dataset is restored with a restriction of epipolar geometry, and the obtained a group of three dimensional points are converted to a polygon model with Delaunay Triangulation. The current status of our development for the shape modeling is displayed.
NASA Astrophysics Data System (ADS)
Ugolnikov, Oleg S.; Maslov, Igor A.
2018-03-01
Polarization measurements of the twilight background with Wide-Angle Polarization Camera (WAPC) are used to detect the depolarization effect caused by stratospheric aerosol near the altitude of 20 km. Based on a number of observations in central Russia in spring and summer 2016, we found the parameters of lognormal size distribution of aerosol particles. This confirmed the previously published results of the colorimetric method as applied to the same twilights. The mean particle radius (about 0.1 micrometers) and size distribution are also in agreement with the recent data of in situ and space-based remote sensing of stratospheric aerosol. Methods considered here provide two independent techniques of the stratospheric aerosol study based on the twilight sky analysis.
2017-07-31
Saturn's northern hemisphere reached its summer solstice in mid-2017, bringing continuous sunshine to the planet's far north. The solstice took place on May 24, 2017. The Cassini mission is using the unparalleled opportunity to observe changes that occur on the planet as the Saturnian seasons turn. This view looks toward the sunlit side of the rings from about 17 degrees above the ring plane. The image was taken with the Cassini spacecraft wide-angle camera on April 17, 2017 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 939 nanometers. The view was acquired at a distance of approximately 733,000 miles (1.2 million kilometers) from Saturn. Image scale is 44 miles (70 kilometers) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA21337
STS-99 Flight Day Highlights and Crew Activities Report
NASA Technical Reports Server (NTRS)
2000-01-01
Live footage shows the Blue Team (second of the dual shift crew), Dominic L. Pudwill Gorie, Janice E. Voss and Mamoru Mohri, beginning the first mapping swath covering a 140-mile-wide path. While Mohri conducts mapping operations, Voss and Gorie are seen participating in a news conference with correspondents from NBC and CNN. The Red Team (first of the dual shift crew), Kevin R. Kregel, Janet L. Kavandi and Gerhard P.J. Thiele, relieves the Blue Team and are seen continuing the mapping operations for this around the clock Shuttle Radar Topography Mission (SRTM). Commander Kregel is shown performing boom (mass) durability tests, calibrating the EarthCam Payload, and speaking with the Launch Control Center (LCC) about trouble shooting a bracket for better camera angle.
2017-01-16
No Earth-based telescope could ever capture a view quite like this. Earth-based views can only show Saturn's daylit side, from within about 25 degrees of Saturn's equatorial plane. A spacecraft in orbit, like Cassini, can capture stunning scenes that would be impossible from our home planet. This view looks toward the sunlit side of the rings from about 25 degrees (if Saturn is dominant in image) above the ring plane. The image was taken in violet light with the Cassini spacecraft wide-angle camera on Oct. 28, 2016. The view was obtained at a distance of approximately 810,000 miles (1.3 million kilometers) from Saturn. Image scale is 50 miles (80 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20517
NASA Astrophysics Data System (ADS)
Ikeda, Sei; Sato, Tomokazu; Kanbara, Masayuki; Yokoya, Naokazu
2004-05-01
Technology that enables users to experience a remote site virtually is called telepresence. A telepresence system using real environment images is expected to be used in the field of entertainment, medicine, education and so on. This paper describes a novel telepresence system which enables users to walk through a photorealistic virtualized environment by actual walking. To realize such a system, a wide-angle high-resolution movie is projected on an immersive multi-screen display to present users the virtualized environments and a treadmill is controlled according to detected user's locomotion. In this study, we use an omnidirectional multi-camera system to acquire images real outdoor scene. The proposed system provides users with rich sense of walking in a remote site.
2008-05-27
Bright puffs and ribbons of cloud drift lazily through Saturn's murky skies. In contrast to the bold red, orange and white clouds of Jupiter, Saturn's clouds are overlain by a thick layer of haze. The visible cloud tops on Saturn are deeper in its atmosphere due to the planet's cooler temperatures. This view looks toward the unilluminated side of the rings from about 18 degrees above the ringplane. Images taken using red, green and blue spectral filters were combined to create this natural color view. The images were acquired with the Cassini spacecraft wide-angle camera on April 15, 2008 at a distance of approximately 1.5 million kilometers (906,000 miles) from Saturn. Image scale is 84 kilometers (52 miles) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA09910
Bombara, Courtenay B; Dürr, Salome; Machovsky-Capuska, Gabriel E; Jones, Peter W; Ward, Michael P
2017-01-01
Information on contacts between individuals within a population is crucial to inform disease control strategies, via parameterisation of disease spread models. In this study we investigated the use of dog-borne video cameras-in conjunction with global positioning systems (GPS) loggers-to both characterise dog-to-dog contacts and to estimate contact rates. We customized miniaturised video cameras, enclosed within 3D-printed plastic cases, and attached these to nylon dog collars. Using two 3400 mAh NCR lithium Li-ion batteries, cameras could record a maximum of 22 hr of continuous video footage. Together with a GPS logger, collars were attached to six free roaming domestic dogs (FRDDs) in two remote Indigenous communities in northern Australia. We recorded a total of 97 hr of video footage, ranging from 4.5 to 22 hr (mean 19.1) per dog, and observed a wide range of social behaviours. The majority (69%) of all observed interactions between community dogs involved direct physical contact. Direct contact behaviours included sniffing, licking, mouthing and play fighting. No contacts appeared to be aggressive, however multiple teeth baring incidents were observed during play fights. We identified a total of 153 contacts-equating to 8 to 147 contacts per dog per 24 hr-from the videos of the five dogs with camera data that could be analysed. These contacts were attributed to 42 unique dogs (range 1 to 19 per video) which could be identified (based on colour patterns and markings). Most dog activity was observed in urban (houses and roads) environments, but contacts were more common in bushland and beach environments. A variety of foraging behaviours were observed, included scavenging through rubbish and rolling on dead animal carcasses. Identified food consumed included chicken, raw bones, animal carcasses, rubbish, grass and cheese. For characterising contacts between FRDD, several benefits of analysing videos compared to GPS fixes alone were identified in this study, including visualisation of the nature of the contact between two dogs; and inclusion of a greater number of dogs in the study (which do not need to be wearing video or GPS collars). Some limitations identified included visualisation of contacts only during daylight hours; the camera lens being obscured on occasion by the dog's mandible or the dog resting on the camera; an insufficiently wide viewing angle (36°); battery life and robustness of the deployments; high costs of the deployment; and analysis of large volumes of often unsteady video footage. This study demonstrates that dog-borne video cameras, are a feasible technology for estimating and characterising contacts between FRDDs. Modifying camera specifications and developing new analytical methods will improve applicability of this technology for monitoring FRDD populations, providing insights into dog-to-dog contacts and therefore how disease might spread within these populations.
Wide Field and Planetary Camera for Space Telescope
NASA Technical Reports Server (NTRS)
Lockhart, R. F.
1982-01-01
The Space Telescope's Wide Field and Planetary Camera instrument, presently under construction, will be used to map the observable universe and to study the outer planets. It will be able to see 1000 times farther than any previously employed instrument. The Wide Field system will be located in a radial bay, receiving its signals via a pick-off mirror centered on the optical axis of the telescope assembly. The external thermal radiator employed by the instrument for cooling will be part of the exterior surface of the Space Telescope. In addition to having a larger (1200-12,000 A) wavelength range than any of the other Space Telescope instruments, its data rate, at 1 Mb/sec, exceeds that of the other instruments. Attention is given to the operating modes and projected performance levels of the Wide Field Camera and Planetary Camera.
75. FIRST TEST SHOT OF THE VAL AT THE DEDICATION ...
75. FIRST TEST SHOT OF THE VAL AT THE DEDICATION CEREMONIES AS SEEN FROM A FIXED CAMERA STATION, May 7, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Equipment Development for Automatic Anthropometric Measurements
NASA Technical Reports Server (NTRS)
Cater, J. P.; Oakey, W. E.
1978-01-01
An automated procedure for measuring and recording the anthropometric active angles is presented. The small portable system consists of a microprocessor controlled video data acquisition system which measures single plane active angles using television video techniques and provides the measured data on sponsored-specified preformatted data sheets. This system, using only a single video camera, observes the end limits of the movement of a pair of separated lamps and calculates the vector angle between the extreme positions.
Stray light lessons learned from the Mars reconnaissance orbiter's optical navigation camera
NASA Astrophysics Data System (ADS)
Lowman, Andrew E.; Stauder, John L.
2004-10-01
The Optical Navigation Camera (ONC) is a technical demonstration slated to fly on NASA"s Mars Reconnaissance Orbiter in 2005. Conventional navigation methods have reduced accuracy in the days immediately preceding Mars orbit insertion. The resulting uncertainty in spacecraft location limits rover landing sites to relatively safe areas, away from interesting features that may harbor clues to past life on the planet. The ONC will provide accurate navigation on approach for future missions by measuring the locations of the satellites of Mars relative to background stars. Because Mars will be a bright extended object just outside the camera"s field of view, stray light control at small angles is essential. The ONC optomechanical design was analyzed by stray light experts and appropriate baffles were implemented. However, stray light testing revealed significantly higher levels of light than expected at the most critical angles. The primary error source proved to be the interface between ground glass surfaces (and the paint that had been applied to them) and the polished surfaces of the lenses. This paper will describe troubleshooting and correction of the problem, as well as other lessons learned that affected stray light performance.
Human tracking over camera networks: a review
NASA Astrophysics Data System (ADS)
Hou, Li; Wan, Wanggen; Hwang, Jenq-Neng; Muhammad, Rizwan; Yang, Mingyang; Han, Kang
2017-12-01
In recent years, automated human tracking over camera networks is getting essential for video surveillance. The tasks of tracking human over camera networks are not only inherently challenging due to changing human appearance, but also have enormous potentials for a wide range of practical applications, ranging from security surveillance to retail and health care. This review paper surveys the most widely used techniques and recent advances for human tracking over camera networks. Two important functional modules for the human tracking over camera networks are addressed, including human tracking within a camera and human tracking across non-overlapping cameras. The core techniques of human tracking within a camera are discussed based on two aspects, i.e., generative trackers and discriminative trackers. The core techniques of human tracking across non-overlapping cameras are then discussed based on the aspects of human re-identification, camera-link model-based tracking and graph model-based tracking. Our survey aims to address existing problems, challenges, and future research directions based on the analyses of the current progress made toward human tracking techniques over camera networks.
Real-time machine vision system using FPGA and soft-core processor
NASA Astrophysics Data System (ADS)
Malik, Abdul Waheed; Thörnberg, Benny; Meng, Xiaozhou; Imran, Muhammad
2012-06-01
This paper presents a machine vision system for real-time computation of distance and angle of a camera from reference points in the environment. Image pre-processing, component labeling and feature extraction modules were modeled at Register Transfer (RT) level and synthesized for implementation on field programmable gate arrays (FPGA). The extracted image component features were sent from the hardware modules to a soft-core processor, MicroBlaze, for computation of distance and angle. A CMOS imaging sensor operating at a clock frequency of 27MHz was used in our experiments to produce a video stream at the rate of 75 frames per second. Image component labeling and feature extraction modules were running in parallel having a total latency of 13ms. The MicroBlaze was interfaced with the component labeling and feature extraction modules through Fast Simplex Link (FSL). The latency for computing distance and angle of camera from the reference points was measured to be 2ms on the MicroBlaze, running at 100 MHz clock frequency. In this paper, we present the performance analysis, device utilization and power consumption for the designed system. The FPGA based machine vision system that we propose has high frame speed, low latency and a power consumption that is much lower compared to commercially available smart camera solutions.
2017-11-21
After more than 13 years at Saturn, and with its fate sealed, NASA's Cassini spacecraft bid farewell to the Saturnian system by firing the shutters of its wide-angle camera and capturing this last, full mosaic of Saturn and its rings two days before the spacecraft's dramatic plunge into the planet's atmosphere. During the observation, a total of 80 wide-angle images were acquired in just over two hours. This view is constructed from 42 of those wide-angle shots, taken using the red, green and blue spectral filters, combined and mosaicked together to create a natural-color view. Six of Saturn's moons -- Enceladus, Epimetheus, Janus, Mimas, Pandora and Prometheus -- make a faint appearance in this image. (Numerous stars are also visible in the background.) A second version of the mosaic is provided in which the planet and its rings have been brightened, with the fainter regions brightened by a greater amount. (The moons and stars have also been brightened by a factor of 15 in this version.) The ice-covered moon Enceladus -- home to a global subsurface ocean that erupts into space -- can be seen at the 1 o'clock position. Directly below Enceladus, just outside the F ring (the thin, farthest ring from the planet seen in this image) lies the small moon Epimetheus. Following the F ring clock-wise from Epimetheus, the next moon seen is Janus. At about the 4:30 position and outward from the F ring is Mimas. Inward of Mimas and still at about the 4:30 position is the F-ring-disrupting moon, Pandora. Moving around to the 10 o'clock position, just inside of the F ring, is the moon Prometheus. This view looks toward the sunlit side of the rings from about 15 degrees above the ring plane. Cassini was approximately 698,000 miles (1.1 million kilometers) from Saturn, on its final approach to the planet, when the images in this mosaic were taken. Image scale on Saturn is about 42 miles (67 kilometers) per pixel. The image scale on the moons varies from 37 to 50 miles (59 to 80 kilometers) pixel. The phase angle (the Sun-planet-spacecraft angle) is 138 degrees. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA17218
Single exposure three-dimensional imaging of dusty plasma clusters.
Hartmann, Peter; Donkó, István; Donkó, Zoltán
2013-02-01
We have worked out the details of a single camera, single exposure method to perform three-dimensional imaging of a finite particle cluster. The procedure is based on the plenoptic imaging principle and utilizes a commercial Lytro light field still camera. We demonstrate the capabilities of our technique on a single layer particle cluster in a dusty plasma, where the camera is aligned and inclined at a small angle to the particle layer. The reconstruction of the third coordinate (depth) is found to be accurate and even shadowing particles can be identified.
Photometric Characteristics of Lunar Terrains
NASA Astrophysics Data System (ADS)
Sato, Hiroyuki; Hapke, Bruce W.; Denevi, Brett W.; Robinson, Mark
2016-10-01
The photometric properties of the lunar depend on albedo, surface roughness, porosity, and the internal/external structure of particles. Hapke parameter maps derived using a bidirectional reflectance model [Hapke, 2012] from Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) images demonstrated the spatial and spectral variation of the photometric properties of the Moon [Sato et al., 2014]. Using the same methodology, here we present the photometric characteristics of typical lunar terrains, which were not systematically analyzed in the previous study.We selected five representative terrain types: mare, highland, swirls, and two Copernican (fresh) crater ejecta (one mare and one highlands example). As for the datasets, we used ~39 months of WAC repeated observations, and for each image pixel, we computed latitude, longitude, incidence, emission, and phase angles using the WAC GLD100 stereo DTM [Scholten et al., 2012]. To obtain similar phase and incidence angle ranges, all sampling sites are near the equator and in the vicinity of Reiner Gamma. Three free Hapke parameters (single scattering albedo: w, HG2 phase function parameter: c, and angular width of SHOE: hs) were then calculated for the seven bands (321-689 nm). The remaining parameters were fixed by simplifying the model [Sato et al., 2014].The highlands, highland ejecta, and swirl (Reiner Gamma) showed clearly higher w than the mare and mare ejecta. The derived c values were lower (less backscattering) for the swirl and higher (more backscattering) for the highlands (and ejecta) relative to the other sites. Forward scattering materials such as unconsolidated transparent crystalline materials might be relatively enriched in the swirl. In the highlands, anorthositic agglutinates with dense internal scattering could be responsible for the strong backscattering. The mare and mare ejecta showed continuously decreasing c from UV to visible wavelengths. This might be caused by the FeO-rich pyroxene and glass in the mare becoming more translucent at longer wavelengths.
Martian Mystery: Do Some Materials Flow Uphill?
NASA Technical Reports Server (NTRS)
1999-01-01
Some of the geological features of Mars defy conventional, or simple, explanations. A recent example is on the wall of a 72 kilometer-wide (45 mile-wide) impact crater in Promethei Terra. The crater (above left) is located at 39oS, 247oW. Its inner walls appear in low-resolution images to be deeply gullied. A high resolution Mars Orbiter Camera (MOC) image shows that each gully on the crater's inner wall contains a tongue of material that appears to have flowed (to best see this, click on the icon above right and examine the full image). Ridges and grooves that converge toward the center of each gully and show a pronounced curvature are oriented in a manner that seems to suggest that material has flowed from the top toward the bottom of the picture. This pattern is not unlike pouring pancake batter into a pan... the viscous fluid will form a steep, lobate margin and spread outward across the pan. The ridges and grooves seen in the image are also more reminiscent of the movement of material out and away from a place of confinement, as opposed to the types of features seen when they flow into a more confined area. Mud and lava-flows, and even some glaciers, for the most part behave in this manner. From these observations, and based solely on the appearance, one might conclude that the features formed by moving from the top of the image towards the bottom. But this is not the case! The material cannot have flowed from the top towards the bottom of the area seen in the high resolution image (above, right), because the crater floor (which is the lowest area in the image) is at the top of the picture. The location and correct orientation of the high resolution image is shown by a white box in the context frame on the left. Since gravity pulls the material in the gullies downhill not uphill the pattern of ridges and grooves found on these gully-filling materials is puzzling. An explanation may lie in the nature of the material (e.g., how viscous was the pancake batter-like material?) and how rapidly it moved, but for now this remains an unexplained martian phenomenon. The context image (above, left) was taken by the MOC red wide angle camera at the same time that the MOC narrow angle camera obtained the high resolution view (above, right). Context images such as this provide a simple way to determine the location of each new high resolution view of the planet. Both images are illuminated from the upper left. The high resolution image covers an area 3 km (1.9 mi) across. Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.NASA Astrophysics Data System (ADS)
Chaban, R.; Pace, D. C.; Marcy, G. R.; Taussig, D.
2016-10-01
Energetic ion losses must be minimized in burning plasmas to maintain fusion power, and existing tokamaks provide access to energetic ion parameter regimes that are relevant to burning machines. A new Fast Ion Loss Detector (FILD) probe on the DIII-D tokamak has been optimized to resolve beam ion losses across a range of 30 - 90 keV in energy and 40° to 80° in pitch angle, thereby providing valuable measurements during many different experiments. The FILD is a magnetic spectrometer; once inserted into the tokamak, the magnetic field allows energetic ions to pass through a collimating aperture and strike a scintillator plate that is imaged by a wide view camera and narrow view photomultiplier tubes (PMTs). The design involves calculating scintillator strike patterns while varying probe geometry. Calculated scintillator patterns are then used to design an optical system that allows adjustment of the focus regions for the 1 MS/s resolved PMTs. A synthetic diagnostic will be used to determine the energy and pitch angle resolution that can be attained in DIII-D experiments. Work supported in part by US DOE under the Science Undergraduate Laboratory Internship (SULI) program and under DE-FC02-04ER54698.
CCD Camera Lens Interface for Real-Time Theodolite Alignment
NASA Technical Reports Server (NTRS)
Wake, Shane; Scott, V. Stanley, III
2012-01-01
Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.
The Absolute Reflectance and New Calibration Site of the Moon
NASA Astrophysics Data System (ADS)
Wu, Yunzhao; Wang, Zhenchao; Cai, Wei; Lu, Yu
2018-05-01
How bright the Moon is forms a simple but fundamental and important question. Although numerous efforts have been made to answer this question such as use of sophisticated electro-optical measurements and suggestions for calibration sites, the answer is still debated. An in situ measurement with a calibration panel on the surface of the Moon is crucial for obtaining the accurate absolute reflectance and resolving the debate. China’s Chang’E-3 (CE-3) “Yutu” rover accomplished this type of measurement using the Visible-Near Infrared Spectrometer (VNIS). The measurements of the VNIS, which were at large emission and phase angles, complement existing measurements for the range of photometric geometry. The in situ reflectance shows that the CE-3 landing site is very dark with an average reflectance of 3.86% in the visible bands. The results are compared with recent mission instruments: the Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC), the Spectral Profiler (SP) on board the SELENE, the Moon Mineralogy Mapper (M3) on board the Chandrayaan-1, and the Chang’E-1 Interference Imaging Spectrometer (IIM). The differences in the measurements of these instruments are very large and indicate inherent differences in their absolute calibration. The M3 and IIM measurements are smaller than LROC WAC and SP, and the VNIS measurement falls between these two pairs. When using the Moon as a radiance source for the on-orbit calibration of spacecraft instruments, one should be cautious about the data. We propose that the CE-3 landing site, a young and homogeneous surface, should serve as the new calibration site.
NASA Astrophysics Data System (ADS)
Carbajal, L.; del-Castillo-Negrete, D.
2017-12-01
Developing avoidance or mitigation strategies of runaway electrons (REs) in magnetic confinement fusion (MCF) plasmas is of crucial importance for the safe operation of ITER. In order to develop these strategies, an accurate diagnostic capability that allows good estimates of the RE distribution function in these plasmas is needed. Synchrotron radiation (SR) of RE in MCF, besides of being one of the main damping mechanisms for RE in the high energy relativistic regime, is routinely used in current MCF experiments to infer the parameters of RE energy and pitch angle distribution functions. In the present paper we address the long standing question about what are the relationships between different REs distribution functions and their corresponding synchrotron emission simultaneously including: full-orbit effects, information of the spectral and angular distribution of SR of each electron, and basic geometric optics of a camera. We study the spatial distribution of the SR on the poloidal plane, and the statistical properties of the expected value of the synchrotron spectra of REs. We observe a strong dependence of the synchrotron emission measured by the camera on the pitch angle distribution of runaways, namely we find that crescent shapes of the spatial distribution of the SR as measured by the camera relate to RE distributions with small pitch angles, while ellipse shapes relate to distributions of runaways with larger the pitch angles. A weak dependence of the synchrotron emission measured by the camera with the RE energy, value of the q-profile at the edge, and the chosen range of wavelengths is observed. Furthermore, we find that oversimplifying the angular dependence of the SR changes the shape of the synchrotron spectra, and overestimates its amplitude by approximately 20 times for avalanching runaways and by approximately 60 times for mono-energetic distributions of runaways1.
2005-05-02
This recent image of Titan reveals more complex patterns of bright and dark regions on the surface, including a small, dark, circular feature, completely surrounded by brighter material. During the two most recent flybys of Titan, on March 31 and April 16, 2005, Cassini captured a number of images of the hemisphere of Titan that faces Saturn. The image at the left is taken from a mosaic of images obtained in March 2005 (see PIA06222) and shows the location of the more recently acquired image at the right. The new image shows intriguing details in the bright and dark patterns near an 80-kilometer-wide (50-mile) crater seen first by Cassini's synthetic aperture radar experiment during a Titan flyby in February 2005 (see PIA07368) and subsequently seen by the imaging science subsystem cameras as a dark spot (center of the image at the left). Interestingly, a smaller, roughly 20-kilometer-wide (12-mile), dark and circular feature can be seen within an irregularly-shaped, brighter ring, and is similar to the larger dark spot associated with the radar crater. However, the imaging cameras see only brightness variations, and without topographic information, the identity of this feature as an impact crater cannot be conclusively determined from this image. The visual infrared mapping spectrometer, which is sensitive to longer wavelengths where Titan's atmospheric haze is less obscuring -- observed this area simultaneously with the imaging cameras, so those data, and perhaps future observations by Cassini's radar, may help to answer the question of this feature's origin. The new image at the right consists of five images that have been added together and enhanced to bring out surface detail and to reduce noise, although some camera artifacts remain. These images were taken with the Cassini spacecraft narrow-angle camera using a filter sensitive to wavelengths of infrared light centered at 938 nanometers -- considered to be the imaging science subsystem's best spectral filter for observing the surface of Titan. This view was acquired from a distance of 33,000 kilometers (20,500 miles). The pixel scale of this image is 390 meters (0.2 miles) per pixel, although the actual resolution is likely to be several times larger. http://photojournal.jpl.nasa.gov/catalog/PIA06234
NASA Astrophysics Data System (ADS)
den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.
2015-10-01
Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.
Adjustable-Viewing-Angle Endoscopic Tool for Skull Base and Brain Surgery
NASA Technical Reports Server (NTRS)
Bae, Youngsam; Liao, Anna; Manohara, Harish; Shahinian, Hrayr
2008-01-01
The term Multi-Angle and Rear Viewing Endoscopic tooL (MARVEL) denotes an auxiliary endoscope, now undergoing development, that a surgeon would use in conjunction with a conventional endoscope to obtain additional perspective. The role of the MARVEL in endoscopic brain surgery would be similar to the role of a mouth mirror in dentistry. Such a tool is potentially useful for in-situ planetary geology applications for the close-up imaging of unexposed rock surfaces in cracks or those not in the direct line of sight. A conventional endoscope provides mostly a frontal view that is, a view along its longitudinal axis and, hence, along a straight line extending from an opening through which it is inserted. The MARVEL could be inserted through the same opening as that of the conventional endoscope, but could be adjusted to provide a view from almost any desired angle. The MARVEL camera image would be displayed, on the same monitor as that of the conventional endoscopic image, as an inset within the conventional endoscopic image. For example, while viewing a tumor from the front in the conventional endoscopic image, the surgeon could simultaneously view the tumor from the side or the rear in the MARVEL image, and could thereby gain additional visual cues that would aid in precise three-dimensional positioning of surgical tools to excise the tumor. Indeed, a side or rear view through the MARVEL could be essential in a case in which the object of surgical interest was not visible from the front. The conceptual design of the MARVEL exploits the surgeon s familiarity with endoscopic surgical tools. The MARVEL would include a miniature electronic camera and miniature radio transmitter mounted on the tip of a surgical tool derived from an endo-scissor (see figure). The inclusion of the radio transmitter would eliminate the need for wires, which could interfere with manipulation of this and other surgical tools. The handgrip of the tool would be connected to a linkage similar to that of an endo-scissor, but the linkage would be configured to enable adjustment of the camera angle instead of actuation of a scissor blade. It is envisioned that thicknesses of the tool shaft and the camera would be less than 4 mm, so that the camera-tipped tool could be swiftly inserted and withdrawn through a dime-size opening. Electronic cameras having dimensions of the order of millimeters are already commercially available, but their designs are not optimized for use in endoscopic brain surgery. The variety of potential endoscopic, thoracoscopic, and laparoscopic applications can be expected to increase as further development of electronic cameras yields further miniaturization and improvements in imaging performance.
Geometry-based populated chessboard recognition
NASA Astrophysics Data System (ADS)
Xie, Youye; Tang, Gongguo; Hoff, William
2018-04-01
Chessboards are commonly used to calibrate cameras, and many robust methods have been developed to recognize the unoccupied boards. However, when the chessboard is populated with chess pieces, such as during an actual game, the problem of recognizing the board is much harder. Challenges include occlusion caused by the chess pieces, the presence of outlier lines and low viewing angles of the chessboard. In this paper, we present a novel approach to address the above challenges and recognize the chessboard. The Canny edge detector and Hough transform are used to capture all possible lines in the scene. The k-means clustering and a k-nearest-neighbors inspired algorithm are applied to cluster and reject the outlier lines based on their Euclidean distances to the nearest neighbors in a scaled Hough transform space. Finally, based on prior knowledge of the chessboard structure, a geometric constraint is used to find the correspondences between image lines and the lines on the chessboard through the homography transformation. The proposed algorithm works for a wide range of the operating angles and achieves high accuracy in experiments.
Geologic applications of Space Shuttle photography
NASA Technical Reports Server (NTRS)
Wood, Charles A.
1989-01-01
Space Shuttle astronauts have used handheld cameras to take about 30,000 photographs of the earth as seen from orbit. These pictures provide valuable, true-color depictions of many geologically significant areas. While the photographs have areal coverages and resolutions similar to the more familiar Landsat MSS and TM images, they differ from the latter in having a wide variety of solar illumination angles and look angles. Astronaut photographs can be used as very small scale aerial photographs for geologic mapping and planning logistical support for field work. Astronaut photography offers unique opportunities, because of the intelligence and training of the on-orbit observer, for documenting dynamic geologic activity such as volcanic eruptions, dust storms, etc. Astronauts have photographed more than 3 dozen volcanic eruption plumes, some of which were not reported otherwise. The stereographic capability of astronaut photography also permits three-dimensional interpretation of geologic landforms which is commonly useful in analysis of structural geology. Astronauts have also photographed about 20 known impact craters as part of project to discover presently unknown examples in Africa, South America, and Australia.
2014-11-03
When Galileo first observed Venus displaying a crescent phase, he excitedly wrote to Kepler (in anagram) of Venus mimicking the moon-goddess. He would have been delirious with joy to see Saturn and Titan, seen in this image, doing the same thing. More than just pretty pictures, high-phase observations -- taken looking generally toward the Sun, as in this image -- are very powerful scientifically since the way atmospheres and rings transmit sunlight is often diagnostic of compositions and physical states. In this example, Titan's crescent nearly encircles its disk due to the small haze particles high in its atmosphere refracting the incoming light of the distant Sun. This view looks toward the sunlit side of the rings from about 3 degrees above the ringplane. The image was taken in violet light with the Cassini spacecraft wide-angle camera on Aug. 11, 2013. The view was obtained at a distance of approximately 1.1 million miles (1.7 million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 154 degrees. Image scale is 64 miles (103 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18291
A Closer Look at Telesto False-Color
2006-02-08
These views show surface features and color variation on the Trojan moon Telesto. The smooth surface of this moon suggests that, like Pandora, it is covered with a mantle of fine, dust-sized icy material. The monochrome image was taken in visible light (see PIA07696). To create the false-color view, ultraviolet, green and infrared images were combined into a single black and white picture that isolates and maps regional color differences. This "color map" was then superposed over a clear-filter image. The origin of the color differences is not yet understood, but may be caused by subtle differences in the surface composition or the sizes of grains making up the icy soil. Tiny Telesto is a mere 24 kilometers (15 miles) wide. The image was acquired with the Cassini spacecraft narrow-angle camera on Dec. 25, 2005 at a distance of approximately 20,000 kilometers (12,000 miles) from Telesto and at a Sun-Telesto-spacecraft, or phase, angle of 58 degrees. Image scale is 118 meters (387 feet) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA07697
2017-01-09
Shadows cast across Mimas' defining feature, Herschel Crater, provide an indication of the size of the crater's towering walls and central peak. Named after the icy moon's discoverer, astronomer William Herschel, the crater stretches 86 miles (139 kilometers) wide -- almost one-third of the diameter of Mimas (246 miles or 396 kilometers) itself. Large impact craters often have peaks in their center -- see Tethys' large crater Odysseus in PIA08400. Herschel's peak stands nearly as tall as Mount Everest on Earth. This view looks toward the anti-Saturn hemisphere of Mimas. North on Mimas is up and rotated 21 degrees to the left. The image was taken with the Cassini spacecraft narrow-angle camera on Oct. 22, 2016 using a combination of spectral filters which preferentially admits wavelengths of ultraviolet light centered at 338 nanometers. The view was acquired at a distance of approximately 115,000 miles (185,000 kilometers) from Mimas and at a Sun-Mimas-spacecraft, or phase, angle of 20 degrees. Image scale is 3,300 feet (1 kilometer) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20515
2016-06-20
As Saturn's northern hemisphere summer approaches, the shadows of the rings creep ever southward across the planet. Here, the ring shadows appear to obscure almost the entire southern hemisphere, while the planet's north pole and its six-sided jet stream, known as "the hexagon," are fully illuminated by the sun. When NASA's Cassini spacecraft arrived at Saturn 12 years ago, the shadows of the rings lay far to the north on the planet (see PIA06077). As the mission progressed and seasons turned on the slow-orbiting giant, equinox arrived and the shadows of the rings became a thin line at the equator (see PIA11667). This view looks toward the sunlit side of the rings from about 16 degrees above the ring plane. The image was taken in red light with the Cassini spacecraft wide-angle camera on March 19, 2016. The view was obtained at a distance of approximately 1.7 million miles (2.7 million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 92 degrees. Image scale is 100 miles (160 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20486
Photometric Observations of Soils and Rocks at the Mars Exploration Rover Landing Sites
NASA Technical Reports Server (NTRS)
Johnson, J. R.; Arvidson, R. A.; Bell, J. F., III; Farrand, W.; Guinness, E.; Johnson, M.; Herkenhoff, K. E.; Lemmon, M.; Morris, R. V.; Seelos, F., IV
2005-01-01
The Panoramic Cameras (Pancam) on the Spirit and Opportunity Mars Exploration Rovers have acquired multispectral reflectance observations of rocks and soils at different incidence, emission, and phase angles that will be used for photometric modeling of surface materials. Phase angle coverage at both sites extends from approx. 0 deg. to approx. 155 deg.
78. PHOTO OF A PROJECTILE FIRING USING A SABOT TAKEN ...
78. PHOTO OF A PROJECTILE FIRING USING A SABOT TAKEN WITH A 70 MM MITCHEL MOTION PICTURE CAMERA, Date unknown, circa 1950. (Original photograph in possession of Dave Willis, San Diego, California.) Photograph represents central frame of negative. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
2016-08-08
The shadow of Saturn on the rings, which stretched across all of the rings earlier in Cassini's mission (see PIA08362), now barely makes it past the Cassini division. The changing length of the shadow marks the passing of the seasons on Saturn. As the planet nears its northern-hemisphere solstice in May 2017, the shadow will get even shorter. At solstice, the shadow's edge will be about 28,000 miles (45,000 kilometers) from the planet's surface, barely making it past the middle of the B ring. The moon Mimas is a few pixels wide, near the lower left in this image. This view looks toward the sunlit side of the rings from about 35 degrees above the ring plane. The image was taken in visible light with the Cassini spacecraft wide-angle camera on May 21, 2016. The view was obtained at a distance of approximately 2.0 million miles (3.2 million kilometers) from Saturn. Image scale is 120 miles (190 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20494
2013-12-23
Saturn's moon Enceladus, covered in snow and ice, resembles a perfectly packed snowball in this image from NASA's Cassini mission. Cassini has imaged Enceladus many times throughout its mission, discovering a fractured surface and the now-famous geysers that erupt icy particles and water vapor from fractures crossing the moons' 200-mile-wide (300-kilometer-wide) south polar terrain. The mountain ridge seen in the south in this image is part of the undulating mountain belt that circumscribes this region. This view looks toward the leading side of Enceladus (313 miles, 504 kilometers across). North on Enceladus is up and rotated 6 degrees to the left. The image was taken with the Cassini spacecraft narrow-angle camera on March 10, 2012, using filters sensitive to ultraviolet, visible and infrared light (spanning wavelengths from 338 to 750 nanometers). The view was acquired at a distance of approximately 106,000 miles (170,000 kilometers) from Enceladus. Image scale is 3,336 feet (1 kilometer) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA17182
NASA Technical Reports Server (NTRS)
Monford, Leo G. (Inventor)
1990-01-01
Improved techniques are provided for alignment of two objects. The present invention is particularly suited for three-dimensional translation and three-dimensional rotational alignment of objects in outer space. A camera 18 is fixedly mounted to one object, such as a remote manipulator arm 10 of the spacecraft, while the planar reflective surface 30 is fixed to the other object, such as a grapple fixture 20. A monitor 50 displays in real-time images from the camera, such that the monitor displays both the reflected image of the camera and visible markings on the planar reflective surface when the objects are in proper alignment. The monitor may thus be viewed by the operator and the arm 10 manipulated so that the reflective surface is perpendicular to the optical axis of the camera, the roll of the reflective surface is at a selected angle with respect to the camera, and the camera is spaced a pre-selected distance from the reflective surface.
Improved docking alignment system
NASA Technical Reports Server (NTRS)
Monford, Leo G. (Inventor)
1988-01-01
Improved techniques are provided for the alignment of two objects. The present invention is particularly suited for 3-D translation and 3-D rotational alignment of objects in outer space. A camera is affixed to one object, such as a remote manipulator arm of the spacecraft, while the planar reflective surface is affixed to the other object, such as a grapple fixture. A monitor displays in real-time images from the camera such that the monitor displays both the reflected image of the camera and visible marking on the planar reflective surface when the objects are in proper alignment. The monitor may thus be viewed by the operator and the arm manipulated so that the reflective surface is perpendicular to the optical axis of the camera, the roll of the reflective surface is at a selected angle with respect to the camera, and the camera is spaced a pre-selected distance from the reflective surface.
1968-12-21
Apollo 8,Moon, Latitude 15 degrees South,Longitude 170 degrees West. Camera Tilt Mode: High Oblique. Direction: Southeast. Sun Angle 17 degrees. Original Film Magazine was labeled E. Camera Data: 70mm Hasselblad; F-Stop: F-5.6; Shutter Speed: 1/250 second. Film Type: Kodak SO-3400 Black and White,ASA 40. Other Photographic Coverage: Lunar Orbiter 1 (LO I) S-3. Flight Date: December 21-27,1968.
Observation of Planetary Motion Using a Digital Camera
ERIC Educational Resources Information Center
Meyn, Jan-Peter
2008-01-01
A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to…
MRO Mars Color Imager (MARCI) Investigation Primary Mission Results
NASA Astrophysics Data System (ADS)
Edgett, K. S.; Cantor, B. A.; Malin, M. C.; Science; Operations Teams, M.
2008-12-01
The Mars Reconnaissance Orbiter (MRO) Mars Color Imager (MARCI) investigation was designed to recover the wide angle camera science objectives of the Mars Climate Orbiter MARCI which was destroyed upon arrival at Mars in 1999 and extend the daily meteorological coverage of the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide angle investigation that was systematically conducted from March 1999 to October 2006. MARCI consists of two wide angle cameras, each with a 180° field of view. The first acquires data in 5 visible wavelength channels (420, 550, 600, 650, 720 nm), the second in 2 UV channels (260, 320 nm). Data have been acquired daily, except during spacecraft upsets, since 24 September 2006. From the MRO 250 to 315 km altitude orbit, inclined 93 degrees, visible wavelength images usually have a pixel scale of about 1 km at nadir and the UV data are at about 8 km per pixel. Data are obtained during every orbit on the day side of the planet from terminator to terminator. These provide a nearly continuous record of meteorological events and changes in surface frost and albedo patterns that span more than 1 martian year and extend the daily global record of such events documented by the MGS MOC. For a few weeks in September and October 2006, both camera systems operated simultaneously, providing views of weather events at about 1400 local time (MOC) and an hour later at about 1500 (MARCI). The continuous meteorological record, now spanning more than 5 Mars years, shows very repeatable weather from year to year with cloud and dust-raising events occurring in the same regions within about 2 weeks of their prior occurrence in previous years. This provides a measure of predictability ideal for assessing future landing sites, orbiter aerobraking plans, and conditions to be encountered by the current landed spacecraft on Mars. However, less predictable are planet-encircling dust events. MOC observed one in 2001, the next was observed by MARCI in 2007. These occurred at different times of year. While popularly known as global dust storms, the nomenclature is misleading, as in each case a storm did not raise dust nor saltate sand on a global basis. Instead, multiple regional storms created a dust haze which obscured much of the martian surface from viewpoints above the lower atmosphere, but in each case the dust opacity was never so high that one could not determine where dust was being raised and where it was not. Within weeks of the end of the 2001 and 2007 global dust events, martian weather returned to its normal, repeatable pattern, with one exception: occasionally thereafter, dust storms were observed in regions where dust-raising had not been seen in the previous years. In these cases, winds capable of raising dust likely occurred at that location every year, but only became visible following a planet-encircling dust event and deposition of dust on a surface that previously did not have sufficient dust to raise. Other MARCI results center on seasonal monitoring of water vapor in the atmosphere, particularly by taking advantage of the anti-correlation between ozone (observable using the UV channels) and water vapor. Owing to their higher spatial resolution than the MOC daily global coverage, details of seasonal polar cap retreat became more apparent, as with these data it is now possible to separate surface frost from ground-hugging fog which forms along the retreating cap edge. MARCI images and meteorological observations are posted weekly on the Internet for public consumption, and the data are archived every 6 months with the NASA Planetary Data System.
Optimization design of periscope type 3X zoom lens design for a five megapixel cellphone camera
NASA Astrophysics Data System (ADS)
Sun, Wen-Shing; Tien, Chuen-Lin; Pan, Jui-Wen; Chao, Yu-Hao; Chu, Pu-Yi
2016-11-01
This paper presents a periscope type 3X zoom lenses design for a five megapixel cellphone camera. The configuration of optical system uses the right angle prism in front of the zoom lenses to change the optical path rotated by a 90° angle resulting in the zoom lenses length of 6 mm. The zoom lenses can be embedded in mobile phone with a thickness of 6 mm. The zoom lenses have three groups with six elements. The half field of view is varied from 30° to 10.89°, the effective focal length is adjusted from 3.142 mm to 9.426 mm, and the F-number is changed from 2.8 to 5.13.
2015-10-15
NASA's Cassini spacecraft spied this tight trio of craters as it approached Saturn's icy moon Enceladus for a close flyby on Oct. 14, 2015. The craters, located at high northern latitudes, are sliced through by thin fractures -- part of a network of similar cracks that wrap around the snow-white moon. The image was taken in visible light with the Cassini spacecraft narrow-angle camera on Oct. 14, 2015 at a distance of approximately 6,000 miles (10,000 kilometers) from Enceladus. Image scale is 197 feet (60 meters) per pixel. The image was taken with the Cassini spacecraft narrow-angle camera on Oct. 14, 2015 using a spectral filter which preferentially admits wavelengths of ultraviolet light centered at 338 nanometers. http://photojournal.jpl.nasa.gov/catalog/PIA20011
Sidelooking laser altimeter for a flight simulator
NASA Technical Reports Server (NTRS)
Webster, L. D. (Inventor)
1983-01-01
An improved laser altimeter for a flight simulator which allows measurement of the height of the simulator probe above the terrain directly below the probe tip is described. A laser beam is directed from the probe at an angle theta to the horizontal to produce a beam spot on the terrain. The angle theta that the laser beam makes with the horizontal is varied so as to bring the beam spot into coincidence with a plumb line coaxial with the longitudinal axis of the probe. A television altimeter camera observes the beam spot and has a raster line aligned with the plumb line. Spot detector circuit coupled to the output of the TV camera monitors the position of the beam spot relative to the plumb line.
Phootprint - A Phobos sample return mission study
NASA Astrophysics Data System (ADS)
Koschny, Detlef; Svedhem, Håkan; Rebuffat, Denis
Introduction ESA is currently studying a mission to return a sample from Phobos, called Phootprint. This study is performed as part of ESA’s Mars Robotic Exploration Programme. Part of the mission goal is to prepare technology needed for a sample return mission from Mars itself; the mission should also have a strong scientific justification, which is described here. 1. Science goal The main science goal of this mission will be to Understand the formation of the Martian moons Phobos and put constraints on the evolution of the solar system. Currently, there are several possibilities for explaining the formation of the Martian moons: (a) co-formation with Mars (b) capture of objects coming close to Mars (c) Impact of a large body onto Mars and formation from the impact ejecta The main science goal of this mission is to find out which of the three scenarios is the most probable one. To do this, samples from Phobos would be returned to Earth and analyzed with extremely high precision in ground-based laboratories. An on-board payload is foreseen to provide information to put the sample into the necessary geological context. 2. Mission Spacecraft and payload will be based on experience gained from previous studies to Martian moons and asteroids. In particular the Marco Polo and MarcoPolo-R asteroid sample return mission studies performed at ESA were used as a starting point. Currently, industrial studies are ongoing. The initial starting assumption was to use a Soyuz launcher. Uunlike the initial Marco Polo and MarcoPolo-R studies to an asteroid, a transfer stage will be needed. Another main difference to an asteroid mission is the fact that the spacecraft actually orbits Mars, not Phobos or Deimos. It is possible to select a spacecraft orbit, which in a Phobos- or Deimos-centred reference system would give an ellipse around the moon. The following model payload is currently foreseen: - Wide Angle Camera, - Narrow Angle Camera, - Close-Up Camera, - Context camera for sampling context, - visible-IR spectrometer - thermal IR spectrometer - and a Radio Science investigation. It is expected that with these instruments the necessary context for the sample can be provided. The paper will focus on the current status of the mission study.
Photometric normalization of LROC WAC images
NASA Astrophysics Data System (ADS)
Sato, H.; Denevi, B.; Robinson, M. S.; Hapke, B. W.; McEwen, A. S.; LROC Science Team
2010-12-01
The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) acquires near global coverage on a monthly basis. The WAC is a push frame sensor with a 90° field of view (FOV) in BW mode and 60° FOV in 7-color mode (320 nm to 689 nm). WAC images are acquired during each orbit in 10° latitude segments with cross track coverage of ~50 km. Before mosaicking, WAC images are radiometrically calibrated to remove instrumental artifacts and to convert at sensor radiance to I/F. Images are also photometrically normalized to common viewing and illumination angles (30° phase), a challenge due to the wide angle nature of the WAC where large differences in phase angle are observed in a single image line (±30°). During a single month the equatorial incidence angle drifts about 28° and over the course of ~1 year the lighting completes a 360° cycle. The light scattering properties of the lunar surface depend on incidence(i), emission(e), and phase(p) angles as well as soil properties such as single-scattering albedo and roughness that vary with terrain type and state of maturity [1]. We first tested a Lommel-Seeliger Correction (LSC) [cos(i)/(cos(i) + cos(e))] [2] with a phase function defined by an exponential decay plus 4th order polynomial term [3] which did not provide an adequate solution. Next we employed a LSC with an exponential 2nd order decay phase correction that was an improvement, but still exhibited unacceptable frame-to-frame residuals. In both cases we fitted the LSC I/F vs. phase angle to derive the phase corrections. To date, the best results are with a lunar-lambert function [4] with exponential 2nd order decay phase correction (LLEXP2) [(A1exp(B1p)+A2exp(B2p)+A3) * cos(i)/(cos(e) + cos(i)) + B3cos(i)]. We derived the parameters for the LLEXP2 from repeat imaging of a small region and then corrected that region with excellent results. When this correction was applied to the whole Moon the results were less than optimal - no surprise given the variability of the regolith from region to region. As the fitting area increases, the accuracy of curve fitting decreases due to the larger variety of albedo, topography, and composition. Thus we have adopted an albedo-dependent photometric normalization routine. Phase curves are derived for discreet bins of preliminary normalized reflectance calculated from Clementine global mosaic in a fitting area that is composed of predominantly mare in Oceanus Procellarum. The global WAC mosaic was then corrected pixel-by-pixel according to its preliminary reflectance map with satisfactory results. We observed that the phase curves per normalized-reflectance bins become steeper as the reflectance value increases. Further filtering by using FeO, TiO2, or optical maturity [5] for parameter calculations may help elucidate the effects of surface composition and maturity on photometric properties of the surface. [1] Hapke, B.W. (1993) Theory of Reflectance and Emittance Spectroscopy, Cambridge Univ. Press. [2] Schoenberg (1925) Ada. Soc. Febb., vol. 50. [3] Hillier et al. (1999) Icarus 141, 205-225. [4] McEwen (1991) Icarus 92, 298-311. [5] Lucey et al. (2000) JGR, v105, no E8, p20377-20386.
Disentangling the outflow and protostars in HH 900 in the Carina Nebula
NASA Astrophysics Data System (ADS)
Reiter, Megan; Smith, Nathan; Kiminki, Megan M.; Bally, John; Anderson, Jay
2015-04-01
HH 900 is a peculiar protostellar outflow emerging from a small, tadpole-shaped globule in the Carina Nebula. Previous Hα imaging with Hubble Space Telescope (HST)/Advanced Camera for Surveys showed an ionized outflow with a wide opening angle that is distinct from the highly collimated structures typically seen in protostellar jets. We present new narrowband near-IR [Fe II] images taken with the Wide Field Camera 3 on the HST that reveal a remarkably different structure than Hα. In contrast to the unusual broad Hα outflow, the [Fe II] emission traces a symmetric, collimated bipolar jet with the morphology and kinematics that are more typical of protostellar jets. In addition, new Gemini adaptive optics images reveal near-IR H2 emission coincident with the Hα emission, but not the [Fe II]. Spectra of these three components trace three separate and distinct velocity components: (1) H2 from the slow, entrained molecular gas, (2) Hα from the ionized skin of the accelerating outflow sheath, and (3) [Fe II] from the fast, dense, and collimated protostellar jet itself. Together, these data require a driving source inside the dark globule that remains undetected behind a large column density of material. In contrast, Hα and H2 emission trace the broad outflow of material entrained by the jet, which is irradiated outside the globule. As it get dissociated and ionized, it remains visible for only a short time after it is dragged into the H II region.
Performance Assessment and Geometric Calibration of RESOURCESAT-2
NASA Astrophysics Data System (ADS)
Radhadevi, P. V.; Solanki, S. S.; Akilan, A.; Jyothi, M. V.; Nagasubramanian, V.
2016-06-01
Resourcesat-2 (RS-2) has successfully completed five years of operations in its orbit. This satellite has multi-resolution and multi-spectral capabilities in a single platform. A continuous and autonomous co-registration, geo-location and radiometric calibration of image data from different sensors with widely varying view angles and resolution was one of the challenges of RS-2 data processing. On-orbit geometric performance of RS-2 sensors has been widely assessed and calibrated during the initial phase operations. Since then, as an ongoing activity, various geometric performance data are being generated periodically. This is performed with sites of dense ground control points (GCPs). These parameters are correlated to the direct geo-location accuracy of the RS-2 sensors and are monitored and validated to maintain the performance. This paper brings out the geometric accuracy assessment, calibration and validation done for about 500 datasets of RS-2. The objectives of this study are to ensure the best absolute and relative location accuracy of different cameras, location performance with payload steering and co-registration of multiple bands. This is done using a viewing geometry model, given ephemeris and attitude data, precise camera geometry and datum transformation. In the model, the forward and reverse transformations between the coordinate systems associated with the focal plane, payload, body, orbit and ground are rigorously and explicitly defined. System level tests using comparisons to ground check points have validated the operational geo-location accuracy performance and the stability of the calibration parameters.
Radiometric calibration of wide-field camera system with an application in astronomy
NASA Astrophysics Data System (ADS)
Vítek, Stanislav; Nasyrova, Maria; Stehlíková, Veronika
2017-09-01
Camera response function (CRF) is widely used for the description of the relationship between scene radiance and image brightness. Most common application of CRF is High Dynamic Range (HDR) reconstruction of the radiance maps of imaged scenes from a set of frames with different exposures. The main goal of this work is to provide an overview of CRF estimation algorithms and compare their outputs with results obtained under laboratory conditions. These algorithms, typically designed for multimedia content, are unfortunately quite useless with astronomical image data, mostly due to their nature (blur, noise, and long exposures). Therefore, we propose an optimization of selected methods to use in an astronomical imaging application. Results are experimentally verified on the wide-field camera system using Digital Single Lens Reflex (DSLR) camera.
Report Of The HST Strategy Panel: A Strategy For Recovery
1991-01-01
orbit change out: the Wide Field/Planetary Camera II (WFPC II), the Near-Infrared Camera and Multi- Object Spectrometer (NICMOS) and the Space ...are the Space Telescope Imaging Spectrograph (STB), the Near-Infrared Camera and Multi- Object Spectrom- eter (NICMOS), and the second Wide Field and...expected to fail to lock due to duplicity was 20%; on- orbit data indicates that 10% may be a better estimate, but the guide stars were preselected
Orbital-science investigation: Part C: photogrammetry of Apollo 15 photography
Wu, Sherman S.C.; Schafer, Francis J.; Jordan, Raymond; Nakata, Gary M.; Derick, James L.
1972-01-01
Mapping of large areas of the Moon by photogrammetric methods was not seriously considered until the Apollo 15 mission. In this mission, a mapping camera system and a 61-cm optical-bar high-resolution panoramic camera, as well as a laser altimeter, were used. The mapping camera system comprises a 7.6-cm metric terrain camera and a 7.6-cm stellar camera mounted in a fixed angular relationship (an angle of 96° between the two camera axes). The metric camera has a glass focal-plane plate with reseau grids. The ground-resolution capability from an altitude of 110 km is approximately 20 m. Because of the auxiliary stellar camera and the laser altimeter, the resulting metric photography can be used not only for medium- and small-scale cartographic or topographic maps, but it also can provide a basis for establishing a lunar geodetic network. The optical-bar panoramic camera has a 135- to 180-line resolution, which is approximately 1 to 2 m of ground resolution from an altitude of 110 km. Very large scale specialized topographic maps for supporting geologic studies of lunar-surface features can be produced from the stereoscopic coverage provided by this camera.
NASA Astrophysics Data System (ADS)
Ogawa, Kazunori; Shirai, Kei; Sawada, Hirotaka; Arakawa, Masahiko; Honda, Rie; Wada, Koji; Ishibashi, Ko; Iijima, Yu-ichi; Sakatani, Naoya; Nakazawa, Satoru; Hayakawa, Hajime
2017-07-01
An artificial impact experiment is scheduled for 2018-2019 in which an impactor will collide with asteroid 162137 Ryugu (1999 JU3) during the asteroid rendezvous phase of the Hayabusa2 spacecraft. The small carry-on impactor (SCI) will shoot a 2-kg projectile at 2 km/s to create a crater 1-10 m in diameter with an expected subsequent ejecta curtain of a 100-m scale on an ideal sandy surface. A miniaturized deployable camera (DCAM3) unit will separate from the spacecraft at about 1 km from impact, and simultaneously conduct optical observations of the experiment. We designed and developed a camera system (DCAM3-D) in the DCAM3, specialized for scientific observations of impact phenomenon, in order to clarify the subsurface structure, construct theories of impact applicable in a microgravity environment, and identify the impact point on the asteroid. The DCAM3-D system consists of a miniaturized camera with a wide-angle and high-focusing performance, high-speed radio communication devices, and control units with large data storage on both the DCAM3 unit and the spacecraft. These components were successfully developed under severe constraints of size, mass and power, and the whole DCAM3-D system has passed all tests verifying functions, performance, and environmental tolerance. Results indicated sufficient potential to conduct the scientific observations during the SCI impact experiment. An operation plan was carefully considered along with the configuration and a time schedule of the impact experiment, and pre-programed into the control unit before the launch. In this paper, we describe details of the system design concept, specifications, and the operating plan of the DCAM3-D system, focusing on the feasibility of scientific observations.
Camera Control and Geo-Registration for Video Sensor Networks
NASA Astrophysics Data System (ADS)
Davis, James W.
With the use of large video networks, there is a need to coordinate and interpret the video imagery for decision support systems with the goal of reducing the cognitive and perceptual overload of human operators. We present computer vision strategies that enable efficient control and management of cameras to effectively monitor wide-coverage areas, and examine the framework within an actual multi-camera outdoor urban video surveillance network. First, we construct a robust and precise camera control model for commercial pan-tilt-zoom (PTZ) video cameras. In addition to providing a complete functional control mapping for PTZ repositioning, the model can be used to generate wide-view spherical panoramic viewspaces for the cameras. Using the individual camera control models, we next individually map the spherical panoramic viewspace of each camera to a large aerial orthophotograph of the scene. The result provides a unified geo-referenced map representation to permit automatic (and manual) video control and exploitation of cameras in a coordinated manner. The combined framework provides new capabilities for video sensor networks that are of significance and benefit to the broad surveillance/security community.
Optical design of portable nonmydriatic fundus camera
NASA Astrophysics Data System (ADS)
Chen, Weilin; Chang, Jun; Lv, Fengxian; He, Yifan; Liu, Xin; Wang, Dajiang
2016-03-01
Fundus camera is widely used in screening and diagnosis of retinal disease. It is a simple, and widely used medical equipment. Early fundus camera expands the pupil with mydriatic to increase the amount of the incoming light, which makes the patients feel vertigo and blurred. Nonmydriatic fundus camera is a trend of fundus camera. Desktop fundus camera is not easy to carry, and only suitable to be used in the hospital. However, portable nonmydriatic retinal camera is convenient for patient self-examination or medical stuff visiting a patient at home. This paper presents a portable nonmydriatic fundus camera with the field of view (FOV) of 40°, Two kinds of light source are used, 590nm is used in imaging, while 808nm light is used in observing the fundus in high resolving power. Ring lights and a hollow mirror are employed to restrain the stray light from the cornea center. The focus of the camera is adjusted by reposition the CCD along the optical axis. The range of the diopter is between -20m-1 and 20m-1.
2017-12-08
NASA image release September 7, 2011 The Earth's moon has been an endless source of fascination for humanity for thousands of years. When at last Apollo 11 landed on the moon's surface in 1969, the crew found a desolate, lifeless orb, but one which still fascinates scientist and non-scientist alike. This image of the moon's north polar region was taken by the Lunar Reconnaissance Orbiter Camera, or LROC. One of the primary scientific objectives of LROC is to identify regions of permanent shadow and near-permanent illumination. Since the start of the mission, LROC has acquired thousands of Wide Angle Camera images approaching the north pole. From these images, scientists produced this mosaic, which is composed of 983 images taken over a one month period during northern summer. This mosaic shows the pole when it is best illuminated, regions that are in shadow are candidates for permanent shadow. Image Credit: NASA/GSFC/Arizona State University NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Bio-inspired hemispherical compound eye camera
NASA Astrophysics Data System (ADS)
Xiao, Jianliang; Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B.; Huang, Yonggang; Rogers, John A.
2014-03-01
Compound eyes in arthropods demonstrate distinct imaging characteristics from human eyes, with wide angle field of view, low aberrations, high acuity to motion and infinite depth of field. Artificial imaging systems with similar geometries and properties are of great interest for many applications. However, the challenges in building such systems with hemispherical, compound apposition layouts cannot be met through established planar sensor technologies and conventional optics. We present our recent progress in combining optics, materials, mechanics and integration schemes to build fully functional artificial compound eye cameras. Nearly full hemispherical shapes (about 160 degrees) with densely packed artificial ommatidia were realized. The number of ommatidia (180) is comparable to those of the eyes of fire ants and bark beetles. The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors, which were fabricated in the planar geometries and then integrated and elastically transformed to hemispherical shapes. Imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes).
2017-10-09
Saturn's cloud belts generally move around the planet in a circular path, but one feature is slightly different. The planet's wandering, hexagon-shaped polar jet stream breaks the mold -- a reminder that surprises lurk everywhere in the solar system. This atmospheric feature was first observed by the Voyager mission in the early 1980s, and was dubbed "the hexagon." Cassini's visual and infrared mapping spectrometer was first to spy the hexagon during the mission, since it could see the feature's outline while the pole was still immersed in wintry darkness. The hexagon became visible to Cassini's imaging cameras as sunlight returned to the northern hemisphere. This view looks toward the northern hemisphere of Saturn -- in summer when this view was acquired -- from above 65 degrees north latitude. The image was taken with the Cassini spacecraft wide-angle camera on June 28, 2017 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was acquired at a distance of approximately 536,000 miles (862,000 kilometers) from Saturn. Image scale is 32 miles (52 kilometers) per pixel. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA21348
NASA Astrophysics Data System (ADS)
Wierzbicki, Damian; Fryskowska, Anna; Kedzierski, Michal; Wojtkowska, Michalina; Delis, Paulina
2018-01-01
Unmanned aerial vehicles are suited to various photogrammetry and remote sensing missions. Such platforms are equipped with various optoelectronic sensors imaging in the visible and infrared spectral ranges and also thermal sensors. Nowadays, near-infrared (NIR) images acquired from low altitudes are often used for producing orthophoto maps for precision agriculture among other things. One major problem results from the application of low-cost custom and compact NIR cameras with wide-angle lenses introducing vignetting. In numerous cases, such cameras acquire low radiometric quality images depending on the lighting conditions. The paper presents a method of radiometric quality assessment of low-altitude NIR imagery data from a custom sensor. The method utilizes statistical analysis of NIR images. The data used for the analyses were acquired from various altitudes in various weather and lighting conditions. An objective NIR imagery quality index was determined as a result of the research. The results obtained using this index enabled the classification of images into three categories: good, medium, and low radiometric quality. The classification makes it possible to determine the a priori error of the acquired images and assess whether a rerun of the photogrammetric flight is necessary.
Fast soft x-ray images of magnetohydrodynamic phenomena in NSTX.
Bush, C E; Stratton, B C; Robinson, J; Zakharov, L E; Fredrickson, E D; Stutman, D; Tritz, K
2008-10-01
A variety of magnetohydrodynamic (MHD) phenomena have been observed on NSTX. Many of these affect fast particle losses, which are of major concern for future burning plasma experiments. Usual diagnostics for studying these phenomena are arrays of Mirnov coils for magnetic oscillations and p-i-n diode arrays for soft x-ray emission from the plasma core. Data reported here are from a unique fast soft x-ray imaging camera (FSXIC) with a wide-angle (pinhole) tangential view of the entire plasma minor cross section. The camera provides a 64x64 pixel image, on a charge coupled device chip, of light resulting from conversion of soft x rays incident on a phosphor to the visible. We have acquired plasma images at frame rates of 1-500 kHz (300 frames/shot) and have observed a variety of MHD phenomena: disruptions, sawteeth, fishbones, tearing modes, and edge localized modes (ELMs). New data including modes with frequency >90 kHz are also presented. Data analysis and modeling techniques used to interpret the FSXIC data are described and compared, and FSXIC results are compared to Mirnov and p-i-n diode array results.
NASA Technical Reports Server (NTRS)
2005-01-01
Saturn poses with Tethys in this Cassini view. The C ring casts thin, string-like shadows on the northern hemisphere. Above that lurks the shadow of the much denser B ring. Cloud bands in the atmosphere are subtly visible in the south. Tethys is 1,071 kilometers (665 miles) across. Cassini will perform a close flyby of Tethys on September 24, 2005. The image was taken on June 10, 2005, in visible green light with the Cassini spacecraft wide-angle camera at a distance of approximately 1.4 million kilometers (900,000 miles) from Saturn. The image scale is 81 kilometers (50 miles) per pixel. The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA's Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colo. For more information about the Cassini-Huygens mission visit http://saturn.jpl.nasa.gov . The Cassini imaging team homepage is at http://ciclops.org .Developing a Low-Cost System for 3d Data Acquisition
NASA Astrophysics Data System (ADS)
Kossieris, S.; Kourounioti, O.; Agrafiotis, P.; Georgopoulos, A.
2017-11-01
In this paper, a developed low-cost system is described, which aims to facilitate 3D documentation fast and reliably by acquiring the necessary data in outdoor environment for the 3D documentation of façades especially in the case of very narrow streets. In particular, it provides a viable solution for buildings up to 8-10m high and streets as narrow as 2m or even less. In cases like that, it is practically impossible or highly time-consuming to acquire images in a conventional way. This practice would lead to a huge number of images and long processing times. The developed system was tested in the narrow streets of a medieval village on the Greek island of Chios. There, in order to by-pass the problem of short taking distances, it was thought to use high definition action cameras together with a 360˚ camera, which are usually provided with very wide-angle lenses and are capable of acquiring images, of high definition, are rather cheap and, most importantly, extremely light. Results suggest that the system can perform fast 3D data acquisition adequate for deliverables of high quality.
System of technical vision for autonomous unmanned aerial vehicles
NASA Astrophysics Data System (ADS)
Bondarchuk, A. S.
2018-05-01
This paper is devoted to the implementation of image recognition algorithm using the LabVIEW software. The created virtual instrument is designed to detect the objects on the frames from the camera mounted on the UAV. The trained classifier is invariant to changes in rotation, as well as to small changes in the camera's viewing angle. Finding objects in the image using particle analysis, allows you to classify regions of different sizes. This method allows the system of technical vision to more accurately determine the location of the objects of interest and their movement relative to the camera.
NASA Astrophysics Data System (ADS)
Liu, Yu-Che; Huang, Chung-Lin
2013-03-01
This paper proposes a multi-PTZ-camera control mechanism to acquire close-up imagery of human objects in a surveillance system. The control algorithm is based on the output of multi-camera, multi-target tracking. Three main concerns of the algorithm are (1) the imagery of human object's face for biometric purposes, (2) the optimal video quality of the human objects, and (3) minimum hand-off time. Here, we define an objective function based on the expected capture conditions such as the camera-subject distance, pan tile angles of capture, face visibility and others. Such objective function serves to effectively balance the number of captures per subject and quality of captures. In the experiments, we demonstrate the performance of the system which operates in real-time under real world conditions on three PTZ cameras.
NASA Technical Reports Server (NTRS)
1986-01-01
This image of Miranda, obtained by Voyager 2 on approach, shows an unusual 'chevron' figure and regions of distinctly differing terrain on the Uranian moon. Voyager was 42,000 kilometers (26,000 miles) away when its narrow-angle camera acquired this clear-filter view. Grooved areas baring light and dark bands, distinct from other areas of mottled terrain, are visible at this resolution of about 600 meters (2,000 feet). The bright V-shaped feature in the grooved areas is the 'chevron' observed in earlier, lower-resolution images. Cutting across the bands are sinuous scarps, probably faults. Superimposed on both types of terrain are many bowl-shaped impact craters less than 5 km (3 mi) wide. The entire picture spans an area about 220 km (140 mi) across. The Voyager project is managed for NASA by the Jet Propulsion Laboratory.
1989-08-21
This picture of Neptune was produced from images taken through the ultraviolet, violet and green filters of the Voyager 2 wide-angle camera. This 'false' color image has been made to show clearly details of the cloud structure and to paint clouds located at different altitudes with different colors. Dark, deeplying clouds tend to be masked in the ultraviolet wavelength since overlying air molecules are particularly effective in scattering sunlight there which brightens the sky above them. Such areas appear dark blue in this photo. The Great Dark Spot (GDS) and the high southern latitudes have a deep bluish cast in this image, indication they are regions where visible light (but not ultraviolet light) may penetrate to a deeper layer of dark cloud or haze in Neptune's atmosphere. Conversely, the pinkish clouds may be positioned at high altitudes.
2014-08-18
Saturn reigns supreme, encircled by its retinue of rings. Although all four giant planets have ring systems, Saturn's is by far the most massive and impressive. Scientists are trying to understand why by studying how the rings have formed and how they have evolved over time. Also seen in this image is Saturn's famous north polar vortex and hexagon. This view looks toward the sunlit side of the rings from about 37 degrees above the ringplane. The image was taken with the Cassini spacecraft wide-angle camera on May 4, 2014 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was acquired at a distance of approximately 2 million miles (3 million kilometers) from Saturn. Image scale is 110 miles (180 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18278
Instrument Overview of the JEM-EUSO Mission
NASA Technical Reports Server (NTRS)
Kajino, F.; Yamamoto, T.; Sakata, M.; Yamamoto, Y.; Sato, H.; Ebizuka, N.; Ebisuzaki, T.; Uehara, Y.; Ohmori, H.; Kawasaki, Y.;
2007-01-01
JEM-EUSO with a large and wide-angle telescope mounted on the International Space Station (ISS) has been planned as a space mission to explore extremes of the universe through the investigation of extreme energy cosmic rays by detecting photons which accompany air showers developed in the earth's atmosphere. JEM-EUSO will be launched by Japanese H-II Transfer Vehicle (HTV) and mounted at the Exposed Facility of Japanese Experiment Module (JEM/EF) of the ISS in the second phase of utilization plan. The telescope consists of high transmittance optical Fresnel lenses with a diameter of 2.5m, 200k channels of multi anode-photomultiplier tubes, focal surface front-end, readout, trigger and system electronics. An infrared camera and a LIDAR system will be also used to monitor the earth's atmosphere.
NASA Technical Reports Server (NTRS)
2005-01-01
1 January 2004 This red wide angle Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows Tikhonravov Crater in central Arabia Terra. The crater is about 386 km (240 mi) in diameter and presents two impact craters at its center that have dark patches of sand in them, giving the impression of pupils in two eyes. North (above) each of these two craters lies a dark-toned patch of surface material, providing the impression of eyebrows. M. K. Tikhonravov was a leading Russian rocket engineer in the 20th Century. The crater named for him, despite its large size, is still partly buried, on its west side, beneath the heavily cratered terrain of Arabia Terra. The center of Tikhonravov is near 13.5oN, 324.2oW. Sunlight illuminates the scene from the upper left.Contact Angle Measurements Using a Simplified Experimental Setup
ERIC Educational Resources Information Center
Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric
2010-01-01
A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact…
3. Elevation view of entire midsection using ultrawide angle lens. ...
3. Elevation view of entire midsection using ultrawide angle lens. Note opened south doors and closed north doors. The following photo WA-203-C-4 is similar except the camera position was moved right to include the slope of the south end. - Puget Sound Naval Shipyard, Munitions Storage Bunker, Naval Ammunitions Depot, South of Campbell Trail, Bremerton, Kitsap County, WA
Camera calibration for multidirectional flame chemiluminescence tomography
NASA Astrophysics Data System (ADS)
Wang, Jia; Zhang, Weiguang; Zhang, Yuhong; Yu, Xun
2017-04-01
Flame chemiluminescence tomography (FCT), which combines computerized tomography theory and multidirectional chemiluminescence emission measurements, can realize instantaneous three-dimensional (3-D) diagnostics for flames with high spatial and temporal resolutions. One critical step of FCT is to record the projections by multiple cameras from different view angles. For high accuracy reconstructions, it requires that extrinsic parameters (the positions and orientations) and intrinsic parameters (especially the image distances) of cameras be accurately calibrated first. Taking the focus effect of the camera into account, a modified camera calibration method was presented for FCT, and a 3-D calibration pattern was designed to solve the parameters. The precision of the method was evaluated by reprojections of feature points to cameras with the calibration results. The maximum root mean square error of the feature points' position is 1.42 pixels and 0.0064 mm for the image distance. An FCT system with 12 cameras was calibrated by the proposed method and the 3-D CH* intensity of a propane flame was measured. The results showed that the FCT system provides reasonable reconstruction accuracy using the camera's calibration results.
Target Acquisition for Projectile Vision-Based Navigation
2014-03-01
Future Work 20 8. References 21 Appendix A. Simulation Results 23 Appendix B. Derivation of Ground Resolution for a Diffraction-Limited Pinhole Camera...results for visual acquisition (left) and target recognition (right). ..........19 Figure B-1. Differential object and image areas for pinhole camera...projectile and target (measured in terms of the angle ) will depend on target heading. In particular, because we have aligned the x axis along the
Evaluation of Eye Metrics as a Detector of Fatigue
2010-03-01
eyeglass frames . The cameras are angled upward toward the eyes and extract real-time pupil diameter, eye-lid movement, and eye-ball movement. The...because the cameras were mounted on eyeglass -like frames , the system was able to continuously monitor the eye throughout all sessions. Overall, the...of “ fitness for duty” testing and “real-time monitoring” of operator performance has been slow (Institute of Medicine, 2004). Oculometric-based
MISR Scans the Texas-Oklahoma Border
NASA Technical Reports Server (NTRS)
2000-01-01
These MISR images of Oklahoma and north Texas were acquired on March 12, 2000 during Terra orbit 1243. The three images on the left, from top to bottom, are from the 70-degree forward viewing camera, the vertical-viewing (nadir) camera, and the 70-degree aftward viewing camera. The higher brightness, bluer tinge, and reduced contrast of the oblique views result primarily from scattering of sunlight in the Earth's atmosphere, though some color and brightness variations are also due to differences in surface reflection at the different angles. The longer slant path through the atmosphere at the oblique angles also accentuates the appearance of thin, high-altitude cirrus clouds.On the right, two areas from the nadir camera image are shown in more detail, along with notations highlighting major geographic features. The south bank of the Red River marks the boundary between Texas and Oklahoma. Traversing brush-covered and grassy plains, rolling hills, and prairies, the Red River and the Canadian River are important resources for farming, ranching, public drinking water, hydroelectric power, and recreation. Both originate in New Mexico and flow eastward, their waters eventually discharging into the Mississippi River.A smoke plume to the north of the Ouachita Mountains and east of Lake Eufaula is visible in the detailed nadir imagery. The plume is also very obvious at the 70-degree forward view angle, to the right of center and about one-fourth of the way down from the top of the image.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.Use of MEMs and optical sensors for closed loop heliostat control
NASA Astrophysics Data System (ADS)
Harper, Paul Julian; Dreijer, Janto; Malan, Karel; Larmuth, James; Gauche, Paul
2016-05-01
The Helio 100 project at STERG (Stellenbosch Solar Thermal Research Group) aims to help reduce the cost of Concentrated Solar Thermal plants by deploying large numbers of small (1x2 m) low cost heliostats. One of the methods employed to reduce the cost of the heliostat field is to have a field that requires no site preparation (grading, leveling, vegetation clearance) and no expensive foundations or concrete pouring for each individual heliostat base. This implies that the heliostat pod frames and vertical mounts might be slightly out of vertical, and the normal method of dead reckoning using accurately surveyed and aligned heliostat bases cannot be used. This paper describes a combination of MEMs and optical sensors on the back of the heliostat, that together with a simple machine learning approach, give accurate and reproducible azimuth and elevation information for the heliostat plane. Initial experiments were done with an android phone mounted on the back of a heliostat as it was a readily available platform combining accelerometers' and camera into one programmable package. It was found quite easy to determine the pointing angle of the heliostat to within 1 milliradian using the rear facing camera and correlating known heliostat angles with target image features on the ground. We also tested the accuracy at various image resolutions by halving the image size successively till the feature detection failed. This showed that even a VGA (640x480) resolution image could give mean errors of 1.5 milliradian. The optical technique is exceedingly simple and does not use any camera calibration, angular reconstruction or knowledge of heliostat drive geometry. We also tested the ability of the 3d accelerometers to determine angle, but this was coarser than the camera and only accurate to around 10 milliradians.
An Orientation Sensor-Based Head Tracking System for Driver Behaviour Monitoring.
Zhao, Yifan; Görne, Lorenz; Yuen, Iek-Man; Cao, Dongpu; Sullman, Mark; Auger, Daniel; Lv, Chen; Wang, Huaji; Matthias, Rebecca; Skrypchuk, Lee; Mouzakitis, Alexandros
2017-11-22
Although at present legislation does not allow drivers in a Level 3 autonomous vehicle to engage in a secondary task, there may become a time when it does. Monitoring the behaviour of drivers engaging in various non-driving activities (NDAs) is crucial to decide how well the driver will be able to take over control of the vehicle. One limitation of the commonly used face-based head tracking system, using cameras, is that sufficient features of the face must be visible, which limits the detectable angle of head movement and thereby measurable NDAs, unless multiple cameras are used. This paper proposes a novel orientation sensor based head tracking system that includes twin devices, one of which measures the movement of the vehicle while the other measures the absolute movement of the head. Measurement error in the shaking and nodding axes were less than 0.4°, while error in the rolling axis was less than 2°. Comparison with a camera-based system, through in-house tests and on-road tests, showed that the main advantage of the proposed system is the ability to detect angles larger than 20° in the shaking and nodding axes. Finally, a case study demonstrated that the measurement of the shaking and nodding angles, produced from the proposed system, can effectively characterise the drivers' behaviour while engaged in the NDAs of chatting to a passenger and playing on a smartphone.
Calibration of RGBD camera and cone-beam CT for 3D intra-operative mixed reality visualization.
Lee, Sing Chun; Fuerst, Bernhard; Fotouhi, Javad; Fischer, Marius; Osgood, Greg; Navab, Nassir
2016-06-01
This work proposes a novel algorithm to register cone-beam computed tomography (CBCT) volumes and 3D optical (RGBD) camera views. The co-registered real-time RGBD camera and CBCT imaging enable a novel augmented reality solution for orthopedic surgeries, which allows arbitrary views using digitally reconstructed radiographs overlaid on the reconstructed patient's surface without the need to move the C-arm. An RGBD camera is rigidly mounted on the C-arm near the detector. We introduce a calibration method based on the simultaneous reconstruction of the surface and the CBCT scan of an object. The transformation between the two coordinate spaces is recovered using Fast Point Feature Histogram descriptors and the Iterative Closest Point algorithm. Several experiments are performed to assess the repeatability and the accuracy of this method. Target registration error is measured on multiple visual and radio-opaque landmarks to evaluate the accuracy of the registration. Mixed reality visualizations from arbitrary angles are also presented for simulated orthopedic surgeries. To the best of our knowledge, this is the first calibration method which uses only tomographic and RGBD reconstructions. This means that the method does not impose a particular shape of the phantom. We demonstrate a marker-less calibration of CBCT volumes and 3D depth cameras, achieving reasonable registration accuracy. This design requires a one-time factory calibration, is self-contained, and could be integrated into existing mobile C-arms to provide real-time augmented reality views from arbitrary angles.
NASA Technical Reports Server (NTRS)
1978-01-01
The large format camera (LFC) designed as a 30 cm focal length cartographic camera system that employs forward motion compensation in order to achieve the full image resolution provided by its 80 degree field angle lens is described. The feasibility of application of the current LFC design to deployment in the orbiter program as the Orbiter Camera Payload System was assessed and the changes that are necessary to meet such a requirement are discussed. Current design and any proposed design changes were evaluated relative to possible future deployment of the LFC on a free flyer vehicle or in a WB-57F. Preliminary mission interface requirements for the LFC are given.
Wide-field Fourier ptychographic microscopy using laser illumination source
Chung, Jaebum; Lu, Hangwen; Ou, Xiaoze; Zhou, Haojiang; Yang, Changhuei
2016-01-01
Fourier ptychographic (FP) microscopy is a coherent imaging method that can synthesize an image with a higher bandwidth using multiple low-bandwidth images captured at different spatial frequency regions. The method’s demand for multiple images drives the need for a brighter illumination scheme and a high-frame-rate camera for a faster acquisition. We report the use of a guided laser beam as an illumination source for an FP microscope. It uses a mirror array and a 2-dimensional scanning Galvo mirror system to provide a sample with plane-wave illuminations at diverse incidence angles. The use of a laser presents speckles in the image capturing process due to reflections between glass surfaces in the system. They appear as slowly varying background fluctuations in the final reconstructed image. We are able to mitigate these artifacts by including a phase image obtained by differential phase contrast (DPC) deconvolution in the FP algorithm. We use a 1-Watt laser configured to provide a collimated beam with 150 mW of power and beam diameter of 1 cm to allow for the total capturing time of 0.96 seconds for 96 raw FPM input images in our system, with the camera sensor’s frame rate being the bottleneck for speed. We demonstrate a factor of 4 resolution improvement using a 0.1 NA objective lens over the full camera field-of-view of 2.7 mm by 1.5 mm. PMID:27896016
Study on the initial value for the exterior orientation of the mobile version
NASA Astrophysics Data System (ADS)
Yu, Zhi-jing; Li, Shi-liang
2011-10-01
Single mobile vision coordinate measurement system is in the measurement site using a single camera body and a notebook computer to achieve three-dimensional coordinates. To obtain more accurate approximate values of exterior orientation calculation in the follow-up is very important in the measurement process. The problem is a typical one for the space resection, and now studies on this topic have been widely conducted in research. Single-phase space resection mainly focuses on two aspects: of co-angular constraint based on the method, its representatives are camera co-angular constraint pose estimation algorithm and the cone angle law; the other is a direct linear transformation (DLT). One common drawback for both methods is that the CCD lens distortion is not considered. When the initial value was calculated with the direct linear transformation method, the distribution and abundance of control points is required relatively high, the need that control points can not be distributed in the same plane must be met, and there are at least six non- coplanar control points. However, its usefulness is limited. Initial value will directly influence the convergence and convergence speed of the ways of calculation. This paper will make the nonlinear of the total linear equations linearized by using the total linear equations containing distorted items and Taylor series expansion, calculating the initial value of the camera exterior orientation. Finally, the initial value is proved to be better through experiments.
System Synchronizes Recordings from Separated Video Cameras
NASA Technical Reports Server (NTRS)
Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.
2009-01-01
A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.
Snowstorm Along the China-Mongolia-Russia Borders
NASA Technical Reports Server (NTRS)
2004-01-01
Heavy snowfall on March 12, 2004, across north China's Inner Mongolia Autonomous Region, Mongolia and Russia, caused train and highway traffic to stop for several days along the Russia-China border. This pair of images from the Multi-angle Imaging SpectroRadiometer (MISR) highlights the snow and surface properties across the region on March 13. The left-hand image is a multi-spectral false-color view made from the near-infrared, red, and green bands of MISR's vertical-viewing (nadir) camera. The right-hand image is a multi-angle false-color view made from the red band data of the 46-degree aftward camera, the nadir camera, and the 46-degree forward camera. About midway between the frozen expanse of China's Hulun Nur Lake (along the right-hand edge of the images) and Russia's Torey Lakes (above image center) is a dark linear feature that corresponds with the China-Mongolia border. In the upper portion of the images, many small plumes of black smoke rise from coal and wood fires and blow toward the southeast over the frozen lakes and snow-covered grasslands. Along the upper left-hand portion of the images, in Russia's Yablonovyy mountain range and the Onon River Valley, the terrain becomes more hilly and forested. In the nadir image, vegetation appears in shades of red, owing to its high near-infrared reflectivity. In the multi-angle composite, open-canopy forested areas are indicated by green hues. Since this is a multi-angle composite, the green color arises not from the color of the leaves but from the architecture of the surface cover. The green areas appear brighter at the nadir angle than at the oblique angles because more of the snow-covered surface in the gaps between the trees is visible. Color variations in the multi-angle composite also indicate angular reflectance properties for areas covered by snow and ice. The light blue color of the frozen lakes is due to the increased forward scattering of smooth ice, and light orange colors indicate rougher ice or snow, which scatters more light in the backward direction. The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire Earth between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbit 22525. The panels cover an area of about 355 kilometers x 380 kilometers, and utilize data from blocks 50 to 52 within World Reference System-2 path 126. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.Efficient large-scale graph data optimization for intelligent video surveillance
NASA Astrophysics Data System (ADS)
Shang, Quanhong; Zhang, Shujun; Wang, Yanbo; Sun, Chen; Wang, Zepeng; Zhang, Luming
2017-08-01
Society is rapidly accepting the use of a wide variety of cameras Location and applications: site traffic monitoring, parking Lot surveillance, car and smart space. These ones here the camera provides data every day in an analysis Effective way. Recent advances in sensor technology Manufacturing, communications and computing are stimulating.The development of new applications that can change the traditional Vision system incorporating universal smart camera network. This Analysis of visual cues in multi camera networks makes wide Applications ranging from smart home and office automation to large area surveillance and traffic surveillance. In addition, dense Camera networks, most of which have large overlapping areas of cameras. In the view of good research, we focus on sparse camera networks. One Sparse camera network using large area surveillance. As few cameras as possible, most cameras do not overlap Each other’s field of vision. This task is challenging Lack of knowledge of topology Network, the specific changes in appearance and movement Track different opinions of the target, as well as difficulties Understanding complex events in a network. In this review in this paper, we present a comprehensive survey of recent studies Results to solve the problem of topology learning, Object appearance modeling and global activity understanding sparse camera network. In addition, some of the current open Research issues are discussed.
NASA Astrophysics Data System (ADS)
Wolfe, C. A.; Lemmon, M. T.
2015-12-01
Dust in the Martian atmosphere influences energy deposition, dynamics, and the viability of solar powered exploration vehicles. The Viking, Pathfinder, Spirit, Opportunity, Phoenix, and Curiosity landers and rovers each included the ability to image the Sun with a science camera equipped with a neutral density filter. Direct images of the Sun not only provide the ability to measure extinction by dust and ice in the atmosphere, but also provide a variety of constraints on the Martian dust and water cycles. These observations have been used to characterize dust storms, to provide ground truth sites for orbiter-based global measurements of dust loading, and to help monitor solar panel performance. In the cost-constrained environment of Mars exploration, future missions may omit such cameras, as the solar-powered InSight mission has. We seek to provide a robust capability of determining atmospheric opacity from sky images taken with cameras that have not been designed for solar imaging, such as the engineering cameras onboard Opportunity and the Mars Hand Lens Imager (MAHLI) on Curiosity. Our investigation focuses primarily on the accuracy of a method that determines optical depth values using scattering models that implement the ratio of sky radiance measurements at different elevation angles, but at the same scattering angle. Operational use requires the ability to retrieve optical depth on a timescale useful to mission planning, and with an accuracy and precision sufficient to support both mission planning and validating orbital measurements. We will present a simulation-based assessment of imaging strategies and their error budgets, as well as a validation based on the comparison of direct extinction measurements from archival Navcam, Hazcam, and MAHLI camera data.
Curiosity Rover View of Alluring Martian Geology Ahead
2015-08-05
A southward-looking panorama combining images from both cameras of the Mast Camera Mastcam instrument on NASA Curiosity Mars Rover shows diverse geological textures on Mount Sharp. A southward-looking panorama combining images from both cameras of the Mast Camera (Mastcam) instrument on NASA's Curiosity Mars Rover shows diverse geological textures on Mount Sharp. Three years after landing on Mars, the mission is investigating this layered mountain for evidence about changes in Martian environmental conditions, from an ancient time when conditions were favorable for microbial life to the much-drier present. Gravel and sand ripples fill the foreground, typical of terrains that Curiosity traversed to reach Mount Sharp from its landing site. Outcrops in the midfield are of two types: dust-covered, smooth bedrock that forms the base of the mountain, and sandstone ridges that shed boulders as they erode. Rounded buttes in the distance contain sulfate minerals, perhaps indicating a change in the availability of water when they formed. Some of the layering patterns on higher levels of Mount Sharp in the background are tilted at different angles than others, evidence of complicated relationships still to be deciphered. The scene spans from southeastward at left to southwestward at right. The component images were taken on April 10 and 11, 2015, the 952nd and 953rd Martian days (or sols) since the rover's landing on Mars on Aug. 6, 2012, UTC (Aug. 5, PDT). Images in the central part of the panorama are from Mastcam's right-eye camera, which is equipped with a 100-millimeter-focal-length telephoto lens. Images used in outer portions, including the most distant portions of the mountain in the scene, were taken with Mastcam's left-eye camera, using a wider-angle, 34-millimeter lens. http://photojournal.jpl.nasa.gov/catalog/PIA19803
Sub-picosecond streak camera measurements at LLNL: From IR to x-rays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuba, J; Shepherd, R; Booth, R
An ultra fast, sub-picosecond resolution streak camera has been recently developed at the LLNL. The camera is a versatile instrument with a wide operating wavelength range. The temporal resolution of up to 300 fs can be achieved, with routine operation at 500 fs. The streak camera has been operated in a wide wavelength range from IR to x-rays up to 2 keV. In this paper we briefly review the main design features that result in the unique properties of the streak camera and present its several scientific applications: (1) Streak camera characterization using a Michelson interferometer in visible range, (2)more » temporally resolved study of a transient x-ray laser at 14.7 nm, which enabled us to vary the x-ray laser pulse duration from {approx}2-6 ps by changing the pump laser parameters, and (3) an example of a time-resolved spectroscopy experiment with the streak camera.« less
Fast Orientation of Video Images of Buildings Acquired from a UAV without Stabilization.
Kedzierski, Michal; Delis, Paulina
2016-06-23
The aim of this research was to assess the possibility of conducting an absolute orientation procedure for video imagery, in which the external orientation for the first image was typical for aerial photogrammetry whereas the external orientation of the second was typical for terrestrial photogrammetry. Starting from the collinearity equations, assuming that the camera tilt angle is equal to 90°, a simplified mathematical model is proposed. The proposed method can be used to determine the X, Y, Z coordinates of points based on a set of collinearity equations of a pair of images. The use of simplified collinearity equations can considerably shorten the processing tine of image data from Unmanned Aerial Vehicles (UAVs), especially in low cost systems. The conducted experiments have shown that it is possible to carry out a complete photogrammetric project of an architectural structure using a camera tilted 85°-90° ( φ or ω) and simplified collinearity equations. It is also concluded that there is a correlation between the speed of the UAV and the discrepancy between the established and actual camera tilt angles.
Fast Orientation of Video Images of Buildings Acquired from a UAV without Stabilization
Kedzierski, Michal; Delis, Paulina
2016-01-01
The aim of this research was to assess the possibility of conducting an absolute orientation procedure for video imagery, in which the external orientation for the first image was typical for aerial photogrammetry whereas the external orientation of the second was typical for terrestrial photogrammetry. Starting from the collinearity equations, assuming that the camera tilt angle is equal to 90°, a simplified mathematical model is proposed. The proposed method can be used to determine the X, Y, Z coordinates of points based on a set of collinearity equations of a pair of images. The use of simplified collinearity equations can considerably shorten the processing tine of image data from Unmanned Aerial Vehicles (UAVs), especially in low cost systems. The conducted experiments have shown that it is possible to carry out a complete photogrammetric project of an architectural structure using a camera tilted 85°–90° (φ or ω) and simplified collinearity equations. It is also concluded that there is a correlation between the speed of the UAV and the discrepancy between the established and actual camera tilt angles. PMID:27347954
Test Rover at JPL During Preparation for Mars Rover Low-Angle Selfie
2015-08-19
This view of a test rover at NASA's Jet Propulsion Laboratory, Pasadena, California, results from advance testing of arm positions and camera pointings for taking a low-angle self-portrait of NASA's Curiosity Mars rover. This rehearsal in California led to a dramatic Aug. 5, 2015, selfie of Curiosity, online at PIA19807. Curiosity's arm-mounted Mars Hand Lens Imager (MAHLI) camera took 92 of component images that were assembled into that mosaic. The rover team positioned the camera lower in relation to the rover body than for any previous full self-portrait of Curiosity. This practice version was taken at JPL's Mars Yard in July 2013, using the Vehicle System Test Bed (VSTB) rover, which has a test copy of MAHLI on its robotic arm. MAHLI was built by Malin Space Science Systems, San Diego. JPL, a division of the California Institute of Technology in Pasadena, manages the Mars Science Laboratory Project for the NASA Science Mission Directorate, Washington. JPL designed and built the project's Curiosity rover. http://photojournal.jpl.nasa.gov/catalog/PIA19810
SU-F-J-200: An Improved Method for Event Selection in Compton Camera Imaging for Particle Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mackin, D; Beddar, S; Polf, J
2016-06-15
Purpose: The uncertainty in the beam range in particle therapy limits the conformality of the dose distributions. Compton scatter cameras (CC), which measure the prompt gamma rays produced by nuclear interactions in the patient tissue, can reduce this uncertainty by producing 3D images confirming the particle beam range and dose delivery. However, the high intensity and short time windows of the particle beams limit the number of gammas detected. We attempt to address this problem by developing a method for filtering gamma ray scattering events from the background by applying the known gamma ray spectrum. Methods: We used a 4more » stage Compton camera to record in list mode the energy deposition and scatter positions of gammas from a Co-60 source. Each CC stage contained a 4×4 array of CdZnTe crystal. To produce images, we used a back-projection algorithm and four filtering Methods: basic, energy windowing, delta energy (ΔE), or delta scattering angle (Δθ). Basic filtering requires events to be physically consistent. Energy windowing requires event energy to fall within a defined range. ΔE filtering selects events with the minimum difference between the measured and a known gamma energy (1.17 and 1.33 MeV for Co-60). Δθ filtering selects events with the minimum difference between the measured scattering angle and the angle corresponding to a known gamma energy. Results: Energy window filtering reduced the FWHM from 197.8 mm for basic filtering to 78.3 mm. ΔE and Δθ filtering achieved the best results, FWHMs of 64.3 and 55.6 mm, respectively. In general, Δθ filtering selected events with scattering angles < 40°, while ΔE filtering selected events with angles > 60°. Conclusion: Filtering CC events improved the quality and resolution of the corresponding images. ΔE and Δθ filtering produced similar results but each favored different events.« less
Calibration Image of Earth by Mars Color Imager
NASA Technical Reports Server (NTRS)
2005-01-01
Three days after the Mars Reconnaissance Orbiter's Aug. 12, 2005, launch, the NASA spacecraft was pointed toward Earth and the Mars Color Imager camera was powered up to acquire a suite of color and ultraviolet images of Earth and the Moon. When it gets to Mars, the Mars Color Imager's main objective will be to obtain daily global color and ultraviolet images of the planet to observe martian meteorology by documenting the occurrence of dust storms, clouds, and ozone. This camera will also observe how the martian surface changes over time, including changes in frost patterns and surface brightness caused by dust storms and dust devils. The purpose of acquiring an image of Earth and the Moon just three days after launch was to help the Mars Color Imager science team obtain a measure, in space, of the instrument's sensitivity, as well as to check that no contamination occurred on the camera during launch. Prior to launch, the team determined that, three days out from Earth, the planet would only be about 4.77 pixels across, and the Moon would be less than one pixel in size, as seen from the Mars Color Imager's wide-angle perspective. If the team waited any longer than three days to test the camera's performance in space, Earth would be too small to obtain meaningful results. The images were acquired by turning Mars Reconnaissance Orbiter toward Earth, then slewing the spacecraft so that the Earth and Moon would pass before each of the five color and two ultraviolet filters of the Mars Color Imager. The distance to Earth was about 1,170,000 kilometers (about 727,000 miles). This image shows a color composite view of Mars Color Imager's image of Earth. As expected, it covers only five pixels. This color view has been enlarged five times. The Sun was illuminating our planet from the left, thus only one quarter of Earth is seen from this perspective. North America was in daylight and facing toward the camera at the time the picture was taken; the data from the camera were being transmitted in real time to the Deep Space Network antennas in Goldstone, California.HF-induced airglow structure as a proxy for ionospheric irregularity detection
NASA Astrophysics Data System (ADS)
Kendall, E. A.
2013-12-01
The High Frequency Active Auroral Research Program (HAARP) heating facility allows scientists to test current theories of plasma physics to gain a better understanding of the underlying mechanisms at work in the lower ionosphere. One powerful technique for diagnosing radio frequency interactions in the ionosphere is to use ground-based optical instrumentation. High-frequency (HF), heater-induced artificial airglow observations can be used to diagnose electron energies and distributions in the heated region, illuminate natural and/or artificially induced ionospheric irregularities, determine ExB plasma drifts, and measure quenching rates by neutral species. Artificial airglow is caused by HF-accelerated electrons colliding with various atmospheric constituents, which in turn emit a photon. The most common emissions are 630.0 nm O(1D), 557.7 nm O(1S), and 427.8 nm N2+(1NG). Because more photons will be emitted in regions of higher electron energization, it may be possible to use airglow imaging to map artificial field-aligned irregularities at a particular altitude range in the ionosphere. Since fairly wide field-of-view imagers are typically deployed in airglow campaigns, it is not well-known what meter-scale features exist in the artificial airglow emissions. Rocket data show that heater-induced electron density variations, or irregularities, consist of bundles of ~10-m-wide magnetic field-aligned filaments with a mean depletion depth of 6% [Kelley et al., 1995]. These bundles themselves constitute small-scale structures with widths of 1.5 to 6 km. Telescopic imaging provides high resolution spatial coverage of ionospheric irregularities and goes hand in hand with other observing techniques such as GPS scintillation, radar, and ionosonde. Since airglow observations can presumably image ionospheric irregularities (electron density variations), they can be used to determine the spatial scale variation, the fill factor, and the lifetime characteristics of irregularities. Telescopic imaging of airglow is a technique capable of simultaneously determining the properties of ionospheric irregularities at decameter resolution over a range of several kilometers. The HAARP telescopic imager consists of two cameras, a set of optics for each camera, and a robotic mount that supports and orients the system. The camera and optics systems are identical except for the camera lenses: one has a wide-angle lens (~19 degrees) and the other has a telescopic lens (~3 degrees). The telescopic imager has a resolution of ~20 m in the F layer and ~10 m in the E layer, which allows the observation of decameter- and kilometer-scale features. Analysis of telescopic data from HAARP campaigns over the last five years will be presented.
MOC View of Mars98 Landing Zone - 12/24/97
NASA Technical Reports Server (NTRS)
1998-01-01
On 12/24/1997 at shortly after 08:17 UTC SCET, the Mars Global Surveyor Mars Orbiter Camera (MOC) took this high resolution image of a small portion of the potential Mars Surveyor '98 landing zone. For the purposes of planning MOC observations, this zone was defined as 75 +/- 2 degrees S latitude, 215 +/- 15 degrees W longitude. The image ran along the western perimeter of the Mars98 landing zone (e.g., near 245oW longitude). At that longitude, the layered deposits are farther south than at the prime landing longitude. The images were shifted in latitude to fall onto the layered deposits. The location of the image was selected to try to cover a range of possible surface morphologies, reliefs, and albedos.
This image is approximately 81.5 km long by 31 km wide. It covers an area of about 2640 sq. km. The center of the image is at 80.46oS, 243.12 degrees W. The viewing conditions are: emission angle 56.30 degrees, incidence angle 58.88 degrees, phase of 30.31 degrees, and 15.15 meters/pixel resolution. North is to the top of the image.The effects of ground fog, which obscures the surface features(left), has been minimize by filtering (right).Malin Space Science Systems (MSSS) and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.MOC View of Mars98 Landing Zone - 12/24/97
NASA Technical Reports Server (NTRS)
1998-01-01
On 12/24/1997 at shortly after 08:17 UTC SCET, the Mars Global Surveyor Mars Orbiter Camera (MOC) took this high resolution image of a small portion of the potential Mars Surveyor '98 landing zone. For the purposes of planning MOC observations, this zone was defined as 75 +/- 2 degrees S latitude, 215 +/- 15 degrees W longitude. The image ran along the western perimeter of the Mars98 landing zone (e.g., near 245oW longitude). At that longitude, the layered deposits are farther south than at the prime landing longitude. The images were shifted in latitude to fall onto the layered deposits. The location of the image was selected to try to cover a range of possible surface morphologies, reliefs, and albedos.
This image is approximately 83.3 km long by 31.7 km wide. It covers an area of about 2750 sq. km. The center of the image is at 81.97 degrees S, 246.74 degrees W. The viewing conditions are: emission angle 58.23 degrees, incidence angle 60.23 degrees, phase of 30.34 degrees, and 15.49 meters/pixel resolution. North is to the top of the image.The effects of ground fog, which obscures the surface features(left), has been minimize by filtering (right).Malin Space Science Systems (MSSS) and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.Song, Shaozhen; Xu, Jingjiang; Wang, Ruikang K
2016-11-01
Current optical coherence tomography (OCT) imaging suffers from short ranging distance and narrow imaging field of view (FOV). There is growing interest in searching for solutions to these limitations in order to expand further in vivo OCT applications. This paper describes a solution where we utilize an akinetic swept source for OCT implementation to enable ~10 cm ranging distance, associated with the use of a wide-angle camera lens in the sample arm to provide a FOV of ~20 x 20 cm 2 . The akinetic swept source operates at 1300 nm central wavelength with a bandwidth of 100 nm. We propose an adaptive calibration procedure to the programmable akinetic light source so that the sensitivity of the OCT system over ~10 cm ranging distance is substantially improved for imaging of large volume samples. We demonstrate the proposed swept source OCT system for in vivo imaging of entire human hands and faces with an unprecedented FOV (up to 400 cm 2 ). The capability of large-volume OCT imaging with ultra-long ranging and ultra-wide FOV is expected to bring new opportunities for in vivo biomedical applications.
Song, Shaozhen; Xu, Jingjiang; Wang, Ruikang K.
2016-01-01
Current optical coherence tomography (OCT) imaging suffers from short ranging distance and narrow imaging field of view (FOV). There is growing interest in searching for solutions to these limitations in order to expand further in vivo OCT applications. This paper describes a solution where we utilize an akinetic swept source for OCT implementation to enable ~10 cm ranging distance, associated with the use of a wide-angle camera lens in the sample arm to provide a FOV of ~20 x 20 cm2. The akinetic swept source operates at 1300 nm central wavelength with a bandwidth of 100 nm. We propose an adaptive calibration procedure to the programmable akinetic light source so that the sensitivity of the OCT system over ~10 cm ranging distance is substantially improved for imaging of large volume samples. We demonstrate the proposed swept source OCT system for in vivo imaging of entire human hands and faces with an unprecedented FOV (up to 400 cm2). The capability of large-volume OCT imaging with ultra-long ranging and ultra-wide FOV is expected to bring new opportunities for in vivo biomedical applications. PMID:27896012
Rosetta/OSIRIS: Nucleus morphology and activity of comet 67P/Churyumov-Gerasimenko
NASA Astrophysics Data System (ADS)
Sierks, Holger
2015-08-01
Introduction: The Rosetta mission of the European Space Agency arrived on August 6, 2014, at the target comet 67P/Churyumov-Gerasimenko after 10 years of cruise. OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) is the scientific imaging system onboard Rosetta. It comprises a Narrow Angle Camera (NAC) for broad-band nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field coma investigations.OSIRIS images the nucleus and the coma of comet 67P/C-G from the arrival throughout early mapping phase, PHILAE landing, and escort phase with close fly-by beginning of the year 2015.The team paper presents the surface morphology and activity of the nucleus as seen in gas, dust, and local jets and the larger scale coma studied by OSIRIS.Acknowledgements: OSIRIS was built by a consortium led by the Max-Planck-Institut für Sonnensystemforschung, Göttingen, Germany, in collaboration with CISAS, University of Padova, Italy, the Laboratoire d'Astrophysique de Marseille, France, the Instituto de Astrofísica de Andalucia, CSIC, Granada, Spain, the Scientific Support Office of the European Space Agency, Noordwijk, The Netherlands, the Instituto Nacional de Técnica Aeroespacial, Madrid, Spain, the Universidad Politéchnica de Madrid, Spain, the Department of Physics and Astronomy of Uppsala University, Sweden, and the Institut für Datentechnik und Kommunikationsnetze der Technischen Universität Braunschweig, Germany.Additional Information: The OSIRIS team is H. Sierks, C. Barbieri, P. Lamy, R. Rodrigo, D. Koschny, H. Rickman, J. Agarwal, M. A'Hearn, I. Bertini, F. Angrilli, M. A. Barucci, J. L. Bertaux, G. Cremonese, V. Da Deppo, B. Davidsson, S. Debei, M. De Cecco, S. Fornasier, M. Fulle, O. Groussin, C. Güttler, P. Gutierrez, S. Hviid, W. Ip, L. Jorda, H. U. Keller, J. Knollenberg, R. Kramm, E. Kührt, M. Küppers, L. Lara, M. Lazzarin, J. J. Lopez, S. Lowry, S. Marchi, F. Marzari, H. Michalik, S. Mottola, G. Naletto, N. Oklay, L. Sabau, N. Thomas, C. Tubiana, J-B. Vincent, P. Wenzel, Associate Scientists & Assistants.
Storrie-Lombardi, Michael C; Muller, Jan-Peter; Fisk, Martin R; Cousins, Claire; Sattler, Birgit; Griffiths, Andrew D; Coates, Andrew J
2009-12-01
The European Space Agency will launch the ExoMars mission in 2016 with a primary goal of surveying the martian subsurface for evidence of organic material. We have recently investigated the utility of including either a 365 nm light-emitting diode or a 375 nm laser light source in the ExoMars rover panoramic camera (PanCam). Such a modification would make it feasible to monitor rover drill cuttings optically for the fluorescence signatures of aromatic organic molecules and map the distribution of polycyclic aromatic hydrocarbons (PAHs) as a function of depth to the 2 m limit of the ExoMars drill. The technique described requires no sample preparation, does not consume irreplaceable resources, and would allow mission control to prioritize deployment of organic detection experiments that require sample destruction, expenditure of non-replaceable consumables, or both. We report here for the first time laser-induced fluorescence emission (L.I.F.E.) imaging detection limits for anthracene, pyrene, and perylene targets doped onto a Mars analog granular peridotite with a 375 nm Nichia laser diode in optically uncorrected wide-angle mode. Data were collected via the Beagle 2 PanCam backup filter wheel fitted with original blue (440 nm), green (530 nm), and red (670 nm) filters. All three PAH species can be detected with the PanCam green (530 nm) filter. Detection limits in the green band for signal-to-noise ratios (S/N) > 10 are 49 parts per million (ppm) for anthracene, 145 ppm for pyrene, and 20 ppm for perylene. The anthracene detection limit improves to 7 ppm with use of the PanCam blue filter. We discuss soil-dependent detection limit constraints; use of UV excitation with other rover cameras, which provides higher spatial resolution; and the advantages of focused and wide-angle laser modes. Finally, we discuss application of L.I.F.E. techniques at multiple wavelengths for exploration of Mars analog extreme environments on Earth, including Icelandic hydrothermally altered basalts and the ice-covered lakes and glaciers of Dronning Maud Land, Antarctica.
NASA Astrophysics Data System (ADS)
González-Jorge, Higinio; Riveiro, Belén; Varela, María; Arias, Pedro
2012-07-01
A low-cost image orthorectification tool based on the utilization of compact cameras and scale bars is developed to obtain the main geometric parameters of masonry bridges for inventory and routine inspection purposes. The technique is validated in three different bridges by comparison with laser scanning data. The surveying process is very delicate and must make a balance between working distance and angle. Three different cameras are used in the study to establish the relationship between the error and the camera model. Results depict nondependence in error between the length of the bridge element, the type of bridge, and the type of element. Error values for all the cameras are below 4 percent (95 percent of the data). A compact Canon camera, the model with the best technical specifications, shows an error level ranging from 0.5 to 1.5 percent.
NASA Astrophysics Data System (ADS)
Tokuoka, Nobuyuki; Miyoshi, Hitoshi; Kusano, Hideaki; Hata, Hidehiro; Hiroe, Tetsuyuki; Fujiwara, Kazuhito; Yasushi, Kondo
2008-11-01
Visualization of explosion phenomena is very important and essential to evaluate the performance of explosive effects. The phenomena, however, generate blast waves and fragments from cases. We must protect our visualizing equipment from any form of impact. In the tests described here, the front lens was separated from the camera head by means of a fiber-optic cable in order to be able to use the camera, a Shimadzu Hypervision HPV-1, for tests in severe blast environment, including the filming of explosions. It was possible to obtain clear images of the explosion that were not inferior to the images taken by the camera with the lens directly coupled to the camera head. It could be confirmed that this system is very useful for the visualization of dangerous events, e.g., at an explosion site, and for visualizations at angles that would be unachievable under normal circumstances.
Stereo Cameras for Clouds (STEREOCAM) Instrument Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romps, David; Oktem, Rusen
2017-10-31
The three pairs of stereo camera setups aim to provide synchronized and stereo calibrated time series of images that can be used for 3D cloud mask reconstruction. Each camera pair is positioned at approximately 120 degrees from the other pair, with a 17o-19o pitch angle from the ground, and at 5-6 km distance from the U.S. Department of Energy (DOE) Central Facility at the Atmospheric Radiation Measurement (ARM) Climate Research Facility Southern Great Plains (SGP) observatory to cover the region from northeast, northwest, and southern views. Images from both cameras of the same stereo setup can be paired together tomore » obtain 3D reconstruction by triangulation. 3D reconstructions from the ring of three stereo pairs can be combined together to generate a 3D mask from surrounding views. This handbook delivers all stereo reconstruction parameters of the cameras necessary to make 3D reconstructions from the stereo camera images.« less
2017-05-10
This view from NASA's Cassini spacecraft is the sharpest ever taken of belts of the features called propellers in the middle part of Saturn's A ring. The propellers are the small, bright features that look like double dashes, visible on both sides of the wave pattern that crosses the image diagonally from top to bottom. The original discovery of propellers in this region in Saturn's rings was made using several images taken from very close to the rings during Cassini's 2004 arrival at Saturn. Those discovery images were of low resolution and were difficult to interpret, and there were few clues as to how the small propellers seen in those images were related to the larger propellers Cassini observed later in the mission. This image, for the first time, shows swarms of propellers of a wide range of sizes, putting the ones Cassini observed in its Saturn arrival images in context. Scientists will use this information to derive a "particle size distribution" for propeller moons, which is an important clue to their origins. The image was taken using the Cassini spacecraft's narrow-angle camera on April 19. The view was has an image scale of 0.24 mile (385 meters) per pixel, and was taken at a sun-ring-spacecraft angle, or phase angle, of 108 degrees. The view looks toward a point approximately 80,000 miles (129,000 kilometers) from Saturn's center. https://photojournal.jpl.nasa.gov/catalog/PIA21448
A Non-Contact Measurement System for the Range of Motion of the Hand
Pham, Trieu; Pathirana, Pubudu N.; Trinh, Hieu; Fay, Pearse
2015-01-01
An accurate and standardised tool to measure the active range of motion (ROM) of the hand is essential to any progressive assessment scenario in hand therapy practice. Goniometers are widely used in clinical settings for measuring the ROM of the hand. However, such measurements have limitations with regard to inter-rater and intra-rater reliability and involve direct physical contact with the hand, possibly increasing the risk of transmitting infections. The system proposed in this paper is the first non-contact measurement system utilising Intel Perceptual Technology and a Senz3D Camera for measuring phalangeal joint angles. To enhance the accuracy of the system, we developed a new approach to achieve the total active movement without measuring three joint angles individually. An equation between the actual spacial position and measurement value of the proximal inter-phalangeal joint was established through the measurement values of the total active movement, so that its actual position can be inferred. Verified by computer simulations, experimental results demonstrated a significant improvement in the calculation of the total active movement and successfully recovered the actual position of the proximal inter-phalangeal joint angles. A trial that was conducted to examine the clinical applicability of the system involving 40 healthy subjects confirmed the practicability and consistency in the proposed system. The time efficiency conveyed a stronger argument for this system to replace the current practice of using goniometers. PMID:26225976
Nuclear medicine imaging system
Bennett, Gerald W.; Brill, A. Bertrand; Bizais, Yves J.; Rowe, R. Wanda; Zubal, I. George
1986-01-07
A nuclear medicine imaging system having two large field of view scintillation cameras mounted on a rotatable gantry and being movable diametrically toward or away from each other is disclosed. In addition, each camera may be rotated about an axis perpendicular to the diameter of the gantry. The movement of the cameras allows the system to be used for a variety of studies, including positron annihilation, and conventional single photon emission, as well as static orthogonal dual multi-pinhole tomography. In orthogonal dual multi-pinhole tomography, each camera is fitted with a seven pinhole collimator to provide seven views from slightly different perspectives. By using two cameras at an angle to each other, improved sensitivity and depth resolution is achieved. The computer system and interface acquires and stores a broad range of information in list mode, including patient physiological data, energy data over the full range detected by the cameras, and the camera position. The list mode acquisition permits the study of attenuation as a result of Compton scatter, as well as studies involving the isolation and correlation of energy with a range of physiological conditions.
Nuclear medicine imaging system
Bennett, Gerald W.; Brill, A. Bertrand; Bizais, Yves J. C.; Rowe, R. Wanda; Zubal, I. George
1986-01-01
A nuclear medicine imaging system having two large field of view scintillation cameras mounted on a rotatable gantry and being movable diametrically toward or away from each other is disclosed. In addition, each camera may be rotated about an axis perpendicular to the diameter of the gantry. The movement of the cameras allows the system to be used for a variety of studies, including positron annihilation, and conventional single photon emission, as well as static orthogonal dual multi-pinhole tomography. In orthogonal dual multi-pinhole tomography, each camera is fitted with a seven pinhole collimator to provide seven views from slightly different perspectives. By using two cameras at an angle to each other, improved sensitivity and depth resolution is achieved. The computer system and interface acquires and stores a broad range of information in list mode, including patient physiological data, energy data over the full range detected by the cameras, and the camera position. The list mode acquisition permits the study of attenuation as a result of Compton scatter, as well as studies involving the isolation and correlation of energy with a range of physiological conditions.
SU-F-J-206: Systematic Evaluation of the Minimum Detectable Shift Using a Range- Finding Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Platt, M; Platt, M; Lamba, M
2016-06-15
Purpose: The robotic table used for patient alignment in proton therapy is calibrated only at commissioning under well-defined conditions and table shifts may vary over time and with differing conditions. The purpose of this study is to systematically investigate minimum detectable shifts using a time-of-flight (TOF) range-finding camera for table position feedback. Methods: A TOF camera was used to acquire one hundred 424 × 512 range images from a flat surface before and after known shifts. Range was assigned by averaging central regions of the image across multiple images. Depth resolution was determined by evaluating the difference between the actualmore » shift of the surface and the measured shift. Depth resolution was evaluated for number of images averaged, area of sensor over which depth was averaged, distance from camera to surface, central versus peripheral image regions, and angle of surface relative to camera. Results: For one to one thousand images with a shift of one millimeter the range in error was 0.852 ± 0.27 mm to 0.004 ± 0.01 mm (95% C.I.). For varying regions of the camera sensor the range in error was 0.02 ± 0.05 mm to 0.47 ± 0.04 mm. The following results are for 10 image averages. For areas ranging from one pixel to 9 × 9 pixels the range in error was 0.15 ± 0.09 to 0.29 ± 0.15 mm (1σ). For distances ranging from two to four meters the range in error was 0.15 ± 0.09 to 0.28 ± 0.15 mm. For an angle of incidence between thirty degrees and ninety degrees the average range in error was 0.11 ± 0.08 to 0.17 ± 0.09 mm. Conclusion: It is feasible to use a TOF camera for measuring shifts in flat surfaces under clinically relevant conditions with submillimeter precision.« less
Reflectance characteristics of the Viking lander camera reference test charts
NASA Technical Reports Server (NTRS)
Wall, S. D.; Burcher, E. E.; Jabson, D. J.
1975-01-01
Reference test charts provide radiometric, colorimetric, and spatial resolution references for the Viking lander cameras on Mars. Reflectance measurements of these references are described, including the absolute bidirectional reflectance of the radiometric references and the relative spectral reflectance of both radiometric and colorimetric references. Results show that the bidirection reflectance of the radiometric references is Lambertian to within + or - 7% for incidence angles between 20 deg and 60 deg, and that their spectral reflectance is constant with wavelength to within + or - 5% over the spectral range of the cameras. Estimated accuracy of the measurements is + or - 0.05 in relative spectral reflectance.
NASA Astrophysics Data System (ADS)
Li, Zhengyan; Zgadzaj, Rafal; Wang, Xiaoming; Reed, Stephen; Dong, Peng; Downer, Michael C.
2010-11-01
We demonstrate a prototype Frequency Domain Streak Camera (FDSC) that can capture the picosecond time evolution of the plasma accelerator structure in a single shot. In our prototype Frequency-Domain Streak Camera, a probe pulse propagates obliquely to a sub-picosecond pump pulse that creates an evolving nonlinear index "bubble" in fused silica glass, supplementing a conventional Frequency Domain Holographic (FDH) probe-reference pair that co-propagates with the "bubble". Frequency Domain Tomography (FDT) generalizes Frequency-Domain Streak Camera by probing the "bubble" from multiple angles and reconstructing its morphology and evolution using algorithms similar to those used in medical CAT scans. Multiplexing methods (Temporal Multiplexing and Angular Multiplexing) improve data storage and processing capability, demonstrating a compact Frequency Domain Tomography system with a single spectrometer.
Testing of the Apollo 15 Metric Camera System.
NASA Technical Reports Server (NTRS)
Helmering, R. J.; Alspaugh, D. H.
1972-01-01
Description of tests conducted (1) to assess the quality of Apollo 15 Metric Camera System data and (2) to develop production procedures for total block reduction. Three strips of metric photography over the Hadley Rille area were selected for the tests. These photographs were utilized in a series of evaluation tests culminating in an orbitally constrained block triangulation solution. Results show that film deformations up to 25 and 5 microns are present in the mapping and stellar materials, respectively. Stellar reductions can provide mapping camera orientations with an accuracy that is consistent with the accuracies of other parameters in the triangulation solutions. Pointing accuracies of 4 to 10 microns can be expected for the mapping camera materials, depending on variations in resolution caused by changing sun angle conditions.
NASA Astrophysics Data System (ADS)
Gonzaga, S.; et al.
2011-03-01
ACS was designed to provide a deep, wide-field survey capability from the visible to near-IR using the Wide Field Camera (WFC), high resolution imaging from the near-UV to near-IR with the now-defunct High Resolution Camera (HRC), and solar-blind far-UV imaging using the Solar Blind Camera (SBC). The discovery efficiency of ACS's Wide Field Channel (i.e., the product of WFC's field of view and throughput) is 10 times greater than that of WFPC2. The failure of ACS's CCD electronics in January 2007 brought a temporary halt to CCD imaging until Servicing Mission 4 in May 2009, when WFC functionality was restored. Unfortunately, the high-resolution optical imaging capability of HRC was not recovered.
Determination of the coma dust back-scattering of 67P for phase angles from 1.2° to 75°
NASA Astrophysics Data System (ADS)
Fink, Uwe; Doose, Lyn
2018-07-01
A phase curve is derived for the dust coma of comet 67P/Churyumov-Gerasimenko (67P) from 1.2° to 74° using images from the OSIRIS camera system on board the Rosetta mission during the period 2014 July 25 to 2015 February 23 as the spacecraft approached the comet. We analyzed 123 images of the continuum filter at 612.6 nm and 60 images of the 375 nm UV continuum filter of the Wide Angle Camera. Our method of extracting a phase curve, close to the nucleus, taking into account illumination conditions, activity of the comet, strong radial radiance intensity decrease and varying phase angles across the image, is described in detail. Our derived backscattering phase curve is considerably steeper than earlier published data. The radiance of the scattering dust in the 612.6 nm filter increases by about a factor of 12 going from a phase angle of 75° to a phase angle of 2.0°. The phase curve for the 375 nm filter is similar but there is reasonable evidence that the I/F color ratio between the two filters changes from a roughly neutral color ratio of 1.2 to a more typical red color of ∼ 2.0 as the activity of the comet increases. No substantial change in the shape of the phase curve could be discerned between 2014 August and 2015 February 19-23 when the comet increased considerably in activity. The phase curve behavior on the illuminated side of the comet and the dark side is in general similar. A comparison of our phase curve with a recent phase curve for 67P by Bertini et al. for the phase angle range ∼15°-80°, where our two reductions overlap, shows good agreement (as does our color ratio between the 612.6 nm and the 375 nm filters) despite the fact that the two phase curve determinations observed the comet at different dust activity levels, at different distances from the nucleus and used completely different observing and data reduction methodologies. Trial scattering calculations demonstrate that the observed strong backscattering most likely arises from particles in the size range 1-20 μm. Our observed backscattering phase curve gives no constraints on the real index of refraction, the particle size distribution or the minimum and maximum particle size cut-offs. However, an upper limit to the imaginary index of refraction of ∼0.01 was required, making these particles quite transparent. Simple spherical scattering calculations including particle size distributions can fit the general characteristics of the phase curve but cannot produce a satisfactory detailed fit.
Electro-optical detector for use in a wide mass range mass spectrometer
NASA Technical Reports Server (NTRS)
Giffin, Charles E. (Inventor)
1976-01-01
An electro-optical detector is disclosed for use in a wide mass range mass spectrometer (MS), in the latter the focal plane is at or very near the exit end of the magnetic analyzer, so that a strong magnetic field of the order of 1000G or more is present at the focal plane location. The novel detector includes a microchannel electron multiplier array (MCA) which is positioned at the focal plane to convert ion beams which are focused by the MS at the focal plane into corresponding electron beams which are then accelerated to form visual images on a conductive phosphored surface. These visual images are then converted into images on the target of a vidicon camera or the like for electronic processing. Due to the strong magnetic field at the focal plane, in one embodiment of the invention, the MCA with front and back parallel ends is placed so that its front end forms an angle of not less than several degrees, preferably on the order of 10.degree.-20.degree., with respect to the focal plane, with the center line of the front end preferably located in the focal plane. In another embodiment the MCA is wedge-shaped, with its back end at an angle of about 10.degree.-20.degree. with respect to the front end. In this embodiment the MCA is placed so that its front end is located at the focal plane.
Sky camera geometric calibration using solar observations
Urquhart, Bryan; Kurtz, Ben; Kleissl, Jan
2016-09-05
A camera model and associated automated calibration procedure for stationary daytime sky imaging cameras is presented. The specific modeling and calibration needs are motivated by remotely deployed cameras used to forecast solar power production where cameras point skyward and use 180° fisheye lenses. Sun position in the sky and on the image plane provides a simple and automated approach to calibration; special equipment or calibration patterns are not required. Sun position in the sky is modeled using a solar position algorithm (requiring latitude, longitude, altitude and time as inputs). Sun position on the image plane is detected using a simple image processing algorithm. Themore » performance evaluation focuses on the calibration of a camera employing a fisheye lens with an equisolid angle projection, but the camera model is general enough to treat most fixed focal length, central, dioptric camera systems with a photo objective lens. Calibration errors scale with the noise level of the sun position measurement in the image plane, but the calibration is robust across a large range of noise in the sun position. In conclusion, calibration performance on clear days ranged from 0.94 to 1.24 pixels root mean square error.« less
NASA Technical Reports Server (NTRS)
Nelson, David L.; Diner, David J.; Thompson, Charles K.; Hall, Jeffrey R.; Rheingans, Brian E.; Garay, Michael J.; Mazzoni, Dominic
2010-01-01
MISR (Multi-angle Imaging SpectroRadiometer) INteractive eXplorer (MINX) is an interactive visualization program that allows a user to digitize smoke, dust, or volcanic plumes in MISR multiangle images, and automatically retrieve height and wind profiles associated with those plumes. This innovation can perform 9-camera animations of MISR level-1 radiance images to study the 3D relationships of clouds and plumes. MINX also enables archiving MISR aerosol properties and Moderate Resolution Imaging Spectroradiometer (MODIS) fire radiative power along with the heights and winds. It can correct geometric misregistration between cameras by correlating off-nadir camera scenes with corresponding nadir scenes and then warping the images to minimize the misregistration offsets. Plots of BRF (bidirectional reflectance factor) vs. camera angle for points clicked in an image can be displayed. Users get rapid access to map views of MISR path and orbit locations and overflight dates, and past or future orbits can be identified that pass over a specified location at a specified time. Single-camera, level-1 radiance data at 1,100- or 275- meter resolution can be quickly displayed in color using a browse option. This software determines the heights and motion vectors of features above the terrain with greater precision and coverage than previous methods, based on an algorithm that takes wind direction into consideration. Human interpreters can precisely identify plumes and their extent, and wind direction. Overposting of MODIS thermal anomaly data aids in the identification of smoke plumes. The software has been used to preserve graphical and textural versions of the digitized data in a Web-based database.
Wide Field Camera 3 Accommodations for HST Robotics Servicing Mission
NASA Technical Reports Server (NTRS)
Ginyard, Amani
2005-01-01
This slide presentation discusses the objectives of the Hubble Space Telescope (HST) Robotics Servicing and Deorbit Mission (HRSDM), reviews the Wide Field Camera 3 (WFC3), and also reviews the contamination accomodations for the WFC3. The objectives of the HRSDM are (1) to provide a disposal capability at the end of HST's useful life, (2) to upgrade the hardware by installing two new scientific instruments: replace the Corrective Optics Space Telescope Axial Replacement (COSTAR) with the Cosmic Origins Spectrograph (COS), and to replace the Wide Field/Planetary Camera-2 (WFPC2) with Wide Field Camera-3, and (3) Extend the Scientific life of HST for a minimum of 5 years after servicing. Included are slides showing the Hubble Robotic Vehicle (HRV) and slides describing what the HRV contains. There are also slides describing the WFC3. One of the mechanisms of the WFC3 is to serve partially as replacement gyroscopes for HST. There are also slides that discuss the contamination requirements for the Rate Sensor Units (RSUs), that are part of the Rate Gyroscope Assembly on the WFC3.
Mitigation of Angle Tracking Errors Due to Color Dependent Centroid Shifts in SIM-Lite
NASA Technical Reports Server (NTRS)
Nemati, Bijan; An, Xin; Goullioud, Renaud; Shao, Michael; Shen, Tsae-Pyng; Wehmeier, Udo J.; Weilert, Mark A.; Wang, Xu; Werne, Thomas A.; Wu, Janet P.;
2010-01-01
The SIM-Lite astrometric interferometer will search for Earth-size planets in the habitable zones of nearby stars. In this search the interferometer will monitor the astrometric position of candidate stars relative to nearby reference stars over the course of a 5 year mission. The elemental measurement is the angle between a target star and a reference star. This is a two-step process, in which the interferometer will each time need to use its controllable optics to align the starlight in the two arms with each other and with the metrology beams. The sensor for this alignment is an angle tracking CCD camera. Various constraints in the design of the camera subject it to systematic alignment errors when observing a star of one spectrum compared with a start of a different spectrum. This effect is called a Color Dependent Centroid Shift (CDCS) and has been studied extensively with SIM-Lite's SCDU testbed. Here we describe results from the simulation and testing of this error in the SCDU testbed, as well as effective ways that it can be reduced to acceptable levels.
Design and control of 2-axis tilting actuator for endoscope using ionic polymer metal composites
NASA Astrophysics Data System (ADS)
Kim, Sung-Joo; Kim, Chul-Jin; Park, No-Cheol; Yang, Hyun-Seok; Park, Young-Pil
2009-03-01
In field of endoscopy, in order to overcome limitation in conventional endoscopy, capsule endoscope has been developed and has been recently applied in medical field in hospital. However, since capsule endoscope moves passively through GI tract by peristalsis, it is not able to control direction of head including camera. It is possible to miss symptoms of disease. Therefore, in this thesis, 2-Axis Tilting Actuator for Endoscope, based on Ionic Polymer Metal Composites (IPMC), is presented. In order to apply to capsule endoscope, the actuator material should satisfy a size, low energy consumption and low working voltage. Since IPMC is emerging material that exhibits a large bending deflection at low voltage, consume low energy and it can be fabricated in any size or any shape, IPMC are selected as an actuator. The system tilts camera module of endoscope to reduce invisible area of the intestines and a goal of tilting angle is selected to be an angle of 5 degrees for each axis. In order to control tiling angle, LQR controller and the full order observer is designed.
NASA Astrophysics Data System (ADS)
Wojciechowski, Adam M.; Karadas, Mürsel; Huck, Alexander; Osterkamp, Christian; Jankuhn, Steffen; Meijer, Jan; Jelezko, Fedor; Andersen, Ulrik L.
2018-03-01
Sensitive, real-time optical magnetometry with nitrogen-vacancy centers in diamond relies on accurate imaging of small (≪10-2), fractional fluorescence changes across the diamond sample. We discuss the limitations on magnetic field sensitivity resulting from the limited number of photoelectrons that a camera can record in a given time. Several types of camera sensors are analyzed, and the smallest measurable magnetic field change is estimated for each type. We show that most common sensors are of a limited use in such applications, while certain highly specific cameras allow achieving nanotesla-level sensitivity in 1 s of a combined exposure. Finally, we demonstrate the results obtained with a lock-in camera that paves the way for real-time, wide-field magnetometry at the nanotesla level and with a micrometer resolution.
Reconditioning of Cassini Narrow-Angle Camera
NASA Technical Reports Server (NTRS)
2002-01-01
These five images of single stars, taken at different times with the narrow-angle camera on NASA's Cassini spacecraft, show the effects of haze collecting on the camera's optics, then successful removal of the haze by warming treatments.
The image on the left was taken on May 25, 2001, before the haze problem occurred. It shows a star named HD339457.The second image from left, taken May 30, 2001, shows the effect of haze that collected on the optics when the camera cooled back down after a routine-maintenance heating to 30 degrees Celsius (86 degrees Fahrenheit). The star is Maia, one of the Pleiades.The third image was taken on October 26, 2001, after a weeklong decontamination treatment at minus 7 C (19 F). The star is Spica.The fourth image was taken of Spica January 30, 2002, after a weeklong decontamination treatment at 4 C (39 F).The final image, also of Spica, was taken July 9, 2002, following three additional decontamination treatments at 4 C (39 F) for two months, one month, then another month.Cassini, on its way toward arrival at Saturn in 2004, is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Cassini mission for NASA's Office of Space Science, Washington, D.C.Satellite markers: a simple method for ground truth car pose on stereo video
NASA Astrophysics Data System (ADS)
Gil, Gustavo; Savino, Giovanni; Piantini, Simone; Pierini, Marco
2018-04-01
Artificial prediction of future location of other cars in the context of advanced safety systems is a must. The remote estimation of car pose and particularly its heading angle is key to predict its future location. Stereo vision systems allow to get the 3D information of a scene. Ground truth in this specific context is associated with referential information about the depth, shape and orientation of the objects present in the traffic scene. Creating 3D ground truth is a measurement and data fusion task associated with the combination of different kinds of sensors. The novelty of this paper is the method to generate ground truth car pose only from video data. When the method is applied to stereo video, it also provides the extrinsic camera parameters for each camera at frame level which are key to quantify the performance of a stereo vision system when it is moving because the system is subjected to undesired vibrations and/or leaning. We developed a video post-processing technique which employs a common camera calibration tool for the 3D ground truth generation. In our case study, we focus in accurate car heading angle estimation of a moving car under realistic imagery. As outcomes, our satellite marker method provides accurate car pose at frame level, and the instantaneous spatial orientation for each camera at frame level.
2008-01-30
After NASA MESSENGER spacecraft completed its successful flyby of Mercury, the Narrow Angle Camera NAC, part of the Mercury Dual Imaging System MDIS, took these images of the receding planet. This is a frame from an animation.
NASA Astrophysics Data System (ADS)
Calvel, Bertrand; Castel, Didier; Standarovski, Eric; Rousset, Gérard; Bougoin, Michel
2017-11-01
The international Rosetta mission, now planned by ESA to be launched in January 2003, will provide a unique opportunity to directly study the nucleus of comet 46P/Wirtanen and its activity in 2013. We describe here the design, the development and the performances of the telescope of the Narrow Angle Camera of the OSIRIS experiment et its Silicon Carbide telescope which will give high resolution images of the cometary nucleus in the visible spectrum. The development of the mirrors has been specifically detailed. The SiC parts have been manufactured by BOOSTEC, polished by STIGMA OPTIQUE and ion figured by IOM under the prime contractorship of ASTRIUM. ASTRIUM was also in charge of the alignment. The final optical quality of the aligned telescope is 30 nm rms wavefront error.
Structured light system calibration method with optimal fringe angle.
Li, Beiwen; Zhang, Song
2014-11-20
For structured light system calibration, one popular approach is to treat the projector as an inverse camera. This is usually performed by projecting horizontal and vertical sequences of patterns to establish one-to-one mapping between camera points and projector points. However, for a well-designed system, either horizontal or vertical fringe images are not sensitive to depth variation and thus yield inaccurate mapping. As a result, the calibration accuracy is jeopardized if a conventional calibration method is used. To address this limitation, this paper proposes a novel calibration method based on optimal fringe angle determination. Experiments demonstrate that our calibration approach can increase the measurement accuracy up to 38% compared to the conventional calibration method with a calibration volume of 300(H) mm×250(W) mm×500(D) mm.
Neptune Great Dark Spot in High Resolution
1999-08-30
This photograph shows the last face on view of the Great Dark Spot that Voyager will make with the narrow angle camera. The image was shuttered 45 hours before closest approach at a distance of 2.8 million kilometers (1.7 million miles). The smallest structures that can be seen are of an order of 50 kilometers (31 miles). The image shows feathery white clouds that overlie the boundary of the dark and light blue regions. The pinwheel (spiral) structure of both the dark boundary and the white cirrus suggest a storm system rotating counterclockwise. Periodic small scale patterns in the white cloud, possibly waves, are short lived and do not persist from one Neptunian rotation to the next. This color composite was made from the clear and green filters of the narrow-angle camera. http://photojournal.jpl.nasa.gov/catalog/PIA00052
An Orientation Sensor-Based Head Tracking System for Driver Behaviour Monitoring
Görne, Lorenz; Yuen, Iek-Man; Cao, Dongpu; Sullman, Mark; Auger, Daniel; Lv, Chen; Wang, Huaji; Matthias, Rebecca; Skrypchuk, Lee; Mouzakitis, Alexandros
2017-01-01
Although at present legislation does not allow drivers in a Level 3 autonomous vehicle to engage in a secondary task, there may become a time when it does. Monitoring the behaviour of drivers engaging in various non-driving activities (NDAs) is crucial to decide how well the driver will be able to take over control of the vehicle. One limitation of the commonly used face-based head tracking system, using cameras, is that sufficient features of the face must be visible, which limits the detectable angle of head movement and thereby measurable NDAs, unless multiple cameras are used. This paper proposes a novel orientation sensor based head tracking system that includes twin devices, one of which measures the movement of the vehicle while the other measures the absolute movement of the head. Measurement error in the shaking and nodding axes were less than 0.4°, while error in the rolling axis was less than 2°. Comparison with a camera-based system, through in-house tests and on-road tests, showed that the main advantage of the proposed system is the ability to detect angles larger than 20° in the shaking and nodding axes. Finally, a case study demonstrated that the measurement of the shaking and nodding angles, produced from the proposed system, can effectively characterise the drivers’ behaviour while engaged in the NDAs of chatting to a passenger and playing on a smartphone. PMID:29165331
2017-04-17
When imaged by NASA Cassini spacecraft at infrared wavelengths that pierce the planet upper haze layer, the high-speed winds of Saturn atmosphere produce watercolor-like patterns. With no solid surface creating atmospheric drag, winds on Saturn can reach speeds of more than 1,100 miles per hour (1,800 kilometers per hour) -- some of the fastest in the solar system. This view was taken from a vantage point about 28 degrees above Saturn's equator. The image was taken with the Cassini spacecraft wide-angle camera on Dec. 2, 2016, with a combination of spectral filters which preferentially admits wavelengths of near-infrared light centered at 728 nanometers. The view was acquired at a distance of approximately 592,000 miles (953,000 kilometers) from Saturn. Image scale is 35 miles (57 kilometers) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA20528
2014-09-29
Saturn many cloud patterns, swept along by high-speed winds, look as if they were painted on by some eager alien artist in this image from NASA Cassini spacecraft. With no real surface features to slow them down, wind speeds on Saturn can top 1,100 mph (1,800 kph), more than four times the top speeds on Earth. This view looks toward the sunlit side of the rings from about 29 degrees above the ringplane. The image was taken with the Cassini spacecraft wide-angle camera on April 4, 2014 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was obtained at a distance of approximately 1.1 million miles (1.8 million kilometers) from Saturn. Image scale is 68 miles (109 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18280
Research on surface free energy of electrowetting liquid zoom lens
NASA Astrophysics Data System (ADS)
Zhao, Cunhua; Lu, Gaoqi; Wei, Daling; Hong, Xinhua; Cui, Dongqing; Gao, Changliu
2011-08-01
Zoom imaging systems have the tendencies of miniaturization or complication so the traditional glass / plastic lenses can't meet the needs. Therefore, a new method, liquid lens is put forward which realizes zoom by changing the shape of liquid surface. liquid zoom lenses have many merits such as smaller volume, lighter weight, controlled zoom, faster response, higher transmission, lower energy consumption and so on. Liquid zoom lenses have wide applications in mobile phones, digital cameras and other small imaging system. The electrowetting phenomenon was reviewed firstly and then the influence of the exerted voltage to the contact angle was analysed in electrowetting effect. At last, the surface free energy of cone-type double liquid zoom lens was researched via the energy minimization principle. The research of surface free energy offers important theoretic dependence for designing liquid zoom lens.
Measurements of UGR of LED light by a DSLR colorimeter
NASA Astrophysics Data System (ADS)
Hsu, Shau-Wei; Chen, Cheng-Hsien; Jiaan, Yuh-Der
2012-10-01
We have developed an image-based measurement method on UGR (unified glare rating) of interior lighting environment. A calibrated DSLR (digital single-lens reflex camera) with an ultra wide-angle lens was used to measure the luminance distribution, by which the corresponding parameters can be automatically calculated. A LED lighting was placed in a room and measured at various positions and directions to study the properties of UGR. The testing results are fitted with visual experiences and UGR principles. To further examine the results, a spectroradiometer and an illuminance meter were respectively used to measure the luminance and illuminance at the same position and orientation of the DSLR. The calculation of UGR by this image-based method may solve the problem of non-uniform luminance-distribution of LED lighting, and was studied on segmentation of the luminance graph for the calculations.
The control net of Mars - May 1977. [from Viking lander spacecraft radio tracking data
NASA Technical Reports Server (NTRS)
Davies, M. E.
1978-01-01
The development of planet-wide control nets of Mars is reviewed, and the May 1977 update is described. This updated control net was computed by means of a large single-block analytical triangulation incorporating the new direction of the spin axis and the new rotation rate of Mars, as determined from radio tracking data provided by the Viking lander spacecraft. The analytical triangulation adjusts for planimetric control only (areocentric latitude and longitude) and for the camera orientation angles. Most of the areocentric radii at the control points were interpolated from radio occultation measurements, but a few were determined photogrammetically, and a substantial number were derived from elevation contours on the 1976 USGS topographic series of Mars maps. A value of V, measured from Mars' vernal equinox along the equator to the prime meridian (Airy-0) is presented.
Experimental investigation of atomization characteristics of swirling spray by ADN gelled propellant
NASA Astrophysics Data System (ADS)
Guan, Hao-Sen; Li, Guo-Xiu; Zhang, Nai-Yuan
2018-03-01
Due to the current global energy shortage and increasingly serious environmental issues, green propellants are attracting more attention. In particular, the ammonium dinitramide (ADN)-based monopropellant thruster is gaining world-wide attention as a green, non-polluting and high specific impulse propellant. Gel propellants combine the advantages of liquid and solid propellants, and are becoming popular in the field of spaceflight. In this paper, a swirling atomization experimental study was carried out using an ADN aqueous gel propellant under different injection pressures. A high-speed camera and a Malvern laser particle size analyzer were used to study the spray process. The flow coefficient, cone angle of swirl atomizing spray, breakup length of spray membrane, and droplet size distribution were analyzed. Furthermore, the effects of different injection pressures on the swirling atomization characteristics were studied.
2003-12-13
Mie Crater, a large basin formed by asteroid or comet impact in Utopia Planitia, lies at the center of this Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) red wide angle image. The crater is approximately 104 km (65 mi) across. To the east and southeast (toward the lower right) of Mie, in this 5 December 2003 view, are clouds of dust and water ice kicked up by local dust storm activity. It is mid-winter in the northern hemisphere of Mars, a time when passing storms are common on the northern plains of the red planet. Sunlight illuminates this image from the lower left; Mie Crater is located at 48.5°N, 220.3°W. Viking 2 landed west/southwest of Mie Crater, off the left edge of this image, in September 1976. http://photojournal.jpl.nasa.gov/catalog/PIA04930
Evidence for Recent Liquid Water on Mars: Gullies in Sirenum Fossae Trough
NASA Technical Reports Server (NTRS)
2000-01-01
This mosaic of two Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) images shows about 20 different gullies coming down the south-facing wall of a trough in the Sirenum Fossae/Gorgonum Chaos region of the martian southern hemisphere. Each channel and its associated fan--or apron--of debris appears to have started just below the same hard, resistant layer of bedrock located approximately 100 meters (about 325 feet) below the top of the trough wall. The layer beneath this hard, resistant bedrock is interpreted to be permeable, which allows ground water to percolate through it and--at the location of this trough--seep out onto the martian surface. The channels and aprons only occur on the south-facing slope of this valley created by faults on each side of the trough. The depression is approximately 1.4 km (0.9 mi) across.The mosaic was constructed from two pictures taken on September 16, 1999, and May 1, 2000. The black line is a gap between the two images that was not covered by MOC. The scene covers an area approximately 5.5 kilometers (3.4 miles) wide by 4.9 km (3.0 mi) high. Sunlight illuminates the area from the upper left. The image is located near 38.5oS, 171.3oW. MOC high resolution images are taken black-and-white (grayscale); the color seen here has been synthesized from the colors of Mars observed by the MOC wide angle cameras and by the Viking Orbiters in the late 1970s.2017-10-30
Reflected sunlight is the source of the illumination for visible wavelength images such as the one above. However, at longer infrared wavelengths, direct thermal emission from objects dominates over reflected sunlight. This enabled instruments that can detect infrared radiation to observe the pole even in the dark days of winter when Cassini first arrived at Saturn and Saturn's northern hemisphere was shrouded in shadow. Now, 13 years later, the north pole basks in full sunlight. Close to the northern summer solstice, sunlight illuminates the previously dark region, permitting Cassini scientists to study this area with the spacecraft's full suite of imagers. This view looks toward the northern hemisphere from about 34 degrees above Saturn's ringplane. The image was taken with the Cassini spacecraft wide-angle camera on April 25, 2017 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was acquired at a distance of approximately 274,000 miles (441,000 kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 111 degrees. Image scale is 16 miles (26 kilometers) per pixel. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA21351
A Forethought and an Afterthought
2014-10-27
Befitting moons named for brothers, the moons Prometheus and Epimetheus share a lot in common. Both are small, icy moons that orbit near the main rings of Saturn. But, like most brothers, they also assert their differences: while Epimetheus is relatively round for a small moon, Prometheus is elongated in shape, similar to a lemon. Prometheus (53 miles, or 86 kilometers across) orbits just outside the A ring - seen here upper-middle of the image - while Epimetheus (70 miles, 113 kilometers across) orbits farther out - seen in the upper-left, doing an orbital two-step with its partner, Janus. This view looks toward the sunlit side of the rings from about 28 degrees above the ringplane. The image was taken in visible light with the Cassini spacecraft wide-angle camera on July 9, 2013. The view was obtained at a distance of approximately 557,000 miles (897,000 kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 11 degrees. Image scale is 33 miles (54 kilometers) per pixel. Prometheus and Epimetheus have been brightened by a factor of 2 relative to the rest of the image to enhance their visibility. http://photojournal.jpl.nasa.gov/catalog/PIA18286
2016-09-12
Saturn's shadow stretched beyond the edge of its rings for many years after Cassini first arrived at Saturn, casting an ever-lengthening shadow that reached its maximum extent at the planet's 2009 equinox. This image captured the moment in 2015 when the shrinking shadow just barely reached across the entire main ring system. The shadow will continue to shrink until the planet's northern summer solstice, at which point it will once again start lengthening across the rings, reaching across them in 2019. Like Earth, Saturn is tilted on its axis. And, just as on Earth, as the sun climbs higher in the sky, shadows get shorter. The projection of the planet's shadow onto the rings shrinks and grows over the course of its 29-year-long orbit, as the angle of the sun changes with respect to Saturn's equator. This view looks toward the sunlit side of the rings from about 11 degrees above the ring plane. The image was taken in visible light with the Cassini spacecraft wide-angle camera on Jan. 16, 2015. The view was obtained at a distance of approximately 1.6 million miles (2.5 million kilometers) from Saturn. Image scale is about 90 miles (150 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20498
Multiple pulsed hypersonic liquid diesel fuel jetsdriven by projectile impact
NASA Astrophysics Data System (ADS)
Pianthong, K.; Takayama, K.; Milton, B. E.; Behnia, M.
2005-06-01
Further studies on high-speed liquid diesel fuel jets injected into ambient air conditions have been carried out. Projectile impact has been used as the driving mechanism. A vertical two-stage light gas gun was used as a launcher to provide the high-speed impact. This paper describes the experimental technique and visualization methods that provided a rapid series of jet images in the one shot. A high-speed video camera (106 fps) and shadowgraph optical system were used to obtain visualization. Very interesting and unique phenomena have been discovered and confirmed in this study. These are that multiple high frequency jet pulses are generated within the duration of a single shot impact. The associated multiple jet shock waves have been clearly captured. This characteristic consistently occurs with the smaller conical angle, straight cone nozzles but not with those with a very wide cone angle or curved nozzle profile. An instantaneous jet tip velocity of 2680 m/s (Mach number of 7.86) was the maximum obtained with the 40^circ nozzle. However, this jet tip velocity can only be sustained for a few microseconds as attenuation is very rapid.
NASA Astrophysics Data System (ADS)
Ormö, J.; Wünnemann, K.; Collins, G.; Melero Asensio, I.
2012-09-01
The Experimental Projectile Impact Chamber (EPIC) consists of a 20.5mm caliber, compressed gas gun and a 7m wide test bed. It is possible to vary the projectile size and density, the velocity up to about 5001n/"s, the impact angle. and the target composition. The EPIC is especially designed for the analysis of impacts into unconsolidated and liquid targets. i.e. allowing the use of gravity scaling. The general objective with the EPIC is to analyze the cratering and modification processes at wet-target (e.g. marinle) impacts. We have carried out 14 shots into dry sand targets with two projectile compositions (light and weak; heavy and strong), at two impact angles. at three impact velocities, and in both quarter-space and half- space geometries. We recorded the impacts with a high-speed camera and compared the results with numerical simulations using iSALE. The evaluation demonstrated that there are noticeable differences between the results from the two projectile types, but that the crater dimensions are consistent with scaling laws based on other impact experiments [1]. This proves the usefulness of the EPIC in the analysis of natural impacts.
NASA Astrophysics Data System (ADS)
La Forgia, F.; Lazzarin, M.; Bodewits, D.; A'Hearn, M. F.; Bertini, I.; Penasa, L.; Naletto, G.; Cremonese, G.; Massironi, M.; Ferri, F.; Frattin, E.; Lucchetti, A.; Ferrari, S.; Barbieri, C.
2017-09-01
The gas filters of OSIRIS/Wide Angle Camera (WAC) on board Rosetta spacecraft allowed to study the gaseous emissions of the inner coma of comet 67P/Churyumov-Gerasimenko. OH, NH, CN, NH2 and OI gas species have been monitored between January and September 2015, i.e. from 2.47 AU pre-perihelion, to 1.37 AU post-perihelion, allowing the study of seasonal variations. Each gas sequence covers slightly more than one comet rotation period allowing also the study of diurnal changes. We measured the gas column density between 1 and 3 km from the nucleus limb in the sunward direction. Results will be presented on the gas diurnal light curves and on the long-term variations such as the dependence and correlation with time, heliocentric distance, range, phase angle and sub-solar point. Gas ratios are studied searching for evidence of any compositional change with time and orbital evolution. We searched for connections between particular "active zones" on the nucleus surface. This study will be helpful in connecting ground based observations of 67P with Rosetta in situ observations.
2017-08-21
NASA's Cassini gazes across the icy rings of Saturn toward the icy moon Tethys, whose night side is illuminated by Saturnshine, or sunlight reflected by the planet. Tethys was on the far side of Saturn with respect to Cassini here; an observer looking upward from the moon's surface toward Cassini would see Saturn's illuminated disk filling the sky. Tethys was brightened by a factor of two in this image to increase its visibility. A sliver of the moon's sunlit northern hemisphere is seen at top. A bright wedge of Saturn's sunlit side is seen at lower left. This view looks toward the sunlit side of the rings from about 10 degrees above the ring plane. The image was taken in visible light with the Cassini spacecraft wide-angle camera on May 13, 2017. The view was acquired at a distance of approximately 750,000 miles (1.2 million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 140 degrees. Image scale is 43 miles (70 kilometers) per pixel on Saturn. The distance to Tethys was about 930,000 miles (1.5 million kilometers). The image scale on Tethys is about 56 miles (90 kilometers) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA21342
Partially-overlapped viewing zone based integral imaging system with super wide viewing angle.
Xiong, Zhao-Long; Wang, Qiong-Hua; Li, Shu-Li; Deng, Huan; Ji, Chao-Chao
2014-09-22
In this paper, we analyze the relationship between viewer and viewing zones of integral imaging (II) system and present a partially-overlapped viewing zone (POVZ) based integral imaging system with a super wide viewing angle. In the proposed system, the viewing angle can be wider than the viewing angle of the conventional tracking based II system. In addition, the POVZ can eliminate the flipping and time delay of the 3D scene as well. The proposed II system has a super wide viewing angle of 120° without flipping effect about twice as wide as the conventional one.
Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras.
Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki
2016-06-24
Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system.
Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras
Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki
2016-01-01
Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system. PMID:27347961
View of chains of star sand dunes in eastern Algeria from Skylab
1973-12-31
SL4-138-3820 (31 Dec. 1973) --- An north-looking oblique view of chains of star sand dunes in eastern Algeria as seen from the Skylab space station in Earth orbit. This picture was taken by one of the Skylab 4 crewmen with a hand-held 70mm Hasselblad camera. The low sun angle of about 25 degrees above horizontal enhances the detail in this picture. The coordinates of the center of the photograph are approximately 29.5 degrees north latitude and 5.0 degrees east longitude in the Grand Erg Oriental. The field of view at the base of the photograph is approximately 200 kilometers (125 miles). The individual dunes are roughly star-shaped rather than simple crescents which are common in dune fields. In this region the stars are aligned along ridges. The causes of these and a wide variety of other dune forms are little understood. Descriptions and photographs from Skylab 4 will be used by the U.S. Geological Survey in their world-wide study of dunes. Photo credit: NASA
MESSENGER Reveals Mercury in New Detail
2008-01-16
As NASA MESSENGER approached Mercury on January 14, 2008, the spacecraft Narrow-Angle Camera on the Mercury Dual Imaging System MDIS instrument captured this view of the planet rugged, cratered landscape illuminated obliquely by the Sun.
2000-11-21
This image is one of seven from the narrow-angle camera on NASA Cassini spacecraft assembled as a brief movie of cloud movements on Jupiter. The smallest features visible are about 500 kilometers about 300 miles across.
A Precision Metrology System for the Hubble Space Telescope Wide Field Camera 3 Instrument
NASA Technical Reports Server (NTRS)
Toland, Ronald W.
2003-01-01
The Wide Field Camera 3 (WFC3) instrument for the Hubble Space Telescope (HST) will replace the current Wide Field and Planetary Camera 2 (WFPC2). By providing higher throughput and sensitivity than WFPC2, and operating from the near-IR to the near-UV, WFC3 will once again bring the performance of HST above that from ground-based observatories. Crucial to the integration of the WFC3 optical bench is a pair of 2-axis cathetometers used to view targets which cannot be seen by other means when the bench is loaded into its enclosure. The setup and calibration of these cathetometers is described, along with results from a comparison of the cathetometer system with other metrology techniques.
A simple three dimensional wide-angle beam propagation method
NASA Astrophysics Data System (ADS)
Ma, Changbao; van Keuren, Edward
2006-05-01
The development of three dimensional (3-D) waveguide structures for chip scale planar lightwave circuits (PLCs) is hampered by the lack of effective 3-D wide-angle (WA) beam propagation methods (BPMs). We present a simple 3-D wide-angle beam propagation method (WA-BPM) using Hoekstra’s scheme along with a new 3-D wave equation splitting method. The applicability, accuracy and effectiveness of our method are demonstrated by applying it to simulations of wide-angle beam propagation and comparing them with analytical solutions.
A simple three dimensional wide-angle beam propagation method.
Ma, Changbao; Van Keuren, Edward
2006-05-29
The development of three dimensional (3-D) waveguide structures for chip scale planar lightwave circuits (PLCs) is hampered by the lack of effective 3-D wide-angle (WA) beam propagation methods (BPMs). We present a simple 3-D wide-angle beam propagation method (WA-BPM) using Hoekstra's scheme along with a new 3-D wave equation splitting method. The applicability, accuracy and effectiveness of our method are demonstrated by applying it to simulations of wide-angle beam propagation and comparing them with analytical solutions.
2013-09-01
Ground testing of prototype hardware and processing algorithms for a Wide Area Space Surveillance System (WASSS) Neil Goldstein, Rainer A...at Magdalena Ridge Observatory using the prototype Wide Area Space Surveillance System (WASSS) camera, which has a 4 x 60 field-of-view , < 0.05...objects with larger-aperture cameras. The sensitivity of the system depends on multi-frame averaging and a Principal Component Analysis based image
Design of motion adjusting system for space camera based on ultrasonic motor
NASA Astrophysics Data System (ADS)
Xu, Kai; Jin, Guang; Gu, Song; Yan, Yong; Sun, Zhiyuan
2011-08-01
Drift angle is a transverse intersection angle of vector of image motion of the space camera. Adjusting the angle could reduce the influence on image quality. Ultrasonic motor (USM) is a new type of actuator using ultrasonic wave stimulated by piezoelectric ceramics. They have many advantages in comparison with conventional electromagnetic motors. In this paper, some improvement was designed for control system of drift adjusting mechanism. Based on ultrasonic motor T-60 was designed the drift adjusting system, which is composed of the drift adjusting mechanical frame, the ultrasonic motor, the driver of Ultrasonic Motor, the photoelectric encoder and the drift adjusting controller. The TMS320F28335 DSP was adopted as the calculation and control processor, photoelectric encoder was used as sensor of position closed loop system and the voltage driving circuit designed as generator of ultrasonic wave. It was built the mathematic model of drive circuit of the ultrasonic motor T-60 using matlab modules. In order to verify the validity of the drift adjusting system, was introduced the source of the disturbance, and made simulation analysis. It designed the control systems of motor drive for drift adjusting system with the improved PID control. The drift angle adjusting system has such advantages as the small space, simple configuration, high position control precision, fine repeatability, self locking property and low powers. It showed that the system could accomplish the mission of drift angle adjusting excellent.