Micro-Imagers for Spaceborne Cell-Growth Experiments
NASA Technical Reports Server (NTRS)
Behar, Alberto; Matthews, Janet; SaintAnge, Beverly; Tanabe, Helen
2006-01-01
A document discusses selected aspects of a continuing effort to develop five micro-imagers for both still and video monitoring of cell cultures to be grown aboard the International Space Station. The approach taken in this effort is to modify and augment pre-existing electronic micro-cameras. Each such camera includes an image-detector integrated-circuit chip, signal-conditioning and image-compression circuitry, and connections for receiving power from, and exchanging data with, external electronic equipment. Four white and four multicolor light-emitting diodes are to be added to each camera for illuminating the specimens to be monitored. The lens used in the original version of each camera is to be replaced with a shorter-focal-length, more-compact singlet lens to make it possible to fit the camera into the limited space allocated to it. Initially, the lenses in the five cameras are to have different focal lengths: the focal lengths are to be 1, 1.5, 2, 2.5, and 3 cm. Once one of the focal lengths is determined to be the most nearly optimum, the remaining four cameras are to be fitted with lenses of that focal length.
Mitigation of Atmospheric Effects on Imaging Systems
2004-03-31
focal length. The imaging system had two cameras: an Electrim camera sensitive in the visible (0.6 µ m) waveband and an Amber QWIP infrared camera...sensitive in the 9–micron region. The Amber QWIP infrared camera had 256x256 pixels, pixel pitch 38 mµ , focal length of 1.8 m, FOV of 5.4 x5.4 mr...each day. Unfortunately, signals from the different read ports of the Electrim camera picked up noise on their way to the digitizer, and this resulted
Thermal Effects on Camera Focal Length in Messenger Star Calibration and Orbital Imaging
NASA Astrophysics Data System (ADS)
Burmeister, S.; Elgner, S.; Preusker, F.; Stark, A.; Oberst, J.
2018-04-01
We analyse images taken by the MErcury Surface, Space ENviorment, GEochemistry, and Ranging (MESSENGER) spacecraft for the camera's thermal response in the harsh thermal environment near Mercury. Specifically, we study thermally induced variations in focal length of the Mercury Dual Imaging System (MDIS). Within the several hundreds of images of star fields, the Wide Angle Camera (WAC) typically captures up to 250 stars in one frame of the panchromatic channel. We measure star positions and relate these to the known star coordinates taken from the Tycho-2 catalogue. We solve for camera pointing, the focal length parameter and two non-symmetrical distortion parameters for each image. Using data from the temperature sensors on the camera focal plane we model a linear focal length function in the form of f(T) = A0 + A1 T. Next, we use images from MESSENGER's orbital mapping mission. We deal with large image blocks, typically used for the production of a high-resolution digital terrain models (DTM). We analyzed images from the combined quadrangles H03 and H07, a selected region, covered by approx. 10,600 images, in which we identified about 83,900 tiepoints. Using bundle block adjustments, we solved for the unknown coordinates of the control points, the pointing of the camera - as well as the camera's focal length. We then fit the above linear function with respect to the focal plane temperature. As a result, we find a complex response of the camera to thermal conditions of the spacecraft. To first order, we see a linear increase by approx. 0.0107 mm per degree temperature for the Narrow-Angle Camera (NAC). This is in agreement with the observed thermal response seen in images of the panchromatic channel of the WAC. Unfortunately, further comparisons of results from the two methods, both of which use different portions of the available image data, are limited. If leaving uncorrected, these effects may pose significant difficulties in the photogrammetric analysis, specifically these may be responsible for erroneous longwavelength trends in topographic models.
Comparison of photogrammetric and astrometric data reduction results for the wild BC-4 camera
NASA Technical Reports Server (NTRS)
Hornbarger, D. H.; Mueller, I., I.
1971-01-01
The results of astrometric and photogrammetric plate reduction techniques for a short focal length camera are compared. Several astrometric models are tested on entire and limited plate areas to analyze their ability to remove systematic errors from interpolated satellite directions using a rigorous photogrammetric reduction as a standard. Residual plots are employed to graphically illustrate the analysis. Conclusions are made as to what conditions will permit the astrometric reduction to achieve comparable accuracies to those of photogrammetric reduction when applied for short focal length ballistic cameras.
HandSight: Supporting Everyday Activities through Touch-Vision
2015-10-01
switches between IR and RGB o Large, low resolution, and fixed focal length > 1ft • Raspberry PI NoIR: https://www.raspberrypi.org/products/ pi -noir...camera/ o Raspberry Pi NoIR camera with external visible light filters o Good image quality, manually adjustable focal length, small, programmable 11...purpose and scope of the research. 2. KEYWORDS: Provide a brief list of keywords (limit to 20 words). 3. ACCOMPLISHMENTS: The PI is reminded that
Fabrication of multi-focal microlens array on curved surface for wide-angle camera module
NASA Astrophysics Data System (ADS)
Pan, Jun-Gu; Su, Guo-Dung J.
2017-08-01
In this paper, we present a wide-angle and compact camera module that consists of microlens array with different focal lengths on curved surface. The design integrates the principle of an insect's compound eye and the human eye. It contains a curved hexagonal microlens array and a spherical lens. Compared with normal mobile phone cameras which usually need no less than four lenses, but our proposed system only uses one lens. Furthermore, the thickness of our proposed system is only 2.08 mm and diagonal full field of view is about 100 degrees. In order to make the critical microlens array, we used the inkjet printing to control the surface shape of each microlens for achieving different focal lengths and use replication method to form curved hexagonal microlens array.
Voyager spacecraft images of Jupiter and Saturn
NASA Technical Reports Server (NTRS)
Birnbaum, M. M.
1982-01-01
The Voyager imaging system is described, noting that it is made up of a narrow-angle and a wide-angle TV camera, each in turn consisting of optics, a filter wheel and shutter assembly, a vidicon tube, and an electronics subsystem. The narrow-angle camera has a focal length of 1500 mm; its field of view is 0.42 deg and its focal ratio is f/8.5. For the wide-angle camera, the focal length is 200 mm, the field of view 3.2 deg, and the focal ratio of f/3.5. Images are exposed by each camera through one of eight filters in the filter wheel on the photoconductive surface of a magnetically focused and deflected vidicon having a diameter of 25 mm. The vidicon storage surface (target) is a selenium-sulfur film having an active area of 11.14 x 11.14 mm; it holds a frame consisting of 800 lines with 800 picture elements per line. Pictures of Jupiter, Saturn, and their moons are presented, with short descriptions given of the area being viewed.
A telephoto camera system with shooting direction control by gaze detection
NASA Astrophysics Data System (ADS)
Teraya, Daiki; Hachisu, Takumi; Yendo, Tomohiro
2015-05-01
For safe driving, it is important for driver to check traffic conditions such as traffic lights, or traffic signs as early as soon. If on-vehicle camera takes image of important objects to understand traffic conditions from long distance and shows these to driver, driver can understand traffic conditions earlier. To take image of long distance objects clearly, the focal length of camera must be long. When the focal length is long, on-vehicle camera doesn't have enough field of view to check traffic conditions. Therefore, in order to get necessary images from long distance, camera must have long-focal length and controllability of shooting direction. In previous study, driver indicates shooting direction on displayed image taken by a wide-angle camera, a direction controllable camera takes telescopic image, and displays these to driver. However, driver uses a touch panel to indicate the shooting direction in previous study. It is cause of disturb driving. So, we propose a telephoto camera system for driving support whose shooting direction is controlled by driver's gaze to avoid disturbing drive. This proposed system is composed of a gaze detector and an active telephoto camera whose shooting direction is controlled. We adopt non-wear detecting method to avoid hindrance to drive. The gaze detector measures driver's gaze by image processing. The shooting direction of the active telephoto camera is controlled by galvanometer scanners and the direction can be switched within a few milliseconds. We confirmed that the proposed system takes images of gazing straight ahead of subject by experiments.
Plenoptic camera based on a liquid crystal microlens array
NASA Astrophysics Data System (ADS)
Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Xie, Changsheng
2015-09-01
A type of liquid crystal microlens array (LCMLA) with tunable focal length by the voltage signals applied between its top and bottom electrodes, is fabricated and then the common optical focusing characteristics are tested. The relationship between the focal length and the applied voltage signals is given. The LCMLA is integrated with an image sensor and further coupled with a main lens so as to construct a plenoptic camera. Several raw images at different voltage signals applied are acquired and contrasted through the LCMLA-based plenoptic camera constructed by us. Our experiments demonstrate that through utilizing a LCMLA in a plenoptic camera, the focused zone of the LCMLA-based plenoptic camera can be shifted effectively only by changing the voltage signals loaded between the electrodes of the LCMLA, which is equivalent to the extension of the depth of field.
An electrically tunable plenoptic camera using a liquid crystal microlens array.
Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Ji, An; Xie, Changsheng
2015-05-01
Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated with an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.
An electrically tunable plenoptic camera using a liquid crystal microlens array
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lei, Yu; School of Automation, Huazhong University of Science and Technology, Wuhan 430074; Wuhan National Laboratory for Optoelectronics, Huazhong University of Science and Technology, Wuhan 430074
2015-05-15
Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated withmore » an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.« less
An electrically tunable plenoptic camera using a liquid crystal microlens array
NASA Astrophysics Data System (ADS)
Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Ji, An; Xie, Changsheng
2015-05-01
Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated with an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.
Photographic zoom fisheye lens design for DSLR cameras
NASA Astrophysics Data System (ADS)
Yan, Yufeng; Sasian, Jose
2017-09-01
Photographic fisheye lenses with fixed focal length for cameras with different sensor formats have been well developed for decades. However, photographic fisheye lenses with variable focal length are rare on the market due in part to the greater design difficulty. This paper presents a large aperture zoom fisheye lens for DSLR cameras that produces both circular and diagonal fisheye imaging for 35-mm sensors and diagonal fisheye imaging for APS-C sensors. The history and optical characteristics of fisheye lenses are briefly reviewed. Then, a 9.2- to 16.1-mm F/2.8 to F/3.5 zoom fisheye lens design is presented, including the design approach and aberration control. Image quality and tolerance performance analysis for this lens are also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rilling, M; Centre de Recherche sur le Cancer, Hôtel-Dieu de Québec, Quebec City, QC; Département de radio-oncologie, CHU de Québec, Quebec City, QC
2015-06-15
Purpose: The purpose of this work is to simulate a multi-focus plenoptic camera used as the measuring device in a real-time three-dimensional scintillation dosimeter. Simulating and optimizing this realistic optical system will bridge the technological gap between concept validation and a clinically viable tool that can provide highly efficient, accurate and precise measurements for dynamic radiotherapy techniques. Methods: The experimental prototype, previously developed for proof of concept purposes, uses an off-the-shelf multi-focus plenoptic camera. With an array of interleaved microlenses of different focal lengths, this camera records spatial and angular information of light emitted by a plastic scintillator volume. Themore » three distinct microlens focal lengths were determined experimentally for use as baseline parameters by measuring image-to-object magnification for different distances in object space. A simulated plenoptic system was implemented using the non-sequential ray tracing software Zemax: this tool allows complete simulation of multiple optical paths by modeling interactions at interfaces such as scatter, diffraction, reflection and refraction. The active sensor was modeled based on the camera manufacturer specifications by a 2048×2048, 5 µm-pixel pitch sensor. Planar light sources, simulating the plastic scintillator volume, were employed for ray tracing simulations. Results: The microlens focal lengths were determined to be 384, 327 and 290 µm. A realistic multi-focus plenoptic system, with independently defined and optimizable specifications, was fully simulated. A f/2.9 and 54 mm-focal length Double Gauss objective was modeled as the system’s main lens. A three-focal length hexagonal microlens array of 250-µm thickness was designed, acting as an image-relay system between the main lens and sensor. Conclusion: Simulation of a fully modeled multi-focus plenoptic camera enables the decoupled optimization of the main lens and microlens specifications. This work leads the way to improving the 3D dosimeter’s achievable resolution, efficiency and build for providing a quality assurance tool fully meeting clinical needs. M.R. is financially supported by a Master’s Canada Graduate Scholarship from the NSERC. This research is also supported by the NSERC Industrial Research Chair in Optical Design.« less
NASA Astrophysics Data System (ADS)
Malin, Michal C.; Ravine, Michael A.; Caplinger, Michael A.; Tony Ghaemi, F.; Schaffner, Jacob A.; Maki, Justin N.; Bell, James F.; Cameron, James F.; Dietrich, William E.; Edgett, Kenneth S.; Edwards, Laurence J.; Garvin, James B.; Hallet, Bernard; Herkenhoff, Kenneth E.; Heydari, Ezat; Kah, Linda C.; Lemmon, Mark T.; Minitti, Michelle E.; Olson, Timothy S.; Parker, Timothy J.; Rowland, Scott K.; Schieber, Juergen; Sletten, Ron; Sullivan, Robert J.; Sumner, Dawn Y.; Aileen Yingst, R.; Duston, Brian M.; McNair, Sean; Jensen, Elsa H.
2017-08-01
The Mars Science Laboratory Mast camera and Descent Imager investigations were designed, built, and operated by Malin Space Science Systems of San Diego, CA. They share common electronics and focal plane designs but have different optics. There are two Mastcams of dissimilar focal length. The Mastcam-34 has an f/8, 34 mm focal length lens, and the M-100 an f/10, 100 mm focal length lens. The M-34 field of view is about 20° × 15° with an instantaneous field of view (IFOV) of 218 μrad; the M-100 field of view (FOV) is 6.8° × 5.1° with an IFOV of 74 μrad. The M-34 can focus from 0.5 m to infinity, and the M-100 from 1.6 m to infinity. All three cameras can acquire color images through a Bayer color filter array, and the Mastcams can also acquire images through seven science filters. Images are ≤1600 pixels wide by 1200 pixels tall. The Mastcams, mounted on the 2 m tall Remote Sensing Mast, have a 360° azimuth and 180° elevation field of regard. Mars Descent Imager is fixed-mounted to the bottom left front side of the rover at 66 cm above the surface. Its fixed focus lens is in focus from 2 m to infinity, but out of focus at 66 cm. The f/3 lens has a FOV of 70° by 52° across and along the direction of motion, with an IFOV of 0.76 mrad. All cameras can acquire video at 4 frames/second for full frames or 720p HD at 6 fps. Images can be processed using lossy Joint Photographic Experts Group and predictive lossless compression.
Shared Focal Plane Investigation for Serial Frame Cameras.
1980-03-01
capability will be restored. 41. -.. TrABLE 1-1 SYSTEM LEADING P) ARTICULARS Lens Focal Length (inches) Range (ft) Contrast 12 18 24 Coverage 22.1...can be expected that signature bands will be apparent in the imagery. Such bands are at best distracting and at worst hindrances to image interpretation
Remote Sensing Simulation Activities for Earthlings
ERIC Educational Resources Information Center
Krockover, Gerald H.; Odden, Thomas D.
1977-01-01
Suggested are activities using a Polaroid camera to illustrate the capabilities of remote sensing. Reading materials from the National Aeronautics and Space Administration (NASA) are suggested. Methods for (1) finding a camera's focal length, (2) calculating ground dimension photograph simulation, and (3) limiting size using film resolution are…
An all-silicone zoom lens in an optical imaging system
NASA Astrophysics Data System (ADS)
Zhao, Cun-Hua
2013-09-01
An all-silicone zoom lens is fabricated. A tunable metal ringer is fettered around the side edge of the lens. A nylon rope linking a motor is tied, encircling the notch in the metal ringer. While the motor is operating, the rope can shrink or release to change the focal length of the lens. A calculation method is developed to obtain the focal length and the zoom ratio. The testing is carried out in succession. The testing values are compared with the calculated ones, and they tally with each other well. Finally, the imaging performance of the all-silicone lens is demonstrated. The all-silicone lens has potential uses in cellphone cameras, notebook cameras, micro monitor lenses, etc.
NASA Astrophysics Data System (ADS)
Bell, J. F.; Godber, A.; McNair, S.; Caplinger, M. A.; Maki, J. N.; Lemmon, M. T.; Van Beek, J.; Malin, M. C.; Wellington, D.; Kinch, K. M.; Madsen, M. B.; Hardgrove, C.; Ravine, M. A.; Jensen, E.; Harker, D.; Anderson, R. B.; Herkenhoff, K. E.; Morris, R. V.; Cisneros, E.; Deen, R. G.
2017-07-01
The NASA Curiosity rover Mast Camera (Mastcam) system is a pair of fixed-focal length, multispectral, color CCD imagers mounted 2 m above the surface on the rover's remote sensing mast, along with associated electronics and an onboard calibration target. The left Mastcam (M-34) has a 34 mm focal length, an instantaneous field of view (IFOV) of 0.22 mrad, and a FOV of 20° × 15° over the full 1648 × 1200 pixel span of its Kodak KAI-2020 CCD. The right Mastcam (M-100) has a 100 mm focal length, an IFOV of 0.074 mrad, and a FOV of 6.8° × 5.1° using the same detector. The cameras are separated by 24.2 cm on the mast, allowing stereo images to be obtained at the resolution of the M-34 camera. Each camera has an eight-position filter wheel, enabling it to take Bayer pattern red, green, and blue (RGB) "true color" images, multispectral images in nine additional bands spanning 400-1100 nm, and images of the Sun in two colors through neutral density-coated filters. An associated Digital Electronics Assembly provides command and data interfaces to the rover, 8 Gb of image storage per camera, 11 bit to 8 bit companding, JPEG compression, and acquisition of high-definition video. Here we describe the preflight and in-flight calibration of Mastcam images, the ways that they are being archived in the NASA Planetary Data System, and the ways that calibration refinements are being developed as the investigation progresses on Mars. We also provide some examples of data sets and analyses that help to validate the accuracy and precision of the calibration.
Adaptive metalenses with simultaneous electrical control of focal length, astigmatism, and shift.
She, Alan; Zhang, Shuyan; Shian, Samuel; Clarke, David R; Capasso, Federico
2018-02-01
Focal adjustment and zooming are universal features of cameras and advanced optical systems. Such tuning is usually performed longitudinally along the optical axis by mechanical or electrical control of focal length. However, the recent advent of ultrathin planar lenses based on metasurfaces (metalenses), which opens the door to future drastic miniaturization of mobile devices such as cell phones and wearable displays, mandates fundamentally different forms of tuning based on lateral motion rather than longitudinal motion. Theory shows that the strain field of a metalens substrate can be directly mapped into the outgoing optical wavefront to achieve large diffraction-limited focal length tuning and control of aberrations. We demonstrate electrically tunable large-area metalenses controlled by artificial muscles capable of simultaneously performing focal length tuning (>100%) as well as on-the-fly astigmatism and image shift corrections, which until now were only possible in electron optics. The device thickness is only 30 μm. Our results demonstrate the possibility of future optical microscopes that fully operate electronically, as well as compact optical systems that use the principles of adaptive optics to correct many orders of aberrations simultaneously.
Malin, Michal C; Ravine, Michael A; Caplinger, Michael A; Tony Ghaemi, F; Schaffner, Jacob A; Maki, Justin N; Bell, James F; Cameron, James F; Dietrich, William E; Edgett, Kenneth S; Edwards, Laurence J; Garvin, James B; Hallet, Bernard; Herkenhoff, Kenneth E; Heydari, Ezat; Kah, Linda C; Lemmon, Mark T; Minitti, Michelle E; Olson, Timothy S; Parker, Timothy J; Rowland, Scott K; Schieber, Juergen; Sletten, Ron; Sullivan, Robert J; Sumner, Dawn Y; Aileen Yingst, R; Duston, Brian M; McNair, Sean; Jensen, Elsa H
2017-08-01
The Mars Science Laboratory Mast camera and Descent Imager investigations were designed, built, and operated by Malin Space Science Systems of San Diego, CA. They share common electronics and focal plane designs but have different optics. There are two Mastcams of dissimilar focal length. The Mastcam-34 has an f/8, 34 mm focal length lens, and the M-100 an f/10, 100 mm focal length lens. The M-34 field of view is about 20° × 15° with an instantaneous field of view (IFOV) of 218 μrad; the M-100 field of view (FOV) is 6.8° × 5.1° with an IFOV of 74 μrad. The M-34 can focus from 0.5 m to infinity, and the M-100 from ~1.6 m to infinity. All three cameras can acquire color images through a Bayer color filter array, and the Mastcams can also acquire images through seven science filters. Images are ≤1600 pixels wide by 1200 pixels tall. The Mastcams, mounted on the ~2 m tall Remote Sensing Mast, have a 360° azimuth and ~180° elevation field of regard. Mars Descent Imager is fixed-mounted to the bottom left front side of the rover at ~66 cm above the surface. Its fixed focus lens is in focus from ~2 m to infinity, but out of focus at 66 cm. The f/3 lens has a FOV of ~70° by 52° across and along the direction of motion, with an IFOV of 0.76 mrad. All cameras can acquire video at 4 frames/second for full frames or 720p HD at 6 fps. Images can be processed using lossy Joint Photographic Experts Group and predictive lossless compression.
Ravine, Michael A.; Caplinger, Michael A.; Tony Ghaemi, F.; Schaffner, Jacob A.; Maki, Justin N.; Bell, James F.; Cameron, James F.; Dietrich, William E.; Edgett, Kenneth S.; Edwards, Laurence J.; Garvin, James B.; Hallet, Bernard; Herkenhoff, Kenneth E.; Heydari, Ezat; Kah, Linda C.; Lemmon, Mark T.; Minitti, Michelle E.; Olson, Timothy S.; Parker, Timothy J.; Rowland, Scott K.; Schieber, Juergen; Sletten, Ron; Sullivan, Robert J.; Sumner, Dawn Y.; Aileen Yingst, R.; Duston, Brian M.; McNair, Sean; Jensen, Elsa H.
2017-01-01
Abstract The Mars Science Laboratory Mast camera and Descent Imager investigations were designed, built, and operated by Malin Space Science Systems of San Diego, CA. They share common electronics and focal plane designs but have different optics. There are two Mastcams of dissimilar focal length. The Mastcam‐34 has an f/8, 34 mm focal length lens, and the M‐100 an f/10, 100 mm focal length lens. The M‐34 field of view is about 20° × 15° with an instantaneous field of view (IFOV) of 218 μrad; the M‐100 field of view (FOV) is 6.8° × 5.1° with an IFOV of 74 μrad. The M‐34 can focus from 0.5 m to infinity, and the M‐100 from ~1.6 m to infinity. All three cameras can acquire color images through a Bayer color filter array, and the Mastcams can also acquire images through seven science filters. Images are ≤1600 pixels wide by 1200 pixels tall. The Mastcams, mounted on the ~2 m tall Remote Sensing Mast, have a 360° azimuth and ~180° elevation field of regard. Mars Descent Imager is fixed‐mounted to the bottom left front side of the rover at ~66 cm above the surface. Its fixed focus lens is in focus from ~2 m to infinity, but out of focus at 66 cm. The f/3 lens has a FOV of ~70° by 52° across and along the direction of motion, with an IFOV of 0.76 mrad. All cameras can acquire video at 4 frames/second for full frames or 720p HD at 6 fps. Images can be processed using lossy Joint Photographic Experts Group and predictive lossless compression. PMID:29098171
NASA Astrophysics Data System (ADS)
de Villiers, Jason; Jermy, Robert; Nicolls, Fred
2014-06-01
This paper presents a system to determine the photogrammetric parameters of a camera. The lens distortion, focal length and camera six degree of freedom (DOF) position are calculated. The system caters for cameras of different sensitivity spectra and fields of view without any mechanical modifications. The distortion characterization, a variant of Brown's classic plumb line method, allows many radial and tangential distortion coefficients and finds the optimal principal point. Typical values are 5 radial and 3 tangential coefficients. These parameters are determined stably and demonstrably produce superior results to low order models despite popular and prevalent misconceptions to the contrary. The system produces coefficients to model both the distorted to undistorted pixel coordinate transformation (e.g. for target designation) and the inverse transformation (e.g. for image stitching and fusion) allowing deterministic rates far exceeding real time. The focal length is determined to minimise the error in absolute photogrammetric positional measurement for both multi camera systems or monocular (e.g. helmet tracker) systems. The system determines the 6 DOF position of the camera in a chosen coordinate system. It can also determine the 6 DOF offset of the camera relative to its mechanical mount. This allows faulty cameras to be replaced without requiring a recalibration of the entire system (such as an aircraft cockpit). Results from two simple applications of the calibration results are presented: stitching and fusion of the images from a dual-band visual/ LWIR camera array, and a simple laboratory optical helmet tracker.
Exploring the imaging properties of thin lenses for cryogenic infrared cameras
NASA Astrophysics Data System (ADS)
Druart, Guillaume; Verdet, Sebastien; Guerineau, Nicolas; Magli, Serge; Chambon, Mathieu; Grulois, Tatiana; Matallah, Noura
2016-05-01
Designing a cryogenic camera is a good strategy to miniaturize and simplify an infrared camera using a cooled detector. Indeed, the integration of optics inside the cold shield allows to simply athermalize the design, guarantees a cold pupil and releases the constraint on having a high back focal length for small focal length systems. By this way, cameras made of a single lens or two lenses are viable systems with good optical features and a good stability in image correction. However it involves a relatively significant additional optical mass inside the dewar and thus increases the cool down time of the camera. ONERA is currently exploring a minimalist strategy consisting in giving an imaging function to thin optical plates that are found in conventional dewars. By this way, we could make a cryogenic camera that has the same cool down time as a traditional dewar without an imagery function. Two examples will be presented: the first one is a camera using a dual-band infrared detector made of a lens outside the dewar and a lens inside the cold shield, the later having the main optical power of the system. We were able to design a cold plano-convex lens with a thickness lower than 1mm. The second example is an evolution of a former cryogenic camera called SOIE. We replaced the cold meniscus by a plano-convex Fresnel lens with a decrease of the optical thermal mass of 66%. The performances of both cameras will be compared.
Miniaturization of dielectric liquid microlens in package
Yang, Chih-Cheng; Tsai, C. Gary; Yeh, J. Andrew
2010-01-01
This study presents packaged microscale liquid lenses actuated with liquid droplets of 300–700 μm in diameter using the dielectric force manipulation. The liquid microlens demonstrated function focal length tunability in a plastic package. The focal length of the liquid lens with a lens droplet of 500 μm in diameter is shortened from 4.4 to 2.2 mm when voltages applied change from 0 to 79 Vrms. Dynamic responses that are analyzed using 2000 frames∕s high speed motion cameras show that the advancing and receding times are measured to be 90 and 60 ms, respectively. The size effect of dielectric liquid microlens is characterized for a lens droplet of 300–700 μm in diameter in an aspect of focal length. PMID:21267438
Neil A. Clark; Sang-Mook Lee
2004-01-01
This paper demonstrates how a digital video camera with a long lens can be used with pulse laser ranging in order to collect very large-scale tree crown measurements. The long focal length of the camera lens provides the magnification required for precise viewing of distant points with the trade-off of spatial coverage. Multiple video frames are mosaicked into a single...
NASA Astrophysics Data System (ADS)
Fuh, Yiin-Kuen; Chen, Pin-Wen; Lai, Zheng-Hong
2016-07-01
Mechanically deformable lenses with dynamically tunable focal lengths have been developed in this work. The fabricated five types of aspheric polydimethylsiloxane (PDMS) lenses presented here have an initial focal length of 7.0, 7.8, 9.0, 10.0 and 10.2 mm. Incorporating two modes of operation in biconvex and concave-convex configurations, the focal lengths can be tuned dynamically as 5.2-10.2, 5.5-9.9, 6.6-11.9, 6.1-13.5 and 6.6-13.5 mm respectively. Additive manufacturing was utilized to fabricate these five types of aspheric lenses (APLs) via sequential layering of PDMS materials. Complex structures with three-dimensional features and shorter focal lengths can be successfully produced by repeatedly depositing, inverting and curing controlled PDMS volume onto previously cured PDMS droplets. From our experiments, we empirically found a direct dependence of the focal length of the lenses with the amount (volume) of deposited PDMS droplets. This new mouldless, low-cost, and flexible lens fabrication method is able to transform an ordinary commercial smartphone camera into a low-cost portable microscope. A few microscopic features can be readily visualized, such as wrinkles of ladybird pupa and printed circuit board. The fabrication technique by successively applying hanging droplet and facile mechanical focal-length-tuning set-up can be easily adopted in the development of high-performance optical lenses.
Depth Perception In Remote Stereoscopic Viewing Systems
NASA Technical Reports Server (NTRS)
Diner, Daniel B.; Von Sydow, Marika
1989-01-01
Report describes theoretical and experimental studies of perception of depth by human operators through stereoscopic video systems. Purpose of such studies to optimize dual-camera configurations used to view workspaces of remote manipulators at distances of 1 to 3 m from cameras. According to analysis, static stereoscopic depth distortion decreased, without decreasing stereoscopitc depth resolution, by increasing camera-to-object and intercamera distances and camera focal length. Further predicts dynamic stereoscopic depth distortion reduced by rotating cameras around center of circle passing through point of convergence of viewing axes and first nodal points of two camera lenses.
Bifocal Stereo for Multipath Person Re-Identification
NASA Astrophysics Data System (ADS)
Blott, G.; Heipke, C.
2017-11-01
This work presents an approach for the task of person re-identification by exploiting bifocal stereo cameras. Present monocular person re-identification approaches show a decreasing working distance, when increasing the image resolution to obtain a higher reidentification performance. We propose a novel 3D multipath bifocal approach, containing a rectilinear lens with larger focal length for long range distances and a fish eye lens of a smaller focal length for the near range. The person re-identification performance is at least on par with 2D re-identification approaches but the working distance of the approach is increased and on average 10% more re-identification performance can be achieved in the overlapping field of view compared to a single camera. In addition, the 3D information is exploited from the overlapping field of view to solve potential 2D ambiguities.
Development of Camera Model and Geometric Calibration/validation of Xsat IRIS Imagery
NASA Astrophysics Data System (ADS)
Kwoh, L. K.; Huang, X.; Tan, W. J.
2012-07-01
XSAT, launched on 20 April 2011, is the first micro-satellite designed and built in Singapore. It orbits the Earth at altitude of 822 km in a sun synchronous orbit. The satellite carries a multispectral camera IRIS with three spectral bands - 0.52~0.60 mm for Green, 0.63~0.69 mm for Red and 0.76~0.89 mm for NIR at 12 m resolution. In the design of IRIS camera, the three bands were acquired by three lines of CCDs (NIR, Red and Green). These CCDs were physically separated in the focal plane and their first pixels not absolutely aligned. The micro-satellite platform was also not stable enough to allow for co-registration of the 3 bands with simple linear transformation. In the camera model developed, this platform stability was compensated with 3rd to 4th order polynomials for the satellite's roll, pitch and yaw attitude angles. With the camera model, the camera parameters such as the band to band separations, the alignment of the CCDs relative to each other, as well as the focal length of the camera can be validated or calibrated. The results of calibration with more than 20 images showed that the band to band along-track separation agreed well with the pre-flight values provided by the vendor (0.093° and 0.046° for the NIR vs red and for green vs red CCDs respectively). The cross-track alignments were 0.05 pixel and 5.9 pixel for the NIR vs red and green vs red CCDs respectively. The focal length was found to be shorter by about 0.8%. This was attributed to the lower operating temperature which XSAT is currently operating. With the calibrated parameters and the camera model, a geometric level 1 multispectral image with RPCs can be generated and if required, orthorectified imagery can also be produced.
Bell, James F.; Godber, A.; McNair, S.; Caplinger, M.A.; Maki, J.N.; Lemmon, M.T.; Van Beek, J.; Malin, M.C.; Wellington, D.; Kinch, K.M.; Madsen, M.B.; Hardgrove, C.; Ravine, M.A.; Jensen, E.; Harker, D.; Anderson, Ryan; Herkenhoff, Kenneth E.; Morris, R.V.; Cisneros, E.; Deen, R.G.
2017-01-01
The NASA Curiosity rover Mast Camera (Mastcam) system is a pair of fixed-focal length, multispectral, color CCD imagers mounted ~2 m above the surface on the rover's remote sensing mast, along with associated electronics and an onboard calibration target. The left Mastcam (M-34) has a 34 mm focal length, an instantaneous field of view (IFOV) of 0.22 mrad, and a FOV of 20° × 15° over the full 1648 × 1200 pixel span of its Kodak KAI-2020 CCD. The right Mastcam (M-100) has a 100 mm focal length, an IFOV of 0.074 mrad, and a FOV of 6.8° × 5.1° using the same detector. The cameras are separated by 24.2 cm on the mast, allowing stereo images to be obtained at the resolution of the M-34 camera. Each camera has an eight-position filter wheel, enabling it to take Bayer pattern red, green, and blue (RGB) “true color” images, multispectral images in nine additional bands spanning ~400–1100 nm, and images of the Sun in two colors through neutral density-coated filters. An associated Digital Electronics Assembly provides command and data interfaces to the rover, 8 Gb of image storage per camera, 11 bit to 8 bit companding, JPEG compression, and acquisition of high-definition video. Here we describe the preflight and in-flight calibration of Mastcam images, the ways that they are being archived in the NASA Planetary Data System, and the ways that calibration refinements are being developed as the investigation progresses on Mars. We also provide some examples of data sets and analyses that help to validate the accuracy and precision of the calibration
Far ultraviolet wide field imaging and photometry - Spartan-202 Mark II Far Ultraviolet Camera
NASA Technical Reports Server (NTRS)
Carruthers, George R.; Heckathorn, Harry M.; Opal, Chet B.; Witt, Adolf N.; Henize, Karl G.
1988-01-01
The U.S. Naval Research Laboratory' Mark II Far Ultraviolet Camera, which is expected to be a primary scientific instrument aboard the Spartan-202 Space Shuttle mission, is described. This camera is intended to obtain FUV wide-field imagery of stars and extended celestial objects, including diffuse nebulae and nearby galaxies. The observations will support the HST by providing FUV photometry of calibration objects. The Mark II camera is an electrographic Schmidt camera with an aperture of 15 cm, a focal length of 30.5 cm, and sensitivity in the 1230-1600 A wavelength range.
First results on video meteors from Crete, Greece
NASA Astrophysics Data System (ADS)
Maravelias, G.
2012-01-01
This work presents the first systematic video meteor observations from a, forthcoming permanent, station in Crete, Greece, operating as the first official node within the International Meteor Organization's Video Network. It consists of a Watec 902 H2 Ultimate camera equipped with a Panasonic WV-LA1208 (focal length 12mm, f/0.8) lens running MetRec. The system operated for 42 nights during 2011 (August 19-December 30, 2011) recording 1905 meteors. It is significantly more performant than a previous system used by the author during the Perseids 2010 (DMK camera 21AF04.AS by The Imaging Source, CCTV lens of focal length 2.8 mm, UFO Capture v2.22), which operated for 17 nights (August 4-22, 2010) recording 32 meteors. Differences - according to the author's experience - between the two softwares (MetRec, UFO Capture) are discussed along with a small guide to video meteor hardware.
Observation of Planetary Motion Using a Digital Camera
ERIC Educational Resources Information Center
Meyn, Jan-Peter
2008-01-01
A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to…
Image Processing for Cameras with Fiber Bundle Image Relay
length. Optical fiber bundles have been used to couple between this focal surface and planar image sensors . However, such fiber-coupled imaging systems...coupled to six discrete CMOS focal planes. We characterize the locally space-variant system impulse response at various stages: monocentric lens image...vignetting, and stitch together the image data from discrete sensors into a single panorama. We compare processed images from the prototype to those taken with
Graphic design of pinhole cameras
NASA Technical Reports Server (NTRS)
Edwards, H. B.; Chu, W. P.
1979-01-01
The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.
NASA Astrophysics Data System (ADS)
Daly, Michael J.; Muhanna, Nidal; Chan, Harley; Wilson, Brian C.; Irish, Jonathan C.; Jaffray, David A.
2014-02-01
A freehand, non-contact diffuse optical tomography (DOT) system has been developed for multimodal imaging with intraoperative cone-beam CT (CBCT) during minimally-invasive cancer surgery. The DOT system is configured for near-infrared fluorescence imaging with indocyanine green (ICG) using a collimated 780 nm laser diode and a nearinfrared CCD camera (PCO Pixelfly USB). Depending on the intended surgical application, the camera is coupled to either a rigid 10 mm diameter endoscope (Karl Storz) or a 25 mm focal length lens (Edmund Optics). A prototype flatpanel CBCT C-Arm (Siemens Healthcare) acquires low-dose 3D images with sub-mm spatial resolution. A 3D mesh is extracted from CBCT for finite-element DOT implementation in NIRFAST (Dartmouth College), with the capability for soft/hard imaging priors (e.g., segmented lymph nodes). A stereoscopic optical camera (NDI Polaris) provides real-time 6D localization of reflective spheres mounted to the laser and camera. Camera calibration combined with tracking data is used to estimate intrinsic (focal length, principal point, non-linear distortion) and extrinsic (translation, rotation) lens parameters. Source/detector boundary data is computed from the tracked laser/camera positions using radiometry models. Target registration errors (TRE) between real and projected boundary points are ~1-2 mm for typical acquisition geometries. Pre-clinical studies using tissue phantoms are presented to characterize 3D imaging performance. This translational research system is under investigation for clinical applications in head-and-neck surgery including oral cavity tumour resection, lymph node mapping, and free-flap perforator assessment.
Ultrathin zoom telescopic objective.
Li, Lei; Wang, Di; Liu, Chao; Wang, Qiong-Hua
2016-08-08
We report an ultrathin zoom telescopic objective that can achieve continuous zoom change and has reduced compact volume. The objective consists of an annular folded lens and three electrowetting liquid lenses. The annular folded lens undertakes the main part of the focal power of the lens system. Due to a multiple-fold design, the optical path is folded in a lens with the thickness of ~1.98mm. The electrowetting liquid lenses constitute a zoom part. Based on the proposed objective, an ultrathin zoom telescopic camera is demonstrated. We analyze the properties of the proposed objective. The aperture of the proposed objective is ~15mm. The total length of the system is ~18mm with a tunable focal length ~48mm to ~65mm. Compared with the conventional zoom telescopic objective, the total length has been largely reduced.
Phenomenology of a Water Venting in Low Earth Orbit
1992-01-01
of the transport of outgas , the interaction of the vehicle with the ionospheric plasma, the energy balance of cometary material, and the uses of...l 4.04 100 ZA Ic 50 o)o 200 250 300 350 400 Distance along profile (pixels) Fig. 8. (a) Equi-photocurrent plot of the water trail from the aft...distance of the onboard camera’s short ONBOARD-CANIERA IMAGES focal length lens. their corresponding mean irradiance at the focal plane is - Xs we have
Focus collimator press for a collimator for gamma ray cameras
DOE Office of Scientific and Technical Information (OSTI.GOV)
York, R.N.; York, D.L.
A focus collimator press for collimators for gamma ray cameras is described comprising a pivot arm of fixed length mounted on a travelling pivot which is movable in the plane of a spaced apart work table surface in a direction toward and away from the work table. A press plate is carried at the opposite end of the fixed length pivot arm, and is maintained in registration with the same portion of the work table for pressing engagement with each undulating radiation opaque strip as it is added to the top of a collimator stack in process by movement ofmore » the travelling pivot inward toward the work table. This enables the press plate to maintain its relative position above the collimator stack and at the same time the angle of the press plate changes, becoming less acute in relation to the work table as the travelling pivot motes inwardly toward the work table. The fixed length of the pivot arm is substantially equal to the focal point of the converging apertures formed by each pair of undulating strips stacked together. Thus, the focal point of each aperture row falls substantially on the axis of the travelling pivot, and since it moves in the plane of the work table surface the focal point of each aperture row is directed to lie in the same common plane. When one of two collimator stacks made in this way is rotated 180 degrees and the two bonded together along their respective first strips, all focal points of every aperture row lie on the central axis of the completed collimator.« less
SU-D-BRC-07: System Design for a 3D Volumetric Scintillation Detector Using SCMOS Cameras
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darne, C; Robertson, D; Alsanea, F
2016-06-15
Purpose: The purpose of this project is to build a volumetric scintillation detector for quantitative imaging of 3D dose distributions of proton beams accurately in near real-time. Methods: The liquid scintillator (LS) detector consists of a transparent acrylic tank (20×20×20 cm{sup 3}) filled with a liquid scintillator that when irradiated with protons generates scintillation light. To track rapid spatial and dose variations in spot scanning proton beams we used three scientific-complementary metal-oxide semiconductor (sCMOS) imagers (2560×2160 pixels). The cameras collect optical signal from three orthogonal projections. To reduce system footprint two mirrors oriented at 45° to the tank surfaces redirectmore » scintillation light to cameras for capturing top and right views. Selection of fixed focal length objective lenses for these cameras was based on their ability to provide large depth of field (DoF) and required field of view (FoV). Multiple cross-hairs imprinted on the tank surfaces allow for image corrections arising from camera perspective and refraction. Results: We determined that by setting sCMOS to 16-bit dynamic range, truncating its FoV (1100×1100 pixels) to image the entire volume of the LS detector, and using 5.6 msec integration time imaging rate can be ramped up to 88 frames per second (fps). 20 mm focal length lens provides a 20 cm imaging DoF and 0.24 mm/pixel resolution. Master-slave camera configuration enable the slaves to initiate image acquisition instantly (within 2 µsec) after receiving a trigger signal. A computer with 128 GB RAM was used for spooling images from the cameras and can sustain a maximum recording time of 2 min per camera at 75 fps. Conclusion: The three sCMOS cameras are capable of high speed imaging. They can therefore be used for quick, high-resolution, and precise mapping of dose distributions from scanned spot proton beams in three dimensions.« less
Mapping Sequence performed during the STS-117 R-Bar Pitch Maneuver
2007-06-10
ISS015-E-11298 (10 June 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Atlantis as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows part of the commander's side or port side of Atlantis' cabin. Distance from the station and shuttle at this time was approximately 600 feet.
Mapping Sequence performed during the STS-118 R-Bar Pitch Maneuver
2007-08-10
ISS015-E-21344 (10 Aug. 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Endeavour as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows the nose cone of Endeavour and surrounding area. Distance between the station and shuttle at this time was approximately 600 feet.
Periodicity analysis on cat-eye reflected beam profiles of optical detectors
NASA Astrophysics Data System (ADS)
Gong, Mali; He, Sifeng
2017-05-01
The cat-eye effect reflected beam profiles of most optical detectors have a certain characteristic of periodicity, which is caused by array arrangement of sensors at their optical focal planes. It is the first time to find and prove that the reflected beam profile becomes several periodic spots at the reflected propagation distance corresponding to half the imaging distance of a CCD camera. Furthermore, the spatial cycle of these spots is approximately constant, independent of the CCD camera's imaging distance, which is related only to the focal length and pixel size of the CCD sensor. Thus, we can obtain the imaging distance and intrinsic parameters of the optical detector by analyzing its cat-eye reflected beam profiles. This conclusion can be applied in the field of non-cooperative cat-eye target recognition.
NASA Astrophysics Data System (ADS)
Lee, Kyuhang; Ko, Jinseok; Wi, Hanmin; Chung, Jinil; Seo, Hyeonjin; Jo, Jae Heung
2018-06-01
The visible TV system used in the Korea Superconducting Tokamak Advanced Research device has been equipped with a periscope to minimize the damage on its CCD pixels from neutron radiation. The periscope with more than 2.3 m in overall length has been designed for the visible camera system with its semi-diagonal field of view as wide as 30° and its effective focal length as short as 5.57 mm. The design performance of the periscope includes the modulation transfer function greater than 0.25 at 68 cycles/mm with low distortion. The installed periscope system has confirmed the image qualities as designed and also as comparable as those from its predecessor but with far less probabilities of neutral damages on the camera.
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele
2011-06-01
Recent advances in micro-optical element fabrication using gray scale technology have opened up the opportunity to create simultaneous multi-spectral imaging with fine structure diffractive lenses. This paper will discuss an approach that uses diffractive optical lenses configured in an array (lenslet array) and placed in close proximity to the focal plane array which enables a small compact simultaneous multispectral imaging camera [1]. The lenslet array is designed so that all lenslets have a common focal length with each lenslet tuned for a different wavelength. The number of simultaneous spectral images is determined by the number of individually configured lenslets in the array. The number of spectral images can be increased by a factor of 2 when using it with a dual-band focal plane array (MWIR/LWIR) by exploiting multiple diffraction orders. In addition, modulation of the focal length of the lenslet array with piezoelectric actuation will enable spectral bin fill-in allowing additional spectral coverage while giving up simultaneity. Different lenslet array spectral imaging concept designs are presented in this paper along with a unique concept for prefiltering the radiation focused on the detector. This approach to spectral imaging has applications in the detection of chemical agents in both aerosolized form and as a liquid on a surface. It also can be applied to the detection of weaponized biological agent and IED detection in various forms from manufacturing to deployment and post detection during forensic analysis.
Wide field NEO survey 1.0-m telescope with 10 2k×4k mosaic CCD camera
NASA Astrophysics Data System (ADS)
Isobe, Syuzo; Asami, Atsuo; Asher, David J.; Hashimoto, Toshiyasu; Nakano, Shi-ichi; Nishiyama, Kota; Ohshima, Yoshiaki; Terazono, Junya; Umehara, Hiroaki; Yoshikawa, Makoto
2002-12-01
We developed a new 1.0 m telescope with a 3 degree flat focal plane to which a mosaic CCD camera with 10 2k×4k chips is fixed. The system was set up in February 2002, and is now undergoing the final fine adjustments. Since the telescope has a focal length of 3 m, a field of 7.5 square degrees is covered in one image. In good seeing conditions, 1.5 arc seconds, at the site located in Bisei town, Okayama prefecture in Japan, we can expect to detect down to 20th magnitude stars with an exposure time of 60 seconds. Considering a read-out time, 46 seconds, of the CCD camera, one image is taken in every two minutes, and about 2,100 square degrees of field is expected to be covered in one clear night. This system is very effective for survey work, especially for Near-Earth-Asteroid detection.
Thermally tunable-focus lenticular lens using liquid crystal.
Heo, Kyong Chan; Yu, Seung Hun; Kwon, Jin Hyuk; Gwag, Jin Seog
2013-12-10
A thermally tunable focusing lenticular liquid crystal (LC) lens array was fabricated using a polymer LC component, including a polarizer that produces linearly polarized light. The focal length in the proposed structure could be tuned by temperature-adjusted applied voltage to a transparent heater in a lenticular LC lens cell because it alters the birefringence of the LC and varies the difference in refractive index between the LC and the polymer. The results showed that the focal length of the E7 LC used varied continuously with temperature from 5.6 to 8.7 mm from 25°C to 54°C, respectively. The proposed lenticular LC lens has potential use in photonic devices such as biological imaging, phone cameras, and optical sensors.
Optics Near the Snell Angle in a Water-to-Air Change of Medium
2007-01-01
the seawater wedge at the focus of a notional 57.3-mm lens modeled in ZEMAX ® [5]. The boxes are plotted in units of µm, and lens focal length is...lenses had insufficient focal-plane coverage. The ZEMAX spot diagram of this layout is depicted in Fig. 4. It is corrected for the horizon angle...the Fig. 9 ZEMAX layout. It is a two-prism design, but only one prism need be built and carried within the camera, with the forward prism being the
Spatial calibration of an optical see-through head mounted display
Gilson, Stuart J.; Fitzgibbon, Andrew W.; Glennerster, Andrew
2010-01-01
We present here a method for calibrating an optical see-through Head Mounted Display (HMD) using techniques usually applied to camera calibration (photogrammetry). Using a camera placed inside the HMD to take pictures simultaneously of a tracked object and features in the HMD display, we could exploit established camera calibration techniques to recover both the intrinsic and extrinsic properties of the HMD (width, height, focal length, optic centre and principal ray of the display). Our method gives low re-projection errors and, unlike existing methods, involves no time-consuming and error-prone human measurements, nor any prior estimates about the HMD geometry. PMID:18599125
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Rick Wetherington checks out one of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with an electric drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Kenny Allen works on the recently acquired Contraves-Goerz Kineto Tracking Mount (KTM). Trailer-mounted with a center console/seat and electric drive tracking mount, the KTM includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff. There are 10 KTMs certified for use on the Eastern Range.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Kenny Allen works on the recently acquired Contraves-Goerz Kineto Tracking Mount (KTM). Trailer-mounted with a center console/seat and electric drive tracking mount, the KTM includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff. There are 10 KTMs certified for use on the Eastern Range.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Kenny Allen checks out one of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with an electric drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
NASA Technical Reports Server (NTRS)
Lane, Marc; Hsieh, Cheng; Adams, Lloyd
1989-01-01
In undertaking the design of a 2000-mm focal length camera for the Mariner Mark II series of spacecraft, JPL sought novel materials with the requisite dimensional and thermal stability, outgassing and corrosion resistance, low mass, high stiffness, and moderate cost. Metal-matrix composites and Al-Li alloys have, in addition to excellent mechanical properties and low density, a suitably low coefficient of thermal expansion, high specific stiffness, and good electrical conductivity. The greatest single obstacle to application of these materials to camera structure design is noted to have been the lack of information regarding long-term dimensional stability.
NASA Astrophysics Data System (ADS)
Fuh, Yiin-Kuen; Lai, Zheng-Hong
2017-02-01
A fast processing route of aspheric polydimethylsiloxane (PDMS) lenses array (APLA) is proposed via the combined effect of inverted gravitational and heat-assisted forces. The fabrication time can be dramatically reduced to 30 s, compared favorably to the traditional duration of 2 hours of repeated cycles of addition-curing processes. In this paper, a low-cost flexible lens can be fabricated by repeatedly depositing, inverting, curing a hanging transparent PDMS elastomer droplet on a previously deposited curved structure. Complex structures with aspheric curve features and various focal lengths can be successfully produced and the fabricated 4 types of APLA have various focal lengths in the range of 7.03 mm, 6.00 mm, 5.33 mm, and 4.43 mm, respectively. Empirically, a direct relationship between the PDMS volume and focal lengths of the lenses can be experimentally deducted. Using these fabricated APLA, an ordinary commercial smartphone camera can be easily transformed to a low-cost, portable digital microscopy (50×magnification) such that point of care diagnostic can be implemented pervasively.
NASA Astrophysics Data System (ADS)
Lin, Han; Baoqi, Mao; Wen, Sun; Weimin, Shen
2016-10-01
There is a race to develop spaceborne high-resolution video cameras since Skybox's success. For low manufacture cost and adaption to micro and small satellites, it is urgent to design and develop compact long focal length optical system with not only small volume, light weight and easy implementation, and also two dimensional field. Our focus is on the Coaxial Three-Mirror Anastigmat (CTMA) with intermediate real image for its no need outer hood and compactness and for its easy alignment, low-order aspheric surface and low cost. The means to deflect its image space beam for accessibility of focal plane array detector and to eliminate its inherent secondary obscuration from its primary mirror central hole and deflection flat mirror is discussed. The conditions to satisfy the above-mentioned requirements are presented with our derived relationship among its optical and structural parameters based on Gaussian optics and geometry. One flat mirror near its exit pupil can be used to deflect its image plane from its axis. And its total length can be decreased with other some flat mirrors. Method for determination of its initial structure with the derived formulae is described through one design example. Furthermore, optimized CTMA without secondary obscuration and with effective focal length (EFFL) of 10m is reported. Its full field, F-number and total length are respectively 1.1°×1°, F/14.3, and one eighth of its EFFL. And its imaging quality is near diffraction limit.
NASA Technical Reports Server (NTRS)
1978-01-01
The large format camera (LFC) designed as a 30 cm focal length cartographic camera system that employs forward motion compensation in order to achieve the full image resolution provided by its 80 degree field angle lens is described. The feasibility of application of the current LFC design to deployment in the orbiter program as the Orbiter Camera Payload System was assessed and the changes that are necessary to meet such a requirement are discussed. Current design and any proposed design changes were evaluated relative to possible future deployment of the LFC on a free flyer vehicle or in a WB-57F. Preliminary mission interface requirements for the LFC are given.
Note: Simple hysteresis parameter inspector for camera module with liquid lens
NASA Astrophysics Data System (ADS)
Chen, Po-Jui; Liao, Tai-Shan; Hwang, Chi-Hung
2010-05-01
A method to inspect hysteresis parameter is presented in this article. The hysteresis of whole camera module with liquid lens can be measured rather than a single lens merely. Because the variation in focal length influences image quality, we propose utilizing the sharpness of images which is captured from camera module for hysteresis evaluation. Experiments reveal that the profile of sharpness hysteresis corresponds to the characteristic of contact angle of liquid lens. Therefore, it can infer that the hysteresis of camera module is induced by the contact angle of liquid lens. An inspection process takes only 20 s to complete. Thus comparing with other instruments, this inspection method is more suitable to integrate into the mass production lines for online quality assurance.
An Unusual View: MISR sees the Moon
2017-08-17
The job of the Multiangle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite is to view Earth. For more than 17 years, its nine cameras have stared downward 24 hours a day, faithfully collecting images used to study Earth's surface and atmosphere. On August 5, however, MISR captured some very unusual data as the Terra satellite performed a backflip in space. This maneuver was performed to allow MISR and the other instruments on Terra to catch a glimpse of the Moon, something that has been done only once before, in 2003. Why task an elderly satellite with such a radical maneuver? Since we can be confident that the Moon's brightness has remained very constant over the mission, MISR's images of the Moon can be used as a check of the instrument's calibration, allowing an independent verification of the procedures used to correct the images for any changes the cameras have experienced over their many years in space. If changes in the cameras' responses to light aren't properly accounted for, the images captured by MISR would make it appear as if Earth were growing darker or lighter, which would throw off scientists' efforts to characterize air pollution, cloud cover and Earth's climate. Because of this, the MISR team uses several methods to calibrate the data, all of which involve imaging something with a known (or independently measured) brightness and correcting the images to match that brightness. Every month, MISR views two panels of a special material called Spectralon, which reflects sunlight in a very particular way, onboard the instrument. Periodically, this calibration is checked by a field team who measures the brightness of a flat, uniformly colored surface on Earth, usually a dry desert lakebed, as MISR flies overhead. The lunar maneuver offers a third opportunity to check the brightness calibration of MISR's images. While viewing Earth, MISR's cameras are fixed at nine different angles, with one (called An) pointed straight down, four canted forwards (Af, Bf, Cf, and Df) and four angled backwards (Aa, Ba, Ca, and Da). The A, B, C, and D cameras have different focal lengths, with the most oblique (D) cameras having the longest focal lengths in order to preserve spatial resolution on the ground. During the lunar maneuver, however, the spacecraft rotated so that each camera saw the almost-full Moon straight on. This means that the different focal lengths produce images with different resolutions. The D cameras produce the sharpest images. These grayscale images were made with raw data from the red spectral band of each camera. Because the spacecraft is constantly rotating while these images were taken, the images are "smeared" in the vertical direction, producing an oval-shaped Moon. These have been corrected to restore the Moon to its true circular shape. https://photojournal.jpl.nasa.gov/catalog/PIA21876
Concept of electro-optical sensor module for sniper detection system
NASA Astrophysics Data System (ADS)
Trzaskawka, Piotr; Dulski, Rafal; Kastek, Mariusz
2010-10-01
The paper presents an initial concept of the electro-optical sensor unit for sniper detection purposes. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. Being a part of a larger system it should contribute to greater overall system efficiency and lower false alarm rate thanks to data and sensor fusion techniques. Additionally, it is expected to provide some pre-shot detection capabilities. Generally acoustic (or radar) systems used for shot detection offer only "after-the-shot" information and they cannot prevent enemy attack, which in case of a skilled sniper opponent usually means trouble. The passive imaging sensors presented in this paper, together with active systems detecting pointed optics, are capable of detecting specific shooter signatures or at least the presence of suspected objects in the vicinity. The proposed sensor unit use thermal camera as a primary sniper and shot detection tool. The basic camera parameters such as focal plane array size and type, focal length and aperture were chosen on the basis of assumed tactical characteristics of the system (mainly detection range) and current technology level. In order to provide costeffective solution the commercially available daylight camera modules and infrared focal plane arrays were tested, including fast cooled infrared array modules capable of 1000 fps image acquisition rate. The daylight camera operates as a support, providing corresponding visual image, easier to comprehend for a human operator. The initial assumptions concerning sensor operation were verified during laboratory and field test and some example shot recording sequences are presented.
Optimization design of periscope type 3X zoom lens design for a five megapixel cellphone camera
NASA Astrophysics Data System (ADS)
Sun, Wen-Shing; Tien, Chuen-Lin; Pan, Jui-Wen; Chao, Yu-Hao; Chu, Pu-Yi
2016-11-01
This paper presents a periscope type 3X zoom lenses design for a five megapixel cellphone camera. The configuration of optical system uses the right angle prism in front of the zoom lenses to change the optical path rotated by a 90° angle resulting in the zoom lenses length of 6 mm. The zoom lenses can be embedded in mobile phone with a thickness of 6 mm. The zoom lenses have three groups with six elements. The half field of view is varied from 30° to 10.89°, the effective focal length is adjusted from 3.142 mm to 9.426 mm, and the F-number is changed from 2.8 to 5.13.
Mapping Sequence performed during the STS-118 R-Bar Pitch Maneuver
2007-08-10
ISS015-E-21335 (10 Aug. 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Endeavour as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image is an almost nadir perspective over Endeavour's of aft cabin and its docking system. Distance between the station and shuttle at this time was approximately 600 feet.
Mapping Sequence performed during the STS-117 R-Bar Pitch Maneuver
2007-06-10
ISS015-E-11354 (10 June 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Atlantis as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows a view of the underside of nose/nosecap and forward landing gear doors. Distance from the station and shuttle at this time was approximately 600 feet.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Kenny Allen makes adjustments on one of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with a center console/seat and electric drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operators Rick Worthington (left) and Kenny Allen work on one of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with a center console/seat and electric drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Kenny Allen stands in the center console area of one of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with an electric-drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Rick Wetherington sits in the center console seat of one of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with an electric drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operators Rick Wetherington (left) and Kenny Allen work on two of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with a center console/seat and electric drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
Dynamic calibration of pan-tilt-zoom cameras for traffic monitoring.
Song, Kai-Tai; Tai, Jen-Chao
2006-10-01
Pan-tilt-zoom (PTZ) cameras have been widely used in recent years for monitoring and surveillance applications. These cameras provide flexible view selection as well as a wider observation range. This makes them suitable for vision-based traffic monitoring and enforcement systems. To employ PTZ cameras for image measurement applications, one first needs to calibrate the camera to obtain meaningful results. For instance, the accuracy of estimating vehicle speed depends on the accuracy of camera calibration and that of vehicle tracking results. This paper presents a novel calibration method for a PTZ camera overlooking a traffic scene. The proposed approach requires no manual operation to select the positions of special features. It automatically uses a set of parallel lane markings and the lane width to compute the camera parameters, namely, focal length, tilt angle, and pan angle. Image processing procedures have been developed for automatically finding parallel lane markings. Interesting experimental results are presented to validate the robustness and accuracy of the proposed method.
Mapping Sequence performed during the STS-117 R-Bar Pitch Maneuver
2007-06-10
ISS015-E-11351 (10 June 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Atlantis as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows part of Atlantis' underside thermal protection system and part of the port side cabin, including the hatch, as well as a section of the open payload bay cover. Distance from the station and shuttle at this time was approximately 600 feet.
Mapping Sequence performed during the STS-117 R-Bar Pitch Maneuver
2007-06-10
ISS015-E-11320 (10 June 2007) --- This is one of a series of images, photographed with a digital still camera using an 800mm focal length, featuring the different areas of the Space Shuttle Atlantis as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows part of Atlantis' cabin and its docking system, which a short time later was involved in linking up with the orbital outpost. Distance from the station and shuttle at this time was approximately 600 feet.
Mapping Sequence performed during the STS-118 R-Bar Pitch Maneuver
2007-08-10
ISS015-E-21340 (10 Aug. 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Endeavour as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows part of the commander's side or port side of Endeavour's cabin, including the hatch, as well as a section of the open payload bay cover. Distance between the station and shuttle at this time was approximately 600 feet.
The development of large-aperture test system of infrared camera and visible CCD camera
NASA Astrophysics Data System (ADS)
Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying
2015-10-01
Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.
NASA Technical Reports Server (NTRS)
Graff, Paige Valderrama; Baker, Marshalyn (Editor); Graff, Trevor (Editor); Lindgren, Charlie (Editor); Mailhot, Michele (Editor); McCollum, Tim (Editor); Runco, Susan (Editor); Stefanov, William (Editor); Willis, Kim (Editor)
2010-01-01
Scientists from the Image Science and Analysis Laboratory (ISAL) at NASA's Johnson Space Center (JSC) work with astronauts onboard the International Space Station (ISS) who take images of Earth. Astronaut photographs, sometimes referred to as Crew Earth Observations, are taken using hand-held digital cameras onboard the ISS. These digital images allow scientists to study our Earth from the unique perspective of space. Astronauts have taken images of Earth since the 1960s. There is a database of over 900,000 astronaut photographs available at http://eol.jsc.nasa.gov . Images are requested by ISAL scientists at JSC and astronauts in space personally frame and acquire them from the Destiny Laboratory or other windows in the ISS. By having astronauts take images, they can specifically frame them according to a given request and need. For example, they can choose to use different lenses to vary the amount of area (field of view) an image will cover. Images can be taken at different times of the day which allows different lighting conditions to bring out or highlight certain features. The viewing angle at which an image is acquired can also be varied to show the same area from different perspectives. Pointing the camera straight down gives you a nadir shot. Pointing the camera at an angle to get a view across an area would be considered an oblique shot. Being able to change these variables makes astronaut photographs a unique and useful data set. Astronaut photographs are taken from the ISS from altitudes of 300 - 400 km (185 to 250 miles). One of the current cameras being used, the Nikon D3X digital camera, can take images using a 50, 100, 250, 400 or 800mm lens. These different lenses allow for a wider or narrower field of view. The higher the focal length (800mm for example) the narrower the field of view (less area will be covered). Higher focal lengths also show greater detail of the area on the surface being imaged. Scientists from the Image Science and Analysis Laboratory (ISAL) at NASA s Johnson Space Center (JSC) work with astronauts onboard the International Space Station (ISS) who take images of Earth. Astronaut photographs, sometimes referred to as Crew Earth Observations, are taken using hand-held digital cameras onboard the ISS. These digital images allow scientists to study our Earth from the unique perspective of space. Astronauts have taken images of Earth since the 1960s. There is a database of over 900,000 astronaut photographs available at http://eol.jsc.nasa.gov . Images are requested by ISAL scientists at JSC and astronauts in space personally frame and acquire them from the Destiny Laboratory or other windows in the ISS. By having astronauts take images, they can specifically frame them according to a given request and need. For example, they can choose to use different lenses to vary the amount of area (field of view) an image will cover. Images can be taken at different times of the day which allows different lighting conditions to bring out or highlight certain features. The viewing angle at which an image is acquired can also be varied to show the same area from different perspectives. Pointing the camera straight down gives you a nadir shot. Pointing the camera at an angle to get a view across an area would be considered an oblique shot. Being able to change these variables makes astronaut photographs a unique and useful data set. Astronaut photographs are taken from the ISS from altitudes of 300 - 400 km (approx.185 to 250 miles). One of the current cameras being used, the Nikon D3X digital camera, can take images using a 50, 100, 250, 400 or 800mm lens. These different lenses allow for a wider or narrower field of view. The higher the focal length (800mm for example) the narrower the field of view (less area will be covered). Higher focal lengths also show greater detail of the area on the surface being imaged. There are four major systems or spheres of Earth. They are: Atmosphere, Biosphere, Hydrosphe, and Litho/Geosphere.
2015-01-16
beamsplitters. A longpass dichroic beamsplitter ( Semrock , FF347-Di01-50.4x71.2) reflects the ultravio- let hydroxyl radical (OH) chemiluminescence, an...and a 80 nm FWHM band- pass filter centered at 300 nm ( Semrock , FF01-300/80-25) fitted to a UV F/4.5 105 mm focal length Nikkor lens. The camera is
Tunable liquid microlens array driven by pyroelectric effect: full interferometric characterization
NASA Astrophysics Data System (ADS)
Miccio, Lisa; Grilli, Simonetta; Vespini, Veronica; Ferraro, Pietro
2008-09-01
Liquid lenses with adjustable focal length are of great interest in the field of microfluidic devices. They are, usually, realized by electrowetting effect after electrodes patterning on a hydrofobic substrate. Applications are possible in many fields ranging from commercial products such as digital cameras to biological cell sorting. We realized an open array of liquid lenses with adjustable focal length without electrode patterning. We used a z-cut Lithium Niobate crystal (LN) as substrate and few microliters of an oily substance to obtain the droplets array. The spontaneous polarization of LN crystals is reversed by the electric field poling process, thus enabling the realization of periodically poled LN (PPLN) crystals. The substrate consists of a two-dimensional square array of reversed domains with a period around 200 μm. Each domain presents an hexagonal geometry due to the crystal structure. PPLN is first covered by a thin and homogeneous layer of the above mentioned liquid and therefore its temperature is changed by means of a digitally controlled hot plate. During heating and cooling process there is a rearrangement of the liquid layer until it reaches the final topography. Lenses formation is due to the superficial tension changing at the liquid-solid interface by means of the pyroelectric effect. Such effect allows to create a two-dimensional lens pattern of tunable focal length without electrodes. The temporal evolution of both shape and focal length lenses are quantitatively measured by Digital Holographic Microscopy. Array imaging properties and quantitative analysis of the lenses features and aberrations are presented.
Stereo depth distortions in teleoperation
NASA Technical Reports Server (NTRS)
Diner, Daniel B.; Vonsydow, Marika
1988-01-01
In teleoperation, a typical application of stereo vision is to view a work space located short distances (1 to 3m) in front of the cameras. The work presented here treats converged camera placement and studies the effects of intercamera distance, camera-to-object viewing distance, and focal length of the camera lenses on both stereo depth resolution and stereo depth distortion. While viewing the fronto-parallel plane 1.4 m in front of the cameras, depth errors are measured on the order of 2cm. A geometric analysis was made of the distortion of the fronto-parallel plane of divergence for stereo TV viewing. The results of the analysis were then verified experimentally. The objective was to determine the optimal camera configuration which gave high stereo depth resolution while minimizing stereo depth distortion. It is found that for converged cameras at a fixed camera-to-object viewing distance, larger intercamera distances allow higher depth resolutions, but cause greater depth distortions. Thus with larger intercamera distances, operators will make greater depth errors (because of the greater distortions), but will be more certain that they are not errors (because of the higher resolution).
Acquisition of 3d Information for Vanished Structure by Using Only AN Ancient Picture
NASA Astrophysics Data System (ADS)
Kunii, Y.; Sakamoto, R.
2016-06-01
In order to acquire 3D information for reconstruction of vanished historical structure, grasp of 3D shape of such structure was attempted by using an ancient picture. Generally, 3D information of a structure is acquired by photogrammetric theory which requires two or more pictures. This paper clarifies that the geometrical information of the structure was obtained only from an ancient picture, and 3D information was acquired. This kind of method was applied for an ancient picture of the Old Imperial Theatre. The Old Imperial Theatre in the picture is constituted by two-point perspective. Therefore, estimated value of focal length of camera, length of camera to the Old Imperial Theatre and some parameters were calculated by estimation of field angle, using body height as an index of length and some geometrical information. Consequently, 3D coordinate of 120 measurement points on the surface of the Old Imperial Theatre were calculated respectively, and 3DCG modeling of the Old Imperial Theatre was realized.
Smart lens: tunable liquid lens for laser tracking
NASA Astrophysics Data System (ADS)
Lin, Fan-Yi; Chu, Li-Yu; Juan, Yu-Shan; Pan, Sih-Ting; Fan, Shih-Kang
2007-05-01
A tracking system utilizing tunable liquid lens is proposed and demonstrated. Adapting the concept of EWOD (electrowetting-on-dielectric), the curvature of a droplet on a dielectric film can be controlled by varying the applied voltage. When utilizing the droplet as an optical lens, the focal length of this adaptive liquid lens can be adjusted as desired. Moreover, the light that passes through it can therefore be focused to different positions in space. In this paper, the tuning range of the curvature and focal length of the tunable liquid lens is investigated. Droplet transformation is observed and analyzed under a CCD camera. A tracking system combining the tunable liquid lens with a laser detection system is also proposed. With a feedback circuit that maximizing the returned signal by controlling the tunable lens, the laser beam can keep tracked on a distant reflected target while it is moving.
Brown, David M; Juarez, Juan C; Brown, Andrea M
2013-12-01
A laser differential image-motion monitor (DIMM) system was designed and constructed as part of a turbulence characterization suite during the DARPA free-space optical experimental network experiment (FOENEX) program. The developed link measurement system measures the atmospheric coherence length (r0), atmospheric scintillation, and power in the bucket for the 1550 nm band. DIMM measurements are made with two separate apertures coupled to a single InGaAs camera. The angle of arrival (AoA) for the wavefront at each aperture can be calculated based on focal spot movements imaged by the camera. By utilizing a single camera for the simultaneous measurement of the focal spots, the correlation of the variance in the AoA allows a straightforward computation of r0 as in traditional DIMM systems. Standard measurements of scintillation and power in the bucket are made with the same apertures by redirecting a percentage of the incoming signals to InGaAs detectors integrated with logarithmic amplifiers for high sensitivity and high dynamic range. By leveraging two, small apertures, the instrument forms a small size and weight configuration for mounting to actively tracking laser communication terminals for characterizing link performance.
NASA Astrophysics Data System (ADS)
Pomares, Jorge; Felicetti, Leonard; Pérez, Javier; Emami, M. Reza
2018-02-01
An image-based servo controller for the guidance of a spacecraft during non-cooperative rendezvous is presented in this paper. The controller directly utilizes the visual features from image frames of a target spacecraft for computing both attitude and orbital maneuvers concurrently. The utilization of adaptive optics, such as zooming cameras, is also addressed through developing an invariant-image servo controller. The controller allows for performing rendezvous maneuvers independently from the adjustments of the camera focal length, improving the performance and versatility of maneuvers. The stability of the proposed control scheme is proven analytically in the invariant space, and its viability is explored through numerical simulations.
Fixed-focus camera objective for small remote sensing satellites
NASA Astrophysics Data System (ADS)
Topaz, Jeremy M.; Braun, Ofer; Freiman, Dov
1993-09-01
An athermalized objective has been designed for a compact, lightweight push-broom camera which is under development at El-Op Ltd. for use in small remote-sensing satellites. The high performance objective has a fixed focus setting, but maintains focus passively over the full range of temperatures encountered in small satellites. The lens is an F/5.0, 320 mm focal length Tessar type, operating over the range 0.5 - 0.9 micrometers . It has a 16 degree(s) field of view and accommodates various state-of-the-art silicon detector arrays. The design and performance of the objective is described in this paper.
Mapping Sequence performed during the STS-117 R-Bar Pitch Maneuver
2007-06-10
ISS015-E-11328 (10 June 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Atlantis as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows part of the commander's side or port side of Atlantis' cabin, including the hatch, as well as a section of the open payload bay cover and part of the docking system. Distance from the station and shuttle at this time was approximately 600 feet.
Optimal design of an earth observation optical system with dual spectral and high resolution
NASA Astrophysics Data System (ADS)
Yan, Pei-pei; Jiang, Kai; Liu, Kai; Duan, Jing; Shan, Qiusha
2017-02-01
With the increasing demand of the high-resolution remote sensing images by military and civilians, Countries around the world are optimistic about the prospect of higher resolution remote sensing images. Moreover, design a visible/infrared integrative optic system has important value in earth observation. Because visible system can't identify camouflage and recon at night, so we should associate visible camera with infrared camera. An earth observation optical system with dual spectral and high resolution is designed. The paper mainly researches on the integrative design of visible and infrared optic system, which makes the system lighter and smaller, and achieves one satellite with two uses. The working waveband of the system covers visible, middle infrared (3-5um). Dual waveband clear imaging is achieved with dispersive RC system. The focal length of visible system is 3056mm, F/# is 10.91. And the focal length of middle infrared system is 1120mm, F/# is 4. In order to suppress the middle infrared thermal radiation and stray light, the second imaging system is achieved and the narcissus phenomenon is analyzed. The system characteristic is that the structure is simple. And the especial requirements of the Modulation Transfer Function (MTF), spot, energy concentration, and distortion etc. are all satisfied.
NASA Astrophysics Data System (ADS)
Zhang, Rumin; Liu, Peng; Liu, Dijun; Su, Guobin
2015-12-01
In this paper, we establish a forward simulation model of plenoptic camera which is implemented by inserting a micro-lens array in a conventional camera. The simulation model is used to emulate how the space objects at different depths are imaged by the main lens then remapped by the micro-lens and finally captured on the 2D sensor. We can easily modify the parameters of the simulation model such as the focal lengths and diameters of the main lens and micro-lens and the number of micro-lens. Employing the spatial integration, the refocused images and all-in-focus images are rendered based on the plenoptic images produced by the model. The forward simulation model can be used to determine the trade-offs between different configurations and to test any new researches related to plenoptic camera without the need of prototype.
NASA Astrophysics Data System (ADS)
Liu, L.; Huang, Zh.; Qiu, Zh.; Li, B.
2018-01-01
A handheld RGB camera was developed to monitor the in vivo distribution of porphyrin-based photosensitizer (PS) hematoporphyrin monomethyl ether (HMME) in blood vessels during photodynamic therapy (PDT). The focal length, f-number, International Standardization Organization (ISO) sensitivity, and shutter speed of the camera were optimized for the solution sample with various HMME concentrations. After the parameter optimization, it was found that the red intensity value of the fluorescence image was linearly related to the fluorescence intensity under investigated conditions. The RGB camera was then used to monitor the in vivo distribution of HMME in blood vessels in a skin-fold window chamber model. The red intensity value of the recorded RGB fluorescence image was found to be linearly correlated to HMME concentrations in the range 0-24 μM. Significant differences in the red to green intensity ratios were observed between the blood vessels and the surrounding tissue.
The multifocus plenoptic camera
NASA Astrophysics Data System (ADS)
Georgiev, Todor; Lumsdaine, Andrew
2012-01-01
The focused plenoptic camera is based on the Lippmann sensor: an array of microlenses focused on the pixels of a conventional image sensor. This device samples the radiance, or plenoptic function, as an array of cameras with large depth of field, focused at a certain plane in front of the microlenses. For the purpose of digital refocusing (which is one of the important applications) the depth of field needs to be large, but there are fundamental optical limitations to this. The solution of the above problem is to use and array of interleaved microlenses of different focal lengths, focused at two or more different planes. In this way a focused image can be constructed at any depth of focus, and a really wide range of digital refocusing can be achieved. This paper presents our theory and results of implementing such camera. Real world images are demonstrating the extended capabilities, and limitations are discussed.
To catch a comet: Technical overview of CAN DO G-324
NASA Technical Reports Server (NTRS)
Obrien, T. J. (Editor)
1986-01-01
The primary objective of the C. E. Williams Middle School Get Away Special CAN DO is the photographing of Comet Halley. The project will involve middle school students, grades 6 through 8, in the study and interpretation of astronomical photographs and techniques. G-324 is contained in a 5 cubic foot GAS Canister with an opening door and pyrex window for photography. It will be pressurized with one atmosphere of dry nitrogen. Three 35mm still cameras with 250 exposure film backs and different focal length lenses will be fired by a combination of automatic timer and an active comet detector. A lightweight 35mm movie camera will shoot single exposures at about 1/2 minute intervals to give an overlapping skymap of the mission. The fifth camera is a solid state television camera specially constructed for detection of the comet by microprocessor.
Wrist Camera Orientation for Effective Telerobotic Orbital Replaceable Unit (ORU) Changeout
NASA Technical Reports Server (NTRS)
Jones, Sharon Monica; Aldridge, Hal A.; Vazquez, Sixto L.
1997-01-01
The Hydraulic Manipulator Testbed (HMTB) is the kinematic replica of the Flight Telerobotic Servicer (FTS). One use of the HMTB is to evaluate advanced control techniques for accomplishing robotic maintenance tasks on board the Space Station. Most maintenance tasks involve the direct manipulation of the robot by a human operator when high-quality visual feedback is important for precise control. An experiment was conducted in the Systems Integration Branch at the Langley Research Center to compare several configurations of the manipulator wrist camera for providing visual feedback during an Orbital Replaceable Unit changeout task. Several variables were considered such as wrist camera angle, camera focal length, target location, lighting. Each study participant performed the maintenance task by using eight combinations of the variables based on a Latin square design. The results of this experiment and conclusions based on data collected are presented.
Compact and high resolution virtual mouse using lens array and light sensor
NASA Astrophysics Data System (ADS)
Qin, Zong; Chang, Yu-Cheng; Su, Yu-Jie; Huang, Yi-Pai; Shieh, Han-Ping David
2016-06-01
Virtual mouse based on IR source, lens array and light sensor was designed and implemented. Optical architecture including lens amount, lens pitch, baseline length, sensor length, lens-sensor gap, focal length etc. was carefully designed to achieve low detective error, high resolution, and simultaneously, compact system volume. System volume is 3.1mm (thickness) × 4.5mm (length) × 2, which is much smaller than that of camera-based device. Relative detective error of 0.41mm and minimum resolution of 26ppi were verified in experiments, so that it can replace conventional touchpad/touchscreen. If system thickness is eased to 20mm, resolution higher than 200ppi can be achieved to replace real mouse.
NASA Astrophysics Data System (ADS)
Thomas, N.
2016-12-01
The Sheath Transport Observer for the Redistribution of Mass (STORM) and the Cusp Plasma Imaging Detector (CuPID) instruments are soft X-ray cameras the utilize slumped micropore ('lobster-eye') optics. These lobster-eye optics, developed by the University of Leicester and the Photonis Corporation, provide for wide field-of-view imaging of X-ray line emission produced via charge exchange between hydrogen in the Earth's exosphere and heavy ions in the solar wind. Both instruments have position sensitize, chevron configuration, microchannel plate detectors in their respective focal planes. STORM possess two, 4 cm by 4 cm, lobster-eye optics, each with a focal length of 37.5 cm. It flew as a piggy back payload on the Diffuse X-ray emission from the Local galaxy (DXL) sounding rocket mission which was launched in December of 2012 from White Sands Missile Range, New Mexico. STORM operated successfully during this mission and represents the first use of lobster-eye optics in space. A future version of STORM, in high orbit, could image a significant portion of the magnetosheath to infer the locations of the magnetopause and the bow shock. CuPID is a 3U CubeSat variant of STORM that uses a single optic with a 27.5 cm focal length. A sounding rocket borne CuPID flew as a science payload with DXL from White Sands in December of 2015 with results forthcoming.
NASA Astrophysics Data System (ADS)
den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.
2015-10-01
Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.
NASA Technical Reports Server (NTRS)
Diner, Daniel B. (Inventor)
1989-01-01
A method and apparatus is developed for obtaining a stereo image with reduced depth distortion and optimum depth resolution. Static and dynamic depth distortion and depth resolution tradeoff is provided. Cameras obtaining the images for a stereo view are converged at a convergence point behind the object to be presented in the image, and the collection-surface-to-object distance, the camera separation distance, and the focal lengths of zoom lenses for the cameras are all increased. Doubling the distances cuts the static depth distortion in half while maintaining image size and depth resolution. Dynamic depth distortion is minimized by panning a stereo view-collecting camera system about a circle which passes through the convergence point and the camera's first nodal points. Horizontal field shifting of the television fields on a television monitor brings both the monitor and the stereo views within the viewer's limit of binocular fusion.
NASA Astrophysics Data System (ADS)
Aguilar, J. A.; Basili, A.; Boccone, V.; Cadoux, F.; Christov, A.; della Volpe, D.; Montaruli, T.; Płatos, Ł.; Rameez, M.
2015-01-01
The focal-plane cameras of γ -ray telescopes frequently use light concentrators in front of the light sensors. The purpose of these concentrators is to increase the effective area of the camera as well as to reduce the stray light coming at large incident angles. These light concentrators are usually based on the Winston cone design. In this contribution we present the design of a hexagonal hollow light concentrator with a lateral profile optimized using a cubic Bézier function to achieve a higher collection efficiency in the angular region of interest. The design presented here is optimized for a Davies-Cotton telescope with a primary mirror of about 4 m in diameter and a focal length of 5.6 m. The described concentrators are part of an innovative camera made up of silicon-photomultiplier sensors, although a similar approach can be used for other sizes of single-mirror telescopes with different camera sensors, including photomultipliers. The challenge of our approach is to achieve a cost-effective design suitable for standard industrial production of both the plastic concentrator substrate and the reflective coating. At the same time we maximize the optical performance. In this paper we also describe the optical set-up to measure the absolute collection efficiency of the light concentrators and demonstrate our good understanding of the measured data using a professional ray-tracing simulation.
Simon Newcomb, America’s First Great Astronomer
2009-02-01
1874 and 1882 transits of Venus across the Sun. A heliostat tracked the Sun and reflected its light through a fixed telescope, where the image was...a new and unique camera consisting of a heliostat , long-focal-length telescope, and photographic plate assembly5 (see figures 2 and 3). While the...and relays or solenoids qualified as leading- Remote mirror Objective lenses Rotating mirrorFixed mirror Observer’s eyepiece Adjustable slit Heliostat
Sky camera geometric calibration using solar observations
Urquhart, Bryan; Kurtz, Ben; Kleissl, Jan
2016-09-05
A camera model and associated automated calibration procedure for stationary daytime sky imaging cameras is presented. The specific modeling and calibration needs are motivated by remotely deployed cameras used to forecast solar power production where cameras point skyward and use 180° fisheye lenses. Sun position in the sky and on the image plane provides a simple and automated approach to calibration; special equipment or calibration patterns are not required. Sun position in the sky is modeled using a solar position algorithm (requiring latitude, longitude, altitude and time as inputs). Sun position on the image plane is detected using a simple image processing algorithm. Themore » performance evaluation focuses on the calibration of a camera employing a fisheye lens with an equisolid angle projection, but the camera model is general enough to treat most fixed focal length, central, dioptric camera systems with a photo objective lens. Calibration errors scale with the noise level of the sun position measurement in the image plane, but the calibration is robust across a large range of noise in the sun position. In conclusion, calibration performance on clear days ranged from 0.94 to 1.24 pixels root mean square error.« less
Geometric calibration of Colour and Stereo Surface Imaging System of ESA's Trace Gas Orbiter
NASA Astrophysics Data System (ADS)
Tulyakov, Stepan; Ivanov, Anton; Thomas, Nicolas; Roloff, Victoria; Pommerol, Antoine; Cremonese, Gabriele; Weigel, Thomas; Fleuret, Francois
2018-01-01
There are many geometric calibration methods for "standard" cameras. These methods, however, cannot be used for the calibration of telescopes with large focal lengths and complex off-axis optics. Moreover, specialized calibration methods for the telescopes are scarce in literature. We describe the calibration method that we developed for the Colour and Stereo Surface Imaging System (CaSSIS) telescope, on board of the ExoMars Trace Gas Orbiter (TGO). Although our method is described in the context of CaSSIS, with camera-specific experiments, it is general and can be applied to other telescopes. We further encourage re-use of the proposed method by making our calibration code and data available on-line.
Warren, Sean C; Kim, Youngchan; Stone, James M; Mitchell, Claire; Knight, Jonathan C; Neil, Mark A A; Paterson, Carl; French, Paul M W; Dunsby, Chris
2016-09-19
This paper demonstrates multiphoton excited fluorescence imaging through a polarisation maintaining multicore fiber (PM-MCF) while the fiber is dynamically deformed using all-proximal detection. Single-shot proximal measurement of the relative optical path lengths of all the cores of the PM-MCF in double pass is achieved using a Mach-Zehnder interferometer read out by a scientific CMOS camera operating at 416 Hz. A non-linear least squares fitting procedure is then employed to determine the deformation-induced lateral shift of the excitation spot at the distal tip of the PM-MCF. An experimental validation of this approach is presented that compares the proximally measured deformation-induced lateral shift in focal spot position to an independent distally measured ground truth. The proximal measurement of deformation-induced shift in focal spot position is applied to correct for deformation-induced shifts in focal spot position during raster-scanning multiphoton excited fluorescence imaging.
Image quality testing of assembled IR camera modules
NASA Astrophysics Data System (ADS)
Winters, Daniel; Erichsen, Patrik
2013-10-01
Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.
The imaging system design of three-line LMCCD mapping camera
NASA Astrophysics Data System (ADS)
Zhou, Huai-de; Liu, Jin-Guo; Wu, Xing-Xing; Lv, Shi-Liang; Zhao, Ying; Yu, Da
2011-08-01
In this paper, the authors introduced the theory about LMCCD (line-matrix CCD) mapping camera firstly. On top of the introduction were consists of the imaging system of LMCCD mapping camera. Secondly, some pivotal designs which were Introduced about the imaging system, such as the design of focal plane module, the video signal's procession, the controller's design of the imaging system, synchronous photography about forward and nadir and backward camera and the nadir camera of line-matrix CCD. At last, the test results of LMCCD mapping camera imaging system were introduced. The results as following: the precision of synchronous photography about forward and nadir and backward camera is better than 4 ns and the nadir camera of line-matrix CCD is better than 4 ns too; the photography interval of line-matrix CCD of the nadir camera can satisfy the butter requirements of LMCCD focal plane module; the SNR tested in laboratory is better than 95 under typical working condition(the solar incidence degree is 30, the reflectivity of the earth's surface is 0.3) of each CCD image; the temperature of the focal plane module is controlled under 30° in a working period of 15 minutes. All of these results can satisfy the requirements about the synchronous photography, the temperature control of focal plane module and SNR, Which give the guarantee of precision for satellite photogrammetry.
Visual Tour Based on Panaromic Images for Indoor Places in Campus
NASA Astrophysics Data System (ADS)
Bakirman, T.
2012-07-01
In this paper, it is aimed to create a visual tour based on panoramic images for Civil Engineering Faculty in Yildiz Technical University. For this purpose, panoramic images should be obtained. Thus, photos taken with a tripod to have the same angle of view in every photo and panoramic images were created with stitching photos. Two different cameras with different focal length were used. With the panoramic images, visual tour with navigation tools created.
Small form-factor VGA camera with variable focus by liquid lens
NASA Astrophysics Data System (ADS)
Oikarinen, Kari A.; Aikio, Mika
2010-05-01
We present the design of a 24 mm long variable focus lens for 1/4" sensor. The chosen CMOS color sensor has VGA (640×480) resolution and 5.6 μm pixel size. The lens utilizes one Varioptic Arctic 320 liquid lens that has a voltage-controllable focal length due to the electrowetting effect. There are no mechanical moving parts. The principle of operation of the liquid lens is explained briefly. We discuss designing optical systems with this type of lens. This includes a modeling approach that allows entering a voltage value to modify the configuration of the liquid lens. The presented design consists only of spherical glass surfaces. The choice to use spherical surfaces was made in order to decrease the costs of manufacturing and provide more predictable performance by the better established method. Fabrication tolerances are compensated by the adjustability of the liquid lens, further increasing the feasibility of manufacturing. The lens is manufactured and assembled into a demonstrator camera. It has an f-number of 2.5 and 40 degree full field of view. The effective focal length varies around 6 millimeters as the liquid lens is adjusted. In simulations we have achieved a focus distance controllable between 20 millimeters and infinity. The design differs from previous approaches by having the aperture stop in the middle of the system instead of in front.
Loehfelm, Thomas W; Prater, Adam B; Debebe, Tequam; Sekhar, Aarti K
2017-02-01
We digitized the radiography teaching file at Black Lion Hospital (Addis Ababa, Ethiopia) during a recent trip, using a standard digital camera and a fluorescent light box. Our goal was to photograph every radiograph in the existing library while optimizing the final image size to the maximum resolution of a high quality tablet computer, preserving the contrast resolution of the radiographs, and minimizing total library file size. A secondary important goal was to minimize the cost and time required to take and process the images. Three workers were able to efficiently remove the radiographs from their storage folders, hang them on the light box, operate the camera, catalog the image, and repack the radiographs back to the storage folder. Zoom, focal length, and film speed were fixed, while aperture and shutter speed were manually adjusted for each image, allowing for efficiency and flexibility in image acquisition. Keeping zoom and focal length fixed, which kept the view box at the same relative position in all of the images acquired during a single photography session, allowed unused space to be batch-cropped, saving considerable time in post-processing, at the expense of final image resolution. We present an analysis of the trade-offs in workflow efficiency and final image quality, and demonstrate that a few people with minimal equipment can efficiently digitize a teaching file library.
Orion Optical Navigation Progress Toward Exploration Mission 1
NASA Technical Reports Server (NTRS)
Holt, Greg N.; D'Souza, Christopher N.; Saley, David
2018-01-01
Optical navigation of human spacecraft was proposed on Gemini and implemented successfully on Apollo as a means of autonomously operating the vehicle in the event of lost communication with controllers on Earth. The Orion emergency return system utilizing optical navigation has matured in design over the last several years, and is currently undergoing the final implementation and test phase in preparation for Exploration Mission 1 (EM-1) in 2019. The software development is past its Critical Design Review, and is progressing through test and certification for human rating. The filter architecture uses a square-root-free UDU covariance factorization. Linear Covariance Analysis (LinCov) was used to analyze the measurement models and the measurement error models on a representative EM-1 trajectory. The Orion EM-1 flight camera was calibrated at the Johnson Space Center (JSC) electro-optics lab. To permanently stake the focal length of the camera a 500 mm focal length refractive collimator was used. Two Engineering Design Unit (EDU) cameras and an EDU star tracker were used for a live-sky test in Denver. In-space imagery with high-fidelity truth metadata is rare so these live-sky tests provide one of the closest real-world analogs to operational use. A hardware-in-the-loop test rig was developed in the Johnson Space Center Electro-Optics Lab to exercise the OpNav system prior to integrated testing on the Orion vehicle. The software is verified with synthetic images. Several hundred off-nominal images are also used to analyze robustness and fault detection in the software. These include effects such as stray light, excess radiation damage, and specular reflections, and are used to help verify the tuning parameters chosen for the algorithms such as earth atmosphere bias, minimum pixel intensity, and star detection thresholds.
Astrometric Calibration and Performance of the Dark Energy Camera
Bernstein, G. M.; Armstrong, R.; Plazas, A. A.; ...
2017-05-30
We characterize the ability of the Dark Energy Camera (DECam) to perform relative astrometry across its 500 Mpix, 3more » $deg^2$ science field of view, and across 4 years of operation. This is done using internal comparisons of $~ 4 x 10^7$ measurements of high-S/N stellar images obtained in repeat visits to fields of moderate stellar density, with the telescope dithered to move the sources around the array. An empirical astrometric model includes terms for: optical distortions; stray electric fields in the CCD detectors; chromatic terms in the instrumental and atmospheric optics; shifts in CCD relative positions of up to $$\\approx 10 \\mu m$$ when the DECam temperature cycles; and low-order distortions to each exposure from changes in atmospheric refraction and telescope alignment. Errors in this astrometric model are dominated by stochastic variations with typical amplitudes of 10-30 mas (in a 30 s exposure) and $$5^{\\prime}-10^{\\prime}$$ arcmin coherence length, plausibly attributed to Kolmogorov-spectrum atmospheric turbulence. The size of these atmospheric distortions is not closely related to the seeing. Given an astrometric reference catalog at density $$\\approx 0.7$$ $$arcmin^{-2}$$, e.g. from Gaia, the typical atmospheric distortions can be interpolated to $$\\approx$$ 7 mas RMS accuracy (for 30 s exposures) with $$1^{\\prime}$$ arcmin coherence length for residual errors. Remaining detectable error contributors are 2-4 mas RMS from unmodelled stray electric fields in the devices, and another 2-4 mas RMS from focal plane shifts between camera thermal cycles. Thus the astrometric solution for a single DECam exposure is accurate to 3-6 mas ( $$\\approx$$ 0.02 pixels, or $$\\approx$$ 300 nm) on the focal plane, plus the stochastic atmospheric distortion.« less
Dense depth maps from correspondences derived from perceived motion
NASA Astrophysics Data System (ADS)
Kirby, Richard; Whitaker, Ross
2017-01-01
Many computer vision applications require finding corresponding points between images and using the corresponding points to estimate disparity. Today's correspondence finding algorithms primarily use image features or pixel intensities common between image pairs. Some 3-D computer vision applications, however, do not produce the desired results using correspondences derived from image features or pixel intensities. Two examples are the multimodal camera rig and the center region of a coaxial camera rig. We present an image correspondence finding technique that aligns pairs of image sequences using optical flow fields. The optical flow fields provide information about the structure and motion of the scene, which are not available in still images but can be used in image alignment. We apply the technique to a dual focal length stereo camera rig consisting of a visible light-infrared camera pair and to a coaxial camera rig. We test our method on real image sequences and compare our results with the state-of-the-art multimodal and structure from motion (SfM) algorithms. Our method produces more accurate depth and scene velocity reconstruction estimates than the state-of-the-art multimodal and SfM algorithms.
Camera sensor arrangement for crop/weed detection accuracy in agronomic images.
Romeo, Juan; Guerrero, José Miguel; Montalvo, Martín; Emmi, Luis; Guijarro, María; Gonzalez-de-Santos, Pablo; Pajares, Gonzalo
2013-04-02
In Precision Agriculture, images coming from camera-based sensors are commonly used for weed identification and crop line detection, either to apply specific treatments or for vehicle guidance purposes. Accuracy of identification and detection is an important issue to be addressed in image processing. There are two main types of parameters affecting the accuracy of the images, namely: (a) extrinsic, related to the sensor's positioning in the tractor; (b) intrinsic, related to the sensor specifications, such as CCD resolution, focal length or iris aperture, among others. Moreover, in agricultural applications, the uncontrolled illumination, existing in outdoor environments, is also an important factor affecting the image accuracy. This paper is exclusively focused on two main issues, always with the goal to achieve the highest image accuracy in Precision Agriculture applications, making the following two main contributions: (a) camera sensor arrangement, to adjust extrinsic parameters and (b) design of strategies for controlling the adverse illumination effects.
Earth Observations taken by Expedition 41 crewmember
2014-09-17
ISS041-E-016740 (17 Sept. 2014) --- One of the Expedition 41 crew members aboard the Earth-orbiting International Space Station exposed this Sept. 17 nocturnal scene featuring most of the largest cities on the central eastern seaboard. Even at 221 nautical miles above Earth, the 28mm focal length on the still camera was able to pick up detail in the image, for example, Central Park on Manhattan at right frame. The nation?s capital is very near frame center.
NASA Astrophysics Data System (ADS)
Soulié, G.
2007-09-01
This paper contains 403 measures of double stars.These measures have been made with a 12",F/10 Meade LX 200 telescope and an X2 Barlow lens giving an effective focal length of about 5.5 meters. The calibration is calculated with measures of standard pairs. Frames have been obtained with a CCD camera MX516.
Mechanical Design of the LSST Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordby, Martin; Bowden, Gordon; Foss, Mike
2008-06-13
The LSST camera is a tightly packaged, hermetically-sealed system that is cantilevered into the main beam of the LSST telescope. It is comprised of three refractive lenses, on-board storage for five large filters, a high-precision shutter, and a cryostat that houses the 3.2 giga-pixel CCD focal plane along with its support electronics. The physically large optics and focal plane demand large structural elements to support them, but the overall size of the camera and its components must be minimized to reduce impact on the image stability. Also, focal plane and optics motions must be minimized to reduce systematic errors inmore » image reconstruction. Design and analysis for the camera body and cryostat will be detailed.« less
Calibration and accuracy analysis of a focused plenoptic camera
NASA Astrophysics Data System (ADS)
Zeller, N.; Quint, F.; Stilla, U.
2014-08-01
In this article we introduce new methods for the calibration of depth images from focused plenoptic cameras and validate the results. We start with a brief description of the concept of a focused plenoptic camera and how from the recorded raw image a depth map can be estimated. For this camera, an analytical expression of the depth accuracy is derived for the first time. In the main part of the paper, methods to calibrate a focused plenoptic camera are developed and evaluated. The optical imaging process is calibrated by using a method which is already known from the calibration of traditional cameras. For the calibration of the depth map two new model based methods, which make use of the projection concept of the camera are developed. These new methods are compared to a common curve fitting approach, which is based on Taylor-series-approximation. Both model based methods show significant advantages compared to the curve fitting method. They need less reference points for calibration than the curve fitting method and moreover, supply a function which is valid in excess of the range of calibration. In addition the depth map accuracy of the plenoptic camera was experimentally investigated for different focal lengths of the main lens and is compared to the analytical evaluation.
The Wide Field Imager instrument for Athena
NASA Astrophysics Data System (ADS)
Meidinger, Norbert; Barbera, Marco; Emberger, Valentin; Fürmetz, Maria; Manhart, Markus; Müller-Seidlitz, Johannes; Nandra, Kirpal; Plattner, Markus; Rau, Arne; Treberspurg, Wolfgang
2017-08-01
ESA's next large X-ray mission ATHENA is designed to address the Cosmic Vision science theme 'The Hot and Energetic Universe'. It will provide answers to the two key astrophysical questions how does ordinary matter assemble into the large-scale structures we see today and how do black holes grow and shape the Universe. The ATHENA spacecraft will be equipped with two focal plane cameras, a Wide Field Imager (WFI) and an X-ray Integral Field Unit (X-IFU). The WFI instrument is optimized for state-of-the-art resolution spectroscopy over a large field of view of 40 amin x 40 amin and high count rates up to and beyond 1 Crab source intensity. The cryogenic X-IFU camera is designed for high-spectral resolution imaging. Both cameras share alternately a mirror system based on silicon pore optics with a focal length of 12 m and large effective area of about 2 m2 at an energy of 1 keV. Although the mission is still in phase A, i.e. studying the feasibility and developing the necessary technology, the definition and development of the instrumentation made already significant progress. The herein described WFI focal plane camera covers the energy band from 0.2 keV to 15 keV with 450 μm thick fully depleted back-illuminated silicon active pixel sensors of DEPFET type. The spatial resolution will be provided by one million pixels, each with a size of 130 μm x 130 μm. The time resolution requirement for the WFI large detector array is 5 ms and for the WFI fast detector 80 μs. The large effective area of the mirror system will be completed by a high quantum efficiency above 90% for medium and higher energies. The status of the various WFI subsystems to achieve this performance will be described and recent changes will be explained here.
Russo, Paolo; Mettivier, Giovanni
2011-04-01
The goal of this study is to evaluate a new method based on a coded aperture mask combined with a digital x-ray imaging detector for measurements of the focal spot sizes of diagnostic x-ray tubes. Common techniques for focal spot size measurements employ a pinhole camera, a slit camera, or a star resolution pattern. The coded aperture mask is a radiation collimator consisting of a large number of apertures disposed on a predetermined grid in an array, through which the radiation source is imaged onto a digital x-ray detector. The method of the coded mask camera allows one to obtain a one-shot accurate and direct measurement of the two dimensions of the focal spot (like that for a pinhole camera) but at a low tube loading (like that for a slit camera). A large number of small apertures in the coded mask operate as a "multipinhole" with greater efficiency than a single pinhole, but keeping the resolution of a single pinhole. X-ray images result from the multiplexed output on the detector image plane of such a multiple aperture array, and the image of the source is digitally reconstructed with a deconvolution algorithm. Images of the focal spot of a laboratory x-ray tube (W anode: 35-80 kVp; focal spot size of 0.04 mm) were acquired at different geometrical magnifications with two different types of digital detector (a photon counting hybrid silicon pixel detector with 0.055 mm pitch and a flat panel CMOS digital detector with 0.05 mm pitch) using a high resolution coded mask (type no-two-holes-touching modified uniformly redundant array) with 480 0.07 mm apertures, designed for imaging at energies below 35 keV. Measurements with a slit camera were performed for comparison. A test with a pinhole camera and with the coded mask on a computed radiography mammography unit with 0.3 mm focal spot was also carried out. The full width at half maximum focal spot sizes were obtained from the line profiles of the decoded images, showing a focal spot of 0.120 mm x 0.105 mm at 35 kVp and M = 6.1, with a detector entrance exposure as low as 1.82 mR (0.125 mA s tube load). The slit camera indicated a focal spot of 0.112 mm x 0.104 mm at 35 kVp and M = 3.15, with an exposure at the detector of 72 mR. Focal spot measurements with the coded mask could be performed up to 80 kVp. Tolerance to angular misalignment with the reference beam up to 7 degrees in in-plane rotations and 1 degrees deg in out-of-plane rotations was observed. The axial distance of the focal spot from the coded mask could also be determined. It is possible to determine the beam intensity via measurement of the intensity of the decoded image of the focal spot and via a calibration procedure. Coded aperture masks coupled to a digital area detector produce precise determinations of the focal spot of an x-ray tube with reduced tube loading and measurement time, coupled to a large tolerance in the alignment of the mask.
Calibration of Action Cameras for Photogrammetric Purposes
Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo
2014-01-01
The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898
Calibration of action cameras for photogrammetric purposes.
Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo
2014-09-18
The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution.
A High Resolution TDI CCD Camera forMicrosatellite (HRCM)
NASA Astrophysics Data System (ADS)
Hao, Yuncai; Zheng, You; Dong, Ying; Li, Tao; Yu, Shijie
In resent years it is a important development direction in the commercial remote sensing field to obtain (1-5)m high ground resolution from space using microsatellite. Thanks to progress of new technologies, new materials and new detectors it is possible to develop 1m ground resolution space imaging system with weight less than 20kg. Based on many years works on optical system design a project of very high resolution TDI CCD camera using in space was proposed by the authors of this paper. The performance parameters and optical lay-out of the HRCM was presented. A compact optical design and results analysis for the system was given in the paper also. and small fold mirror to take a line field of view usable for TDI CCD and short outer size. The length along the largest size direction is about 1/4 of the focal length. And two 4096X96(grades) line TDI CCD will be used as the focal plane detector. The special optical parts are fixed near before the final image for getting the ground pixel resolution higher than the Nyquist resolution of the detector using the sub-pixel technique which will be explained in the paper. In the system optical SiC will be used as the mirror material, the C-C composite material will be used as the material of the mechanical structure framework. The circle frame of the primary and secondary mirrors will use one time turning on a machine tool in order to assuring concentric request for alignment of the system. In general the HRCM have the performance parameters with 2.5m focal length, 20 FOV, 1/11relative aperture, (0.4-0.8) micrometer spectral range, 10 micron pixel size of TDI CCD, weight less than 20kg, 1m ground pixel resolution at flying orbit 500km high. Design and analysis of the HRCM put up in the paper indicate that HRCM have many advantages to use it in space. Keywords High resolution TDI CCD Sub-pixel imaging Light-weighted optical system SiC mirror
2003-03-07
File name :DSC_0749.JPG File size :1.1MB(1174690Bytes) Date taken :2003/03/07 13:51:29 Image size :2000 x 1312 Resolution :300 x 300 dpi Number of bits :8bit/channel Protection attribute :Off Hide Attribute :Off Camera ID :N/A Camera :NIKON D1H Quality mode :FINE Metering mode :Matrix Exposure mode :Shutter priority Speed light :No Focal length :20 mm Shutter speed :1/500second Aperture :F11.0 Exposure compensation :0 EV White Balance :Auto Lens :20 mm F 2.8 Flash sync mode :N/A Exposure difference :0.0 EV Flexible program :No Sensitivity :ISO200 Sharpening :Normal Image Type :Color Color Mode :Mode II(Adobe RGB) Hue adjustment :3 Saturation Control :N/A Tone compensation :Normal Latitude(GPS) :N/A Longitude(GPS) :N/A Altitude(GPS) :N/A
2002-02-19
File name :DSC_0028.JPG File size :2.8MB(2950833Bytes) Date taken :2002/02/19 09:49:01 Image size :3008 x 2000 Resolution :300 x 300 dpi Number of bits :8bit/channel Protection attribute :Off Hide Attribute :Off Camera ID :N/A Camera :NIKON D100 Quality mode :N/A Metering mode :Matrix Exposure mode :Shutter priority Speed light :Yes Focal length :24 mm Shutter speed :1/60second Aperture :F3.5 Exposure compensation :0 EV White Balance :N/A Lens :N/A Flash sync mode :N/A Exposure difference :N/A Flexible program :N/A Sensitivity :N/A Sharpening :N/A Image Type :Color Color Mode :N/A Hue adjustment :N/A Saturation Control :N/A Tone compensation :N/A Latitude(GPS) :N/A Longitude(GPS) :N/A Altitude(GPS) :N/A
2002-02-24
File name :DSC_0047.JPG File size :2.8MB(2931574Bytes) Date taken :2002/02/24 10:06:57 Image size :3008 x 2000 Resolution :300 x 300 dpi Number of bits :8bit/channel Protection attribute :Off Hide Attribute :Off Camera ID :N/A Camera :NIKON D100 Quality mode :N/A Metering mode :Matrix Exposure mode :Shutter priority Speed light :Yes Focal length :24 mm Shutter speed :1/180second Aperture :F20.0 Exposure compensation :+0.3 EV White Balance :N/A Lens :N/A Flash sync mode :N/A Exposure difference :N/A Flexible program :N/A Sensitivity :N/A Sharpening :N/A Image Type :Color Color Mode :N/A Hue adjustment :N/A Saturation Control :N/A Tone compensation :N/A Latitude(GPS) :N/A Longitude(GPS) :N/A Altitude(GPS) :N/A
Theoretical performance model for single image depth from defocus.
Trouvé-Peloux, Pauline; Champagnat, Frédéric; Le Besnerais, Guy; Idier, Jérôme
2014-12-01
In this paper we present a performance model for depth estimation using single image depth from defocus (SIDFD). Our model is based on an original expression of the Cramér-Rao bound (CRB) in this context. We show that this model is consistent with the expected behavior of SIDFD. We then study the influence on the performance of the optical parameters of a conventional camera such as the focal length, the aperture, and the position of the in-focus plane (IFP). We derive an approximate analytical expression of the CRB away from the IFP, and we propose an interpretation of the SIDFD performance in this domain. Finally, we illustrate the predictive capacity of our performance model on experimental data comparing several settings of a consumer camera.
NASA Astrophysics Data System (ADS)
Fukuda, Takahito; Shinomura, Masato; Xia, Peng; Awatsuji, Yasuhiro; Nishio, Kenzo; Matoba, Osamu
2017-04-01
We constructed a parallel-phase-shifting digital holographic microscopy (PPSDHM) system using an inverted magnification optical system, and succeeded in three-dimensional (3D) motion-picture imaging for 3D displacement of a microscopic object. In the PPSDHM system, the inverted and afocal magnification optical system consisted of a microscope objective (16.56 mm focal length and 0.25 numerical aperture) and a convex lens (300 mm focal length and 82 mm aperture diameter). A polarization-imaging camera was used to record multiple phase-shifted holograms with a single-shot exposure. We recorded an alum crystal, sinking down in aqueous solution of alum, by the constructed PPSDHM system at 60 frames/s for about 20 s and reconstructed high-quality 3D motion-picture image of the crystal. Then, we calculated amounts of displacement of the crystal from the amounts in the focus plane and the magnifications of the magnification optical system, and obtained the 3D trajectory of the crystal by that amounts.
NASA Astrophysics Data System (ADS)
Kim, Sang-Youn; Yeo, Myoung; Shin, Eun-Jae; Park, Won-Hyeong; Jang, Jong-Seok; Nam, Byeong-Uk; Bae, Jin Woo
2015-11-01
In this paper, we propose a variable focus microlens module based on a transparent, electroactive, and non-ionic PVC/DBA gel. A non-ionic PVC/DBA (nPVC) gel on an ITO glass was confined beneath a rigid annular electrode, and applied pressure squeezed a bulge of the nPVC gel into the annular electrode, resulting in a hemispherical plano-convex nPVC gel microlens. The proposed nPVC gel microlens was analyzed and optimized. When voltage is applied to the circular perimeter (the annular electrode) of this fabricated microlens, electrically induced creep deformation of the nPVC gel occurs, changing its optical focal length. The focal length remarkably increases from 3.8 mm up to 14.3 mm with increasing applied voltages from 300 V to 800 V. Due to its compact, transparent, and electroactive characteristics, the proposed nPVC gel microlens can be easily inserted into small consumer electronic devices, such as digital cameras, camcorders, cell phones, and other portable optical devices.
Coaxial fundus camera for opthalmology
NASA Astrophysics Data System (ADS)
de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.
2015-09-01
A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.
Thick lens chromatic effective focal length variation versus bending
NASA Astrophysics Data System (ADS)
Sparrold, Scott
2017-11-01
Longitudinal chromatic aberration (LCA) can limit the optical performance in refractive optical systems. Understanding a singlet's chromatic change of effective focal leads to insights and methods to control LCA. Long established, first order theory, shows the chromatic change in focal length for a zero thickness lens is proportional to it's focal length divided by the lens V number or inverse dispersion. This work presents the derivation of an equation for a thick singlet's chromatic change in effective focal length as a function of center thickness, t, dispersion, V, index of refraction, n, and the Coddington shape factor, K. A plot of bending versus chromatic focal length variation is presented. Lens thickness does not influence chromatic variation of effective focal length for a convex plano or plano convex lens. A lens's center thickness'influence on chromatic focal length variation is more pronounced for lower indices of refraction.
Earth Observations taken by Expedition 41 crewmember
2014-09-27
ISS041-E-045469 (27 Sept. 2014) --- One of the Expedition 41 crew members aboard the International Space Station, flying at an altitude of 222 nautical miles above a point in the Atlantic Ocean several hundred miles off the coast of Africa near the Tropic of Cancer, photographed this eye-catching panorama of the night sky on Sept. 27. NASA astronaut Reid Wiseman, flight engineer, tweeted the image, which was taken with an electronic still camera, set with a 24mm focal length. In his accompanying comments, Wiseman stated, "Sahara sands make the Earth glow orange."
First results from the TOPSAT camera
NASA Astrophysics Data System (ADS)
Greenway, Paul; Tosh, Ian; Morris, Nigel; Burton, Gary; Cawley, Steve
2017-11-01
The TopSat camera is a low cost remote sensing imager capable of producing 2.5 metre resolution panchromatic imagery, funded by the British National Space Centre's Mosaic programme. The instrument was designed and assembled at the Space Science & Technology Department of the CCLRC's Rutherford Appleton Laboratory (RAL) in the UK, and was launched on the 27th October 2005 from Plesetsk Cosmodrome in Northern Russia on a Kosmos-3M. The camera utilises an off-axis three mirror system, which has the advantages of excellent image quality over a wide field of view, combined with a compactness that makes its overall dimensions smaller than its focal length. Keeping the costs to a minimum has been a major design driver in the development of this camera. The camera is part of the TopSat mission, which is a collaboration between four UK organisations; QinetiQ, Surrey Satellite Technology Ltd (SSTL), RAL and Infoterra. Its objective is to demonstrate provision of rapid response high resolution imagery to fixed and mobile ground stations using a low cost minisatellite. The paper "Development of the TopSat Camera" presented by RAL at the 5th ICSO in 2004 described the opto-mechanical design, assembly, alignment and environmental test methods implemented. Now that the spacecraft is in orbit and successfully acquiring images, this paper presents the first results from the camera and makes an initial assessment of the camera's in-orbit performance.
Wavefront Sensing With Switched Lenses for Defocus Diversity
NASA Technical Reports Server (NTRS)
Dean, Bruce H.
2007-01-01
In an alternative hardware design for an apparatus used in image-based wavefront sensing, defocus diversity is introduced by means of fixed lenses that are mounted in a filter wheel (see figure) so that they can be alternately switched into a position in front of the focal plane of an electronic camera recording the image formed by the optical system under test. [The terms image-based, wavefront sensing, and defocus diversity are defined in the first of the three immediately preceding articles, Broadband Phase Retrieval for Image-Based Wavefront Sensing (GSC-14899-1).] Each lens in the filter wheel is designed so that the optical effect of placing it at the assigned position is equivalent to the optical effect of translating the camera a specified defocus distance along the optical axis. Heretofore, defocus diversity has been obtained by translating the imaging camera along the optical axis to various defocus positions. Because data must be taken at multiple, accurately measured defocus positions, it is necessary to mount the camera on a precise translation stage that must be calibrated for each defocus position and/or to use an optical encoder for measurement and feedback control of the defocus positions. Additional latency is introduced into the wavefront sensing process as the camera is translated to the various defocus positions. Moreover, if the optical system under test has a large focal length, the required defocus values are large, making it necessary to use a correspondingly bulky translation stage. By eliminating the need for translation of the camera, the alternative design simplifies and accelerates the wavefront-sensing process. This design is cost-effective in that the filterwheel/lens mechanism can be built from commercial catalog components. After initial calibration of the defocus value of each lens, a selected defocus value is introduced by simply rotating the filter wheel to place the corresponding lens in front of the camera. The rotation of the wheel can be automated by use of a motor drive, and further calibration is not necessary. Because a camera-translation stage is no longer needed, the size of the overall apparatus can be correspondingly reduced.
Sighting optics including an optical element having a first focal length and a second focal length
Crandall, David Lynn [Idaho Falls, ID
2011-08-01
One embodiment of sighting optics according to the teachings provided herein may include a front sight and a rear sight positioned in spaced-apart relation. The rear sight includes an optical element having a first focal length and a second focal length. The first focal length is selected so that it is about equal to a distance separating the optical element and the front sight and the second focal length is selected so that it is about equal to a target distance. The optical element thus brings into simultaneous focus, for a user, images of the front sight and the target.
Crandall, David Lynn
2011-08-16
Sighting optics include a front sight and a rear sight positioned in a spaced-apart relation. The rear sight includes an optical element having a first focal length and a second focal length. The first focal length is selected so that it is about equal to a distance separating the optical element and the front sight and the second focal length is selected so that it is about equal to a target distance. The optical element thus brings into simultaneous focus for a user images of the front sight and the target.
A digital gigapixel large-format tile-scan camera.
Ben-Ezra, M
2011-01-01
Although the resolution of single-lens reflex (SLR) and medium-format digital cameras has increased in recent years, applications for cultural-heritage preservation and computational photography require even higher resolutions. Addressing this issue, a large-format cameras' large image planes can achieve very high resolution without compromising pixel size and thus can provide high-quality, high-resolution images.This digital large-format tile scan camera can acquire high-quality, high-resolution images of static scenes. It employs unique calibration techniques and a simple algorithm for focal-stack processing of very large images with significant magnification variations. The camera automatically collects overlapping focal stacks and processes them into a high-resolution, extended-depth-of-field image.
Photodetectors for the Advanced Gamma-ray Imaging System (AGIS)
NASA Astrophysics Data System (ADS)
Wagner, Robert G.; Advanced Gamma-ray Imaging System AGIS Collaboration
2010-03-01
The Advanced Gamma-Ray Imaging System (AGIS) is a concept for the next generation very high energy gamma-ray observatory. Design goals include an order of magnitude better sensitivity, better angular resolution, and a lower energy threshold than existing Cherenkov telescopes. Each telescope is equipped with a camera that detects and records the Cherenkov-light flashes from air showers. The camera is comprised of a pixelated focal plane of blue sensitive and fast (nanosecond) photon detectors that detect the photon signal and convert it into an electrical one. Given the scale of AGIS, the camera must be reliable and cost effective. The Schwarzschild-Couder optical design yields a smaller plate scale than present-day Cherenkov telescopes, enabling the use of more compact, multi-pixel devices, including multianode photomultipliers or Geiger avalanche photodiodes. We present the conceptual design of the focal plane for the camera and results from testing candidate! focal plane sensors.
A Three-Line Stereo Camera Concept for Planetary Exploration
NASA Technical Reports Server (NTRS)
Sandau, Rainer; Hilbert, Stefan; Venus, Holger; Walter, Ingo; Fang, Wai-Chi; Alkalai, Leon
1997-01-01
This paper presents a low-weight stereo camera concept for planetary exploration. The camera uses three CCD lines within the image plane of one single objective. Some of the main features of the camera include: focal length-90 mm, FOV-18.5 deg, IFOV-78 (mu)rad, convergence angles-(+/-)10 deg, radiometric dynamics-14 bit, weight-2 kg, and power consumption-12.5 Watts. From an orbit altitude of 250 km the ground pixel size is 20m x 20m and the swath width is 82 km. The CCD line data is buffered in the camera internal mass memory of 1 Gbit. After performing radiometric correction and application-dependent preprocessing the data is compressed and ready for downlink. Due to the aggressive application of advanced technologies in the area of microelectronics and innovative optics, the low mass and power budgets of 2 kg and 12.5 Watts is achieved, while still maintaining high performance. The design of the proposed light-weight camera is also general purpose enough to be applicable to other planetary missions such as the exploration of Mars, Mercury, and the Moon. Moreover, it is an example of excellent international collaboration on advanced technology concepts developed at DLR, Germany, and NASA's Jet Propulsion Laboratory, USA.
Apollo 12 photography 70 mm, 16 mm, and 35 mm frame index
NASA Technical Reports Server (NTRS)
1970-01-01
For each 70-mm frame, the index presents information on: (1) the focal length of the camera, (2) the photo scale at the principal point of the frame, (3) the selenographic coordinates at the principal point of the frame, (4) the percentage of forward overlap of the frame, (5) the sun angle (medium, low, high), (6) the quality of the photography, (7) the approximate tilt (minimum and maximum) of the camera, and (8) the direction of tilt. A brief description of each frame is also included. The index to the 16-mm sequence photography includes information concerning the approximate surface coverage of the photographic sequence and a brief description of the principal features shown. A column of remarks is included to indicate: (1) if the sequence is plotted on the photographic index map and (2) the quality of the photography. The pictures taken using the lunar surface closeup stereoscopic camera (35 mm) are also described in this same index format.
The Use of Video-Tacheometric Technology for Documenting and Analysing Geometric Features of Objects
NASA Astrophysics Data System (ADS)
Woźniak, Marek; Świerczyńska, Ewa; Jastrzębski, Sławomir
2015-12-01
This paper analyzes selected aspects of the use of video-tacheometric technology for inventorying and documenting geometric features of objects. Data was collected with the use of the video-tacheometer Topcon Image Station IS-3 and the professional camera Canon EOS 5D Mark II. During the field work and the development of data the following experiments have been performed: multiple determination of the camera interior orientation parameters and distortion parameters of five lenses with different focal lengths, reflectorless measurements of profiles for the elevation and inventory of decorative surface wall of the building of Warsaw Ballet School. During the research the process of acquiring and integrating video-tacheometric data was analysed as well as the process of combining "point cloud" acquired by using video-tacheometer in the scanning process with independent photographs taken by a digital camera. On the basis of tests performed, utility of the use of video-tacheometric technology in geodetic surveys of geometrical features of buildings has been established.
Radiometric calibration of an ultra-compact microbolometer thermal imaging module
NASA Astrophysics Data System (ADS)
Riesland, David W.; Nugent, Paul W.; Laurie, Seth; Shaw, Joseph A.
2017-05-01
As microbolometer focal plane array formats are steadily decreasing, new challenges arise in correcting for thermal drift in the calibration coefficients. As the thermal mass of the cameras decrease the focal plane becomes more sensitive to external thermal inputs. This paper shows results from a temperature compensation algorithm for characterizing and radiometrically calibrating a FLIR Lepton camera.
Data filtering with support vector machines in geometric camera calibration.
Ergun, B; Kavzoglu, T; Colkesen, I; Sahin, C
2010-02-01
The use of non-metric digital cameras in close-range photogrammetric applications and machine vision has become a popular research agenda. Being an essential component of photogrammetric evaluation, camera calibration is a crucial stage for non-metric cameras. Therefore, accurate camera calibration and orientation procedures have become prerequisites for the extraction of precise and reliable 3D metric information from images. The lack of accurate inner orientation parameters can lead to unreliable results in the photogrammetric process. A camera can be well defined with its principal distance, principal point offset and lens distortion parameters. Different camera models have been formulated and used in close-range photogrammetry, but generally sensor orientation and calibration is performed with a perspective geometrical model by means of the bundle adjustment. In this study, support vector machines (SVMs) using radial basis function kernel is employed to model the distortions measured for Olympus Aspherical Zoom lens Olympus E10 camera system that are later used in the geometric calibration process. It is intended to introduce an alternative approach for the on-the-job photogrammetric calibration stage. Experimental results for DSLR camera with three focal length settings (9, 18 and 36 mm) were estimated using bundle adjustment with additional parameters, and analyses were conducted based on object point discrepancies and standard errors. Results show the robustness of the SVMs approach on the correction of image coordinates by modelling total distortions on-the-job calibration process using limited number of images.
640x480 PtSi Stirling-cooled camera system
NASA Astrophysics Data System (ADS)
Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; Coyle, Peter J.; Feder, Howard L.; Gilmartin, Harvey R.; Levine, Peter A.; Sauer, Donald J.; Shallcross, Frank V.; Demers, P. L.; Smalser, P. J.; Tower, John R.
1992-09-01
A Stirling cooled 3 - 5 micron camera system has been developed. The camera employs a monolithic 640 X 480 PtSi-MOS focal plane array. The camera system achieves an NEDT equals 0.10 K at 30 Hz frame rate with f/1.5 optics (300 K background). At a spatial frequency of 0.02 cycles/mRAD the vertical and horizontal Minimum Resolvable Temperature are in the range of MRT equals 0.03 K (f/1.5 optics, 300 K background). The MOS focal plane array achieves a resolution of 480 TV lines per picture height independent of background level and position within the frame.
Motion camera based on a custom vision sensor and an FPGA architecture
NASA Astrophysics Data System (ADS)
Arias-Estrada, Miguel
1998-09-01
A digital camera for custom focal plane arrays was developed. The camera allows the test and development of analog or mixed-mode arrays for focal plane processing. The camera is used with a custom sensor for motion detection to implement a motion computation system. The custom focal plane sensor detects moving edges at the pixel level using analog VLSI techniques. The sensor communicates motion events using the event-address protocol associated to a temporal reference. In a second stage, a coprocessing architecture based on a field programmable gate array (FPGA) computes the time-of-travel between adjacent pixels. The FPGA allows rapid prototyping and flexible architecture development. Furthermore, the FPGA interfaces the sensor to a compact PC computer which is used for high level control and data communication to the local network. The camera could be used in applications such as self-guided vehicles, mobile robotics and smart surveillance systems. The programmability of the FPGA allows the exploration of further signal processing like spatial edge detection or image segmentation tasks. The article details the motion algorithm, the sensor architecture, the use of the event- address protocol for velocity vector computation and the FPGA architecture used in the motion camera system.
NASA Technical Reports Server (NTRS)
1976-01-01
Trade studies were conducted to ensure the overall feasibility of the focal plane camera in a radial module. The primary variable in the trade studies was the location of the pickoff mirror, on axis versus off-axis. Two alternatives were: (1) the standard (electromagnetic focus) SECO submodule, and (2) the MOD 15 permanent magnet focus SECO submodule. The technical areas of concern were the packaging affected parameters of thermal dissipation, focal plane obscuration, and image quality.
Bennett, C.L.
1996-07-23
An imaging Fourier transform spectrometer is described having a Fourier transform infrared spectrometer providing a series of images to a focal plane array camera. The focal plane array camera is clocked to a multiple of zero crossing occurrences as caused by a moving mirror of the Fourier transform infrared spectrometer and as detected by a laser detector such that the frame capture rate of the focal plane array camera corresponds to a multiple of the zero crossing rate of the Fourier transform infrared spectrometer. The images are transmitted to a computer for processing such that representations of the images as viewed in the light of an arbitrary spectral ``fingerprint`` pattern can be displayed on a monitor or otherwise stored and manipulated by the computer. 2 figs.
Mechatronic design of a fully integrated camera for mini-invasive surgery.
Zazzarini, C C; Patete, P; Baroni, G; Cerveri, P
2013-06-01
This paper describes the design features of an innovative fully integrated camera candidate for mini-invasive abdominal surgery with single port or transluminal access. The apparatus includes a CMOS imaging sensor, a light-emitting diode (LED)-based unit for scene illumination, a photodiode for luminance detection, an optical system designed according to the mechanical compensation paradigm, an actuation unit for enabling autofocus and optical zoom, and a control logics based on microcontroller. The bulk of the apparatus is characterized by a tubular shape with a diameter of 10 mm and a length of 35 mm. The optical system, composed of four lens groups, of which two are mobile, has a total length of 13.46 mm and an effective focal length ranging from 1.61 to 4.44 mm with a zoom factor of 2.75×, with a corresponding angular field of view ranging from 16° to 40°. The mechatronics unit, devoted to move the zoom and the focus lens groups, is implemented adopting miniature piezoelectric motors. The control logics implements a closed-loop mechanism, between the LEDs and photodiode, to attain automatic control light. Bottlenecks of the design and some potential issues of the realization are discussed. A potential clinical scenario is introduced.
NASA Astrophysics Data System (ADS)
Zhang, Y. A.; Lin, C. F.; Lin, J. P.; Zeng, X. Y.; Yan, Q.; Zhou, X. T.; Guo, T. L.
2018-04-01
Electric-field-driven liquid crystal (ELC) lens with tunable focal length and their depth of field has been extensively applied in 3D display and imaging systems. In this work, a dual-layer electrode-driven liquid crystal (DELC) lens with electrically tunable focal length and controllable focal plane is demonstrated. ITO-SiO2-AZO electrodes with the dual-layer staggered structure on the top substrate are used as driven electrodes within a LC cell, which permits the establishment of an alternative controllability. The focal length of the DELC lens can be adjusted from 1.41 cm to 0.29 cm when the operating voltage changes from 15 V to 40 V. Furthermore, the focal plane of the DELC lens can selectively move by changing the driving method of the applied voltage to the next driven electrodes. This work demonstrates that the DELC lens has potential applications in imaging systems because of electrically tunable focal length and controllable focal plane.
Sung, Yu-Lung; Jeang, Jenn; Lee, Chia-Hsiung; Shih, Wei-Chuan
2015-04-01
We present a highly repeatable, lithography-free and mold-free method for fabricating flexible optical lenses by in situ curing liquid polydimethylsiloxane droplets on a preheated smooth surface with an inkjet printing process. This method enables us to fabricate lenses with a focal length as short as 5.6 mm, which can be controlled by varying the droplet volume and the temperature of the preheated surface. Furthermore, the lens can be attached to a smartphone camera without any accessories and can produce high-resolution (1 μm) images for microscopy applications.
PNIC - A near infrared camera for testing focal plane arrays
NASA Astrophysics Data System (ADS)
Hereld, Mark; Harper, D. A.; Pernic, R. J.; Rauscher, Bernard J.
1990-07-01
This paper describes the design and the performance of the Astrophysical Research Consortium prototype near-infrared camera (pNIC) designed to test focal plane arrays both on and off the telescope. Special attention is given to the detector in pNIC, the mechanical and optical designs, the electronics, and the instrument interface. Experiments performed to illustrate the most salient aspects of pNIC are described.
SPARTAN Near-IR Camera SPARTAN Cookbook Ohio State Infrared Imager/Spectrograph (OSIRIS) - NO LONGER Instrumentation at SOAR»SPARTAN Near-IR Camera SPARTAN Near-IR Camera System Overview The Spartan Infrared Camera is a high spatial resolution near-IR imager. Spartan has a focal plane conisisting of four "
Imaging characteristics of photogrammetric camera systems
Welch, R.; Halliday, J.
1973-01-01
In view of the current interest in high-altitude and space photographic systems for photogrammetric mapping, the United States Geological Survey (U.S.G.S.) undertook a comprehensive research project designed to explore the practical aspects of applying the latest image quality evaluation techniques to the analysis of such systems. The project had two direct objectives: (1) to evaluate the imaging characteristics of current U.S.G.S. photogrammetric camera systems; and (2) to develop methodologies for predicting the imaging capabilities of photogrammetric camera systems, comparing conventional systems with new or different types of systems, and analyzing the image quality of photographs. Image quality was judged in terms of a number of evaluation factors including response functions, resolving power, and the detectability and measurability of small detail. The limiting capabilities of the U.S.G.S. 6-inch and 12-inch focal length camera systems were established by analyzing laboratory and aerial photographs in terms of these evaluation factors. In the process, the contributing effects of relevant parameters such as lens aberrations, lens aperture, shutter function, image motion, film type, and target contrast procedures for analyzing image quality and predicting and comparing performance capabilities. ?? 1973.
Rapid-cadence optical monitoring for short-period variability of ɛ Aurigae
NASA Astrophysics Data System (ADS)
Billings, Gary
2013-07-01
ɛ Aurigae was observed with CCD cameras and 35 mm SLR camera lenses, at rapid cadence (>1/minute), for long runs (up to 11 hours), on multiple occasions during 2009 - 2011, to monitor for variability of the system at scales of minutes to hours. The lens and camera were changed during the period to improve results, finalizing on a 135 mm focal length Canon f/2 lens (at f/2.8), an ND8 neutral density filter, a Johnson V filter, and an SBIG ST-8XME camera (Kodak KAF-1603ME microlensed chip). Differential photometry was attempted, but because of the large separation between the variable and comparison star (η Aur), noise caused by transient extinction variations was not consistently eliminated. The lowest-noise time series for searching for short-period variability proved to be the extinction-corrected instrumental magnitude of ɛ Aur obtained on "photometric nights", with η Aur used to determine and monitor the extinction coefficient for the night. No flares or short-period variations of ɛ Aur were detected by visual inspection of the light curves from observing runs with noise levels as low as 0.008 magnitudes rms.
NASA Technical Reports Server (NTRS)
1980-01-01
Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.
Space infrared telescope facility wide field and diffraction limited array camera (IRAC)
NASA Technical Reports Server (NTRS)
Fazio, G. G.
1986-01-01
IRAC focal plane detector technology was developed and studies of alternate focal plane configurations were supported. While any of the alternate focal planes under consideration would have a major impact on the Infrared Array Camera, it was possible to proceed with detector development and optical analysis research based on the proposed design since, to a large degree, the studies undertaken are generic to any SIRTF imaging instrument. Development of the proposed instrument was also important in a situation in which none of the alternate configurations has received the approval of the Science Working Group.
On the Quality of Point-Clouds Derived from Sfm-Photogrammetry Applied to UAS Imagery
NASA Astrophysics Data System (ADS)
Carbonneau, P.; James, T.
2014-12-01
Structure from Motion photogrammetry (SfM-photogrammetry) recently appeared in environmental sciences as an impressive tool allowing for the creation of topographic data from unstructured imagery. Several authors have tested the performance of SfM-photogrammetry vs that of TLS or dGPS. Whilst the initial results were very promising, there is currently a growing awareness that systematic deformations occur in DEMs and point-clouds derived from SfM-photogrammetry. Notably, some authors have identified a systematic doming manifest as an increasing error vs distance to the model centre. Simulation studies have confirmed that this error is due to errors in the calibration of camera distortions. This work aims to further investigate these effects in the presence of real data. We start with a dataset of 220 images acquired from a sUAS. After obtaining an initial self-calibration of the camera lens with Agisoft Photoscan, our method consists in applying systematic perturbations to 2 key lens parameters: Focal length and the k1 distortion parameter. For each perturbation, a point-cloud was produced and compared to LiDAR data. After deriving the mean and standard deviation of the error residuals (ɛ), a 2nd order polynomial surface was fitted to the errors point-cloud and the peak ɛ defined as the mathematical extrema of this surface. The results are presented in figure 1. This figure shows that lens perturbations can induce a range of errors with systematic behaviours. Peak ɛ is primarily controlled by K1 with a secondary control exerted by the focal length. These results allow us to state that: To limit the peak ɛ to 10cm, the K1 parameter must be calibrated to within 0.00025 and the focal length to within 2.5 pixels (≈10 µm). This level of calibration accuracy can only be achieved with proper design of image acquisition and control network geometry. Our main point is therefore that SfM is not a bypass to a rigorous and well-informed photogrammetric approach. Users of SfM-photogrammetry will still require basic training and knowledge in the fundamentals of photogrammetry. This is especially true for applications where very small topographic changes need to be detected or where gradient-sensitive processes need to be modelled.
S-F graphic representation analysis of photoelectric facula focometer poroo-plate glass
NASA Astrophysics Data System (ADS)
Tong, Yilin; Han, Xuecai
2016-10-01
Optical system focal length is usually based on the magnification method with focal length measurement poroo-plate glass is used as base element measuring focal length of focometer. On the basis of using analysis of magnification method to measure the accuracy of optical lens focal length, an expression between the ruling span of poroo-plate glass and the focal length of measured optical system was deduced, an efficient method to work out S-F graph with AUTOCAD was developed, the selecting principle of focometer parameter was analyzed, and Applied examples for designing poroo-plate glass in S-F figure was obtained.
Moghaddam, Mohammadreza Salehi; Latifi, H; Shahraki, Hamidreza; Cheri, Mohammad Sadegh
2015-04-01
Microlenses with tunable focal length have wide applications in optofluidic devices. This work presents a numerical and experimental investigation on a tunable electrowetting-based concave lens. Optical properties such as focal length of the lens and visibility of images were investigated numerically and experimentally. A finite element analysis and a ZEMAX simulation were used for determination of surface profile and focal length of the lens. The results show that the theoretical surface profile and focal length of the lens are in good agreement with the experimental ones. The lens has a wide tuning focal length equal to 6.5 (cm). Because the polydimethylsiloxane (PDMS) layer is wedge shaped (as both the dielectric and hydrophobic layers), lower applied voltage is needed. A commercial program was used to find the focal length of the lens from maximum visibility value by tuning the applied voltage.
Intraocular camera for retinal prostheses: Refractive and diffractive lens systems
NASA Astrophysics Data System (ADS)
Hauer, Michelle Christine
The focus of this thesis is on the design and analysis of refractive, diffractive, and hybrid refractive/diffractive lens systems for a miniaturized camera that can be surgically implanted in the crystalline lens sac and is designed to work in conjunction with current and future generation retinal prostheses. The development of such an intraocular camera (IOC) would eliminate the need for an external head-mounted or eyeglass-mounted camera. Placing the camera inside the eye would allow subjects to use their natural eye movements for foveation (attention) instead of more cumbersome head tracking, would notably aid in personal navigation and mobility, and would also be significantly more psychologically appealing from the standpoint of personal appearances. The capability for accommodation with no moving parts or feedback control is incorporated by employing camera designs that exhibit nearly infinite depth of field. Such an ultracompact optical imaging system requires a unique combination of refractive and diffractive optical elements and relaxed system constraints derived from human psychophysics. This configuration necessitates an extremely compact, short focal-length lens system with an f-number close to unity. Initially, these constraints appear highly aggressive from an optical design perspective. However, after careful analysis of the unique imaging requirements of a camera intended to work in conjunction with the relatively low pixellation levels of a retinal microstimulator array, it becomes clear that such a design is not only feasible, but could possibly be implemented with a single lens system.
The ideal subject distance for passport pictures.
Verhoff, Marcel A; Witzel, Carsten; Kreutz, Kerstin; Ramsthaler, Frank
2008-07-04
In an age of global combat against terrorism, the recognition and identification of people on document images is of increasing significance. Experiments and calculations have shown that the camera-to-subject distance - not the focal length of the lens - can have a significant effect on facial proportions. Modern passport pictures should be able to function as a reference image for automatic and manual picture comparisons. This requires a defined subject distance. It is completely unclear which subject distance, in the taking of passport photographs, is ideal for the recognition of the actual person. We show here that the camera-to-subject distance that is perceived as ideal is dependent on the face being photographed, even if the distance of 2m was most frequently preferred. So far the problem of the ideal camera-to-subject distance for faces has only been approached through technical calculations. We have, for the first time, answered this question experimentally with a double-blind experiment. Even if there is apparently no ideal camera-to-subject distance valid for every face, 2m can be proposed as ideal for the taking of passport pictures. The first step would actually be the determination of a camera-to-subject distance for the taking of passport pictures within the standards. From an anthropological point of view it would be interesting to find out which facial features allow the preference of a shorter camera-to-subject distance and which allow the preference of a longer camera-to-subject distance.
Automatic Orientation of Large Blocks of Oblique Images
NASA Astrophysics Data System (ADS)
Rupnik, E.; Nex, F.; Remondino, F.
2013-05-01
Nowadays, multi-camera platforms combining nadir and oblique cameras are experiencing a revival. Due to their advantages such as ease of interpretation, completeness through mitigation of occluding areas, as well as system accessibility, they have found their place in numerous civil applications. However, automatic post-processing of such imagery still remains a topic of research. Configuration of cameras poses a challenge on the traditional photogrammetric pipeline used in commercial software and manual measurements are inevitable. For large image blocks it is certainly an impediment. Within theoretical part of the work we review three common least square adjustment methods and recap on possible ways for a multi-camera system orientation. In the practical part we present an approach that successfully oriented a block of 550 images acquired with an imaging system composed of 5 cameras (Canon Eos 1D Mark III) with different focal lengths. Oblique cameras are rotated in the four looking directions (forward, backward, left and right) by 45° with respect to the nadir camera. The workflow relies only upon open-source software: a developed tool to analyse image connectivity and Apero to orient the image block. The benefits of the connectivity tool are twofold: in terms of computational time and success of Bundle Block Adjustment. It exploits the georeferenced information provided by the Applanix system in constraining feature point extraction to relevant images only, and guides the concatenation of images during the relative orientation. Ultimately an absolute transformation is performed resulting in mean re-projection residuals equal to 0.6 pix.
MTF measurement of IR optics in different temperature ranges
NASA Astrophysics Data System (ADS)
Bai, Alexander; Duncker, Hannes; Dumitrescu, Eugen
2017-10-01
Infrared (IR) optical systems are at the core of many military, civilian and manufacturing applications and perform mission critical functions. To reliably fulfill the demanding requirements imposed on today's high performance IR optics, highly accurate, reproducible and fast lens testing is of crucial importance. Testing the optical performance within different temperature ranges becomes key in many military applications. Due to highly complex IR-Applications in the fields of aerospace, military and automotive industries, MTF Measurement under realistic environmental conditions become more and more relevant. A Modulation Transfer Function (MTF) test bench with an integrated thermal chamber allows measuring several sample sizes in a temperature range from -40 °C to +120°C. To reach reliable measurement results under these difficult conditions, a specially developed temperature stable design including an insulating vacuum are used. The main function of this instrument is the measurement of the MTF both on- and off-axis at up to +/-70° field angle, as well as measurement of effective focal length, flange focal length and distortion. The vertical configuration of the system guarantees a small overall footprint. By integrating a high-resolution IR camera with focal plane array (FPA) in the detection unit, time consuming measurement procedures such as scanning slit with liquid nitrogen cooled detectors can be avoided. The specified absolute accuracy of +/- 3% MTF is validated using internationally traceable reference optics. Together with a complete and intuitive software solution, this makes the instrument a turn-key device for today's state-of- the-art optical testing.
Modeling and Simulation of High Resolution Optical Remote Sensing Satellite Geometric Chain
NASA Astrophysics Data System (ADS)
Xia, Z.; Cheng, S.; Huang, Q.; Tian, G.
2018-04-01
The high resolution satellite with the longer focal length and the larger aperture has been widely used in georeferencing of the observed scene in recent years. The consistent end to end model of high resolution remote sensing satellite geometric chain is presented, which consists of the scene, the three line array camera, the platform including attitude and position information, the time system and the processing algorithm. The integrated design of the camera and the star tracker is considered and the simulation method of the geolocation accuracy is put forward by introduce the new index of the angle between the camera and the star tracker. The model is validated by the geolocation accuracy simulation according to the test method of the ZY-3 satellite imagery rigorously. The simulation results show that the geolocation accuracy is within 25m, which is highly consistent with the test results. The geolocation accuracy can be improved about 7 m by the integrated design. The model combined with the simulation method is applicable to the geolocation accuracy estimate before the satellite launching.
Determination of the paraxial focal length using Zernike polynomials over different apertures
NASA Astrophysics Data System (ADS)
Binkele, Tobias; Hilbig, David; Henning, Thomas; Fleischmann, Friedrich
2017-02-01
The paraxial focal length is still the most important parameter in the design of a lens. As presented at the SPIE Optics + Photonics 2016, the measured focal length is a function of the aperture. The paraxial focal length can be found when the aperture approaches zero. In this work, we investigate the dependency of the Zernike polynomials on the aperture size with respect to 3D space. By this, conventional wavefront measurement systems that apply Zernike polynomial fitting (e.g. Shack-Hartmann-Sensor) can be used to determine the paraxial focal length, too. Since the Zernike polynomials are orthogonal over a unit circle, the aperture used in the measurement has to be normalized. By shrinking the aperture and keeping up with the normalization, the Zernike coefficients change. The relation between these changes and the paraxial focal length are investigated. The dependency of the focal length on the aperture size is derived analytically and evaluated by simulation and measurement of a strong focusing lens. The measurements are performed using experimental ray tracing and a Shack-Hartmann-Sensor. Using experimental ray tracing for the measurements, the aperture can be chosen easily. Regarding the measurements with the Shack-Hartmann- Sensor, the aperture size is fixed. Thus, the Zernike polynomials have to be adapted to use different aperture sizes by the proposed method. By doing this, the paraxial focal length can be determined from the measurements in both cases.
Bennett, Charles L.
1996-01-01
An imaging Fourier transform spectrometer (10, 210) having a Fourier transform infrared spectrometer (12) providing a series of images (40) to a focal plane array camera (38). The focal plane array camera (38) is clocked to a multiple of zero crossing occurrences as caused by a moving mirror (18) of the Fourier transform infrared spectrometer (12) and as detected by a laser detector (50) such that the frame capture rate of the focal plane array camera (38) corresponds to a multiple of the zero crossing rate of the Fourier transform infrared spectrometer (12). The images (40) are transmitted to a computer (45) for processing such that representations of the images (40) as viewed in the light of an arbitrary spectral "fingerprint" pattern can be displayed on a monitor (60) or otherwise stored and manipulated by the computer (45).
High-precision method of binocular camera calibration with a distortion model.
Li, Weimin; Shan, Siyu; Liu, Hui
2017-03-10
A high-precision camera calibration method for binocular stereo vision system based on a multi-view template and alternative bundle adjustment is presented in this paper. The proposed method could be achieved by taking several photos on a specially designed calibration template that has diverse encoded points in different orientations. In this paper, the method utilized the existing algorithm used for monocular camera calibration to obtain the initialization, which involves a camera model, including radial lens distortion and tangential distortion. We created a reference coordinate system based on the left camera coordinate to optimize the intrinsic parameters of left camera through alternative bundle adjustment to obtain optimal values. Then, optimal intrinsic parameters of the right camera can be obtained through alternative bundle adjustment when we create a reference coordinate system based on the right camera coordinate. We also used all intrinsic parameters that were acquired to optimize extrinsic parameters. Thus, the optimal lens distortion parameters and intrinsic and extrinsic parameters were obtained. Synthetic and real data were used to test the method. The simulation results demonstrate that the maximum mean absolute relative calibration errors are about 3.5e-6 and 1.2e-6 for the focal length and the principal point, respectively, under zero-mean Gaussian noise with 0.05 pixels standard deviation. The real result shows that the reprojection error of our model is about 0.045 pixels with the relative standard deviation of 1.0e-6 over the intrinsic parameters. The proposed method is convenient, cost-efficient, highly precise, and simple to carry out.
GEMINI-TITAN (GT)-10 - MISC. - INFLIGHT (MILKY WAY) - OUTER SPACE
1966-08-01
S66-45314 (19 July 1966) --- Ultraviolet spectra of stars in the region of the Southern Cross. These objective-grating spectra were obtained by astronauts John W. Young and Michael Collins during Gemini-10 stand-up EVA on July 19, 1966, with a 70mm Maurer camera and its f/3.3 focal length lens. The spectra extends from 2,200 angstroms to about 4,000 angstroms. The spacecraft was docked to the horizon-stabilized Agena-10; thus giving an apparent field of rotation resulting from the four-degree-per-minute orbital motion during the 20-second exposure time. Photo credit: NASA
GEMINI-TITAN (GT)-10 - MISC. - INFLIGHT (MILKY WAY) - OUTER SPACE
1966-08-01
S66-45328 (19 July 1966) --- Ultraviolet spectra of stars in the Carina-Vela region of the southern Milky Way. These objective-grating spectra were obtained by astronauts John W. Young and Michael Collins during Gemini-10 stand-up EVA on July 19, 1966, with a 70mm Maurer camera and its f/3.3 focal length lens. The spectra extends from 2,200 angstroms to about 4,000 angstroms. The spacecraft was docked to the horizon-stabilized Agena-10; thus giving an apparent field of rotation resulting from the four-degree-per-minute orbital motion during the 20-second exposure time. Photo credit: NASA
Focal Length Controllable Ultrasonic Transducer Using Bimorph-Type Bending Actuator
NASA Astrophysics Data System (ADS)
Chae, Min-Ku; Kim, Moo-Joon; Ha, Kang-Lyeol; Lee, Chai-Bong
2003-05-01
Using a bimorph-type bending actuator, we propose a new method for controlling the focal length of a transducer by electric DC voltage. We designed two focal length controllable ultrasonic transducers with actuators, a line-focus and a point-focus transducer. The polyvinylidene fluoride (PVDF) piezoelectric type polymer film is used for transmitting and receiving of ultrasonic signals. Using the new method, it is confirmed by investigation of the underwater acoustic field that the focal length can be controlled to within 10% of the radius of the transducer curvature.
Focal length hysteresis of a double-liquid lens based on electrowetting
NASA Astrophysics Data System (ADS)
Peng, Runling; Wang, Dazhen; Hu, Zhiwei; Chen, Jiabi; Zhuang, Songlin
2013-02-01
In this paper, an extended Young equation especially suited for an ideal cylindrical double-liquid variable-focus lens is derived by means of an energy minimization method. Based on the extended Young equation, a kind of focal length hysteresis effect is introduced into the double-liquid variable-focus lens. Such an effect can be explained theoretically by adding a force of friction to the tri-phase contact line. Theoretical analysis shows that the focal length at a particular voltage can be different depending on whether the applied voltage is increasing or decreasing, that is, there is a focal length hysteresis effect. Moreover, the focal length at a particular voltage must be larger when the voltage is rising than when it is dropping. These conclusions are also verified by experiments.
The Multi-site All-Sky CAmeRA (MASCARA). Finding transiting exoplanets around bright (mV < 8) stars
NASA Astrophysics Data System (ADS)
Talens, G. J. J.; Spronck, J. F. P.; Lesage, A.-L.; Otten, G. P. P. L.; Stuik, R.; Pollacco, D.; Snellen, I. A. G.
2017-05-01
This paper describes the design, operations, and performance of the Multi-site All-Sky CAmeRA (MASCARA). Its primary goal is to find new exoplanets transiting bright stars, 4 < mV < 8, by monitoring the full sky. MASCARA consists of one northern station on La Palma, Canary Islands (fully operational since February 2015), one southern station at La Silla Observatory, Chile (operational from early 2017), and a data centre at Leiden Observatory in the Netherlands. Both MASCARA stations are equipped with five interline CCD cameras using wide field lenses (24 mm focal length) with fixed pointings, which together provide coverage down to airmass 3 of the local sky. The interline CCD cameras allow for back-to-back exposures, taken at fixed sidereal times with exposure times of 6.4 sidereal seconds. The exposures are short enough that the motion of stars across the CCD does not exceed one pixel during an integration. Astrometry and photometry are performed on-site, after which the resulting light curves are transferred to Leiden for further analysis. The final MASCARA archive will contain light curves for 70 000 stars down to mV = 8.4, with a precision of 1.5% per 5 minutes at mV = 8.
Mars Exploration Rover Athena Panoramic Camera (Pancam) investigation
Bell, J.F.; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.N.; Arneson, H.M.; Brown, D.; Collins, S.A.; Dingizian, A.; Elliot, S.T.; Hagerott, E.C.; Hayes, A.G.; Johnson, M.J.; Johnson, J. R.; Joseph, J.; Kinch, K.; Lemmon, M.T.; Morris, R.V.; Scherr, L.; Schwochert, M.; Shepard, M.K.; Smith, G.H.; Sohl-Dickstein, J. N.; Sullivan, R.J.; Sullivan, W.T.; Wadsworth, M.
2003-01-01
The Panoramic Camera (Pancam) investigation is part of the Athena science payload launched to Mars in 2003 on NASA's twin Mars Exploration Rover (MER) missions. The scientific goals of the Pancam investigation are to assess the high-resolution morphology, topography, and geologic context of each MER landing site, to obtain color images to constrain the mineralogic, photometric, and physical properties of surface materials, and to determine dust and aerosol opacity and physical properties from direct imaging of the Sun and sky. Pancam also provides mission support measurements for the rovers, including Sun-finding for rover navigation, hazard identification and digital terrain modeling to help guide long-term rover traverse decisions, high-resolution imaging to help guide the selection of in situ sampling targets, and acquisition of education and public outreach products. The Pancam optical, mechanical, and electronics design were optimized to achieve these science and mission support goals. Pancam is a multispectral, stereoscopic, panoramic imaging system consisting of two digital cameras mounted on a mast 1.5 m above the Martian surface. The mast allows Pancam to image the full 360?? in azimuth and ??90?? in elevation. Each Pancam camera utilizes a 1024 ?? 1024 active imaging area frame transfer CCD detector array. The Pancam optics have an effective focal length of 43 mm and a focal ratio f/20, yielding an instantaneous field of view of 0.27 mrad/pixel and a field of view of 16?? ?? 16??. Each rover's two Pancam "eyes" are separated by 30 cm and have a 1?? toe-in to provide adequate stereo parallax. Each eye also includes a small eight position filter wheel to allow surface mineralogic studies, multispectral sky imaging, and direct Sun imaging in the 400-1100 nm wavelength region. Pancam was designed and calibrated to operate within specifications on Mars at temperatures from -55?? to +5??C. An onboard calibration target and fiducial marks provide the capability to validate the radiometric and geometric calibration on Mars. Copyright 2003 by the American Geophysical Union.
Wick, David V.
2005-12-20
An active optical zoom system changes the magnification (or effective focal length) of an optical imaging system by utilizing two or more active optics in a conventional optical system. The system can create relatively large changes in system magnification with very small changes in the focal lengths of individual active elements by leveraging the optical power of the conventional optical elements (e.g., passive lenses and mirrors) surrounding the active optics. The active optics serve primarily as variable focal-length lenses or mirrors, although adding other aberrations enables increased utility. The active optics can either be LC SLMs, used in a transmissive optical zoom system, or DMs, used in a reflective optical zoom system. By appropriately designing the optical system, the variable focal-length lenses or mirrors can provide the flexibility necessary to change the overall system focal length (i.e., effective focal length), and therefore magnification, that is normally accomplished with mechanical motion in conventional zoom lenses. The active optics can provide additional flexibility by allowing magnification to occur anywhere within the FOV of the system, not just on-axis as in a conventional system.
Challenges and solutions for high performance SWIR lens design
NASA Astrophysics Data System (ADS)
Gardner, M. C.; Rogers, P. J.; Wilde, M. F.; Cook, T.; Shipton, A.
2016-10-01
Shortwave infrared (SWIR) cameras are becoming increasingly attractive due to the improving size, resolution and decreasing prices of InGaAs focal plane arrays (FPAs). The rapid development of competitively priced HD performance SWIR cameras has not been matched in SWIR imaging lenses with the result that the lens is now more likely to be the limiting factor in imaging quality than the FPA. Adapting existing lens designs from the visible region by re-coating for SWIR will improve total transmission but diminished image quality metrics such as MTF, and in particular large field angle performance such as vignetting, field curvature and distortion are serious consequences. To meet this challenge original SWIR solutions are presented including a wide field of view fixed focal length lens for commercial machine vision (CMV) and a wide angle, small, lightweight defence lens and their relevant design considerations discussed. Issues restricting suitable glass types will be examined. The index and dispersion properties at SWIR wavelengths can differ significantly from their visible values resulting in unusual glass combinations when matching doublet elements. Materials chosen simultaneously allow athermalization of the design as well as containing matched CTEs in the elements of doublets. Recently, thinned backside-illuminated InGaAs devices have made Vis.SWIR cameras viable. The SWIR band is sufficiently close to the visible that the same constituent materials can be used for AR coatings covering both bands. Keeping the lens short and mass low can easily result in high incidence angles which in turn complicates coating design, especially when extended beyond SWIR into the visible band. This paper also explores the potential performance of wideband Vis.SWIR AR coatings.
Broadband Achromatic Telecentric Lens
NASA Technical Reports Server (NTRS)
Mouroulis, Pantazis
2007-01-01
A new type of lens design features broadband achromatic performance as well as telecentricity, using a minimum number of spherical elements. With appropriate modifications, the lens design form can be tailored to cover the range of response of the focal-plane array, from Si (400-1,000 nm) to InGaAs (400-1,700 or 2,100 nm) or InSb/HgCdTe reaching to 2,500 nm. For reference, lenses typically are achromatized over the visible wavelength range of 480-650 nm. In remote sensing applications, there is a need for broadband achromatic telescopes, normally satisfied with mirror-based systems. However, mirror systems are not always feasible due to size or geometry restrictions. They also require expensive aspheric surfaces. Non-obscured mirror systems can be difficult to align and have a limited (essentially one-dimensional) field of view. Centrally obscured types have a two-dimensional but very limited field in addition to the obscuration. Telecentricity is a highly desirable property for matching typical spectrometer types, as well as for reducing the variation of the angle of incidence and cross-talk on the detector for simple camera types. This rotationally symmetric telescope with no obscuration and using spherical surfaces and selected glass types fills a need in the range of short focal lengths. It can be used as a compact front unit for a matched spectrometer, as an ultra-broadband camera objective lens, or as the optics of an integrated camera/spectrometer in which the wavelength information is obtained by the use of strip or linear variable filters on the focal plane array. This kind of camera and spectrometer system can find applications in remote sensing, as well as in-situ applications for geological mapping and characterization of minerals, ecological studies, and target detection and identification through spectral signatures. Commercially, the lens can be used in quality-control applications via spectral analysis. The lens design is based on the rear landscape lens with the aperture stop in front of all elements. This allows sufficient room for telecentricity in addition to making the stop easily accessible. The crucial design features are the use of a doublet with an ultra-low dispersion glass (fluorite or S-FPL53), and the use of a strong negative element, which enables flat field and telecentricity in conjunction with the last (field lens) element. The field lens also can be designed to be in contact with the array, a feature that is desirable in some applications. The lens has a 20deg field of view, for a 50-mm focal length, and is corrected over the range of wavelengths of 450-2,300 nm. Transverse color, which is the most pernicious aberration for spectroscopic work, is controlled at the level of 1 m or below at 0.7 m field and 5 m at full field. The maximum chief ray angle is less than 1.7 , providing good telecentricity. An additional feature of this lens is that it is made exclusively with glasses that provide good transmission up to 2,300 nm and even some transmission to 2,500 nm; thus, the lens can be used in applications that cover the entire solar-reflected spectrum. Alternative realizations are possible that provide enhanced resolution and even less transverse color over a narrower wavelength range.
The GCT camera for the Cherenkov Telescope Array
NASA Astrophysics Data System (ADS)
Lapington, J. S.; Abchiche, A.; Allan, D.; Amans, J.-P.; Armstrong, T. P.; Balzer, A.; Berge, D.; Boisson, C.; Bousquet, J.-J.; Bose, R.; Brown, A. M.; Bryan, M.; Buchholtz, G.; Buckley, J.; Chadwick, P. M.; Costantini, H.; Cotter, G.; Daniel, M. K.; De Franco, A.; De Frondat, F.; Dournaux, J.-L.; Dumas, D.; Ernenwein, J.-P.; Fasola, G.; Funk, S.; Gironnet, J.; Graham, J. A.; Greenshaw, T.; Hervet, O.; Hidaka, N.; Hinton, J. A.; Huet, J.-M.; Jankowsky, D.; Jegouzo, I.; Jogler, T.; Kawashima, T.; Kraus, M.; Laporte, P.; Leach, S.; Lefaucheur, J.; Markoff, S.; Melse, T.; Minaya, I. A.; Mohrmann, L.; Molyneux, P.; Moore, P.; Nolan, S. J.; Okumura, A.; Osborne, J. P.; Parsons, R. D.; Rosen, S.; Ross, D.; Rowell, G.; Rulten, C. B.; Sato, Y.; Sayede, F.; Schmoll, J.; Schoorlemmer, H.; Servillat, M.; Sol, H.; Stamatescu, V.; Stephan, M.; Stuik, R.; Sykes, J.; Tajima, H.; Thornhill, J.; Tibaldo, L.; Trichard, C.; Varner, G.; Vink, J.; Watson, J. J.; White, R.; Yamane, N.; Zech, A.; Zink, A.; Zorn, J.; CTA Consortium
2017-12-01
The Gamma Cherenkov Telescope (GCT) is one of the designs proposed for the Small Sized Telescope (SST) section of the Cherenkov Telescope Array (CTA). The GCT uses dual-mirror optics, resulting in a compact telescope with good image quality and a large field of view with a smaller, more economical, camera than is achievable with conventional single mirror solutions. The photon counting GCT camera is designed to record the flashes of atmospheric Cherenkov light from gamma and cosmic ray initiated cascades, which last only a few tens of nanoseconds. The GCT optics require that the camera detectors follow a convex surface with a radius of curvature of 1 m and a diameter of 35 cm, which is approximated by tiling the focal plane with 32 modules. The first camera prototype is equipped with multi-anode photomultipliers, each comprising an 8×8 array of 6×6 mm2 pixels to provide the required angular scale, adding up to 2048 pixels in total. Detector signals are shaped, amplified and digitised by electronics based on custom ASICs that provide digitisation at 1 GSample/s. The camera is self-triggering, retaining images where the focal plane light distribution matches predefined spatial and temporal criteria. The electronics are housed in the liquid-cooled, sealed camera enclosure. LED flashers at the corners of the focal plane provide a calibration source via reflection from the secondary mirror. The first GCT camera prototype underwent preliminary laboratory tests last year. In November 2015, the camera was installed on a prototype GCT telescope (SST-GATE) in Paris and was used to successfully record the first Cherenkov light of any CTA prototype, and the first Cherenkov light seen with such a dual-mirror optical system. A second full-camera prototype based on Silicon Photomultipliers is under construction. Up to 35 GCTs are envisaged for CTA.
Performance of the e2v 1.2 GPix cryogenic camera for the J-PAS 2.5m survey telescope
NASA Astrophysics Data System (ADS)
Robbins, M. S.; Bastable, M.; Bates, A.; Dryer, M.; Eames, S.; Fenemore-Jones, G.; Haddow, G.; Jorden, P. R.; Lane, B.; Marin-Franch, A.; Mortimer, J.; Palmer, I.; Puttay, N.; Renshaw, R.; Smith, M.; Taylor, K.; Tearle, J.; Weston, P.; Wheeler, P.; Worley, J.
2016-08-01
The J-PAS project will perform a five-year survey of the northern sky from a new 2.5m telescope in Teruel, Spain. In this paper the build and factory testing of the commercially supplied cryogenic camera is described. The 1.2 Giga-pixel focal plane is contained within a novel liquid-nitrogen cooled vacuum cryostat, which maintains the flatness for the cooled, 0.45m diameter focal plane to better than 27 μm peak to valley. The cooling system controls the focal plane to a temperature of -100°C with a variation across the focal plane of better than 2.5oC and a stability of better than +/- 0.5 °C over the long periods of operation required. The proximity drive electronics achieves total system level noise performance better than 5 e- from the 224-channel CCD system.
Comparison Between RGB and Rgb-D Cameras for Supporting Low-Cost Gnss Urban Navigation
NASA Astrophysics Data System (ADS)
Rossi, L.; De Gaetani, C. I.; Pagliari, D.; Realini, E.; Reguzzoni, M.; Pinto, L.
2018-05-01
A pure GNSS navigation is often unreliable in urban areas because of the presence of obstructions, thus preventing a correct reception of the satellite signal. The bridging between GNSS outages, as well as the vehicle attitude reconstruction, can be recovered by using complementary information, such as visual data acquired by RGB-D or RGB cameras. In this work, the possibility of integrating low-cost GNSS and visual data by means of an extended Kalman filter has been investigated. The focus is on the comparison between the use of RGB-D or RGB cameras. In particular, a Microsoft Kinect device (second generation) and a mirrorless Canon EOS M RGB camera have been compared. The former is an interesting RGB-D camera because of its low-cost, easiness of use and raw data accessibility. The latter has been selected for the high-quality of the acquired images and for the possibility of mounting fixed focal length lenses with a lower weight and cost with respect to a reflex camera. The designed extended Kalman filter takes as input the GNSS-only trajectory and the relative orientation between subsequent pairs of images. Depending on the visual data acquisition system, the filter is different because RGB-D cameras acquire both RGB and depth data, allowing to solve the scale problem, which is instead typical of image-only solutions. The two systems and filtering approaches were assessed by ad-hoc experimental tests, showing that the use of a Kinect device for supporting a u-blox low-cost receiver led to a trajectory with a decimeter accuracy, that is 15 % better than the one obtained when using the Canon EOS M camera.
Optimal focusing conditions of lenses using Gaussian beams
Franco, Juan Manuel; Cywiak, Moisés; Cywiak, David; ...
2016-04-02
By using the analytical equations of the propagation of Gaussian beams in which truncation exhibits negligible consequences, we describe a method that uses the value of the focal length of a focusing lens to classify its focusing performance. In this study, we show that for different distances between a laser and a focusing lens there are different planes where best focusing conditions can be obtained and we demonstrate how the value of the focal length impacts the lens focusing properties. To perform the classification we introduce the term delimiting focal length. As the value of the focal length used inmore » wave propagation theory is nominal and difficult to measure accurately, we describe an experimental approach to calculate its value matching our analytical description. Finally, we describe possible applications of the results for characterizing Gaussian sources, for measuring focal lengths and/or alternatively for characterizing piston-like movements.« less
Studies on a silicon-photomultiplier-based camera for Imaging Atmospheric Cherenkov Telescopes
NASA Astrophysics Data System (ADS)
Arcaro, C.; Corti, D.; De Angelis, A.; Doro, M.; Manea, C.; Mariotti, M.; Rando, R.; Reichardt, I.; Tescaro, D.
2017-12-01
Imaging Atmospheric Cherenkov Telescopes (IACTs) represent a class of instruments which are dedicated to the ground-based observation of cosmic VHE gamma ray emission based on the detection of the Cherenkov radiation produced in the interaction of gamma rays with the Earth atmosphere. One of the key elements of such instruments is a pixelized focal-plane camera consisting of photodetectors. To date, photomultiplier tubes (PMTs) have been the common choice given their high photon detection efficiency (PDE) and fast time response. Recently, silicon photomultipliers (SiPMs) are emerging as an alternative. This rapidly evolving technology has strong potential to become superior to that based on PMTs in terms of PDE, which would further improve the sensitivity of IACTs, and see a price reduction per square millimeter of detector area. We are working to develop a SiPM-based module for the focal-plane cameras of the MAGIC telescopes to probe this technology for IACTs with large focal plane cameras of an area of few square meters. We will describe the solutions we are exploring in order to balance a competitive performance with a minimal impact on the overall MAGIC camera design using ray tracing simulations. We further present a comparative study of the overall light throughput based on Monte Carlo simulations and considering the properties of the major hardware elements of an IACT.
Extended depth of field system for long distance iris acquisition
NASA Astrophysics Data System (ADS)
Chen, Yuan-Lin; Hsieh, Sheng-Hsun; Hung, Kuo-En; Yang, Shi-Wen; Li, Yung-Hui; Tien, Chung-Hao
2012-10-01
Using biometric signatures for identity recognition has been practiced for centuries. Recently, iris recognition system attracts much attention due to its high accuracy and high stability. The texture feature of iris provides a signature that is unique for each subject. Currently most commercial iris recognition systems acquire images in less than 50 cm, which is a serious constraint that needs to be broken if we want to use it for airport access or entrance that requires high turn-over rate . In order to capture the iris patterns from a distance, in this study, we developed a telephoto imaging system with image processing techniques. By using the cubic phase mask positioned front of the camera, the point spread function was kept constant over a wide range of defocus. With adequate decoding filter, the blurred image was restored, where the working distance between the subject and the camera can be achieved over 3m associated with 500mm focal length and aperture F/6.3. The simulation and experimental results validated the proposed scheme, where the depth of focus of iris camera was triply extended over the traditional optics, while keeping sufficient recognition accuracy.
KAPAO Prime: Design and Simulation
NASA Astrophysics Data System (ADS)
McGonigle, Lorcan
2012-11-01
KAPAO (KAPAO A Pomona Adaptive Optics instrument) is a dual-band natural guide star adaptive optics system designed to measure and remove atmospheric aberration from Pomona College's telescope atop Table Mountain. We present here, the final optical system, referred to as Prime, designed in Zemax Optical Design Software. Prime is characterized by diffraction limited imaging over the full 73'' field of view of our Andor Camera at f/33 as well as for our NIR Xenics camera at f/50. In Zemax, tolerances of 1% on OAP focal length and off-axis distance were shown to contribute an additional 4 nm of wavefront error (98% confidence) over the field of view of the Andor camera; the contribution from surface irregularity was determined analytically to be 40nm for OAPs specified to l/10 surface irregularity. Modeling of the temperature deformation of the breadboard in SolidWorks revealed 70 micron contractions along the edges of the board for a decrease of 75 F; when applied to OAP positions such displacements from the optimal layout are predicted to contribute an additional 20 nanometers of wavefront error. Flexure modeling of the breadboard due to gravity is on-going. We hope to begin alignment and testing of ``Prime'' in Q1 2013.
Meteor44 Video Meteor Photometry
NASA Technical Reports Server (NTRS)
Swift, Wesley R.; Suggs, Robert M.; Cooke, William J.
2004-01-01
Meteor44 is a software system developed at MSFC for the calibration and analysis of video meteor data. The dynamic range of the (8bit) video data is extended by approximately 4 magnitudes for both meteors and stellar images using saturation compensation. Camera and lens specific saturation compensation coefficients are derived from artificial variable star laboratory measurements. Saturation compensation significantly increases the number of meteors with measured intensity and improves the estimation of meteoroid mass distribution. Astrometry is automated to determine each image s plate coefficient using appropriate star catalogs. The images are simultaneously intensity calibrated from the contained stars to determine the photon sensitivity and the saturation level referenced above the atmosphere. The camera s spectral response is used to compensate for stellar color index and typical meteor spectra in order to report meteor light curves in traditional visual magnitude units. Recent efforts include improved camera calibration procedures, long focal length "streak" meteor photome&y and two-station track determination. Meteor44 has been used to analyze data from the 2001.2002 and 2003 MSFC Leonid observational campaigns as well as several lesser showers. The software is interactive and can be demonstrated using data from recent Leonid campaigns.
Front-end multiplexing—applied to SQUID multiplexing: Athena X-IFU and QUBIC experiments
NASA Astrophysics Data System (ADS)
Prele, D.
2015-08-01
As we have seen for digital camera market and a sensor resolution increasing to "megapixels", all the scientific and high-tech imagers (whatever the wave length - from radio to X-ray range) tends also to always increases the pixels number. So the constraints on front-end signals transmission increase too. An almost unavoidable solution to simplify integration of large arrays of pixels is front-end multiplexing. Moreover, "simple" and "efficient" techniques allow integration of read-out multiplexers in the focal plane itself. For instance, CCD (Charge Coupled Device) technology has boost number of pixels in digital camera. Indeed, this is exactly a planar technology which integrates both the sensors and a front-end multiplexed readout. In this context, front-end multiplexing techniques will be discussed for a better understanding of their advantages and their limits. Finally, the cases of astronomical instruments in the millimeter and in the X-ray ranges using SQUID (Superconducting QUantum Interference Device) will be described.
Three dimensional measurement with an electrically tunable focused plenoptic camera
NASA Astrophysics Data System (ADS)
Lei, Yu; Tong, Qing; Xin, Zhaowei; Wei, Dong; Zhang, Xinyu; Liao, Jing; Wang, Haiwei; Xie, Changsheng
2017-03-01
A liquid crystal microlens array (LCMLA) with an arrayed microhole pattern electrode based on nematic liquid crystal materials using a fabrication method including traditional UV-photolithography and wet etching is presented. Its focusing performance is measured under different voltage signals applied between the electrodes of the LCMLA. The experimental outcome shows that the focal length of the LCMLA can be tuned easily by only changing the root mean square value of the voltage signal applied. The developed LCMLA is further integrated with a main lens and an imaging sensor to construct a LCMLA-based focused plenoptic camera (LCFPC) prototype. The focused range of the LCFPC can be shifted electrically along the optical axis of the imaging system. The principles and methods for acquiring several key parameters such as three dimensional (3D) depth, positioning, and motion expression are given. The depth resolution is discussed in detail. Experiments are carried out to obtain the static and dynamic 3D information of objects chosen.
Three dimensional measurement with an electrically tunable focused plenoptic camera.
Lei, Yu; Tong, Qing; Xin, Zhaowei; Wei, Dong; Zhang, Xinyu; Liao, Jing; Wang, Haiwei; Xie, Changsheng
2017-03-01
A liquid crystal microlens array (LCMLA) with an arrayed microhole pattern electrode based on nematic liquid crystal materials using a fabrication method including traditional UV-photolithography and wet etching is presented. Its focusing performance is measured under different voltage signals applied between the electrodes of the LCMLA. The experimental outcome shows that the focal length of the LCMLA can be tuned easily by only changing the root mean square value of the voltage signal applied. The developed LCMLA is further integrated with a main lens and an imaging sensor to construct a LCMLA-based focused plenoptic camera (LCFPC) prototype. The focused range of the LCFPC can be shifted electrically along the optical axis of the imaging system. The principles and methods for acquiring several key parameters such as three dimensional (3D) depth, positioning, and motion expression are given. The depth resolution is discussed in detail. Experiments are carried out to obtain the static and dynamic 3D information of objects chosen.
Average focal length and power of a section of any defined surface.
Kaye, Stephen B
2010-04-01
To provide a method to allow calculation of the average focal length and power of a lens through a specified meridian of any defined surface, not limited to the paraxial approximations. University of Liverpool, Liverpool, United Kingdom. Functions were derived to model back-vertex focal length and representative power through a meridian containing any defined surface. Average back-vertex focal length was based on the definition of the average of a function, using the angle of incidence as an independent variable. Univariate functions allowed determination of average focal length and power through a section of any defined or topographically measured surface of a known refractive index. These functions incorporated aberrations confined to the section. The proposed method closely approximates the average focal length, and by inference power, of a section (meridian) of a surface to a single or scalar value. It is not dependent on the paraxial and other nonconstant approximations and includes aberrations confined to that meridian. A generalization of this method to include all orthogonal and oblique meridians is needed before a comparison with measured wavefront values can be made. Copyright (c) 2010 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
An astronomy camera for low background applications in the 1. 0 to 2. 5. mu. m spectral region
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaki, S.A.; Bailey, G.C.; Hagood, R.W.
1989-02-01
A short wavelength (1.0-2.5 ..mu..m) 128 x 128 focal plane array forms the heart of this low background astronomy camera system. The camera is designed to accept either a 128 x 128 HgCdTe array for the 1-2.5 ..mu..m spectral region or an InSb array for the 3-5 ..mu..m spectral region. A cryogenic folded optical system is utilized to control excess stray light along with a cold eight-position filter wheel for spectral filtering. The camera head and electronics will also accept a 256 x 256 focal plane. Engineering evaluation of the complete system is complete along with two engineering runs atmore » the JPL Table Mountain Observatory. System design, engineering performance, and sample imagery are presented in this paper.« less
Ultra-fast framing camera tube
Kalibjian, Ralph
1981-01-01
An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.
Jin, Xin; Liu, Li; Chen, Yanqin; Dai, Qionghai
2017-05-01
This paper derives a mathematical point spread function (PSF) and a depth-invariant focal sweep point spread function (FSPSF) for plenoptic camera 2.0. Derivation of PSF is based on the Fresnel diffraction equation and image formation analysis of a self-built imaging system which is divided into two sub-systems to reflect the relay imaging properties of plenoptic camera 2.0. The variations in PSF, which are caused by changes of object's depth and sensor position variation, are analyzed. A mathematical model of FSPSF is further derived, which is verified to be depth-invariant. Experiments on the real imaging systems demonstrate the consistency between the proposed PSF and the actual imaging results.
NASA Technical Reports Server (NTRS)
Gunapala, Sarath D.; Park, Jin S.; Sarusi, Gabby; Lin, True-Lon; Liu, John K.; Maker, Paul D.; Muller, Richard E.; Shott, Craig A.; Hoelter, Ted
1997-01-01
In this paper, we discuss the development of very sensitive, very long wavelength infrared GaAs/Al(x)Ga(1-x)As quantum well infrared photodetectors (QWIP's) based on bound-to-quasi-bound intersubband transition, fabrication of random reflectors for efficient light coupling, and the demonstration of a 15 micro-m cutoff 128 x 128 focal plane array imaging camera. Excellent imagery, with a noise equivalent differential temperature (N E(delta T)) of 30 mK has been achieved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter
Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature.more » Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.« less
15 CFR 742.4 - National security.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Requirements” section except those cameras in ECCN 6A003.b.4.b that have a focal plane array with 111,000 or..., South Korea, Spain, Sweden, Switzerland, Turkey, and the United Kingdom for those cameras in ECCN 6A003...
15 CFR 742.4 - National security.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Requirements” section except those cameras in ECCN 6A003.b.4.b that have a focal plane array with 111,000 or..., South Korea, Spain, Sweden, Switzerland, Turkey, and the United Kingdom for those cameras in ECCN 6A003...
15 CFR 742.4 - National security.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Requirements” section except those cameras in ECCN 6A003.b.4.b that have a focal plane array with 111,000 or..., South Korea, Spain, Sweden, Switzerland, Turkey, and the United Kingdom for those cameras in ECCN 6A003...
15 CFR 742.4 - National security.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Requirements” section except those cameras in ECCN 6A003.b.4.b that have a focal plane array with 111,000 or..., South Korea, Spain, Sweden, Switzerland, Turkey, and the United Kingdom for those cameras in ECCN 6A003...
NASA Astrophysics Data System (ADS)
Schuster, Norbert; Franks, John
2011-06-01
In the 8-12 micron waveband Focal Plane Arrays (FPA) are available with a 17 micron pixel pitch in different arrays sizes (e.g. 512 x 480 pixels and 320 x 240 pixels) and with excellent electrical properties. Many applications become possible using this new type of IR-detector which will become the future standard in uncooled technology. Lenses with an f-number faster than f/1.5 minimize the diffraction impact on the spatial resolution and guarantee a high thermal resolution for uncooled cameras. Both effects will be quantified. The distinction between Traditional f-number (TF) and Radiometric f-number (RF) is discussed. Lenses with different focal lengths are required for applications in a variety of markets. They are classified by their Horizontal field of view (HFOV). Respecting the requirements for high volume markets, several two lens solutions will be discussed. A commonly accepted parameter of spatial resolution is the Modulation Transfer Function (MTF)-value at the Nyquist frequency of the detector (here 30cy/mm). This parameter of resolution will be presented versus field of view. Wide Angle and Super Wide Angle lenses are susceptible to low relative illumination in the corner of the detector. Measures to reduce this drop to an acceptable value are presented.
Di Domenico, Giovanni; Cardarelli, Paolo; Contillo, Adriano; Taibi, Angelo; Gambaccini, Mauro
2016-01-01
The quality of a radiography system is affected by several factors, a major one being the focal spot size of the x-ray tube. In fact, the measurement of such size is recognized to be of primary importance during acceptance tests and image quality evaluations of clinical radiography systems. The most common device providing an image of the focal spot emission distribution is a pin-hole camera, which requires a high tube loading in order to produce a measurable signal. This work introduces an alternative technique to obtain an image of the focal spot, through the processing of a single radiograph of a simple test object, acquired with a suitable magnification. The radiograph of a magnified sharp edge is a well-established method to evaluate the extension of the focal spot profile along the direction perpendicular to the edge. From a single radiograph of a circular x-ray absorber, it is possible to extract simultaneously the radial profiles of several sharp edges with different orientations. The authors propose a technique that allows to obtain an image of the focal spot through the processing of these radial profiles by means of a pseudo-CT reconstruction technique. In order to validate this technique, the reconstruction has been applied to the simulated radiographs of an ideal disk-shaped absorber, generated by various simulated focal spot distributions. Furthermore, the method has been applied to the focal spot of a commercially available mammography unit. In the case of simulated radiographs, the results of the reconstructions have been compared to the original distributions, showing an excellent agreement for what regards both the overall distribution and the full width at half maximum measurements. In the case of the experimental test, the method allowed to obtain images of the focal spot that have been compared with the results obtained through standard techniques, namely, pin-hole camera and slit camera. The method was proven to be effective for simulated images and the results of the experimental test suggest that it could be considered as an alternative technique for focal spot distribution evaluation. The method offers the possibility to measure the actual focal spot size and emission distribution at the same exposure conditions as clinical routine, avoiding high tube loading as in the case of the pin-hole imaging technique.
Measuring the Flatness of Focal Plane for Very Large Mosaic CCD Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hao, Jiangang; Estrada, Juan; Cease, Herman
2010-06-08
Large mosaic multiCCD camera is the key instrument for modern digital sky survey. DECam is an extremely red sensitive 520 Megapixel camera designed for the incoming Dark Energy Survey (DES). It is consist of sixty two 4k x 2k and twelve 2k x 2k 250-micron thick fully-depleted CCDs, with a focal plane of 44 cm in diameter and a field of view of 2.2 square degree. It will be attached to the Blanco 4-meter telescope at CTIO. The DES will cover 5000 square-degrees of the southern galactic cap in 5 color bands (g, r, i, z, Y) in 5 yearsmore » starting from 2011. To achieve the science goal of constraining the Dark Energy evolution, stringent requirements are laid down for the design of DECam. Among them, the flatness of the focal plane needs to be controlled within a 60-micron envelope in order to achieve the specified PSF variation limit. It is very challenging to measure the flatness of the focal plane to such precision when it is placed in a high vacuum dewar at 173 K. We developed two image based techniques to measure the flatness of the focal plane. By imaging a regular grid of dots on the focal plane, the CCD offset along the optical axis is converted to the variation the grid spacings at different positions on the focal plane. After extracting the patterns and comparing the change in spacings, we can measure the flatness to high precision. In method 1, the regular dots are kept in high sub micron precision and cover the whole focal plane. In method 2, no high precision for the grid is required. Instead, we use a precise XY stage moves the pattern across the whole focal plane and comparing the variations of the spacing when it is imaged by different CCDs. Simulation and real measurements show that the two methods work very well for our purpose, and are in good agreement with the direct optical measurements.« less
An automated calibration method for non-see-through head mounted displays.
Gilson, Stuart J; Fitzgibbon, Andrew W; Glennerster, Andrew
2011-08-15
Accurate calibration of a head mounted display (HMD) is essential both for research on the visual system and for realistic interaction with virtual objects. Yet, existing calibration methods are time consuming and depend on human judgements, making them error prone, and are often limited to optical see-through HMDs. Building on our existing approach to HMD calibration Gilson et al. (2008), we show here how it is possible to calibrate a non-see-through HMD. A camera is placed inside a HMD displaying an image of a regular grid, which is captured by the camera. The HMD is then removed and the camera, which remains fixed in position, is used to capture images of a tracked calibration object in multiple positions. The centroids of the markers on the calibration object are recovered and their locations re-expressed in relation to the HMD grid. This allows established camera calibration techniques to be used to recover estimates of the HMD display's intrinsic parameters (width, height, focal length) and extrinsic parameters (optic centre and orientation of the principal ray). We calibrated a HMD in this manner and report the magnitude of the errors between real image features and reprojected features. Our calibration method produces low reprojection errors without the need for error-prone human judgements. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wong, Erwin
2000-03-01
Traditional methods of linear based imaging limits the viewer to a single fixed-point perspective. By means of a single lens multiple perspective mirror system, a 360-degree representation of the area around the camera is reconstructed. This reconstruction is used overcome the limitations of a traditional camera by providing the viewer with many different perspectives. By constructing the mirror into a hemispherical surface with multiple focal lengths at various diameters on the mirror, and by placing a parabolic mirror overhead, a stereoscopic image can be extracted from the image captured by a high-resolution camera placed beneath the mirror. Image extraction and correction is made by computer processing of the image obtained by camera; the image present up to five distinguishable different viewpoints that a computer can extrapolate pseudo- perspective data from. Geometric and depth for field can be extrapolated via comparison and isolation of objects within a virtual scene post processed by the computer. Combining data with scene rendering software provides the viewer with the ability to choose a desired viewing position, multiple dynamic perspectives, and virtually constructed perspectives based on minimal existing data. An examination into the workings of the mirror relay system is provided, including possible image extrapolation and correctional methods. Generation of data and virtual interpolated and constructed data is also mentioned.
VizieR Online Data Catalog: TU UMa light curves and maxima, CL Aur minima (Liska+, 2016)
NASA Astrophysics Data System (ADS)
Liska, J.; Skarka, M.; Mikulasek, Z.; Zejda, M.; Chrastina, M.
2016-02-01
Differential photometry for RR Lyrae star TU UMa in the 1st and 2nd file. The measurements were obtained using 24-inch and 1-inch telescopes, respectively. The observations were performed at the Masaryk University Observatory in Brno (3 nights, 24-inch), and at the private observatory in Brno (16 nights, 1-inch) in the Czech Republic from December 2013 to June 2014. Observing equipments consisted of 24-inch Newtonian telescope (600/2780mm, diameter/focal length) and a Moravian Instruments CCD camera G2-4000 with Stromgren photometric filters vby, and of 1-inch refractor (a photographic lens Sonnar 4/135mm, lens focal ratio/focal length) and ATIK 16IC CCD camera with green photometric filter with similar throughput as the Johnson V filter. Exposures were v - 60s, b - 30s, y - 30s, green - 30s. For the small aperture telescope, five frames were combined to a single image to achieve a better signal-to-noise ratio. The time resolution of a such combined frame is about 170s. CCD images were calibrated in a standard way (dark frame and flat field corrections). The C-Munipack software (Motl 2009) was used for this processing as well as for differential photometry. The comparison star BD+30 2165 was the same for both instruments, but the control stars were BD+30 2164 (for the 24-inch telescope) and HD 99593 (for the 1-inch telescope). The 3rd file contains maxima timings of TU UMa adopted from the GEOS RR Lyr database, from the latest publications, together with maxima timings determined in our study. Times of maxima were calculated from our observations, sky-surveys data (Hipparcos, NSVS, Pi of the Sky, SuperWASP), photographic measurements (project DASCH), and from several published datasets, in which the maxima were omitted or badly determined - Boenigk (1958AcA.....8...13B), Liakos, Niarchos (2011IBVS.6099....1L, 2011IBVS.5990....1L), Liu, Janes (1989ApJS...69..593L), Preston et al. (1961ApJ...133..484P). The 4th file contains minima timings of eclipsing binary CL Aur adopted from O-C Gateway database. (5 data files).
A math model for high velocity sensoring with a focal plane shuttered camera.
NASA Technical Reports Server (NTRS)
Morgan, P.
1971-01-01
A new mathematical model is presented which describes the image produced by a focal plane shutter-equipped camera. The model is based upon the well-known collinearity condition equations and incorporates both the translational and rotational motion of the camera during the exposure interval. The first differentials of the model with respect to exposure interval, delta t, yield the general matrix expressions for image velocities which may be simplified to known cases. The exposure interval, delta t, may be replaced under certain circumstances with a function incorporating blind velocity and image position if desired. The model is tested using simulated Lunar Orbiter data and found to be computationally stable as well as providing excellent results, provided that some external information is available on the velocity parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flaugher, B.; Diehl, H. T.; Alvarez, O.
2015-11-15
The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuummore » Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel{sup −1}. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less
Flaugher, B.
2015-04-11
The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar.more » The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less
Characterization and performance of PAUCam filters
NASA Astrophysics Data System (ADS)
Casas, R.; Cardiel-Sas, L.; Castander, F. J.; Díaz, C.; Gaweda, J.; Jiménez Rojas, J.; Jiménez, S.; Lamensans, M.; Padilla, C.; Rodriguez, F. J.; Sanchez, E.; Sevilla Noarbe, I.
2016-08-01
PAUCam is a large field of view camera designed to exploit the field delivered by the prime focus corrector of the William Herschel Telescope, at the Observatorio del Roque de los Muchachos. One of the new features of this camera is its filter system, placed within a few millimeters of the focal plane using eleven trays containing 40 narrow band and 6 broad band filters, working in vacuum at an operational temperature of 250K and in a focalized beam. In this contribution, we describe the performance of these filters both in the characterization tests at the laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di Domenico, Giovanni, E-mail: didomenico@fe.infn.it; Cardarelli, Paolo; Taibi, Angelo
Purpose: The quality of a radiography system is affected by several factors, a major one being the focal spot size of the x-ray tube. In fact, the measurement of such size is recognized to be of primary importance during acceptance tests and image quality evaluations of clinical radiography systems. The most common device providing an image of the focal spot emission distribution is a pin-hole camera, which requires a high tube loading in order to produce a measurable signal. This work introduces an alternative technique to obtain an image of the focal spot, through the processing of a single radiographmore » of a simple test object, acquired with a suitable magnification. Methods: The radiograph of a magnified sharp edge is a well-established method to evaluate the extension of the focal spot profile along the direction perpendicular to the edge. From a single radiograph of a circular x-ray absorber, it is possible to extract simultaneously the radial profiles of several sharp edges with different orientations. The authors propose a technique that allows to obtain an image of the focal spot through the processing of these radial profiles by means of a pseudo-CT reconstruction technique. In order to validate this technique, the reconstruction has been applied to the simulated radiographs of an ideal disk-shaped absorber, generated by various simulated focal spot distributions. Furthermore, the method has been applied to the focal spot of a commercially available mammography unit. Results: In the case of simulated radiographs, the results of the reconstructions have been compared to the original distributions, showing an excellent agreement for what regards both the overall distribution and the full width at half maximum measurements. In the case of the experimental test, the method allowed to obtain images of the focal spot that have been compared with the results obtained through standard techniques, namely, pin-hole camera and slit camera. Conclusions: The method was proven to be effective for simulated images and the results of the experimental test suggest that it could be considered as an alternative technique for focal spot distribution evaluation. The method offers the possibility to measure the actual focal spot size and emission distribution at the same exposure conditions as clinical routine, avoiding high tube loading as in the case of the pin-hole imaging technique.« less
Focal plane wavefront sensor achromatization: The multireference self-coherent camera
NASA Astrophysics Data System (ADS)
Delorme, J. R.; Galicher, R.; Baudoz, P.; Rousset, G.; Mazoyer, J.; Dupuis, O.
2016-04-01
Context. High contrast imaging and spectroscopy provide unique constraints for exoplanet formation models as well as for planetary atmosphere models. But this can be challenging because of the planet-to-star small angular separation (<1 arcsec) and high flux ratio (>105). Recently, optimized instruments like VLT/SPHERE and Gemini/GPI were installed on 8m-class telescopes. These will probe young gazeous exoplanets at large separations (≳1 au) but, because of uncalibrated phase and amplitude aberrations that induce speckles in the coronagraphic images, they are not able to detect older and fainter planets. Aims: There are always aberrations that are slowly evolving in time. They create quasi-static speckles that cannot be calibrated a posteriori with sufficient accuracy. An active correction of these speckles is thus needed to reach very high contrast levels (>106-107). This requires a focal plane wavefront sensor. Our team proposed a self coherent camera, the performance of which was demonstrated in the laboratory. As for all focal plane wavefront sensors, these are sensitive to chromatism and we propose an upgrade that mitigates the chromatism effects. Methods: First, we recall the principle of the self-coherent camera and we explain its limitations in polychromatic light. Then, we present and numerically study two upgrades to mitigate chromatism effects: the optical path difference method and the multireference self-coherent camera. Finally, we present laboratory tests of the latter solution. Results: We demonstrate in the laboratory that the multireference self-coherent camera can be used as a focal plane wavefront sensor in polychromatic light using an 80 nm bandwidth at 640 nm (bandwidth of 12.5%). We reach a performance that is close to the chromatic limitations of our bench: 1σ contrast of 4.5 × 10-8 between 5 and 17 λ0/D. Conclusions: The performance of the MRSCC is promising for future high-contrast imaging instruments that aim to actively minimize the speckle intensity so as to detect and spectrally characterize faint old or light gaseous planets.
Advanced imaging research and development at DARPA
NASA Astrophysics Data System (ADS)
Dhar, Nibir K.; Dat, Ravi
2012-06-01
Advances in imaging technology have huge impact on our daily lives. Innovations in optics, focal plane arrays (FPA), microelectronics and computation have revolutionized camera design. As a result, new approaches to camera design and low cost manufacturing is now possible. These advances are clearly evident in visible wavelength band due to pixel scaling, improvements in silicon material and CMOS technology. CMOS cameras are available in cell phones and many other consumer products. Advances in infrared imaging technology have been slow due to market volume and many technological barriers in detector materials, optics and fundamental limits imposed by the scaling laws of optics. There is of course much room for improvements in both, visible and infrared imaging technology. This paper highlights various technology development projects at DARPA to advance the imaging technology for both, visible and infrared. Challenges and potentials solutions are highlighted in areas related to wide field-of-view camera design, small pitch pixel, broadband and multiband detectors and focal plane arrays.
Gas Analysis Using Auroral Spectroscopy.
NASA Astrophysics Data System (ADS)
Alozie, M.; Thomas, G.; Medillin, M.
2017-12-01
As part of the Undergraduate Student Instrumentation Project at the University of Houston, an Auroral spectroscope was designed and built. This visible light spectroscope was constructed out of carbon fiber, aluminum, and 3D printed parts. The spectroscope was designed to calculate the wavelengths of the spectral lines and analyze the emitted light spectrum of the gases. The spectroscope contains a primary parabolic 6" mirror and a smaller secondary 2.46" mirror. The light captured through these mirrors will be guided to an optical train that consist of five lenses (1" in diameter and focal length), a slit, and a visible transmission grating. The light will then be led to a Sony Alpha A6000 camera to take images of the spectral lines.
Application of phase matching autofocus in airborne long-range oblique photography camera
NASA Astrophysics Data System (ADS)
Petrushevsky, Vladimir; Guberman, Asaf
2014-06-01
The Condor2 long-range oblique photography (LOROP) camera is mounted in an aerodynamically shaped pod carried by a fast jet aircraft. Large aperture, dual-band (EO/MWIR) camera is equipped with TDI focal plane arrays and provides high-resolution imagery of extended areas at long stand-off ranges, at day and night. Front Ritchey-Chretien optics is made of highly stable materials. However, the camera temperature varies considerably in flight conditions. Moreover, a composite-material structure of the reflective objective undergoes gradual dehumidification in dry nitrogen atmosphere inside the pod, causing some small decrease of the structure length. The temperature and humidity effects change a distance between the mirrors by just a few microns. The distance change is small but nevertheless it alters the camera's infinity focus setpoint significantly, especially in the EO band. To realize the optics' resolution potential, the optimal focus shall be constantly maintained. In-flight best focus calibration and temperature-based open-loop focus control give mostly satisfactory performance. To get even better focusing precision, a closed-loop phase-matching autofocus method was developed for the camera. The method makes use of an existing beamsharer prism FPA arrangement where aperture partition exists inherently in an area of overlap between the adjacent detectors. The defocus is proportional to an image phase shift in the area of overlap. Low-pass filtering of raw defocus estimate reduces random errors related to variable scene content. Closed-loop control converges robustly to precise focus position. The algorithm uses the temperature- and range-based focus prediction as an initial guess for the closed-loop phase-matching control. The autofocus algorithm achieves excellent results and works robustly in various conditions of scene illumination and contrast.
Curiosity Rover View of Alluring Martian Geology Ahead
2015-08-05
A southward-looking panorama combining images from both cameras of the Mast Camera Mastcam instrument on NASA Curiosity Mars Rover shows diverse geological textures on Mount Sharp. A southward-looking panorama combining images from both cameras of the Mast Camera (Mastcam) instrument on NASA's Curiosity Mars Rover shows diverse geological textures on Mount Sharp. Three years after landing on Mars, the mission is investigating this layered mountain for evidence about changes in Martian environmental conditions, from an ancient time when conditions were favorable for microbial life to the much-drier present. Gravel and sand ripples fill the foreground, typical of terrains that Curiosity traversed to reach Mount Sharp from its landing site. Outcrops in the midfield are of two types: dust-covered, smooth bedrock that forms the base of the mountain, and sandstone ridges that shed boulders as they erode. Rounded buttes in the distance contain sulfate minerals, perhaps indicating a change in the availability of water when they formed. Some of the layering patterns on higher levels of Mount Sharp in the background are tilted at different angles than others, evidence of complicated relationships still to be deciphered. The scene spans from southeastward at left to southwestward at right. The component images were taken on April 10 and 11, 2015, the 952nd and 953rd Martian days (or sols) since the rover's landing on Mars on Aug. 6, 2012, UTC (Aug. 5, PDT). Images in the central part of the panorama are from Mastcam's right-eye camera, which is equipped with a 100-millimeter-focal-length telephoto lens. Images used in outer portions, including the most distant portions of the mountain in the scene, were taken with Mastcam's left-eye camera, using a wider-angle, 34-millimeter lens. http://photojournal.jpl.nasa.gov/catalog/PIA19803
The optical design of a visible adaptive optics system for the Magellan Telescope
NASA Astrophysics Data System (ADS)
Kopon, Derek
The Magellan Adaptive Optics system will achieve first light in November of 2012. This AO system contains several subsystems including the 585-actuator concave adaptive secondary mirror, the Calibration Return Optic (CRO) alignment and calibration system, the CLIO 1-5 microm IR science camera, the movable guider camera and active optics assembly, and the W-Unit, which contains both the Pyramid Wavefront Sensor (PWFS) and the VisAO visible science camera. In this dissertation, we present details of the design, fabrication, assembly, alignment, and laboratory performance of the VisAO camera and its optical components. Many of these components required a custom design, such as the Spectral Differential Imaging Wollaston prisms and filters and the coronagraphic spots. One component, the Atmospheric Dispersion Corrector (ADC), required a unique triplet design that had until now never been fabricated and tested on sky. We present the design, laboratory, and on-sky results for our triplet ADC. We also present details of the CRO test setup and alignment. Because Magellan is a Gregorian telescope, the ASM is a concave ellipsoidal mirror. By simulating a star with a white light point source at the far conjugate, we can create a double-pass test of the whole system without the need for a real on-sky star. This allows us to test the AO system closed loop in the Arcetri test tower at its nominal design focal length and optical conjugates. The CRO test will also allow us to calibrate and verify the system off-sky at the Magellan telescope during commissioning and periodically thereafter. We present a design for a possible future upgrade path for a new visible Integral Field Spectrograph. By integrating a fiber array bundle at the VisAO focal plane, we can send light to a pre-existing facility spectrograph, such as LDSS3, which will allow 20 mas spatial sampling and R˜1,800 spectra over the band 0.6-1.05 microm. This would be the highest spatial resolution IFU to date, either from the ground or in space.
Use of Vertical Aerial Images for Semi-Oblique Mapping
NASA Astrophysics Data System (ADS)
Poli, D.; Moe, K.; Legat, K.; Toschi, I.; Lago, F.; Remondino, F.
2017-05-01
The paper proposes a methodology for the use of the oblique sections of images from large-format photogrammetric cameras, by exploiting the effect of the central perspective geometry in the lateral parts of the nadir images ("semi-oblique" images). The point of origin of the investigation was the execution of a photogrammetric flight over Norcia (Italy), which was seriously damaged after the earthquake of 30/10/2016. Contrary to the original plan of oblique acquisitions, the flight was executed on 15/11/2017 using an UltraCam Eagle camera with focal length 80 mm, and combining two flight plans, rotated by 90º ("crisscross" flight). The images (GSD 5 cm) were used to extract a 2.5D DSM cloud, sampled to a XY-grid size of 2 GSD, a 3D point clouds with a mean spatial resolution of 1 GSD and a 3D mesh model at a resolution of 10 cm of the historic centre of Norcia for a quantitative assessment of the damages. From the acquired nadir images the "semi-oblique" images (forward, backward, left and right views) could be extracted and processed in a modified version of GEOBLY software for measurements and restitution purposes. The potential of such semi-oblique image acquisitions from nadir-view cameras is hereafter shown and commented.
Gamma Ray Burst Optical Counterpart Search Experiment (GROCSE)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, H.S.; Ables, E.; Bionta, R.M.
GROCSE (Gamma-Ray Optical Counterpart Search Experiments) is a system of automated telescopes that search for simultaneous optical activity associated with gamma ray bursts in response to real-time burst notifications provided by the BATSE/BACODINE network. The first generation system, GROCSE 1, is sensitive down to Mv {approximately} 8.5 and requires an average of 12 seconds to obtain the first images of the gamma ray burst error box defined by the BACODINE trigger. The collaboration is now constructing a second generation system which has a 4 second slewing time and can reach Mv {approximately} 14 with a 5 second exposure. GROCSE 2more » consists of 4 cameras on a single mount. Each camera views the night sky through a commercial Canon lens (f/1.8, focal length 200 mm) and utilizes a 2K x 2K Loral CCD. Light weight and low noise custom readout electronics were designed and fabricated for these CCDs. The total field of view of the 4 cameras is 17.6 x 17.6 {degree}. GROCSE II will be operated by the end of 1995. In this paper, the authors present an overview of the GROCSE system and the results of measurements with a GROCSE 2 prototype unit.« less
Automatic Camera Calibration for Cultural Heritage Applications Using Unstructured Planar Objects
NASA Astrophysics Data System (ADS)
Adam, K.; Kalisperakis, I.; Grammatikopoulos, L.; Karras, G.; Petsa, E.
2013-07-01
As a rule, image-based documentation of cultural heritage relies today on ordinary digital cameras and commercial software. As such projects often involve researchers not familiar with photogrammetry, the question of camera calibration is important. Freely available open-source user-friendly software for automatic camera calibration, often based on simple 2D chess-board patterns, are an answer to the demand for simplicity and automation. However, such tools cannot respond to all requirements met in cultural heritage conservation regarding possible imaging distances and focal lengths. Here we investigate the practical possibility of camera calibration from unknown planar objects, i.e. any planar surface with adequate texture; we have focused on the example of urban walls covered with graffiti. Images are connected pair-wise with inter-image homographies, which are estimated automatically through a RANSAC-based approach after extracting and matching interest points with the SIFT operator. All valid points are identified on all images on which they appear. Provided that the image set includes a "fronto-parallel" view, inter-image homographies with this image are regarded as emulations of image-to-world homographies and allow computing initial estimates for the interior and exterior orientation elements. Following this initialization step, the estimates are introduced into a final self-calibrating bundle adjustment. Measures are taken to discard unsuitable images and verify object planarity. Results from practical experimentation indicate that this method may produce satisfactory results. The authors intend to incorporate the described approach into their freely available user-friendly software tool, which relies on chess-boards, to assist non-experts in their projects with image-based approaches.
BOREAS Level-0 C-130 Aerial Photography
NASA Technical Reports Server (NTRS)
Newcomer, Jeffrey A.; Dominguez, Roseanne; Hall, Forrest G. (Editor)
2000-01-01
For BOReal Ecosystem-Atmosphere Study (BOREAS), C-130 and other aerial photography was collected to provide finely detailed and spatially extensive documentation of the condition of the primary study sites. The NASA C-130 Earth Resources aircraft can accommodate two mapping cameras during flight, each of which can be fitted with 6- or 12-inch focal-length lenses and black-and-white, natural-color, or color-IR film, depending upon requirements. Both cameras were often in operation simultaneously, although sometimes only the lower resolution camera was deployed. When both cameras were in operation, the higher resolution camera was often used in a more limited fashion. The acquired photography covers the period of April to September 1994. The aerial photography was delivered as rolls of large format (9 x 9 inch) color transparency prints, with imagery from multiple missions (hundreds of prints) often contained within a single roll. A total of 1533 frames were collected from the C-130 platform for BOREAS in 1994. Note that the level-0 C-130 transparencies are not contained on the BOREAS CD-ROM set. An inventory file is supplied on the CD-ROM to inform users of all the data that were collected. Some photographic prints were made from the transparencies. In addition, BORIS staff digitized a subset of the tranparencies and stored the images in JPEG format. The CD-ROM set contains a small subset of the collected aerial photography that were the digitally scanned and stored as JPEG files for most tower and auxiliary sites in the NSA and SSA. See Section 15 for information about how to acquire additional imagery.
Temporal Coding of Volumetric Imagery
NASA Astrophysics Data System (ADS)
Llull, Patrick Ryan
'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration of other information within that video; namely, focal and spectral information. The next part of the thesis demonstrates derivative works of CACTI: compressive extended depth of field and compressive spectral-temporal imaging. These works successfully show the technique's extension of temporal coding to improve sensing performance in these other dimensions. Geometrical optics-related tradeoffs, such as the classic challenges of wide-field-of-view and high resolution photography, have motivated the development of mulitscale camera arrays. The advent of such designs less than a decade ago heralds a new era of research- and engineering-related challenges. One significant challenge is that of managing the focal volume (x,y,z ) over wide fields of view and resolutions. The fourth chapter shows advances on focus and image quality assessment for a class of multiscale gigapixel cameras developed at Duke. Along the same line of work, we have explored methods for dynamic and adaptive addressing of focus via point spread function engineering. We demonstrate another form of temporal coding in the form of physical translation of the image plane from its nominal focal position. We demonstrate this technique's capability to generate arbitrary point spread functions.
Application of imaging to the atmospheric Cherenkov technique
NASA Technical Reports Server (NTRS)
Cawley, M. F.; Fegan, D. J.; Gibbs, K.; Gorham, P. W.; Hillas, A. M.; Lamb, R. C.; Liebing, D. F.; Mackeown, P. K.; Porter, N. A.; Stenger, V. J.
1985-01-01
Turver and Weekes proposed using a system of phototubes in the focal plane of a large reflector to give an air Cherenkov camera for gamma ray astronomy. Preliminary results with a 19 element camera have been reported previously. In 1983 the camera was increased to 37 pixels; it has now been routinely operated for two years. A brief physical description of the camera, its mode of operation, and the data reduction procedures are presented. The Monte Carlo simultations on which these are based on also reviewed.
Lytro camera technology: theory, algorithms, performance analysis
NASA Astrophysics Data System (ADS)
Georgiev, Todor; Yu, Zhan; Lumsdaine, Andrew; Goma, Sergio
2013-03-01
The Lytro camera is the first implementation of a plenoptic camera for the consumer market. We consider it a successful example of the miniaturization aided by the increase in computational power characterizing mobile computational photography. The plenoptic camera approach to radiance capture uses a microlens array as an imaging system focused on the focal plane of the main camera lens. This paper analyzes the performance of Lytro camera from a system level perspective, considering the Lytro camera as a black box, and uses our interpretation of Lytro image data saved by the camera. We present our findings based on our interpretation of Lytro camera file structure, image calibration and image rendering; in this context, artifacts and final image resolution are discussed.
Portraiture lens concept in a mobile phone camera
NASA Astrophysics Data System (ADS)
Sheil, Conor J.; Goncharov, Alexander V.
2017-11-01
A small form-factor lens was designed for the purpose of portraiture photography, the size of which allows use within smartphone casing. The current general requirement of mobile cameras having good all-round performance results in a typical, familiar, many-element design. Such designs have little room for improvement, in terms of the available degrees of freedom and highly-demanding target metrics such as low f-number and wide field of view. However, the specific application of the current portraiture lens relaxed the requirement of an all-round high-performing lens, allowing improvement of certain aspects at the expense of others. With a main emphasis on reducing depth of field (DoF), the current design takes advantage of the simple geometrical relationship between DoF and pupil diameter. The system has a large aperture, while a reasonable f-number gives a relatively large focal length, requiring a catadioptric lens design with double ray path; hence, field of view is reduced. Compared to typical mobile lenses, the large diameter reduces depth of field by a factor of four.
Geometric Calibration and Validation of Kompsat-3A AEISS-A Camera
Seo, Doocheon; Oh, Jaehong; Lee, Changno; Lee, Donghan; Choi, Haejin
2016-01-01
Kompsat-3A, which was launched on 25 March 2015, is a sister spacecraft of the Kompsat-3 developed by the Korea Aerospace Research Institute (KARI). Kompsat-3A’s AEISS-A (Advanced Electronic Image Scanning System-A) camera is similar to Kompsat-3’s AEISS but it was designed to provide PAN (Panchromatic) resolution of 0.55 m, MS (multispectral) resolution of 2.20 m, and TIR (thermal infrared) at 5.5 m resolution. In this paper we present the geometric calibration and validation work of Kompsat-3A that was completed last year. A set of images over the test sites was taken for two months and was utilized for the work. The workflow includes the boresight calibration, CCDs (charge-coupled devices) alignment and focal length determination, the merge of two CCD lines, and the band-to-band registration. Then, the positional accuracies without any GCPs (ground control points) were validated for hundreds of test sites across the world using various image acquisition modes. In addition, we checked the planimetric accuracy by bundle adjustments with GCPs. PMID:27783054
Study of the detail content of Apollo orbital photography
NASA Technical Reports Server (NTRS)
Kinzly, R. E.
1972-01-01
The results achieved during a study of the Detail Content of Apollo Orbital Photography are reported. The effect of residual motion smear or image reproduction processes upon the detail content of lunar surface imagery obtained from the orbiting command module are assessed. Data and conclusions obtained from the Apollo 8, 12, 14 and 15 missions are included. For the Apollo 8, 12 and 14 missions, the bracket-mounted Hasselblad camera had no mechanism internal to the camera for motion compensation. If the motion of the command module were left totally uncompensated, these photographs would exhibit a ground smear varying from 12 to 27 meters depending upon the focal length of the lens and the exposure time. During the photographic sequences motion compensation was attempted by firing the attitude control system of the spacecraft at a rate to compensate for the motion relative to the lunar surface. The residual smear occurring in selected frames of imagery was assessed using edge analyses methods to obtain and achieved modulation transfer function (MTF) which was compared to a baseline MTF.
NASA Astrophysics Data System (ADS)
Ozolinsh, Maris; Paulins, Paulis
2017-09-01
An experimental setup allowing the modeling of conditions in optical devices and in the eye at various degrees of scattering such as cataract pathology in human eyes is presented. The scattering in cells of polymer-dispersed liquid crystals (PDLCs) and ‘Smart Glass’ windows is used in the modeling experiments. Both applications are used as optical obstacles placed in different positions of the optical information flow pathway either directly on the stimuli demonstration computer screen or mounted directly after the image-formation lens of a digital camera. The degree of scattering is changed continuously by applying an AC voltage of up to 30-80 V to the PDLC cell. The setup uses a camera with 14 bit depth and a 24 mm focal length lens. Light-emitting diodes and diode-pumped solid-state lasers emitting radiation of different wavelengths are used as portable small-divergence light sources in the experiments. Image formation, optical system point spread function, modulation transfer functions, and system resolution limits are determined for such sample optical systems in student optics and optometry experimental exercises.
Two cloud-based cues for estimating scene structure and camera calibration.
Jacobs, Nathan; Abrams, Austin; Pless, Robert
2013-10-01
We describe algorithms that use cloud shadows as a form of stochastically structured light to support 3D scene geometry estimation. Taking video captured from a static outdoor camera as input, we use the relationship of the time series of intensity values between pairs of pixels as the primary input to our algorithms. We describe two cues that relate the 3D distance between a pair of points to the pair of intensity time series. The first cue results from the fact that two pixels that are nearby in the world are more likely to be under a cloud at the same time than two distant points. We describe methods for using this cue to estimate focal length and scene structure. The second cue is based on the motion of cloud shadows across the scene; this cue results in a set of linear constraints on scene structure. These constraints have an inherent ambiguity, which we show how to overcome by combining the cloud motion cue with the spatial cue. We evaluate our method on several time lapses of real outdoor scenes.
Cat-eye effect reflected beam profiles of an optical system with sensor array.
Gong, Mali; He, Sifeng; Guo, Rui; Wang, Wei
2016-06-01
In this paper, we propose an applicable propagation model for Gaussian beams passing through any cat-eye target instead of traditional simplification consisting of only a mirror placed at the focal plane of a lens. According to the model, the cat-eye effect of CCD cameras affected by defocus is numerically simulated. An excellent agreement of experiment results with theoretical analysis is obtained. It is found that the reflectivity distribution at the focal plane of the cat-eye optical lens has great influence on the results, while the cat-eye effect reflected beam profiles of CCD cameras show obvious periodicity.
Plenoptic background oriented schlieren imaging
NASA Astrophysics Data System (ADS)
Klemkowsky, Jenna N.; Fahringer, Timothy W.; Clifford, Christopher J.; Bathel, Brett F.; Thurow, Brian S.
2017-09-01
The combination of the background oriented schlieren (BOS) technique with the unique imaging capabilities of a plenoptic camera, termed plenoptic BOS, is introduced as a new addition to the family of schlieren techniques. Compared to conventional single camera BOS, plenoptic BOS is capable of sampling multiple lines-of-sight simultaneously. Displacements from each line-of-sight are collectively used to build a four-dimensional displacement field, which is a vector function structured similarly to the original light field captured in a raw plenoptic image. The displacement field is used to render focused BOS images, which qualitatively are narrow depth of field slices of the density gradient field. Unlike focused schlieren methods that require manually changing the focal plane during data collection, plenoptic BOS synthetically changes the focal plane position during post-processing, such that all focal planes are captured in a single snapshot. Through two different experiments, this work demonstrates that plenoptic BOS is capable of isolating narrow depth of field features, qualitatively inferring depth, and quantitatively estimating the location of disturbances in 3D space. Such results motivate future work to transition this single-camera technique towards quantitative reconstructions of 3D density fields.
Hyun, Jong Jin; Keum, Bora; Seo, Yeon Seok; Kim, Yong Sik; Jeen, Yoon Tae; Lee, Hong Sik; Um, Soon Ho; Kim, Chang Duck; Ryu, Ho Sang; Lim, Jong-Wook; Woo, Dong-Gi; Kim, Young-Joong; Lim, Myo-Taeg
2012-01-01
Background/Aims Three-dimensional (3D) imaging is gaining popularity and has been partly adopted in laparoscopic surgery or robotic surgery but has not been applied to gastrointestinal endoscopy. As a first step, we conducted an experiment to evaluate whether images obtained by conventional gastrointestinal endoscopy could be used to acquire quantitative 3D information. Methods Two endoscopes (GIF-H260) were used in a Borrmann type I tumor model made of clay. The endoscopes were calibrated by correcting the barrel distortion and perspective distortion. Obtained images were converted to gray-level image, and the characteristics of the images were obtained by edge detection. Finally, data on 3D parameters were measured by using epipolar geometry, two view geometry, and pinhole camera model. Results The focal length (f) of endoscope at 30 mm was 258.49 pixels. Two endoscopes were fixed at predetermined distance, 12 mm (d12). After matching and calculating disparity (v2-v1), which was 106 pixels, the calculated length between the camera and object (L) was 29.26 mm. The height of the object projected onto the image (h) was then applied to the pinhole camera model, and the result of H (height and width) was 38.21 mm and 41.72 mm, respectively. Measurements were conducted from 2 different locations. The measurement errors ranged from 2.98% to 7.00% with the current Borrmann type I tumor model. Conclusions It was feasible to obtain parameters necessary for 3D analysis and to apply the data to epipolar geometry with conventional gastrointestinal endoscope to calculate the size of an object. PMID:22977798
50 CFR 216.155 - Requirements for monitoring and reporting.
Code of Federal Regulations, 2010 CFR
2010-10-01
... place 3 autonomous digital video cameras overlooking chosen haul-out sites located varying distances from the missile launch site. Each video camera will be set to record a focal subgroup within the... presence and activity will be conducted and recorded in a field logbook or recorded on digital video for...
Geometric facial comparisons in speed-check photographs.
Buck, Ursula; Naether, Silvio; Kreutz, Kerstin; Thali, Michael
2011-11-01
In many cases, it is not possible to call the motorists to account for their considerable excess in speeding, because they deny being the driver on the speed-check photograph. An anthropological comparison of facial features using a photo-to-photo comparison can be very difficult depending on the quality of the photographs. One difficulty of that analysis method is that the comparison photographs of the presumed driver are taken with a different camera or camera lens and from a different angle than for the speed-check photo. To take a comparison photograph with exactly the same camera setup is almost impossible. Therefore, only an imprecise comparison of the individual facial features is possible. The geometry and position of each facial feature, for example the distances between the eyes or the positions of the ears, etc., cannot be taken into consideration. We applied a new method using 3D laser scanning, optical surface digitalization, and photogrammetric calculation of the speed-check photo, which enables a geometric comparison. Thus, the influence of the focal length and the distortion of the objective lens are eliminated and the precise position and the viewing direction of the speed-check camera are calculated. Even in cases of low-quality images or when the face of the driver is partly hidden, good results are delivered using this method. This new method, Geometric Comparison, is evaluated and validated in a prepared study which is described in this article.
KAPAO Prime: Design and Simulation
NASA Astrophysics Data System (ADS)
McGonigle, Lorcan; Choi, P. I.; Severson, S. A.; Spjut, E.
2013-01-01
KAPAO (KAPAO A Pomona Adaptive Optics instrument) is a dual-band natural guide star adaptive optics system designed to measure and remove atmospheric aberration over UV-NIR wavelengths from Pomona College’s telescope atop Table Mountain. We present here, the final optical system, KAPAO Prime, designed in Zemax Optical Design Software that uses custom off-axis paraboloid mirrors (OAPs) to manipulate light appropriately for a Shack-Hartman wavefront sensor, deformable mirror, and science cameras. KAPAO Prime is characterized by diffraction limited imaging over the full 81” field of view of our optical camera at f/33 as well as over the smaller field of view of our NIR camera at f/50. In Zemax, tolerances of 1% on OAP focal length and off-axis distance were shown to contribute an additional 4 nm of wavefront error (98% confidence) over the field of view of our optical camera; the contribution from surface irregularity was determined analytically to be 40nm for OAPs specified to λ/10 surface irregularity (632.8nm). Modeling of the temperature deformation of the breadboard in SolidWorks revealed 70 micron contractions along the edges of the board for a decrease of 75°F when applied to OAP positions such displacements from the optimal layout are predicted to contribute an additional 20 nanometers of wavefront error. Flexure modeling of the breadboard due to gravity is on-going. We hope to begin alignment and testing of KAPAO Prime in Q1 2013.
Adaptation of the Camera Link Interface for Flight-Instrument Applications
NASA Technical Reports Server (NTRS)
Randall, David P.; Mahoney, John C.
2010-01-01
COTS (commercial-off-the-shelf) hard ware using an industry-standard Camera Link interface is proposed to accomplish the task of designing, building, assembling, and testing electronics for an airborne spectrometer that would be low-cost, but sustain the required data speed and volume. The focal plane electronics were designed to support that hardware standard. Analysis was done to determine how these COTS electronics could be interfaced with space-qualified camera electronics. Interfaces available for spaceflight application do not support the industry standard Camera Link interface, but with careful design, COTS EGSE (electronics ground support equipment), including camera interfaces and camera simulators, can still be used.
The NACA High-Speed Motion-Picture Camera Optical Compensation at 40,000 Photographs Per Second
NASA Technical Reports Server (NTRS)
Miller, Cearcy D
1946-01-01
The principle of operation of the NACA high-speed camera is completely explained. This camera, operating at the rate of 40,000 photographs per second, took the photographs presented in numerous NACA reports concerning combustion, preignition, and knock in the spark-ignition engine. Many design details are presented and discussed, details of an entirely conventional nature are omitted. The inherent aberrations of the camera are discussed and partly evaluated. The focal-plane-shutter effect of the camera is explained. Photographs of the camera are presented. Some high-speed motion pictures of familiar objects -- photoflash bulb, firecrackers, camera shutter -- are reproduced as an illustration of the quality of the photographs taken by the camera.
A novel SPECT camera for molecular imaging of the prostate
NASA Astrophysics Data System (ADS)
Cebula, Alan; Gilland, David; Su, Li-Ming; Wagenaar, Douglas; Bahadori, Amir
2011-10-01
The objective of this work is to develop an improved SPECT camera for dedicated prostate imaging. Complementing the recent advancements in agents for molecular prostate imaging, this device has the potential to assist in distinguishing benign from aggressive cancers, to improve site-specific localization of cancer, to improve accuracy of needle-guided prostate biopsy of cancer sites, and to aid in focal therapy procedures such as cryotherapy and radiation. Theoretical calculations show that the spatial resolution/detection sensitivity of the proposed SPECT camera can rival or exceed 3D PET and further signal-to-noise advantage is attained with the better energy resolution of the CZT modules. Based on photon transport simulation studies, the system has a reconstructed spatial resolution of 4.8 mm with a sensitivity of 0.0001. Reconstruction of a simulated prostate distribution demonstrates the focal imaging capability of the system.
Goyal, Anish; Myers, Travis; Wang, Christine A; Kelly, Michael; Tyrrell, Brian; Gokden, B; Sanchez, Antonio; Turner, George; Capasso, Federico
2014-06-16
We demonstrate active hyperspectral imaging using a quantum-cascade laser (QCL) array as the illumination source and a digital-pixel focal-plane-array (DFPA) camera as the receiver. The multi-wavelength QCL array used in this work comprises 15 individually addressable QCLs in which the beams from all lasers are spatially overlapped using wavelength beam combining (WBC). The DFPA camera was configured to integrate the laser light reflected from the sample and to perform on-chip subtraction of the passive thermal background. A 27-frame hyperspectral image was acquired of a liquid contaminant on a diffuse gold surface at a range of 5 meters. The measured spectral reflectance closely matches the calculated reflectance. Furthermore, the high-speed capabilities of the system were demonstrated by capturing differential reflectance images of sand and KClO3 particles that were moving at speeds of up to 10 m/s.
Crystal diffraction lens with variable focal length
Smither, R.K.
1991-04-02
A method and apparatus for altering the focal length of a focusing element of one of a plurality of pre-determined focal lengths by changing heat transfer within selected portions of the element by controlled quantities is disclosed. Control over heat transfer is accomplished by manipulating one or more of a number of variables, including: the amount of heat or cold applied to surfaces; type of fluids pumped through channels for heating and cooling; temperatures, directions of flow and rates of flow of fluids; and placement of channels. 19 figures.
Fabrication of micromachined focusing mirrors with seamless reflective surface
NASA Astrophysics Data System (ADS)
Hou, Max Ti-Kuang; Liao, Ke-Min; Yeh, Hong-Zhen; Cheng, Bo-Wen; Hong, Pei-Yuan; Chen, Rongshun
2003-01-01
A surface-micromachined focusing mirror with variable focal length, which is controlled by adjusting the mirror"s curvature, is fabricated and characterized. The bowl-shaped micromirror, which is fabricated from the micro bilayer circular plate, focuses light beam through thermal actuation of the external heat source. Both the initial and operational curvatures are manipulated by the residual stresses in two layers of the mirror. Improper stresses would lead to the failure of the bowl-shaped structure. We analyze and design geometrical dimensions for simultaneously avoiding the structure failure and increasing the tuning range of the focal length. The interferometer has been used to measure the focal length and the focusing ability. Mirrors with nominal focal lengths approximately 730 μm, and tuning ranges of about 50 microns were demonstrated. The measurement directly through optical approach has also been tried, but requires further investigation, because the laser beam affects the focusing of the micromirror seriously.
Continuously variable focal length lens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Bernhard W; Chollet, Matthieu C
2013-12-17
A material preferably in crystal form having a low atomic number such as beryllium (Z=4) provides for the focusing of x-rays in a continuously variable manner. The material is provided with plural spaced curvilinear, optically matched slots and/or recesses through which an x-ray beam is directed. The focal length of the material may be decreased or increased by increasing or decreasing, respectively, the number of slots (or recesses) through which the x-ray beam is directed, while fine tuning of the focal length is accomplished by rotation of the material so as to change the path length of the x-ray beammore » through the aligned cylindrical slows. X-ray analysis of a fixed point in a solid material may be performed by scanning the energy of the x-ray beam while rotating the material to maintain the beam's focal point at a fixed point in the specimen undergoing analysis.« less
Helmet-mounted uncooled FPA camera for use in firefighting applications
NASA Astrophysics Data System (ADS)
Wu, Cheng; Feng, Shengrong; Li, Kai; Pan, Shunchen; Su, Junhong; Jin, Weiqi
2000-05-01
From the concept and need background of firefighters to the thermal imager, we discuss how the helmet-mounted camera applied in the bad environment of conflagration, especially at the high temperature, and how the better matching between the thermal imager with the helmet will be put into effect in weight, size, etc. Finally, give a practical helmet- mounted IR camera based on the uncooled focal plane array detector for in firefighting.
Traffic Sign Recognition with Invariance to Lighting in Dual-Focal Active Camera System
NASA Astrophysics Data System (ADS)
Gu, Yanlei; Panahpour Tehrani, Mehrdad; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki
In this paper, we present an automatic vision-based traffic sign recognition system, which can detect and classify traffic signs at long distance under different lighting conditions. To realize this purpose, the traffic sign recognition is developed in an originally proposed dual-focal active camera system. In this system, a telephoto camera is equipped as an assistant of a wide angle camera. The telephoto camera can capture a high accuracy image for an object of interest in the view field of the wide angle camera. The image from the telephoto camera provides enough information for recognition when the accuracy of traffic sign is low from the wide angle camera. In the proposed system, the traffic sign detection and classification are processed separately for different images from the wide angle camera and telephoto camera. Besides, in order to detect traffic sign from complex background in different lighting conditions, we propose a type of color transformation which is invariant to light changing. This color transformation is conducted to highlight the pattern of traffic signs by reducing the complexity of background. Based on the color transformation, a multi-resolution detector with cascade mode is trained and used to locate traffic signs at low resolution in the image from the wide angle camera. After detection, the system actively captures a high accuracy image of each detected traffic sign by controlling the direction and exposure time of the telephoto camera based on the information from the wide angle camera. Moreover, in classification, a hierarchical classifier is constructed and used to recognize the detected traffic signs in the high accuracy image from the telephoto camera. Finally, based on the proposed system, a set of experiments in the domain of traffic sign recognition is presented. The experimental results demonstrate that the proposed system can effectively recognize traffic signs at low resolution in different lighting conditions.
Tight focusing of radially polarized circular Airy vortex beams
NASA Astrophysics Data System (ADS)
Chen, Musheng; Huang, Sujuan; Shao, Wei
2017-11-01
Tight focusing properties of radially polarized circular Airy vortex beams (CAVB) are studied numerically. The light field expressions for the focused fields are derived based on vectorial Debye theory. We also study the relationship between focal profiles, such as light intensity distribution, radius of focal spot and focal length, and the parameters of CAVB. Numerical results demonstrate that we can generate a radially polarized CAVB with super-long focal length, super-strong longitudinal intensity or subwavelength focused spot at the focal plane by properly choosing the parameters of incident light and high numerical aperture (NA) lens. These results have potential applications for optical trapping, optical storage and particle acceleration.
In-situ calibration of nonuniformity in infrared staring and modulated systems
NASA Astrophysics Data System (ADS)
Black, Wiley T.
Infrared cameras can directly measure the apparent temperature of objects, providing thermal imaging. However, the raw output from most infrared cameras suffers from a strong, often limiting noise source called nonuniformity. Manufacturing imperfections in infrared focal planes lead to high pixel-to-pixel sensitivity to electronic bias, focal plane temperature, and other effects. The resulting imagery can only provide useful thermal imaging after a nonuniformity calibration has been performed. Traditionally, these calibrations are performed by momentarily blocking the field of view with a at temperature plate or blackbody cavity. However because the pattern is a coupling of manufactured sensitivities with operational variations, periodic recalibration is required, sometimes on the order of tens of seconds. A class of computational methods called Scene-Based Nonuniformity Correction (SBNUC) has been researched for over 20 years where the nonuniformity calibration is estimated in digital processing by analysis of the video stream in the presence of camera motion. The most sophisticated SBNUC methods can completely and robustly eliminate the high-spatial frequency component of nonuniformity with only an initial reference calibration or potentially no physical calibration. I will demonstrate a novel algorithm that advances these SBNUC techniques to support all spatial frequencies of nonuniformity correction. Long-wave infrared microgrid polarimeters are a class of camera that incorporate a microscale per-pixel wire-grid polarizer directly affixed to each pixel of the focal plane. These cameras have the capability of simultaneously measuring thermal imagery and polarization in a robust integrated package with no moving parts. I will describe the necessary adaptations of my SBNUC method to operate on this class of sensor as well as demonstrate SBNUC performance in LWIR polarimetry video collected on the UA mall.
Flat liquid crystal diffractive lenses with variable focus and magnification
NASA Astrophysics Data System (ADS)
Valley, Pouria
Non-mechanical variable lenses are important for creating compact imaging devices. Various methods employing dielectrically actuated lenses, membrane lenses, and liquid crystal lenses were previously proposed [1-4]. In This dissertation the design, fabrication, and characterization of innovative flat tunable-focus liquid crystal diffractive lenses (LCDL) are presented. LCDL employ binary Fresnel zone electrodes fabricated on Indium-Tin-Oxide using conventional micro-photolithography. The light phase can be adjusted by varying the effective refractive index of a nematic liquid crystal sandwiched between the electrodes and a reference substrate. Using a proper voltage distribution across various electrodes the focal length can be changed between several discrete values. Electrodes are shunted such that the correct phase retardation step sequence is achieved. If the number of 2pi zone boundaries is increased by a factor of m the focal length is changed from f to f/m based on the digitized Fresnel zone equation: f = rm2/2mlambda, where r m is mth zone radius, and lambda is the wavelength. The chromatic aberration of the diffractive lens is addressed and corrected by adding a variable fluidic lens. These LCDL operate at very low voltage levels (+/-2.5V ac input), exhibit fast switching times (20-150 ms), can have large apertures (>10 mm), and small form factor, and are robust and insensitive to vibrations, gravity, and capillary effects that limit membrane and dielectrically actuated lenses. Several tests were performed on the LCDL including diffraction efficiency measurement, switching dynamics, and hybrid imaging with a refractive lens. Negative focal lengths are achieved by adjusting the voltages across electrodes. Using these lenses in combination, magnification can be changed and zoom lenses can be formed. These characteristics make LCDL a good candidate for a variety of applications including auto-focus and zoom lenses in compact imaging devices such as camera phones. A business plan centered on this technology was developed as part of the requirements for the minor in entrepreneurship from the Eller College of Management. An industrial analysis is presented in this study that involves product development, marketing, and financial analyses (Appendix I).
15 CFR 742.4 - National security.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Requirements” section except those cameras in ECCN 6A003.b.4.b that have a focal plane array with 111,000 or... Albania, Australia, Austria, Belgium, Bulgaria, Canada, Croatia, Cyprus, Czech Republic, Denmark, Estonia....b.4.b that have a focal plane array with 111,000 or fewer elements and a frame rate of 60 Hz or less...
Camera for detection of cosmic rays of energy more than 10 Eev on the ISS orbit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garipov, G. K.; Khrenov, B. A.; Panasyuk, M. I.
1998-06-15
Concept of the EHE CR observation from the ISS orbit is discussed. A design of the camera at the Russian segment of the ISS comprising a large area (60 m{sup 2}) parabolic mirror with a photo multiplier pixel retina in its focal plane is described.
NASA Technical Reports Server (NTRS)
Gunapala, S.; Bandara, S. V.; Liu, J. K.; Hong, W.; Sundaram, M.; Maker, P. D.; Muller, R. E.
1997-01-01
In this paper, we discuss the development of this very sensitive long waelength infrared (LWIR) camera based on a GaAs/AlGaAs QWIP focal plane array (FPA) and its performance in quantum efficiency, NEAT, uniformity, and operability.
A new high-speed IR camera system
NASA Technical Reports Server (NTRS)
Travis, Jeffrey W.; Shu, Peter K.; Jhabvala, Murzy D.; Kasten, Michael S.; Moseley, Samuel H.; Casey, Sean C.; Mcgovern, Lawrence K.; Luers, Philip J.; Dabney, Philip W.; Kaipa, Ravi C.
1994-01-01
A multi-organizational team at the Goddard Space Flight Center is developing a new far infrared (FIR) camera system which furthers the state of the art for this type of instrument by the incorporating recent advances in several technological disciplines. All aspects of the camera system are optimized for operation at the high data rates required for astronomical observations in the far infrared. The instrument is built around a Blocked Impurity Band (BIB) detector array which exhibits responsivity over a broad wavelength band and which is capable of operating at 1000 frames/sec, and consists of a focal plane dewar, a compact camera head electronics package, and a Digital Signal Processor (DSP)-based data system residing in a standard 486 personal computer. In this paper we discuss the overall system architecture, the focal plane dewar, and advanced features and design considerations for the electronics. This system, or one derived from it, may prove useful for many commercial and/or industrial infrared imaging or spectroscopic applications, including thermal machine vision for robotic manufacturing, photographic observation of short-duration thermal events such as combustion or chemical reactions, and high-resolution surveillance imaging.
The MVM imaging system and its spacecraft interactions. [Mariner Venus/Mercury TV system performance
NASA Technical Reports Server (NTRS)
Vescelus, F. E.
1975-01-01
The present work describes the main considerations and steps taken in determining the functional design of the imaging system of the Mariner Venus/Mercury (MVM) spacecraft and gives examples of some of the interactions between the spacecraft and the imaging instrument during the design and testing phases. Stringent cost and scheduling constraints dictated the use of the previous Mariner 9 dual-camera TV system. The TV parameters laid the groundwork for the imaging definition. Based on the flyby distances from Venus and Mercury and the goal of surface resolution better than 500 meters per sample pair, calculation was performed on focal length, format size, planetary coverage, and data rates. Some problems encountered in initial mechanical operation and as a result of spacecraft drift during the mission are also discussed.
Marcot, Bruce G.; Jorgenson, M. Torre; DeGange, Anthony R.
2014-01-01
5. A Canon® Rebel 3Ti with a Sigma zoom lens (18–200 mm focal length). The Drift® HD-170 and GoPro® Hero3 cameras were secured to the struts and underwing for nadir (direct downward) imaging. The Panasonic® and Canon® cameras were each hand-held for oblique-angle landscape images, shooting through the airplanes’ windows, targeting both general landscape conditions as well as landscape features of special interest, such as tundra fire scars and landslips. The Drift® and GoPro® cameras each were set for time-lapse photography at 5-second intervals for overlapping coverage. Photographs from all cameras (100 percent .jpg format) were date- and time-synchronized to geographic positioning system waypoints taken during the flights, also at 5-second intervals, providing precise geotagging (latitude-longitude) of all files. All photographs were adjusted for color saturation and gamma, and nadir photographs were corrected for lens distortion for the Drift® and GoPro® cameras’ 170° wide-angle distortion. EXIF (exchangeable image file format) data on camera settings and geotagging were extracted into spreadsheet databases. An additional 1 hour, 20 minutes, and 43 seconds of high-resolution videos were recorded at 60 frames per second with the GoPro® camera along selected transect segments, and also were image-adjusted and corrected for lens distortion. Geotagged locations of 12,395 nadir photographs from the Drift® and GoPro® cameras were overlayed in a geographic information system (ArcMap 10.0) onto a map of 44 ecotypes (land- and water-cover types) of the Arctic Network study area. Presence and area of each ecotype occurring within a geographic information system window centered on the location of each photograph were recorded and included in the spreadsheet databases. All original and adjusted photographs, videos, geographic positioning system flight tracks, and photograph databases are available by contacting ascweb@usgs.gov.
Toward real-time endoscopically-guided robotic navigation based on a 3D virtual surgical field model
NASA Astrophysics Data System (ADS)
Gong, Yuanzheng; Hu, Danying; Hannaford, Blake; Seibel, Eric J.
2015-03-01
The challenge is to accurately guide the surgical tool within the three-dimensional (3D) surgical field for roboticallyassisted operations such as tumor margin removal from a debulked brain tumor cavity. The proposed technique is 3D image-guided surgical navigation based on matching intraoperative video frames to a 3D virtual model of the surgical field. A small laser-scanning endoscopic camera was attached to a mock minimally-invasive surgical tool that was manipulated toward a region of interest (residual tumor) within a phantom of a debulked brain tumor. Video frames from the endoscope provided features that were matched to the 3D virtual model, which were reconstructed earlier by raster scanning over the surgical field. Camera pose (position and orientation) is recovered by implementing a constrained bundle adjustment algorithm. Navigational error during the approach to fluorescence target (residual tumor) is determined by comparing the calculated camera pose to the measured camera pose using a micro-positioning stage. From these preliminary results, computation efficiency of the algorithm in MATLAB code is near real-time (2.5 sec for each estimation of pose), which can be improved by implementation in C++. Error analysis produced 3-mm distance error and 2.5 degree of orientation error on average. The sources of these errors come from 1) inaccuracy of the 3D virtual model, generated on a calibrated RAVEN robotic platform with stereo tracking; 2) inaccuracy of endoscope intrinsic parameters, such as focal length; and 3) any endoscopic image distortion from scanning irregularities. This work demonstrates feasibility of micro-camera 3D guidance of a robotic surgical tool.
A LiDAR data-based camera self-calibration method
NASA Astrophysics Data System (ADS)
Xu, Lijun; Feng, Jing; Li, Xiaolu; Chen, Jianjun
2018-07-01
To find the intrinsic parameters of a camera, a LiDAR data-based camera self-calibration method is presented here. Parameters have been estimated using particle swarm optimization (PSO), enhancing the optimal solution of a multivariate cost function. The main procedure of camera intrinsic parameter estimation has three parts, which include extraction and fine matching of interest points in the images, establishment of cost function, based on Kruppa equations and optimization of PSO using LiDAR data as the initialization input. To improve the precision of matching pairs, a new method of maximal information coefficient (MIC) and maximum asymmetry score (MAS) was used to remove false matching pairs based on the RANSAC algorithm. Highly precise matching pairs were used to calculate the fundamental matrix so that the new cost function (deduced from Kruppa equations in terms of the fundamental matrix) was more accurate. The cost function involving four intrinsic parameters was minimized by PSO for the optimal solution. To overcome the issue of optimization pushed to a local optimum, LiDAR data was used to determine the scope of initialization, based on the solution to the P4P problem for camera focal length. To verify the accuracy and robustness of the proposed method, simulations and experiments were implemented and compared with two typical methods. Simulation results indicated that the intrinsic parameters estimated by the proposed method had absolute errors less than 1.0 pixel and relative errors smaller than 0.01%. Based on ground truth obtained from a meter ruler, the distance inversion accuracy in the experiments was smaller than 1.0 cm. Experimental and simulated results demonstrated that the proposed method was highly accurate and robust.
NASA Astrophysics Data System (ADS)
Yan, Yunxiang; Wang, Gang; Sun, Weimin; Luo, A.-Li; Ma, Zhenyu; Li, Jian; Wang, Shuqing
2017-04-01
Focal ratio degradation (FRD) is a major contributor to throughput and light loss in a fibre spectroscopic telescope system. We combine the guided mode theory in geometric optics and a well-known model, the power distribution model (PDM), to predict and explain the FRD dependence properties. We present a robust method by modifying the energy distribution method with f-intercept to control the input condition. This method provides a way to determine the proper position of the fibre end on the focal plane to improve energy utilization and FRD performance, which lifts the relative throughput up to 95 per cent with variation of output focal ratio less than 2 per cent. This method can also help to optimize the arrangement of the position of focal-plane plate to enhance the coupling efficiency in a telescope. To investigate length properties, we modified the PDM by introducing a new parameter, the focal distance f, into the original model to make it available for a multiposition measurement system. The results show that the modified model is robust and feasible for measuring the key parameter d0 to simulate the transmission characteristics. The output focal ratio in the experiment does not follow the prediction trend but shows an interesting phenomenon: the output focal ratio increases first to the peak, then decreases and remains stable finally with increasing fibre length longer than 15 m. This provides a reference for choosing the appropriate length of fibre to improve the FRD performance for the design of the fibre system in a telescope.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riot, V J; Olivier, S; Bauman, B
2012-05-24
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics willmore » meet their performance goals.« less
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele
2012-06-01
Using diffractive micro-lenses configured in an array and placed in close proximity to the focal plane array will enable a small compact simultaneous multispectral imaging camera. This approach can be applied to spectral regions from the ultraviolet (UV) to the long-wave infrared (LWIR). The number of simultaneously imaged spectral bands is determined by the number of individually configured diffractive optical micro-lenses (lenslet) in the array. Each lenslet images at a different wavelength determined by the blaze and set at the time of manufacturing based on application. In addition, modulation of the focal length of the lenslet array with piezoelectric or electro-static actuation will enable spectral band fill-in allowing hyperspectral imaging. Using the lenslet array with dual-band detectors will increase the number of simultaneous spectral images by a factor of two when utilizing multiple diffraction orders. Configurations and concept designs will be presented for detection application for biological/chemical agents, buried IED's and reconnaissance. The simultaneous detection of multiple spectral images in a single frame of data enhances the image processing capability by eliminating temporal differences between colors and enabling a handheld instrument that is insensitive to motion.
Software and hardware complex for observation of star occultations by asteroids
NASA Astrophysics Data System (ADS)
Karbovsky, V.; Kleshchonok, V.; Buromsky, M.
2017-12-01
The preparation to the program for observation of star occultations by asteroids on the AZT-2 telescope was started in 2016. A new method for registration of occultation with a CCD camera in the synchronous transfer mode was proposed and developed. The special program was written to control the CCD camera and record images during such observations. The speed of image transfer can vary within wide limits, which makes it possible to carry out observations in a wide range of stellar magnitudes. The telescope AZT-2 is used, which has the largest mirror diameter in Kiev (D = 0.7 m. F = 10.5 m). A 3-fold optical reducer was produced, which providing a field of view with a CCD camera Apogee Alta U47 10 arcminutes and the equivalent focal length of the telescope 3.2 meters. The results of test observations are presented. The program is implemented jointly by the Main Astronomical Observatory of the National Academy of Sciences of Ukraine and the Astronomical Observatory of the Taras Shevchenko National University of Kyiv. Regular observations of star occultation by asteroids are planned with the help of this complex. % Z https://occultations.org Kleshchonok,V.V.,Buromsky,M. I. 2005, Kinematics and Physics of Celestial Bodies, 21, 5, 405 Kleshchonok, V.V., Buromskii, N. I., Khat’ko,I.V.2008, Kinematics and Physics of Celestial Bodies, 24, 2, 114
The Art and Science of Photography in Hand Surgery
Wang, Keming; Kowalski, Evan J.; Chung, Kevin C.
2013-01-01
High-quality medical photography plays an important role in teaching and demonstrating the functional capacity of the hands, as well as in medicolegal documentation. Obtaining standardized, high-quality photographs is now an essential component of many surgery practices. The importance of standardized photography in facial and cosmetic surgery has been well documented in previous studies, but no studies have thoroughly addressed the details of photography for hand surgery. In this paper, we will provide a set of guidelines and basic camera concepts for different scenarios to help hand surgeons obtain appropriate and informative high quality photographs. A camera used for medical photography should come equipped with a large sensor size and an optical zoom lens with a focal length ranging anywhere from 14-75mm. In a clinic or office setting, we recommend six standardized views of the hand and four views for the wrist, and additional views should be taken for tendon ruptures, nerve injuries, or other deformities of the hand. For intra-operative pictures, the camera operator should understand the procedure and pertinent anatomy in order to properly obtain high-quality photographs. When digital radiographs are not available, and radiographic film must be photographed, it is recommended to reduce the exposure and change the color mode to black and white to obtain the best possible pictures. The goal of medical photography is to present the subject in an accurate and precise fashion. PMID:23755927
Overview of the Multi-Spectral Imager on the NEAR spacecraft
NASA Astrophysics Data System (ADS)
Hawkins, S. E., III
1996-07-01
The Multi-Spectral Imager on the Near Earth Asteroid Rendezvous (NEAR) spacecraft is a 1 Hz frame rate CCD camera sensitive in the visible and near infrared bands (~400-1100 nm). MSI is the primary instrument on the spacecraft to determine morphology and composition of the surface of asteroid 433 Eros. In addition, the camera will be used to assist in navigation to the asteroid. The instrument uses refractive optics and has an eight position spectral filter wheel to select different wavelength bands. The MSI optical focal length of 168 mm gives a 2.9 ° × 2.25 ° field of view. The CCD is passively cooled and the 537×244 pixel array output is digitized to 12 bits. Electronic shuttering increases the effective dynamic range of the instrument by more than a factor of 100. A one-time deployable cover protects the instrument during ground testing operations and launch. A reduced aperture viewport permits full field of view imaging while the cover is in place. A Data Processing Unit (DPU) provides the digital interface between the spacecraft and the Camera Head and uses an RTX2010 processor. The DPU provides an eight frame image buffer, lossy and lossless data compression routines, and automatic exposure control. An overview of the instrument is presented and design parameters and trade-offs are discussed.
Imaging of Mercury and Venus from a flyby
Murray, B.C.; Belton, M.J.S.; Danielson, G. Edward; Davies, M.E.; Kuiper, G.P.; O'Leary, B. T.; Suomi, V.E.; Trask, N.J.
1971-01-01
This paper describes the results of study of an imaging experiment planned for the 1973 Mariner Venus/Mercury flyby mission. Scientific objectives, mission constraints, analysis of alternative systems, and the rationale for final choice are presented. Severe financial constraints ruled out the best technical alternative for flyby imaging, a film/readout system, or even significant re-design of previous Mariner vidicon camera/tape recorder systems. The final selection was a vidicon camera quite similar to that used for Mariner Mars 1971, but with the capability of real time transmission during the Venus and Mercury flybys. Real time data return became possible through dramatic increase in the communications bandwidth at only modest sacrifice in the quality of the returned pictures. Two identical long focal length cameras (1500 mm) were selected and it will be possible to return several thousand pictures from both planets at resolutions ranging from equivalent to Earthbased to tenths of a kilometer at encounter. Systematic high resolution ultraviolet photography of Venus is planned after encounter in an attempt to understand the nature of the mysterious ultraviolet markings and their apparent 4- to 5-day rotation period. Full disk coverage in mosaics will produce pictures of both planets similar in quality to Earthbased telescopic pictures of the Moon. The increase of resolution, more than three orders of magnitude, will yield an exciting first look at two planets whose closeup appearance is unknown. ?? 1971.
Multi-Angle Snowflake Camera Value-Added Product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shkurko, Konstantin; Garrett, T.; Gaustad, K
The Multi-Angle Snowflake Camera (MASC) addresses a need for high-resolution multi-angle imaging of hydrometeors in freefall with simultaneous measurement of fallspeed. As illustrated in Figure 1, the MASC consists of three cameras, separated by 36°, each pointing at an identical focal point approximately 10 cm away. Located immediately above each camera, a light aims directly at the center of depth of field for its corresponding camera. The focal point at which the cameras are aimed lies within a ring through which hydrometeors fall. The ring houses a system of near-infrared emitter-detector pairs, arranged in two arrays separated vertically by 32more » mm. When hydrometeors pass through the lower array, they simultaneously trigger all cameras and lights. Fallspeed is calculated from the time it takes to traverse the distance between the upper and lower triggering arrays. The trigger electronics filter out ambient light fluctuations associated with varying sunlight and shadows. The microprocessor onboard the MASC controls the camera system and communicates with the personal computer (PC). The image data is sent via FireWire 800 line, and fallspeed (and camera control) is sent via a Universal Serial Bus (USB) line that relies on RS232-over-USB serial conversion. See Table 1 for specific details on the MASC located at the Oliktok Point Mobile Facility on the North Slope of Alaska. The value-added product (VAP) detailed in this documentation analyzes the raw data (Section 2.0) using Python: images rely on OpenCV image processing library and derived aggregated statistics rely on some clever averaging. See Sections 4.1 and 4.2 for more details on what variables are computed.« less
An optical wavefront sensor based on a double layer microlens array.
Lin, Vinna; Wei, Hsiang-Chun; Hsieh, Hsin-Ta; Su, Guo-Dung John
2011-01-01
In order to determine light aberrations, Shack-Hartmann optical wavefront sensors make use of microlens arrays (MLA) to divide the incident light into small parts and focus them onto image planes. In this paper, we present the design and fabrication of long focal length MLA with various shapes and arrangements based on a double layer structure for optical wavefront sensing applications. A longer focal length MLA could provide high sensitivity in determining the average slope across each microlens under a given wavefront, and spatial resolution of a wavefront sensor is increased by numbers of microlenses across a detector. In order to extend focal length, we used polydimethysiloxane (PDMS) above MLA on a glass substrate. Because of small refractive index difference between PDMS and MLA interface (UV-resin), the incident light is less refracted and focused in further distance. Other specific focal lengths could also be realized by modifying the refractive index difference without changing the MLA size. Thus, the wavefront sensor could be improved with better sensitivity and higher spatial resolution.
Optimization of confocal laser induced fluorescence for long focal length applications
NASA Astrophysics Data System (ADS)
Jemiolo, Andrew J.; Henriquez, Miguel F.; Thompson, Derek S.; Scime, Earl E.
2017-10-01
Laser induced fluorescence (LIF) is a non-perturbative diagnostic for measuring ion and neutral particle velocities and temperatures in a plasma. The conventional method for single-photon LIF requires intersecting optical paths for light injection and collection. The multiple vacuum windows needed for such measurements are unavailable in many plasma experiments. Confocal LIF eliminates the need for perpendicular intersecting optical paths by using concentric injection and collection paths through a single window. One of the main challenges with using confocal LIF is achieving high resolution measurements at the longer focal lengths needed for many plasma experiments. We present confocal LIF measurements in HELIX, a helicon plasma experiment at West Virginia University, demonstrating spatial resolution dependence on focal length and spatial filtering. By combining aberration mitigating optics with spatial filtering, our results show high resolution measurements at focal lengths of 0.5 m, long enough to access the interiors of many laboratory plasma experiments. This work was supported by U.S. National Science Foundation Grant No. PHY-1360278.
Thermal Imaging with Novel Infrared Focal Plane Arrays and Quantitative Analysis of Thermal Imagery
NASA Technical Reports Server (NTRS)
Gunapala, S. D.; Rafol, S. B.; Bandara, S. V.; Liu, J. K.; Mumolo, J. M.; Soibel, A.; Ting, D. Z.; Tidrow, Meimei
2012-01-01
We have developed a single long-wavelength infrared (LWIR) quantum well infrared photodetector (QWIP) camera for thermography. This camera has been used to measure the temperature profile of patients. A pixel coregistered simultaneously reading mid-wavelength infrared (MWIR)/LWIR dual-band QWIP camera was developed to improve the accuracy of temperature measurements especially with objects with unknown emissivity. Even the dualband measurement can provide inaccurate results due to the fact that emissivity is a function of wavelength. Thus we have been developing a four-band QWIP camera for accurate temperature measurement of remote object.
Light ray field capture using focal plane sweeping and its optical reconstruction using 3D displays.
Park, Jae-Hyeung; Lee, Sung-Keun; Jo, Na-Young; Kim, Hee-Jae; Kim, Yong-Soo; Lim, Hong-Gi
2014-10-20
We propose a method to capture light ray field of three-dimensional scene using focal plane sweeping. Multiple images are captured using a usual camera at different focal distances, spanning the three-dimensional scene. The captured images are then back-projected to four-dimensional spatio-angular space to obtain the light ray field. The obtained light ray field can be visualized either using digital processing or optical reconstruction using various three-dimensional display techniques including integral imaging, layered display, and holography.
Liao, Jun; Wang, Zhe; Zhang, Zibang; Bian, Zichao; Guo, Kaikai; Nambiar, Aparna; Jiang, Yutong; Jiang, Shaowei; Zhong, Jingang; Choma, Michael; Zheng, Guoan
2018-02-01
We report the development of a multichannel microscopy for whole-slide multiplane, multispectral and phase imaging. We use trinocular heads to split the beam path into 6 independent channels and employ a camera array for parallel data acquisition, achieving a maximum data throughput of approximately 1 gigapixel per second. To perform single-frame rapid autofocusing, we place 2 near-infrared light-emitting diodes (LEDs) at the back focal plane of the condenser lens to illuminate the sample from 2 different incident angles. A hot mirror is used to direct the near-infrared light to an autofocusing camera. For multiplane whole-slide imaging (WSI), we acquire 6 different focal planes of a thick specimen simultaneously. For multispectral WSI, we relay the 6 independent image planes to the same focal position and simultaneously acquire information at 6 spectral bands. For whole-slide phase imaging, we acquire images at 3 focal positions simultaneously and use the transport-of-intensity equation to recover the phase information. We also provide an open-source design to further increase the number of channels from 6 to 15. The reported platform provides a simple solution for multiplexed fluorescence imaging and multimodal WSI. Acquiring an instant focal stack without z-scanning may also enable fast 3-dimensional dynamic tracking of various biological samples. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Close Range Calibration of Long Focal Length Lenses in a Changing Environment
NASA Astrophysics Data System (ADS)
Robson, Stuart; MacDonald, Lindsay; Kyle, Stephen; Shortis, Mark R.
2016-06-01
University College London is currently developing a large-scale multi-camera system for dimensional control tasks in manufacturing, including part machining, assembly and tracking, as part of the Light Controlled Factory project funded by the UK Engineering and Physical Science Research Council. In parallel, as part of the EU LUMINAR project funded by the European Association of National Metrology Institutes, refraction models of the atmosphere in factory environments are being developed with the intent of modelling and eliminating the effects of temperature and other variations. The accuracy requirements for both projects are extremely demanding, so accordingly improvements in the modelling of both camera imaging and the measurement environment are essential. At the junction of these two projects lies close range camera calibration. The accurate and reliable calibration of cameras across a realistic range of atmospheric conditions in the factory environment is vital in order to eliminate systematic errors. This paper demonstrates the challenge of experimentally isolating environmental effects at the level of a few tens of microns. Longer lines of sight promote the use and calibration of a near perfect perspective projection from a Kern 75mm lens with maximum radial distortion of the order of 0.5m. Coordination of a reference target array, representing a manufactured part, is achieved to better than 0.1mm at a standoff of 8m. More widely, results contribute to better sensor understanding, improved mathematical modelling of factory environments and more reliable coordination of targets to 0.1mm and better over large volumes.
NASA Astrophysics Data System (ADS)
Buisset, Christophe; Prasit, Apirat; Lépine, Thierry; Poshyachinda, Saran; Soonthornthum, Boonrucksar; Deboos, Alexis
2016-07-01
The National Astronomical Research Institute (NARIT) is currently developing an all spherical five lenses focal reducer to image a FOV circular of diameter Δθ = 14.6' on the 4K camera with a pixel scale equal to 0.42''/pixel. The spatial resolution will be better than 1.2'' over the full visible spectral domain [400 nm, 800 nm]. The relative irradiance between the ghost and the science images will be lower than 10-4. The maximum distortion will be lower than 1% and the maximum angle of incidence on the filters will be equal to 8°. The focal reducer comprises 1 doublet L1 located at the fork entrance and 1 triplet L2 located in front of the camera. The doublet L1 will be mounted on a tip-tilt mount placed on a robotic sliding rail. L1 will thus be placed in the optical path during the observations with the 4K camera and will be removed during the observations with the other instruments. The triplet L2 will be installed on the instrument cube in front of the camera equipped with the filter wheel. The glass will be manufactured in a specialized company, the mechanical parts will be manufactured by using the NARIT Computer Numerical Control machine and the lenses will be integrated at NARIT. In this paper, we describe the optical and mechanical designs and we present the geometrical performance, the transmission budget and the results of the stray light analyses.
Coherent infrared imaging camera (CIRIC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.
1995-07-01
New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerousmore » and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.« less
NASA Astrophysics Data System (ADS)
Deng, Shengfeng; Lyu, Jinke; Sun, Hongda; Cui, Xiaobin; Wang, Tun; Lu, Miao
2015-03-01
A chirped artificial compound eye on a curved surface was fabricated using an optical resin and then mounted on the end of an endoscopic imaging fiber bundle. The focal length of each lenslet on the curved surface was variable to realize a flat focal plane, which matched the planar end surface of the fiber bundle. The variation of the focal length was obtained by using a photoresist mold formed by dose-modulated laser lithography and subsequent thermal reflow. The imaging performance of the fiber bundle was characterized by coupling with a coaxial light microscope, and the result demonstrated a larger field of view and better imaging quality than that of an artificial compound eye with a uniform focal length. Accordingly, this technology has potential application in stereoscopic endoscopy.
NASA Astrophysics Data System (ADS)
Dai, Wanwan; Xie, Xingwang; Li, Dapeng; Han, Xinjie; Liu, Zhonglun; Wei, Dong; Xin, Zhaowei; Zhang, Xinyu; Wang, Haiwei; Xie, Changsheng
2018-02-01
Under the condition of existing intense turbulence, the object's wavefront may be severely distorted. So, the wavefront sensors based on the traditional microlens array (MLA) with a fixed focal length can not be used to measure the wavefront effectively. In order to obtain a larger measurement range and higher measurement accuracy, we propose a liquid-crystal microlens array (LCMLA) with needed ability of swing focus over the focal plane and further adjusting focal length, which is constructed by a dual patterned ITO electrodes. The main structure of the LCMLA is divided into two layers, which are made of glass substrate with ITO transparent electrodes. The top layer of each liquid-crystal microlens consists of four rectangular electrodes, and the bottom layer is a circular electrode. In common optical measurements performed, the operations are carried out such as adding the same signal voltage over four electrodes of each microlens to adjust the focal length of the lens cell and adding a signal voltage with different RMS amplitude to adjust the focus position on the focal plane. Experiments show that the LCMLA developed by us demonstrate a desired focal length adjustable function and dynamic swing ability, so as to indicate that the method can be used not only to measure wavefront but also correct the wavefront with strong distortion.
CCD TV focal plane guider development and comparison to SIRTF applications
NASA Technical Reports Server (NTRS)
Rank, David M.
1989-01-01
It is expected that the SIRTF payload will use a CCD TV focal plane fine guidance sensor to provide acquisition of sources and tracking stability of the telescope. Work has been done to develop CCD TV cameras and guiders at Lick Observatory for several years and have produced state of the art CCD TV systems for internal use. NASA decided to provide additional support so that the limits of this technology could be established and a comparison between SIRTF requirements and practical systems could be put on a more quantitative basis. The results of work carried out at Lick Observatory which was designed to characterize present CCD autoguiding technology and relate it to SIRTF applications is presented. Two different design types of CCD cameras were constructed using virtual phase and burred channel CCD sensors. A simple autoguider was built and used on the KAO, Mt. Lemon and Mt. Hamilton telescopes. A video image processing system was also constructed in order to characterize the performance of the auto guider and CCD cameras.
Evaluate depth of field limits of fixed focus lens arrangements in thermal infrared
NASA Astrophysics Data System (ADS)
Schuster, Norbert
2016-05-01
More and more modern thermal imaging systems use uncooled detectors. High volume applications work with detectors that have a reduced pixel count (typically between 200x150 and 640x480). This reduces the usefulness of modern image treatment procedures such as wave front coding. On the other hand, uncooled detectors demand lenses with fast fnumbers, near f/1.0, which reduces the expected Depth of Field (DoF). What are the limits on resolution if the target changes distance to the camera system? The desire to implement lens arrangements without a focusing mechanism demands a deeper quantification of the DoF problem. A new approach avoids the classic "accepted image blur circle" and quantifies the expected DoF by the Through Focus MTF of the lens. This function is defined for a certain spatial frequency that provides a straightforward relation to the pixel pitch of imaging device. A certain minimum MTF-level is necessary so that the complete thermal imaging system can realize its basic functions, such as recognition or detection of specified targets. Very often, this technical tradeoff is approved with a certain lens. But what is the impact of changing the lens for one with a different focal length? Narrow field lenses, which give more details of targets in longer distances, tighten the DoF problem. A first orientation is given by the hyperfocal distance. It depends in a square relation on the focal length and in a linear relation on the through focus MTF of the lens. The analysis of these relations shows the contradicting requirements between higher thermal and spatial resolution, faster f-number and desired DoF. Furthermore, the hyperfocal distance defines the DoF-borders. Their relation between is such as the first order imaging formulas. A calculation methodology will be presented to transfer DoF-results from an approved combination lens and camera to another lens in combination with the initial camera. Necessary input for this prediction is the accepted DoF of the initial combination and the through focus MTFs of both lenses. The accepted DoF of the initial combination defines an application and camera related MTF-level, which must be provided also by the new lens. Examples are provided. The formula of the Diffraction-Limited-Through-Focus-MTF (DLTF) quantifies the physical limit and works without any ray trace. This relation respects the pixel pitch, the waveband and the aperture based f-number, but is independent of detector size. The DLTF has a steeper slope than the ray traced Through-Focus-MTF; its maximum is the diffraction limit. The DLTF predicts the DoF-relations quite precisely. Differences to ray trace results are discussed. Last calculations with modern detectors show that a static chosen MTF-level doesn't reflect the reality for the DoFproblem. The MTF-level to respect depends on application, pixel pitch, IR-camera and image treatment. A value of 0.250 at the detector Nyquist frequency seems to be a reasonable starting point for uncooled FPAs with 17μm pixel pitch.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rachel F. Brem; Jocelyn A. Rapelyea; , Gilat Zisman
2005-08-01
To prospectively evaluate a high-resolution breast-specific gamma camera for depicting occult breast cancer in women at high risk for breast cancer but with normal mammographic and physical examination findings. MATERIALS AND METHODS: Institutional Review Board approval and informed consent were obtained. The study was HIPAA compliant. Ninety-four high-risk women (age range, 36-78 years; mean, 55 years) with normal mammographic (Breast Imaging Reporting and Data System [BI-RADS] 1 or 2) and physical examination findings were evaluated with scintimammography. After injection with 25-30 mCi (925-1110 MBq) of technetium 99m sestamibi, patients were imaged with a high-resolution small-field-of-view breast-specific gamma camera in craniocaudalmore » and mediolateral oblique projections. Scintimammograms were prospectively classified according to focal radiotracer uptake as normal (score of 1), with no focal or diffuse uptake; benign (score of 2), with minimal patchy uptake; probably benign (score of 3), with scattered patchy uptake; probably abnormal (score of 4), with mild focal radiotracer uptake; and abnormal (score of 5), with marked focal radiotracer uptake. Mammographic breast density was categorized according to BI-RADS criteria. Patients with normal scintimammograms (scores of 1, 2, or 3) were followed up for 1 year with an annual mammogram, physical examination, and repeat scintimammography. Patients with abnormal scintimammograms (scores of 4 or 5) underwent ultrasonography (US), and those with focal hypoechoic lesions underwent biopsy. If no lesion was found during US, patients were followed up with scintimammography. Specific pathologic findings were compared with scintimammographic findings. RESULTS: Of 94 women, 78 (83%) had normal scintimammograms (score of 1, 2, or 3) at initial examination and 16 (17%) had abnormal scintimammograms (score of 4 or 5). Fourteen (88%) of the 16 patients had either benign findings at biopsy or no focal abnormality at US; in two (12%) patients, invasive carcinoma was diagnosed at US-guided biopsy (9 mm each at pathologic examination). CONCLUSION: High-resolution breast-specific scintimammography can depict small (<1-cm), mammographically occult, nonpalpable lesions in women at increased risk for breast cancer not otherwise identified at mammography or physical examination.« less
NASA Astrophysics Data System (ADS)
Clapuyt, Francois; Vanacker, Veerle; Van Oost, Kristof
2016-05-01
Combination of UAV-based aerial pictures and Structure-from-Motion (SfM) algorithm provides an efficient, low-cost and rapid framework for remote sensing and monitoring of dynamic natural environments. This methodology is particularly suitable for repeated topographic surveys in remote or poorly accessible areas. However, temporal analysis of landform topography requires high accuracy of measurements and reproducibility of the methodology as differencing of digital surface models leads to error propagation. In order to assess the repeatability of the SfM technique, we surveyed a study area characterized by gentle topography with an UAV platform equipped with a standard reflex camera, and varied the focal length of the camera and location of georeferencing targets between flights. Comparison of different SfM-derived topography datasets shows that precision of measurements is in the order of centimetres for identical replications which highlights the excellent performance of the SfM workflow, all parameters being equal. The precision is one order of magnitude higher for 3D topographic reconstructions involving independent sets of ground control points, which results from the fact that the accuracy of the localisation of ground control points strongly propagates into final results.
Exploring the Universe with the Hubble Space Telescope
NASA Technical Reports Server (NTRS)
1990-01-01
A general overview is given of the operations, engineering challenges, and components of the Hubble Space Telescope. Deployment, checkout and servicing in space are discussed. The optical telescope assembly, focal plane scientific instruments, wide field/planetary camera, faint object spectrograph, faint object camera, Goddard high resolution spectrograph, high speed photometer, fine guidance sensors, second generation technology, and support systems and services are reviewed.
SHORT COMMUNICATION: An image processing approach to calibration of hydrometers
NASA Astrophysics Data System (ADS)
Lorefice, S.; Malengo, A.
2004-06-01
The usual method adopted for multipoint calibration of glass hydrometers is based on the measurement of the buoyancy by hydrostatic weighing when the hydrometer is plunged in a reference liquid up to the scale mark to be calibrated. An image processing approach is proposed by the authors to align the relevant scale mark with the reference liquid surface level. The method uses image analysis with a data processing technique and takes into account the perspective error. For this purpose a CCD camera with a pixel matrix of 604H × 576V and a lens of 16 mm focal length were used. High accuracy in the hydrometer reading was obtained as the resulting reading uncertainty was lower than 0.02 mm, about a fifth of the usual figure with the visual reading made by an operator.
A liquid crystal microlens array with aluminum and graphene electrodes for plenoptic imaging
NASA Astrophysics Data System (ADS)
Lei, Yu; Tong, Qing; Luo, Jun; Zhang, Xinyu; Sang, Hongshi; Xie, Changsheng
2015-12-01
Currently, several semiconducting oxide materials such as typical indium tin oxide are widely used as the transparent conducting electrodes (TCEs) in liquid crystal microlens arrays. In this paper, we fabricate a liquid crystal microlens array using graphene rather than semiconducting oxides as the TCE. Common optical experiments are carried out to acquire the focusing features of the graphene-based liquid crystal microlens array (GLCMLA) driven electrically. The acquired optical fields show that the GLCMLA can converge incident collimating lights efficiently. The relationship between the focal length and the applied voltage signal is presented. Then the GLCMLA is deployed in a plenoptic camera prototype and the raw images are acquired so as to verify their imaging capability. Our experiments demonstrate that graphene has already presented a broad application prospect in the area of adaptive optics.
Monocular Vision-Based Underwater Object Detection
Zhang, Zhen; Dai, Fengzhao; Bu, Yang; Wang, Huibin
2017-01-01
In this paper, we propose an underwater object detection method using monocular vision sensors. In addition to commonly used visual features such as color and intensity, we investigate the potential of underwater object detection using light transmission information. The global contrast of various features is used to initially identify the region of interest (ROI), which is then filtered by the image segmentation method, producing the final underwater object detection results. We test the performance of our method with diverse underwater datasets. Samples of the datasets are acquired by a monocular camera with different qualities (such as resolution and focal length) and setups (viewing distance, viewing angle, and optical environment). It is demonstrated that our ROI detection method is necessary and can largely remove the background noise and significantly increase the accuracy of our underwater object detection method. PMID:28771194
Chalcogenide glass microlenses by inkjet printing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanchez, Eric A.; Waldmann, Maike; Arnold, Craig B.
We demonstrate micrometer scale mid-IR lenses for integrated optics, using solution-based inkjet printing techniques and subsequent processing. Arsenic sulfide spherical microlenses with diameters of 10-350 {mu}m and focal lengths of 10-700 {mu}m have been fabricated. The baking conditions can be used to tune the precise focal length.
Site Characterization and Analysis Penetrometer System (SCAPS) Heavy Metal Sensors
2003-04-01
gigawatts per square centimeter (GW/cm2). Within the small volume about the focal point, rapid heating, vaporization, and ionization of a small amount of the...2001). Briefly, the light emitted from the fiber in the probe is collimated, turned by a quartz prism, and focused by a short- focal -length quartz 5...are decoupled using the configuration shown in Figure 1. The excitation beam is focused by a 25-cm focal length lens through a 2-inch aluminum mirror
Multi-image acquisition-based distance sensor using agile laser spot beam.
Riza, Nabeel A; Amin, M Junaid
2014-09-01
We present a novel laser-based distance measurement technique that uses multiple-image-based spatial processing to enable distance measurements. Compared with the first-generation distance sensor using spatial processing, the modified sensor is no longer hindered by the classic Rayleigh axial resolution limit for the propagating laser beam at its minimum beam waist location. The proposed high-resolution distance sensor design uses an electronically controlled variable focus lens (ECVFL) in combination with an optical imaging device, such as a charged-coupled device (CCD), to produce and capture different laser spot size images on a target with these beam spot sizes different from the minimal spot size possible at this target distance. By exploiting the unique relationship of the target located spot sizes with the varying ECVFL focal length for each target distance, the proposed distance sensor can compute the target distance with a distance measurement resolution better than the axial resolution via the Rayleigh resolution criterion. Using a 30 mW 633 nm He-Ne laser coupled with an electromagnetically actuated liquid ECVFL, along with a 20 cm focal length bias lens, and using five spot images captured per target position by a CCD-based Nikon camera, a proof-of-concept proposed distance sensor is successfully implemented in the laboratory over target ranges from 10 to 100 cm with a demonstrated sub-cm axial resolution, which is better than the axial Rayleigh resolution limit at these target distances. Applications for the proposed potentially cost-effective distance sensor are diverse and include industrial inspection and measurement and 3D object shape mapping and imaging.
Positive focal shift of gallium nitride high contrast grating focusing reflectors
NASA Astrophysics Data System (ADS)
He, Shumin; Wang, Zhenhai; Liu, Qifa
2016-09-01
We design a type of metasurfaces capable of serving as a visible-light focusing reflector based on gallium nitride (GaN) high contrast gratings (HCGs). The wavefront of the reflected light is precisely manipulated by spatial variation of the grating periods along the subwavelength ridge array to achieve light focusing. Different from conventional negative focal shift effect, a positive focal shift is observed in such focusing reflectors. Detailed investigations of the influence of device size on the focusing performance, especially the focal length, are preformed via a finite element method . The results show that all performance parameters are greatly affected by the reflector size. A more concentrated focal point, or a better focusing capability, can be achieved by larger size. With increasing reflector size, the achieved focal length decreases and gradually approaches to the design, thus the corresponding positive focal shift decreases. Our results are helpful for understanding the visible-light control of the planar HCG-based focusing reflectors.
Variable focal length deformable mirror
Headley, Daniel [Albuquerque, NM; Ramsey, Marc [Albuquerque, NM; Schwarz, Jens [Albuquerque, NM
2007-06-12
A variable focal length deformable mirror has an inner ring and an outer ring that simply support and push axially on opposite sides of a mirror plate. The resulting variable clamping force deforms the mirror plate to provide a parabolic mirror shape. The rings are parallel planar sections of a single paraboloid and can provide an on-axis focus, if the rings are circular, or an off-axis focus, if the rings are elliptical. The focal length of the deformable mirror can be varied by changing the variable clamping force. The deformable mirror can generally be used in any application requiring the focusing or defocusing of light, including with both coherent and incoherent light sources.
Hyper Suprime-Cam: Camera dewar design
NASA Astrophysics Data System (ADS)
Komiyama, Yutaka; Obuchi, Yoshiyuki; Nakaya, Hidehiko; Kamata, Yukiko; Kawanomoto, Satoshi; Utsumi, Yousuke; Miyazaki, Satoshi; Uraguchi, Fumihiro; Furusawa, Hisanori; Morokuma, Tomoki; Uchida, Tomohisa; Miyatake, Hironao; Mineo, Sogo; Fujimori, Hiroki; Aihara, Hiroaki; Karoji, Hiroshi; Gunn, James E.; Wang, Shiang-Yu
2018-01-01
This paper describes the detailed design of the CCD dewar and the camera system which is a part of the wide-field imager Hyper Suprime-Cam (HSC) on the 8.2 m Subaru Telescope. On the 1.°5 diameter focal plane (497 mm in physical size), 116 four-side buttable 2 k × 4 k fully depleted CCDs are tiled with 0.3 mm gaps between adjacent chips, which are cooled down to -100°C by two pulse tube coolers with a capability to exhaust 100 W heat at -100°C. The design of the dewar is basically a natural extension of Suprime-Cam, incorporating some improvements such as (1) a detailed CCD positioning strategy to avoid any collision between CCDs while maximizing the filling factor of the focal plane, (2) a spherical washers mechanism adopted for the interface points to avoid any deformation caused by the tilt of the interface surface to be transferred to the focal plane, (3) the employment of a truncated-cone-shaped window, made of synthetic silica, to save the back focal space, and (4) a passive heat transfer mechanism to exhaust efficiently the heat generated from the CCD readout electronics which are accommodated inside the dewar. Extensive simulations using a finite-element analysis (FEA) method are carried out to verify that the design of the dewar is sufficient to satisfy the assigned errors. We also perform verification tests using the actually assembled CCD dewar to supplement the FEA and demonstrate that the design is adequate to ensure an excellent image quality which is key to the HSC. The details of the camera system, including the control computer system, are described as well as the assembling process of the dewar and the process of installation on the telescope.
An Optical Wavefront Sensor Based on a Double Layer Microlens Array
Lin, Vinna; Wei, Hsiang-Chun; Hsieh, Hsin-Ta; Su, Guo-Dung John
2011-01-01
In order to determine light aberrations, Shack-Hartmann optical wavefront sensors make use of microlens arrays (MLA) to divide the incident light into small parts and focus them onto image planes. In this paper, we present the design and fabrication of long focal length MLA with various shapes and arrangements based on a double layer structure for optical wavefront sensing applications. A longer focal length MLA could provide high sensitivity in determining the average slope across each microlens under a given wavefront, and spatial resolution of a wavefront sensor is increased by numbers of microlenses across a detector. In order to extend focal length, we used polydimethysiloxane (PDMS) above MLA on a glass substrate. Because of small refractive index difference between PDMS and MLA interface (UV-resin), the incident light is less refracted and focused in further distance. Other specific focal lengths could also be realized by modifying the refractive index difference without changing the MLA size. Thus, the wavefront sensor could be improved with better sensitivity and higher spatial resolution. PMID:22346643
Design of a variable-focal-length optical system
NASA Technical Reports Server (NTRS)
Ricks, D.; Shannon, R. R.
1984-01-01
Requirements to place an entire optical system with a variable focal length ranging from 20 to 200 cm within a overall length somewhat less than 100 cm placed severe restrictions on the design of a zoom lens suitable for use on a comet explorer. The requirements of a wavelength range of 0.4 to 1.0 microns produced even greater limitations on the possibilities for a design that included a catadioptric (using mirrors and glass) front and followed by a zooming refractive portion. Capabilities available commercial zoom lenses as well as patents of optical systems are reviewed. Preliminary designs of the refractive optics zoom lens and the catadioptric system are presented and evaluated. Of the two, the latter probably has the best chance of success, so long as the shortest focal lengths are not really needed.
Visualization of Subsurface Defects in Composites using a Focal Plane Array Infrared Camera
NASA Technical Reports Server (NTRS)
Plotnikov, Yuri A.; Winfree, William P.
1999-01-01
A technique for enhanced defect visualization in composites via transient thermography is presented in this paper. The effort targets automated defect map construction for multiple defects located in the observed area. Experimental data were collected on composite panels of different thickness with square inclusions and flat bottom holes of different depth and orientation. The time evolution of the thermal response and spatial thermal profiles are analyzed. The pattern generated by carbon fibers and the vignetting effect of the focal plane array camera make defect visualization difficult. An improvement of the defect visibility is made by the pulse phase technique and the spatial background treatment. The relationship between a size of a defect and its reconstructed image is analyzed as well. The image processing technique for noise reduction is discussed.
Long-Wavelength 640 x 486 GaAs/AlGaAs Quantum Well Infrared Photodetector Snap-Shot Camera
NASA Technical Reports Server (NTRS)
Gunapala, Sarath D.; Bandara, Sumith V.; Liu, John K.; Hong, Winn; Sundaram, Mani; Maker, Paul D.; Muller, Richard E.; Shott, Craig A.; Carralejo, Ronald
1998-01-01
A 9-micrometer cutoff 640 x 486 snap-shot quantum well infrared photodetector (QWIP) camera has been demonstrated. The performance of this QWIP camera is reported including indoor and outdoor imaging. The noise equivalent differential temperature (NE.deltaT) of 36 mK has been achieved at 300 K background with f/2 optics. This is in good agreement with expected focal plane array sensitivity due to the practical limitations on charge handling capacity of the multiplexer, read noise, bias voltage, and operating temperature.
Manned observations technology development, FY 1992 report
NASA Technical Reports Server (NTRS)
Israel, Steven
1992-01-01
This project evaluated the suitability of the NASA/JSC developed electronic still camera (ESC) digital image data for Earth observations from the Space Shuttle, as a first step to aid planning for Space Station Freedom. Specifically, image resolution achieved from the Space Shuttle using the current ESC system, which is configured with a Loral 15 mm x 15 mm (1024 x 1024 pixel array) CCD chip on the focal plane of a Nikon F4 camera, was compared to that of current handheld 70 mm Hasselblad 500 EL/M film cameras.
Expected progress based on aluminium galium nitride Focal Plan Array for near and deep Ultraviolet
NASA Astrophysics Data System (ADS)
Reverchon, J.-L.; Robin, K.; Bansropun, S.; Gourdel, Y.; Robo, J.-A.; Truffer, J.-P.; Costard, E.; Brault, J.; Frayssinet, E.; Duboz, J.-Y.
The fast development of nitrides has given the opportunity to investigate AlGaN as a material for ultraviolet detection. A camera based on such a material presents an extremely low dark current at room temperature. It can compete with technologies based on photocathodes, MCP intensifiers, back thinned CCD or hybrid CMOS focal plane arrays for low flux measurements. First, we will present results on focal plane array of 320 × 256 pixels with a pitch of 30 μm. The peak responsivity is tuned from 260 nm to 360 nm in different cameras. All these results are obtained in a standard SWIR supply chaine and with AlGaN Schottky diodes grown on sapphire. We will present here the first attempts to transfer the standard design Schottky photodiodes on from sapphire to silicon substrates. We will show the capability to remove the silicon substrate, to etch the window layer in order to extend the band width to lower wavelength and to maintain the AlGaN membrane integrity.
The art and science of photography in hand surgery.
Wang, Keming; Kowalski, Evan J; Chung, Kevin C
2014-03-01
High-quality medical photography plays an important role in teaching and demonstrating the functional capacity of the hands as well as in medicolegal documentation. Obtaining standardized, high-quality photographs is now an essential component of many surgery practices. The importance of standardized photography in facial and cosmetic surgery has been well documented in previous studies, but no studies have thoroughly addressed the details of photography for hand surgery. In this paper, we provide a set of guidelines and basic camera concepts for different scenarios to help hand surgeons obtain appropriate and informative high-quality photographs. A camera used for medical photography should come equipped with a large sensor size and an optical zoom lens with a focal length ranging anywhere from 14 to 75 mm. In a clinic or office setting, we recommend 6 standardized views of the hand and 4 views for the wrist; additional views should be taken for tendon ruptures, nerve injuries, or other deformities of the hand. For intraoperative pictures, the camera operator should understand the procedure and pertinent anatomy in order to properly obtain high-quality photographs. When digital radiographs are not available and radiographic film must be photographed, it is recommended to reduce the exposure and change the color mode to black and white to obtain the best possible pictures. The goal of medical photography is to present the subject in an accurate and precise fashion. Copyright © 2014 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Algolcam: Low Cost Sky Scanning with Modern Technology
NASA Astrophysics Data System (ADS)
Connors, Martin; Bolton, Dempsey; Doktor, Ian
2016-01-01
Low cost DSLR cameras running under computer control offer good sensitivity, high resolution, small size, and the convenience of digital image handling. Recent developments in small single board computers have pushed the performance to cost and size ratio to unprecedented values, with the further advantage of very low power consumption. Yet a third technological development is motor control electronics which is easily integrated with the computer to make an automated mount, which in our case is custom built, but with similar mounts available commercially. Testing of such a system under a clear plastic dome at our auroral observatory was so successful that we have developed a weatherproof housing allowing use during the long, cold, and clear winter nights at northerly latitudes in Canada. The main advantage of this housing should be improved image quality as compared to operation through clear plastic. We have improved the driving software to include the ability to self-calibrate pointing through the web API of astrometry.net, and data can be reduced automatically through command line use of the Muniwin program. The mount offers slew in declination and RA, and tracking at sidereal or other rates in RA. Our previous tests with a Nikon D5100 with standard lenses in the focal length range 50-200 mm, operating at f/4 to f/5, allowed detection of 12th magnitude stars with 30 second exposure under very dark skies. At 85 mm focal length, a field of 15° by 10° is imaged with 4928 by 3264 color pixels, and we have adopted an 85 mm fixed focal length f/1.4 lens (as used by Project Panoptes), which we expect will give a limited magnitude approaching 15. With a large field of view, deep limiting magnitude, low cost, and ease of construction and use, we feel that the Algolcam offers great possibilities in monitoring and finding changes in the sky. We have already applied it to variable star light curves, and with a suitable pipeline for detection of moving or varying objects, it offers great potential for analysis and discovery. The use of low cost cutting edge technology makes Algolcam particularly interesting for enhancing the advanced undergraduate learning experience in astronomy.
Optomechanical System Development of the AWARE Gigapixel Scale Camera
NASA Astrophysics Data System (ADS)
Son, Hui S.
Electronic focal plane arrays (FPA) such as CMOS and CCD sensors have dramatically improved to the point that digital cameras have essentially phased out film (except in very niche applications such as hobby photography and cinema). However, the traditional method of mating a single lens assembly to a single detector plane, as required for film cameras, is still the dominant design used in cameras today. The use of electronic sensors and their ability to capture digital signals that can be processed and manipulated post acquisition offers much more freedom of design at system levels and opens up many interesting possibilities for the next generation of computational imaging systems. The AWARE gigapixel scale camera is one such computational imaging system. By utilizing a multiscale optical design, in which a large aperture objective lens is mated with an array of smaller, well corrected relay lenses, we are able to build an optically simple system that is capable of capturing gigapixel scale images via post acquisition stitching of the individual pictures from the array. Properly shaping the array of digital cameras allows us to form an effectively continuous focal surface using off the shelf (OTS) flat sensor technology. This dissertation details developments and physical implementations of the AWARE system architecture. It illustrates the optomechanical design principles and system integration strategies we have developed through the course of the project by summarizing the results of the two design phases for AWARE: AWARE-2 and AWARE-10. These systems represent significant advancements in the pursuit of scalable, commercially viable snapshot gigapixel imaging systems and should serve as a foundation for future development of such systems.
NASA Technical Reports Server (NTRS)
1997-01-01
Passive millimeter wave (PMMW) sensors have the ability to see through fog, clouds, dust and sandstorms and thus have the potential to support all-weather operations, both military and commercial. Many of the applications, such as military transport or commercial aircraft landing, are technologically stressing in that they require imaging of a scene with a large field of view in real time and with high spatial resolution. The development of a low cost PMMW focal plane array camera is essential to obtain real-time video images to fulfill the above needs. The overall objective of this multi-year project (Phase 1) was to develop and demonstrate the capabilities of a W-band PMMW camera with a microwave/millimeter wave monolithic integrated circuit (MMIC) focal plane array (FPA) that can be manufactured at low cost for both military and commercial applications. This overall objective was met in July 1997 when the first video images from the camera were generated of an outdoor scene. In addition, our consortium partner McDonnell Douglas was to develop a real-time passive millimeter wave flight simulator to permit pilot evaluation of a PMMW-equipped aircraft in a landing scenario. A working version of this simulator was completed. This work was carried out under the DARPA-funded PMMW Camera Technology Reinvestment Project (TRP), also known as the PMMW Camera DARPA Joint Dual-Use Project. In this final report for the Phase 1 activities, a year by year description of what the specific objectives were, the approaches taken, and the progress made is presented, followed by a description of the validation and imaging test results obtained in 1997.
The influence of focal spot blooming on high-contrast spatial resolution in CT imaging.
Grimes, Joshua; Duan, Xinhui; Yu, Lifeng; Halaweish, Ahmed F; Haag, Nicole; Leng, Shuai; McCollough, Cynthia
2015-10-01
The objective of this work was to investigate focal spot blooming effects on the spatial resolution of CT images and to evaluate an x-ray tube that uses dynamic focal spot control for minimizing focal spot blooming. The influence of increasing tube current at a fixed tube potential of 80 kV on high-contrast spatial resolution of seven different CT scanner models (scanners A-G), including one scanner that uses dynamic focal spot control to reduce focal spot blooming (scanner A), was evaluated. Spatial resolution was assessed using a wire phantom for the modulation transfer function (MTF) calculation and a copper disc phantom for measuring the slice sensitivity profile (SSP). The impact of varying the tube potential was investigated on two scanner models (scanners A and B) by measuring the MTF and SSP and also by using the resolution bar pattern module of the ACR CT phantom. The phantoms were scanned at 70-150 kV on scanner A and 80-140 kV on scanner B, with tube currents from 100 mA up to the maximum tube current available on each scanner. The images were reconstructed using a slice thickness of 0.6 mm with both smooth and sharp kernels. Additionally, focal spot size at varying tube potentials and currents was directly measured using pinhole and slit camera techniques. Evaluation of the MTF and SSP data from the 7 CT scanner models evaluated demonstrated decreased focal spot blooming for newer scanners, as evidenced by decreasing deviations in MTF and SSP as tube current varied. For scanners A and B, where focal spot blooming effects as a function of tube potential were assessed, the spatial resolution variation in the axial plane was much smaller on scanner A compared to scanner B as tube potential and current changed. On scanner A, the 50% MTF never decreased by more than 2% from the 50% MTF measured at 100 mA. On scanner B, the 50% MTF decreased by as much as 19% from the 50% MTF measured at 100 mA. Assessments of the SSP, the bar patterns in the ACR phantom and the pinhole and slit camera measurements were consistent with the MTF calculations. Focal spot blooming has a noticeable effect on spatial resolution in CT imaging. The focal spot shaping technology of scanner A greatly reduced blooming effects.
Electrically optofluidic zoom system with a large zoom range and high-resolution image.
Li, Lei; Yuan, Rong-Ying; Wang, Jin-Hui; Wang, Qiong-Hua
2017-09-18
We report an electrically controlled optofluidic zoom system which can achieve a large continuous zoom change and high-resolution image. The zoom system consists of an optofluidic zoom objective and a switchable light path which are controlled by two liquid optical shutters. The proposed zoom system can achieve a large tunable focal length range from 36mm to 92mm. And in this tuning range, the zoom system can correct aberrations dynamically, thus the image resolution is high. Due to large zoom range, the proposed imaging system incorporates both camera configuration and telescope configuration into one system. In addition, the whole system is electrically controlled by three electrowetting liquid lenses and two liquid optical shutters, therefore, the proposed system is very compact and free of mechanical moving parts. The proposed zoom system has potential to take place of conventional zoom systems.
Design and test of a high performance off-axis TMA telescope
NASA Astrophysics Data System (ADS)
Fan, Bin; Cai, Wei-jun; Huang, Ying
2017-11-01
A new complete Optical Demonstration Model (ODM) of high performance off-axis Three Mirror Anastigmatic (TMA) telescope has been successfully developed in BISME. This 1.75-m focal length, 1/9 relative aperture, 6.2°×1.0°field of view visible telescope, which uses the TDICCD detectors of 7μm pixel size, can provide 2.0-m ground sampling distance and 51-km swath from an altitude of 500 km. With some significant efforts, the main goals of the ODM have been reached: a compact lightweight design while realizing high performance and high stability. The optical system and key technologies have been applied in the multispectral camera of ZY-3 Satellite (the first high resolution stereo mapping satellite of China), which was successfully launched on January 9th, 2012. The main technology of ODM was described. The test results and applications were outlined.
Saw-tooth refractive lens for high energy x-ray focusing
NASA Astrophysics Data System (ADS)
Antimonov, Mikhail A.; Khounsary, Ali M.
2014-09-01
Saw-tooth refractive lens (SRL) provides a comparatively attractive option for X-ray focusing. An SRL assembly consists of two parts, each with an array of triangular structures (prisms), set tilted symmetrically with respect to the incoming beam. Its main advantage is a simple, continuous tunability in energy and focal length. SRLs can be used for both long and short focal length focusing. Long focal distance focusing of an SRL can accurately be predicted using simple analytical relations. However, the focus size at short focal distances focusing may deviate appreciably from the expected demagnified source size when: (1) the length of the SRL is comparable with the focusing distance, (2) the incident beam is not monochromatic, and (3) and the distance between adjacent prism tips, the tip step, is large . The first factor was considered in a previous work while the other two are addressed is this paper. This preliminary work is aimed at a better understanding of the SRL lenses for focusing an undulator beamline at the Advanced Photon Source (APS).
Liang, Kun; Yang, Cailan; Peng, Li; Zhou, Bo
2017-02-01
In uncooled long-wave IR camera systems, the temperature of a focal plane array (FPA) is variable along with the environmental temperature as well as the operating time. The spatial nonuniformity of the FPA, which is partly affected by the FPA temperature, obviously changes as well, resulting in reduced image quality. This study presents a real-time nonuniformity correction algorithm based on FPA temperature to compensate for nonuniformity caused by FPA temperature fluctuation. First, gain coefficients are calculated using a two-point correction technique. Then offset parameters at different FPA temperatures are obtained and stored in tables. When the camera operates, the offset tables are called to update the current offset parameters via a temperature-dependent interpolation. Finally, the gain coefficients and offset parameters are used to correct the output of the IR camera in real time. The proposed algorithm is evaluated and compared with two representative shutterless algorithms [minimizing the sum of the squares of errors algorithm (MSSE), template-based solution algorithm (TBS)] using IR images captured by a 384×288 pixel uncooled IR camera with a 17 μm pitch. Experimental results show that this method can quickly trace the response drift of the detector units when the FPA temperature changes. The quality of the proposed algorithm is as good as MSSE, while the processing time is as short as TBS, which means the proposed algorithm is good for real-time control and at the same time has a high correction effect.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Xiaoxiao; Tian, Jingxuan; Wen, Weijia, E-mail: phwen@ust.hk
2016-04-18
We report a metasurface for focusing reflected ultrasonic waves over a wide frequency band of 0.45–0.55 MHz. The broadband focusing effect of the reflective metasurface is studied numerically and then confirmed experimentally using near-field scanning techniques. The focusing mechanism can be attributed to the hyperboloidal reflection phase profile imposed by different depths of concentric grooves on the metasurface. In particular, the focal lengths of the reflective metasurface are extracted from simulations and experiments, and both exhibit good linear dependence on frequency over the considered frequency band. The proposed broadband reflective metasurface with tunable focal length has potential applications in the broadmore » field of ultrasonics, such as ultrasonic tomographic imaging, high intensity focused ultrasound treatment, etc.« less
Optofluidic lens with tunable focal length and asphericity
Mishra, Kartikeya; Murade, Chandrashekhar; Carreel, Bruno; Roghair, Ivo; Oh, Jung Min; Manukyan, Gor; van den Ende, Dirk; Mugele, Frieder
2014-01-01
Adaptive micro-lenses enable the design of very compact optical systems with tunable imaging properties. Conventional adaptive micro-lenses suffer from substantial spherical aberration that compromises the optical performance of the system. Here, we introduce a novel concept of liquid micro-lenses with superior imaging performance that allows for simultaneous and independent tuning of both focal length and asphericity. This is achieved by varying both hydrostatic pressures and electric fields to control the shape of the refracting interface between an electrically conductive lens fluid and a non-conductive ambient fluid. Continuous variation from spherical interfaces at zero electric field to hyperbolic ones with variable ellipticity for finite fields gives access to lenses with positive, zero, and negative spherical aberration (while the focal length can be tuned via the hydrostatic pressure). PMID:25224851
NASA Astrophysics Data System (ADS)
Zhao, Cunhua; Liang, Huiqin; Cui, Dongqing; Hong, Xinhua; Wei, Daling; Gao, Changliu
2011-08-01
In the ultralight or ultrathin applied domain of zoom lens, the traditional glass / plastic lens is limited for manufacture technology or cost. Therefore, a liquid lens was put forward to solve the problems. The liquid zoom lens has the merits of lower cost, smaller volume, quicker response, lower energy consumption, continuous zoom and higher accuracy. In liquid zoom lens the precise focal length is obtained by the contact angle changing to affect the curvature radius of interface. In our works, the relations of the exerted voltage, the contact angle, the curvature radius and the focal length were researched and accurately calculated. The calculation of the focal length provides an important theoretical basis for instructing the design of liquid zoom lens.
Bifocal liquid lens zoom objective for mobile phone applications
NASA Astrophysics Data System (ADS)
Wippermann, F. C.; Schreiber, P.; Bräuer, A.; Craen, P.
2007-02-01
Miniaturized camera systems are an integral part of today's mobile phones which recently possess auto focus functionality. Commercially available solutions without moving parts have been developed using the electrowetting technology. Here, the contact angle of a drop of a conductive or polar liquid placed on an insulating substrate can be influenced by an electric field. Besides the compensation of the axial image shift due to different object distances, mobile phones with zoom functionality are desired as a next evolutionary step. In classical mechanically compensated zoom lenses two independently driven actuators combined with precision guides are needed leading to a delicate, space consuming and expansive opto-mechanical setup. Liquid lens technology based on the electrowetting effect gives the opportunity to built adaptive lenses without moving parts thus simplifying the mechanical setup. However, with the recent commercially available liquid lens products a completely motionless and continuously adaptive zoom system with market relevant optical performance is not feasible. This is due to the limited change in optical power the liquid lenses can provide and the dispersion of the used materials. As an intermediate step towards a continuously adjustable and motionless zoom lens we propose a bifocal system sufficient for toggling between two effective focal lengths without any moving parts. The system has its mechanical counterpart in a bifocal zoom lens where only one lens group has to be moved. In a liquid lens bifocal zoom two groups of adaptable liquid lenses are required for adjusting the effective focal length and keeping the image location constant. In order to overcome the difficulties in achromatizing the lens we propose a sequential image acquisition algorithm. Here, the full color image is obtained from a sequence of monochrome images (red, green, blue) leading to a simplified optical setup.
LC-lens array with light field algorithm for 3D biomedical applications
NASA Astrophysics Data System (ADS)
Huang, Yi-Pai; Hsieh, Po-Yuan; Hassanfiroozi, Amir; Martinez, Manuel; Javidi, Bahram; Chu, Chao-Yu; Hsuan, Yun; Chu, Wen-Chun
2016-03-01
In this paper, liquid crystal lens (LC-lens) array was utilized in 3D bio-medical applications including 3D endoscope and light field microscope. Comparing with conventional plastic lens array, which was usually placed in 3D endoscope or light field microscope system to record image disparity, our LC-lens array has higher flexibility of electrically changing its focal length. By using LC-lens array, the working distance and image quality of 3D endoscope and microscope could be enhanced. Furthermore, the 2D/3D switching ability could be achieved if we turn off/on the electrical power on LClens array. In 3D endoscope case, a hexagonal micro LC-lens array with 350um diameter was placed at the front end of a 1mm diameter endoscope. With applying electric field on LC-lens array, the 3D specimen would be recorded as from seven micro-cameras with different disparity. We could calculate 3D construction of specimen with those micro images. In the other hand, if we turn off the electric field on LC-lens array, the conventional high resolution 2D endoscope image would be recorded. In light field microscope case, the LC-lens array was placed in front of the CMOS sensor. The main purpose of LC-lens array is to extend the refocusing distance of light field microscope, which is usually very narrow in focused light field microscope system, by montaging many light field images sequentially focusing on different depth. With adjusting focal length of LC-lens array from 2.4mm to 2.9mm, the refocusing distance was extended from 1mm to 11.3mm. Moreover, we could use a LC wedge to electrically shift the optics axis and increase the resolution of light field.
A DirtI Application for LBT Commissioning Campaigns
NASA Astrophysics Data System (ADS)
Borelli, J. L.
2009-09-01
In order to characterize the Gregorian focal stations and test the performance achieved by the Large Binocular Telescope (LBT) adaptive optics system, two infrared test cameras were constructed within a joint project between INAF (Observatorio Astronomico di Bologna, Italy) and the Max Planck Institute for Astronomy (Germany). Is intended here to describe the functionality and successful results obtained with the Daemon for the Infrared Test Camera Interface (DirtI) during commissioning campaigns.
Optimal speckle noise reduction filter for range gated laser illuminated imaging
NASA Astrophysics Data System (ADS)
Dayton, David; Gonglewski, John; Lasche, James; Hassall, Arthur
2016-09-01
Laser illuminated imaging has a number of applications in the areas of night time air-to-ground target surveillance, ID, and pointing and tracking. Using a laser illuminator, the illumination intensity and thus the signal to noise ratio can be controlled. With the advent of high performance range gated cameras in the short-wave infra-red band, higher spatial resolution can be achieved over passive thermal night imaging cameras in the mid-wave infra-red due to the shorter wave-length. If a coherent illuminator is used the resulting imagery often suffers from speckle noise due to the scattering off of a rough target surface, which gives it a grainy "salt and pepper" appearance. The probability density function for the intensity of focal plane speckle is well understood to follow a negative exponential distribution. This can be exploited to develop a Bayesian speckle noise filter. The filter has the advantage over simple frame averaging approaches in that it preserves target features and motion while reducing speckle noise without smearing or blurring the images. The resulting filtered images have the appearance of passive imagery and so are more amenable to sensor fusion with simultaneous mid-wave infra-red thermal images for enhanced target ID. The noise filter improvement is demonstrated using examples from real world laser imaging tests on tactical targets.
High-speed fiber laser cutting of thick stainless steel for dismantling tasks
NASA Astrophysics Data System (ADS)
Shin, Jae Sung; Oh, Seong Yong; Park, Hyunmin; Chung, Chin-Man; Seon, Sangwoo; Kim, Taek-Soo; Lee, Lim; Choi, Byung-Seon; Moon, Jei-Kwon
2017-09-01
A high-speed fiber laser cutting technology of thick steels for dismantling tasks was achieved using a 6-kW fiber laser system. At first, a new cutting head for efficient cutting of thick steels was developed, which was composed by a collimator with a focal length of 160 mm and mirror-type focusing objects with a long focal length of 600 mm. The long focal length of the focusing object made it possible for the beam size to be small through the thick cutting material and the cutting efficiency was expected to increase compared with the short focal length. In addition, folding the beam facilitated the compact cutting head with a size of 160 mm (width) × 80 mm (height) × 640 mm (length) and a weight of 6.9 kg. In the cutting experiment, the laser beam was delivered to the cutting head by a 25-m long process fiber with a core diameter of 100 μm. The cutting performances were studied against the thicknesses of stainless steel plates. A maximum cutting speed of 72 mm/min was obtained for the 60-mm thick stainless steel plate cutting and the cut specimen showed an excellent kerf shape and a narrow kerf width. To the best of our knowledge, this cutting speed was higher than other previously reported results when cutting with a 6-kW laser power.
Focusing hard X-rays with old LPs
NASA Astrophysics Data System (ADS)
Cederström, Björn; Cahn, Robert N.; Danielsson, Mats; Lundqvist, Mats; Nygren, David R.
2000-04-01
We have found that two sections cut from a vinyl long-playing record can form a spherical aberration-free refractive lens for hard X-rays. Our manufactured saw-tooth refractive lens has a focal length of 22 cm for 23-keV X-rays. The low cost and short focal length of this lens make it feasible for use in small-scale experiments with conventional X-ray tubes.
Homogeneity of small-scale earthquake faulting, stress, and fault strength
Hardebeck, J.L.
2006-01-01
Small-scale faulting at seismogenic depths in the crust appears to be more homogeneous than previously thought. I study three new high-quality focal-mechanism datasets of small (M < ??? 3) earthquakes in southern California, the east San Francisco Bay, and the aftershock sequence of the 1989 Loma Prieta earthquake. I quantify the degree of mechanism variability on a range of length scales by comparing the hypocentral disctance between every pair of events and the angular difference between their focal mechanisms. Closely spaced earthquakes (interhypocentral distance ??2 km) tend to have very similar focal mechanisms, often identical to within the 1-sigma uncertainty of ???25??. This observed similarity implies that in small volumes of crust, while faults of many orientations may or may not be present, only similarly oriented fault planes produce earthquakes contemporaneously. On these short length scales, the crustal stress orientation and fault strength (coefficient of friction) are inferred to be homogeneous as well, to produce such similar earthquakes. Over larger length scales (???2-50 km), focal mechanisms become more diverse with increasing interhypocentral distance (differing on average by 40-70??). Mechanism variability on ???2- to 50 km length scales can be explained by ralatively small variations (???30%) in stress or fault strength. It is possible that most of this small apparent heterogeneity in stress of strength comes from measurement error in the focal mechanisms, as negligibble variation in stress or fault strength (<10%) is needed if each earthquake is assigned the optimally oriented focal mechanism within the 1-sigma confidence region. This local homogeneity in stress orientation and fault strength is encouraging, implying it may be possible to measure these parameters with enough precision to be useful in studying and modeling large earthquakes.
Loosli, Y; Vianay, B; Luginbuehl, R; Snedeker, J G
2012-05-01
We present a novel approach to modeling cell spreading, and use it to reveal a potentially central mechanism regulating focal adhesion maturation in various cell phenotypes. Actin bundles that span neighboring focal complexes at the lamellipodium-lamellum interface were assumed to be loaded by intracellular forces in proportion to bundle length. We hypothesized that the length of an actin bundle (with the corresponding accumulated force at its adhesions) may thus regulate adhesion maturation to ensure cell mechanical stability and morphological integrity. We developed a model to test this hypothesis, implementing a "top-down" approach to simplify certain cellular processes while explicitly incorporating complexity of other key subcellular mechanisms. Filopodial and lamellipodial activities were treated as modular processes with functional spatiotemporal interactions coordinated by rules regarding focal adhesion turnover and actin bundle dynamics. This theoretical framework was able to robustly predict temporal evolution of cell area and cytoskeletal organization as reported from a wide range of cell spreading experiments using micropatterned substrates. We conclude that a geometric/temporal modeling framework can capture the key functional aspects of the rapid spreading phase and resultant cytoskeletal complexity. Hence the model is used to reveal mechanistic insight into basic cell behavior essential for spreading. It demonstrates that actin bundles spanning nascent focal adhesions such that they are aligned to the leading edge may accumulate centripetal endogenous forces along their length, and could thus trigger focal adhesion maturation in a force-length dependent fashion. We suggest that this mechanism could be a central "integrating" factor that effectively coordinates force-mediated adhesion maturation at the lamellipodium-lamellum interface.
Defining habitat covariates in camera-trap based occupancy studies
Niedballa, Jürgen; Sollmann, Rahel; Mohamed, Azlan bin; Bender, Johannes; Wilting, Andreas
2015-01-01
In species-habitat association studies, both the type and spatial scale of habitat covariates need to match the ecology of the focal species. We assessed the potential of high-resolution satellite imagery for generating habitat covariates using camera-trapping data from Sabah, Malaysian Borneo, within an occupancy framework. We tested the predictive power of covariates generated from satellite imagery at different resolutions and extents (focal patch sizes, 10–500 m around sample points) on estimates of occupancy patterns of six small to medium sized mammal species/species groups. High-resolution land cover information had considerably more model support for small, patchily distributed habitat features, whereas it had no advantage for large, homogeneous habitat features. A comparison of different focal patch sizes including remote sensing data and an in-situ measure showed that patches with a 50-m radius had most support for the target species. Thus, high-resolution satellite imagery proved to be particularly useful in heterogeneous landscapes, and can be used as a surrogate for certain in-situ measures, reducing field effort in logistically challenging environments. Additionally, remote sensed data provide more flexibility in defining appropriate spatial scales, which we show to impact estimates of wildlife-habitat associations. PMID:26596779
NASA Astrophysics Data System (ADS)
Takahashi, Tadayuki; Mitsuda, Kazuhisa; Kelley, Richard; Aarts, Henri; Aharonian, Felix; Akamatsu, Hiroki; Akimoto, Fumie; Allen, Steve; Anabuki, Naohisa; Angelini, Lorella; Arnaud, Keith; Asai, Makoto; Audard, Marc; Awaki, Hisamitsu; Azzarello, Philipp; Baluta, Chris; Bamba, Aya; Bando, Nobutaka; Bautz, Mark; Blandford, Roger; Boyce, Kevin; Brown, Greg; Cackett, Ed; Chernyakova, Mara; Coppi, Paolo; Costantini, Elisa; de Plaa, Jelle; den Herder, Jan-Willem; DiPirro, Michael; Done, Chris; Dotani, Tadayasu; Doty, John; Ebisawa, Ken; Eckart, Megan; Enoto, Teruaki; Ezoe, Yuichiro; Fabian, Andrew; Ferrigno, Carlo; Foster, Adam; Fujimoto, Ryuichi; Fukazawa, Yasushi; Funk, Stefan; Furuzawa, Akihiro; Galeazzi, Massimiliano; Gallo, Luigi; Gandhi, Poshak; Gendreau, Keith; Gilmore, Kirk; Haas, Daniel; Haba, Yoshito; Hamaguchi, Kenji; Hatsukade, Isamu; Hayashi, Takayuki; Hayashida, Kiyoshi; Hiraga, Junko; Hirose, Kazuyuki; Hornschemeier, Ann; Hoshino, Akio; Hughes, John; Hwang, Una; Iizuka, Ryo; Inoue, Yoshiyuki; Ishibashi, Kazunori; Ishida, Manabu; Ishimura, Kosei; Ishisaki, Yoshitaka; Ito, Masayuki; Iwata, Naoko; Iyomoto, Naoko; Kaastra, Jelle; Kallman, Timothy; Kamae, Tuneyoshi; Kataoka, Jun; Katsuda, Satoru; Kawahara, Hajime; Kawaharada, Madoka; Kawai, Nobuyuki; Kawasaki, Shigeo; Khangaluyan, Dmitry; Kilbourne, Caroline; Kimura, Masashi; Kinugasa, Kenzo; Kitamoto, Shunji; Kitayama, Tetsu; Kohmura, Takayoshi; Kokubun, Motohide; Kosaka, Tatsuro; Koujelev, Alex; Koyama, Katsuji; Krimm, Hans; Kubota, Aya; Kunieda, Hideyo; LaMassa, Stephanie; Laurent, Philippe; Lebrun, Francois; Leutenegger, Maurice; Limousin, Olivier; Loewenstein, Michael; Long, Knox; Lumb, David; Madejski, Grzegorz; Maeda, Yoshitomo; Makishima, Kazuo; Marchand, Genevieve; Markevitch, Maxim; Matsumoto, Hironori; Matsushita, Kyoko; McCammon, Dan; McNamara, Brian; Miller, Jon; Miller, Eric; Mineshige, Shin; Minesugi, Kenji; Mitsuishi, Ikuyuki; Miyazawa, Takuya; Mizuno, Tsunefumi; Mori, Hideyuki; Mori, Koji; Mukai, Koji; Murakami, Toshio; Murakami, Hiroshi; Mushotzky, Richard; Nagano, Hosei; Nagino, Ryo; Nakagawa, Takao; Nakajima, Hiroshi; Nakamori, Takeshi; Nakazawa, Kazuhiro; Namba, Yoshiharu; Natsukari, Chikara; Nishioka, Yusuke; Nobukawa, Masayoshi; Nomachi, Masaharu; O'Dell, Steve; Odaka, Hirokazu; Ogawa, Hiroyuki; Ogawa, Mina; Ogi, Keiji; Ohashi, Takaya; Ohno, Masanori; Ohta, Masayuki; Okajima, Takashi; Okamoto, Atsushi; Okazaki, Tsuyoshi; Ota, Naomi; Ozaki, Masanobu; Paerels, Fritzs; Paltani, Stéphane; Parmar, Arvind; Petre, Robert; Pohl, Martin; Porter, F. Scott; Ramsey, Brian; Reis, Rubens; Reynolds, Christopher; Russell, Helen; Safi-Harb, Samar; Sakai, Shin-ichiro; Sameshima, Hiroaki; Sanders, Jeremy; Sato, Goro; Sato, Rie; Sato, Yohichi; Sato, Kosuke; Sawada, Makoto; Serlemitsos, Peter; Seta, Hiromi; Shibano, Yasuko; Shida, Maki; Shimada, Takanobu; Shinozaki, Keisuke; Shirron, Peter; Simionescu, Aurora; Simmons, Cynthia; Smith, Randall; Sneiderman, Gary; Soong, Yang; Stawarz, Lukasz; Sugawara, Yasuharu; Sugita, Hiroyuki; Sugita, Satoshi; Szymkowiak, Andrew; Tajima, Hiroyasu; Takahashi, Hiromitsu; Takeda, Shin-ichiro; Takei, Yoh; Tamagawa, Toru; Tamura, Takayuki; Tamura, Keisuke; Tanaka, Takaaki; Tanaka, Yasuo; Tashiro, Makoto; Tawara, Yuzuru; Terada, Yukikatsu; Terashima, Yuichi; Tombesi, Francesco; Tomida, Hiroshi; Tsuboi, Yohko; Tsujimoto, Masahiro; Tsunemi, Hiroshi; Tsuru, Takeshi; Uchida, Hiroyuki; Uchiyama, Yasunobu; Uchiyama, Hideki; Ueda, Yoshihiro; Ueno, Shiro; Uno, Shinichiro; Urry, Meg; Ursino, Eugenio; de Vries, Cor; Wada, Atsushi; Watanabe, Shin; Werner, Norbert; White, Nicholas; Yamada, Takahiro; Yamada, Shinya; Yamaguchi, Hiroya; Yamasaki, Noriko; Yamauchi, Shigeo; Yamauchi, Makoto; Yatsu, Yoichi; Yonetoku, Daisuke; Yoshida, Atsumasa; Yuasa, Takayuki
2012-09-01
The joint JAXA/NASA ASTRO-H mission is the sixth in a series of highly successful X-ray missions initiated by the Institute of Space and Astronautical Science (ISAS). ASTRO-H will investigate the physics of the highenergy universe via a suite of four instruments, covering a very wide energy range, from 0.3 keV to 600 keV. These instruments include a high-resolution, high-throughput spectrometer sensitive over 0.3-12 keV with high spectral resolution of ΔE ≦ 7 eV, enabled by a micro-calorimeter array located in the focal plane of thin-foil X-ray optics; hard X-ray imaging spectrometers covering 5-80 keV, located in the focal plane of multilayer-coated, focusing hard X-ray mirrors; a wide-field imaging spectrometer sensitive over 0.4-12 keV, with an X-ray CCD camera in the focal plane of a soft X-ray telescope; and a non-focusing Compton-camera type soft gamma-ray detector, sensitive in the 40-600 keV band. The simultaneous broad bandpass, coupled with high spectral resolution, will enable the pursuit of a wide variety of important science themes.
640 x 480 MWIR and LWIR camera system developments
NASA Astrophysics Data System (ADS)
Tower, John R.; Villani, Thomas S.; Esposito, Benjamin J.; Gilmartin, Harvey R.; Levine, Peter A.; Coyle, Peter J.; Davis, Timothy J.; Shallcross, Frank V.; Sauer, Donald J.; Meyerhofer, Dietrich
1993-01-01
The performance of a 640 x 480 PtSi, 3,5 microns (MWIR), Stirling cooled camera system with a minimum resolvable temperature of 0.03 is considered. A preliminary specification of a full-TV resolution PtSi radiometer was developed using the measured performance characteristics of the Stirling cooled camera. The radiometer is capable of imaging rapid thermal transients from 25 to 250 C with better than 1 percent temperature resolution. This performance is achieved using the electronic exposure control capability of the MOS focal plane array (FPA). A liquid nitrogen cooled camera with an eight-position filter wheel has been developed using the 640 x 480 PtSi FPA. Low thermal mass packaging for the FPA was developed for Joule-Thomson applications.
640 x 480 MWIR and LWIR camera system developments
NASA Astrophysics Data System (ADS)
Tower, J. R.; Villani, T. S.; Esposito, B. J.; Gilmartin, H. R.; Levine, P. A.; Coyle, P. J.; Davis, T. J.; Shallcross, F. V.; Sauer, D. J.; Meyerhofer, D.
The performance of a 640 x 480 PtSi, 3,5 microns (MWIR), Stirling cooled camera system with a minimum resolvable temperature of 0.03 is considered. A preliminary specification of a full-TV resolution PtSi radiometer was developed using the measured performance characteristics of the Stirling cooled camera. The radiometer is capable of imaging rapid thermal transients from 25 to 250 C with better than 1 percent temperature resolution. This performance is achieved using the electronic exposure control capability of the MOS focal plane array (FPA). A liquid nitrogen cooled camera with an eight-position filter wheel has been developed using the 640 x 480 PtSi FPA. Low thermal mass packaging for the FPA was developed for Joule-Thomson applications.
NASA Astrophysics Data System (ADS)
Lyuty, V. M.; Abdullayev, B. I.; Alekberov, I. A.; Gulmaliyev, N. I.; Mikayilov, Kh. M.; Rustamov, B. N.
2009-12-01
Short description of optical and electric scheme of CCD photometer with camera U-47 installed on the Cassegrain focus of ZEISS-600 telescope of the ShAO NAS Azerbaijan is provided. The reducer of focus with factor of reduction 1.7 is applied. It is calculated equivalent focal distances of a telescope with a focus reducer. General calculations of optimum distance from focal plane and t sizes of optical filters of photometer are presented.
The LSST Camera 500 watt -130 degC Mixed Refrigerant Cooling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowden, Gordon B.; Langton, Brian J.; /SLAC
2014-05-28
The LSST Camera has a higher cryogenic heat load than previous CCD telescope cameras due to its large size (634 mm diameter focal plane, 3.2 Giga pixels) and its close coupled front-end electronics operating at low temperature inside the cryostat. Various refrigeration technologies are considered for this telescope/camera environment. MMR-Technology’s Mixed Refrigerant technology was chosen. A collaboration with that company was started in 2009. The system, based on a cluster of Joule-Thomson refrigerators running a special blend of mixed refrigerants is described. Both the advantages and problems of applying this technology to telescope camera refrigeration are discussed. Test results frommore » a prototype refrigerator running in a realistic telescope configuration are reported. Current and future stages of the development program are described. (auth)« less
The NASA 2003 Mars Exploration Rover Panoramic Camera (Pancam) Investigation
NASA Astrophysics Data System (ADS)
Bell, J. F.; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Morris, R. V.; Athena Team
2002-12-01
The Panoramic Camera System (Pancam) is part of the Athena science payload to be launched to Mars in 2003 on NASA's twin Mars Exploration Rover missions. The Pancam imaging system on each rover consists of two major components: a pair of digital CCD cameras, and the Pancam Mast Assembly (PMA), which provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360o of azimuth and from zenith to nadir, providing a complete view of the scene around the rover. Pancam utilizes two 1024x2048 Mitel frame transfer CCD detector arrays, each having a 1024x1024 active imaging area and 32 optional additional reference pixels per row for offset monitoring. Each array is combined with optics and a small filter wheel to become one "eye" of a multispectral, stereoscopic imaging system. The optics for both cameras consist of identical 3-element symmetrical lenses with an effective focal length of 42 mm and a focal ratio of f/20, yielding an IFOV of 0.28 mrad/pixel or a rectangular FOV of 16o\\x9D 16o per eye. The two eyes are separated by 30 cm horizontally and have a 1o toe-in to provide adequate parallax for stereo imaging. The cameras are boresighted with adjacent wide-field stereo Navigation Cameras, as well as with the Mini-TES instrument. The Pancam optical design is optimized for best focus at 3 meters range, and allows Pancam to maintain acceptable focus from infinity to within 1.5 meters of the rover, with a graceful degradation (defocus) at closer ranges. Each eye also contains a small 8-position filter wheel to allow multispectral sky imaging, direct Sun imaging, and surface mineralogic studies in the 400-1100 nm wavelength region. Pancam has been designed and calibrated to operate within specifications from -55oC to +5oC. An onboard calibration target and fiducial marks provide the ability to validate the radiometric and geometric calibration on Mars. Pancam relies heavily on use of the JPL ICER wavelet compression algorithm to maximize data return within stringent mission downlink limits. The scientific goals of the Pancam investigation are to: (a) obtain monoscopic and stereoscopic image mosaics to assess the morphology, topography, and geologic context of each MER landing site; (b) obtain multispectral visible to short-wave near-IR images of selected regions to determine surface color and mineralogic properties; (c) obtain multispectral images over a range of viewing geometries to constrain surface photometric and physical properties; and (d) obtain images of the Martian sky, including direct images of the Sun, to determine dust and aerosol opacity and physical properties. In addition, Pancam also serves a variety of operational functions on the MER mission, including (e) serving as the primary Sun-finding camera for rover navigation; (f) resolving objects on the scale of the rover wheels to distances of ~100 m to help guide navigation decisions; (g) providing stereo coverage adequate for the generation of digital terrain models to help guide and refine rover traverse decisions; (h) providing high resolution images and other context information to guide the selection of the most interesting in situ sampling targets; and (i) supporting acquisition and release of exciting E/PO products.
A rigid and thermally stable all ceramic optical support bench assembly for the LSST Camera
NASA Astrophysics Data System (ADS)
Kroedel, Matthias; Langton, J. Brian; Wahl, Bill
2017-09-01
This paper will present the ceramic design, fabrication and metrology results and assembly plan of the LSST camera optical bench structure which is using the unique manufacturing features of the HB-Cesic technology. The optical bench assembly consists of a rigid "Grid" fabrication supporting individual raft plates mounting sensor assemblies by way of a rigid kinematic support system to meet extreme stringent requirements for focal plane planarity and stability.
NASA Astrophysics Data System (ADS)
Chatterjee, Abhijit; Verma, Anurag
2016-05-01
The Advanced Wide Field Sensor (AWiFS) camera caters to high temporal resolution requirement of Resourcesat-2A mission with repeativity of 5 days. The AWiFS camera consists of four spectral bands, three in the visible and near IR and one in the short wave infrared. The imaging concept in VNIR bands is based on push broom scanning that uses linear array silicon charge coupled device (CCD) based Focal Plane Array (FPA). On-Board Calibration unit for these CCD based FPAs is used to monitor any degradation in FPA during entire mission life. Four LEDs are operated in constant current mode and 16 different light intensity levels are generated by electronically changing exposure of CCD throughout the calibration cycle. This paper describes experimental setup and characterization results of various flight model visible LEDs (λP=650nm) for development of On-Board Calibration unit of Advanced Wide Field Sensor (AWiFS) camera of RESOURCESAT-2A. Various LED configurations have been studied to meet dynamic range coverage of 6000 pixels silicon CCD based focal plane array from 20% to 60% of saturation during night pass of the satellite to identify degradation of detector elements. The paper also explains comparison of simulation and experimental results of CCD output profile at different LED combinations in constant current mode.
Recent developments for the Large Binocular Telescope Guiding Control Subsystem
NASA Astrophysics Data System (ADS)
Golota, T.; De La Peña, M. D.; Biddick, C.; Lesser, M.; Leibold, T.; Miller, D.; Meeks, R.; Hahn, T.; Storm, J.; Sargent, T.; Summers, D.; Hill, J.; Kraus, J.; Hooper, S.; Fisher, D.
2014-07-01
The Large Binocular Telescope (LBT) has eight Acquisition, Guiding, and wavefront Sensing Units (AGw units). They provide guiding and wavefront sensing capability at eight different locations at both direct and bent Gregorian focal stations. Recent additions of focal stations for PEPSI and MODS instruments doubled the number of focal stations in use including respective motion, camera controller server computers, and software infrastructure communicating with Guiding Control Subsystem (GCS). This paper describes the improvements made to the LBT GCS and explains how these changes have led to better maintainability and contributed to increased reliability. This paper also discusses the current GCS status and reviews potential upgrades to further improve its performance.
The design and assembly of aluminum mirrors of a three-mirror-anastigmat telescope
NASA Astrophysics Data System (ADS)
Chang, Shenq-Tsong; Lin, Yu-Chuan; Wu, Kun-Huan; Lien, Chun-Chieh; Huang, Ting-Ming; Tsay, Ho-Lin; Chan, Chia-Yen
2017-09-01
Better ground sampling distance (GSD) has been a trend for earth observation satellites. A long-focal-length telescope is required accordingly in systematic point of view. On the other hand, there is size constraint for such long-focal-length telescope especially in space projects. Three-mirror-anastigmat (TMA) was proven to have excellent features of correcting aberrations, wide spectral range and shorter physical requirement [1-3].
The design of red-blue 3D video fusion system based on DM642
NASA Astrophysics Data System (ADS)
Fu, Rongguo; Luo, Hao; Lv, Jin; Feng, Shu; Wei, Yifang; Zhang, Hao
2016-10-01
Aiming at the uncertainty of traditional 3D video capturing including camera focal lengths, distance and angle parameters between two cameras, a red-blue 3D video fusion system based on DM642 hardware processing platform is designed with the parallel optical axis. In view of the brightness reduction of traditional 3D video, the brightness enhancement algorithm based on human visual characteristics is proposed and the luminance component processing method based on YCbCr color space is also proposed. The BIOS real-time operating system is used to improve the real-time performance. The video processing circuit with the core of DM642 enhances the brightness of the images, then converts the video signals of YCbCr to RGB and extracts the R component from one camera, so does the other video and G, B component are extracted synchronously, outputs 3D fusion images finally. The real-time adjustments such as translation and scaling of the two color components are realized through the serial communication between the VC software and BIOS. The system with the method of adding red-blue components reduces the lost of the chrominance components and makes the picture color saturation reduce to more than 95% of the original. Enhancement algorithm after optimization to reduce the amount of data fusion in the processing of video is used to reduce the fusion time and watching effect is improved. Experimental results show that the system can capture images in near distance, output red-blue 3D video and presents the nice experiences to the audience wearing red-blue glasses.
A software package for evaluating the performance of a star sensor operation
NASA Astrophysics Data System (ADS)
Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; Nirmal, K.; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant
2017-02-01
We have developed a low-cost off-the-shelf component star sensor ( StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.
Solar System Portrait - Views of 6 Planets
1996-09-13
These six narrow-angle color images were made from the first ever portrait of the solar system taken by NASA’s Voyager 1, which was more than 4 billion miles from Earth and about 32 degrees above the ecliptic. The spacecraft acquired a total of 60 frames for a mosaic of the solar system which shows six of the planets. Mercury is too close to the sun to be seen. Mars was not detectable by the Voyager cameras due to scattered sunlight in the optics, and Pluto was not included in the mosaic because of its small size and distance from the sun. These blown-up images, left to right and top to bottom are Venus, Earth, Jupiter, and Saturn, Uranus, Neptune. The background features in the images are artifacts resulting from the magnification. The images were taken through three color filters -- violet, blue and green -- and recombined to produce the color images. Jupiter and Saturn were resolved by the camera but Uranus and Neptune appear larger than they really are because of image smear due to spacecraft motion during the long (15 second) exposure times. Earth appears to be in a band of light because it coincidentally lies right in the center of the scattered light rays resulting from taking the image so close to the sun. Earth was a crescent only 0.12 pixels in size. Venus was 0.11 pixel in diameter. The planetary images were taken with the narrow-angle camera (1500 mm focal length). http://photojournal.jpl.nasa.gov/catalog/PIA00453
Status of MUSIC, the MUltiwavelength Sub/millimeter Inductance Camera
NASA Astrophysics Data System (ADS)
Golwala, Sunil R.; Bockstiegel, Clint; Brugger, Spencer; Czakon, Nicole G.; Day, Peter K.; Downes, Thomas P.; Duan, Ran; Gao, Jiansong; Gill, Amandeep K.; Glenn, Jason; Hollister, Matthew I.; LeDuc, Henry G.; Maloney, Philip R.; Mazin, Benjamin A.; McHugh, Sean G.; Miller, David; Noroozian, Omid; Nguyen, Hien T.; Sayers, Jack; Schlaerth, James A.; Siegel, Seth; Vayonakis, Anastasios K.; Wilson, Philip R.; Zmuidzinas, Jonas
2012-09-01
We present the status of MUSIC, the MUltiwavelength Sub/millimeter Inductance Camera, a new instrument for the Caltech Submillimeter Observatory. MUSIC is designed to have a 14', diffraction-limited field-of-view instrumented with 2304 detectors in 576 spatial pixels and four spectral bands at 0.87, 1.04, 1.33, and 1.98 mm. MUSIC will be used to study dusty star-forming galaxies, galaxy clusters via the Sunyaev-Zeldovich effect, and star formation in our own and nearby galaxies. MUSIC uses broadband superconducting phased-array slot-dipole antennas to form beams, lumpedelement on-chip bandpass filters to define spectral bands, and microwave kinetic inductance detectors to sense incoming light. The focal plane is fabricated in 8 tiles consisting of 72 spatial pixels each. It is coupled to the telescope via an ambient-temperature ellipsoidal mirror and a cold reimaging lens. A cold Lyot stop sits at the image of the primary mirror formed by the ellipsoidal mirror. Dielectric and metal-mesh filters are used to block thermal infrared and out-ofband radiation. The instrument uses a pulse tube cooler and 3He/ 3He/4He closed-cycle cooler to cool the focal plane to below 250 mK. A multilayer shield attenuates Earth's magnetic field. Each focal plane tile is read out by a single pair of coaxes and a HEMT amplifier. The readout system consists of 16 copies of custom-designed ADC/DAC and IF boards coupled to the CASPER ROACH platform. We focus on recent updates on the instrument design and results from the commissioning of the full camera in 2012.
First light of Cassis: the stereo surface imaging system onboard the exomars TGO
NASA Astrophysics Data System (ADS)
Gambicorti, L.; Piazza, D.; Pommerol, A.; Roloff, V.; Gerber, M.; Ziethe, R.; El-Maarry, M. R.; Weigel, T.; Johnson, M.; Vernani, D.; Pelo, E.; Da Deppo, V.; Cremonese, G.; Ficai Veltroni, I.; Thomas, N.
2017-09-01
The Colour and Stereo Surface Imaging System (CaSSIS) camera was launched on 14 March 2016 onboard the ExoMars Trace Gas Orbiter (TGO) and it is currently in cruise to Mars. The CaSSIS high resolution optical system is based on a TMA telescope (Three Mirrors Anastigmatic configuration) with a 4th powered folding mirror compacting the CFRP (Carbon Fiber Reinforced Polymer) structure. The camera EPD (Entrance Pupil Diameter) is 135 mm and the focal length is 880 mm, giving an F# 6.5 system; the wavelength range covered by the instrument is 400-1100 nm. The optical system is designed to have distortion of less than 2%, and a worst case Modulation Transfer Function (MTF) of 0.3 at the detector Nyquist spatial frequency (i.e. 50 lp/mm). The Focal Plane Assembly (FPA), including the detector, is a spare from the Simbio-Sys instrument of the Italian Space Agency (ASI). Simbio-Sys will fly on ESA's BepiColombo mission to Mercury in 2018. The detector, developed by Raytheon Vision Systems, is a 2k×2k hybrid Si-PIN array with 10 μm-pixel pitch. The detector allows snap shot operation at a read-out rate of 5 Mpx/s with 14-bit resolution. CaSSIS will operate in a push-frame mode with a Filter Strip Assembly (FSA), placed directly above the detector sensitive area, selecting 4 colour bands. The scale at a slant angle of 4.6 m/px from the nominal orbit is foreseen to produce frames of 9.4 km × 6.3 km on the Martian surface, and covering a Field of View (FoV) of 1.33° cross track × 0.88° along track. The University of Bern was in charge of the full instrument integration as well as the characterisation of the focal plane of CaSSIS. The paper will present an overview of CaSSIS and the optical performance of the telescope and the FPA. The preliminary results of the on-ground calibration campaign and the first light obtained during the commissioning and pointing campaign (April 2016) will be described in detail. The instrument is acquiring images with an average Point Spread Function at Full-Width-Half-Maximum (PSF FWHM) of < 1.5 px, as expected.
Calibration of the Lunar Reconnaissance Orbiter Camera
NASA Astrophysics Data System (ADS)
Tschimmel, M.; Robinson, M. S.; Humm, D. C.; Denevi, B. W.; Lawrence, S. J.; Brylow, S.; Ravine, M.; Ghaemi, T.
2008-12-01
The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m/pixel and 2 UV filters (315 and 360 nm) with a resolution of 400 m/pixel. In addition to the multicolor imaging the WAC can operate in monochrome mode to provide a global large- incidence angle basemap and a time-lapse movie of the illumination conditions at both poles. The WAC has a highly linear response, a read noise of 72 e- and a full well capacity of 47,200 e-. The signal-to-noise ratio in each band is 140 in the worst case. There are no out-of-band leaks and the spectral response of each filter is well characterized. Each NAC is a monochrome pushbroom scanner, providing images with a resolution of 50 cm/pixel from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of topography and hazard risks. The North and South poles will be mapped on a 1-meter-scale poleward of 85.5° latitude. Stereo coverage can be provided by pointing the NACs off-nadir. The NACs are also highly linear. Read noise is 71 e- for NAC-L and 74 e- for NAC-R and the full well capacity is 248,500 e- for NAC-L and 262,500 e- for NAC- R. The focal lengths are 699.6 mm for NAC-L and 701.6 mm for NAC-R; the system MTF is 28% for NAC-L and 26% for NAC-R. The signal-to-noise ratio is at least 46 (terminator scene) and can be higher than 200 (high sun scene). Both NACs exhibit a straylight feature, which is caused by out-of-field sources and is of a magnitude of 1-3%. However, as this feature is well understood it can be greatly reduced during ground processing. All three cameras were calibrated in the laboratory under ambient conditions. Future thermal vacuum tests will characterize critical behaviors across the full range of lunar operating temperatures. In-flight tests will check for changes in response after launch and provide key data for meeting the requirements of 1% relative and 10% absolute radiometric calibration.
Miniature Wide-Angle Lens for Small-Pixel Electronic Camera
NASA Technical Reports Server (NTRS)
Mouroulils, Pantazis; Blazejewski, Edward
2009-01-01
A proposed wideangle lens is shown that would be especially well suited for an electronic camera in which the focal plane is occupied by an image sensor that has small pixels. The design of the lens is intended to satisfy requirements for compactness, high image quality, and reasonably low cost, while addressing issues peculiar to the operation of small-pixel image sensors. Hence, this design is expected to enable the development of a new generation of compact, high-performance electronic cameras. The lens example shown has a 60 degree field of view and a relative aperture (f-number) of 3.2. The main issues affecting the design are also shown.
Optical registration of spaceborne low light remote sensing camera
NASA Astrophysics Data System (ADS)
Li, Chong-yang; Hao, Yan-hui; Xu, Peng-mei; Wang, Dong-jie; Ma, Li-na; Zhao, Ying-long
2018-02-01
For the high precision requirement of spaceborne low light remote sensing camera optical registration, optical registration of dual channel for CCD and EMCCD is achieved by the high magnification optical registration system. System integration optical registration and accuracy of optical registration scheme for spaceborne low light remote sensing camera with short focal depth and wide field of view is proposed in this paper. It also includes analysis of parallel misalignment of CCD and accuracy of optical registration. Actual registration results show that imaging clearly, MTF and accuracy of optical registration meet requirements, it provide important guarantee to get high quality image data in orbit.
C-RED one: ultra-high speed wavefront sensing in the infrared made possible
NASA Astrophysics Data System (ADS)
Gach, J.-L.; Feautrier, Philippe; Stadler, Eric; Greffe, Timothee; Clop, Fabien; Lemarchand, Stéphane; Carmignani, Thomas; Boutolleau, David; Baker, Ian
2016-07-01
First Light Imaging's CRED-ONE infrared camera is capable of capturing up to 3500 full frames per second with a subelectron readout noise. This breakthrough has been made possible thanks to the use of an e-APD infrared focal plane array which is a real disruptive technology in imagery. We will show the performances of the camera, its main features and compare them to other high performance wavefront sensing cameras like OCAM2 in the visible and in the infrared. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.
Orbital-science investigation: Part C: photogrammetry of Apollo 15 photography
Wu, Sherman S.C.; Schafer, Francis J.; Jordan, Raymond; Nakata, Gary M.; Derick, James L.
1972-01-01
Mapping of large areas of the Moon by photogrammetric methods was not seriously considered until the Apollo 15 mission. In this mission, a mapping camera system and a 61-cm optical-bar high-resolution panoramic camera, as well as a laser altimeter, were used. The mapping camera system comprises a 7.6-cm metric terrain camera and a 7.6-cm stellar camera mounted in a fixed angular relationship (an angle of 96° between the two camera axes). The metric camera has a glass focal-plane plate with reseau grids. The ground-resolution capability from an altitude of 110 km is approximately 20 m. Because of the auxiliary stellar camera and the laser altimeter, the resulting metric photography can be used not only for medium- and small-scale cartographic or topographic maps, but it also can provide a basis for establishing a lunar geodetic network. The optical-bar panoramic camera has a 135- to 180-line resolution, which is approximately 1 to 2 m of ground resolution from an altitude of 110 km. Very large scale specialized topographic maps for supporting geologic studies of lunar-surface features can be produced from the stereoscopic coverage provided by this camera.
The Volumetric Imaging System for the Ionosphere (VISION)
2011-01-01
telescope mirror is an 80 mm focal length, 50 mm diameter, first surface mirror with magnesium fluoride over aluminum coating on a zerodur blank...will be either a wedge–and–strip anode or a codacon. The telescope mirror is an 80 mm focal length, 2.5 cm diameter, first surface mirror with...magnesium fluoride over aluminum coating on a zerodur blank. The filters are the three reflection bandpass filters developed for and flown on
Hyperspectral Infrared Imaging of Flames Using a Spectrally Scanning Fabry-Perot Filter
NASA Technical Reports Server (NTRS)
Rawlins, W. T.; Lawrence, W. G.; Marinelli, W. J.; Allen, M. G.; Piltch, N. (Technical Monitor)
2001-01-01
The temperatures and compositions of gases in and around flames can be diagnosed using infrared emission spectroscopy to observe molecular band shapes and intensities. We have combined this approach with a low-order scanning Fabry-Perot filter and an infrared camera to obtain spectrally scanned infrared emission images of a laboratory flame and exhaust plume from 3.7 to 5.0 micrometers, at a spectral resolution of 0.043 micrometers, and a spatial resolution of 1 mm. The scanning filter or AIRIS (Adaptive Infrared Imaging Spectroradiometer) is a Fabry-Perot etalon operating in low order (mirror spacing = wavelength) such that the central spot, containing a monochromatic image of the scene, is viewed by the detector array. The detection system is a 128 x 128 liquid-nitrogen-cooled InSb focal plane array. The field of view is controlled by a 50 mm focal length multielement lens and an V4.8 aperture, resulting in an image 6.4 x 6.4 cm in extent at the flame and a depth of field of approximately 4 cm. Hyperspectral images above a laboratory CH4/air flame show primarily the strong emission from CO2 at 4.3 micrometers, and weaker emissions from CO and H2O. We discuss techniques to analyze the spectra, and plans to use this instrument in microgravity flame spread experiments.
Chromatic control in coextruded layered polymer microlenses
NASA Astrophysics Data System (ADS)
Crescimanno, Michael; Oder, Tom N.; Andrews, James H.; Zhou, Chuanhong; Petrus, Joshua B.; Merlo, Cory; Bagheri, Cameron; Hetzel, Connor; Tancabel, James; Singer, Kenneth D.; Baer, Eric
2014-12-01
We describe the formation, characterization and theoretical understanding of microlenses comprised of alternating polystyrene and polymethylmethacrylate layers produced by multilayer coextrusion. These lenses are fabricated by photolithography, using a grayscale mask followed by plasma etching, so that the refractive index alternation of the bilayer stack appears across the radius of the microlens. The alternating quarter-wave thick layers form a one-dimensional photonic crystal whose dispersion augments the material dispersion, allowing one to sculpt the chromatic dispersion of the lens by adjusting the layered structure. Using Huygen's principle, we model our experimental measurements of the focal length of these lenses across the reflection band of the multilayer polymer film from which the microlens is fashioned. For a 56 micron diameter multilayered lens of focal length 300 microns, we measured a nearly 25 percent variation in the focal length across a shallow, 50 nm-wide reflection band.
Imaging spectroscopy using embedded diffractive optical arrays
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele; Hinnrichs, Bradford
2017-09-01
Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera based on diffractive optic arrays. This approach to hyperspectral imaging has been demonstrated in all three infrared bands SWIR, MWIR and LWIR. The hyperspectral optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of this infrared hyperspectral sensor. This new and innovative approach to an infrared hyperspectral imaging spectrometer uses micro-optics that are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a small satellite, mini-UAV, commercial quadcopter or man portable. Also, an application of how this spectral imaging technology can easily be used to quantify the mass and volume flow rates of hydrocarbon gases. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. The detector array is divided into sub-images covered by each lenslet. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the number of simultaneous different spectral images collected each frame of the camera. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame. This system spans the SWIR and MWIR bands with a single optical array and focal plane array.
Camera Concepts for the Advanced Gamma-Ray Imaging System (AGIS)
NASA Astrophysics Data System (ADS)
Nepomuk Otte, Adam
2009-05-01
The Advanced Gamma-Ray Imaging System (AGIS) is a concept for the next generation observatory in ground-based very high energy gamma-ray astronomy. Design goals are ten times better sensitivity, higher angular resolution, and a lower energy threshold than existing Cherenkov telescopes. Each telescope is equipped with a camera that detects and records the Cherenkov-light flashes from air showers. The camera is comprised of a pixelated focal plane of blue sensitive and fast (nanosecond) photon detectors that detect the photon signal and convert it into an electrical one. The incorporation of trigger electronics and signal digitization into the camera are under study. Given the size of AGIS, the camera must be reliable, robust, and cost effective. We are investigating several directions that include innovative technologies such as Geiger-mode avalanche-photodiodes as a possible detector and switched capacitor arrays for the digitization.
Rogers, B.T. Jr.; Davis, W.C.
1957-12-17
This patent relates to high speed cameras having resolution times of less than one-tenth microseconds suitable for filming distinct sequences of a very fast event such as an explosion. This camera consists of a rotating mirror with reflecting surfaces on both sides, a narrow mirror acting as a slit in a focal plane shutter, various other mirror and lens systems as well as an innage recording surface. The combination of the rotating mirrors and the slit mirror causes discrete, narrow, separate pictures to fall upon the film plane, thereby forming a moving image increment of the photographed event. Placing a reflecting surface on each side of the rotating mirror cancels the image velocity that one side of the rotating mirror would impart, so as a camera having this short a resolution time is thereby possible.
An Integrated Optimal Estimation Approach to Spitzer Space Telescope Focal Plane Survey
NASA Technical Reports Server (NTRS)
Bayard, David S.; Kang, Bryan H.; Brugarolas, Paul B.; Boussalis, D.
2004-01-01
This paper discusses an accurate and efficient method for focal plane survey that was used for the Spitzer Space Telescope. The approach is based on using a high-order 37-state Instrument Pointing Frame (IPF) Kalman filter that combines both engineering parameters and science parameters into a single filter formulation. In this approach, engineering parameters such as pointing alignments, thermomechanical drift and gyro drifts are estimated along with science parameters such as plate scales and optical distortions. This integrated approach has many advantages compared to estimating the engineering and science parameters separately. The resulting focal plane survey approach is applicable to a diverse range of science instruments such as imaging cameras, spectroscopy slits, and scanning-type arrays alike. The paper will summarize results from applying the IPF Kalman Filter to calibrating the Spitzer Space Telescope focal plane, containing the MIPS, IRAC, and the IRS science Instrument arrays.
Spatial characterization of Bessel-like beams for strong-field physics.
Summers, Adam M; Yu, Xiaoming; Wang, Xinya; Raoul, Maxime; Nelson, Josh; Todd, Daniel; Zigo, Stefan; Lei, Shuting; Trallero-Herrero, Carlos A
2017-02-06
We present a compact, simple design for the generation and tuning of both the spot size and effective focal length of Bessel-like beams. In particular, this setup provides an important tool for the use of Bessel-like beams with high-power, femtosecond laser systems. Using a shallow angle axicon in conjunction with a spherical lens, we show that it is possible to focus Bessel-like modes to comparable focal spot sizes to sharp axicons while maintaining a long effective focal length. The resulting focal profiles are characterized in detail using an accurate high dynamic range imaging technique. Quantitatively, we introduce a metric (R0.8) which defines the spot-size containing 80% of the total energy. Our setup overcomes the typical compromise between long working distances and small spot sizes. This is particularly relevant for strong-field physics where most experiments must operate in vacuum.
Biomimetic small scale variable focal length lens unit using synthetic elastomer actuators
NASA Astrophysics Data System (ADS)
Kim, Baek-chul; Chung, Jinah; Lee, Y.; Nam, Jae-Do; Moon, Hyungpil; Choi, Hyouk Ryeol; Koo, J. C.
2011-04-01
Having a combination of a gel-like soft lens, ligaments, and the Ciliary muscles, the human eyes are effectively working for various focal lengths without a complicated group of lens. The simple and compact but effective optical system should deserve numerous attentions from various technical field especially portable information technology device industry. Noting the limited physical space of those deivces, demanding shock durability, and massive volume productivity, the present paper proposes a biomimetic optical lens unit that is organized with a circular silicone lens and an annular dielectric polymer actuator. Unlike the traditional optical lens mechanism that normally acquires a focus by changing its focal distance with moving lens or focal plane. the proposed optical system changes its lens thickness using a annulary connected polymer actuator in order to get image focuses. The proposed biomimetic lens system ensures high shock durability, compact physical dimensions, fast actuations, simple manufacturing process, and low production cost.
Multistage polymeric lens structure in silica-waveguides for photonic functional circuits
NASA Astrophysics Data System (ADS)
Tate, Atsushi; Suzuki, Takanori; Tsuda, Hiroyuki
2005-04-01
A waveguide lens composed of multistage polymer-filled thin grooves in a silica planar lightwave circuit (PLC) is proposed and the low-loss structure is designed. Both an imaging optical system and a Fourier-Transform optical system can be configured in a PLC by use of a waveguide lens. It makes a PLC functional and its design flexible. Moreover, a focal length of a lens is tunable with large thermo-optic effect of the polymer. A concatenated lens is formed to attain a desirable focal length with low-loss. The thickness of each lens and the spacing are about 10-50 microns. The simulation showed that the radiation loss of the light propagate through 20-stage grooves filled with a polymer was only 0.868 dB when the refractive index of the polymer was 1.57, the groove width was 30 microns, and the spacing between adjacent grooves was 15 microns. For example, the single lens structure that the center thickness is 30 microns, the diameter is 300 microns, and the refractive index of the polymer was 1.57, have a focal length of 4600 microns. The focal length of 450 microns can be obtained with 20-stage concatenated lens structure. The larger numerical aperture can be realized with a polymer of higher refractive index. We have applied the concatenated lens structure to various photonic circuits including optical couplers, a variable optical attenuator.
Visible-infrared achromatic imaging by wavefront coding with wide-angle automobile camera
NASA Astrophysics Data System (ADS)
Ohta, Mitsuhiko; Sakita, Koichi; Shimano, Takeshi; Sugiyama, Takashi; Shibasaki, Susumu
2016-09-01
We perform an experiment of achromatic imaging with wavefront coding (WFC) using a wide-angle automobile lens. Our original annular phase mask for WFC was inserted to the lens, for which the difference between the focal positions at 400 nm and at 950 nm is 0.10 mm. We acquired images of objects using a WFC camera with this lens under the conditions of visible and infrared light. As a result, the effect of the removal of the chromatic aberration of the WFC system was successfully determined. Moreover, we fabricated a demonstration set assuming the use of a night vision camera in an automobile and showed the effect of the WFC system.
Single-snapshot 2D color measurement by plenoptic imaging system
NASA Astrophysics Data System (ADS)
Masuda, Kensuke; Yamanaka, Yuji; Maruyama, Go; Nagai, Sho; Hirai, Hideaki; Meng, Lingfei; Tosic, Ivana
2014-03-01
Plenoptic cameras enable capture of directional light ray information, thus allowing applications such as digital refocusing, depth estimation, or multiband imaging. One of the most common plenoptic camera architectures contains a microlens array at the conventional image plane and a sensor at the back focal plane of the microlens array. We leverage the multiband imaging (MBI) function of this camera and develop a single-snapshot, single-sensor high color fidelity camera. Our camera is based on a plenoptic system with XYZ filters inserted in the pupil plane of the main lens. To achieve high color measurement precision of this system, we perform an end-to-end optimization of the system model that includes light source information, object information, optical system information, plenoptic image processing and color estimation processing. Optimized system characteristics are exploited to build an XYZ plenoptic colorimetric camera prototype that achieves high color measurement precision. We describe an application of our colorimetric camera to color shading evaluation of display and show that it achieves color accuracy of ΔE<0.01.
NASA Astrophysics Data System (ADS)
Torkildsen, H. E.; Hovland, H.; Opsahl, T.; Haavardsholm, T. V.; Nicolas, S.; Skauli, T.
2014-06-01
In some applications of multi- or hyperspectral imaging, it is important to have a compact sensor. The most compact spectral imaging sensors are based on spectral filtering in the focal plane. For hyperspectral imaging, it has been proposed to use a "linearly variable" bandpass filter in the focal plane, combined with scanning of the field of view. As the image of a given object in the scene moves across the field of view, it is observed through parts of the filter with varying center wavelength, and a complete spectrum can be assembled. However if the radiance received from the object varies with viewing angle, or with time, then the reconstructed spectrum will be distorted. We describe a camera design where this hyperspectral functionality is traded for multispectral imaging with better spectral integrity. Spectral distortion is minimized by using a patterned filter with 6 bands arranged close together, so that a scene object is seen by each spectral band in rapid succession and with minimal change in viewing angle. The set of 6 bands is repeated 4 times so that the spectral data can be checked for internal consistency. Still the total extent of the filter in the scan direction is small. Therefore the remainder of the image sensor can be used for conventional imaging with potential for using motion tracking and 3D reconstruction to support the spectral imaging function. We show detailed characterization of the point spread function of the camera, demonstrating the importance of such characterization as a basis for image reconstruction. A simplified image reconstruction based on feature-based image coregistration is shown to yield reasonable results. Elimination of spectral artifacts due to scene motion is demonstrated.
Multipurpose Hyperspectral Imaging System
NASA Technical Reports Server (NTRS)
Mao, Chengye; Smith, David; Lanoue, Mark A.; Poole, Gavin H.; Heitschmidt, Jerry; Martinez, Luis; Windham, William A.; Lawrence, Kurt C.; Park, Bosoon
2005-01-01
A hyperspectral imaging system of high spectral and spatial resolution that incorporates several innovative features has been developed to incorporate a focal plane scanner (U.S. Patent 6,166,373). This feature enables the system to be used for both airborne/spaceborne and laboratory hyperspectral imaging with or without relative movement of the imaging system, and it can be used to scan a target of any size as long as the target can be imaged at the focal plane; for example, automated inspection of food items and identification of single-celled organisms. The spectral resolution of this system is greater than that of prior terrestrial multispectral imaging systems. Moreover, unlike prior high-spectral resolution airborne and spaceborne hyperspectral imaging systems, this system does not rely on relative movement of the target and the imaging system to sweep an imaging line across a scene. This compact system (see figure) consists of a front objective mounted at a translation stage with a motorized actuator, and a line-slit imaging spectrograph mounted within a rotary assembly with a rear adaptor to a charged-coupled-device (CCD) camera. Push-broom scanning is carried out by the motorized actuator which can be controlled either manually by an operator or automatically by a computer to drive the line-slit across an image at a focal plane of the front objective. To reduce the cost, the system has been designed to integrate as many as possible off-the-shelf components including the CCD camera and spectrograph. The system has achieved high spectral and spatial resolutions by using a high-quality CCD camera, spectrograph, and front objective lens. Fixtures for attachment of the system to a microscope (U.S. Patent 6,495,818 B1) make it possible to acquire multispectral images of single cells and other microscopic objects.
Zackrisson, Bjørn; Aus, Gunnar; Bergdahl, Svante; Lilja, Hans; Lodding, Pär; Pihl, Carl-Gustav; Hugosson, Jonas
2004-04-01
We evaluated the significance of focal prostate cancer found in sextant biopsies in men participating in a biennial prostate specific antigen (PSA) based screening program. In 1995, 10000 men 50 to 65 years old were randomized to biennial screening with PSA testing. Sextant biopsies were recommended when total PSA was 3 ng/ml or greater at screening rounds 1 and 2, and 2.54 ng/ml or greater at subsequent screening rounds. Focal cancer was defined as total a core cancer length of less than 3 mm in the biopsy specimen. Low volume cancer was defined as a total tumor volume of less than 0.5 cm in the radical retropubic prostatectomy specimen. The number of men who underwent biopsy and the number of cancers detected in the 5 possible sets of biopsies were 1725 and 402, 706 and 124, 307 and 36, 103 and 9, and 13 and 0, respectively. The risk of detecting focal cancer was 7.9%, 10.2%, 7.5%, 5.8% and 0%, respectively, but the relative ratio (focal-to-all cancers) increased 34%, 58%, 64%, 67% and, not applicable, respectively. In men with a total core cancer length of less than 10 mm there was no correlation between core cancer length and total tumor volume, as measured in the prostatectomy specimen. Two-thirds of men with a total core cancer length of less than 3 mm had a tumor volume of greater than 0.5 cm, while the risk of low volume cancer was less than 5% only in men with a total core cancer length of greater than 10 mm. In a repeat PSA based screening program sextant biopsies are of little or no value for predicting tumor volume.
DotLens smartphone microscopy for biological and biomedical applications (Conference Presentation)
NASA Astrophysics Data System (ADS)
Sung, Yu-Lung; Zhao, Fusheng; Shih, Wei-Chuan
2017-02-01
Recent advances in inkjet-printed optics have created a new class of lens fabrication technique. Lenses with a tunable geometry, magnification, and focal length can be fabricated by dispensing controlled amounts of liquid polymer onto a heated surface. This fabrication technique is highly cost-effective, and can achieve optically smooth surface finish. Dubbed DotLens, a single of which weighs less than 50 mg and occupies a volume less than 50 μL. DotLens can be attached onto any smartphone camera akin to a contact lens, and enable smartphones to obtain image resolution as fine as 1 µm. The surface curvature modifies the optical path of light to the image sensor, and enables the camera to focus as close as 2 mm. This enables microscopic imaging on a smartphone without any additional attachments, and has shown great potential in mobile point-of-care diagnostic systems, particularly for histology of tissue sections and cytology of blood cells. DotLens Smartphone Microscopy represents an innovative approach fundamentally different from other smartphone microscopes. In this paper, we describe the application and performance of DotLens smartphone microscopy in biological and biomedical research. In particular, we show recent results from images collected from pathology tissue slides with cancer features. In addition, we show performance in cytological analysis of blood smear. This tool has empowered Citizen Science investigators to collect microscopic images from various interesting objects.
NASA Astrophysics Data System (ADS)
Xie, Xingwang; Han, Xinjie; Long, Huabao; Dai, Wanwan; Xin, Zhaowei; Wei, Dong; Zhang, Xinyu; Wang, Haiwei; Xie, Changsheng
2018-02-01
In this paper, a new liquid-crystal microlens array (LCMLA) with patterned ring-electrode arrays (PREAs) is investigated, which has an ability to acquire multiple-mode two-dimensional images with better electrically tunable efficiency than common liquid-crystal devices. The new type of LCMLA can be used to overcome several remarkable disadvantage of conventional liquid-crystal microlens arrays switched and adjusted electrically by relatively complex mechanism. There are two layer electrodes in the LCMLA developed by us. The top electrode layer consists of PREAs with different featured diameter but the same center for each single cell, and the bottom is a plate electrode. When both electrode structures are driven independently by variable AC voltage signal, a gradient electric field distribution could be obtained, which can drive liquid-crystal molecules to reorient themselves along the gradient electric field shaped, so as to demonstrate a satisfactory refractive index distribution. The common experiments are carried out to validate the performances needed. As shown, the focal length of the LCMLA can be adjusted continuously according to the variable voltage signal applied. According to designing, the LCMLA will be integrated continuously with an image sensors to set up a camera with desired performances. The test results indicate that our camera based on the LCMLA can obtain distinct multiple-mode two-dimensional images under the condition of using relatively low driving signal voltage.
VizieR Online Data Catalog: Five years of blazar observations with VERITAS (Archambault+, 2016)
NASA Astrophysics Data System (ADS)
Archambault, S.; Archer, A.; Benbow, W.; Bird, R.; Biteau, J.; Buchovecky, M.; Buckley, J. H.; Bugaev, V.; Byrum, K.; Cerruti, M.; Chen, X.; Ciupik, L.; Connolly, M. P.; Cui, W.; Eisch, J. D.; Errando, M.; Falcone, A.; Feng, Q.; Finley, J. P.; Fleischhack, H.; Fortin, P.; Fortson, L.; Furniss, A.; Gillanders, G. H.; Griffin, S.; Grube, J.; Gyuk, G.; Hutten, M.; Hakansson, N.; Hanna, D.; Holder, J.; Humensky, T. B.; Johnson, C. A.; Kaaret, P.; Kar, P.; Kelley-Hoskins, N.; Kertzman, M.; Kieda, D.; Krause, M.; Krennrich, F.; Kumar, S.; Lang, M. J.; Maier, G.; McArthur, S.; McCann, A.; Meagher, K.; Moriarty, P.; Mukherjee, R.; Nguyen, T.; Nieto, D.; O'Faolain de Bhroithe, A.; Ong, R. A.; Otte, A. N.; Park, N.; Perkins, J. S.; Pichel, A.; Pohl, M.; Popkow, A.; Pueschel, E.; Quinn, J.; Ragan, K.; Reynolds, P. T.; Richards, G. T.; Roache, E.; Rovero, A. C.; Santander, M.; Sembroski, G. H.; Shahinyan, K.; Smith, A. W.; Staszak, D.; Telezhinsky, I.; Tucci, J. V.; Tyler, J.; Vincent, S.; Wakely, S. P.; Weiner, O. M.; Weinstein, A.; Williams, D. A.; Zitzer, B.
2018-04-01
In this paper, we present the results of the analysis of most of the non-detected blazars observed by VERITAS from 2007 (the beginning of full-scale scientific operations) to 2012 August (before the upgrade of the VERITAS array, see Kieda 2013arXiv1308.4849D). The VERITAS (Very Energetic Radiation Imaging Telescope Array System) telescope array is composed of four IACTs of 12 m diameter each, located at the Fred Lawrence Whipple Observatory, on the slopes of Mount Hopkins, in southern Arizona (31°40'N, 110°57'W). Each telescope has a segmented mirror that focuses light onto a camera composed of 499 photomultipliers located at the focal plane. The instrument FOV is 3.5°. For further details on the VERITAS instrument, see Holder et al. (2006APh....25..391H) and Holder (2011ICRC...12..137H). The VERITAS observations here have an average length of 20 minutes (referred to as a run), before switching targets or wobble directions. (7 data files).
Experimental results in autonomous landing approaches by dynamic machine vision
NASA Astrophysics Data System (ADS)
Dickmanns, Ernst D.; Werner, Stefan; Kraus, S.; Schell, R.
1994-07-01
The 4-D approach to dynamic machine vision, exploiting full spatio-temporal models of the process to be controlled, has been applied to on board autonomous landing approaches of aircraft. Aside from image sequence processing, for which it was developed initially, it is also used for data fusion from a range of sensors. By prediction error feedback an internal representation of the aircraft state relative to the runway in 3-D space and time is servo- maintained in the interpretation process, from which the control applications required are being derived. The validity and efficiency of the approach have been proven both in hardware- in-the-loop simulations and in flight experiments with a twin turboprop aircraft Do128 under perturbations from cross winds and wind gusts. The software package has been ported to `C' and onto a new transputer image processing platform; the system has been expanded for bifocal vision with two cameras of different focal length mounted fixed relative to each other on a two-axes platform for viewing direction control.
NASA Astrophysics Data System (ADS)
Pareschi, Giovanni; Tagliaferri, Gianpiero; Argan, Andrea; Bellazzini, Ronaldo; Catalano, Osvaldo; Costa, Enrico; Cusumano, Giancarlo; Fiore, Fabrizio; Fiorini, Carlo; Malaguti, Giuseppe; Matt, Giorgio; Mereghetti, Sandro; Micela, Giuseppina; Perola, Giuseppe Cesare; Villa, Gabriele
2010-07-01
The Italian New Hard X-ray Mission (NHXM) is an evolution of the HEXIT-Sat concept, extending up to 80 keV the fine imaging capability today available only at E<10 keV, with the further addition of photoelectric imaging polarimetry. NHXM consists of four identical mirrors, with a 10 m focal length, achieved after launch by means of a deployable structure. Three of the four telescopes will have at their focus three identical spectro-imaging cameras, while a X-ray imaging polarimeter will be placed at the focus of the fourth. In order to ensure a low and stable background, NHXM will be place on a low Earth equatorial orbit. NHXM will provide a real breakthrough on a number of hot astrophysical issues, broadly falling under two main topics: i) censing the black holes in the Universe and probing the physics of accretion in the most diverse conditions; ii) investigating the particle acceleration mechanisms at work in different contexts, and the effects of radiative transfer in highly magnetized plasmas and strong gravitational fields.
The 1982 control network of Mars
NASA Technical Reports Server (NTRS)
Davies, M. E.; Katayama, F. Y.
1983-01-01
Attention is given to a planet-wide control network of Mars that was computed in September 1982 using a large single-block analytical triangulation with 47,524 measurements of 6853 control points on 1054 Mariner 9 and 757 Viking pictures. In all, 19,139 normal equations were solved, with a resulting standard error of measurement of 18.06 microns. The control points identified by name and letter designation are given, as are the aerographic coordinates of the control points. In addition, the coordinates of the Viking I lander site are given: latitude, 22.480 deg; longitude, 47.962 deg (radius, 3389.32 km). This study expands and updates the previously published network (1978). It is noted that the computation differs in many respects from standard aerial mapping photogrammetric practice. In comparison with aerial mapping photography, the television formats are small and the focal lengths are long; stereo coverage is rare, the scale of the pictures varies greatly, and the residual camera distortions are large.
3D-printed eagle eye: Compound microlens system for foveated imaging
Thiele, Simon; Arzenbacher, Kathrin; Gissibl, Timo; Giessen, Harald; Herkommer, Alois M.
2017-01-01
We present a highly miniaturized camera, mimicking the natural vision of predators, by 3D-printing different multilens objectives directly onto a complementary metal-oxide semiconductor (CMOS) image sensor. Our system combines four printed doublet lenses with different focal lengths (equivalent to f = 31 to 123 mm for a 35-mm film) in a 2 × 2 arrangement to achieve a full field of view of 70° with an increasing angular resolution of up to 2 cycles/deg field of view in the center of the image. The footprint of the optics on the chip is below 300 μm × 300 μm, whereas their height is <200 μm. Because the four lenses are printed in one single step without the necessity for any further assembling or alignment, this approach allows for fast design iterations and can lead to a plethora of different miniaturized multiaperture imaging systems with applications in fields such as endoscopy, optical metrology, optical sensing, surveillance drones, or security. PMID:28246646
NASA Astrophysics Data System (ADS)
Malaguti, G.; Pareschi, G.; Ferrando, P.; Caroli, E.; Di Cocco, G.; Foschini, L.; Basso, S.; Del Sordo, S.; Fiore, F.; Bonati, A.; Lesci, G.; Poulsen, J. M.; Monzani, F.; Stevoli, A.; Negri, B.
2005-08-01
The 10-100 keV region of the electromagnetic spectrum contains the potential for a dramatic improvement in our understanding of a number of key problems in high energy astrophysics. A deep inspection of the universe in this band is on the other hand still lacking because of the demanding sensitivity (fraction of μCrab in the 20-40 keV for 1 Ms integration time) and imaging (≈ 15" angular resolution) requirements. The mission ideas currently being proposed are based on long focal length, grazing incidence, multi-layer optics, coupled with focal plane detectors with few hundreds μm spatial resolution capability. The required large focal lengths, ranging between 8 and 50 m, can be realized by means of extendable optical benches (as foreseen e.g. for the HEXITSAT, NEXT and NuSTAR missions) or formation flight scenarios (e.g. Simbol-X and XEUS). While the final telescope design will require a detailed trade-off analysis between all the relevant parameters (focal length, plate scale value, angular resolution, field of view, detector size, and sensitivity degradation due to detector dead area and telescope vignetting), extreme attention must be dedicated to the background minimization. In this respect, key issues are represented by the passive baffling system, which in case of large focal lengths requires particular design assessments, and by the active/passive shielding geometries and materials. In this work, the result of a study of the expected background for a hard X-ray telescope is presented, and its implication on the required sensitivity, together with the possible implementation design concepts for active and passive shielding in the framework of future satellite missions, are discussed.
SWUIS-A: A Versatile, Low-Cost UV/VIS/IR Imaging System for Airborne Astronomy and Aeronomy Research
NASA Technical Reports Server (NTRS)
Durda, Daniel D.; Stern, S. Alan; Tomlinson, William; Slater, David C.; Vilas, Faith
2001-01-01
We have developed and successfully flight-tested on 14 different airborne missions the hardware and techniques for routinely conducting valuable astronomical and aeronomical observations from high-performance, two-seater military-type aircraft. The SWUIS-A (Southwest Universal Imaging System - Airborne) system consists of an image-intensified CCD camera with broad band response from the near-UV to the near IR, high-quality foreoptics, a miniaturized video recorder, an aircraft-to-camera power and telemetry interface with associated camera controls, and associated cables, filters, and other minor equipment. SWUIS-A's suite of high-quality foreoptics gives it selectable, variable focal length/variable field-of-view capabilities. The SWUIS-A camera frames at 60 Hz video rates, which is a key requirement for both jitter compensation and high time resolution (useful for occultation, lightning, and auroral studies). Broadband SWUIS-A image coadds can exceed a limiting magnitude of V = 10.5 in <1 sec with dark sky conditions. A valuable attribute of SWUIS-A airborne observations is the fact that the astronomer flies with the instrument, thereby providing Space Shuttle-like "payload specialist" capability to "close-the-loop" in real-time on the research done on each research mission. Key advantages of the small, high-performance aircraft on which we can fly SWUIS-A include significant cost savings over larger, more conventional airborne platforms, worldwide basing obviating the need for expensive, campaign-style movement of specialized large aircraft and their logistics support teams, and ultimately faster reaction times to transient events. Compared to ground-based instruments, airborne research platforms offer superior atmospheric transmission, the mobility to reach remote and often-times otherwise unreachable locations over the Earth, and virtually-guaranteed good weather for observing the sky. Compared to space-based instruments, airborne platforms typically offer substantial cost advantages and the freedom to fly along nearly any groundtrack route for transient event tracking such as occultations and eclipses.
Size-effect of oligomeric cholesteric liquid-crystal microlenses on the optical specifications.
Bayon, Chloé; Agez, Gonzague; Mitov, Michel
2015-10-15
In cholesteric liquid-crystalline microlenses, we have studied the role of the microlens size on the focused light intensity and the focal length. We have found that the intensity is maximized by aiming a specific range for the diameter and the thickness of microlenses and that the focal length is adjusted by controlling the diameter and the annealing time of the optical film. Cholesteric microlenses may be used as wavelength-tunable directional light sources in organic soft-matter circuits.
Focus-tunable liquid cylindrical lens based on electrowetting
NASA Astrophysics Data System (ADS)
Tan, Yanting; Peng, Runling
2017-10-01
The double-liquid focus-tunable lens based on electrowetting on dielectrics is attracting many researchers' attention because of compact volume, quick responding speed, low consumption etc. In this paper, a focus-tunable liquid cylindrical lens based on electrowetting is designed, the structure and operating principles of this lens are introduced. COMSOL Multiphysics is chamber, and the focal length is varied continuously. According to the materials used in our laboratory, the focal length is estimated, ranging between (-∞, -38.6mm)υ(61.4mm, +∞).
Surface geometry and optical aberrations of ex-vivo crystalline lenses
NASA Astrophysics Data System (ADS)
Bueno, Juan M.; Schwarz, Christina; Acosta, Eva; Artal, Pablo
2010-02-01
The shape of the surfaces of ex-vivo human crystalline lenses was measured using a shadow photography technique. From these data, the back-focal distance and the contribution of each surface to the main optical aberrations of the lenses were estimated. The aberrations of the lenses were measured separately with two complementary techniques: a Hartmann-Shack wavefront sensor and a point-diffraction interferometer. A laser scanning set-up was also used to measure the actual back-focal length as well as the phase aberration in one meridian section of the lenses. Measured and predicted back-focal length agreed well within the experimental errors. The lens aberrations computed with a ray-tracing approach from the measured surfaces and geometrical data only reproduce quantitatively the measured aberrations.
The NOAO NEWFIRM Data Handling System
NASA Astrophysics Data System (ADS)
Zárate, N.; Fitzpatrick, M.
2008-08-01
The NOAO Extremely Wide-Field IR Mosaic (NEWFIRM) is a new 1-2.4 micron IR camera that is now being commissioned for the 4m Mayall telescope at Kitt Peak. The focal plane consists of a 2x2 mosaic of 2048x2048 arrays offerring a field-of-view of 27.6' on a side. The use of dual MONSOON array controllers permits very fast readout, a scripting interface allows for highly efficient observing modes. We describe the Data Handling System (DHS) for the NEWFIRM camera which is designed to meet the performance requirements of the instrument as well as the observing environment in which in operates. It is responsible for receiving the data stream from the detector and instrument software, rectifying the image geometry, presenting a real-time display of the image to the user, final assembly of a science-grade image with complete headers, as well as triggering automated pipeline and archival functions. The DHS uses an event-based messaging system to control multiple processes on a distributed network of machines. The asynchronous nature of this processing means the DHS operates independently from the camera readout and the design of the system is inherently scalable to larger focal planes that use a greater number of array controllers. Current status and future plans for the DHS are also discussed.
Status and performance of HST/Wide Field Camera 3
NASA Astrophysics Data System (ADS)
Kimble, Randy A.; MacKenty, John W.; O'Connell, Robert W.
2006-06-01
Wide Field Camera 3 (WFC3) is a powerful UV/visible/near-infrared camera currently in development for installation into the Hubble Space Telescope. WFC3 provides two imaging channels. The UVIS channel features a 4096 x 4096 pixel CCD focal plane covering 200 to 1000 nm wavelengths with a 160 x 160 arcsec field of view. The UVIS channel provides unprecedented sensitivity and field of view in the near ultraviolet for HST. It is particularly well suited for studies of the star formation history of local galaxies and clusters, searches for Lyman alpha dropouts at moderate redshift, and searches for low surface brightness structures against the dark UV sky background. The IR channel features a 1024 x 1024 pixel HgCdTe focal plane covering 800 to 1700 nm with a 139 x 123 arcsec field of view, providing a major advance in IR survey efficiency for HST. IR channel science goals include studies of dark energy, galaxy formation at high redshift, and star formation. The instrument is being prepared for launch as part of HST Servicing Mission 4, tentatively scheduled for late 2007, contingent upon formal approval of shuttle-based servicing after successful shuttle return-to-flight. We report here on the status and performance of WFC3.
Automated Meteor Fluxes with a Wide-Field Meteor Camera Network
NASA Technical Reports Server (NTRS)
Blaauw, R. C.; Campbell-Brown, M. D.; Cooke, W.; Weryk, R. J.; Gill, J.; Musci, R.
2013-01-01
Within NASA, the Meteoroid Environment Office (MEO) is charged to monitor the meteoroid environment in near ]earth space for the protection of satellites and spacecraft. The MEO has recently established a two ]station system to calculate automated meteor fluxes in the millimeter ]size ]range. The cameras each consist of a 17 mm focal length Schneider lens on a Watec 902H2 Ultimate CCD video camera, producing a 21.7 x 16.3 degree field of view. This configuration has a red ]sensitive limiting meteor magnitude of about +5. The stations are located in the South Eastern USA, 31.8 kilometers apart, and are aimed at a location 90 km above a point 50 km equidistant from each station, which optimizes the common volume. Both single station and double station fluxes are found, each having benefits; more meteors will be detected in a single camera than will be seen in both cameras, producing a better determined flux, but double station detections allow for non ]ambiguous shower associations and permit speed/orbit determinations. Video from the cameras are fed into Linux computers running the ASGARD (All Sky and Guided Automatic Real ]time Detection) software, created by Rob Weryk of the University of Western Ontario Meteor Physics Group. ASGARD performs the meteor detection/photometry, and invokes the MILIG and MORB codes to determine the trajectory, speed, and orbit of the meteor. A subroutine in ASGARD allows for the approximate shower identification in single station meteors. The ASGARD output is used in routines to calculate the flux in units of #/sq km/hour. The flux algorithm employed here differs from others currently in use in that it does not assume a single height for all meteors observed in the common camera volume. In the MEO system, the volume is broken up into a set of height intervals, with the collecting areas determined by the radiant of active shower or sporadic source. The flux per height interval is summed to obtain the total meteor flux. As ASGARD also computes the meteor mass from the photometry, a mass flux can be also calculated. Weather conditions in the southeastern United States are seldom ideal, which introduces the difficulty of a variable sky background. First a weather algorithm indicates if sky conditions are clear enough to calculate fluxes, at which point a limiting magnitude algorithm is employed. The limiting magnitude algorithm performs a fit of stellar magnitudes vs camera intensities. The stellar limiting magnitude is derived from this and easily converted to a limiting meteor magnitude for the active shower or sporadic source.
Characterizing volcanic activity: Application of freely-available webcams
NASA Astrophysics Data System (ADS)
Dehn, J.; Harrild, M.; Webley, P. W.
2017-12-01
In recent years, freely-available web-based cameras, or webcams, have become more readily available allowing an increased level of monitoring at active volcanoes across the globe. While these cameras have been extensively used as qualitative tools, they provide a unique dataset to perform quantitative analyzes of the changing behavior of the particular volcano within the cameras field of view. We focus on the multitude of these freely-available webcams and present a new algorithm to detect changes in volcanic activity using nighttime webcam data. Our approach uses a quick, efficient, and fully automated algorithm to identify changes in webcam data in near real-time, including techniques such as edge detection, Gaussian mixture models, and temporal/spatial statistical tests, which are applied to each target image. Often the image metadata (exposure, gain settings, aperture, focal length, etc.) are unknown, meaning we developed our algorithm to identify the quantity of volcanically incandescent pixels as well as the number of specific algorithm tests needed to detect thermal activity, instead of directly correlating brightness in the webcam to eruption temperatures. We compared our algorithm results to a manual analysis of webcam data for several volcanoes and determined a false detection rate of less than 3% for the automated approach. In our presentation, we describe the different tests integrated into our algorithm, lessons learned, and how we applied our method to several volcanoes across the North Pacific during its development and implementation. We will finish with a discussion on the global applicability of our approach and how to build a 24/7, 365 day a year tool that can be used as an additional data source for real-time analysis of volcanic activity.
Variable-focus liquid lens for miniature cameras
NASA Astrophysics Data System (ADS)
Kuiper, S.; Hendriks, B. H. W.
2004-08-01
The meniscus between two immiscible liquids can be used as an optical lens. A change in curvature of this meniscus by electrowetting leads to a change in focal distance. It is demonstrated that two liquids in a tube form a self-centered lens with a high optical quality. The motion of the lens during a focusing action was studied by observation through the transparent tube wall. Finally, a miniature achromatic camera module was designed and constructed based on this adjustable lens, showing that it is excellently suited for use in portable applications.
Photogrammetry of the Map Instrument in a Cryogenic Vacuum Environment
NASA Technical Reports Server (NTRS)
Hill, M.; Packard, E.; Pazar, R.
2000-01-01
MAP Instrument requirements dictated that the instruments Focal Plane Assembly (FPA) and Thermal Reflector System (TRS) maintain a high degree of structural integrity at operational temperatures (< 50K). To verify integrity at these extremes, an elaborate test fixture was constructed to provide a large cryogenic (< 20K) radiative environment and a mobile photogrammetry camera. This paper will discuss MAP's Instrument requirements, how those requirements were verified using photogrammetry, and the test setup used to provide the environment and camera movement needed to verify the instrument's requirements.
Color camera computed tomography imaging spectrometer for improved spatial-spectral image accuracy
NASA Technical Reports Server (NTRS)
Wilson, Daniel W. (Inventor); Bearman, Gregory H. (Inventor); Johnson, William R. (Inventor)
2011-01-01
Computed tomography imaging spectrometers ("CTIS"s) having color focal plane array detectors are provided. The color FPA detector may comprise a digital color camera including a digital image sensor, such as a Foveon X3.RTM. digital image sensor or a Bayer color filter mosaic. In another embodiment, the CTIS includes a pattern imposed either directly on the object scene being imaged or at the field stop aperture. The use of a color FPA detector and the pattern improves the accuracy of the captured spatial and spectral information.
Exact optics - III. Schwarzschild's spectrograph camera revised
NASA Astrophysics Data System (ADS)
Willstrop, R. V.
2004-03-01
Karl Schwarzschild identified a system of two mirrors, each defined by conic sections, free of third-order spherical aberration, coma and astigmatism, and with a flat focal surface. He considered it impractical, because the field was too restricted. This system was rediscovered as a quadratic approximation to one of Lynden-Bell's `exact optics' designs which have wider fields. Thus the `exact optics' version has a moderate but useful field, with excellent definition, suitable for a spectrograph camera. The mirrors are strongly aspheric in both the Schwarzschild design and the exact optics version.
Focal length calibration of an electrically tunable lens by digital holography.
Wang, Zhaomin; Qu, Weijuan; Yang, Fang; Asundi, Anand Krishna
2016-02-01
The electrically tunable lens (ETL) is a novel current-controlled adaptive optical component which can continuously tune its focus in a specific range via changing its surface curvature. To quantitatively characterize its tuning power, here we assume the ETL to be a pure phase object and present a novel calibration method to dynamically measure its wavefront by use of digital holographic microscopy (DHM). The least squares method is then used to fit the radius of curvature of the wavefront. The focal length is obtained by substituting the radius into the Zemax model of the ETL. The behavior curve between the focal length of the ETL and its driven current is drawn, and a quadratic mathematic model is set up to characterize it. To verify our model, an ETL and offset lens combination is proposed and applied to ETL-based transport of intensity equation (TIE) phase retrieval microscopy. The experimental result demonstrates the calibration works well in TIE phase retrieval in comparison with the phase measured by DHM.
Modeling of thermal lensing in side and end-pumped finite solid-state laser rods. M.S. Thesis
NASA Technical Reports Server (NTRS)
Brackett, Vincent G.
1990-01-01
An analytical expression for approximating the time-dependent thermal focal length in finite solid state laser rods was derived. The analysis is based on the temperature variation of the material refractive index caused by optical pumping of these rods. Several quantities were found to be relevant to this analysis. These quantities were the specific thermal profiles of the rods, type of optical pumping employed, type of cooling scheme employed (side and end-cooling parameters), and the specific material characteristics of the rods. The Thermal Lensing Model was formulated using the geometric ray tracing approach. The focal lengths are then approximated, by calculating the phase shift in the index of refraction, as the different rays of an incident plane wave are tracked through a lens-like crystal medium. The approach also applies in the case of Gaussian or parabolic pump beams. It is shown that the prediction of thermal focal length is in good quantitative agreement with experimentally obtained data.
Application of preconditioned alternating direction method of multipliers in depth from focal stack
NASA Astrophysics Data System (ADS)
Javidnia, Hossein; Corcoran, Peter
2018-03-01
Postcapture refocusing effect in smartphone cameras is achievable using focal stacks. However, the accuracy of this effect is totally dependent on the combination of the depth layers in the stack. The accuracy of the extended depth of field effect in this application can be improved significantly by computing an accurate depth map, which has been an open issue for decades. To tackle this issue, a framework is proposed based on a preconditioned alternating direction method of multipliers for depth from the focal stack and synthetic defocus application. In addition to its ability to provide high structural accuracy, the optimization function of the proposed framework can, in fact, converge faster and better than state-of-the-art methods. The qualitative evaluation has been done on 21 sets of focal stacks and the optimization function has been compared against five other methods. Later, 10 light field image sets have been transformed into focal stacks for quantitative evaluation purposes. Preliminary results indicate that the proposed framework has a better performance in terms of structural accuracy and optimization in comparison to the current state-of-the-art methods.
Focus control system for stretched-membrane mirror module
Butler, B.L.; Beninga, K.J.
1991-05-21
A focus control system dynamically sets and controls the focal length of a reflective membrane supported between a perimeter frame. A rear membrane is also supported between the perimeter frame rearward and spaced apart from a back side of the reflective membrane. The space between the membranes defines a plenum space into which a mass of gas at a first pressure is inserted. The pressure differential between the first pressure and an external pressure, such as the atmospheric pressure, causes the reflective membrane to assume a first curvature relative to a reference plane associated with the perimeter frame. This curvature defines the focal length of the reflective membrane. The focal length is dynamically controlled by changing the volume of the plenum space, thereby changing the first pressure. The system can be used to change or maintain the pressure differential and hence the front membrane curvature. The plenum volume is changed by pushing or pulling on a central section of the rear membrane using a suitable actuator. Sensing means continuously sense the location of the reflective membrane relative to the reference plane. This sensed position is compared to a reference position, and a resulting error signal, comprising the difference between the sensed position and reference position, drives the actuator in a direction to minimize the difference. A vent value compensates for temperature changes or leaks in the closed volume by allowing the pressure differential to be adjusted as required to center the working range of the actuator about the desired focal length. 13 figures.
Focus control system for stretched-membrane mirror module
Butler, Barry L.; Beninga, Kelly J.
1991-01-01
A focus control system dynamically sets and controls the focal length of a reflective membrane supported between a perimeter frame. A rear membrane is also supported between the perimeter frame rearward and spaced apart from a back side of the reflective membrane. The space between the membranes defines a plenum space into which a mass of gas at a first pressure is inserted. The pressure differential between the first pressure and an external pressure, such as the atmospheric pressure, causes the reflective membrane to assume a first curvature relative to a reference plane associated with the perimeter frame. This curvature defines the focal length of the reflective membrane. The focal length is dynamically controlled by changing the volume of the plenum space, thereby changing the first pressure. The system can be used to change or maintain the pressure differential and hence the front membrane curvature. The plenum volume is changed by pushing or pulling on a central section of the rear membrane using a suitable actuator. Sensing means continuously sense the location of the reflective membrane relative to the reference plane. This sensed position is compared to a reference position, and a resulting error signal, comprising the difference between the sensed position and reference position, drives the actuator in a direction to minimize the difference. A vent value compensates for temperature changes or leaks in the closed volume by allowing the pressure differential to be adjusted as required to center the working range of the actuator about the desired focal length.
A planar lens based on the electrowetting of two immiscible liquids
NASA Astrophysics Data System (ADS)
Liu, Chao-Xuan; Park, Jihwan; Choi, Jin-Woo
2008-03-01
This paper reports the development and characterization of a planar liquid lens based on electrowetting. The working concept of electrowetting two immiscible liquids is demonstrated with measurement and characterization of contact angles with regard to externally applied electric voltages. Consequently, a planar liquid lens is designed and implemented based on this competitive electrowetting. A droplet of silicone oil confined in an aqueous solution (1% KCl) works as a liquid lens. Electrowetting then controls the shape of the confined silicone oil and the focal length of the liquid lens varies depending upon an applied dc voltage. A unique feature of this lens design is the double-ring planar electrodes beneath the hydrophobic substrate. While an outer ring electrode provides an initial boundary for the silicone oil droplet, an inner ring works as the actuation electrode for the lens. Further, the planar electrodes, instead of vertical or out-of-plane wall electrodes, facilitate the integration of liquid lenses into microfluidic systems. With the voltage applied in the range of 50-250 V, the confined silicone oil droplet changed its shape and the optical magnification of a 3 mm-diameter liquid lens was clearly demonstrated. Moreover, focal lengths of liquid lenses with diameters of 2 mm, 3 mm and 4 mm were characterized, respectively. The obtained results suggest that a larger lens diameter yields a longer focal length and a wider range of focal length change in response to voltage. The demonstrated liquid lens has a simple structure and is easy to fabricate.
System Architecture of the Dark Energy Survey Camera Readout Electronics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaw, Theresa; /FERMILAB; Ballester, Otger
2010-05-27
The Dark Energy Survey makes use of a new camera, the Dark Energy Camera (DECam). DECam will be installed in the Blanco 4M telescope at Cerro Tololo Inter-American Observatory (CTIO). DECam is presently under construction and is expected to be ready for observations in the fall of 2011. The focal plane will make use of 62 2Kx4K and 12 2kx2k fully depleted Charge-Coupled Devices (CCDs) for guiding, alignment and focus. This paper will describe design considerations of the system; including, the entire signal path used to read out the CCDs, the development of a custom crate and backplane, the overallmore » grounding scheme and early results of system tests.« less
InGaAs focal plane arrays for low-light-level SWIR imaging
NASA Astrophysics Data System (ADS)
MacDougal, Michael; Hood, Andrew; Geske, Jon; Wang, Jim; Patel, Falgun; Follman, David; Manzo, Juan; Getty, Jonathan
2011-06-01
Aerius Photonics will present their latest developments in large InGaAs focal plane arrays, which are used for low light level imaging in the short wavelength infrared (SWIR) regime. Aerius will present imaging in both 1280x1024 and 640x512 formats. Aerius will present characterization of the FPA including dark current measurements. Aerius will also show the results of development of SWIR FPAs for high temperaures, including imagery and dark current data. Finally, Aerius will show results of using the SWIR camera with Aerius' SWIR illuminators using VCSEL technology.
Design of a long focal length mid-wavelength infrared optical system
NASA Astrophysics Data System (ADS)
Duan, Jing; Zhang, Zhanpeng; Liu, Kai; Shan, Qiusha; Jiang, Kai; Yan, Peipei
2018-02-01
Based on a 640×512 cooled staring focal plane array (FPA) detector, pixel size 15μm×15μm, a long focal length mid-wavelength infrared optical system was designed. In this paper, the working wavelength is 3μm 5μm, the temperature range is -30°C +50°C, this system can realize 1000mm focal length, the F-number is 4, the full field of view is 0.70°, satisfy 100% cold shield efficiency. A re-imaging refractive system was adopted in this designed optical system consists of a main objective group and a projection group. First of all, the structural selection and the initial parameter calculation were introduced. Secondly, on the basis of variety of the temperature, a focusing len was presented in this system to adjust to produce a clear image. Last but not the least, to improve image quality and environment adaptability, the analysis of temperature change and ghost image were described particularly. The design results prove that at the spatial frequency of 33 lp/mm, the axis MTF of the optical system is greater than 0.35, the system can offer a high resolution and excellent images, and it has the advantages of good adaptability, simple structure, easy to adjust, and high transmittance.
Improved Scanners for Microscopic Hyperspectral Imaging
NASA Technical Reports Server (NTRS)
Mao, Chengye
2009-01-01
Improved scanners to be incorporated into hyperspectral microscope-based imaging systems have been invented. Heretofore, in microscopic imaging, including spectral imaging, it has been customary to either move the specimen relative to the optical assembly that includes the microscope or else move the entire assembly relative to the specimen. It becomes extremely difficult to control such scanning when submicron translation increments are required, because the high magnification of the microscope enlarges all movements in the specimen image on the focal plane. To overcome this difficulty, in a system based on this invention, no attempt would be made to move either the specimen or the optical assembly. Instead, an objective lens would be moved within the assembly so as to cause translation of the image at the focal plane: the effect would be equivalent to scanning in the focal plane. The upper part of the figure depicts a generic proposed microscope-based hyperspectral imaging system incorporating the invention. The optical assembly of this system would include an objective lens (normally, a microscope objective lens) and a charge-coupled-device (CCD) camera. The objective lens would be mounted on a servomotor-driven translation stage, which would be capable of moving the lens in precisely controlled increments, relative to the camera, parallel to the focal-plane scan axis. The output of the CCD camera would be digitized and fed to a frame grabber in a computer. The computer would store the frame-grabber output for subsequent viewing and/or processing of images. The computer would contain a position-control interface board, through which it would control the servomotor. There are several versions of the invention. An essential feature common to all versions is that the stationary optical subassembly containing the camera would also contain a spatial window, at the focal plane of the objective lens, that would pass only a selected portion of the image. In one version, the window would be a slit, the CCD would contain a one-dimensional array of pixels, and the objective lens would be moved along an axis perpendicular to the slit to spatially scan the image of the specimen in pushbroom fashion. The image built up by scanning in this case would be an ordinary (non-spectral) image. In another version, the optics of which are depicted in the lower part of the figure, the spatial window would be a slit, the CCD would contain a two-dimensional array of pixels, the slit image would be refocused onto the CCD by a relay-lens pair consisting of a collimating and a focusing lens, and a prism-gratingprism optical spectrometer would be placed between the collimating and focusing lenses. Consequently, the image on the CCD would be spatially resolved along the slit axis and spectrally resolved along the axis perpendicular to the slit. As in the first-mentioned version, the objective lens would be moved along an axis perpendicular to the slit to spatially scan the image of the specimen in pushbroom fashion.
Plenoptic Imager for Automated Surface Navigation
NASA Technical Reports Server (NTRS)
Zollar, Byron; Milder, Andrew; Milder, Andrew; Mayo, Michael
2010-01-01
An electro-optical imaging device is capable of autonomously determining the range to objects in a scene without the use of active emitters or multiple apertures. The novel, automated, low-power imaging system is based on a plenoptic camera design that was constructed as a breadboard system. Nanohmics proved feasibility of the concept by designing an optical system for a prototype plenoptic camera, developing simulated plenoptic images and range-calculation algorithms, constructing a breadboard prototype plenoptic camera, and processing images (including range calculations) from the prototype system. The breadboard demonstration included an optical subsystem comprised of a main aperture lens, a mechanical structure that holds an array of micro lenses at the focal distance from the main lens, and a structure that mates a CMOS imaging sensor the correct distance from the micro lenses. The demonstrator also featured embedded electronics for camera readout, and a post-processor executing image-processing algorithms to provide ranging information.
The TolTEC Camera for the LMT Telescope
NASA Astrophysics Data System (ADS)
Bryan, Sean
2018-01-01
TolTEC is a new camera being built for the 50-meter Large Millimeter-wave Telescope (LMT) on Sierra Negra in Puebla, Mexico. The instrument will discover and characterize distant galaxies by detecting the thermal emission of dust heated by starlight. The polarimetric capabilities of the camera will measure magnetic fields in star-forming regions in the Milky Way. The optical design of the camera uses mirrors, lenses, and dichroics to simultaneously couple a 4 arcminute diameter field of view onto three single-band focal planes at 150, 220, and 280 GHz. The 7000 polarization-selective detectors are single-band horn-coupled LEKID detectors fabricated at NIST. A rotating half wave plate operates at ambient temperature to modulate the polarized signal. In addition to the galactic and extragalactic surveys already planned, TolTEC installed at the LMT will provide open observing time to the community.
A Combined Laser-Communication and Imager for Microspacecraft (ACLAIM)
NASA Technical Reports Server (NTRS)
Hemmati, H.; Lesh, J.
1998-01-01
ACLAIM is a multi-function instrument consisting of a laser communication terminal and an imaging camera that share a common telescope. A single APS- (Active Pixel Sensor) based focal-plane-array is used to perform both the acquisition and tracking (for laser communication) and science imaging functions.
50 CFR 217.55 - Requirements for monitoring and reporting.
Code of Federal Regulations, 2014 CFR
2014-10-01
... MAMMALS INCIDENTAL TO SPECIFIED ACTIVITIES Taking of Marine Mammals Incidental To Target and Missile... the following monitoring measures: (1) Visual land-based monitoring. (i) Prior to each missile launch... located varying distances from the missile launch site. Each video camera will be set to record a focal...
50 CFR 216.155 - Requirements for monitoring and reporting.
Code of Federal Regulations, 2013 CFR
2013-10-01
... IMPORTING OF MARINE MAMMALS Taking Of Marine Mammals Incidental To Missile Launch Activities from San... monitoring measures: (1) Visual Land-Based Monitoring. (i) Prior to each missile launch, an observer(s) will... from the missile launch site. Each video camera will be set to record a focal subgroup within the...
50 CFR 216.155 - Requirements for monitoring and reporting.
Code of Federal Regulations, 2012 CFR
2012-10-01
... IMPORTING OF MARINE MAMMALS Taking Of Marine Mammals Incidental To Missile Launch Activities from San... monitoring measures: (1) Visual Land-Based Monitoring. (i) Prior to each missile launch, an observer(s) will... from the missile launch site. Each video camera will be set to record a focal subgroup within the...
50 CFR 216.155 - Requirements for monitoring and reporting.
Code of Federal Regulations, 2011 CFR
2011-10-01
... IMPORTING OF MARINE MAMMALS Taking Of Marine Mammals Incidental To Missile Launch Activities from San... monitoring measures: (1) Visual Land-Based Monitoring. (i) Prior to each missile launch, an observer(s) will... from the missile launch site. Each video camera will be set to record a focal subgroup within the...
Military Applications of Curved Focal Plane Arrays Developed by the HARDI Program
2011-01-01
considered one of the main founders of geometrical optics, modern photography, and cinematography . Among his inventions are the Petzval portrait lens...still be a problem. B. HARDI Program/Institute for Defense Analyses (IDA) Task 1. HARDI Program State-of-the- art cameras could be improved by
Analysis of Fresnel Zone Plates Focusing Dependence on Operating Frequency
Fuster, José Miguel; Candelas, Pilar; Castiñeira-Ibáñez, Sergio; Pérez-López, Sergio
2017-01-01
The focusing properties of Fresnel Zone Plates (FZPs) against frequency are analyzed in this work. It is shown that the FZP focal length depends almost linearly on the operating frequency. Focal depth and focal distortion are also considered, establishing a limit on the frequency span at which the operating frequency can be shifted. An underwater FZP ultrasound focusing system is demonstrated, and experimental results agree with the theoretical analysis and simulations. PMID:29206137
Broadband diffractive lens or imaging element
Ceglio, Natale M.; Hawryluk, Andrew M.; London, Richard A.; Seppala, Lynn G.
1991-01-01
A broadband diffractive lens or imaging element produces a sharp focus and/or a high resolution image with broad bandwidth illuminating radiation. The diffractive lens is sectored or segmented into regions, each of which focuses or images a distinct narrowband of radiation but all of which have a common focal length. Alternatively, a serial stack of minus filters, each with a diffraction pattern which focuses or images a distinct narrowband of radiation but all of which have a common focal length, is used. The two approaches can be combined. Multifocal broadband diffractive elements can also be formed.
Micro-optic lens for data storage
NASA Technical Reports Server (NTRS)
Milster, T. D.; Trusty, R. M.; Wang, M. S.; Froehlich, F. F.; Erwin, J. Kevin
1991-01-01
A new type of microlens for data storage applications that has improved off-axis performance is described. The lens consists of a micro Fresnel pattern on a curved substrate. The radius of the substrate is equal to the focal length of the lens. If the pattern and substrate are thin, the combination satisfies the Abbe sine condition. Therefore, the lens is free of coma. We analyze a 0.5 numerical aperture, 0.50 mm focal length lens in detail. A 0.16 numerical aperture lens was fabricated holographically, and results are presented.
Characterization of a long-focal-length polycapillary optic for high-energy x-rays
NASA Astrophysics Data System (ADS)
Cari, Padiyar; Suparmi, -; Padiyar, Sushil D.; Gibson, Walter M.; MacDonald, Carolyn A.; Alexander, Cheryl D.; Joy, Marshall K.; Russell, Christine H.; Chen, Zewu
2000-11-01
Polycapillary fibers and a prototype collector for high energy x rays with a 2 m focal length have been fabricated and characterized. Measurements of a prototype collector, performed in collimating mode, show that the optic has high transmission, good uniformity, and small exit divergence. The transmission as a function of energy was analyzed using an extended single fiber geometrical optic simulation and the result shows that the simulation fits the data fairly well. Scatter transmission and contrast enhancement were measured in focusing mode using a parallel beam input.
Optics for MUSIC: a new (sub)millimeter camera for the Caltech Submillimeter Observatory
NASA Astrophysics Data System (ADS)
Sayers, Jack; Czakon, Nicole G.; Day, Peter K.; Downes, Thomas P.; Duan, Ran P.; Gao, Jiansong; Glenn, Jason; Golwala, Sunil R.; Hollister, Matt I.; LeDuc, Henry G.; Mazin, Benjamin A.; Maloney, Philip R.; Noroozian, Omid; Nguyen, Hien T.; Schlaerth, James A.; Siegel, Seth; Vaillancourt, John E.; Vayonakis, Anastasios; Wilson, Philip R.; Zmuidzinas, Jonas
2010-07-01
We will present the design and implementation, along with calculations and some measurements of the performance, of the room-temperature and cryogenic optics for MUSIC, a new (sub)millimeter camera we are developing for the Caltech Submm Observatory (CSO). The design consists of two focusing elements in addition to the CSO primary and secondary mirrors: a warm off-axis elliptical mirror and a cryogenic (4K) lens. These optics will provide a 14 arcmin field of view that is diffraction limited in all four of the MUSIC observing bands (2.00, 1.33, 1.02, and 0.86 mm). A cold (4K) Lyot stop will be used to define the primary mirror illumination, which will be maximized while keeping spillover at the sub 1% level. The MUSIC focal plane will be populated with broadband phased antenna arrays that efficiently couple to factor of (see manuscript) 3 in bandwidth,1, 2 and each pixel on the focal plane will be read out via a set of four lumped element filters that define the MUSIC observing bands (i.e., each pixel on the focal plane simultaneously observes in all four bands). Finally, a series of dielectric and metal-mesh low pass filters have been implemented to reduce the optical power load on the MUSIC cryogenic stages to a quasi-negligible level while maintaining good transmission in-band.
Determining fast orientation changes of multi-spectral line cameras from the primary images
NASA Astrophysics Data System (ADS)
Wohlfeil, Jürgen
2012-01-01
Fast orientation changes of airborne and spaceborne line cameras cannot always be avoided. In such cases it is essential to measure them with high accuracy to ensure a good quality of the resulting imagery products. Several approaches exist to support the orientation measurement by using optical information received through the main objective/telescope. In this article an approach is proposed that allows the determination of non-systematic orientation changes between every captured line. It does not require any additional camera hardware or onboard processing capabilities but the payload images and a rough estimate of the camera's trajectory. The approach takes advantage of the typical geometry of multi-spectral line cameras with a set of linear sensor arrays for different spectral bands on the focal plane. First, homologous points are detected within the heavily distorted images of different spectral bands. With their help a connected network of geometrical correspondences can be built up. This network is used to calculate the orientation changes of the camera with the temporal and angular resolution of the camera. The approach was tested with an extensive set of aerial surveys covering a wide range of different conditions and achieved precise and reliable results.
The PanCam instrument on the 2018 Exomars rover: Scientific objectives
NASA Astrophysics Data System (ADS)
Jaumann, Ralf; Coates, Andrew; Hauber, Ernst; Hoffmann, Harald; Schmitz, Nicole; Le Deit, Laetitia; Tirsch, Daniela; Paar, Gerhard; Griffiths, Andrew
2010-05-01
The Exomars Panoramic Camera System is an imaging suite of three camera heads to be mounted on the ExoMars rover`s mast, with the boresight 1.8 m above ground. As late as the ExoMars Pasteur Payload Design Review (PDR) in 2009, the PanCam consists of two identical wide angle cameras (WAC) with fixed focal length lenses, and a high resolution camera (HRC) with an automatic focus mechanism, placed adjacent to the right WAC. The WAC stereo pair provides binocular vision for stereoscopic studies as well as 12 filter positions (per camera) for stereoscopic colour imaging and scientific multispectral studies. The stereo baseline of the pair is 500 mm. The two WAC have 22 mm focal length, f/10 lenses that illuminate detectors with 1024 × 1024 pixels. WAC lenses are fixed, with an optimal focus set to 4 m, and a focus ranging from 1.2 m (corresponding to the nearest view of the calibration target on the rover deck) to infinity. The HRC is able to focus between 0.9 m (distance to a drill core on the rover`s sample tray) and infinity. The instantaneous field of views of WAC and HRC are 580 μrad/pixel and 83 μrad/pixel, respectively. The corresponding resolution (in mm/pixel) at a distance of 2 m are 1.2 (WAC) and 0.17 (HRC), at 100 m distance it is 58 (WAC) and 8.3 (HRC). WAC and HRC will be geometrically co-aligned. The main scientific goal of PanCam is the geologic characterisation of the environment in which the rover is operating, providing the context for investigations carried out by the other instruments of the Pasteur payload. PanCam data will serve as a bridge between orbital data (high-resolution images from HRSC, CTX, and HiRISE, and spectrometer data from OMEGA and CRISM) and the data acquired in situ on the Martian surface. The position of HRC on top of the rover`s mast enables the detailed panoramic inspection of surface features over the full horizontal range of 360° even at large distances, an important prerequisite to identify the scientifically most promising targets and to plan the rover`s traverse. Key to success of PanCam is the provision of data that allow the determination of rock lithology, either of boulders on the surface or of outcrops. This task requires high spatial resolution as well as colour capabilities. The stereo images provide complementary information on the three-dimensional properties (i.e. the shape) of rocks. As an example, the degree of rounding of rocks as a result of fluvial transport can reveal the erosional history of the investigated particles, with possible implications on the chronology and intensity of rock-water interaction. The identification of lithology and geological history of rocks will strongly benefit from the co-aligned views of WAC (colour, stereo) and HRC (high spatial resolution), which will ensure that 3D and multispectral information is available together with fine-scale textural information for each scene. Stereo information is also of utmost importance for the determination of outcrop geometry (e.g., strike and dip of layered sequences), which helps to understand the emplacement history of sedimentary and volcanic rocks (e.g., cross-bedding, unconformities, etc.). PanCam will further reveal physical soil properties such as cohesion by imaging sites where the soil is disturbed by the rover`s wheels and the drill. Another essential task of PanCam is the imaging of samples (from the drill) before ingestion into the rover for further analysis by other instruments. PanCam can be tilted vertically and will also study the atmosphere (e.g., dust loading, opacity, clouds) and aeolian processes related to surface-atmosphere interactions, such as dust devils.
Investigating at the Moon With new Eyes: The Lunar Reconnaissance Orbiter Mission Camera (LROC)
NASA Astrophysics Data System (ADS)
Hiesinger, H.; Robinson, M. S.; McEwen, A. S.; Turtle, E. P.; Eliason, E. M.; Jolliff, B. L.; Malin, M. C.; Thomas, P. C.
The Lunar Reconnaissance Orbiter Mission Camera (LROC) H. Hiesinger (1,2), M.S. Robinson (3), A.S. McEwen (4), E.P. Turtle (4), E.M. Eliason (4), B.L. Jolliff (5), M.C. Malin (6), and P.C. Thomas (7) (1) Brown Univ., Dept. of Geological Sciences, Providence RI 02912, Harald_Hiesinger@brown.edu, (2) Westfaelische Wilhelms-University, (3) Northwestern Univ., (4) LPL, Univ. of Arizona, (5) Washington Univ., (6) Malin Space Science Systems, (7) Cornell Univ. The Lunar Reconnaissance Orbiter (LRO) mission is scheduled for launch in October 2008 as a first step to return humans to the Moon by 2018. The main goals of the Lunar Reconnaissance Orbiter Camera (LROC) are to: 1) assess meter and smaller- scale features for safety analyses for potential lunar landing sites near polar resources, and elsewhere on the Moon; and 2) acquire multi-temporal images of the poles to characterize the polar illumination environment (100 m scale), identifying regions of permanent shadow and permanent or near permanent illumination over a full lunar year. In addition, LROC will return six high-value datasets such as 1) meter-scale maps of regions of permanent or near permanent illumination of polar massifs; 2) high resolution topography through stereogrammetric and photometric stereo analyses for potential landing sites; 3) a global multispectral map in 7 wavelengths (300-680 nm) to characterize lunar resources, in particular ilmenite; 4) a global 100-m/pixel basemap with incidence angles (60-80 degree) favorable for morphologic interpretations; 5) images of a variety of geologic units at sub-meter resolution to investigate physical properties and regolith variability; and 6) meter-scale coverage overlapping with Apollo Panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972, to estimate hazards for future surface operations. LROC consists of two narrow-angle cameras (NACs) which will provide 0.5-m scale panchromatic images over a 5-km swath, a wide-angle camera (WAC) to acquire images at about 100 m/pixel in seven color bands over a 100-km swath, and a common Sequence and Compressor System (SCS). Each NAC has a 700-mm-focal-length optic that images onto a 5000-pixel CCD line-array, providing a cross-track field-of-view (FOV) of 2.86 degree. The NAC readout noise is better than 100 e- , and the data are sampled at 12 bits. Its internal buffer holds 256 MB of uncompressed data, enough for a full-swath image 25-km long or a 2x2 binned image 100-km long. The WAC has two 6-mm- focal-length lenses imaging onto the same 1000 x 1000 pixel, electronically shuttered CCD area-array, one imaging in the visible/near IR, and the other in the UV. Each has a cross-track FOV of 90 degree. From the nominal 50-km orbit, the WAC will have a resolution of 100 m/pixel in the visible, and a swath width of ˜100 km. The seven-band color capability of the WAC is achieved by color filters mounted directly 1 over the detector, providing different sections of the CCD with different filters [1]. The readout noise is less than 40 e- , and, as with the NAC, pixel values are digitized to 12-bits and may be subsequently converted to 8-bit values. The total mass of the LROC system is about 12 kg; the total LROC power consumption averages at 22 W (30 W peak). Assuming a downlink with lossless compression, LRO will produce a total of 20 TeraBytes (TB) of raw data. Production of higher-level data products will result in a total of 70 TB for Planetary Data System (PDS) archiving, 100 times larger than any previous missions. [1] Malin et al., JGR, 106, 17651-17672, 2001. 2
Modeling of pulse-echo inspections of multi-layer panels containing delaminations
NASA Astrophysics Data System (ADS)
Richter, Nathaniel Lawrence
To meet the needs of counter insurgency operations the armor of tanks need to be lighter. This is accomplished by using a combination of materials: metals, composites, and ceramics. This multi-material composite armor using any combination of the above materials will need to be inspected for manufacturing error, shipping damage, and combat damage. Nondestructive inspection, particularly ultrasonic inspection, has a long history of successfully inspecting thick composite structures. To more easily develop inspection plans for many layered composites a computational model would be of use. A model of this type would need to have the ability to account for multiple material types and flaws that are larger than the beam size. Also, as a result of armor thickness any model would need to consider attenuation and effects of focused transducers. This was accomplishing by extending the Thompson-Gray Measurement Model for use with multiple layers at normal incidence to the transducer and large planar defects parallel to the layers. Material values of the armor and the characteristics of the transducers were determined for use in the model. The model results are compared to experimentally collected data to show agreement. The model is then used to determine the requirements of a new inspection plan through varying the frequency and focal length of the transducers. The defect reflection amplitudes for 5 MHz with the focal lengths in water of 7.5, 8.5, and 9.5 inches are 0.55178, 0.75270, and 0.44836. The same for 10 MHz are 0.12474, 0.21425, and 0.10637. The 8.5 in focal length also is the equivilent thickness in water for the material leading to the defect interface. This focal length would, from theory, cause the greatest amplitude from the defect. This is supported by the results in that the highest amplitude occurs at 8.5 inches for both sets of frequencies. It is also evident that the response at 5 MHz is greater than that at 10 MHz. As such, the 5 MHz transducer with an 8.5 inch focal length is nominal for this inspection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, L; Duke University Medical Center, Durham, NC; Fudan University Shanghai Cancer Center, Shanghai
Purpose: To investigate prostate imaging onboard radiation therapy machines using a novel robotic, 49-pinhole Single Photon Emission Computed Tomography (SPECT) system. Methods: Computer-simulation studies were performed for region-of-interest (ROI) imaging using a 49-pinhole SPECT collimator and for broad cross-section imaging using a parallel-hole SPECT collimator. A male XCAT phantom was computersimulated in supine position with one 12mm-diameter tumor added in the prostate. A treatment couch was added to the phantom. Four-minute detector trajectories for imaging a 7cm-diameter-sphere ROI encompassing the tumor were investigated with different parameters, including pinhole focal length, pinhole diameter and trajectory starting angle. Pseudo-random Poisson noise wasmore » included in the simulated projection data, and SPECT images were reconstructed by OSEM with 4 subsets and up to 10 iterations. Images were evaluated by visual inspection, profiles, and Root-Mean- Square-Error (RMSE). Results: The tumor was well visualized above background by the 49-pinhole SPECT system with different pinhole parameters while it was not visible with parallel-hole SPECT imaging. Minimum RMSEs were 0.30 for 49-pinhole imaging and 0.41 for parallelhole imaging. For parallel-hole imaging, the detector trajectory from rightto- left yielded slightly lower RMSEs than that from posterior to anterior. For 49-pinhole imaging, near-minimum RMSEs were maintained over a broader range of OSEM iterations with a 5mm pinhole diameter and 21cm focal length versus a 2mm diameter pinhole and 18cm focal length. The detector with 21cm pinhole focal length had the shortest rotation radius averaged over the trajectory. Conclusion: On-board functional and molecular prostate imaging may be feasible in 4-minute scan times by robotic SPECT. A 49-pinhole SPECT system could improve such imaging as compared to broadcross-section parallel-hole collimated SPECT imaging. Multi-pinhole imaging can be improved by considering pinhole focal length, pinhole diameter, and trajectory starting angle. The project is supported by the NIH grant 5R21-CA156390.« less
Electro-optical detector for use in a wide mass range mass spectrometer
NASA Technical Reports Server (NTRS)
Giffin, Charles E. (Inventor)
1976-01-01
An electro-optical detector is disclosed for use in a wide mass range mass spectrometer (MS), in the latter the focal plane is at or very near the exit end of the magnetic analyzer, so that a strong magnetic field of the order of 1000G or more is present at the focal plane location. The novel detector includes a microchannel electron multiplier array (MCA) which is positioned at the focal plane to convert ion beams which are focused by the MS at the focal plane into corresponding electron beams which are then accelerated to form visual images on a conductive phosphored surface. These visual images are then converted into images on the target of a vidicon camera or the like for electronic processing. Due to the strong magnetic field at the focal plane, in one embodiment of the invention, the MCA with front and back parallel ends is placed so that its front end forms an angle of not less than several degrees, preferably on the order of 10.degree.-20.degree., with respect to the focal plane, with the center line of the front end preferably located in the focal plane. In another embodiment the MCA is wedge-shaped, with its back end at an angle of about 10.degree.-20.degree. with respect to the front end. In this embodiment the MCA is placed so that its front end is located at the focal plane.
Adaptive liquid microlenses activated by stimuli-responsive hydrogels.
Dong, Liang; Agarwal, Abhishek K; Beebe, David J; Jiang, Hongrui
2006-08-03
Despite its compactness, the human eye can easily focus on different distances by adjusting the shape of its lens with the help of ciliary muscles. In contrast, traditional man-made optical systems achieve focusing by physical displacement of the lenses used. But in recent years, advances in miniaturization technology have led to optical systems that no longer require complicated mechanical systems to tune and adjust optical performance. These systems have found wide use in photonics, displays and biomedical systems. They are either based on arrays of microlenses with fixed focal lengths, or use external control to adjust the microlens focal length. An intriguing example is the tunable liquid lens, where electrowetting or external pressure manipulates the shape of a liquid droplet and thereby adjusts its optical properties. Here we demonstrate a liquid lens system that allows for autonomous focusing. The central component is a stimuli-responsive hydrogel integrated into a microfluidic system and serving as the container for a liquid droplet, with the hydrogel simultaneously sensing the presence of stimuli and actuating adjustments to the shape--and hence focal length--of the droplet. By working at the micrometre scale where ionic diffusion and surface tension scale favourably, we can use pinned liquid-liquid interfaces to obtain stable devices and realize response times of ten to a few tens of seconds. The microlenses, which can have a focal length ranging from -infinity to +infinity (divergent and convergent), are also readily integrated into arrays that may find use in applications such as sensing, medical diagnostics and lab-on-a-chip technologies.
Adaptive liquid microlenses activated by stimuli-responsive hydrogels
NASA Astrophysics Data System (ADS)
Dong, Liang; Agarwal, Abhishek K.; Beebe, David J.; Jiang, Hongrui
2006-08-01
Despite its compactness, the human eye can easily focus on different distances by adjusting the shape of its lens with the help of ciliary muscles. In contrast, traditional man-made optical systems achieve focusing by physical displacement of the lenses used. But in recent years, advances in miniaturization technology have led to optical systems that no longer require complicated mechanical systems to tune and adjust optical performance. These systems have found wide use in photonics, displays and biomedical systems. They are either based on arrays of microlenses with fixed focal lengths, or use external control to adjust the microlens focal length. An intriguing example is the tunable liquid lens, where electrowetting or external pressure manipulates the shape of a liquid droplet and thereby adjusts its optical properties. Here we demonstrate a liquid lens system that allows for autonomous focusing. The central component is a stimuli-responsive hydrogel integrated into a microfluidic system and serving as the container for a liquid droplet, with the hydrogel simultaneously sensing the presence of stimuli and actuating adjustments to the shape-and hence focal length-of the droplet. By working at the micrometre scale where ionic diffusion and surface tension scale favourably, we can use pinned liquid-liquid interfaces to obtain stable devices and realize response times of ten to a few tens of seconds. The microlenses, which can have a focal length ranging from -∞ to +∞ (divergent and convergent), are also readily integrated into arrays that may find use in applications such as sensing, medical diagnostics and lab-on-a-chip technologies.
FOCAL PLANE WAVEFRONT SENSING USING RESIDUAL ADAPTIVE OPTICS SPECKLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Codona, Johanan L.; Kenworthy, Matthew, E-mail: jlcodona@gmail.com
2013-04-20
Optical imperfections, misalignments, aberrations, and even dust can significantly limit sensitivity in high-contrast imaging systems such as coronagraphs. An upstream deformable mirror (DM) in the pupil can be used to correct or compensate for these flaws, either to enhance the Strehl ratio or suppress the residual coronagraphic halo. Measurement of the phase and amplitude of the starlight halo at the science camera is essential for determining the DM shape that compensates for any non-common-path (NCP) wavefront errors. Using DM displacement ripples to create a series of probe and anti-halo speckles in the focal plane has been proposed for space-based coronagraphsmore » and successfully demonstrated in the lab. We present the theory and first on-sky demonstration of a technique to measure the complex halo using the rapidly changing residual atmospheric speckles at the 6.5 m MMT telescope using the Clio mid-IR camera. The AO system's wavefront sensor measurements are used to estimate the residual wavefront, allowing us to approximately compute the rapidly evolving phase and amplitude of speckle halo. When combined with relatively short, synchronized science camera images, the complex speckle estimates can be used to interferometrically analyze the images, leading to an estimate of the static diffraction halo with NCP effects included. In an operational system, this information could be collected continuously and used to iteratively correct quasi-static NCP errors or suppress imperfect coronagraphic halos.« less
NASA Technical Reports Server (NTRS)
Mach, Douglas M.; Rust, W. David
1989-01-01
The present device for lightning channel propagation-velocity determination employs eight photodetectors mounted behind precision horizontal slits in the focal plane of a photographic camera lens. The eight photodetector pulses, IRIG-B time, and slow and fast electric field-change waveforms are recorded on a 14-track analog tape recorder. A comparison of the present results with those obtained by a streaking camera shows no significant differences between the velocities obtained from the same strokes with the two systems; neither is there any difference in pulse characteristics or in the velocities calculated from them.
Fabrication and Characterization of Flexible Electrowetting on Dielectrics (EWOD) Microlens
Li, Chenhui; Jiang, Hongrui
2014-01-01
We present a flexible variable-focus converging microlens actuated by electrowetting on dielectric (EWOD). The microlens is made of two immiscible liquids and a soft polymer, polydimethylsiloxane (PDMS). Parylene intermediate layer is used to produce robust flexible electrode on PDMS. A low-temperature PDMS-compatible fabrication process has been developed to reduce the stress on the lens structure. The lens has been demonstrated to be able to conform to curved surfaces smoothly. The focal length of the microlens is 29–38 mm on a flat surface, and 31–41 mm on a curved surface, varying with the voltage applied. The resolving power of the microlens is 25.39 line pairs per mm by a 1951 United States Air Force (USAF) resolution chart and the lens aberrations are measured by a Shack-Hartmann wavefront sensor. The focal length behavior on a curved surface is discussed and for the current lens demonstrated the focal length is slightly longer on the curved surface as a result of the effect of the curved PDMS substrate. PMID:25360324
Numerical study of the medium thickness in the Z-scan technique
NASA Astrophysics Data System (ADS)
Severiano Carrillo, I.; Méndez Otero, M. M.; Arroyo Carrasco, M. L.; Iturbe Castillo, M. D.
2011-09-01
The optical characterization of nonlinear media through the Z-scan technique considers initially a thin medium (with a thickness much less than the beam depth of focus). It has been observed that increasing the thickness of the medium the transmittance increases, this means that n2 increases, for this reason we will present a numerical model to determinate the minimum thin and the maximum thick medium limit. A thin medium is considered as a thin lens with focal length F1 and a thick medium can be regarded as a set of such thin lenses set with focal lengths F2, these lenses are contained in a medium whit a refraction index different than air. This analysis is made through Matlab using the theory of Gaussian beams, ABCD matrices and the q parameter, elementary theory in the development of this work, where the main feature of this model is that the nonlinearity type of the medium is considered as an integer constant in its focal length3. We present the graphs obtained from Z-scan for thick medium with both thermal and Kerr nonlinearities.
Alignment telescope for Antares
NASA Astrophysics Data System (ADS)
Appert, Q. D.; Swann, T. A.; Ward, J. H.; Hardesty, C.; Wright, L.
The Antares Automatic Alignment System employs a specially designed telescope for alignment of its laser beamlines. There are two telescopes in the system, and since each telescope is a primary alignment reference, stringent boresight accuracy and stability over the focus range were required. Optical and mechanical designs, which meet this requirements as well as that of image quality over a wide wavelength band, are described. Special test techniques for initial assembly and alignment of the telescope are also presented. The telescope, which has a 180-mm aperture FK51-KZF2 type glass doublet objective, requires a boresight accuracy of 2.8 (SIGMA)rad at two focal lengths, and object distances between 11 meters and infinity. Travel of a smaller secondary doublet provides focus from 11 m to infinity with approximately 7.8 m effective focal length. By flipping in a third doublet, the effective focal length is reduced to 2.5 m. Telescope alignment was accomplished by using a rotary air bearing to establish an axis in front of the system and placing the focus of a Laser Unequal Path Interferometer (LUPI) at the image plane.
Alignment Telescope For Antares
NASA Astrophysics Data System (ADS)
Appert, Q. D.; Swann, T. A.; Ward, J. H.; Hardesty, C.; Wrignt, L.
1983-11-01
The Antares Automatic Alignment System employs a specially designed telescope for alignment of its laser beamlines. There are two telescopes in the system, and since eacn telescope is a primary alignment reference, stringent boresight accuracy and stability over the focus range were required. Optical and mechanical designs, which meet this requirement as well as that of image quality over a wide wavelength band, are described. Special test techniques for initial assembly and alignment of the telescope are also presented. The telescope, which has a 180-mm aperture FK51-KZF2 type glass doublet objective, requires a boresight accuracy of 2.8 prad at two focal lengths, and object distances between 11 meters and infinity. Travel of a smaller secondary doublet provides focus from 11 m to infinity with approximately 7.8 m effective focal length. By flipping in a third doublet, the effective focal length is reduced to 2.5 m. Telescope alignment was accomplished by using a rotary air bearing to establish an axis in front of the system and placing the focus of a Laser Unequal Path Interferometer (LUPI) at the image plane.
Method of preparing a tunable-focus liquid-crystal (LC) lens
NASA Astrophysics Data System (ADS)
Li, Xiaolong; Zhou, Zuowei; Ren, Hongwen
2018-02-01
A liquid crystal (LC) lens is prepared by controlling the alignment of a LC using a homogeneous polyimide (PI) layer and a homeotropic PI layer. The rubbed homogeneous PI layer has a concave surface and the homeotropic PI layer is flat. The LC sandwiched between the two PI layers obtains a hybrid alignment which has the largest gradient of refractive index (GRIN) distribution. The LC layer exhibits a lens character because of its convex shape. Since the effective refractive index of the LC is larger than that of the homogeneous PI, the LC lens can focus a light with the shortest focal length in the voltage-off state. By applying an external voltage, the LC molecules can be reoriented along the electric field. As a result, the focal length of the LC lens is reduced. The focal length of the LC lens can be tuned from 30 to 120 μm when the voltage is changed from 0 to 7 Vrms. This LC lens has the advantages of no threshold, low operating voltage, and simple fabrication.
Confocal microscopy using variable-focal-length microlenses and an optical fiber bundle.
Yang, Lisong; Mac Raighne, Aaron; McCabe, Eithne M; Dunbar, L Andrea; Scharf, Toralf
2005-10-01
The use of variable-focal-length (VFL) microlenses can provide a way to axially scan the foci across a sample by electronic control. We demonstrate an approach to coupling VFL microlenses individually to a fiber bundle as a way to create a high-throughput aperture array with a controllable aperture pattern. It would potentially be applied in real-time confocal imaging in vivo for biological specimens. The VFL microlenses that we used consist of a liquid-crystal film sandwiched between a pair of conductive substrates for which one has a hole-patterned electrode. One obtains the variation of the focal length by changing the applied voltage. The fiber bundle has been characterized by coupling with both coherent and incoherent light sources. We further demonstrate the use of a VFL microlens array in combination with the fiber bundle to build up a confocal system. The axial response of the confocal system has been measured without mechanical movement of the sample or the objective, and the FWHM is estimated to be approximately 16 microm, with asymmetric sidelobes.
Generating High resolution surfaces from images: when photogrammetry and applied geophysics meets
NASA Astrophysics Data System (ADS)
Bretar, F.; Pierrot-Deseilligny, M.; Schelstraete, D.; Martin, O.; Quernet, P.
2012-04-01
Airborne digital photogrammetry has been used for some years to create digital models of the Earth's topography from calibrated cameras. But, in the recent years, the use of non-professionnal digital cameras has become valuable to reconstruct topographic surfaces. Today, the multi megapixel resolution of non-professionnal digital cameras, either used in a close range configuration or from low altitude flights, provide a ground pixel size of respectively a fraction of millimeters to couple of centimeters. Such advances turned into reality because the data processing chain made a tremendous break through during the last five years. This study investigates the potential of the open source software MICMAC developed by the French National Survey IGN (http://www.micmac.ign.fr) to calibrate unoriented digital images and calculate surface models of extremely high resolution for Earth Science purpose. We would like to report two experiences performed in 2011. The first has been performed in the context of risk assessment of rock falls and landslides along the cliffs of Normandy seashore. The acquisition protocol for the first site of "Criel-sur-Mer" has been very simple: a walk along the chalk vertical cliffs taking photos with a focal of 18mm every approx. 50m with an overlap of 80% allowed to generate 2.5km of digital surface at centimeter resolution. The site of "Les Vaches Noires" has been more complicated to acquire because of both the geology (dark clays) and the geometry (the landslide direction is parallel to the seashore and has a high field depth from the shore). We therefore developed an innovative device mounted on board of an autogyre (in-between ultralight power driven aircraft and helicopter). The entire area has been surveyed with a focal of 70mm at 400m asl with a ground pixel of 3cm. MICMAC gives the possibility to directly georeference digital Model. Here, it has been performed by a net of wireless GPS called Geocubes, also developed at IGN. The second experience is a part of field measurements performed over the flanks of the volcano Piton de la Fournaise, La Réunion island. In order to characterize the roughness of different type of lava flows, extremely high resolution Digital Terrain Models (0.6mm) have been generated with MICMAC. The use of such high definition topography made the characterization possible through the calculation of the correlation length, the standard deviation and the fractal dimension. To conclude, we will sketch a synthesis of the need of geoscientists vs. the optimal resolution of digital topographic data.
NASA Astrophysics Data System (ADS)
Oertel, D.; Jahn, H.; Sandau, R.; Walter, I.; Driescher, H.
1990-10-01
Objectives of the multifunctional stereo imaging camera (MUSIC) system to be deployed on the Soviet Mars-94 mission are outlined. A high-resolution stereo camera (HRSC) and wide-angle opto-electronic stereo scanner (WAOSS) are combined in terms of hardware, software, technology aspects, and solutions. Both HRSC and WAOSS are push-button instruments containing a single optical system and focal plates with several parallel CCD line sensors. Emphasis is placed on the MUSIC system's stereo capability, its design, mass memory, and data compression. A 1-Gbit memory is divided into two parts: 80 percent for HRSC and 20 percent for WAOSS, while the selected on-line compression strategy is based on macropixel coding and real-time transform coding.
Image intensification; Proceedings of the Meeting, Los Angeles, CA, Jan. 17, 18, 1989
NASA Astrophysics Data System (ADS)
Csorba, Illes P.
Various papers on image intensification are presented. Individual topics discussed include: status of high-speed optical detector technologies, super second generation imge intensifier, gated image intensifiers and applications, resistive-anode position-sensing photomultiplier tube operational modeling, undersea imaging and target detection with gated image intensifier tubes, image intensifier modules for use with commercially available solid state cameras, specifying the components of an intensified solid state television camera, superconducting IR focal plane arrays, one-inch TV camera tube with very high resolution capacity, CCD-Digicon detector system performance parameters, high-resolution X-ray imaging device, high-output technology microchannel plate, preconditioning of microchannel plate stacks, recent advances in small-pore microchannel plate technology, performance of long-life curved channel microchannel plates, low-noise microchannel plates, development of a quartz envelope heater.
Treatment planning for prostate focal laser ablation in the face of needle placement uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cepek, Jeremy, E-mail: jcepek@robarts.ca; Fenster, Aaron; Lindner, Uri
2014-01-15
Purpose: To study the effect of needle placement uncertainty on the expected probability of achieving complete focal target destruction in focal laser ablation (FLA) of prostate cancer. Methods: Using a simplified model of prostate cancer focal target, and focal laser ablation region shapes, Monte Carlo simulations of needle placement error were performed to estimate the probability of completely ablating a region of target tissue. Results: Graphs of the probability of complete focal target ablation are presented over clinically relevant ranges of focal target sizes and shapes, ablation region sizes, and levels of needle placement uncertainty. In addition, a table ismore » provided for estimating the maximum target size that is treatable. The results predict that targets whose length is at least 5 mm smaller than the diameter of each ablation region can be confidently ablated using, at most, four laser fibers if the standard deviation in each component of needle placement error is less than 3 mm. However, targets larger than this (i.e., near to or exceeding the diameter of each ablation region) require more careful planning. This process is facilitated by using the table provided. Conclusions: The probability of completely ablating a focal target using FLA is sensitive to the level of needle placement uncertainty, especially as the target length approaches and becomes greater than the diameter of ablated tissue that each individual laser fiber can achieve. The results of this work can be used to help determine individual patient eligibility for prostate FLA, to guide the planning of prostate FLA, and to quantify the clinical benefit of using advanced systems for accurate needle delivery for this treatment modality.« less
Development of ultra-precision micro-cavity measurement technique in HIT-UOI
NASA Astrophysics Data System (ADS)
Cui, Jiwen; Li, Lei; Tan, Jiubin
2010-08-01
Micro cavities with high aspect ratio are widely used in different fields including aerospace and defense industries with the development of manufacturing technology. So how to measure the dimension of these cavities has become one of the major research subjects in the field of measurement and instrument. This paper describes some activities of the precision micro cavity measurement technique in Center of Ultra-precision Optoelectronic Instrument (UOI), Harbin Institute of Technology (HIT). The key issue of micro cavity measurement in UOI is called touch-trigger measurement method. The first scheme is double optical fiber coupling, in which light coming from the incident optical fiber is transmitted in the reversal direction via the optical fiber coupling into the effluent optical fiber, the lateral displacement of the touch-trigger sensor is transformed into the deflexion of light coming out from the effluent optical fiber, and the deflexion is transformed into an image signal by the object lens and CCD capturing system. And the second scheme is micro focal-length collimation, in which a fiber stem with a ball mounted on its end is used as a probe and a small segment of it is used as a cylindrical lens to collimate a point light source and image it to a camera, the deflection of the fiber stem can be inferred from the change in image acquired by the camera with ultrahigh displacement sensitivity. Experiments for these activities will be given with a focus on the measurement results and repeatability uncertainty.
The ICE spectrograph for PEPSI at the LBT: preliminary optical design
NASA Astrophysics Data System (ADS)
Pallavicini, Roberto; Zerbi, Filippo M.; Spano, Paolo; Conconi, Paolo; Mazzoleni, Ruben; Molinari, Emilio; Strassmeier, Klaus G.
2003-03-01
We present a preliminary design study for a high-resolution echelle spectrograph (ICE) to be used with the spectropolarimeter PEPSI under development at the LBT. In order to meet the scientific requirements and take full advantage of the peculiarities of the LBT (i.e. the binocular nature and the adaptive optics capabilities), we have designed a fiber-fed bench mounted instrument for both high resolution (R ≍ 100,000; non-AO polarimetric and integral light modes) and ultra-high resolution (R ≍ 300,000; AO integral light mode). In both cases, 4 spectra per order (two for each primary mirror) shall be accomodated in a 2-dimensional cross dispersed echelle format. In order to obtain a resolution-slit product of ≍ 100,000 as required by the science case, we have considered two alternative designs, one with two R4 echelles in series and the other with a sigle R4 echelle and fiber slicing. A white-pupil design, VPH cross-dispersers and two cameras of different focal length for the AO and non-AO modes are adopted in both cases. It is concluded that the single-echelle fiber-slicer solution has to be preferred in terms of performances, complexity and cost. It can be implemented at the LBT in two phases, with the long-camera AO mode added in a second phase depending on the availability of funds and the time-scale for implementation of the AO system.
Observation of interaction of shock wave with gas bubble by image converter camera
NASA Astrophysics Data System (ADS)
Yoshii, M.; Tada, M.; Tsuji, T.; Isuzugawa, Kohji
1995-05-01
When a spark discharge occurs at the first focal point of a semiellipsoid or a reflector located in water, a spherical shock wave is produced. A part of the wave spreads without reflecting on the reflector and is called direct wave in this paper. Another part reflects on the semiellipsoid and converges near the second focal point, that is named the focusing wave, and locally produces a high pressure. This phenomenon is applied to disintegrators of kidney stone. But it is concerned that cavitation bubbles induced in the body by the expansion wave following the focusing wave will injure human tissue around kidney stone. In this paper, in order to examine what happens when shock waves strike bubbles on human tissue, the aspect that an air bubble is truck by the spherical shock wave or its behavior is visualized by the schlieren system and its photographs are taken using an image converter camera. Besides,the variation of the pressure amplitude caused by the shock wave and the flow of water around the bubble is measured with a pressure probe.
NASA Astrophysics Data System (ADS)
Reverchon, Jean-Luc; Gourdel, Yves; Robo, Jean-Alexandre; Truffer, Jean-Patrick; Costard, Eric; Brault, Julien; Duboz, Jean-Yves
2017-11-01
The fast development of nitrides has given the opportunity to investigate AlGaN as a material for ultraviolet detection. Such AlGaN based camera presents an intrinsic spectral selectivity and an extremely low dark current at room temperature. Firstly, we will present results on focal plane array of 320x256 pixels with a pitch of 30μm. The peak responsivity is around 280nm (solar-blind), 310nm and 360nm. These results are obtained in a standard SWIR supply chain (readout circuit, electronics). With the existing near-UV camera grown on sapphire, the short wavelength cutoff is due to a window layer improving the material quality of the active layer. The ultimate shortest wavelength would be 200nm due to sapphire substrate. We present here the ways to transfer the standard design of Schottky photodiodes from sapphire to silicon substrate. We will show the capability to remove the silicon substrate, and etch the window layer in order to extend the band width to lower wavelengths.
Simulation of two-dimensional adjustable liquid gradient refractive index (L-GRIN) microlens
NASA Astrophysics Data System (ADS)
Le, Zichun; Wu, Xiang; Sun, Yunli; Du, Ying
2017-07-01
In this paper, a two-dimensional liquid gradient refractive index (L-GRIN) microlens is designed which can be used in adjusting focusing direction and focal spot of light beam. Finite element method (FEM) is used to simulate the convection diffusion process happening in core inlet flow and cladding inlet flow. And the ray tracing method shows us the light beam focusing effect including the extrapolation of focal length and output beam spot size. When the flow rates of the core and cladding fluids are held the same between the internal and external, left and right, and upper and lower inlets, the focal length varied from 313 μm to 53.3 μm while the flow rate of liquids ranges from 500 pL/s to 10,000 pL/s. While the core flow rate is bigger than the cladding inlet flow rate, the light beam will focus on a light spot with a tunable size. By adjusting the ratio of cladding inlet flow rate including Qright/Qleft and Qup/Qdown, we get the adjustable two-dimensional focus direction rather than the one-dimensional focusing. In summary, by adjusting the flow rate of core inlet and cladding inlet, the focal length, output beam spot and focusing direction of the input light beam can be manipulated. We suppose this kind of flexible microlens can be used in integrated optics and lab-on-a-chip system.
A potential individual cell malignancy indicator: focal length
NASA Astrophysics Data System (ADS)
Wang, Weina; Lear, Kevin L.
2011-03-01
The label-free technique of optofluidic intracavity spectroscopy (OFIS) utilizes the optical transmission spectrum of a cell in a microfluidic Fabry-Pérot (F-P) cavity to distinguish cells from cancerous cell lines and baseline normal blood cells. The classification between canine hemangiosarcoma (HSA) cancer cells and monocytes in canine normal peripheral blood mononuclear cells (PBMCs) had been demonstrated with 95% sensitivity and 98% specificity. Now with a new optical model that treats the cell settled at the bottom of the cavity as a thin lens, the focal length of cells was extracted and used as an individual cell malignancy indicator.
Paraxial ray solution for liquid-filled variable focus lenses
NASA Astrophysics Data System (ADS)
Wang, Lihui; Oku, Hiromasa; Ishikawa, Masatoshi
2017-12-01
We propose a general solution for determining the cardinal points and effective focal length of a liquid-filled variable focus lens to aid in understanding the dynamic behavior of the lens when the focal length is changed. A prototype of a variable focus lens was fabricated and used to validate the solution. A simplified solution was also presented that can be used to quickly and conveniently calculate the performance of the lens. We expect that the proposed solutions will improve the design of optical systems that contain variable focus lenses, such as machine vision systems with zoom and focus functions.
Broadband diffractive lens or imaging element
Ceglio, Natale M.; Hawryluk, Andrew M.; London, Richard A.; Seppala, Lynn G.
1993-01-01
A broadband diffractive lens or imaging element produces a sharp focus and/or a high resolution image with broad bandwidth illuminating radiation. The diffractive lens is sectored or segmented into regions, each of which focuses or images a distinct narrowband of radiation but all of which have a common focal length. Alternatively, a serial stack of minus filters, each with a diffraction pattern which focuses or images a distinct narrowband of radiation but all of which have a common focal length, is used. The two approaches can be combined. Multifocal broadband diffractive elements can also be formed. Thin film embodiments are described.
Broadband diffractive lens or imaging element
Ceglio, N.M.; Hawryluk, A.M.; London, R.A.; Seppala, L.G.
1993-10-26
A broadband diffractive lens or imaging element produces a sharp focus and/or a high resolution image with broad bandwidth illuminating radiation. The diffractive lens is sectored or segmented into regions, each of which focuses or images a distinct narrowband of radiation but all of which have a common focal length. Alternatively, a serial stack of minus filters, each with a diffraction pattern which focuses or images a distinct narrowband of radiation but all of which have a common focal length, is used. The two approaches can be combined. Multifocal broadband diffractive elements can also be formed. Thin film embodiments are described. 21 figures.
Mikš, Antonín; Novák, Pavel
2018-05-10
In this article, we analyze the problem of the paraxial design of an active optical element with variable focal length, which maintains the positions of its principal planes fixed during the change of its optical power. Such optical elements are important in the process of design of complex optical systems (e.g., zoom systems), where the fixed position of principal planes during the change of optical power is essential for the design process. The proposed solution is based on the generalized membrane tunable-focus fluidic lens with several membrane surfaces.
Optomechanical stability design of space optical mapping camera
NASA Astrophysics Data System (ADS)
Li, Fuqiang; Cai, Weijun; Zhang, Fengqin; Li, Na; Fan, Junjie
2018-01-01
According to the interior orientation elements and imaging quality requirements of mapping application to mapping camera and combined with off-axis three-mirror anastigmat(TMA) system, high optomechanical stability design of a space optical mapping camera is introduced in this paper. The configuration is a coaxial TMA system used in off-axis situation. Firstly, the overall optical arrangement is described., and an overview of the optomechanical packaging is provided. Zerodurglass, carbon fiber composite and carbon-fiber reinforced silicon carbon (C/SiC) are widely used in the optomechanical structure, because their low coefficient of thermal expansion (CTE) can reduce the thermal sensitivity of the mirrors and focal plane. Flexible and unloading support are used in reflector and camera supporting structure. Epoxy structural adhesives is used for bonding optics to metal structure is also introduced in this paper. The primary mirror is mounted by means of three-point ball joint flexures system, which is attach to the back of the mirror. Then, In order to predict flexural displacements due to gravity, static finite element analysis (FEA) is performed on the primary mirror. The optical performance peak-to-valley (PV) and root-mean-square (RMS) wavefront errors are detected before and after assemble. Also, the dynamic finite element analysis(FEA) of the whole optical arrangement is carried out as to investigate the performance of optomechanical. Finally, in order to evaluate the stability of the design, the thermal vacuum test and vibration test are carried out and the Modulation Transfer Function (MTF) and elements of interior orientation are presented as the evaluation index. Before and after the thermal vacuum test and vibration test, the MTF, focal distance and position of the principal point of optical system are measured and the result is as expected.
ATTICA family of thermal cameras in submarine applications
NASA Astrophysics Data System (ADS)
Kuerbitz, Gunther; Fritze, Joerg; Hoefft, Jens-Rainer; Ruf, Berthold
2001-10-01
Optronics Mast Systems (US: Photonics Mast Systems) are electro-optical devices which enable a submarine crew to observe the scenery above water during dive. Unlike classical submarine periscopes they are non-hull-penetrating and therefore have no direct viewing capability. Typically they have electro-optical cameras both for the visual and for an IR spectral band with panoramic view and a stabilized line of sight. They can optionally be equipped with laser range- finders, antennas, etc. The brand name ATTICA (Advanced Two- dimensional Thermal Imager with CMOS-Array) characterizes a family of thermal cameras using focal-plane-array (FPA) detectors which can be tailored to a variety of requirements. The modular design of the ATTICA components allows the use of various detectors (InSb, CMT 3...5 μm , CMT 7...11 μm ) for specific applications. By means of a microscanner ATTICA cameras achieve full standard TV resolution using detectors with only 288 X 384 (US:240 X 320) detector elements. A typical requirement for Optronics-Mast Systems is a Quick- Look-Around capability. For FPA cameras this implies the need for a 'descan' module which can be incorporated in the ATTICA cameras without complications.
Design, demonstration and testing of low F-number LWIR panoramic imaging relay optics
NASA Astrophysics Data System (ADS)
Furxhi, Orges; Frascati, Joe; Driggers, Ronald
2018-04-01
Panoramic imaging is inherently wide field of view. High sensitivity uncooled Long Wave Infrared (LWIR) imaging requires low F-number optics. These two requirements result in short back working distance designs that, in addition to being costly, are challenging to integrate with commercially available uncooled LWIR cameras and cores. Common challenges include the relocation of the shutter flag, custom calibration of the camera dynamic range and NUC tables, focusing, and athermalization. Solutions to these challenges add to the system cost and make panoramic uncooled LWIR cameras commercially unattractive. In this paper, we present the design of Panoramic Imaging Relay Optics (PIRO) and show imagery and test results with one of the first prototypes. PIRO designs use several reflective surfaces (generally two) to relay a panoramic scene onto a real, donut-shaped image. The PIRO donut is imaged on the focal plane of the camera using a commercially-off-the-shelf (COTS) low F-number lens. This approach results in low component cost and effortless integration with pre-calibrated commercially available cameras and lenses.
Optical Design of the LSST Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivier, S S; Seppala, L; Gilmore, K
2008-07-16
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a camera system that includes a set of broad-band filters and refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. Optical design of the camera lenses and filters is integrated with optical design of telescope mirrors to optimize performance, resulting in excellent image quality over the entire field from ultra-violet to near infra-red wavelengths. The LSST camera optics design consists of three refractive lenses withmore » clear aperture diameters of 1.55 m, 1.10 m and 0.69 m and six interchangeable, broad-band, filters with clear aperture diameters of 0.75 m. We describe the methodology for fabricating, coating, mounting and testing these lenses and filters, and we present the results of detailed tolerance analyses, demonstrating that the camera optics will perform to the specifications required to meet their performance goals.« less
AO WFS detector developments at ESO to prepare for the E-ELT
NASA Astrophysics Data System (ADS)
Downing, Mark; Casali, Mark; Finger, Gert; Lewis, Steffan; Marchetti, Enrico; Mehrgan, Leander; Ramsay, Suzanne; Reyes, Javier
2016-07-01
ESO has a very active on-going AO WFS detector development program to not only meet the needs of the current crop of instruments for the VLT, but also has been actively involved in gathering requirements, planning, and developing detectors and controllers/cameras for the instruments in design and being proposed for the E-ELT. This paper provides an overall summary of the AO WFS Detector requirements of the E-ELT instruments currently in design and telescope focal units. This is followed by a description of the many interesting detector, controller, and camera developments underway at ESO to meet these needs; a) the rationale behind and plan to upgrade the 240x240 pixels, 2000fps, "zero noise", L3Vision CCD220 sensor based AONGC camera; b) status of the LGSD/NGSD High QE, 3e- RoN, fast 700fps, 1760x1680 pixels, Visible CMOS Imager and camera development; c) status of and development plans for the Selex SAPHIRA NIR eAPD and controller. Most of the instruments and detector/camera developments are described in more detail in other papers at this conference.
An Atomic Lens Using a Focusing Hollow Beam
NASA Astrophysics Data System (ADS)
Xia, Yong; Yin, Jian-Ping; Wang, Yu-Zhu
2003-05-01
We propose a new method to generate a focused hollow laser beam by using an azimuthally distributed 2pi-phase plate and a convergent thin lens, and calculate the intensity distribution of the focused hollow beam in free propagation space. The relationship between the waist wo of the incident collimated Gaussian beam and the dark spot size of the focused hollow beam at the focal point, and the relationship between the focal length f of the thin lens and the dark spot size are studied respectively. The optical potential of the blue-detuned focused hollow beam for 85Rb atoms is calculated. Our study shows that when the larger waist w of the incident Gaussian beam and the shorter focal length f of the lens are chosen, we can obtain an extremely small dark spot size of the focused hollow beam, which can be used to form an atomic lens with a resolution of several angstroms.
Myers, Matthew R; Giridhar, Dushyanth
2011-06-01
In the characterization of high-intensity focused ultrasound (HIFU) systems, it is desirable to know the intensity field within a tissue phantom. Infrared (IR) thermography is a potentially useful method for inferring this intensity field from the heating pattern within the phantom. However, IR measurements require an air layer between the phantom and the camera, making inferences about the thermal field in the absence of the air complicated. For example, convection currents can arise in the air layer and distort the measurements relative to the phantom-only situation. Quantitative predictions of intensity fields based upon IR temperature data are also complicated by axial and radial diffusion of heat. In this paper, mathematical expressions are derived for use with IR temperature data acquired at times long enough that noise is a relatively small fraction of the temperature trace, but small enough that convection currents have not yet developed. The relations were applied to simulated IR data sets derived from computed pressure and temperature fields. The simulation was performed in a finite-element geometry involving a HIFU transducer sonicating upward in a phantom toward an air interface, with an IR camera mounted atop an air layer, looking down at the heated interface. It was found that, when compared to the intensity field determined directly from acoustic propagation simulations, intensity profiles could be obtained from the simulated IR temperature data with an accuracy of better than 10%, at pre-focal, focal, and post-focal locations. © 2011 Acoustical Society of America
Optical zoom lens module using MEMS deformable mirrors for portable device
NASA Astrophysics Data System (ADS)
Lu, Jia-Shiun; Su, Guo-Dung J.
2012-10-01
The thickness of the smart phones in today's market is usually below than 10 mm, and with the shrinking of the phone volume, the difficulty of its production of the camera lens has been increasing. Therefore, how to give the imaging device more functionality in the smaller space is one of the interesting research topics for today's mobile phone companies. In this paper, we proposed a thin optical zoom system which is combined of micro-electromechanical components and reflective optical architecture. By the adopting of the MEMS deformable mirrors, we can change their radius of curvature to reach the optical zoom in and zoom out. And because we used the all-reflective architecture, so this system has eliminated the considerable chromatic aberrations in the absence of lenses. In our system, the thickness of the zoom system is about 11 mm. The smallest EFL (effective focal length) is 4.61 mm at a diagonal field angle of 52° and f/# of 5.24. The longest EFL of the module is 9.22 mm at a diagonal field angle of 27.4 with f/# of 5.03.°
The Mars Surveyor '01 Rover and Robotic Arm
NASA Technical Reports Server (NTRS)
Bonitz, Robert G.; Nguyen, Tam T.; Kim, Won S.
1999-01-01
The Mars Surveyor 2001 Lander will carry with it both a Robotic Arm and Rover to support various science and technology experiments. The Marie Curie Rover, the twin sister to Sojourner Truth, is expected to explore the surface of Mars in early 2002. Scientific investigations to determine the elemental composition of surface rocks and soil using the Alpha Proton X-Ray Spectrometer (APXS) will be conducted along with several technology experiments including the Mars Experiment on Electrostatic Charging (MEEC) and the Wheel Abrasion Experiment (WAE). The Rover will follow uplinked operational sequences each day, but will be capable of autonomous reactions to the unpredictable features of the Martian environment. The Mars Surveyor 2001 Robotic Arm will perform rover deployment, and support various positioning, digging, and sample acquiring functions for MECA (Mars Environmental Compatibility Assessment) and Mossbauer Spectrometer experiments. The Robotic Arm will also collect its own sensor data for engineering data analysis. The Robotic Arm Camera (RAC) mounted on the forearm of the Robotic Arm will capture various images with a wide range of focal length adjustment during scientific experiments and rover deployment
High-frame-rate infrared and visible cameras for test range instrumentation
NASA Astrophysics Data System (ADS)
Ambrose, Joseph G.; King, B.; Tower, John R.; Hughes, Gary W.; Levine, Peter A.; Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; O'Mara, K.; Sjursen, W.; McCaffrey, Nathaniel J.; Pantuso, Francis P.
1995-09-01
Field deployable, high frame rate camera systems have been developed to support the test and evaluation activities at the White Sands Missile Range. The infrared cameras employ a 640 by 480 format PtSi focal plane array (FPA). The visible cameras employ a 1024 by 1024 format backside illuminated CCD. The monolithic, MOS architecture of the PtSi FPA supports commandable frame rate, frame size, and integration time. The infrared cameras provide 3 - 5 micron thermal imaging in selectable modes from 30 Hz frame rate, 640 by 480 frame size, 33 ms integration time to 300 Hz frame rate, 133 by 142 frame size, 1 ms integration time. The infrared cameras employ a 500 mm, f/1.7 lens. Video outputs are 12-bit digital video and RS170 analog video with histogram-based contrast enhancement. The 1024 by 1024 format CCD has a 32-port, split-frame transfer architecture. The visible cameras exploit this architecture to provide selectable modes from 30 Hz frame rate, 1024 by 1024 frame size, 32 ms integration time to 300 Hz frame rate, 1024 by 1024 frame size (with 2:1 vertical binning), 0.5 ms integration time. The visible cameras employ a 500 mm, f/4 lens, with integration time controlled by an electro-optical shutter. Video outputs are RS170 analog video (512 by 480 pixels), and 12-bit digital video.
Final Optical Design of PANIC, a Wide-Field Infrared Camera for CAHA
NASA Astrophysics Data System (ADS)
Cárdenas, M. C.; Gómez, J. Rodríguez; Lenzen, R.; Sánchez-Blanco, E.
We present the Final Optical Design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Ritchey-Chrtien focus of the Calar Alto 2.2 m telescope. This will be the first instrument built under the German-Spanish consortium that manages the Calar Alto observatory. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. The optical design produces a well defined internal pupil available to reducing the thermal background by a cryogenic pupil stop. A mosaic of four detectors Hawaii 2RG of 2 k ×2 k, made by Teledyne, will give a field of view of 31.9 arcmin ×31.9 arcmin.
Optical Arc-Length Sensor For TIG Welding
NASA Technical Reports Server (NTRS)
Smith, Matthew A.
1990-01-01
Proposed subsystem of tungsten/inert-gas (TIG) welding system measures length of welding arc optically. Viewed by video camera, in one of three alternative optical configurations. Length of arc measured instead of inferred from voltage.
VizieR Online Data Catalog: Field RR Lyrae stars (Liska+, 2016)
NASA Astrophysics Data System (ADS)
Liska, J.; Skarka, M.; Zejda, M.; Mikulasek, Z.; de Villiers, S. N.
2016-05-01
Differential photometry for VX Her in 'table1.dat' file. New photometric measurements for VX Her were performed at Masaryk University Observatory, Brno, Czech Republic during 13 nights (April-August 2014) with 0.6-m (24-inch) Newtonian telescope, CCD G2-0402, in BVRI bands. CCD images were calibrated in a standard way (dark frame and flat field corrections). The C-Munipack software (Motl 2009) was used for this processing as well as for differential photometry. TYC 1510-269-1 and TYC 1510-149-1 were used as comparison and check stars, respectively. Differential photometry for AT Ser and SS Leo is in 'table2.dat' file. New photometric measurements for these two stars were obtained using 1-inch refractor (a photographic lens Sonnar 4/135mm, lens focal ratio/focal length) and ATIK 16IC CCD camera with green photometric filter with similar throughput as the Johnson V filter. Exposures were 30s and each five frames were combined to a single image to achieve a better signal-to-noise ratio. The time resolution of a such combined frame is about 170s. The comparison stars were HD 142799 for AT Ser and HD 100763 for SS Leo. List with candidates for binaries with RR Lyrae component - RRLyrBinCan database (version 2016 May 5) is in 'table3.dat' file. 'table4.dat' file contains false-positives binary candidates among RR Lyrae stars. 'table5.dat' and 'table6.dat' files contain used maxima timings given in GEOS RR Lyr database, or newly determined in this study. (7 data files).
Mars reconnaissance orbiter's high resolution imaging science experiment (HiRISE)
McEwen, A.S.; Eliason, E.M.; Bergstrom, J.W.; Bridges, N.T.; Hansen, C.J.; Delamere, W.A.; Grant, J. A.; Gulick, V.C.; Herkenhoff, K. E.; Keszthelyi, L.; Kirk, R.L.; Mellon, M.T.; Squyres, S. W.; Thomas, N.; Weitz, C.M.
2007-01-01
The HiRISE camera features a 0.5 m diameter primary mirror, 12 m effective focal length, and a focal plane system that can acquire images containing up to 28 Gb (gigabits) of data in as little as 6 seconds. HiRISE will provide detailed images (0.25 to 1.3 m/pixel) covering ???1% of the Martian surface during the 2-year Primary Science Phase (PSP) beginning November 2006. Most images will include color data covering 20% of the potential field of view. A top priority is to acquire ???1000 stereo pairs and apply precision geometric corrections to enable topographic measurements to better than 25 cm vertical precision. We expect to return more than 12 Tb of HiRISE data during the 2-year PSP, and use pixel binning, conversion from 14 to 8 bit values, and a lossless compression system to increase coverage. HiRISE images are acquired via 14 CCD detectors, each with 2 output channels, and with multiple choices for pixel binning and number of Time Delay and Integration lines. HiRISE will support Mars exploration by locating and characterizing past, present, and future landing sites, unsuccessful landing sites, and past and potentially future rover traverses. We will investigate cratering, volcanism, tectonism, hydrology, sedimentary processes, stratigraphy, aeolian processes, mass wasting, landscape evolution, seasonal processes, climate change, spectrophotometry, glacial and periglacial processes, polar geology, and regolith properties. An Internet Web site (HiWeb) will enable anyone in the world to suggest HiRISE targets on Mars and to easily locate, view, and download HiRISE data products. Copyright 2007 by the American Geophysical Union.
Reticle stage based linear dosimeter
Berger, Kurt W [Livermore, CA
2007-03-27
A detector to measure EUV intensity employs a linear array of photodiodes. The detector is particularly suited for photolithography systems that includes: (i) a ringfield camera; (ii) a source of radiation; (iii) a condenser for processing radiation from the source of radiation to produce a ringfield illumination field for illuminating a mask; (iv) a reticle that is positioned at the ringfield camera's object plane and from which a reticle image in the form of an intensity profile is reflected into the entrance pupil of the ringfield camera, wherein the reticle moves in a direction that is transverse to the length of the ringfield illumination field that illuminates the reticle; (v) detector for measuring the entire intensity along the length of the ringfield illumination field that is projected onto the reticle; and (vi) a wafer onto which the reticle imaged is projected from the ringfield camera.
Reticle stage based linear dosimeter
Berger, Kurt W.
2005-06-14
A detector to measure EUV intensity employs a linear array of photodiodes. The detector is particularly suited for photolithography systems that includes: (i) a ringfield camera; (ii) a source of radiation; (iii) a condenser for processing radiation from the source of radiation to produce a ringfield illumination field for illuminating a mask; (iv) a reticle that is positioned at the ringfield camera's object plane and from which a reticle image in the form of an intensity profile is reflected into the entrance pupil of the ringfield camera, wherein the reticle moves in a direction that is transverse to the length of the ringfield illumination field that illuminates the reticle; (v) detector for measuring the entire intensity along the length of the ringfield illumination field that is projected onto the reticle; and (vi) a wafer onto which the reticle imaged is projected from the ringfield camera.
Clinical Gait Evaluation of Patients with Lumbar Spine Stenosis.
Sun, Jun; Liu, Yan-Cheng; Yan, Song-Hua; Wang, Sha-Sha; Lester, D Kevin; Zeng, Ji-Zhou; Miao, Jun; Zhang, Kuan
2018-02-01
The third generation Intelligent Device for Energy Expenditure and Activity (IDEEA3, MiniSun, CA) has been developed for clinical gait evaluation, and this study was designed to evaluate the accuracy and reliability of IDEEA3 for the gait measurement of lumbar spinal stenosis (LSS) patients. Twelve healthy volunteers were recruited to compare gait cycle, cadence, step length, velocity, and number of steps between a motion analysis system and a high-speed video camera. Twenty hospitalized LSS patients were recruited for the comparison of the five parameters between the IDEEA3 and GoPro camera. Paired t-test, intraclass correlation coefficient, concordance correlation coefficient, and Bland-Altman plots were used for the data analysis. The ratios of GoPro camera results to motion analysis system results, and the ratios of IDEEA3 results to GoPro camera results were all around 1.00. All P-values of paired t-tests for gait cycle, cadence, step length, and velocity were greater than 0.05, while all the ICC and CCC results were above 0.950 with P < 0.001. The measurements for gait cycle, cadence, step length, velocity, and number of steps with the GoPro camera are highly consistent with the measurements with the motion analysis system. The measurements for IDEEA3 are consistent with those for the GoPro camera. IDEEA3 can be effectively used in the gait measurement of LSS patients. © 2018 Chinese Orthopaedic Association and John Wiley & Sons Australia, Ltd.
Investigation of the flow structure in thin polymer films using 3D µPTV enhanced by GPU
NASA Astrophysics Data System (ADS)
Cavadini, Philipp; Weinhold, Hannes; Tönsmann, Max; Chilingaryan, Suren; Kopmann, Andreas; Lewkowicz, Alexander; Miao, Chuan; Scharfer, Philip; Schabel, Wilhelm
2018-04-01
To understand the effects of inhomogeneous drying on the quality of polymer coatings, an experimental setup to resolve the occurring flow field throughout the drying film has been developed. Deconvolution microscopy is used to analyze the flow field in 3D and time. Since the dimension of the spatial component in the direction of the line-of-sight is limited compared to the lateral components, a multi-focal approach is used. Here, the beam of light is equally distributed on up to five cameras using cubic beam splitters. Adding a meniscus lens between each pair of camera and beam splitter and setting different distances between each camera and its meniscus lens creates multi-focality and allows one to increase the depth of the observed volume. Resolving the spatial component in the line-of-sight direction is based on analyzing the point spread function. The analysis of the PSF is computational expensive and introduces a high complexity compared to traditional particle image velocimetry approaches. A new algorithm tailored to the parallel computing architecture of recent graphics processing units has been developed. The algorithm is able to process typical images in less than a second and has further potential to realize online analysis in the future. As a prove of principle, the flow fields occurring in thin polymer solutions drying at ambient conditions and at boundary conditions that force inhomogeneous drying are presented.
Fine Guidance Sensing for Coronagraphic Observatories
NASA Technical Reports Server (NTRS)
Brugarolas, Paul; Alexander, James W.; Trauger, John T.; Moody, Dwight C.
2011-01-01
Three options have been developed for Fine Guidance Sensing (FGS) for coronagraphic observatories using a Fine Guidance Camera within a coronagraphic instrument. Coronagraphic observatories require very fine precision pointing in order to image faint objects at very small distances from a target star. The Fine Guidance Camera measures the direction to the target star. The first option, referred to as Spot, was to collect all of the light reflected from a coronagraph occulter onto a focal plane, producing an Airy-type point spread function (PSF). This would allow almost all of the starlight from the central star to be used for centroiding. The second approach, referred to as Punctured Disk, collects the light that bypasses a central obscuration, producing a PSF with a punctured central disk. The final approach, referred to as Lyot, collects light after passing through the occulter at the Lyot stop. The study includes generation of representative images for each option by the science team, followed by an engineering evaluation of a centroiding or a photometric algorithm for each option. After the alignment of the coronagraph to the fine guidance system, a "nulling" point on the FGS focal point is determined by calibration. This alignment is implemented by a fine alignment mechanism that is part of the fine guidance camera selection mirror. If the star images meet the modeling assumptions, and the star "centroid" can be driven to that nulling point, the contrast for the coronagraph will be maximized.
KWFC: four square degrees camera for the Kiso Schmidt Telescope
NASA Astrophysics Data System (ADS)
Sako, Shigeyuki; Aoki, Tsutomu; Doi, Mamoru; Ienaka, Nobuyuki; Kobayashi, Naoto; Matsunaga, Noriyuki; Mito, Hiroyuki; Miyata, Takashi; Morokuma, Tomoki; Nakada, Yoshikazu; Soyano, Takao; Tarusawa, Ken'ichi; Miyazaki, Satoshi; Nakata, Fumiaki; Okada, Norio; Sarugaku, Yuki; Richmond, Michael W.
2012-09-01
The Kiso Wide Field Camera (KWFC) is a facility instrument for the 105-cm Schmidt telescope being operated by the Kiso Observatory of the University of Tokyo. This camera has been designed for wide-field observations by taking advantage of a large focal-plane area of the Schmidt telescope. Eight CCD chips with a total of 8k x 8k pixels cover a field-of-view of 2.2 degrees x 2.2 degrees on the sky. The dewar window works as a field flattener lens minimizing an image distortion across the field of view. Two shutter plates moving in parallel achieve uniform exposures on all the CCD pixels. The KWFC is equipped with a filter exchanger composed of an industrial robotic arm, a filter magazine capable of storing 12 filters, and a filter holder at the focal plane. Both the arm and the magazine are installed inside the tube framework of the telescope but without vignetting the beam. Wide-field survey programs searching for supernovae and late-type variable stars have begun in April 2012. The survey observations are performed with a management software system for facility instruments including the telescope and the KWFC. This system automatically carries out observations based on target lists registered in advance and makes appropriate decisions for implementation of observations by referring to weather conditions and status of the instruments. Image data obtained in the surveys are processed with pipeline software in real time to search for candidates of time-variable sources.
Microelectromechanical-System-Based Variable-Focus Liquid Lens for Capsule Endoscopes
NASA Astrophysics Data System (ADS)
Seo, Sang Won; Han, Seungoh; Seo, Jun Ho; Kim, Young Mok; Kang, Moon Sik; Min, Nam Ki; Choi, Woo Beom; Sung, Man Young
2009-05-01
A liquid lens based on the electrowetting phenomenon was designed to be cylindrical to minimize dead area. The lens was fabricated with microelectromechanical-system (MEMS) technology using silicon thin film and wafer bonding processes. A multiple dielectric layer comprising Teflon, silicon nitride, and thermal oxide was formed on the cylinder wall. With a change of 11 Vrms in the applied bias, the lens module, including the fabricated liquid lens, showed a focal length change of approximately 166 mm. A capsule endoscope was assembled, including the lens module, and was successfully used to take images of a pig colon at various focal lengths.
Direct generation of abruptly focusing vortex beams using a 3/2 radial phase-only pattern.
Davis, Jeffrey A; Cottrell, Don M; Zinn, Jonathan M
2013-03-20
Abruptly focusing Airy beams have previously been generated using a radial cubic phase pattern that represents the Fourier transform of the Airy beam. The Fourier transform of this pattern is formed using a system length of 2f, where f is the focal length of the Fourier transform lens. In this work, we directly generate these abruptly focusing Airy beams using a 3/2 radial phase pattern encoded onto a liquid crystal display. The resulting optical system is much shorter. In addition, we can easily produce vortex patterns at the focal point of these beams. Experimental results match theoretical predictions.
Lenticular accommodation in relation to ametropia: the chick model.
Choh, Vivian; Sivak, Jacob G
2005-03-04
Our goal was to determine whether experimentally induced ametropias have an effect on lenticular accommodation and spherical aberration. Form-deprivation myopia and hyperopia were induced in one eye of hatchling chicks by application of a translucent goggle and +15 D lens, respectively. After 7 days, eyes were enucleated and lenses were optically scanned prior to accommodation, during accommodation, and after accommodation. Accommodation was induced by electrical stimulation of the ciliary nerve. Lenticular focal lengths for form-deprived eyes were significantly shorter than for their controls and accommodation-associated changes in focal length were significantly smaller in myopic eyes compared to their controls. For eyes imposed with +15 D blur, focal lengths were longer than those for their controls and accommodative changes were greater. Spherical aberration of the lens increased with accommodation in both form-deprived and lens-treated birds, but induction of ametropia had no effect on lenticular spherical aberration in general. Nonmonotonicity from lenticular spherical aberration increased during accommodation but effects of refractive error were equivocal. The crystalline lens contributes to refractive error changes of the eye both in the case of myopia and hyperopia. These changes are likely attributable to global changes in the size and shape of the eye.
Low Cost DIY Lenses kit For High School Teaching
NASA Astrophysics Data System (ADS)
Thepnurat, Meechai; Saphet, Parinya; Tong-on, Anusorn
2017-09-01
A set of lenses was fabricated from a low cost materials in a DIY (Do it yourself) process. The purpose was to demonstrate to teachers and students in high schools how to construct lenses by themselves with the local available materials. The lenses could be applied in teaching Physics, about the nature of a lens such as focal length and light rays passing through lenses in either direction, employing a set of simple laser pointers. This instrumental kit was made from a transparent 2 mm thick of acrylic Perspex. It was cut into rectangular pieces with dimensions of 2x15 cm2 and bent into curved shape by a hot air blower on a cylindrical wooden rod with curvature radii of about 3-4.5 cm. Then a pair of these Perspex were formed into a hollow thick lenses with a base supporting platform, so that any appropriate liquids could be filled in. The focal length of the lens was measured from laser beam drawing on a paper. The refractive index, n (n) of a filling liquid could be calculated from the measured focal length (f). The kit was low cost and DIY but was greatly applicable for optics teaching in high school laboratory.
Design of a nanopatterned long focal-length planar focusing collector for concentrated solar power
NASA Astrophysics Data System (ADS)
Ding, Qing; Choubal, Aakash; Toussaint, Kimani C.
2017-02-01
Concentrated solar power (CSP) facilities heavily utilize parabolic troughs to collect and concentrate sunlight onto receivers that deliver solar thermal energy to heat engines for generating electricity. However, parabolic troughs are bulky and heavy and result in a large capital investment for CSP plants, thereby making it difficult for CSP technology to be competitive with photovoltaics. We present the design of a planar focusing collector (PFC) with focal length beyond the micron scale. The PFC design is based on the use of a nanostructured silver surface for linearly polarized singlewavelength light. The designed PFC consists of metallic nanogrooves on a dielectric substrate. The geometric properties, namely the width and depth, of a single-unit nanogroove allows for full control of the optical phase at desired spatial coordinates along the nanogroove short-axis for a single wavelength. Moreover, we show numerically that such phase control can be used to construct a phase front that mimics that of a cylindrical lens. In addition, we determine the concentration ratio by comparing the width of our PFC design to the cross-sectional width of its focal spot. We also determine the conversion efficiency at long focal lengths by evaluating the ratio of the collected optical power to the incoming optical power. Finally, we examine the focusing behavior across multiple wavelengths and angles of incidence. Our work shows how nano-optics and plasmonics could contribute to this important area of CSP technology.
The statistics of calcium-mediated focal excitations on a one-dimensional cable.
Chen, Wei; Asfaw, Mesfin; Shiferaw, Yohannes
2012-02-08
It is well known that various cardiac arrhythmias are initiated by an ill-timed excitation that originates from a focal region of the heart. However, up to now, it is not known what governs the timing, location, and morphology of these focal excitations. Recent studies have shown that these excitations can be caused by abnormalities in the calcium (Ca) cycling system. However, the cause-and-effect relationships linking subcellular Ca dynamics and focal activity in cardiac tissue is not completely understood. In this article, we present a minimal model of Ca-mediated focal excitations in cardiac tissue. This model accounts for the stochastic nature of spontaneous Ca release on a one-dimensional cable of cardiac cells. Using this model, we show that the timing of focal excitations is equivalent to a first passage time problem in a spatially extended system. In particular, we find that for a short cable the mean first passage time increases exponentially with the number of cells in tissue, and is critically dependent on the ratio of inward to outward currents near the threshold for an action potential. For long cables excitations occurs due to ectopic foci that occur on a length scale determined by the minimum length of tissue that can induce an action potential. Furthermore, we find that for long cables the mean first passage time decreases as a power law in the number cells. These results provide precise criteria for the occurrence of focal excitations in cardiac tissue, and will serve as a guide to determine the propensity of Ca-mediated triggered arrhythmias in the heart. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Rieke-Zapp, D.; Tecklenburg, W.; Peipe, J.; Hastedt, H.; Haig, Claudia
Recent tests on the geometric stability of several digital cameras that were not designed for photogrammetric applications have shown that the accomplished accuracies in object space are either limited or that the accuracy potential is not exploited to the fullest extent. A total of 72 calibrations were calculated with four different software products for eleven digital camera models with different hardware setups, some with mechanical fixation of one or more parts. The calibration procedure was chosen in accord to a German guideline for evaluation of optical 3D measuring systems [VDI/VDE, VDI/VDE 2634 Part 1, 2002. Optical 3D Measuring Systems-Imaging Systems with Point-by-point Probing. Beuth Verlag, Berlin]. All images were taken with ringflashes which was considered a standard method for close-range photogrammetry. In cases where the flash was mounted to the lens, the force exerted on the lens tube and the camera mount greatly reduced the accomplished accuracy. Mounting the ringflash to the camera instead resulted in a large improvement of accuracy in object space. For standard calibration best accuracies in object space were accomplished with a Canon EOS 5D and a 35 mm Canon lens where the focusing tube was fixed with epoxy (47 μm maximum absolute length measurement error in object space). The fixation of the Canon lens was fairly easy and inexpensive resulting in a sevenfold increase in accuracy compared with the same lens type without modification. A similar accuracy was accomplished with a Nikon D3 when mounting the ringflash to the camera instead of the lens (52 μm maximum absolute length measurement error in object space). Parameterisation of geometric instabilities by introduction of an image variant interior orientation in the calibration process improved results for most cameras. In this case, a modified Alpa 12 WA yielded the best results (29 μm maximum absolute length measurement error in object space). Extending the parameter model with FiBun software to model not only an image variant interior orientation, but also deformations in the sensor domain of the cameras, showed significant improvements only for a small group of cameras. The Nikon D3 camera yielded the best overall accuracy (25 μm maximum absolute length measurement error in object space) with this calibration procedure indicating at the same time the presence of image invariant error in the sensor domain. Overall, calibration results showed that digital cameras can be applied for an accurate photogrammetric survey and that only a little effort was sufficient to greatly improve the accuracy potential of digital cameras.
Miniature infrared hyperspectral imaging sensor for airborne applications
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele; Hinnrichs, Bradford; McCutchen, Earl
2017-05-01
Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera, both MWIR and LWIR, small enough to serve as a payload on a miniature unmanned aerial vehicles. The optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of the sensor. This new and innovative approach to infrared hyperspectral imaging spectrometer uses micro-optics and will be explained in this paper. The micro-optics are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a mini-UAV or commercial quadcopter. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the spatial resolution. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame.
Infrared hyperspectral imaging miniaturized for UAV applications
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele; Hinnrichs, Bradford; McCutchen, Earl
2017-02-01
Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera, both MWIR and LWIR, small enough to serve as a payload on a miniature unmanned aerial vehicles. The optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of the sensor. This new and innovative approach to infrared hyperspectral imaging spectrometer uses micro-optics and will be explained in this paper. The micro-optics are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a mini-UAV or commercial quadcopter. Also, an example of how this technology can easily be used to quantify a hydrocarbon gas leak's volume and mass flowrates. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the spatial resolution. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame.
NASA Astrophysics Data System (ADS)
Gacal, G. F. B.; Lagrosas, N.
2016-12-01
Nowadays, cameras are commonly used by students. In this study, we use this instrument to look at moon signals and relate these signals to Gaussian functions. To implement this as a classroom activity, students need computers, computer software to visualize signals, and moon images. A normalized Gaussian function is often used to represent probability density functions of normal distribution. It is described by its mean m and standard deviation s. The smaller standard deviation implies less spread from the mean. For the 2-dimensional Gaussian function, the mean can be described by coordinates (x0, y0), while the standard deviations can be described by sx and sy. In modelling moon signals obtained from sky-cameras, the position of the mean (x0, y0) is solved by locating the coordinates of the maximum signal of the moon. The two standard deviations are the mean square weighted deviation based from the sum of total pixel values of all rows/columns. If visualized in three dimensions, the 2D Gaussian function appears as a 3D bell surface (Fig. 1a). This shape is similar to the pixel value distribution of moon signals as captured by a sky-camera. An example of this is illustrated in Fig 1b taken around 22:20 (local time) of January 31, 2015. The local time is 8 hours ahead of coordinated universal time (UTC). This image is produced by a commercial camera (Canon Powershot A2300) with 1s exposure time, f-stop of f/2.8, and 5mm focal length. One has to chose a camera with high sensitivity when operated at nighttime to effectively detect these signals. Fig. 1b is obtained by converting the red-green-blue (RGB) photo to grayscale values. The grayscale values are then converted to a double data type matrix. The last conversion process is implemented for the purpose of having the same scales for both Gaussian model and pixel distribution of raw signals. Subtraction of the Gaussian model from the raw data produces a moonless image as shown in Fig. 1c. This moonless image can be used for quantifying cloud cover as captured by ordinary cameras (Gacal et al, 2016). Cloud cover can be defined as the ratio of number of pixels whose values exceeds 0.07 and the total number of pixels. In this particular image, cloud cover value is 0.67.
The Mars NetLander panoramic camera
NASA Astrophysics Data System (ADS)
Jaumann, Ralf; Langevin, Yves; Hauber, Ernst; Oberst, Jürgen; Grothues, Hans-Georg; Hoffmann, Harald; Soufflot, Alain; Bertaux, Jean-Loup; Dimarellis, Emmanuel; Mottola, Stefano; Bibring, Jean-Pierre; Neukum, Gerhard; Albertz, Jörg; Masson, Philippe; Pinet, Patrick; Lamy, Philippe; Formisano, Vittorio
2000-10-01
The panoramic camera (PanCam) imaging experiment is designed to obtain high-resolution multispectral stereoscopic panoramic images from each of the four Mars NetLander 2005 sites. The main scientific objectives to be addressed by the PanCam experiment are (1) to locate the landing sites and support the NetLander network sciences, (2) to geologically investigate and map the landing sites, and (3) to study the properties of the atmosphere and of variable phenomena. To place in situ measurements at a landing site into a proper regional context, it is necessary to determine the lander orientation on ground and to exactly locate the position of the landing site with respect to the available cartographic database. This is not possible by tracking alone due to the lack of on-ground orientation and the so-called map-tie problem. Images as provided by the PanCam allow to determine accurate tilt and north directions for each lander and to identify the lander locations based on landmarks, which can also be recognized in appropriate orbiter imagery. With this information, it will be further possible to improve the Mars-wide geodetic control point network and the resulting geometric precision of global map products. The major geoscientific objectives of the PanCam lander images are the recognition of surface features like ripples, ridges and troughs, and the identification and characterization of different rock and surface units based on their morphology, distribution, spectral characteristics, and physical properties. The analysis of the PanCam imagery will finally result in the generation of precise map products for each of the landing sites. So far comparative geologic studies of the Martian surface are restricted to the timely separated Mars Pathfinder and the two Viking Lander Missions. Further lander missions are in preparation (Beagle-2, Mars Surveyor 03). NetLander provides the unique opportunity to nearly double the number of accessible landing site data by providing simultaneous and long-term observations at four different surface locations which becomes especially important for studies of variable surface features as well as properties and phenomena of the atmosphere. Major changes on the surface that can be detected by PanCam are caused by eolian activities and condensation processes, which directly reflect variations in the prevailing near-surface wind regime and the diurnal and seasonal volatile and dust cycles. Atmospheric studies will concentrate on the detection of clouds, measurements of the aerosol contents and the water vapor absorption at 936 nm. In order to meet these objectives, the proposed PanCam instrument is a highly miniaturized, dedicated stereo and multispectral imaging device. The camera consists of two identical camera cubes, which are arranged in a common housing at a fixed stereo base length of 11 cm. Each camera cube is equipped with a CCD frame transfer detector with 1024×1024 active pixels and optics with a focal length of 13 mm yielding a field-of-view of 53°×53° and an instantaneous filed of view of 1.1 mrad. A filter swivel with six positions provides different color band passes in the wavelength range of 400-950 nm. The camera head is mounted on top of a deployable scissors boom and can be rotated by 360° to obtain a full panorama, which is already covered by eight images. The boom raises the camera head to a final altitude of 90 cm above the surface. Most camera activities will take place within the first week and the first month of the mission. During the remainder of the mission, the camera will operate with a reduced data rate to monitor time-dependent variations on a daily basis. PanCam is a joint German/French project with contributions from DLR, Institute of Space Sensor Technology and Planetary Exploration, Berlin, Institut d'Astrophysique Spatiale, CNRS, Orsay, and Service d'Aéronomie, CNRS, Verrières-le-Buisson.
Cai, Xi; Han, Guang; Song, Xin; Wang, Jinkuan
2017-11-01
single-camera-based gait monitoring is unobtrusive, inexpensive, and easy-to-use to monitor daily gait of seniors in their homes. However, most studies require subjects to walk perpendicularly to camera's optical axis or along some specified routes, which limits its application in elderly home monitoring. To build unconstrained monitoring environments, we propose a method to measure step length symmetry ratio (a useful gait parameter representing gait symmetry without significant relationship with age) from unconstrained straight walking using a single camera, without strict restrictions on walking directions or routes. according to projective geometry theory, we first develop a calculation formula of step length ratio for the case of unconstrained straight-line walking. Then, to adapt to general cases, we propose to modify noncollinear footprints, and accordingly provide general procedure for step length ratio extraction from unconstrained straight walking. Our method achieves a mean absolute percentage error (MAPE) of 1.9547% for 15 subjects' normal and abnormal side-view gaits, and also obtains satisfactory MAPEs for non-side-view gaits (2.4026% for 45°-view gaits and 3.9721% for 30°-view gaits). The performance is much better than a well-established monocular gait measurement system suitable only for side-view gaits with a MAPE of 3.5538%. Independently of walking directions, our method can accurately estimate step length ratios from unconstrained straight walking. This demonstrates our method is applicable for elders' daily gait monitoring to provide valuable information for elderly health care, such as abnormal gait recognition, fall risk assessment, etc. single-camera-based gait monitoring is unobtrusive, inexpensive, and easy-to-use to monitor daily gait of seniors in their homes. However, most studies require subjects to walk perpendicularly to camera's optical axis or along some specified routes, which limits its application in elderly home monitoring. To build unconstrained monitoring environments, we propose a method to measure step length symmetry ratio (a useful gait parameter representing gait symmetry without significant relationship with age) from unconstrained straight walking using a single camera, without strict restrictions on walking directions or routes. according to projective geometry theory, we first develop a calculation formula of step length ratio for the case of unconstrained straight-line walking. Then, to adapt to general cases, we propose to modify noncollinear footprints, and accordingly provide general procedure for step length ratio extraction from unconstrained straight walking. Our method achieves a mean absolute percentage error (MAPE) of 1.9547% for 15 subjects' normal and abnormal side-view gaits, and also obtains satisfactory MAPEs for non-side-view gaits (2.4026% for 45°-view gaits and 3.9721% for 30°-view gaits). The performance is much better than a well-established monocular gait measurement system suitable only for side-view gaits with a MAPE of 3.5538%. Independently of walking directions, our method can accurately estimate step length ratios from unconstrained straight walking. This demonstrates our method is applicable for elders' daily gait monitoring to provide valuable information for elderly health care, such as abnormal gait recognition, fall risk assessment, etc.
Realization of integral 3-dimensional image using fabricated tunable liquid lens array
NASA Astrophysics Data System (ADS)
Lee, Muyoung; Kim, Junoh; Kim, Cheol Joong; Lee, Jin Su; Won, Yong Hyub
2015-03-01
Electrowetting has been widely studied for various optical applications such as optical switch, sensor, prism, and display. In this study, vari-focal liquid lens array is developed using electrowetting principle to construct integral 3-dimensional imaging. The electrowetting principle that changes the surface tension by applying voltage has several advantages to realize active optical device such as fast response time, low electrical consumption, and no mechanical moving parts. Two immiscible liquids that are water and oil are used for forming lens. By applying a voltage to the water, the focal length of the lens could be tuned as changing contact angle of water. The fabricated electrowetting vari-focal liquid lens array has 1mm diameter spherical lens shape that has 1.6mm distance between each lens. The number of lenses on the panel is 23x23 and the focal length of the lens array is simultaneously tuned from -125 to 110 diopters depending on the applied voltage. The fabricated lens array is implemented to integral 3-dimensional imaging. A 3D object is reconstructed by fabricated liquid lens array with 23x23 elemental images that are generated by 3D max tools. When liquid lens array is tuned as convex state. From vari-focal liquid lens array implemented integral imaging system, we expect that depth enhanced integral imaging can be realized in the near future.
13.5 nm High Harmonic Generation Driven by a Visible Noncollinear Optical Parametric Amplifier
2011-11-11
compressed through a CaF2 prism pair at Brewster angle , and directed to the second OPA stage after a periscope flipping its polarization. The 90% part of...FWHM pulse duration. HHG setup The OPA pulses are sent into a vacuum chamber and focused in an Argon ( lens focal length 150 mm) or Helium (focal
Design optimization of ultra-precise elliptical mirrors for hard x-ray nanofocusing at Nanoscopium
NASA Astrophysics Data System (ADS)
Kewish, Cameron M.; Polack, François; Signorato, Riccardo; Somogyi, Andrea
2013-09-01
The design and implementation of a pair of 100 mm-long grazing-incidence total-reflection mirrors for the hard X-ray beamline Nanoscopium at Synchrotron Soleil is presented. A vertically and horizontally nanofocusing mirror pair, oriented in Kirkpatrick-Baez geometry, has been designed and fabricated with the aim of creating a diffraction-limited high-intensity 5 - 20 keV beam with a focal spot size as small as 50 nm. We describe the design considerations, including wave-optical calculations of figures-of-merit that are relevant for spectromicroscopy, such as the focal spot size, depth of field and integrated intensity. The mechanical positioning tolerance in the pitch angle that is required to avoid introducing high-intensity features in the neighborhood of the focal spot is demonstrated with simulations to be of the order of microradians, becoming tighter for shorter focal lengths and therefore directly affecting all nanoprobe mirror systems. Metrology results for the completed mirrors are presented, showing that better than 1.5 °A-rms figure error has been achieved over the full mirror lengths with respect to the designed elliptical surfaces, with less than 60 nrad-rms slope errors.
Prototypic Development and Evaluation of a Medium Format Metric Camera
NASA Astrophysics Data System (ADS)
Hastedt, H.; Rofallski, R.; Luhmann, T.; Rosenbauer, R.; Ochsner, D.; Rieke-Zapp, D.
2018-05-01
Engineering applications require high-precision 3D measurement techniques for object sizes that vary between small volumes (2-3 m in each direction) and large volumes (around 20 x 20 x 1-10 m). The requested precision in object space (1σ RMS) is defined to be within 0.1-0.2 mm for large volumes and less than 0.01 mm for small volumes. In particular, focussing large volume applications the availability of a metric camera would have different advantages for several reasons: 1) high-quality optical components and stabilisations allow for a stable interior geometry of the camera itself, 2) a stable geometry leads to a stable interior orientation that enables for an a priori camera calibration, 3) a higher resulting precision can be expected. With this article the development and accuracy evaluation of a new metric camera, the ALPA 12 FPS add|metric will be presented. Its general accuracy potential is tested against calibrated lengths in a small volume test environment based on the German Guideline VDI/VDE 2634.1 (2002). Maximum length measurement errors of less than 0.025 mm are achieved with different scenarios having been tested. The accuracy potential for large volumes is estimated within a feasibility study on the application of photogrammetric measurements for the deformation estimation on a large wooden shipwreck in the German Maritime Museum. An accuracy of 0.2 mm-0.4 mm is reached for a length of 28 m (given by a distance from a lasertracker network measurement). All analyses have proven high stabilities of the interior orientation of the camera and indicate the applicability for a priori camera calibration for subsequent 3D measurements.
NASA Technical Reports Server (NTRS)
Stefanov, William L.
2017-01-01
The NASA Earth observations dataset obtained by humans in orbit using handheld film and digital cameras is freely accessible to the global community through the online searchable database at https://eol.jsc.nasa.gov, and offers a useful compliment to traditional ground-commanded sensor data. The dataset includes imagery from the NASA Mercury (1961) through present-day International Space Station (ISS) programs, and currently totals over 2.6 million individual frames. Geographic coverage of the dataset includes land and oceans areas between approximately 52 degrees North and South latitudes, but is spatially and temporally discontinuous. The photographic dataset includes some significant impediments for immediate research, applied, and educational use: commercial RGB films and camera systems with overlapping bandpasses; use of different focal length lenses, unconstrained look angles, and variable spacecraft altitudes; and no native geolocation information. Such factors led to this dataset being underutilized by the community but recent advances in automated and semi-automated image geolocation, image feature classification, and web-based services are adding new value to the astronaut-acquired imagery. A coupled ground software and on-orbit hardware system for the ISS is in development for planned deployment in mid-2017; this system will capture camera pose information for each astronaut photograph to allow automated, full georegistration of the data. The ground system component of the system is currently in use to fully georeference imagery collected in response to International Disaster Charter activations, and the auto-registration procedures are being applied to the extensive historical database of imagery to add value for research and educational purposes. In parallel, machine learning techniques are being applied to automate feature identification and classification throughout the dataset, in order to build descriptive metadata that will improve search capabilities. It is expected that these value additions will increase interest and use of the dataset by the global community.
SO2 camera measurements at Lastarria volcano and Lascar volcano in Chile
NASA Astrophysics Data System (ADS)
Lübcke, Peter; Bobrowski, Nicole; Dinger, Florian; Klein, Angelika; Kuhn, Jonas; Platt, Ulrich
2015-04-01
The SO2 camera is a remote-sensing technique that measures volcanic SO2 emissions via the strong SO2 absorption structures in the UV using scattered solar radiation as a light source. The 2D-imagery (usually recorded with a frame rate of up to 1 Hz) allows new insights into degassing processes of volcanoes. Besides the large advantage of high frequency sampling the spatial resolution allows to investigate SO2 emissions from individual fumaroles and not only the total SO2 emission flux of a volcano, which is often dominated by the volcanic plume. Here we present SO2 camera measurements that were made during the CCVG workshop in Chile in November 2014. Measurements were performed at Lastarria volcano, a 5700 m high stratovolcano and Lascar volcano, a 5600 m high stratovolcano both in northern Chile on 21 - 22 November, 2014 and on 26 - 27 November, 2014, respectively. At both volcanoes measurements were conducted from a distance of roughly 6-7 km under close to ideal conditions (low solar zenith angle, a very dry and cloudless atmosphere and an only slightly condensed plume). However, determination of absolute SO2 emission rates proves challenging as part of the volcanic plume hovered close to the ground. The volcanic plume therefore is in front of the mountain in our camera images. An SO2 camera system consisting of a UV sensitive CCD and two UV band-pass filters (centered at 315 nm and 330 nm) was used. The two band-pass filters are installed in a rotating wheel and images are taken with both filter sequentially. The instrument used a CCD with 1024 x 1024 pixels and an imaging area of 13.3 mm x 13.3 mm. In combination with the focal length of 32 mm this results in a field-of-view of 25° x 25°. The calibration of the instrument was performed with help of a DOAS instrument that is co-aligned with the SO2 camera. We will present images and SO2 emission rates from both volcanoes. At Lastarria gases are emitted from three different fumarole fields and we will attempt to investigate the degassing behavior of the individual fumaroles. Lascar volcano only had a very weak plume originating from the active central crater with maximum SO2 column densities of only up to 5 × 1017[molecules/cm2] during our measurements. These low SO2 column densities in combination with the almost ideal measurements conditions will be used to assess the detection limit of our current SO2 camera system.
Shake, Rattle and Roll: James Webb Telescope Components Pass Tests
NASA Technical Reports Server (NTRS)
2008-01-01
This image shows a model of one of three detectors for the Mid-Infrared Instrument on NASA's upcoming James Webb Space Telescope. The detector, which looks green in this picture, and is similar to the charge-coupled devices, or 'CCDs,' in digital cameras, is housed in the brick-like unit shown here, called a focal plane module.NASA Astrophysics Data System (ADS)
Mazin, Ben
2014-07-01
Microwave Kinetic Inductance Detectors (MKIDs) are single photon counting, energy resolving detectors applicable across the UVOIR. The first MKID instrument, ARCONS, has been taking data on the Palomar 200" for several years, and we have recently published the first papers using ARCONS data. There are currently two UVOIR MKID instruments fully funded and under construction for planet hunting, DARKNESS for the Palomar P1640 coronagraph, and MEC for Subaru's SCExAO.There are significant opportunities available in pairing MKIDs with TMT. MKIDs can serve as a combined science camera and fast focal plane speckle sensor, allowing rapid feedback to cancel atmospheric speckles. A MKID-based TMT Planet Imager (potentially just a visiting SCExAO+MEC) could discover and take spectra of planets in the habitable zones of nearby M dwarfs, potentially discovering life by looking at spectral signatures in their atmospheres.Another promising application is using the outer part of the focal plane that is ignored by NFIRAOS for a large MKID array. This instrument would serve as a serendipitous camera, providing imaging and spectroscopy for galaxies that would rotate through the field during the normal use of IRIS and IRMOS. This "free" 30-m time would yield a very deep imaging catalog with R~30 spectroscopy.
Improved calibration-based non-uniformity correction method for uncooled infrared camera
NASA Astrophysics Data System (ADS)
Liu, Chengwei; Sui, Xiubao
2017-08-01
With the latest improvements of microbolometer focal plane arrays (FPA), uncooled infrared (IR) cameras are becoming the most widely used devices in thermography, especially in handheld devices. However the influences derived from changing ambient condition and the non-uniform response of the sensors make it more difficult to correct the nonuniformity of uncooled infrared camera. In this paper, based on the infrared radiation characteristic in the TEC-less uncooled infrared camera, a novel model was proposed for calibration-based non-uniformity correction (NUC). In this model, we introduce the FPA temperature, together with the responses of microbolometer under different ambient temperature to calculate the correction parameters. Based on the proposed model, we can work out the correction parameters with the calibration measurements under controlled ambient condition and uniform blackbody. All correction parameters can be determined after the calibration process and then be used to correct the non-uniformity of the infrared camera in real time. This paper presents the detail of the compensation procedure and the performance of the proposed calibration-based non-uniformity correction method. And our method was evaluated on realistic IR images obtained by a 384x288 pixels uncooled long wave infrared (LWIR) camera operated under changed ambient condition. The results show that our method can exclude the influence caused by the changed ambient condition, and ensure that the infrared camera has a stable performance.
NASA Technical Reports Server (NTRS)
Defrere, D.; Hinz, P.; Downey, E.; Boehm, M.; Danchi, W. C.; Durney, O.; Ertel, S.; Hill, J. M.; Hoffmann, W. F.; Mennesson, B.;
2016-01-01
The Large Binocular Telescope Interferometer uses a near-infrared camera to measure the optical path length variations between the two AO-corrected apertures and provide high-angular resolution observations for all its science channels (1.5-13 microns). There is however a wavelength dependent component to the atmospheric turbulence, which can introduce optical path length errors when observing at a wavelength different from that of the fringe sensing camera. Water vapor in particular is highly dispersive and its effect must be taken into account for high-precision infrared interferometric observations as described previously for VLTI/MIDI or the Keck Interferometer Nuller. In this paper, we describe the new sensing approach that has been developed at the LBT to measure and monitor the optical path length fluctuations due to dry air and water vapor separately. After reviewing the current performance of the system for dry air seeing compensation, we present simultaneous H-, K-, and N-band observations that illustrate the feasibility of our feed forward approach to stabilize the path length fluctuations seen by the LBTI nuller uses a near-infrared camera to measure the optical path length variations between the two AO-corrected apertures and provide high-angular resolution observations for all its science channels (1.5-13 microns). There is however a wavelength dependent component to the atmospheric turbulence, which can introduce optical path length errors when observing at a wavelength different from that of the fringe sensing camera. Water vapor in particular is highly dispersive and its effect must be taken into account for high-precision infrared interferometric observations as described previously for VLTI MIDI or the Keck Interferometer Nuller. In this paper, we describe the new sensing approach that has been developed at the LBT to measure and monitor the optical path length fluctuations due to dry air and water vapor separately. After reviewing the current performance of the system for dry air seeing compensation, we present simultaneous H-, K-, and N-band observations that illustrate the feasibility of our feed forward approach to stabilize the path length fluctuations seen by the LBTI nuller.
NASA Astrophysics Data System (ADS)
Luquet, Ph.; Brouard, L.; Chinal, E.
2017-11-01
Astrium has developed a product line of compact and versatile instruments for HR and VHR missions in Earth Observation. These cameras consist on a Silicon Carbide Korsch-type telescope, a focal plane with one or several retina modules - including five lines CCD, optical filters and front end electronics - and the instrument main electronics. Several versions have been developed with a telescope pupil diameter from 200 mm up to 650 mm, covering a large range of GSD (from 2.5 m down to sub-metric) and swath (from 10km up to 30 km) and compatible with different types of platform. Nine cameras have already been manufactured for five different programs: ALSAT2 (Algeria), SSOT (Chile), SPOT6 & SPOT7 (France), KRS (Kazakhstan) and VNREDSat (Vietnam). Two of them have already been launched and are delivering high quality images.
CANICA: The Cananea Near-Infrared Camera at the 2.1 m OAGH Telescope
NASA Astrophysics Data System (ADS)
Carrasco, L.; Hernández Utrera, O.; Vázquez, S.; Mayya, Y. D.; Carrasco, E.; Pedraza, J.; Castillo-Domínguez, E.; Escobedo, G.; Devaraj, R.; Luna, A.
2017-10-01
The Cananea near-infrared camera (CANICA) is an instrument commissioned at the 2.12 m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located in Cananea, Sonora, México. CANICA operates in the near-infrared at multiple bands including J(1.24 μm), H(1.63 μm) and K' (2.12 μm) broad-bands. CANICA in located at the Ritchey-Chrétien focal plane of the telescope, reimaging the f/12 beam into f/6 beam. The detector is a 1024 × 1024 HgCdTe HAWAII array of 18.5 μm pixel size, covering a field of view of 5.5 × 5.5 arcmin2, for a plate scale of 0.32 arcsec/pixel. The camera is enclosed in a cryostat, cooled with liquid nitrogen to 77 K. The cryostat contains the collimator, two 15-position filter wheels, single fixed reimaging optics and the detector.
Cryogenic solid Schmidt camera as a base for future wide-field IR systems
NASA Astrophysics Data System (ADS)
Yudin, Alexey N.
2011-11-01
Work is focused on study of capability of solid Schmidt camera to serve as a wide-field infrared lens for aircraft system with whole sphere coverage, working in 8-14 um spectral range, coupled with spherical focal array of megapixel class. Designs of 16 mm f/0.2 lens with 60 and 90 degrees sensor diagonal are presented, their image quality is compared with conventional solid design. Achromatic design with significantly improved performance, containing enclosed soft correcting lens behind protective front lens is proposed. One of the main goals of the work is to estimate benefits from curved detector arrays in 8-14 um spectral range wide-field systems. Coupling of photodetector with solid Schmidt camera by means of frustrated total internal reflection is considered, with corresponding tolerance analysis. The whole lens, except front element, is considered to be cryogenic, with solid Schmidt unit to be flown by hydrogen for improvement of bulk transmission.
The CAOS camera platform: ushering in a paradigm change in extreme dynamic range imager design
NASA Astrophysics Data System (ADS)
Riza, Nabeel A.
2017-02-01
Multi-pixel imaging devices such as CCD, CMOS and Focal Plane Array (FPA) photo-sensors dominate the imaging world. These Photo-Detector Array (PDA) devices certainly have their merits including increasingly high pixel counts and shrinking pixel sizes, nevertheless, they are also being hampered by limitations in instantaneous dynamic range, inter-pixel crosstalk, quantum full well capacity, signal-to-noise ratio, sensitivity, spectral flexibility, and in some cases, imager response time. Recently invented is the Coded Access Optical Sensor (CAOS) Camera platform that works in unison with current Photo-Detector Array (PDA) technology to counter fundamental limitations of PDA-based imagers while providing high enough imaging spatial resolution and pixel counts. Using for example the Texas Instruments (TI) Digital Micromirror Device (DMD) to engineer the CAOS camera platform, ushered in is a paradigm change in advanced imager design, particularly for extreme dynamic range applications.
Initial astronomical results with a new 5-14 micron Si:Ga 58x62 DRO array camera
NASA Technical Reports Server (NTRS)
Gezari, Dan; Folz, Walter; Woods, Larry
1989-01-01
A new array camera system was developed using a 58 x 62 pixel Si:Ga (gallium doped silicon) DRO (direct readout) photoconductor array detector manufactured by Hughes/Santa Barbara Research Center (SBRC). The camera system is a broad band photometer designed for 5 to 14 micron imaging with large ground-based optical telescopes. In a typical application a 10 micron photon flux of about 10(exp 9) photons sec(exp -1) m(exp -2) microns(exp -1) arcsec(exp -2) is incident in the telescope focal plane, while the detector well capacity of these arrays is 10(exp 5) to 10 (exp 6) electrons. However, when the real efficiencies and operating conditions are accounted for, the 2-channel 3596 pixel array operates with about 1/2 full wells at 10 micron and 10% bandwidth with high duty cycle and no real experimental compromises.
A solid state lightning propagation speed sensor
NASA Technical Reports Server (NTRS)
Mach, Douglas M.; Rust, W. David
1989-01-01
A device to measure the propagation speeds of cloud-to-ground lightning has been developed. The lightning propagation speed (LPS) device consists of eight solid state silicon photodetectors mounted behind precision horizontal slits in the focal plane of a 50-mm lens on a 35-mm camera. Although the LPS device produces results similar to those obtained from a streaking camera, the LPS device has the advantages of smaller size, lower cost, mobile use, and easier data collection and analysis. The maximum accuracy for the LPS is 0.2 microsec, compared with about 0.8 microsecs for the streaking camera. It is found that the return stroke propagation speed for triggered lightning is different than that for natural lightning if measurements are taken over channel segments less than 500 m. It is suggested that there are no significant differences between the propagation speeds of positive and negative flashes. Also, differences between natural and triggered dart leaders are discussed.
Tofte, Josef N; Westerlind, Brian O; Martin, Kevin D; Guetschow, Brian L; Uribe-Echevarria, Bastián; Rungprai, Chamnanni; Phisitkul, Phinit
2017-03-01
To validate the knee, shoulder, and virtual Fundamentals of Arthroscopic Training (FAST) modules on a virtual arthroscopy simulator via correlations with arthroscopy case experience and postgraduate year. Orthopaedic residents and faculty from one institution performed a standardized sequence of knee, shoulder, and FAST modules to evaluate baseline arthroscopy skills. Total operation time, camera path length, and composite total score (metric derived from multiple simulator measurements) were compared with case experience and postgraduate level. Values reported are Pearson r; alpha = 0.05. 35 orthopaedic residents (6 per postgraduate year), 2 fellows, and 3 faculty members (2 sports, 1 foot and ankle), including 30 male and 5 female residents, were voluntarily enrolled March to June 2015. Knee: training year correlated significantly with year-averaged knee composite score, r = 0.92, P = .004, 95% confidence interval (CI) = 0.84, 0.96; operation time, r = -0.92, P = .004, 95% CI = -0.96, -0.84; and camera path length, r = -0.97, P = .0004, 95% CI = -0.98, -0.93. Knee arthroscopy case experience correlated significantly with composite score, r = 0.58, P = .0008, 95% CI = 0.27, 0.77; operation time, r = -0.54, P = .002, 95% CI = -0.75, -0.22; and camera path length, r = -0.62, P = .0003, 95% CI = -0.8, -0.33. Shoulder: training year correlated strongly with average shoulder composite score, r = 0.90, P = .006, 95% CI = 0.81, 0.95; operation time, r = -0.94, P = .001, 95% CI = -0.97, -0.89; and camera path length, r = -0.89, P = .007, 95% CI = -0.95, -0.80. Shoulder arthroscopy case experience correlated significantly with average composite score, r = 0.52, P = .003, 95% CI = 0.2, 0.74; strongly with operation time, r = -0.62, P = .0002, 95% CI = -0.8, -0.33; and camera path length, r = -0.37, P = .044, 95% CI = -0.64, -0.01, by training year. FAST: training year correlated significantly with 3 combined FAST activity average composite scores, r = 0.81, P = .0279, 95% CI = 0.65, 0.90; operation times, r = -0.86, P = .012, 95% CI = -0.93, -0.74; and camera path lengths, r = -0.85, P = .015, 95% CI = -0.92, -0.72. Total arthroscopy cases performed did not correlate significantly with overall FAST performance. We found significant correlations between both training year and knee and shoulder arthroscopy experience when compared with performance as measured by composite score, camera path length, and operation time during a simulated diagnostic knee and shoulder arthroscopy, respectively. Three FAST activities demonstrated significant correlations with training year but not arthroscopy case experience as measured by composite score, camera path length, and operation time. We attempt to validate an arthroscopy simulator that could be used to supplement arthroscopy skills training for orthopaedic residents. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Comparison of parameters of modern cooled and uncooled thermal cameras
NASA Astrophysics Data System (ADS)
Bareła, Jarosław; Kastek, Mariusz; Firmanty, Krzysztof; Krupiński, Michał
2017-10-01
During the design of a system employing thermal cameras one always faces a problem of choosing the camera types best suited for the task. In many cases such a choice is far from optimal one, and there are several reasons for that. System designers often favor tried and tested solution they are used to. They do not follow the latest developments in the field of infrared technology and sometimes their choices are based on prejudice and not on facts. The paper presents the results of measurements of basic parameters of MWIR and LWIR thermal cameras, carried out in a specialized testing laboratory. The measured parameters are decisive in terms of image quality generated by thermal cameras. All measurements were conducted according to current procedures and standards. However the camera settings were not optimized for a specific test conditions or parameter measurements. Instead the real settings used in normal camera operations were applied to obtain realistic camera performance figures. For example there were significant differences between measured values of noise parameters and catalogue data provided by manufacturers, due to the application of edge detection filters to increase detection and recognition ranges. The purpose of this paper is to provide help in choosing the optimal thermal camera for particular application, answering the question whether to opt for cheaper microbolometer device or apply slightly better (in terms of specifications) yet more expensive cooled unit. Measurements and analysis were performed by qualified personnel with several dozen years of experience in both designing and testing of thermal camera systems with both cooled and uncooled focal plane arrays. Cameras of similar array sizes and optics were compared, and for each tested group the best performing devices were selected.
Strategic options towards an affordable high-performance infrared camera
NASA Astrophysics Data System (ADS)
Oduor, Patrick; Mizuno, Genki; Dutta, Achyut K.; Lewis, Jay; Dhar, Nibir K.
2016-05-01
The promise of infrared (IR) imaging attaining low-cost akin to CMOS sensors success has been hampered by the inability to achieve cost advantages that are necessary for crossover from military and industrial applications into the consumer and mass-scale commercial realm despite well documented advantages. Banpil Photonics is developing affordable IR cameras by adopting new strategies to speed-up the decline of the IR camera cost curve. We present a new short-wave IR (SWIR) camera; 640x512 pixel InGaAs uncooled system that is high sensitivity low noise (<50e-), high dynamic range (100 dB), high-frame rates (> 500 frames per second (FPS)) at full resolution, and low power consumption (< 1 W) in a compact system. This camera paves the way towards mass market adoption by not only demonstrating high-performance IR imaging capability value add demanded by military and industrial application, but also illuminates a path towards justifiable price points essential for consumer facing application industries such as automotive, medical, and security imaging adoption. Among the strategic options presented include new sensor manufacturing technologies that scale favorably towards automation, multi-focal plane array compatible readout electronics, and dense or ultra-small pixel pitch devices.
In vitro near-infrared imaging of occlusal dental caries using a germanium-enhanced CMOS camera
NASA Astrophysics Data System (ADS)
Lee, Chulsung; Darling, Cynthia L.; Fried, Daniel
2010-02-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310-nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study was to determine whether the lesion contrast derived from NIR transillumination can be used to estimate lesion severity. Another aim was to compare the performance of a new Ge enhanced complementary metal-oxide-semiconductor (CMOS) based NIR imaging camera with the InGaAs focal plane array (FPA). Extracted human teeth (n=52) with natural occlusal caries were imaged with both cameras at 1310-nm and the image contrast between sound and carious regions was calculated. After NIR imaging, teeth were sectioned and examined using more established methods, namely polarized light microscopy (PLM) and transverse microradiography (TMR) to calculate lesion severity. Lesions were then classified into 4 categories according to the lesion severity. Lesion contrast increased significantly with lesion severity for both cameras (p<0.05). The Ge enhanced CMOS camera equipped with the larger array and smaller pixels yielded higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
In vitro near-infrared imaging of occlusal dental caries using germanium enhanced CMOS camera.
Lee, Chulsung; Darling, Cynthia L; Fried, Daniel
2010-03-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310-nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study was to determine whether the lesion contrast derived from NIR transillumination can be used to estimate lesion severity. Another aim was to compare the performance of a new Ge enhanced complementary metal-oxide-semiconductor (CMOS) based NIR imaging camera with the InGaAs focal plane array (FPA). Extracted human teeth (n=52) with natural occlusal caries were imaged with both cameras at 1310-nm and the image contrast between sound and carious regions was calculated. After NIR imaging, teeth were sectioned and examined using more established methods, namely polarized light microscopy (PLM) and transverse microradiography (TMR) to calculate lesion severity. Lesions were then classified into 4 categories according to the lesion severity. Lesion contrast increased significantly with lesion severity for both cameras (p<0.05). The Ge enhanced CMOS camera equipped with the larger array and smaller pixels yielded higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
Multispectral Snapshot Imagers Onboard Small Satellite Formations for Multi-Angular Remote Sensing
NASA Technical Reports Server (NTRS)
Nag, Sreeja; Hewagama, Tilak; Georgiev, Georgi; Pasquale, Bert; Aslam, Shahid; Gatebe, Charles K.
2017-01-01
Multispectral snapshot imagers are capable of producing 2D spatial images with a single exposure at selected, numerous wavelengths using the same camera, therefore operate differently from push broom or whiskbroom imagers. They are payloads of choice in multi-angular, multi-spectral imaging missions that use small satellites flying in controlled formation, to retrieve Earth science measurements dependent on the targets Bidirectional Reflectance-Distribution Function (BRDF). Narrow fields of view are needed to capture images with moderate spatial resolution. This paper quantifies the dependencies of the imagers optical system, spectral elements and camera on the requirements of the formation mission and their impact on performance metrics such as spectral range, swath and signal to noise ratio (SNR). All variables and metrics have been generated from a comprehensive, payload design tool. The baseline optical parameters selected (diameter 7 cm, focal length 10.5 cm, pixel size 20 micron, field of view 1.15 deg) and snapshot imaging technologies are available. The spectral components shortlisted were waveguide spectrometers, acousto-optic tunable filters (AOTF), electronically actuated Fabry-Perot interferometers, and integral field spectrographs. Qualitative evaluation favored AOTFs because of their low weight, small size, and flight heritage. Quantitative analysis showed that waveguide spectrometers perform better in terms of achievable swath (10-90 km) and SNR (greater than 20) for 86 wavebands, but the data volume generated will need very high bandwidth communication to downlink. AOTFs meet the external data volume caps well as the minimum spectral (wavebands) and radiometric (SNR) requirements, therefore are found to be currently feasible in spite of lower swath and SNR.
Calibration, Projection, and Final Image Products of MESSENGER's Mercury Dual Imaging System
NASA Astrophysics Data System (ADS)
Denevi, Brett W.; Chabot, Nancy L.; Murchie, Scott L.; Becker, Kris J.; Blewett, David T.; Domingue, Deborah L.; Ernst, Carolyn M.; Hash, Christopher D.; Hawkins, S. Edward; Keller, Mary R.; Laslo, Nori R.; Nair, Hari; Robinson, Mark S.; Seelos, Frank P.; Stephens, Grant K.; Turner, F. Scott; Solomon, Sean C.
2018-02-01
We present an overview of the operations, calibration, geodetic control, photometric standardization, and processing of images from the Mercury Dual Imaging System (MDIS) acquired during the orbital phase of the MESSENGER spacecraft's mission at Mercury (18 March 2011-30 April 2015). We also provide a summary of all of the MDIS products that are available in NASA's Planetary Data System (PDS). Updates to the radiometric calibration included slight modification of the frame-transfer smear correction, updates to the flat fields of some wide-angle camera (WAC) filters, a new model for the temperature dependence of narrow-angle camera (NAC) and WAC sensitivity, and an empirical correction for temporal changes in WAC responsivity. Further, efforts to characterize scattered light in the WAC system are described, along with a mosaic-dependent correction for scattered light that was derived for two regional mosaics. Updates to the geometric calibration focused on the focal lengths and distortions of the NAC and all WAC filters, NAC-WAC alignment, and calibration of the MDIS pivot angle and base. Additionally, two control networks were derived so that the majority of MDIS images can be co-registered with sub-pixel accuracy; the larger of the two control networks was also used to create a global digital elevation model. Finally, we describe the image processing and photometric standardization parameters used in the creation of the MDIS advanced products in the PDS, which include seven large-scale mosaics, numerous targeted local mosaics, and a set of digital elevation models ranging in scale from local to global.
Sensors integration for smartphone navigation: performances and future challenges
NASA Astrophysics Data System (ADS)
Aicardi, I.; Dabove, P.; Lingua, A.; Piras, M.
2014-08-01
Nowadays the modern smartphones include several sensors which are usually adopted in geomatic application, as digital camera, GNSS (Global Navigation Satellite System) receivers, inertial platform, RFID and Wi-Fi systems. In this paper the authors would like to testing the performances of internal sensors (Inertial Measurement Unit, IMU) of three modern smartphones (Samsung GalaxyS4, Samsung GalaxyS5 and iPhone4) compared to external mass-market IMU platform in order to verify their accuracy levels, in terms of positioning. Moreover, the Image Based Navigation (IBN) approach is also investigated: this approach can be very useful in hard-urban environment or for indoor positioning, as alternative to GNSS positioning. IBN allows to obtain a sub-metrical accuracy, but a special database of georeferenced images (Image DataBase, IDB) is needed, moreover it is necessary to use dedicated algorithm to resizing the images which are collected by smartphone, in order to share it with the server where is stored the IDB. Moreover, it is necessary to characterize smartphone camera lens in terms of focal length and lens distortions. The authors have developed an innovative method with respect to those available today, which has been tested in a covered area, adopting a special support where all sensors under testing have been installed. Geomatic instrument have been used to define the reference trajectory, with purpose to compare this one, with the path obtained with IBN solution. First results leads to have an horizontal and vertical accuracies better than 60 cm, respect to the reference trajectories. IBN method, sensors, test and result will be described in the paper.
Li, Jin; Liu, Zilong
2017-07-24
Remote sensing cameras in the visible/near infrared range are essential tools in Earth-observation, deep-space exploration, and celestial navigation. Their imaging performance, i.e. image quality here, directly determines the target-observation performance of a spacecraft, and even the successful completion of a space mission. Unfortunately, the camera itself, such as a optical system, a image sensor, and a electronic system, limits the on-orbit imaging performance. Here, we demonstrate an on-orbit high-resolution imaging method based on the invariable modulation transfer function (IMTF) of cameras. The IMTF, which is stable and invariable to the changing of ground targets, atmosphere, and environment on orbit or on the ground, depending on the camera itself, is extracted using a pixel optical focal-plane (PFP). The PFP produces multiple spatial frequency targets, which are used to calculate the IMTF at different frequencies. The resulting IMTF in combination with a constrained least-squares filter compensates for the IMTF, which represents the removal of the imaging effects limited by the camera itself. This method is experimentally confirmed. Experiments on an on-orbit panchromatic camera indicate that the proposed method increases 6.5 times of the average gradient, 3.3 times of the edge intensity, and 1.56 times of the MTF value compared to the case when IMTF is not used. This opens a door to push the limitation of a camera itself, enabling high-resolution on-orbit optical imaging.
Efficient volumetric estimation from plenoptic data
NASA Astrophysics Data System (ADS)
Anglin, Paul; Reeves, Stanley J.; Thurow, Brian S.
2013-03-01
The commercial release of the Lytro camera, and greater availability of plenoptic imaging systems in general, have given the image processing community cost-effective tools for light-field imaging. While this data is most commonly used to generate planar images at arbitrary focal depths, reconstruction of volumetric fields is also possible. Similarly, deconvolution is a technique that is conventionally used in planar image reconstruction, or deblurring, algorithms. However, when leveraged with the ability of a light-field camera to quickly reproduce multiple focal planes within an imaged volume, deconvolution offers a computationally efficient method of volumetric reconstruction. Related research has shown than light-field imaging systems in conjunction with tomographic reconstruction techniques are also capable of estimating the imaged volume and have been successfully applied to particle image velocimetry (PIV). However, while tomographic volumetric estimation through algorithms such as multiplicative algebraic reconstruction techniques (MART) have proven to be highly accurate, they are computationally intensive. In this paper, the reconstruction problem is shown to be solvable by deconvolution. Deconvolution offers significant improvement in computational efficiency through the use of fast Fourier transforms (FFTs) when compared to other tomographic methods. This work describes a deconvolution algorithm designed to reconstruct a 3-D particle field from simulated plenoptic data. A 3-D extension of existing 2-D FFT-based refocusing techniques is presented to further improve efficiency when computing object focal stacks and system point spread functions (PSF). Reconstruction artifacts are identified; their underlying source and methods of mitigation are explored where possible, and reconstructions of simulated particle fields are provided.
NASA Astrophysics Data System (ADS)
Smee, Stephen A.; Prochaska, Travis; Shectman, Stephen A.; Hammond, Randolph P.; Barkhouser, Robert H.; DePoy, D. L.; Marshall, J. L.
2012-09-01
We describe the conceptual optomechanical design for GMACS, a wide-field, multi-object, moderate-resolution optical spectrograph for the Giant Magellan Telescope (GMT). GMACS is a candidate first-light instrument for the GMT and will be one of several instruments housed in the Gregorian Instrument Rotator (GIR) located at the Gregorian focus. The instrument samples a 9 arcminute x 18 arcminute field of view providing two resolution modes (i.e, low resolution, R ~ 2000, and moderate resolution, R ~ 4000) over a 3700 Å to 10200 Å wavelength range. To minimize the size of the optics, four fold mirrors at the GMT focal plane redirect the full field into four individual "arms", that each comprises a double spectrograph with a red and blue channel. Hence, each arm samples a 4.5 arcminute x 9 arcminute field of view. The optical layout naturally leads to three separate optomechanical assemblies: a focal plane assembly, and two identical optics modules. The focal plane assembly contains the last element of the telescope's wide-field corrector, slit-mask, tent-mirror assembly, and slit-mask magazine. Each of the two optics modules supports two of the four instrument arms and houses the aft-optics (i.e. collimators, dichroics, gratings, and cameras). A grating exchange mechanism, and articulated gratings and cameras facilitate multiple resolution modes. In this paper we describe the details of the GMACS optomechanical design, including the requirements and considerations leading to the design, mechanism details, optics mounts, and predicted flexure performance.
NASA Astrophysics Data System (ADS)
Tomiyasu, Kentaro; Takagi, Ryo; Iwasaki, Ryosuke; Yoshizawa, Shin; Umemura, Shin-ichiro
2017-07-01
In high-intensity focused ultrasound (HIFU) treatment, controlling the ultrasound dose at each focal target spot is important because it is a problem that the length of the coagulated region in front of the focal point deviates owing to the differences in absorption in each focal target spot and attenuation in the intervening tissues. In this study, the detected changes in the power spectra of HIFU echoes were used by controlling the HIFU duration in the “trigger HIFU” sequence with the aim to increase coagulation size through the enhancement of the ultrasonic heating by the cavitation induced by the preceding extremely high intensity short “trigger” pulse. The result shows that this method can be used to detect boiling bubbles and the following generated cavitation bubbles at their early stage. By automatically stopping HIFU exposure immediately after detecting the bubbles, overheating was prevented and the deviation of the length of the coagulated region was reduced.
Big capabilities in small packages: hyperspectral imaging from a compact platform
NASA Astrophysics Data System (ADS)
Beasley, Matthew; Goldberg, Hannah; Voorhees, Christopher; Illsley, Peter
2016-09-01
We present the Compact Holographic Aberration-corrected Platform (CHAP) instrument, designed and developed at Planetary Resources Development Corporation. By combining a dispersive element with the secondary of a telescope, we are able to produce a relatively long focal length with moderate dispersion at the focal plane. This design enables us to build a capable hyperspectral imaging instrument within the size constraints of the Cubesat form-factor. The advantages of our design revolves around its simplicity: there are only two optical elements, producing both a white light and diffracted image. With the use of a replicated grating, we can produce a long focal length hyperspectral imager at a price point far below other spaceflight instruments. The design is scalable for larger platforms and since it has no transmitting optics and only two reflective surfaces could be designed to function at any desired wavelength. Our system will be capable of spectral imaging across the 400 to 900 nm spectral range for use in small body surveys.
Nearby Exo-Earth Astrometric Telescope (NEAT)
NASA Technical Reports Server (NTRS)
Shao, M.; Nemati, B.; Zhai, C.; Goullioud, R.
2011-01-01
NEAT (Nearby Exo ]Earths Astrometric Telescope) is a modest sized (1m diameter telescope) It will be capable of searching approx 100 nearby stars down to 1 Mearth planets in the habitable zone, and 200 @ 5 Mearth, 1AU. The concept addresses the major issues for ultra -precise astrometry: (1) Photon noise (0.5 deg dia field of view) (2) Optical errors (beam walk) with long focal length telescope (3) Focal plane errors , with laser metrology of the focal plane (4) PSF centroiding errors with measurement of the "True" PSF instead of using a "guess " of the true PSF, and correction for intra pixel QE non-uniformities. Technology "close" to complete. Focal plane geometry to 2e-5 pixels and centroiding to approx 4e -5 pixels.
Development of ATHENA mirror modules
NASA Astrophysics Data System (ADS)
Collon, Maximilien J.; Vacanti, Giuseppe; Barrière, Nicolas M.; Landgraf, Boris; Günther, Ramses; Vervest, Mark; van der Hoeven, Roy; Dekker, Danielle; Chatbi, Abdel; Girou, David; Sforzini, Jessica; Beijersbergen, Marco W.; Bavdaz, Marcos; Wille, Eric; Fransen, Sebastiaan; Shortt, Brian; Haneveld, Jeroen; Koelewijn, Arenda; Booysen, Karin; Wijnperle, Maurice; van Baren, Coen; Eigenraam, Alexander; Müller, Peter; Krumrey, Michael; Burwitz, Vadim; Pareschi, Giovanni; Massahi, Sonny; Christensen, Finn E.; Della Monica Ferreira, Desirée.; Valsecchi, Giuseppe; Oliver, Paul; Checquer, Ian; Ball, Kevin; Zuknik, Karl-Heinz
2017-08-01
Silicon Pore Optics (SPO), developed at cosine with the European Space Agency (ESA) and several academic and industrial partners, provides lightweight, yet stiff, high-resolution x-ray optics. This technology enables ATHENA to reach an unprecedentedly large effective area in the 0.2 - 12 keV band with an angular resolution better than 5''. After developing the technology for 50 m and 20 m focal length, this year has witnessed the first 12 m focal length mirror modules being produced. The technology development is also gaining momentum with three different radii under study: mirror modules for the inner radii (Rmin = 250 mm), outer radii (Rmax = 1500 mm) and middle radii (Rmid = 737 mm) are being developed in parallel.
High-speed varifocal imaging with a tunable acoustic gradient index of refraction lens.
Mermillod-Blondin, Alexandre; McLeod, Euan; Arnold, Craig B
2008-09-15
Fluidic lenses allow for varifocal optical elements, but current approaches are limited by the speed at which focal length can be changed. Here we demonstrate the use of a tunable acoustic gradient (TAG) index of refraction lens as a fast varifocal element. The optical power of the TAG lens varies continuously, allowing for rapid selection and modification of the effective focal length at time scales of 1 mus and shorter. The wavefront curvature applied to the incident light is experimentally quantified as a function of time, and single-frame imaging is demonstrated. Results indicate that the TAG lens can successfully be employed to perform high-rate imaging at multiple locations.
Spatial filters for high average power lasers
Erlandson, Alvin C
2012-11-27
A spatial filter includes a first filter element and a second filter element overlapping with the first filter element. The first filter element includes a first pair of cylindrical lenses separated by a first distance. Each of the first pair of cylindrical lenses has a first focal length. The first filter element also includes a first slit filter positioned between the first pair of cylindrical lenses. The second filter element includes a second pair of cylindrical lenses separated by a second distance. Each of the second pair of cylindrical lenses has a second focal length. The second filter element also includes a second slit filter positioned between the second pair of cylindrical lenses.
Spatial filters for high power lasers
Erlandson, Alvin Charles; Bayramian, Andrew James
2014-12-02
A spatial filter includes a first filter element and a second filter element overlapping with the first filter element. The first filter element includes a first pair of cylindrical lenses separated by a first distance. Each of the first pair of cylindrical lenses has a first focal length. The first filter element also includes a first longitudinal slit filter positioned between the first pair of cylindrical lenses. The second filter element includes a second pair of cylindrical lenses separated by a second distance. Each of the second pair of cylindrical lenses has a second focal length. The second filter element also includes a second longitudinal slit filter positioned between the second pair of cylindrical lenses.
Hubble Space Telescope faint object camera instrument handbook (Post-COSTAR), version 5.0
NASA Technical Reports Server (NTRS)
Nota, A. (Editor); Jedrzejewski, R. (Editor); Greenfield, P. (Editor); Hack, W. (Editor)
1994-01-01
The faint object camera (FOC) is a long-focal-ratio, photon-counting device capable of taking high-resolution two-dimensional images of the sky up to 14 by 14 arc seconds squared in size with pixel dimensions as small as 0.014 by 0.014 arc seconds squared in the 1150 to 6500 A wavelength range. Its performance approaches that of an ideal imaging system at low light levels. The FOC is the only instrument on board the Hubble Space Telescope (HST) to fully use the spatial resolution capabilities of the optical telescope assembly (OTA) and is one of the European Space Agency's contributions to the HST program.
Classification of Kiwifruit Grades Based on Fruit Shape Using a Single Camera
Fu, Longsheng; Sun, Shipeng; Li, Rui; Wang, Shaojin
2016-01-01
This study aims to demonstrate the feasibility for classifying kiwifruit into shape grades by adding a single camera to current Chinese sorting lines equipped with weight sensors. Image processing methods are employed to calculate fruit length, maximum diameter of the equatorial section, and projected area. A stepwise multiple linear regression method is applied to select significant variables for predicting minimum diameter of the equatorial section and volume and to establish corresponding estimation models. Results show that length, maximum diameter of the equatorial section and weight are selected to predict the minimum diameter of the equatorial section, with the coefficient of determination of only 0.82 when compared to manual measurements. Weight and length are then selected to estimate the volume, which is in good agreement with the measured one with the coefficient of determination of 0.98. Fruit classification based on the estimated minimum diameter of the equatorial section achieves a low success rate of 84.6%, which is significantly improved using a linear combination of the length/maximum diameter of the equatorial section and projected area/length ratios, reaching 98.3%. Thus, it is possible for Chinese kiwifruit sorting lines to reach international standards of grading kiwifruit on fruit shape classification by adding a single camera. PMID:27376292
Thin plastic foil X-ray optics with spiral geometry
NASA Astrophysics Data System (ADS)
Barbera, Marco; Mineo, Teresa; Perinati, Emanuele; Schnopper, Herbert W.; Taibi, Angelo
2007-09-01
Winding a plastic foil ribbon into spiral cylinder or spiral cones we can design and build single or multiple reflection X-ray grazing incidence focusing optics with potential applications in Astronomy as well as experimental physics. The use of thin plastic foils from common industrial applications and of a mounting technique which does not require the construction of mandrels make these optics very cost effective. A spiral geometry focusing optic produces an annular image of a point source with the angular size of the annulus depending mainly on the pitch of the winding and the focal length. We use a ray-tracing code to evaluate the performances of cylindrical, and double conical spiral geometry as a function of the design parameters e.g. focal length, diameter, optic length. Some preliminary results are presented on X-ray imaging tests performed on spiral cylindrical optics.
Liquid Lens module with wide field-of-view and variable focal length
NASA Astrophysics Data System (ADS)
Seo, Sang Won; Han, Seungoh; Seo, Jun Ho; Choi, Woo Bum; Sung, Man Young
2010-12-01
A novel wide angle and variable-focus imaging module based on a miniaturized liquid lens is presented for capsule endoscopy applications. For these applications, it is desirable to have features such as a wide field of view (FOV), variable focus, small size, and low power consumption, thereby taking full advantage of the miniaturized liquid lens. The proposed imaging module has three aspheric plastic lenses for a wide FOV, and one liquid lens that can change the focal length by as much as 24.5 cm with a bias voltage difference of 23 Vrms for variable focusing. The assembled lens module has an overall length of 8.4 mm and a FOV of 120.5°. The realized imaging module including the proposed lenses is small enough to be inserted into a capsule endoscope, and it is expected to improve the diagnostic capability of capsule endoscopes.
Radiometric infrared focal plane array imaging system for thermographic applications
NASA Technical Reports Server (NTRS)
Esposito, B. J.; Mccafferty, N.; Brown, R.; Tower, J. R.; Kosonocky, W. F.
1992-01-01
This document describes research performed under the Radiometric Infrared Focal Plane Array Imaging System for Thermographic Applications contract. This research investigated the feasibility of using platinum silicide (PtSi) Schottky-barrier infrared focal plane arrays (IR FPAs) for NASA Langley's specific radiometric thermal imaging requirements. The initial goal of this design was to develop a high spatial resolution radiometer with an NETD of 1 percent of the temperature reading over the range of 0 to 250 C. The proposed camera design developed during this study and described in this report provides: (1) high spatial resolution (full-TV resolution); (2) high thermal dynamic range (0 to 250 C); (3) the ability to image rapid, large thermal transients utilizing electronic exposure control (commandable dynamic range of 2,500,000:1 with exposure control latency of 33 ms); (4) high uniformity (0.5 percent nonuniformity after correction); and (5) high thermal resolution (0.1 C at 25 C background and 0.5 C at 250 C background).
Radiometric infrared focal plane array imaging system for thermographic applications
NASA Astrophysics Data System (ADS)
Esposito, B. J.; McCafferty, N.; Brown, R.; Tower, J. R.; Kosonocky, W. F.
1992-11-01
This document describes research performed under the Radiometric Infrared Focal Plane Array Imaging System for Thermographic Applications contract. This research investigated the feasibility of using platinum silicide (PtSi) Schottky-barrier infrared focal plane arrays (IR FPAs) for NASA Langley's specific radiometric thermal imaging requirements. The initial goal of this design was to develop a high spatial resolution radiometer with an NETD of 1 percent of the temperature reading over the range of 0 to 250 C. The proposed camera design developed during this study and described in this report provides: (1) high spatial resolution (full-TV resolution); (2) high thermal dynamic range (0 to 250 C); (3) the ability to image rapid, large thermal transients utilizing electronic exposure control (commandable dynamic range of 2,500,000:1 with exposure control latency of 33 ms); (4) high uniformity (0.5 percent nonuniformity after correction); and (5) high thermal resolution (0.1 C at 25 C background and 0.5 C at 250 C background).
A lab-on-phone instrument with varifocal microscope via a liquid-actuated aspheric lens (LAL)
Fuh, Yiin-Kuen; Lai, Zheng-Hong; Kau, Li-Han; Huang, Hung-Jui
2017-01-01
In this paper, we introduce a novel concept of liquid-actuated aspheric lens (LAL) with a built-in aspheric polydimethylsiloxane lens (APL) to enable the design of compact optical systems with varifocal microscopic imaging. The varifocal lens module consists of a sandwiched structures such as 3d printed syringe pump functionally serves as liquid controller. Other key components include two acrylic cylinders, a rigid separator, a APL/membrane composite (APLMC) embedded PDMS membrane. In functional operation, the fluidic controller was driven to control the pressure difference and ALPMC deformation. The focal length can be changed through the pressure difference. This is achieved by the adjustment of volume change of injected liquid such that a widely tunable focal length. The proposed LAL can transform to 3 modes: microscopic mode (APLMC only), convex-concave mode and biconcave mode. It is noticeable that LAL in the operation of microscopic mode is tunable in focus via the actuation of APLMC (focal length is from 4.3 to 2.3 mm and magnification 50X) and can rival the images quality of commercial microscopes. A new lab-on-phone device is economically feasible and functionally versatile to offer a great potential in the point of care applications. PMID:28650971
Solution for the nonuniformity correction of infrared focal plane arrays.
Zhou, Huixin; Liu, Shangqian; Lai, Rui; Wang, Dabao; Cheng, Yubao
2005-05-20
Based on the S-curve model of the detector response of infrared focal plan arrays (IRFPAs), an improved two-point correction algorithm is presented. The algorithm first transforms the nonlinear image data into linear data and then uses the normal two-point algorithm to correct the linear data. The algorithm can effectively overcome the influence of nonlinearity of the detector's response, and it enlarges the correction precision and the dynamic range of the response. A real-time imaging-signal-processing system for IRFPAs that is based on a digital signal processor and field-programmable gate arrays is also presented. The nonuniformity correction capability of the presented solution is validated by experimental imaging procedures of a 128 x 128 pixel IRFPA camera prototype.
Three-dimensional particle tracking via tunable color-encoded multiplexing.
Duocastella, Martí; Theriault, Christian; Arnold, Craig B
2016-03-01
We present a novel 3D tracking approach capable of locating single particles with nanometric precision over wide axial ranges. Our method uses a fast acousto-optic liquid lens implemented in a bright field microscope to multiplex light based on color into different and selectable focal planes. By separating the red, green, and blue channels from an image captured with a color camera, information from up to three focal planes can be retrieved. Multiplane information from the particle diffraction rings enables precisely locating and tracking individual objects up to an axial range about 5 times larger than conventional single-plane approaches. We apply our method to the 3D visualization of the well-known coffee-stain phenomenon in evaporating water droplets.
New Focal Plane Array Controller for the Instruments of the Subaru Telescope
NASA Astrophysics Data System (ADS)
Nakaya, Hidehiko; Komiyama, Yutaka; Miyazaki, Satoshi; Yamashita, Takuya; Yagi, Masafumi; Sekiguchi, Maki
2006-03-01
We have developed a next-generation data acquisition system, MESSIA5 (Modularized Extensible System for Image Acquisition), which comprises the digital part of a focal plane array controller. The new data acquisition system was constructed based on a 64 bit, 66 MHz PCI (peripheral component interconnect) bus architecture and runs on an x86 CPU computer with (non-real-time) Linux. The system, including the CPU board, is placed at the telescope focus, and standard gigabit Ethernet is adopted for the data transfer, as opposed to a dedicated fiber link. During the summer of 2002, we installed the new system for the first time on the Subaru prime-focus camera Suprime-Cam and successfully improved the observing performance.
History of the formerly top secret KH-9 Hexagon spy satellite
NASA Astrophysics Data System (ADS)
Pressel, Phil
2014-12-01
This paper is about the development, design, fabrication and use of the KH-9 Hexagon spy in the sky satellite camera system that was finally declassified by the National Reconnaissance Office on September 17, 2011 twenty five years after the program ended. It was the last film based reconnaissance camera and was known by experts in the field as "the most complicated system ever put up in orbit." It provided important intelligence for the United States government and was the reason that President Nixon was able to sign the SALT treaty, and when President Reagan said "Trust but Verify" it provided the means of verification. Each satellite weighed 30,000 pounds and carried two cameras thereby permitting photographs of the entire landmass of the earth to be taken in stereo. Each camera carried up to 30 miles of film for a total of 60 miles of film. Ultra-complex mechanisms controlled the structurally "wimpy" film that traveled at speeds up to 204 inches per second at the focal plane and was perfectly synchronized to the optical image.
Development of the FPI+ as facility science instrument for SOFIA cycle four observations
NASA Astrophysics Data System (ADS)
Pfüller, Enrico; Wiedemann, Manuel; Wolf, Jürgen; Krabbe, Alfred
2016-08-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a heavily modified Boeing 747SP aircraft, accommodating a 2.5m infrared telescope. This airborne observation platform takes astronomers to flight altitudes of up to 13.7 km (45,000ft) and therefore allows an unobstructed view of the infrared universe at wavelengths between 0.3 m and 1600 m. SOFIA is currently completing its fourth cycle of observations and utilizes eight different imaging and spectroscopic science instruments. New instruments for SOFIAs cycle 4 observations are the High-resolution Airborne Wideband Camera-plus (HAWC+) and the Focal Plane Imager (FPI+). The latter is an integral part of the telescope assembly and is used on every SOFIA flight to ensure precise tracking on the desired targets. The FPI+ is used as a visual-light photometer in its role as facility science instrument. Since the upgrade of the FPI camera and electronics in 2013, it uses a thermo-electrically cooled science grade EM-CCD sensor inside a commercial-off-the-shelf Andor camera. The back-illuminated sensor has a peak quantum efficiency of 95% and the dark current is as low as 0.01 e-/pix/sec. With this new hardware the telescope has successfully tracked on 16th magnitude stars and thus the sky coverage, e.g. the area of sky that has suitable tracking stars, has increased to 99%. Before its use as an integrated tracking imager, the same type of camera has been used as a standalone diagnostic tool to analyze the telescope pointing stability at frequencies up to 200 Hz (imaging with 400 fps). These measurements help to improve the telescope pointing control algorithms and therefore reduce the image jitter in the focal plane. Science instruments benefit from this improvement with smaller image sizes for longer exposure times. The FPI has also been used to support astronomical observations like stellar occultations by the dwarf planet Pluto and a number of exoplanet transits. Especially the observation of the occultation events benefits from the high camera sensitivity, fast readout capability and the low read noise and it was possible to achieve high time resolution on the photometric light curves. This paper will give an overview of the development from the standalone diagnostic camera to the upgraded guiding/tracking camera, fully integrated into the telescope, while still offering the diagnostic capabilities and finally to the use as a facility science instrument on SOFIA.
Design, Analysis, and On-Sun Evaluation of Reflective Strips Under Controlled Buckling
NASA Technical Reports Server (NTRS)
Jaworske, Donald A.; Sechkar, Edward A.; Colozza, Anthony J.
2014-01-01
Solar concentrators are envisioned for use in a variety of space-based applications, including applications involving in situ resource utilization. Identifying solar concentrators that minimize mass and cost are of great interest, especially since launch cost is driven in part by the mass of the payload. Concentrators must also be able to survive the wide temperature excursions on the lunar surface. Identifying smart structures which compensate for changes in concentrator geometry brought about by temperature extremes are of interest. Some applications may benefit from the ability to change the concentrators focal pattern at will. This paper addresses a method of designing a single reflective strip to produce a desired focal pattern through the use of controlled buckling. Small variations in the cross section over the length of the reflective strip influence the distribution of light in the focal region. A finite element method of analysis is utilized here which calculates the curve produced for a given strip cross section and axial load. Varying axial force and strip cross section over the length of the reflective strip provide a means of optimizing ray convergence in the focal region. Careful selection of a tapered cross section yields a reflective strip that approximates a parabola. An array of reflective strips under controlled buckling produces a light weight concentrator and adjustments in the compression of individual strips provide a means of compensating for temperature excursions or changing the focal pattern at will.
Design, Analysis, and On-Sun Evaluation of Reflective Strips Under Controlled Buckling
NASA Technical Reports Server (NTRS)
Jaworske, D. A.; Sechkar, E. A.; Colozza, A. J.
2014-01-01
Solar concentrators are envisioned for use in a variety of space-based applications, including applications involving in situ resource utilization. Identifying solar concentrators that minimize mass and cost are of great interest, especially since launch cost is driven in part by the mass of the payload. Concentrators must also be able to survive the wide temperature excursions on the lunar surface. Identifying smart structures which compensate for changes in concentrator geometry brought about by temperature extremes are of interest. Some applications may benefit from the ability to change the concentrator's focal pattern at will. This paper addresses a method of designing a single reflective strip to produce a desired focal pattern through the use of controlled buckling. Small variations in the cross section over the length of the reflective strip influence the distribution of light in the focal region. A finite element method of analysis is utilized here which calculates the curve produced for a given strip cross section and axial load. Varying axial force and strip cross section over the length of the reflective strip provide a means of optimizing ray convergence in the focal region. Careful selection of a tapered cross section yields a reflective strip that approximates a parabola. An array of reflective strips under controlled buckling produces a light weight concentrator and adjustments in the compression of individual strips provide a means of compensating for temperature excursions or changing the focal pattern at will.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, X; Grimes, J; Yu, L
Purpose: Focal spot blooming is an increase in the focal spot size at increased tube current and/or decreased tube potential. In this work, we evaluated the influence of tube current on the focal spot size at low kV for two CT systems, one of which used a tube designed to reduce blooming effects. Methods: A slit camera (10 micron slit) was used to measure focal spot size on two CT scanners from the same manufacturer (Siemens Somatom Force and Definition Flash) at 70 kV and low, medium and maximum tube currents, according to the capabilities of each system (Force: 100,more » 800 and 1300 mA; Flash: 100, 200 and 500 mA). Exposures were made with a stationary tube in service mode using a raised stand without table movement or flying focal spot technique. Focal spot size, nominally 0.8 and 1.2 mm, respectively, was measured parallel and perpendicular to the cathode-anode axis by calculating the full-width-at-half-maximum of the slit profile recording using computed radiographic plates. Results: Focal spot sizes perpendicular to the anode-cathode axis increased at the maximum mA by 5.7% on the Force and 39.1% on the Flash relative to that at the minimal mA, even though the mA was increased 13-fold on the Force and only 5- fold on the Flash. Focal spot size increased parallel to the anode-cathode axis by 70.4% on Force and 40.9% on Flash. Conclusion: For CT protocols using low kV, high mA is typically required. These protocols are relevant in children and smaller adults, and for dual-energy scanning. Technical measures to limit focal spot blooming are important in these settings to avoid reduced spatial resolution. The x-ray tube on a recently-introduced scanner appears to greatly reduce blooming effects, even at very high mA values. CHM has research support from Siemens Healthcare.« less
Standards and Procedures for Application of Radiometric Sensors
2010-07-01
radiation from slowly oscillating electrons. Optical radiation is defined to be the narrow portion of the electromagnetic spectrum spanning five orders...809-10, July 2010 4-1 CHAPTER 4 PHOTOMETRY Radiometry is the science and the craft of measuring radiant power across the entire optical ...distances compared to its focal length (which is the most common application) the image of the source is at the focal distance of the optical system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vishnyakov, A. V.; Stuchinsky, V. A., E-mail: stuchin@isp.nsc.ru; Brunev, D. V.
2014-03-03
In the present paper, we propose a method for evaluating the bulk diffusion length of minority charge carriers in the photosensing layer of photovoltaic focal plane array (FPA) photodetectors. The method is based on scanning a strip-shaped illumination spot with one of the detector diodes at a low level of photocurrents j{sub ph} being registered; such scanning provides data for subsequent analysis of measured spot-scan profiles within a simple diffusion model. The asymptotic behavior of the effective (at j{sub ph} ≠ 0) charge-carrier diffusion length l{sub d} {sub eff} as a function of j{sub ph} for j{sub ph} → 0 inferred frommore » our experimental data proved to be consistent with the behavior of l{sub d} {sub eff} vs j{sub ph} as predicted by the model, while the obtained values of the bulk diffusion length of minority carriers (electrons) in the p-HgCdTe film of investigated HgCdTe n-on-p FPA photodetectors were found to be in a good agreement with the previously reported carrier diffusion-length values for HgCdTe.« less
Performance of NICER flight x-ray concentrator
NASA Astrophysics Data System (ADS)
Okajima, Takashi; Soong, Yang; Balsamo, Erin R.; Enoto, Teruaki; Olsen, Larry; Koenecke, Richard; Lozipone, Larry; Kearney, John; Fitzsimmons, Sean; Numata, Ai; Kenyon, Steven J.; Arzoumanian, Zaven; Gendreau, Keith
2016-07-01
Neutron star Interior Composition ExploreR (NICER) is a NASA instrument to be onboard International Space Station, which is equipped with 56 pairs of an X-ray concentrator (XRC) and a silicon drift detector for high timing observations. The XRC is based on an epoxy replicated thin aluminum foil X-ray mirror, similar to those of Suzaku and ASTRO-H (Hitomi), but only a single stage parabolic grazing incidence optic. Each has a focal length of 1.085m and a diameter of 105 mm, with 24 confocally aligned parabolic shells. Grazing incident angles to individual shells range from 0.4 to 1.4 deg. The flight 56 XRCs have been completed and successfully delivered to the payload integration. All the XRC was characterized at the NASA/GSFC 100-m X-ray beamline using 1.5 keV X-rays (some of them are also at 4.5 keV). The XRC performance, effective area and point spread function, was measured by a CCD camera and a proportional counter. The average effective area is about 44 cm2 at 1.5 keV and about 18 cm2 at 4.5 keV, which is consistent with a micro-roughness of 0.5nm from individual shell reflectivity measurements. The XRC focuses about 91% of X-rays into a 2mm aperture at the focal plane, which is the NICER detector window size. Each XRC weighs only 325 g. These performance met the project requirement. In this paper, we will present summary of the flight XRC performance as well as co-alignment results of the 56 XRCs on the flight payload as it is important to estimate the total effective for astronomical observations.
Wang, Zhengzhou; Hu, Bingliang; Yin, Qinye
2017-01-01
The schlieren method of measuring far-field focal spots offers many advantages at the Shenguang III laser facility such as low cost and automatic laser-path collimation. However, current methods of far-field focal spot measurement often suffer from low precision and efficiency when the final focal spot is merged manually, thereby reducing the accuracy of reconstruction. In this paper, we introduce an improved schlieren method to construct the high dynamic-range image of far-field focal spots and improve the reconstruction accuracy and efficiency. First, a detection method based on weak light beam sampling and magnification imaging was designed; images of the main and side lobes of the focused laser irradiance in the far field were obtained using two scientific CCD cameras. Second, using a self-correlation template matching algorithm, a circle the same size as the schlieren ball was dug from the main lobe cutting image and used to change the relative region of the main lobe cutting image within a 100×100 pixel region. The position that had the largest correlation coefficient between the side lobe cutting image and the main lobe cutting image when a circle was dug was identified as the best matching point. Finally, the least squares method was used to fit the center of the side lobe schlieren small ball, and the error was less than 1 pixel. The experimental results show that this method enables the accurate, high-dynamic-range measurement of a far-field focal spot and automatic image reconstruction. Because the best matching point is obtained through image processing rather than traditional reconstruction methods based on manual splicing, this method is less sensitive to the efficiency of focal-spot reconstruction and thus offers better experimental precision. PMID:28207758
Theoretical analysis for double-liquid variable focus lens
NASA Astrophysics Data System (ADS)
Peng, Runling; Chen, Jiabi; Zhuang, Songlin
2007-09-01
In this paper, various structures for double-liquid variable focus lens are introduced. And based on an energy minimization method, explicit calculations and detailed analyses upon an extended Young-type equation are given for double-liquid lenses with cylindrical electrode. Such an equation is especially applicable to liquid-liquid-solid tri-phase systems. It is a little different from the traditional Young equation that was derived according to vapor-liquid-solid triphase systems. The electrowetting effect caused by an external voltage changes the interface shape between two liquids as well as the focal length of the lens. Based on the extended Young-type equation, the relationship between the focal length and the external voltage can also be derived. Corresponding equations and simulation results are presented.
NASA Astrophysics Data System (ADS)
Zeng, Zhi; Peng, Runling; He, Mei
2017-02-01
The double-liquid variable-focus lens based on the electrowetting has the characteristics of small size, light weight, fast response, and low price and so on. In this paper, double-liquid variable-focus lens's Principle and structure are introduced. The reasons for the existence and improvement of contact angle hysteresis are given according improved Young's equation. At last, 1-Bromododecane with silicone oil are mixed to get oil liquid with different viscosity and proportion liquid as insulating liquid. External voltages are applied to these three liquid lens and focal lengths of the lenses versus applied voltage are investigated. Experiments show that, the decreasing of oil liquid viscosity can reduce focal length hysteresis.