Measuring the Flatness of Focal Plane for Very Large Mosaic CCD Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hao, Jiangang; Estrada, Juan; Cease, Herman
2010-06-08
Large mosaic multiCCD camera is the key instrument for modern digital sky survey. DECam is an extremely red sensitive 520 Megapixel camera designed for the incoming Dark Energy Survey (DES). It is consist of sixty two 4k x 2k and twelve 2k x 2k 250-micron thick fully-depleted CCDs, with a focal plane of 44 cm in diameter and a field of view of 2.2 square degree. It will be attached to the Blanco 4-meter telescope at CTIO. The DES will cover 5000 square-degrees of the southern galactic cap in 5 color bands (g, r, i, z, Y) in 5 yearsmore » starting from 2011. To achieve the science goal of constraining the Dark Energy evolution, stringent requirements are laid down for the design of DECam. Among them, the flatness of the focal plane needs to be controlled within a 60-micron envelope in order to achieve the specified PSF variation limit. It is very challenging to measure the flatness of the focal plane to such precision when it is placed in a high vacuum dewar at 173 K. We developed two image based techniques to measure the flatness of the focal plane. By imaging a regular grid of dots on the focal plane, the CCD offset along the optical axis is converted to the variation the grid spacings at different positions on the focal plane. After extracting the patterns and comparing the change in spacings, we can measure the flatness to high precision. In method 1, the regular dots are kept in high sub micron precision and cover the whole focal plane. In method 2, no high precision for the grid is required. Instead, we use a precise XY stage moves the pattern across the whole focal plane and comparing the variations of the spacing when it is imaged by different CCDs. Simulation and real measurements show that the two methods work very well for our purpose, and are in good agreement with the direct optical measurements.« less
- Astrophysics - DES - PreCam PreCam Work at ANL The Argonne/HEP Dark Energy Survey (DES) group, working on the Dark Energy Camera (DECam), built a mini-DECam camera called PreCam. This camera has provided valuable
Micro-Imagers for Spaceborne Cell-Growth Experiments
NASA Technical Reports Server (NTRS)
Behar, Alberto; Matthews, Janet; SaintAnge, Beverly; Tanabe, Helen
2006-01-01
A document discusses selected aspects of a continuing effort to develop five micro-imagers for both still and video monitoring of cell cultures to be grown aboard the International Space Station. The approach taken in this effort is to modify and augment pre-existing electronic micro-cameras. Each such camera includes an image-detector integrated-circuit chip, signal-conditioning and image-compression circuitry, and connections for receiving power from, and exchanging data with, external electronic equipment. Four white and four multicolor light-emitting diodes are to be added to each camera for illuminating the specimens to be monitored. The lens used in the original version of each camera is to be replaced with a shorter-focal-length, more-compact singlet lens to make it possible to fit the camera into the limited space allocated to it. Initially, the lenses in the five cameras are to have different focal lengths: the focal lengths are to be 1, 1.5, 2, 2.5, and 3 cm. Once one of the focal lengths is determined to be the most nearly optimum, the remaining four cameras are to be fitted with lenses of that focal length.
Lee, Sahmin; Yoon, Chang-Hwan; Oh, Il-Young; Suh, Jung-Won; Cho, Young-Seok; Cho, Goo-Yeong; Chae, In-Ho; Choi, Dong-Ju; Youn, Tae-Jin
2015-01-01
The angiographic features of restenosis contain prognostic information. However, restenosis patterns of the new generation drug-eluting stents (DES), everolimus-(EES) and resolute zotarolimus-eluting stent (ZES) have not been described.A total of 210 consecutive patients with DES restenosis were enrolled from 2003 to 2012. We analyzed 217 restenotic lesions after DES implantation, and compared the morphologic characteristics of the 2nd generation DES restenosis to those of restenosis with 2 first generation DES, sirolimus-(SES) and paclitaxel-eluting stent (PES).Baseline characteristics were comparable between the different stent groups. The incidence of focal restenosis was significantly lower for PES than the other stents (49.5% versus 87.0%, 76.2%, and 82.1% for PES versus SES, EES, and ZES, respectively, P < 0.001). When considering the pattern of restenosis solely within the stent margins, a further clear distinction between PES and other stents was observed (40.0% versus 92.9%, 88.9%, and 81.2% in PES versus SES, EES, and ZES, respectively, P < 0.001). There were no significant differences in restenosis patterns among SES, EES, and ZES. In multivariate analysis, PES implantation, hypertension, and age were associated with non-focal type of restenosis after DES implantation. After the introduction of EES and ZES into routine clinical practice in 2008, focal restenosis significantly increased from 63.9% to 76.7% and diffuse restenosis significantly decreased from 26.4% to 11.0% (P = 0.045).Focal restenosis was the most common pattern of restenosis in the new generation DES and the incidence of diffuse restenosis significantly decreased with the introduction of the 2nd generation DES.
Thermal Effects on Camera Focal Length in Messenger Star Calibration and Orbital Imaging
NASA Astrophysics Data System (ADS)
Burmeister, S.; Elgner, S.; Preusker, F.; Stark, A.; Oberst, J.
2018-04-01
We analyse images taken by the MErcury Surface, Space ENviorment, GEochemistry, and Ranging (MESSENGER) spacecraft for the camera's thermal response in the harsh thermal environment near Mercury. Specifically, we study thermally induced variations in focal length of the Mercury Dual Imaging System (MDIS). Within the several hundreds of images of star fields, the Wide Angle Camera (WAC) typically captures up to 250 stars in one frame of the panchromatic channel. We measure star positions and relate these to the known star coordinates taken from the Tycho-2 catalogue. We solve for camera pointing, the focal length parameter and two non-symmetrical distortion parameters for each image. Using data from the temperature sensors on the camera focal plane we model a linear focal length function in the form of f(T) = A0 + A1 T. Next, we use images from MESSENGER's orbital mapping mission. We deal with large image blocks, typically used for the production of a high-resolution digital terrain models (DTM). We analyzed images from the combined quadrangles H03 and H07, a selected region, covered by approx. 10,600 images, in which we identified about 83,900 tiepoints. Using bundle block adjustments, we solved for the unknown coordinates of the control points, the pointing of the camera - as well as the camera's focal length. We then fit the above linear function with respect to the focal plane temperature. As a result, we find a complex response of the camera to thermal conditions of the spacecraft. To first order, we see a linear increase by approx. 0.0107 mm per degree temperature for the Narrow-Angle Camera (NAC). This is in agreement with the observed thermal response seen in images of the panchromatic channel of the WAC. Unfortunately, further comparisons of results from the two methods, both of which use different portions of the available image data, are limited. If leaving uncorrected, these effects may pose significant difficulties in the photogrammetric analysis, specifically these may be responsible for erroneous longwavelength trends in topographic models.
Mitigation of Atmospheric Effects on Imaging Systems
2004-03-31
focal length. The imaging system had two cameras: an Electrim camera sensitive in the visible (0.6 µ m) waveband and an Amber QWIP infrared camera...sensitive in the 9–micron region. The Amber QWIP infrared camera had 256x256 pixels, pixel pitch 38 mµ , focal length of 1.8 m, FOV of 5.4 x5.4 mr...each day. Unfortunately, signals from the different read ports of the Electrim camera picked up noise on their way to the digitizer, and this resulted
The imaging system design of three-line LMCCD mapping camera
NASA Astrophysics Data System (ADS)
Zhou, Huai-de; Liu, Jin-Guo; Wu, Xing-Xing; Lv, Shi-Liang; Zhao, Ying; Yu, Da
2011-08-01
In this paper, the authors introduced the theory about LMCCD (line-matrix CCD) mapping camera firstly. On top of the introduction were consists of the imaging system of LMCCD mapping camera. Secondly, some pivotal designs which were Introduced about the imaging system, such as the design of focal plane module, the video signal's procession, the controller's design of the imaging system, synchronous photography about forward and nadir and backward camera and the nadir camera of line-matrix CCD. At last, the test results of LMCCD mapping camera imaging system were introduced. The results as following: the precision of synchronous photography about forward and nadir and backward camera is better than 4 ns and the nadir camera of line-matrix CCD is better than 4 ns too; the photography interval of line-matrix CCD of the nadir camera can satisfy the butter requirements of LMCCD focal plane module; the SNR tested in laboratory is better than 95 under typical working condition(the solar incidence degree is 30, the reflectivity of the earth's surface is 0.3) of each CCD image; the temperature of the focal plane module is controlled under 30° in a working period of 15 minutes. All of these results can satisfy the requirements about the synchronous photography, the temperature control of focal plane module and SNR, Which give the guarantee of precision for satellite photogrammetry.
Mechanical Design of the LSST Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordby, Martin; Bowden, Gordon; Foss, Mike
2008-06-13
The LSST camera is a tightly packaged, hermetically-sealed system that is cantilevered into the main beam of the LSST telescope. It is comprised of three refractive lenses, on-board storage for five large filters, a high-precision shutter, and a cryostat that houses the 3.2 giga-pixel CCD focal plane along with its support electronics. The physically large optics and focal plane demand large structural elements to support them, but the overall size of the camera and its components must be minimized to reduce impact on the image stability. Also, focal plane and optics motions must be minimized to reduce systematic errors inmore » image reconstruction. Design and analysis for the camera body and cryostat will be detailed.« less
Instrumental Response Model and Detrending for the Dark Energy Camera
Bernstein, G. M.; Abbott, T. M. C.; Desai, S.; ...
2017-09-14
We describe the model for mapping from sky brightness to the digital output of the Dark Energy Camera (DECam) and the algorithms adopted by the Dark Energy Survey (DES) for inverting this model to obtain photometric measures of celestial objects from the raw camera output. This calibration aims for fluxes that are uniform across the camera field of view and across the full angular and temporal span of the DES observations, approaching the accuracy limits set by shot noise for the full dynamic range of DES observations. The DES pipeline incorporates several substantive advances over standard detrending techniques, including principal-components-based sky and fringe subtraction; correction of the "brighter-fatter" nonlinearity; use of internal consistency in on-sky observations to disentangle the influences of quantum efficiency, pixel-size variations, and scattered light in the dome flats; and pixel-by-pixel characterization of instrument spectral response, through combination of internal-consistency constraints with auxiliary calibration data. This article provides conceptual derivations of the detrending/calibration steps, and the procedures for obtaining the necessary calibration data. Other publications will describe the implementation of these concepts for the DES operational pipeline, the detailed methods, and the validation that the techniques can bring DECam photometry and astrometry withinmore » $$\\approx 2$$ mmag and $$\\approx 3$$ mas, respectively, of fundamental atmospheric and statistical limits. In conclusion, the DES techniques should be broadly applicable to wide-field imagers.« less
Instrumental Response Model and Detrending for the Dark Energy Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernstein, G. M.; Abbott, T. M. C.; Desai, S.
We describe the model for mapping from sky brightness to the digital output of the Dark Energy Camera (DECam) and the algorithms adopted by the Dark Energy Survey (DES) for inverting this model to obtain photometric measures of celestial objects from the raw camera output. This calibration aims for fluxes that are uniform across the camera field of view and across the full angular and temporal span of the DES observations, approaching the accuracy limits set by shot noise for the full dynamic range of DES observations. The DES pipeline incorporates several substantive advances over standard detrending techniques, including principal-components-based sky and fringe subtraction; correction of the "brighter-fatter" nonlinearity; use of internal consistency in on-sky observations to disentangle the influences of quantum efficiency, pixel-size variations, and scattered light in the dome flats; and pixel-by-pixel characterization of instrument spectral response, through combination of internal-consistency constraints with auxiliary calibration data. This article provides conceptual derivations of the detrending/calibration steps, and the procedures for obtaining the necessary calibration data. Other publications will describe the implementation of these concepts for the DES operational pipeline, the detailed methods, and the validation that the techniques can bring DECam photometry and astrometry withinmore » $$\\approx 2$$ mmag and $$\\approx 3$$ mas, respectively, of fundamental atmospheric and statistical limits. In conclusion, the DES techniques should be broadly applicable to wide-field imagers.« less
Russo, Paolo; Mettivier, Giovanni
2011-04-01
The goal of this study is to evaluate a new method based on a coded aperture mask combined with a digital x-ray imaging detector for measurements of the focal spot sizes of diagnostic x-ray tubes. Common techniques for focal spot size measurements employ a pinhole camera, a slit camera, or a star resolution pattern. The coded aperture mask is a radiation collimator consisting of a large number of apertures disposed on a predetermined grid in an array, through which the radiation source is imaged onto a digital x-ray detector. The method of the coded mask camera allows one to obtain a one-shot accurate and direct measurement of the two dimensions of the focal spot (like that for a pinhole camera) but at a low tube loading (like that for a slit camera). A large number of small apertures in the coded mask operate as a "multipinhole" with greater efficiency than a single pinhole, but keeping the resolution of a single pinhole. X-ray images result from the multiplexed output on the detector image plane of such a multiple aperture array, and the image of the source is digitally reconstructed with a deconvolution algorithm. Images of the focal spot of a laboratory x-ray tube (W anode: 35-80 kVp; focal spot size of 0.04 mm) were acquired at different geometrical magnifications with two different types of digital detector (a photon counting hybrid silicon pixel detector with 0.055 mm pitch and a flat panel CMOS digital detector with 0.05 mm pitch) using a high resolution coded mask (type no-two-holes-touching modified uniformly redundant array) with 480 0.07 mm apertures, designed for imaging at energies below 35 keV. Measurements with a slit camera were performed for comparison. A test with a pinhole camera and with the coded mask on a computed radiography mammography unit with 0.3 mm focal spot was also carried out. The full width at half maximum focal spot sizes were obtained from the line profiles of the decoded images, showing a focal spot of 0.120 mm x 0.105 mm at 35 kVp and M = 6.1, with a detector entrance exposure as low as 1.82 mR (0.125 mA s tube load). The slit camera indicated a focal spot of 0.112 mm x 0.104 mm at 35 kVp and M = 3.15, with an exposure at the detector of 72 mR. Focal spot measurements with the coded mask could be performed up to 80 kVp. Tolerance to angular misalignment with the reference beam up to 7 degrees in in-plane rotations and 1 degrees deg in out-of-plane rotations was observed. The axial distance of the focal spot from the coded mask could also be determined. It is possible to determine the beam intensity via measurement of the intensity of the decoded image of the focal spot and via a calibration procedure. Coded aperture masks coupled to a digital area detector produce precise determinations of the focal spot of an x-ray tube with reduced tube loading and measurement time, coupled to a large tolerance in the alignment of the mask.
Voyager spacecraft images of Jupiter and Saturn
NASA Technical Reports Server (NTRS)
Birnbaum, M. M.
1982-01-01
The Voyager imaging system is described, noting that it is made up of a narrow-angle and a wide-angle TV camera, each in turn consisting of optics, a filter wheel and shutter assembly, a vidicon tube, and an electronics subsystem. The narrow-angle camera has a focal length of 1500 mm; its field of view is 0.42 deg and its focal ratio is f/8.5. For the wide-angle camera, the focal length is 200 mm, the field of view 3.2 deg, and the focal ratio of f/3.5. Images are exposed by each camera through one of eight filters in the filter wheel on the photoconductive surface of a magnetically focused and deflected vidicon having a diameter of 25 mm. The vidicon storage surface (target) is a selenium-sulfur film having an active area of 11.14 x 11.14 mm; it holds a frame consisting of 800 lines with 800 picture elements per line. Pictures of Jupiter, Saturn, and their moons are presented, with short descriptions given of the area being viewed.
A digital gigapixel large-format tile-scan camera.
Ben-Ezra, M
2011-01-01
Although the resolution of single-lens reflex (SLR) and medium-format digital cameras has increased in recent years, applications for cultural-heritage preservation and computational photography require even higher resolutions. Addressing this issue, a large-format cameras' large image planes can achieve very high resolution without compromising pixel size and thus can provide high-quality, high-resolution images.This digital large-format tile scan camera can acquire high-quality, high-resolution images of static scenes. It employs unique calibration techniques and a simple algorithm for focal-stack processing of very large images with significant magnification variations. The camera automatically collects overlapping focal stacks and processes them into a high-resolution, extended-depth-of-field image.
Photodetectors for the Advanced Gamma-ray Imaging System (AGIS)
NASA Astrophysics Data System (ADS)
Wagner, Robert G.; Advanced Gamma-ray Imaging System AGIS Collaboration
2010-03-01
The Advanced Gamma-Ray Imaging System (AGIS) is a concept for the next generation very high energy gamma-ray observatory. Design goals include an order of magnitude better sensitivity, better angular resolution, and a lower energy threshold than existing Cherenkov telescopes. Each telescope is equipped with a camera that detects and records the Cherenkov-light flashes from air showers. The camera is comprised of a pixelated focal plane of blue sensitive and fast (nanosecond) photon detectors that detect the photon signal and convert it into an electrical one. Given the scale of AGIS, the camera must be reliable and cost effective. The Schwarzschild-Couder optical design yields a smaller plate scale than present-day Cherenkov telescopes, enabling the use of more compact, multi-pixel devices, including multianode photomultipliers or Geiger avalanche photodiodes. We present the conceptual design of the focal plane for the camera and results from testing candidate! focal plane sensors.
Radiometric calibration of an ultra-compact microbolometer thermal imaging module
NASA Astrophysics Data System (ADS)
Riesland, David W.; Nugent, Paul W.; Laurie, Seth; Shaw, Joseph A.
2017-05-01
As microbolometer focal plane array formats are steadily decreasing, new challenges arise in correcting for thermal drift in the calibration coefficients. As the thermal mass of the cameras decrease the focal plane becomes more sensitive to external thermal inputs. This paper shows results from a temperature compensation algorithm for characterizing and radiometrically calibrating a FLIR Lepton camera.
640x480 PtSi Stirling-cooled camera system
NASA Astrophysics Data System (ADS)
Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; Coyle, Peter J.; Feder, Howard L.; Gilmartin, Harvey R.; Levine, Peter A.; Sauer, Donald J.; Shallcross, Frank V.; Demers, P. L.; Smalser, P. J.; Tower, John R.
1992-09-01
A Stirling cooled 3 - 5 micron camera system has been developed. The camera employs a monolithic 640 X 480 PtSi-MOS focal plane array. The camera system achieves an NEDT equals 0.10 K at 30 Hz frame rate with f/1.5 optics (300 K background). At a spatial frequency of 0.02 cycles/mRAD the vertical and horizontal Minimum Resolvable Temperature are in the range of MRT equals 0.03 K (f/1.5 optics, 300 K background). The MOS focal plane array achieves a resolution of 480 TV lines per picture height independent of background level and position within the frame.
Motion camera based on a custom vision sensor and an FPGA architecture
NASA Astrophysics Data System (ADS)
Arias-Estrada, Miguel
1998-09-01
A digital camera for custom focal plane arrays was developed. The camera allows the test and development of analog or mixed-mode arrays for focal plane processing. The camera is used with a custom sensor for motion detection to implement a motion computation system. The custom focal plane sensor detects moving edges at the pixel level using analog VLSI techniques. The sensor communicates motion events using the event-address protocol associated to a temporal reference. In a second stage, a coprocessing architecture based on a field programmable gate array (FPGA) computes the time-of-travel between adjacent pixels. The FPGA allows rapid prototyping and flexible architecture development. Furthermore, the FPGA interfaces the sensor to a compact PC computer which is used for high level control and data communication to the local network. The camera could be used in applications such as self-guided vehicles, mobile robotics and smart surveillance systems. The programmability of the FPGA allows the exploration of further signal processing like spatial edge detection or image segmentation tasks. The article details the motion algorithm, the sensor architecture, the use of the event- address protocol for velocity vector computation and the FPGA architecture used in the motion camera system.
NASA Technical Reports Server (NTRS)
1976-01-01
Trade studies were conducted to ensure the overall feasibility of the focal plane camera in a radial module. The primary variable in the trade studies was the location of the pickoff mirror, on axis versus off-axis. Two alternatives were: (1) the standard (electromagnetic focus) SECO submodule, and (2) the MOD 15 permanent magnet focus SECO submodule. The technical areas of concern were the packaging affected parameters of thermal dissipation, focal plane obscuration, and image quality.
Comparison of photogrammetric and astrometric data reduction results for the wild BC-4 camera
NASA Technical Reports Server (NTRS)
Hornbarger, D. H.; Mueller, I., I.
1971-01-01
The results of astrometric and photogrammetric plate reduction techniques for a short focal length camera are compared. Several astrometric models are tested on entire and limited plate areas to analyze their ability to remove systematic errors from interpolated satellite directions using a rigorous photogrammetric reduction as a standard. Residual plots are employed to graphically illustrate the analysis. Conclusions are made as to what conditions will permit the astrometric reduction to achieve comparable accuracies to those of photogrammetric reduction when applied for short focal length ballistic cameras.
Fabrication of multi-focal microlens array on curved surface for wide-angle camera module
NASA Astrophysics Data System (ADS)
Pan, Jun-Gu; Su, Guo-Dung J.
2017-08-01
In this paper, we present a wide-angle and compact camera module that consists of microlens array with different focal lengths on curved surface. The design integrates the principle of an insect's compound eye and the human eye. It contains a curved hexagonal microlens array and a spherical lens. Compared with normal mobile phone cameras which usually need no less than four lenses, but our proposed system only uses one lens. Furthermore, the thickness of our proposed system is only 2.08 mm and diagonal full field of view is about 100 degrees. In order to make the critical microlens array, we used the inkjet printing to control the surface shape of each microlens for achieving different focal lengths and use replication method to form curved hexagonal microlens array.
Bennett, C.L.
1996-07-23
An imaging Fourier transform spectrometer is described having a Fourier transform infrared spectrometer providing a series of images to a focal plane array camera. The focal plane array camera is clocked to a multiple of zero crossing occurrences as caused by a moving mirror of the Fourier transform infrared spectrometer and as detected by a laser detector such that the frame capture rate of the focal plane array camera corresponds to a multiple of the zero crossing rate of the Fourier transform infrared spectrometer. The images are transmitted to a computer for processing such that representations of the images as viewed in the light of an arbitrary spectral ``fingerprint`` pattern can be displayed on a monitor or otherwise stored and manipulated by the computer. 2 figs.
A telephoto camera system with shooting direction control by gaze detection
NASA Astrophysics Data System (ADS)
Teraya, Daiki; Hachisu, Takumi; Yendo, Tomohiro
2015-05-01
For safe driving, it is important for driver to check traffic conditions such as traffic lights, or traffic signs as early as soon. If on-vehicle camera takes image of important objects to understand traffic conditions from long distance and shows these to driver, driver can understand traffic conditions earlier. To take image of long distance objects clearly, the focal length of camera must be long. When the focal length is long, on-vehicle camera doesn't have enough field of view to check traffic conditions. Therefore, in order to get necessary images from long distance, camera must have long-focal length and controllability of shooting direction. In previous study, driver indicates shooting direction on displayed image taken by a wide-angle camera, a direction controllable camera takes telescopic image, and displays these to driver. However, driver uses a touch panel to indicate the shooting direction in previous study. It is cause of disturb driving. So, we propose a telephoto camera system for driving support whose shooting direction is controlled by driver's gaze to avoid disturbing drive. This proposed system is composed of a gaze detector and an active telephoto camera whose shooting direction is controlled. We adopt non-wear detecting method to avoid hindrance to drive. The gaze detector measures driver's gaze by image processing. The shooting direction of the active telephoto camera is controlled by galvanometer scanners and the direction can be switched within a few milliseconds. We confirmed that the proposed system takes images of gazing straight ahead of subject by experiments.
PNIC - A near infrared camera for testing focal plane arrays
NASA Astrophysics Data System (ADS)
Hereld, Mark; Harper, D. A.; Pernic, R. J.; Rauscher, Bernard J.
1990-07-01
This paper describes the design and the performance of the Astrophysical Research Consortium prototype near-infrared camera (pNIC) designed to test focal plane arrays both on and off the telescope. Special attention is given to the detector in pNIC, the mechanical and optical designs, the electronics, and the instrument interface. Experiments performed to illustrate the most salient aspects of pNIC are described.
SPARTAN Near-IR Camera SPARTAN Cookbook Ohio State Infrared Imager/Spectrograph (OSIRIS) - NO LONGER Instrumentation at SOAR»SPARTAN Near-IR Camera SPARTAN Near-IR Camera System Overview The Spartan Infrared Camera is a high spatial resolution near-IR imager. Spartan has a focal plane conisisting of four "
Space infrared telescope facility wide field and diffraction limited array camera (IRAC)
NASA Technical Reports Server (NTRS)
Fazio, G. G.
1986-01-01
IRAC focal plane detector technology was developed and studies of alternate focal plane configurations were supported. While any of the alternate focal planes under consideration would have a major impact on the Infrared Array Camera, it was possible to proceed with detector development and optical analysis research based on the proposed design since, to a large degree, the studies undertaken are generic to any SIRTF imaging instrument. Development of the proposed instrument was also important in a situation in which none of the alternate configurations has received the approval of the Science Working Group.
Plenoptic camera based on a liquid crystal microlens array
NASA Astrophysics Data System (ADS)
Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Xie, Changsheng
2015-09-01
A type of liquid crystal microlens array (LCMLA) with tunable focal length by the voltage signals applied between its top and bottom electrodes, is fabricated and then the common optical focusing characteristics are tested. The relationship between the focal length and the applied voltage signals is given. The LCMLA is integrated with an image sensor and further coupled with a main lens so as to construct a plenoptic camera. Several raw images at different voltage signals applied are acquired and contrasted through the LCMLA-based plenoptic camera constructed by us. Our experiments demonstrate that through utilizing a LCMLA in a plenoptic camera, the focused zone of the LCMLA-based plenoptic camera can be shifted effectively only by changing the voltage signals loaded between the electrodes of the LCMLA, which is equivalent to the extension of the depth of field.
HandSight: Supporting Everyday Activities through Touch-Vision
2015-10-01
switches between IR and RGB o Large, low resolution, and fixed focal length > 1ft • Raspberry PI NoIR: https://www.raspberrypi.org/products/ pi -noir...camera/ o Raspberry Pi NoIR camera with external visible light filters o Good image quality, manually adjustable focal length, small, programmable 11...purpose and scope of the research. 2. KEYWORDS: Provide a brief list of keywords (limit to 20 words). 3. ACCOMPLISHMENTS: The PI is reminded that
Bennett, Charles L.
1996-01-01
An imaging Fourier transform spectrometer (10, 210) having a Fourier transform infrared spectrometer (12) providing a series of images (40) to a focal plane array camera (38). The focal plane array camera (38) is clocked to a multiple of zero crossing occurrences as caused by a moving mirror (18) of the Fourier transform infrared spectrometer (12) and as detected by a laser detector (50) such that the frame capture rate of the focal plane array camera (38) corresponds to a multiple of the zero crossing rate of the Fourier transform infrared spectrometer (12). The images (40) are transmitted to a computer (45) for processing such that representations of the images (40) as viewed in the light of an arbitrary spectral "fingerprint" pattern can be displayed on a monitor (60) or otherwise stored and manipulated by the computer (45).
Costless Platform for High Resolution Stereoscopic Images of a High Gothic Facade
NASA Astrophysics Data System (ADS)
Héno, R.; Chandelier, L.; Schelstraete, D.
2012-07-01
In October 2011, the PPMD specialized master's degree students (Photogrammetry, Positionning and Deformation Measurement) of the French ENSG (IGN's School of Geomatics, the Ecole Nationale des Sciences Géographiques) were asked to come and survey the main facade of the cathedral of Amiens, which is very complex as far as size and decoration are concerned. Although it was first planned to use a lift truck for the image survey, budget considerations and taste for experimentation led the project to other perspectives: images shot from the ground level with a long focal camera will be combined to complementary images shot from what higher galleries are available on the main facade with a wide angle camera fixed on a horizontal 2.5 meter long pole. This heteroclite image survey is being processed by the PPMD master's degree students during this academic year. Among other type of products, 3D point clouds will be calculated on specific parts of the facade with both sources of images. If the proposed device and methodology to get full image coverage of the main facade happen to be fruitful, the image acquisition phase will be completed later by another team. This article focuses on the production of 3D point clouds with wide angle images on the rose of the main facade.
An electrically tunable plenoptic camera using a liquid crystal microlens array.
Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Ji, An; Xie, Changsheng
2015-05-01
Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated with an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.
An electrically tunable plenoptic camera using a liquid crystal microlens array
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lei, Yu; School of Automation, Huazhong University of Science and Technology, Wuhan 430074; Wuhan National Laboratory for Optoelectronics, Huazhong University of Science and Technology, Wuhan 430074
2015-05-15
Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated withmore » an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.« less
An electrically tunable plenoptic camera using a liquid crystal microlens array
NASA Astrophysics Data System (ADS)
Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Ji, An; Xie, Changsheng
2015-05-01
Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated with an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.
The GCT camera for the Cherenkov Telescope Array
NASA Astrophysics Data System (ADS)
Lapington, J. S.; Abchiche, A.; Allan, D.; Amans, J.-P.; Armstrong, T. P.; Balzer, A.; Berge, D.; Boisson, C.; Bousquet, J.-J.; Bose, R.; Brown, A. M.; Bryan, M.; Buchholtz, G.; Buckley, J.; Chadwick, P. M.; Costantini, H.; Cotter, G.; Daniel, M. K.; De Franco, A.; De Frondat, F.; Dournaux, J.-L.; Dumas, D.; Ernenwein, J.-P.; Fasola, G.; Funk, S.; Gironnet, J.; Graham, J. A.; Greenshaw, T.; Hervet, O.; Hidaka, N.; Hinton, J. A.; Huet, J.-M.; Jankowsky, D.; Jegouzo, I.; Jogler, T.; Kawashima, T.; Kraus, M.; Laporte, P.; Leach, S.; Lefaucheur, J.; Markoff, S.; Melse, T.; Minaya, I. A.; Mohrmann, L.; Molyneux, P.; Moore, P.; Nolan, S. J.; Okumura, A.; Osborne, J. P.; Parsons, R. D.; Rosen, S.; Ross, D.; Rowell, G.; Rulten, C. B.; Sato, Y.; Sayede, F.; Schmoll, J.; Schoorlemmer, H.; Servillat, M.; Sol, H.; Stamatescu, V.; Stephan, M.; Stuik, R.; Sykes, J.; Tajima, H.; Thornhill, J.; Tibaldo, L.; Trichard, C.; Varner, G.; Vink, J.; Watson, J. J.; White, R.; Yamane, N.; Zech, A.; Zink, A.; Zorn, J.; CTA Consortium
2017-12-01
The Gamma Cherenkov Telescope (GCT) is one of the designs proposed for the Small Sized Telescope (SST) section of the Cherenkov Telescope Array (CTA). The GCT uses dual-mirror optics, resulting in a compact telescope with good image quality and a large field of view with a smaller, more economical, camera than is achievable with conventional single mirror solutions. The photon counting GCT camera is designed to record the flashes of atmospheric Cherenkov light from gamma and cosmic ray initiated cascades, which last only a few tens of nanoseconds. The GCT optics require that the camera detectors follow a convex surface with a radius of curvature of 1 m and a diameter of 35 cm, which is approximated by tiling the focal plane with 32 modules. The first camera prototype is equipped with multi-anode photomultipliers, each comprising an 8×8 array of 6×6 mm2 pixels to provide the required angular scale, adding up to 2048 pixels in total. Detector signals are shaped, amplified and digitised by electronics based on custom ASICs that provide digitisation at 1 GSample/s. The camera is self-triggering, retaining images where the focal plane light distribution matches predefined spatial and temporal criteria. The electronics are housed in the liquid-cooled, sealed camera enclosure. LED flashers at the corners of the focal plane provide a calibration source via reflection from the secondary mirror. The first GCT camera prototype underwent preliminary laboratory tests last year. In November 2015, the camera was installed on a prototype GCT telescope (SST-GATE) in Paris and was used to successfully record the first Cherenkov light of any CTA prototype, and the first Cherenkov light seen with such a dual-mirror optical system. A second full-camera prototype based on Silicon Photomultipliers is under construction. Up to 35 GCTs are envisaged for CTA.
Performance of the e2v 1.2 GPix cryogenic camera for the J-PAS 2.5m survey telescope
NASA Astrophysics Data System (ADS)
Robbins, M. S.; Bastable, M.; Bates, A.; Dryer, M.; Eames, S.; Fenemore-Jones, G.; Haddow, G.; Jorden, P. R.; Lane, B.; Marin-Franch, A.; Mortimer, J.; Palmer, I.; Puttay, N.; Renshaw, R.; Smith, M.; Taylor, K.; Tearle, J.; Weston, P.; Wheeler, P.; Worley, J.
2016-08-01
The J-PAS project will perform a five-year survey of the northern sky from a new 2.5m telescope in Teruel, Spain. In this paper the build and factory testing of the commercially supplied cryogenic camera is described. The 1.2 Giga-pixel focal plane is contained within a novel liquid-nitrogen cooled vacuum cryostat, which maintains the flatness for the cooled, 0.45m diameter focal plane to better than 27 μm peak to valley. The cooling system controls the focal plane to a temperature of -100°C with a variation across the focal plane of better than 2.5oC and a stability of better than +/- 0.5 °C over the long periods of operation required. The proximity drive electronics achieves total system level noise performance better than 5 e- from the 224-channel CCD system.
Studies on a silicon-photomultiplier-based camera for Imaging Atmospheric Cherenkov Telescopes
NASA Astrophysics Data System (ADS)
Arcaro, C.; Corti, D.; De Angelis, A.; Doro, M.; Manea, C.; Mariotti, M.; Rando, R.; Reichardt, I.; Tescaro, D.
2017-12-01
Imaging Atmospheric Cherenkov Telescopes (IACTs) represent a class of instruments which are dedicated to the ground-based observation of cosmic VHE gamma ray emission based on the detection of the Cherenkov radiation produced in the interaction of gamma rays with the Earth atmosphere. One of the key elements of such instruments is a pixelized focal-plane camera consisting of photodetectors. To date, photomultiplier tubes (PMTs) have been the common choice given their high photon detection efficiency (PDE) and fast time response. Recently, silicon photomultipliers (SiPMs) are emerging as an alternative. This rapidly evolving technology has strong potential to become superior to that based on PMTs in terms of PDE, which would further improve the sensitivity of IACTs, and see a price reduction per square millimeter of detector area. We are working to develop a SiPM-based module for the focal-plane cameras of the MAGIC telescopes to probe this technology for IACTs with large focal plane cameras of an area of few square meters. We will describe the solutions we are exploring in order to balance a competitive performance with a minimal impact on the overall MAGIC camera design using ray tracing simulations. We further present a comparative study of the overall light throughput based on Monte Carlo simulations and considering the properties of the major hardware elements of an IACT.
An astronomy camera for low background applications in the 1. 0 to 2. 5. mu. m spectral region
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaki, S.A.; Bailey, G.C.; Hagood, R.W.
1989-02-01
A short wavelength (1.0-2.5 ..mu..m) 128 x 128 focal plane array forms the heart of this low background astronomy camera system. The camera is designed to accept either a 128 x 128 HgCdTe array for the 1-2.5 ..mu..m spectral region or an InSb array for the 3-5 ..mu..m spectral region. A cryogenic folded optical system is utilized to control excess stray light along with a cold eight-position filter wheel for spectral filtering. The camera head and electronics will also accept a 256 x 256 focal plane. Engineering evaluation of the complete system is complete along with two engineering runs atmore » the JPL Table Mountain Observatory. System design, engineering performance, and sample imagery are presented in this paper.« less
Ultra-fast framing camera tube
Kalibjian, Ralph
1981-01-01
An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.
Jin, Xin; Liu, Li; Chen, Yanqin; Dai, Qionghai
2017-05-01
This paper derives a mathematical point spread function (PSF) and a depth-invariant focal sweep point spread function (FSPSF) for plenoptic camera 2.0. Derivation of PSF is based on the Fresnel diffraction equation and image formation analysis of a self-built imaging system which is divided into two sub-systems to reflect the relay imaging properties of plenoptic camera 2.0. The variations in PSF, which are caused by changes of object's depth and sensor position variation, are analyzed. A mathematical model of FSPSF is further derived, which is verified to be depth-invariant. Experiments on the real imaging systems demonstrate the consistency between the proposed PSF and the actual imaging results.
Photographic zoom fisheye lens design for DSLR cameras
NASA Astrophysics Data System (ADS)
Yan, Yufeng; Sasian, Jose
2017-09-01
Photographic fisheye lenses with fixed focal length for cameras with different sensor formats have been well developed for decades. However, photographic fisheye lenses with variable focal length are rare on the market due in part to the greater design difficulty. This paper presents a large aperture zoom fisheye lens for DSLR cameras that produces both circular and diagonal fisheye imaging for 35-mm sensors and diagonal fisheye imaging for APS-C sensors. The history and optical characteristics of fisheye lenses are briefly reviewed. Then, a 9.2- to 16.1-mm F/2.8 to F/3.5 zoom fisheye lens design is presented, including the design approach and aberration control. Image quality and tolerance performance analysis for this lens are also presented.
NASA Technical Reports Server (NTRS)
Gunapala, Sarath D.; Park, Jin S.; Sarusi, Gabby; Lin, True-Lon; Liu, John K.; Maker, Paul D.; Muller, Richard E.; Shott, Craig A.; Hoelter, Ted
1997-01-01
In this paper, we discuss the development of very sensitive, very long wavelength infrared GaAs/Al(x)Ga(1-x)As quantum well infrared photodetectors (QWIP's) based on bound-to-quasi-bound intersubband transition, fabrication of random reflectors for efficient light coupling, and the demonstration of a 15 micro-m cutoff 128 x 128 focal plane array imaging camera. Excellent imagery, with a noise equivalent differential temperature (N E(delta T)) of 30 mK has been achieved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter
Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature.more » Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.« less
15 CFR 742.4 - National security.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Requirements” section except those cameras in ECCN 6A003.b.4.b that have a focal plane array with 111,000 or..., South Korea, Spain, Sweden, Switzerland, Turkey, and the United Kingdom for those cameras in ECCN 6A003...
15 CFR 742.4 - National security.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Requirements” section except those cameras in ECCN 6A003.b.4.b that have a focal plane array with 111,000 or..., South Korea, Spain, Sweden, Switzerland, Turkey, and the United Kingdom for those cameras in ECCN 6A003...
15 CFR 742.4 - National security.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Requirements” section except those cameras in ECCN 6A003.b.4.b that have a focal plane array with 111,000 or..., South Korea, Spain, Sweden, Switzerland, Turkey, and the United Kingdom for those cameras in ECCN 6A003...
15 CFR 742.4 - National security.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Requirements” section except those cameras in ECCN 6A003.b.4.b that have a focal plane array with 111,000 or..., South Korea, Spain, Sweden, Switzerland, Turkey, and the United Kingdom for those cameras in ECCN 6A003...
Di Domenico, Giovanni; Cardarelli, Paolo; Contillo, Adriano; Taibi, Angelo; Gambaccini, Mauro
2016-01-01
The quality of a radiography system is affected by several factors, a major one being the focal spot size of the x-ray tube. In fact, the measurement of such size is recognized to be of primary importance during acceptance tests and image quality evaluations of clinical radiography systems. The most common device providing an image of the focal spot emission distribution is a pin-hole camera, which requires a high tube loading in order to produce a measurable signal. This work introduces an alternative technique to obtain an image of the focal spot, through the processing of a single radiograph of a simple test object, acquired with a suitable magnification. The radiograph of a magnified sharp edge is a well-established method to evaluate the extension of the focal spot profile along the direction perpendicular to the edge. From a single radiograph of a circular x-ray absorber, it is possible to extract simultaneously the radial profiles of several sharp edges with different orientations. The authors propose a technique that allows to obtain an image of the focal spot through the processing of these radial profiles by means of a pseudo-CT reconstruction technique. In order to validate this technique, the reconstruction has been applied to the simulated radiographs of an ideal disk-shaped absorber, generated by various simulated focal spot distributions. Furthermore, the method has been applied to the focal spot of a commercially available mammography unit. In the case of simulated radiographs, the results of the reconstructions have been compared to the original distributions, showing an excellent agreement for what regards both the overall distribution and the full width at half maximum measurements. In the case of the experimental test, the method allowed to obtain images of the focal spot that have been compared with the results obtained through standard techniques, namely, pin-hole camera and slit camera. The method was proven to be effective for simulated images and the results of the experimental test suggest that it could be considered as an alternative technique for focal spot distribution evaluation. The method offers the possibility to measure the actual focal spot size and emission distribution at the same exposure conditions as clinical routine, avoiding high tube loading as in the case of the pin-hole imaging technique.
Development of Camera Model and Geometric Calibration/validation of Xsat IRIS Imagery
NASA Astrophysics Data System (ADS)
Kwoh, L. K.; Huang, X.; Tan, W. J.
2012-07-01
XSAT, launched on 20 April 2011, is the first micro-satellite designed and built in Singapore. It orbits the Earth at altitude of 822 km in a sun synchronous orbit. The satellite carries a multispectral camera IRIS with three spectral bands - 0.52~0.60 mm for Green, 0.63~0.69 mm for Red and 0.76~0.89 mm for NIR at 12 m resolution. In the design of IRIS camera, the three bands were acquired by three lines of CCDs (NIR, Red and Green). These CCDs were physically separated in the focal plane and their first pixels not absolutely aligned. The micro-satellite platform was also not stable enough to allow for co-registration of the 3 bands with simple linear transformation. In the camera model developed, this platform stability was compensated with 3rd to 4th order polynomials for the satellite's roll, pitch and yaw attitude angles. With the camera model, the camera parameters such as the band to band separations, the alignment of the CCDs relative to each other, as well as the focal length of the camera can be validated or calibrated. The results of calibration with more than 20 images showed that the band to band along-track separation agreed well with the pre-flight values provided by the vendor (0.093° and 0.046° for the NIR vs red and for green vs red CCDs respectively). The cross-track alignments were 0.05 pixel and 5.9 pixel for the NIR vs red and green vs red CCDs respectively. The focal length was found to be shorter by about 0.8%. This was attributed to the lower operating temperature which XSAT is currently operating. With the calibrated parameters and the camera model, a geometric level 1 multispectral image with RPCs can be generated and if required, orthorectified imagery can also be produced.
A math model for high velocity sensoring with a focal plane shuttered camera.
NASA Technical Reports Server (NTRS)
Morgan, P.
1971-01-01
A new mathematical model is presented which describes the image produced by a focal plane shutter-equipped camera. The model is based upon the well-known collinearity condition equations and incorporates both the translational and rotational motion of the camera during the exposure interval. The first differentials of the model with respect to exposure interval, delta t, yield the general matrix expressions for image velocities which may be simplified to known cases. The exposure interval, delta t, may be replaced under certain circumstances with a function incorporating blind velocity and image position if desired. The model is tested using simulated Lunar Orbiter data and found to be computationally stable as well as providing excellent results, provided that some external information is available on the velocity parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flaugher, B.; Diehl, H. T.; Alvarez, O.
2015-11-15
The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuummore » Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel{sup −1}. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less
Flaugher, B.
2015-04-11
The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar.more » The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less
Characterization and performance of PAUCam filters
NASA Astrophysics Data System (ADS)
Casas, R.; Cardiel-Sas, L.; Castander, F. J.; Díaz, C.; Gaweda, J.; Jiménez Rojas, J.; Jiménez, S.; Lamensans, M.; Padilla, C.; Rodriguez, F. J.; Sanchez, E.; Sevilla Noarbe, I.
2016-08-01
PAUCam is a large field of view camera designed to exploit the field delivered by the prime focus corrector of the William Herschel Telescope, at the Observatorio del Roque de los Muchachos. One of the new features of this camera is its filter system, placed within a few millimeters of the focal plane using eleven trays containing 40 narrow band and 6 broad band filters, working in vacuum at an operational temperature of 250K and in a focalized beam. In this contribution, we describe the performance of these filters both in the characterization tests at the laboratory.
The Dark Energy Survey and Operations: Years 1 to 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diehl, H. T.
2016-01-01
The Dark Energy Survey (DES) is an operating optical survey aimed at understanding the accelerating expansion of the universe using four complementary methods: weak gravitational lensing, galaxy cluster counts, baryon acoustic oscillations, and Type Ia supernovae. To perform the 5000 sq-degree wide field and 30 sq-degree supernova surveys, the DES Collaboration built the Dark Energy Camera (DECam), a 3 square-degree, 570-Megapixel CCD camera that was installed at the prime focus of the Blanco 4-meter telescope at the Cerro Tololo Inter-American Observatory (CTIO). DES has completed its third observing season out of a nominal five. This paper describes DES “Year 1”more » (Y1) to “Year 3” (Y3), the strategy, an outline of the survey operations procedures, the efficiency of operations and the causes of lost observing time. It provides details about the quality of the first three season's data, and describes how we are adjusting the survey strategy in the face of the El Niño Southern Oscillation« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di Domenico, Giovanni, E-mail: didomenico@fe.infn.it; Cardarelli, Paolo; Taibi, Angelo
Purpose: The quality of a radiography system is affected by several factors, a major one being the focal spot size of the x-ray tube. In fact, the measurement of such size is recognized to be of primary importance during acceptance tests and image quality evaluations of clinical radiography systems. The most common device providing an image of the focal spot emission distribution is a pin-hole camera, which requires a high tube loading in order to produce a measurable signal. This work introduces an alternative technique to obtain an image of the focal spot, through the processing of a single radiographmore » of a simple test object, acquired with a suitable magnification. Methods: The radiograph of a magnified sharp edge is a well-established method to evaluate the extension of the focal spot profile along the direction perpendicular to the edge. From a single radiograph of a circular x-ray absorber, it is possible to extract simultaneously the radial profiles of several sharp edges with different orientations. The authors propose a technique that allows to obtain an image of the focal spot through the processing of these radial profiles by means of a pseudo-CT reconstruction technique. In order to validate this technique, the reconstruction has been applied to the simulated radiographs of an ideal disk-shaped absorber, generated by various simulated focal spot distributions. Furthermore, the method has been applied to the focal spot of a commercially available mammography unit. Results: In the case of simulated radiographs, the results of the reconstructions have been compared to the original distributions, showing an excellent agreement for what regards both the overall distribution and the full width at half maximum measurements. In the case of the experimental test, the method allowed to obtain images of the focal spot that have been compared with the results obtained through standard techniques, namely, pin-hole camera and slit camera. Conclusions: The method was proven to be effective for simulated images and the results of the experimental test suggest that it could be considered as an alternative technique for focal spot distribution evaluation. The method offers the possibility to measure the actual focal spot size and emission distribution at the same exposure conditions as clinical routine, avoiding high tube loading as in the case of the pin-hole imaging technique.« less
Focal plane wavefront sensor achromatization: The multireference self-coherent camera
NASA Astrophysics Data System (ADS)
Delorme, J. R.; Galicher, R.; Baudoz, P.; Rousset, G.; Mazoyer, J.; Dupuis, O.
2016-04-01
Context. High contrast imaging and spectroscopy provide unique constraints for exoplanet formation models as well as for planetary atmosphere models. But this can be challenging because of the planet-to-star small angular separation (<1 arcsec) and high flux ratio (>105). Recently, optimized instruments like VLT/SPHERE and Gemini/GPI were installed on 8m-class telescopes. These will probe young gazeous exoplanets at large separations (≳1 au) but, because of uncalibrated phase and amplitude aberrations that induce speckles in the coronagraphic images, they are not able to detect older and fainter planets. Aims: There are always aberrations that are slowly evolving in time. They create quasi-static speckles that cannot be calibrated a posteriori with sufficient accuracy. An active correction of these speckles is thus needed to reach very high contrast levels (>106-107). This requires a focal plane wavefront sensor. Our team proposed a self coherent camera, the performance of which was demonstrated in the laboratory. As for all focal plane wavefront sensors, these are sensitive to chromatism and we propose an upgrade that mitigates the chromatism effects. Methods: First, we recall the principle of the self-coherent camera and we explain its limitations in polychromatic light. Then, we present and numerically study two upgrades to mitigate chromatism effects: the optical path difference method and the multireference self-coherent camera. Finally, we present laboratory tests of the latter solution. Results: We demonstrate in the laboratory that the multireference self-coherent camera can be used as a focal plane wavefront sensor in polychromatic light using an 80 nm bandwidth at 640 nm (bandwidth of 12.5%). We reach a performance that is close to the chromatic limitations of our bench: 1σ contrast of 4.5 × 10-8 between 5 and 17 λ0/D. Conclusions: The performance of the MRSCC is promising for future high-contrast imaging instruments that aim to actively minimize the speckle intensity so as to detect and spectrally characterize faint old or light gaseous planets.
Depth Perception In Remote Stereoscopic Viewing Systems
NASA Technical Reports Server (NTRS)
Diner, Daniel B.; Von Sydow, Marika
1989-01-01
Report describes theoretical and experimental studies of perception of depth by human operators through stereoscopic video systems. Purpose of such studies to optimize dual-camera configurations used to view workspaces of remote manipulators at distances of 1 to 3 m from cameras. According to analysis, static stereoscopic depth distortion decreased, without decreasing stereoscopitc depth resolution, by increasing camera-to-object and intercamera distances and camera focal length. Further predicts dynamic stereoscopic depth distortion reduced by rotating cameras around center of circle passing through point of convergence of viewing axes and first nodal points of two camera lenses.
Advanced imaging research and development at DARPA
NASA Astrophysics Data System (ADS)
Dhar, Nibir K.; Dat, Ravi
2012-06-01
Advances in imaging technology have huge impact on our daily lives. Innovations in optics, focal plane arrays (FPA), microelectronics and computation have revolutionized camera design. As a result, new approaches to camera design and low cost manufacturing is now possible. These advances are clearly evident in visible wavelength band due to pixel scaling, improvements in silicon material and CMOS technology. CMOS cameras are available in cell phones and many other consumer products. Advances in infrared imaging technology have been slow due to market volume and many technological barriers in detector materials, optics and fundamental limits imposed by the scaling laws of optics. There is of course much room for improvements in both, visible and infrared imaging technology. This paper highlights various technology development projects at DARPA to advance the imaging technology for both, visible and infrared. Challenges and potentials solutions are highlighted in areas related to wide field-of-view camera design, small pitch pixel, broadband and multiband detectors and focal plane arrays.
Shared Focal Plane Investigation for Serial Frame Cameras.
1980-03-01
capability will be restored. 41. -.. TrABLE 1-1 SYSTEM LEADING P) ARTICULARS Lens Focal Length (inches) Range (ft) Contrast 12 18 24 Coverage 22.1...can be expected that signature bands will be apparent in the imagery. Such bands are at best distracting and at worst hindrances to image interpretation
NASA Astrophysics Data System (ADS)
Malin, Michal C.; Ravine, Michael A.; Caplinger, Michael A.; Tony Ghaemi, F.; Schaffner, Jacob A.; Maki, Justin N.; Bell, James F.; Cameron, James F.; Dietrich, William E.; Edgett, Kenneth S.; Edwards, Laurence J.; Garvin, James B.; Hallet, Bernard; Herkenhoff, Kenneth E.; Heydari, Ezat; Kah, Linda C.; Lemmon, Mark T.; Minitti, Michelle E.; Olson, Timothy S.; Parker, Timothy J.; Rowland, Scott K.; Schieber, Juergen; Sletten, Ron; Sullivan, Robert J.; Sumner, Dawn Y.; Aileen Yingst, R.; Duston, Brian M.; McNair, Sean; Jensen, Elsa H.
2017-08-01
The Mars Science Laboratory Mast camera and Descent Imager investigations were designed, built, and operated by Malin Space Science Systems of San Diego, CA. They share common electronics and focal plane designs but have different optics. There are two Mastcams of dissimilar focal length. The Mastcam-34 has an f/8, 34 mm focal length lens, and the M-100 an f/10, 100 mm focal length lens. The M-34 field of view is about 20° × 15° with an instantaneous field of view (IFOV) of 218 μrad; the M-100 field of view (FOV) is 6.8° × 5.1° with an IFOV of 74 μrad. The M-34 can focus from 0.5 m to infinity, and the M-100 from 1.6 m to infinity. All three cameras can acquire color images through a Bayer color filter array, and the Mastcams can also acquire images through seven science filters. Images are ≤1600 pixels wide by 1200 pixels tall. The Mastcams, mounted on the 2 m tall Remote Sensing Mast, have a 360° azimuth and 180° elevation field of regard. Mars Descent Imager is fixed-mounted to the bottom left front side of the rover at 66 cm above the surface. Its fixed focus lens is in focus from 2 m to infinity, but out of focus at 66 cm. The f/3 lens has a FOV of 70° by 52° across and along the direction of motion, with an IFOV of 0.76 mrad. All cameras can acquire video at 4 frames/second for full frames or 720p HD at 6 fps. Images can be processed using lossy Joint Photographic Experts Group and predictive lossless compression.
NASA Astrophysics Data System (ADS)
de Villiers, Jason; Jermy, Robert; Nicolls, Fred
2014-06-01
This paper presents a system to determine the photogrammetric parameters of a camera. The lens distortion, focal length and camera six degree of freedom (DOF) position are calculated. The system caters for cameras of different sensitivity spectra and fields of view without any mechanical modifications. The distortion characterization, a variant of Brown's classic plumb line method, allows many radial and tangential distortion coefficients and finds the optimal principal point. Typical values are 5 radial and 3 tangential coefficients. These parameters are determined stably and demonstrably produce superior results to low order models despite popular and prevalent misconceptions to the contrary. The system produces coefficients to model both the distorted to undistorted pixel coordinate transformation (e.g. for target designation) and the inverse transformation (e.g. for image stitching and fusion) allowing deterministic rates far exceeding real time. The focal length is determined to minimise the error in absolute photogrammetric positional measurement for both multi camera systems or monocular (e.g. helmet tracker) systems. The system determines the 6 DOF position of the camera in a chosen coordinate system. It can also determine the 6 DOF offset of the camera relative to its mechanical mount. This allows faulty cameras to be replaced without requiring a recalibration of the entire system (such as an aircraft cockpit). Results from two simple applications of the calibration results are presented: stitching and fusion of the images from a dual-band visual/ LWIR camera array, and a simple laboratory optical helmet tracker.
Exploring the imaging properties of thin lenses for cryogenic infrared cameras
NASA Astrophysics Data System (ADS)
Druart, Guillaume; Verdet, Sebastien; Guerineau, Nicolas; Magli, Serge; Chambon, Mathieu; Grulois, Tatiana; Matallah, Noura
2016-05-01
Designing a cryogenic camera is a good strategy to miniaturize and simplify an infrared camera using a cooled detector. Indeed, the integration of optics inside the cold shield allows to simply athermalize the design, guarantees a cold pupil and releases the constraint on having a high back focal length for small focal length systems. By this way, cameras made of a single lens or two lenses are viable systems with good optical features and a good stability in image correction. However it involves a relatively significant additional optical mass inside the dewar and thus increases the cool down time of the camera. ONERA is currently exploring a minimalist strategy consisting in giving an imaging function to thin optical plates that are found in conventional dewars. By this way, we could make a cryogenic camera that has the same cool down time as a traditional dewar without an imagery function. Two examples will be presented: the first one is a camera using a dual-band infrared detector made of a lens outside the dewar and a lens inside the cold shield, the later having the main optical power of the system. We were able to design a cold plano-convex lens with a thickness lower than 1mm. The second example is an evolution of a former cryogenic camera called SOIE. We replaced the cold meniscus by a plano-convex Fresnel lens with a decrease of the optical thermal mass of 66%. The performances of both cameras will be compared.
Optimisation de l'émission du continuum femtoseconde de lumière blanche entre 600 nm et 800 nm
NASA Astrophysics Data System (ADS)
Ramstein, S.; Mottin, S.
2005-06-01
Un dispositif de spectroscopie avec résolution du temps de vol des photons en milieu diffus a été développé. Celui-ci repose sur l'utilisation d'un continuum de lumière blanche généré par focalisation d'un laser amplifié (830 nm, 1 kHz, 0.5 W, 170 fs) dans de l'eau déminéralisée. Afin d'optimiser spectralement et en puissance la source blanche sur la fenêtre spectrale 600 800 nm, une étude de la mise en forme spatio-temporelle avant autofocalisation de l'impulsion laser par le milieu a été menée. Cette mise en forme est effectuée de manière spatiale en changeant la focale de la lentille de focalisation et de manière temporelle en changeant le taux de compression de l'impulsion. L'étude montre que le cône de lumière émise possède plus de puissance dans la fenêtre spectrale d'intérêt pour des focales longues. Sur la fenêtre 600-800 nm, le rendement énergétique intégré varie de 5%, avec une focalef=6cm, à 15%, avec une focale f = 30 cm. La mise en forme temporelle montre des effets similaires avec les mêmes ordres de grandeur.
First SN Discoveries from the Dark Energy Survey
NASA Astrophysics Data System (ADS)
Abbott, T.; Abdalla, F.; Achitouv, I.; Ahn, E.; Aldering, G.; Allam, S.; Alonso, D.; Amara, A.; Annis, J.; Antonik, M.; Aragon-Salamanca, A.; Armstrong, R.; Ashall, C.; Asorey, J.; Bacon, D.; Balbinot, E.; Banerji, M.; Barbary, K.; Barkhouse, W.; Baruah, L.; Bauer, A.; Bechtol, K.; Becker, M.; Bender, R.; Benoist, C.; Benoit-Levy, A.; Bernardi, M.; Bernstein, G.; Bernstein, J. P.; Bernstein, R.; Bertin, E.; Beynon, E.; Bhattacharya, S.; Biesiadzinski, T.; Biswas, R.; Blake, C.; Bloom, J. S.; Bocquet, S.; Brandt, C.; Bridle, S.; Brooks, D.; Brown, P. J.; Brunner, R.; Buckley-Geer, E.; Burke, D.; Burkert, A.; Busha, M.; Campa, J.; Campbell, H.; Cane, R.; Capozzi, D.; Carlstrom, J.; Carnero Rosell, A.; Carollo, M.; Carrasco-Kind, M.; Carretero, J.; Carter, M.; Casas, R.; Castander, F. J.; Chen, Y.; Chiu, I.; Chue, C.; Clampitt, J.; Clerkin, L.; Cohn, J.; Colless, M.; Copeland, E.; Covarrubias, R. A.; Crittenden, R.; Crocce, M.; Cunha, C.; da Costa, L.; d'Andrea, C.; Das, S.; Das, R.; Davis, T. M.; Deb, S.; DePoy, D.; Derylo, G.; Desai, S.; de Simoni, F.; Devlin, M.; Diehl, H. T.; Dietrich, J.; Dodelson, S.; Doel, P.; Dolag, K.; Efstathiou, G.; Eifler, T.; Erickson, B.; Eriksen, M.; Estrada, J.; Etherington, J.; Evrard, A.; Farrens, S.; Fausti Neto, A.; Fernandez, E.; Ferreira, P. C.; Finley, D.; Fischer, J. A.; Flaugher, B.; Fosalba, P.; Frieman, J.; Furlanetto, C.; Garcia-Bellido, J.; Gaztanaga, E.; Gelman, M.; Gerdes, D.; Giannantonio, T.; Gilhool, S.; Gill, M.; Gladders, M.; Gladney, L.; Glazebrook, K.; Gray, M.; Gruen, D.; Gruendl, R.; Gupta, R.; Gutierrez, G.; Habib, S.; Hall, E.; Hansen, S.; Hao, J.; Heitmann, K.; Helsby, J.; Henderson, R.; Hennig, C.; High, W.; Hirsch, M.; Hoffmann, K.; Holhjem, K.; Honscheid, K.; Host, O.; Hoyle, B.; Hu, W.; Huff, E.; Huterer, D.; Jain, B.; James, D.; Jarvis, M.; Jarvis, M. J.; Jeltema, T.; Johnson, M.; Jouvel, S.; Kacprzak, T.; Karliner, I.; Katsaros, J.; Kent, S.; Kessler, R.; Kim, A.; Kim-Vy, T.; King, L.; Kirk, D.; Kochanek, C.; Kopp, M.; Koppenhoefer, J.; Kovacs, E.; Krause, E.; Kravtsov, A.; Kron, R.; Kuehn, K.; Kuemmel, M.; Kuhlmann, S.; Kunder, A.; Kuropatkin, N.; Kwan, J.; Lahav, O.; Leistedt, B.; Levi, M.; Lewis, P.; Liddle, A.; Lidman, C.; Lilly, S.; Lin, H.; Liu, J.; Lopez-Arenillas, C.; Lorenzon, W.; LoVerde, M.; Ma, Z.; Maartens, R.; Maccrann, N.; Macri, L.; Maia, M.; Makler, M.; Manera, M.; Maraston, C.; March, M.; Markovic, K.; Marriner, J.; Marshall, J.; Marshall, S.; Martini, P.; Marti Sanahuja, P.; Mayers, J.; McKay, T.; McMahon, R.; Melchior, P.; Merritt, K. W.; Merson, A.; Miller, C.; Miquel, R.; Mohr, J.; Moore, T.; Mortonson, M.; Mosher, J.; Mould, J.; Mukherjee, P.; Neilsen, E.; Ngeow, C.; Nichol, R.; Nidever, D.; Nord, B.; Nugent, P.; Ogando, R.; Old, L.; Olsen, J.; Ostrovski, F.; Paech, K.; Papadopoulos, A.; Papovich, C.; Patton, K.; Peacock, J.; Pellegrini, P. S. S.; Peoples, J.; Percival, W.; Perlmutter, S.; Petravick, D.; Plazas, A.; Ponce, R.; Poole, G.; Pope, A.; Refregier, A.; Reyes, R.; Ricker, P.; Roe, N.; Romer, K.; Roodman, A.; Rooney, P.; Ross, A.; Rowe, B.; Rozo, E.; Rykoff, E.; Sabiu, C.; Saglia, R.; Sako, M.; Sanchez, A.; Sanchez, C.; Sanchez, E.; Sanchez, J.; Santiago, B.; Saro, A.; Scarpine, V.; Schindler, R.; Schmidt, B. P.; Schmitt, R. L.; Schubnell, M.; Seitz, S.; Senger, R.; Sevilla, I.; Sharp, R.; Sheldon, E.; Sheth, R.; Smith, R. C.; Smith, M.; Snigula, J.; Soares-Santos, M.; Sobreira, F.; Song, J.; Soumagnac, M.; Spinka, H.; Stebbins, A.; Stoughton, C.; Suchyta, E.; Suhada, R.; Sullivan, M.; Sun, F.; Suntzeff, N.; Sutherland, W.; Swanson, M. E. C.; Sypniewski, A. J.; Szepietowski, R.; Talaga, R.; Tarle, G.; Tarrant, E.; Balan, S. Thaithara; Thaler, J.; Thomas, D.; Thomas, R. C.; Tucker, D.; Uddin, S. A.; Ural, S.; Vikram, V.; Voigt, L.; Walker, A. R.; Walker, T.; Wechsler, R.; Weinberg, D.; Weller, J.; Wester, W.; Wetzstein, M.; White, M.; Wilcox, H.; Wilman, D.; Yanny, B.; Young, J.; Zablocki, A.; Zenteno, A.; Zhang, Y.; Zuntz, J.
2012-12-01
The Dark Energy Survey (DES) report the discovery of the first set of supernovae (SN) from the project. Images were observed as part of the DES Science Verification phase using the newly-installed 570-Megapixel Dark Energy Camera on the CTIO Blanco 4-m telescope by observers J. Annis, E. Buckley-Geer, and H. Lin. SN observations are planned throughout the observing campaign on a regular cadence of 4-6 days in each of the ten 3-deg2 fields in the DES griz filters.
Temporal Coding of Volumetric Imagery
NASA Astrophysics Data System (ADS)
Llull, Patrick Ryan
'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration of other information within that video; namely, focal and spectral information. The next part of the thesis demonstrates derivative works of CACTI: compressive extended depth of field and compressive spectral-temporal imaging. These works successfully show the technique's extension of temporal coding to improve sensing performance in these other dimensions. Geometrical optics-related tradeoffs, such as the classic challenges of wide-field-of-view and high resolution photography, have motivated the development of mulitscale camera arrays. The advent of such designs less than a decade ago heralds a new era of research- and engineering-related challenges. One significant challenge is that of managing the focal volume (x,y,z ) over wide fields of view and resolutions. The fourth chapter shows advances on focus and image quality assessment for a class of multiscale gigapixel cameras developed at Duke. Along the same line of work, we have explored methods for dynamic and adaptive addressing of focus via point spread function engineering. We demonstrate another form of temporal coding in the form of physical translation of the image plane from its nominal focal position. We demonstrate this technique's capability to generate arbitrary point spread functions.
Concept of electro-optical sensor module for sniper detection system
NASA Astrophysics Data System (ADS)
Trzaskawka, Piotr; Dulski, Rafal; Kastek, Mariusz
2010-10-01
The paper presents an initial concept of the electro-optical sensor unit for sniper detection purposes. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. Being a part of a larger system it should contribute to greater overall system efficiency and lower false alarm rate thanks to data and sensor fusion techniques. Additionally, it is expected to provide some pre-shot detection capabilities. Generally acoustic (or radar) systems used for shot detection offer only "after-the-shot" information and they cannot prevent enemy attack, which in case of a skilled sniper opponent usually means trouble. The passive imaging sensors presented in this paper, together with active systems detecting pointed optics, are capable of detecting specific shooter signatures or at least the presence of suspected objects in the vicinity. The proposed sensor unit use thermal camera as a primary sniper and shot detection tool. The basic camera parameters such as focal plane array size and type, focal length and aperture were chosen on the basis of assumed tactical characteristics of the system (mainly detection range) and current technology level. In order to provide costeffective solution the commercially available daylight camera modules and infrared focal plane arrays were tested, including fast cooled infrared array modules capable of 1000 fps image acquisition rate. The daylight camera operates as a support, providing corresponding visual image, easier to comprehend for a human operator. The initial assumptions concerning sensor operation were verified during laboratory and field test and some example shot recording sequences are presented.
Remote Sensing Simulation Activities for Earthlings
ERIC Educational Resources Information Center
Krockover, Gerald H.; Odden, Thomas D.
1977-01-01
Suggested are activities using a Polaroid camera to illustrate the capabilities of remote sensing. Reading materials from the National Aeronautics and Space Administration (NASA) are suggested. Methods for (1) finding a camera's focal length, (2) calculating ground dimension photograph simulation, and (3) limiting size using film resolution are…
Application of imaging to the atmospheric Cherenkov technique
NASA Technical Reports Server (NTRS)
Cawley, M. F.; Fegan, D. J.; Gibbs, K.; Gorham, P. W.; Hillas, A. M.; Lamb, R. C.; Liebing, D. F.; Mackeown, P. K.; Porter, N. A.; Stenger, V. J.
1985-01-01
Turver and Weekes proposed using a system of phototubes in the focal plane of a large reflector to give an air Cherenkov camera for gamma ray astronomy. Preliminary results with a 19 element camera have been reported previously. In 1983 the camera was increased to 37 pixels; it has now been routinely operated for two years. A brief physical description of the camera, its mode of operation, and the data reduction procedures are presented. The Monte Carlo simultations on which these are based on also reviewed.
Lytro camera technology: theory, algorithms, performance analysis
NASA Astrophysics Data System (ADS)
Georgiev, Todor; Yu, Zhan; Lumsdaine, Andrew; Goma, Sergio
2013-03-01
The Lytro camera is the first implementation of a plenoptic camera for the consumer market. We consider it a successful example of the miniaturization aided by the increase in computational power characterizing mobile computational photography. The plenoptic camera approach to radiance capture uses a microlens array as an imaging system focused on the focal plane of the main camera lens. This paper analyzes the performance of Lytro camera from a system level perspective, considering the Lytro camera as a black box, and uses our interpretation of Lytro image data saved by the camera. We present our findings based on our interpretation of Lytro camera file structure, image calibration and image rendering; in this context, artifacts and final image resolution are discussed.
Cat-eye effect reflected beam profiles of an optical system with sensor array.
Gong, Mali; He, Sifeng; Guo, Rui; Wang, Wei
2016-06-01
In this paper, we propose an applicable propagation model for Gaussian beams passing through any cat-eye target instead of traditional simplification consisting of only a mirror placed at the focal plane of a lens. According to the model, the cat-eye effect of CCD cameras affected by defocus is numerically simulated. An excellent agreement of experiment results with theoretical analysis is obtained. It is found that the reflectivity distribution at the focal plane of the cat-eye optical lens has great influence on the results, while the cat-eye effect reflected beam profiles of CCD cameras show obvious periodicity.
Plenoptic background oriented schlieren imaging
NASA Astrophysics Data System (ADS)
Klemkowsky, Jenna N.; Fahringer, Timothy W.; Clifford, Christopher J.; Bathel, Brett F.; Thurow, Brian S.
2017-09-01
The combination of the background oriented schlieren (BOS) technique with the unique imaging capabilities of a plenoptic camera, termed plenoptic BOS, is introduced as a new addition to the family of schlieren techniques. Compared to conventional single camera BOS, plenoptic BOS is capable of sampling multiple lines-of-sight simultaneously. Displacements from each line-of-sight are collectively used to build a four-dimensional displacement field, which is a vector function structured similarly to the original light field captured in a raw plenoptic image. The displacement field is used to render focused BOS images, which qualitatively are narrow depth of field slices of the density gradient field. Unlike focused schlieren methods that require manually changing the focal plane during data collection, plenoptic BOS synthetically changes the focal plane position during post-processing, such that all focal planes are captured in a single snapshot. Through two different experiments, this work demonstrates that plenoptic BOS is capable of isolating narrow depth of field features, qualitatively inferring depth, and quantitatively estimating the location of disturbances in 3D space. Such results motivate future work to transition this single-camera technique towards quantitative reconstructions of 3D density fields.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rilling, M; Centre de Recherche sur le Cancer, Hôtel-Dieu de Québec, Quebec City, QC; Département de radio-oncologie, CHU de Québec, Quebec City, QC
2015-06-15
Purpose: The purpose of this work is to simulate a multi-focus plenoptic camera used as the measuring device in a real-time three-dimensional scintillation dosimeter. Simulating and optimizing this realistic optical system will bridge the technological gap between concept validation and a clinically viable tool that can provide highly efficient, accurate and precise measurements for dynamic radiotherapy techniques. Methods: The experimental prototype, previously developed for proof of concept purposes, uses an off-the-shelf multi-focus plenoptic camera. With an array of interleaved microlenses of different focal lengths, this camera records spatial and angular information of light emitted by a plastic scintillator volume. Themore » three distinct microlens focal lengths were determined experimentally for use as baseline parameters by measuring image-to-object magnification for different distances in object space. A simulated plenoptic system was implemented using the non-sequential ray tracing software Zemax: this tool allows complete simulation of multiple optical paths by modeling interactions at interfaces such as scatter, diffraction, reflection and refraction. The active sensor was modeled based on the camera manufacturer specifications by a 2048×2048, 5 µm-pixel pitch sensor. Planar light sources, simulating the plastic scintillator volume, were employed for ray tracing simulations. Results: The microlens focal lengths were determined to be 384, 327 and 290 µm. A realistic multi-focus plenoptic system, with independently defined and optimizable specifications, was fully simulated. A f/2.9 and 54 mm-focal length Double Gauss objective was modeled as the system’s main lens. A three-focal length hexagonal microlens array of 250-µm thickness was designed, acting as an image-relay system between the main lens and sensor. Conclusion: Simulation of a fully modeled multi-focus plenoptic camera enables the decoupled optimization of the main lens and microlens specifications. This work leads the way to improving the 3D dosimeter’s achievable resolution, efficiency and build for providing a quality assurance tool fully meeting clinical needs. M.R. is financially supported by a Master’s Canada Graduate Scholarship from the NSERC. This research is also supported by the NSERC Industrial Research Chair in Optical Design.« less
50 CFR 216.155 - Requirements for monitoring and reporting.
Code of Federal Regulations, 2010 CFR
2010-10-01
... place 3 autonomous digital video cameras overlooking chosen haul-out sites located varying distances from the missile launch site. Each video camera will be set to record a focal subgroup within the... presence and activity will be conducted and recorded in a field logbook or recorded on digital video for...
Adaptation of the Camera Link Interface for Flight-Instrument Applications
NASA Technical Reports Server (NTRS)
Randall, David P.; Mahoney, John C.
2010-01-01
COTS (commercial-off-the-shelf) hard ware using an industry-standard Camera Link interface is proposed to accomplish the task of designing, building, assembling, and testing electronics for an airborne spectrometer that would be low-cost, but sustain the required data speed and volume. The focal plane electronics were designed to support that hardware standard. Analysis was done to determine how these COTS electronics could be interfaced with space-qualified camera electronics. Interfaces available for spaceflight application do not support the industry standard Camera Link interface, but with careful design, COTS EGSE (electronics ground support equipment), including camera interfaces and camera simulators, can still be used.
Far ultraviolet wide field imaging and photometry - Spartan-202 Mark II Far Ultraviolet Camera
NASA Technical Reports Server (NTRS)
Carruthers, George R.; Heckathorn, Harry M.; Opal, Chet B.; Witt, Adolf N.; Henize, Karl G.
1988-01-01
The U.S. Naval Research Laboratory' Mark II Far Ultraviolet Camera, which is expected to be a primary scientific instrument aboard the Spartan-202 Space Shuttle mission, is described. This camera is intended to obtain FUV wide-field imagery of stars and extended celestial objects, including diffuse nebulae and nearby galaxies. The observations will support the HST by providing FUV photometry of calibration objects. The Mark II camera is an electrographic Schmidt camera with an aperture of 15 cm, a focal length of 30.5 cm, and sensitivity in the 1230-1600 A wavelength range.
The NACA High-Speed Motion-Picture Camera Optical Compensation at 40,000 Photographs Per Second
NASA Technical Reports Server (NTRS)
Miller, Cearcy D
1946-01-01
The principle of operation of the NACA high-speed camera is completely explained. This camera, operating at the rate of 40,000 photographs per second, took the photographs presented in numerous NACA reports concerning combustion, preignition, and knock in the spark-ignition engine. Many design details are presented and discussed, details of an entirely conventional nature are omitted. The inherent aberrations of the camera are discussed and partly evaluated. The focal-plane-shutter effect of the camera is explained. Photographs of the camera are presented. Some high-speed motion pictures of familiar objects -- photoflash bulb, firecrackers, camera shutter -- are reproduced as an illustration of the quality of the photographs taken by the camera.
A novel SPECT camera for molecular imaging of the prostate
NASA Astrophysics Data System (ADS)
Cebula, Alan; Gilland, David; Su, Li-Ming; Wagenaar, Douglas; Bahadori, Amir
2011-10-01
The objective of this work is to develop an improved SPECT camera for dedicated prostate imaging. Complementing the recent advancements in agents for molecular prostate imaging, this device has the potential to assist in distinguishing benign from aggressive cancers, to improve site-specific localization of cancer, to improve accuracy of needle-guided prostate biopsy of cancer sites, and to aid in focal therapy procedures such as cryotherapy and radiation. Theoretical calculations show that the spatial resolution/detection sensitivity of the proposed SPECT camera can rival or exceed 3D PET and further signal-to-noise advantage is attained with the better energy resolution of the CZT modules. Based on photon transport simulation studies, the system has a reconstructed spatial resolution of 4.8 mm with a sensitivity of 0.0001. Reconstruction of a simulated prostate distribution demonstrates the focal imaging capability of the system.
Goyal, Anish; Myers, Travis; Wang, Christine A; Kelly, Michael; Tyrrell, Brian; Gokden, B; Sanchez, Antonio; Turner, George; Capasso, Federico
2014-06-16
We demonstrate active hyperspectral imaging using a quantum-cascade laser (QCL) array as the illumination source and a digital-pixel focal-plane-array (DFPA) camera as the receiver. The multi-wavelength QCL array used in this work comprises 15 individually addressable QCLs in which the beams from all lasers are spatially overlapped using wavelength beam combining (WBC). The DFPA camera was configured to integrate the laser light reflected from the sample and to perform on-chip subtraction of the passive thermal background. A 27-frame hyperspectral image was acquired of a liquid contaminant on a diffuse gold surface at a range of 5 meters. The measured spectral reflectance closely matches the calculated reflectance. Furthermore, the high-speed capabilities of the system were demonstrated by capturing differential reflectance images of sand and KClO3 particles that were moving at speeds of up to 10 m/s.
Helmet-mounted uncooled FPA camera for use in firefighting applications
NASA Astrophysics Data System (ADS)
Wu, Cheng; Feng, Shengrong; Li, Kai; Pan, Shunchen; Su, Junhong; Jin, Weiqi
2000-05-01
From the concept and need background of firefighters to the thermal imager, we discuss how the helmet-mounted camera applied in the bad environment of conflagration, especially at the high temperature, and how the better matching between the thermal imager with the helmet will be put into effect in weight, size, etc. Finally, give a practical helmet- mounted IR camera based on the uncooled focal plane array detector for in firefighting.
Neil A. Clark; Sang-Mook Lee
2004-01-01
This paper demonstrates how a digital video camera with a long lens can be used with pulse laser ranging in order to collect very large-scale tree crown measurements. The long focal length of the camera lens provides the magnification required for precise viewing of distant points with the trade-off of spatial coverage. Multiple video frames are mosaicked into a single...
Traffic Sign Recognition with Invariance to Lighting in Dual-Focal Active Camera System
NASA Astrophysics Data System (ADS)
Gu, Yanlei; Panahpour Tehrani, Mehrdad; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki
In this paper, we present an automatic vision-based traffic sign recognition system, which can detect and classify traffic signs at long distance under different lighting conditions. To realize this purpose, the traffic sign recognition is developed in an originally proposed dual-focal active camera system. In this system, a telephoto camera is equipped as an assistant of a wide angle camera. The telephoto camera can capture a high accuracy image for an object of interest in the view field of the wide angle camera. The image from the telephoto camera provides enough information for recognition when the accuracy of traffic sign is low from the wide angle camera. In the proposed system, the traffic sign detection and classification are processed separately for different images from the wide angle camera and telephoto camera. Besides, in order to detect traffic sign from complex background in different lighting conditions, we propose a type of color transformation which is invariant to light changing. This color transformation is conducted to highlight the pattern of traffic signs by reducing the complexity of background. Based on the color transformation, a multi-resolution detector with cascade mode is trained and used to locate traffic signs at low resolution in the image from the wide angle camera. After detection, the system actively captures a high accuracy image of each detected traffic sign by controlling the direction and exposure time of the telephoto camera based on the information from the wide angle camera. Moreover, in classification, a hierarchical classifier is constructed and used to recognize the detected traffic signs in the high accuracy image from the telephoto camera. Finally, based on the proposed system, a set of experiments in the domain of traffic sign recognition is presented. The experimental results demonstrate that the proposed system can effectively recognize traffic signs at low resolution in different lighting conditions.
In-situ calibration of nonuniformity in infrared staring and modulated systems
NASA Astrophysics Data System (ADS)
Black, Wiley T.
Infrared cameras can directly measure the apparent temperature of objects, providing thermal imaging. However, the raw output from most infrared cameras suffers from a strong, often limiting noise source called nonuniformity. Manufacturing imperfections in infrared focal planes lead to high pixel-to-pixel sensitivity to electronic bias, focal plane temperature, and other effects. The resulting imagery can only provide useful thermal imaging after a nonuniformity calibration has been performed. Traditionally, these calibrations are performed by momentarily blocking the field of view with a at temperature plate or blackbody cavity. However because the pattern is a coupling of manufactured sensitivities with operational variations, periodic recalibration is required, sometimes on the order of tens of seconds. A class of computational methods called Scene-Based Nonuniformity Correction (SBNUC) has been researched for over 20 years where the nonuniformity calibration is estimated in digital processing by analysis of the video stream in the presence of camera motion. The most sophisticated SBNUC methods can completely and robustly eliminate the high-spatial frequency component of nonuniformity with only an initial reference calibration or potentially no physical calibration. I will demonstrate a novel algorithm that advances these SBNUC techniques to support all spatial frequencies of nonuniformity correction. Long-wave infrared microgrid polarimeters are a class of camera that incorporate a microscale per-pixel wire-grid polarizer directly affixed to each pixel of the focal plane. These cameras have the capability of simultaneously measuring thermal imagery and polarization in a robust integrated package with no moving parts. I will describe the necessary adaptations of my SBNUC method to operate on this class of sensor as well as demonstrate SBNUC performance in LWIR polarimetry video collected on the UA mall.
NASA Astrophysics Data System (ADS)
Bell, J. F.; Godber, A.; McNair, S.; Caplinger, M. A.; Maki, J. N.; Lemmon, M. T.; Van Beek, J.; Malin, M. C.; Wellington, D.; Kinch, K. M.; Madsen, M. B.; Hardgrove, C.; Ravine, M. A.; Jensen, E.; Harker, D.; Anderson, R. B.; Herkenhoff, K. E.; Morris, R. V.; Cisneros, E.; Deen, R. G.
2017-07-01
The NASA Curiosity rover Mast Camera (Mastcam) system is a pair of fixed-focal length, multispectral, color CCD imagers mounted 2 m above the surface on the rover's remote sensing mast, along with associated electronics and an onboard calibration target. The left Mastcam (M-34) has a 34 mm focal length, an instantaneous field of view (IFOV) of 0.22 mrad, and a FOV of 20° × 15° over the full 1648 × 1200 pixel span of its Kodak KAI-2020 CCD. The right Mastcam (M-100) has a 100 mm focal length, an IFOV of 0.074 mrad, and a FOV of 6.8° × 5.1° using the same detector. The cameras are separated by 24.2 cm on the mast, allowing stereo images to be obtained at the resolution of the M-34 camera. Each camera has an eight-position filter wheel, enabling it to take Bayer pattern red, green, and blue (RGB) "true color" images, multispectral images in nine additional bands spanning 400-1100 nm, and images of the Sun in two colors through neutral density-coated filters. An associated Digital Electronics Assembly provides command and data interfaces to the rover, 8 Gb of image storage per camera, 11 bit to 8 bit companding, JPEG compression, and acquisition of high-definition video. Here we describe the preflight and in-flight calibration of Mastcam images, the ways that they are being archived in the NASA Planetary Data System, and the ways that calibration refinements are being developed as the investigation progresses on Mars. We also provide some examples of data sets and analyses that help to validate the accuracy and precision of the calibration.
15 CFR 742.4 - National security.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Requirements” section except those cameras in ECCN 6A003.b.4.b that have a focal plane array with 111,000 or... Albania, Australia, Austria, Belgium, Bulgaria, Canada, Croatia, Cyprus, Czech Republic, Denmark, Estonia....b.4.b that have a focal plane array with 111,000 or fewer elements and a frame rate of 60 Hz or less...
Camera for detection of cosmic rays of energy more than 10 Eev on the ISS orbit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garipov, G. K.; Khrenov, B. A.; Panasyuk, M. I.
1998-06-15
Concept of the EHE CR observation from the ISS orbit is discussed. A design of the camera at the Russian segment of the ISS comprising a large area (60 m{sup 2}) parabolic mirror with a photo multiplier pixel retina in its focal plane is described.
NASA Technical Reports Server (NTRS)
Gunapala, S.; Bandara, S. V.; Liu, J. K.; Hong, W.; Sundaram, M.; Maker, P. D.; Muller, R. E.
1997-01-01
In this paper, we discuss the development of this very sensitive long waelength infrared (LWIR) camera based on a GaAs/AlGaAs QWIP focal plane array (FPA) and its performance in quantum efficiency, NEAT, uniformity, and operability.
A new high-speed IR camera system
NASA Technical Reports Server (NTRS)
Travis, Jeffrey W.; Shu, Peter K.; Jhabvala, Murzy D.; Kasten, Michael S.; Moseley, Samuel H.; Casey, Sean C.; Mcgovern, Lawrence K.; Luers, Philip J.; Dabney, Philip W.; Kaipa, Ravi C.
1994-01-01
A multi-organizational team at the Goddard Space Flight Center is developing a new far infrared (FIR) camera system which furthers the state of the art for this type of instrument by the incorporating recent advances in several technological disciplines. All aspects of the camera system are optimized for operation at the high data rates required for astronomical observations in the far infrared. The instrument is built around a Blocked Impurity Band (BIB) detector array which exhibits responsivity over a broad wavelength band and which is capable of operating at 1000 frames/sec, and consists of a focal plane dewar, a compact camera head electronics package, and a Digital Signal Processor (DSP)-based data system residing in a standard 486 personal computer. In this paper we discuss the overall system architecture, the focal plane dewar, and advanced features and design considerations for the electronics. This system, or one derived from it, may prove useful for many commercial and/or industrial infrared imaging or spectroscopic applications, including thermal machine vision for robotic manufacturing, photographic observation of short-duration thermal events such as combustion or chemical reactions, and high-resolution surveillance imaging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riot, V J; Olivier, S; Bauman, B
2012-05-24
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics willmore » meet their performance goals.« less
Malin, Michal C; Ravine, Michael A; Caplinger, Michael A; Tony Ghaemi, F; Schaffner, Jacob A; Maki, Justin N; Bell, James F; Cameron, James F; Dietrich, William E; Edgett, Kenneth S; Edwards, Laurence J; Garvin, James B; Hallet, Bernard; Herkenhoff, Kenneth E; Heydari, Ezat; Kah, Linda C; Lemmon, Mark T; Minitti, Michelle E; Olson, Timothy S; Parker, Timothy J; Rowland, Scott K; Schieber, Juergen; Sletten, Ron; Sullivan, Robert J; Sumner, Dawn Y; Aileen Yingst, R; Duston, Brian M; McNair, Sean; Jensen, Elsa H
2017-08-01
The Mars Science Laboratory Mast camera and Descent Imager investigations were designed, built, and operated by Malin Space Science Systems of San Diego, CA. They share common electronics and focal plane designs but have different optics. There are two Mastcams of dissimilar focal length. The Mastcam-34 has an f/8, 34 mm focal length lens, and the M-100 an f/10, 100 mm focal length lens. The M-34 field of view is about 20° × 15° with an instantaneous field of view (IFOV) of 218 μrad; the M-100 field of view (FOV) is 6.8° × 5.1° with an IFOV of 74 μrad. The M-34 can focus from 0.5 m to infinity, and the M-100 from ~1.6 m to infinity. All three cameras can acquire color images through a Bayer color filter array, and the Mastcams can also acquire images through seven science filters. Images are ≤1600 pixels wide by 1200 pixels tall. The Mastcams, mounted on the ~2 m tall Remote Sensing Mast, have a 360° azimuth and ~180° elevation field of regard. Mars Descent Imager is fixed-mounted to the bottom left front side of the rover at ~66 cm above the surface. Its fixed focus lens is in focus from ~2 m to infinity, but out of focus at 66 cm. The f/3 lens has a FOV of ~70° by 52° across and along the direction of motion, with an IFOV of 0.76 mrad. All cameras can acquire video at 4 frames/second for full frames or 720p HD at 6 fps. Images can be processed using lossy Joint Photographic Experts Group and predictive lossless compression.
Ravine, Michael A.; Caplinger, Michael A.; Tony Ghaemi, F.; Schaffner, Jacob A.; Maki, Justin N.; Bell, James F.; Cameron, James F.; Dietrich, William E.; Edgett, Kenneth S.; Edwards, Laurence J.; Garvin, James B.; Hallet, Bernard; Herkenhoff, Kenneth E.; Heydari, Ezat; Kah, Linda C.; Lemmon, Mark T.; Minitti, Michelle E.; Olson, Timothy S.; Parker, Timothy J.; Rowland, Scott K.; Schieber, Juergen; Sletten, Ron; Sullivan, Robert J.; Sumner, Dawn Y.; Aileen Yingst, R.; Duston, Brian M.; McNair, Sean; Jensen, Elsa H.
2017-01-01
Abstract The Mars Science Laboratory Mast camera and Descent Imager investigations were designed, built, and operated by Malin Space Science Systems of San Diego, CA. They share common electronics and focal plane designs but have different optics. There are two Mastcams of dissimilar focal length. The Mastcam‐34 has an f/8, 34 mm focal length lens, and the M‐100 an f/10, 100 mm focal length lens. The M‐34 field of view is about 20° × 15° with an instantaneous field of view (IFOV) of 218 μrad; the M‐100 field of view (FOV) is 6.8° × 5.1° with an IFOV of 74 μrad. The M‐34 can focus from 0.5 m to infinity, and the M‐100 from ~1.6 m to infinity. All three cameras can acquire color images through a Bayer color filter array, and the Mastcams can also acquire images through seven science filters. Images are ≤1600 pixels wide by 1200 pixels tall. The Mastcams, mounted on the ~2 m tall Remote Sensing Mast, have a 360° azimuth and ~180° elevation field of regard. Mars Descent Imager is fixed‐mounted to the bottom left front side of the rover at ~66 cm above the surface. Its fixed focus lens is in focus from ~2 m to infinity, but out of focus at 66 cm. The f/3 lens has a FOV of ~70° by 52° across and along the direction of motion, with an IFOV of 0.76 mrad. All cameras can acquire video at 4 frames/second for full frames or 720p HD at 6 fps. Images can be processed using lossy Joint Photographic Experts Group and predictive lossless compression. PMID:29098171
Multi-Angle Snowflake Camera Value-Added Product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shkurko, Konstantin; Garrett, T.; Gaustad, K
The Multi-Angle Snowflake Camera (MASC) addresses a need for high-resolution multi-angle imaging of hydrometeors in freefall with simultaneous measurement of fallspeed. As illustrated in Figure 1, the MASC consists of three cameras, separated by 36°, each pointing at an identical focal point approximately 10 cm away. Located immediately above each camera, a light aims directly at the center of depth of field for its corresponding camera. The focal point at which the cameras are aimed lies within a ring through which hydrometeors fall. The ring houses a system of near-infrared emitter-detector pairs, arranged in two arrays separated vertically by 32more » mm. When hydrometeors pass through the lower array, they simultaneously trigger all cameras and lights. Fallspeed is calculated from the time it takes to traverse the distance between the upper and lower triggering arrays. The trigger electronics filter out ambient light fluctuations associated with varying sunlight and shadows. The microprocessor onboard the MASC controls the camera system and communicates with the personal computer (PC). The image data is sent via FireWire 800 line, and fallspeed (and camera control) is sent via a Universal Serial Bus (USB) line that relies on RS232-over-USB serial conversion. See Table 1 for specific details on the MASC located at the Oliktok Point Mobile Facility on the North Slope of Alaska. The value-added product (VAP) detailed in this documentation analyzes the raw data (Section 2.0) using Python: images rely on OpenCV image processing library and derived aggregated statistics rely on some clever averaging. See Sections 4.1 and 4.2 for more details on what variables are computed.« less
Thermal Imaging with Novel Infrared Focal Plane Arrays and Quantitative Analysis of Thermal Imagery
NASA Technical Reports Server (NTRS)
Gunapala, S. D.; Rafol, S. B.; Bandara, S. V.; Liu, J. K.; Mumolo, J. M.; Soibel, A.; Ting, D. Z.; Tidrow, Meimei
2012-01-01
We have developed a single long-wavelength infrared (LWIR) quantum well infrared photodetector (QWIP) camera for thermography. This camera has been used to measure the temperature profile of patients. A pixel coregistered simultaneously reading mid-wavelength infrared (MWIR)/LWIR dual-band QWIP camera was developed to improve the accuracy of temperature measurements especially with objects with unknown emissivity. Even the dualband measurement can provide inaccurate results due to the fact that emissivity is a function of wavelength. Thus we have been developing a four-band QWIP camera for accurate temperature measurement of remote object.
Light ray field capture using focal plane sweeping and its optical reconstruction using 3D displays.
Park, Jae-Hyeung; Lee, Sung-Keun; Jo, Na-Young; Kim, Hee-Jae; Kim, Yong-Soo; Lim, Hong-Gi
2014-10-20
We propose a method to capture light ray field of three-dimensional scene using focal plane sweeping. Multiple images are captured using a usual camera at different focal distances, spanning the three-dimensional scene. The captured images are then back-projected to four-dimensional spatio-angular space to obtain the light ray field. The obtained light ray field can be visualized either using digital processing or optical reconstruction using various three-dimensional display techniques including integral imaging, layered display, and holography.
Liao, Jun; Wang, Zhe; Zhang, Zibang; Bian, Zichao; Guo, Kaikai; Nambiar, Aparna; Jiang, Yutong; Jiang, Shaowei; Zhong, Jingang; Choma, Michael; Zheng, Guoan
2018-02-01
We report the development of a multichannel microscopy for whole-slide multiplane, multispectral and phase imaging. We use trinocular heads to split the beam path into 6 independent channels and employ a camera array for parallel data acquisition, achieving a maximum data throughput of approximately 1 gigapixel per second. To perform single-frame rapid autofocusing, we place 2 near-infrared light-emitting diodes (LEDs) at the back focal plane of the condenser lens to illuminate the sample from 2 different incident angles. A hot mirror is used to direct the near-infrared light to an autofocusing camera. For multiplane whole-slide imaging (WSI), we acquire 6 different focal planes of a thick specimen simultaneously. For multispectral WSI, we relay the 6 independent image planes to the same focal position and simultaneously acquire information at 6 spectral bands. For whole-slide phase imaging, we acquire images at 3 focal positions simultaneously and use the transport-of-intensity equation to recover the phase information. We also provide an open-source design to further increase the number of channels from 6 to 15. The reported platform provides a simple solution for multiplexed fluorescence imaging and multimodal WSI. Acquiring an instant focal stack without z-scanning may also enable fast 3-dimensional dynamic tracking of various biological samples. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Buisset, Christophe; Prasit, Apirat; Lépine, Thierry; Poshyachinda, Saran; Soonthornthum, Boonrucksar; Deboos, Alexis
2016-07-01
The National Astronomical Research Institute (NARIT) is currently developing an all spherical five lenses focal reducer to image a FOV circular of diameter Δθ = 14.6' on the 4K camera with a pixel scale equal to 0.42''/pixel. The spatial resolution will be better than 1.2'' over the full visible spectral domain [400 nm, 800 nm]. The relative irradiance between the ghost and the science images will be lower than 10-4. The maximum distortion will be lower than 1% and the maximum angle of incidence on the filters will be equal to 8°. The focal reducer comprises 1 doublet L1 located at the fork entrance and 1 triplet L2 located in front of the camera. The doublet L1 will be mounted on a tip-tilt mount placed on a robotic sliding rail. L1 will thus be placed in the optical path during the observations with the 4K camera and will be removed during the observations with the other instruments. The triplet L2 will be installed on the instrument cube in front of the camera equipped with the filter wheel. The glass will be manufactured in a specialized company, the mechanical parts will be manufactured by using the NARIT Computer Numerical Control machine and the lenses will be integrated at NARIT. In this paper, we describe the optical and mechanical designs and we present the geometrical performance, the transmission budget and the results of the stray light analyses.
Coherent infrared imaging camera (CIRIC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.
1995-07-01
New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerousmore » and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.« less
An all-silicone zoom lens in an optical imaging system
NASA Astrophysics Data System (ADS)
Zhao, Cun-Hua
2013-09-01
An all-silicone zoom lens is fabricated. A tunable metal ringer is fettered around the side edge of the lens. A nylon rope linking a motor is tied, encircling the notch in the metal ringer. While the motor is operating, the rope can shrink or release to change the focal length of the lens. A calculation method is developed to obtain the focal length and the zoom ratio. The testing is carried out in succession. The testing values are compared with the calculated ones, and they tally with each other well. Finally, the imaging performance of the all-silicone lens is demonstrated. The all-silicone lens has potential uses in cellphone cameras, notebook cameras, micro monitor lenses, etc.
CCD TV focal plane guider development and comparison to SIRTF applications
NASA Technical Reports Server (NTRS)
Rank, David M.
1989-01-01
It is expected that the SIRTF payload will use a CCD TV focal plane fine guidance sensor to provide acquisition of sources and tracking stability of the telescope. Work has been done to develop CCD TV cameras and guiders at Lick Observatory for several years and have produced state of the art CCD TV systems for internal use. NASA decided to provide additional support so that the limits of this technology could be established and a comparison between SIRTF requirements and practical systems could be put on a more quantitative basis. The results of work carried out at Lick Observatory which was designed to characterize present CCD autoguiding technology and relate it to SIRTF applications is presented. Two different design types of CCD cameras were constructed using virtual phase and burred channel CCD sensors. A simple autoguider was built and used on the KAO, Mt. Lemon and Mt. Hamilton telescopes. A video image processing system was also constructed in order to characterize the performance of the auto guider and CCD cameras.
Wide field NEO survey 1.0-m telescope with 10 2k×4k mosaic CCD camera
NASA Astrophysics Data System (ADS)
Isobe, Syuzo; Asami, Atsuo; Asher, David J.; Hashimoto, Toshiyasu; Nakano, Shi-ichi; Nishiyama, Kota; Ohshima, Yoshiaki; Terazono, Junya; Umehara, Hiroaki; Yoshikawa, Makoto
2002-12-01
We developed a new 1.0 m telescope with a 3 degree flat focal plane to which a mosaic CCD camera with 10 2k×4k chips is fixed. The system was set up in February 2002, and is now undergoing the final fine adjustments. Since the telescope has a focal length of 3 m, a field of 7.5 square degrees is covered in one image. In good seeing conditions, 1.5 arc seconds, at the site located in Bisei town, Okayama prefecture in Japan, we can expect to detect down to 20th magnitude stars with an exposure time of 60 seconds. Considering a read-out time, 46 seconds, of the CCD camera, one image is taken in every two minutes, and about 2,100 square degrees of field is expected to be covered in one clear night. This system is very effective for survey work, especially for Near-Earth-Asteroid detection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rachel F. Brem; Jocelyn A. Rapelyea; , Gilat Zisman
2005-08-01
To prospectively evaluate a high-resolution breast-specific gamma camera for depicting occult breast cancer in women at high risk for breast cancer but with normal mammographic and physical examination findings. MATERIALS AND METHODS: Institutional Review Board approval and informed consent were obtained. The study was HIPAA compliant. Ninety-four high-risk women (age range, 36-78 years; mean, 55 years) with normal mammographic (Breast Imaging Reporting and Data System [BI-RADS] 1 or 2) and physical examination findings were evaluated with scintimammography. After injection with 25-30 mCi (925-1110 MBq) of technetium 99m sestamibi, patients were imaged with a high-resolution small-field-of-view breast-specific gamma camera in craniocaudalmore » and mediolateral oblique projections. Scintimammograms were prospectively classified according to focal radiotracer uptake as normal (score of 1), with no focal or diffuse uptake; benign (score of 2), with minimal patchy uptake; probably benign (score of 3), with scattered patchy uptake; probably abnormal (score of 4), with mild focal radiotracer uptake; and abnormal (score of 5), with marked focal radiotracer uptake. Mammographic breast density was categorized according to BI-RADS criteria. Patients with normal scintimammograms (scores of 1, 2, or 3) were followed up for 1 year with an annual mammogram, physical examination, and repeat scintimammography. Patients with abnormal scintimammograms (scores of 4 or 5) underwent ultrasonography (US), and those with focal hypoechoic lesions underwent biopsy. If no lesion was found during US, patients were followed up with scintimammography. Specific pathologic findings were compared with scintimammographic findings. RESULTS: Of 94 women, 78 (83%) had normal scintimammograms (score of 1, 2, or 3) at initial examination and 16 (17%) had abnormal scintimammograms (score of 4 or 5). Fourteen (88%) of the 16 patients had either benign findings at biopsy or no focal abnormality at US; in two (12%) patients, invasive carcinoma was diagnosed at US-guided biopsy (9 mm each at pathologic examination). CONCLUSION: High-resolution breast-specific scintimammography can depict small (<1-cm), mammographically occult, nonpalpable lesions in women at increased risk for breast cancer not otherwise identified at mammography or physical examination.« less
Exploring the Universe with the Hubble Space Telescope
NASA Technical Reports Server (NTRS)
1990-01-01
A general overview is given of the operations, engineering challenges, and components of the Hubble Space Telescope. Deployment, checkout and servicing in space are discussed. The optical telescope assembly, focal plane scientific instruments, wide field/planetary camera, faint object spectrograph, faint object camera, Goddard high resolution spectrograph, high speed photometer, fine guidance sensors, second generation technology, and support systems and services are reviewed.
Hyper Suprime-Cam: Camera dewar design
NASA Astrophysics Data System (ADS)
Komiyama, Yutaka; Obuchi, Yoshiyuki; Nakaya, Hidehiko; Kamata, Yukiko; Kawanomoto, Satoshi; Utsumi, Yousuke; Miyazaki, Satoshi; Uraguchi, Fumihiro; Furusawa, Hisanori; Morokuma, Tomoki; Uchida, Tomohisa; Miyatake, Hironao; Mineo, Sogo; Fujimori, Hiroki; Aihara, Hiroaki; Karoji, Hiroshi; Gunn, James E.; Wang, Shiang-Yu
2018-01-01
This paper describes the detailed design of the CCD dewar and the camera system which is a part of the wide-field imager Hyper Suprime-Cam (HSC) on the 8.2 m Subaru Telescope. On the 1.°5 diameter focal plane (497 mm in physical size), 116 four-side buttable 2 k × 4 k fully depleted CCDs are tiled with 0.3 mm gaps between adjacent chips, which are cooled down to -100°C by two pulse tube coolers with a capability to exhaust 100 W heat at -100°C. The design of the dewar is basically a natural extension of Suprime-Cam, incorporating some improvements such as (1) a detailed CCD positioning strategy to avoid any collision between CCDs while maximizing the filling factor of the focal plane, (2) a spherical washers mechanism adopted for the interface points to avoid any deformation caused by the tilt of the interface surface to be transferred to the focal plane, (3) the employment of a truncated-cone-shaped window, made of synthetic silica, to save the back focal space, and (4) a passive heat transfer mechanism to exhaust efficiently the heat generated from the CCD readout electronics which are accommodated inside the dewar. Extensive simulations using a finite-element analysis (FEA) method are carried out to verify that the design of the dewar is sufficient to satisfy the assigned errors. We also perform verification tests using the actually assembled CCD dewar to supplement the FEA and demonstrate that the design is adequate to ensure an excellent image quality which is key to the HSC. The details of the camera system, including the control computer system, are described as well as the assembling process of the dewar and the process of installation on the telescope.
Bell, James F.; Godber, A.; McNair, S.; Caplinger, M.A.; Maki, J.N.; Lemmon, M.T.; Van Beek, J.; Malin, M.C.; Wellington, D.; Kinch, K.M.; Madsen, M.B.; Hardgrove, C.; Ravine, M.A.; Jensen, E.; Harker, D.; Anderson, Ryan; Herkenhoff, Kenneth E.; Morris, R.V.; Cisneros, E.; Deen, R.G.
2017-01-01
The NASA Curiosity rover Mast Camera (Mastcam) system is a pair of fixed-focal length, multispectral, color CCD imagers mounted ~2 m above the surface on the rover's remote sensing mast, along with associated electronics and an onboard calibration target. The left Mastcam (M-34) has a 34 mm focal length, an instantaneous field of view (IFOV) of 0.22 mrad, and a FOV of 20° × 15° over the full 1648 × 1200 pixel span of its Kodak KAI-2020 CCD. The right Mastcam (M-100) has a 100 mm focal length, an IFOV of 0.074 mrad, and a FOV of 6.8° × 5.1° using the same detector. The cameras are separated by 24.2 cm on the mast, allowing stereo images to be obtained at the resolution of the M-34 camera. Each camera has an eight-position filter wheel, enabling it to take Bayer pattern red, green, and blue (RGB) “true color” images, multispectral images in nine additional bands spanning ~400–1100 nm, and images of the Sun in two colors through neutral density-coated filters. An associated Digital Electronics Assembly provides command and data interfaces to the rover, 8 Gb of image storage per camera, 11 bit to 8 bit companding, JPEG compression, and acquisition of high-definition video. Here we describe the preflight and in-flight calibration of Mastcam images, the ways that they are being archived in the NASA Planetary Data System, and the ways that calibration refinements are being developed as the investigation progresses on Mars. We also provide some examples of data sets and analyses that help to validate the accuracy and precision of the calibration
Image Processing for Cameras with Fiber Bundle Image Relay
length. Optical fiber bundles have been used to couple between this focal surface and planar image sensors . However, such fiber-coupled imaging systems...coupled to six discrete CMOS focal planes. We characterize the locally space-variant system impulse response at various stages: monocentric lens image...vignetting, and stitch together the image data from discrete sensors into a single panorama. We compare processed images from the prototype to those taken with
Observation of Planetary Motion Using a Digital Camera
ERIC Educational Resources Information Center
Meyn, Jan-Peter
2008-01-01
A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to…
Visualization of Subsurface Defects in Composites using a Focal Plane Array Infrared Camera
NASA Technical Reports Server (NTRS)
Plotnikov, Yuri A.; Winfree, William P.
1999-01-01
A technique for enhanced defect visualization in composites via transient thermography is presented in this paper. The effort targets automated defect map construction for multiple defects located in the observed area. Experimental data were collected on composite panels of different thickness with square inclusions and flat bottom holes of different depth and orientation. The time evolution of the thermal response and spatial thermal profiles are analyzed. The pattern generated by carbon fibers and the vignetting effect of the focal plane array camera make defect visualization difficult. An improvement of the defect visibility is made by the pulse phase technique and the spatial background treatment. The relationship between a size of a defect and its reconstructed image is analyzed as well. The image processing technique for noise reduction is discussed.
Periodicity analysis on cat-eye reflected beam profiles of optical detectors
NASA Astrophysics Data System (ADS)
Gong, Mali; He, Sifeng
2017-05-01
The cat-eye effect reflected beam profiles of most optical detectors have a certain characteristic of periodicity, which is caused by array arrangement of sensors at their optical focal planes. It is the first time to find and prove that the reflected beam profile becomes several periodic spots at the reflected propagation distance corresponding to half the imaging distance of a CCD camera. Furthermore, the spatial cycle of these spots is approximately constant, independent of the CCD camera's imaging distance, which is related only to the focal length and pixel size of the CCD sensor. Thus, we can obtain the imaging distance and intrinsic parameters of the optical detector by analyzing its cat-eye reflected beam profiles. This conclusion can be applied in the field of non-cooperative cat-eye target recognition.
Long-Wavelength 640 x 486 GaAs/AlGaAs Quantum Well Infrared Photodetector Snap-Shot Camera
NASA Technical Reports Server (NTRS)
Gunapala, Sarath D.; Bandara, Sumith V.; Liu, John K.; Hong, Winn; Sundaram, Mani; Maker, Paul D.; Muller, Richard E.; Shott, Craig A.; Carralejo, Ronald
1998-01-01
A 9-micrometer cutoff 640 x 486 snap-shot quantum well infrared photodetector (QWIP) camera has been demonstrated. The performance of this QWIP camera is reported including indoor and outdoor imaging. The noise equivalent differential temperature (NE.deltaT) of 36 mK has been achieved at 300 K background with f/2 optics. This is in good agreement with expected focal plane array sensitivity due to the practical limitations on charge handling capacity of the multiplexer, read noise, bias voltage, and operating temperature.
Manned observations technology development, FY 1992 report
NASA Technical Reports Server (NTRS)
Israel, Steven
1992-01-01
This project evaluated the suitability of the NASA/JSC developed electronic still camera (ESC) digital image data for Earth observations from the Space Shuttle, as a first step to aid planning for Space Station Freedom. Specifically, image resolution achieved from the Space Shuttle using the current ESC system, which is configured with a Loral 15 mm x 15 mm (1024 x 1024 pixel array) CCD chip on the focal plane of a Nikon F4 camera, was compared to that of current handheld 70 mm Hasselblad 500 EL/M film cameras.
Expected progress based on aluminium galium nitride Focal Plan Array for near and deep Ultraviolet
NASA Astrophysics Data System (ADS)
Reverchon, J.-L.; Robin, K.; Bansropun, S.; Gourdel, Y.; Robo, J.-A.; Truffer, J.-P.; Costard, E.; Brault, J.; Frayssinet, E.; Duboz, J.-Y.
The fast development of nitrides has given the opportunity to investigate AlGaN as a material for ultraviolet detection. A camera based on such a material presents an extremely low dark current at room temperature. It can compete with technologies based on photocathodes, MCP intensifiers, back thinned CCD or hybrid CMOS focal plane arrays for low flux measurements. First, we will present results on focal plane array of 320 × 256 pixels with a pitch of 30 μm. The peak responsivity is tuned from 260 nm to 360 nm in different cameras. All these results are obtained in a standard SWIR supply chaine and with AlGaN Schottky diodes grown on sapphire. We will present here the first attempts to transfer the standard design Schottky photodiodes on from sapphire to silicon substrates. We will show the capability to remove the silicon substrate, to etch the window layer in order to extend the band width to lower wavelength and to maintain the AlGaN membrane integrity.
SU-D-BRC-07: System Design for a 3D Volumetric Scintillation Detector Using SCMOS Cameras
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darne, C; Robertson, D; Alsanea, F
2016-06-15
Purpose: The purpose of this project is to build a volumetric scintillation detector for quantitative imaging of 3D dose distributions of proton beams accurately in near real-time. Methods: The liquid scintillator (LS) detector consists of a transparent acrylic tank (20×20×20 cm{sup 3}) filled with a liquid scintillator that when irradiated with protons generates scintillation light. To track rapid spatial and dose variations in spot scanning proton beams we used three scientific-complementary metal-oxide semiconductor (sCMOS) imagers (2560×2160 pixels). The cameras collect optical signal from three orthogonal projections. To reduce system footprint two mirrors oriented at 45° to the tank surfaces redirectmore » scintillation light to cameras for capturing top and right views. Selection of fixed focal length objective lenses for these cameras was based on their ability to provide large depth of field (DoF) and required field of view (FoV). Multiple cross-hairs imprinted on the tank surfaces allow for image corrections arising from camera perspective and refraction. Results: We determined that by setting sCMOS to 16-bit dynamic range, truncating its FoV (1100×1100 pixels) to image the entire volume of the LS detector, and using 5.6 msec integration time imaging rate can be ramped up to 88 frames per second (fps). 20 mm focal length lens provides a 20 cm imaging DoF and 0.24 mm/pixel resolution. Master-slave camera configuration enable the slaves to initiate image acquisition instantly (within 2 µsec) after receiving a trigger signal. A computer with 128 GB RAM was used for spooling images from the cameras and can sustain a maximum recording time of 2 min per camera at 75 fps. Conclusion: The three sCMOS cameras are capable of high speed imaging. They can therefore be used for quick, high-resolution, and precise mapping of dose distributions from scanned spot proton beams in three dimensions.« less
Inverse Relationship between Serum VEGF Levels and Late In-Stent Restenosis of Drug-Eluting Stents
Shen, Li; Ji, Meng; Cai, Sishi; Chen, Jiahui; Yao, Zhifeng
2017-01-01
Late in-stent restenosis (ISR) has raised concerns regarding the long-term efficacy of drug-eluting stents (DES). The role of vascular endothelial growth factor (VEGF) in the pathological process of ISR is controversial. This retrospective study aimed to investigate the relationship between serum VEGF levels and late ISR in patients with DES implantation. A total of 158 patients who underwent angiography follow-up beyond 1 year after intervention were included. The study population was classified into ISR and non-ISR groups. The ISR group was further divided according to follow-up duration and Mehran classification. VEGF levels were significantly lower in the ISR group than in the non-ISR group [96.34 (48.18, 174.14) versus 179.14 (93.59, 307.74) pg/mL, p < 0.0001]. Multivariate regression revealed that VEGF level, procedure age, and low-density lipoprotein cholesterol were independent risk factors for late ISR formation. Subgroup analysis demonstrated that VEGF levels were even lower in the very late (≥5 years) and diffuse ISR group (Mehran patterns II, III, and IV) than in the late ISR group (1–4 years) and the focal ISR group (Mehran pattern I), respectively. Furthermore, significant difference was found between diffuse and focal ISR groups. Serum VEGF levels were inversely associated with late ISR after DES implantation. PMID:28373989
Optomechanical System Development of the AWARE Gigapixel Scale Camera
NASA Astrophysics Data System (ADS)
Son, Hui S.
Electronic focal plane arrays (FPA) such as CMOS and CCD sensors have dramatically improved to the point that digital cameras have essentially phased out film (except in very niche applications such as hobby photography and cinema). However, the traditional method of mating a single lens assembly to a single detector plane, as required for film cameras, is still the dominant design used in cameras today. The use of electronic sensors and their ability to capture digital signals that can be processed and manipulated post acquisition offers much more freedom of design at system levels and opens up many interesting possibilities for the next generation of computational imaging systems. The AWARE gigapixel scale camera is one such computational imaging system. By utilizing a multiscale optical design, in which a large aperture objective lens is mated with an array of smaller, well corrected relay lenses, we are able to build an optically simple system that is capable of capturing gigapixel scale images via post acquisition stitching of the individual pictures from the array. Properly shaping the array of digital cameras allows us to form an effectively continuous focal surface using off the shelf (OTS) flat sensor technology. This dissertation details developments and physical implementations of the AWARE system architecture. It illustrates the optomechanical design principles and system integration strategies we have developed through the course of the project by summarizing the results of the two design phases for AWARE: AWARE-2 and AWARE-10. These systems represent significant advancements in the pursuit of scalable, commercially viable snapshot gigapixel imaging systems and should serve as a foundation for future development of such systems.
NASA Astrophysics Data System (ADS)
Daly, Michael J.; Muhanna, Nidal; Chan, Harley; Wilson, Brian C.; Irish, Jonathan C.; Jaffray, David A.
2014-02-01
A freehand, non-contact diffuse optical tomography (DOT) system has been developed for multimodal imaging with intraoperative cone-beam CT (CBCT) during minimally-invasive cancer surgery. The DOT system is configured for near-infrared fluorescence imaging with indocyanine green (ICG) using a collimated 780 nm laser diode and a nearinfrared CCD camera (PCO Pixelfly USB). Depending on the intended surgical application, the camera is coupled to either a rigid 10 mm diameter endoscope (Karl Storz) or a 25 mm focal length lens (Edmund Optics). A prototype flatpanel CBCT C-Arm (Siemens Healthcare) acquires low-dose 3D images with sub-mm spatial resolution. A 3D mesh is extracted from CBCT for finite-element DOT implementation in NIRFAST (Dartmouth College), with the capability for soft/hard imaging priors (e.g., segmented lymph nodes). A stereoscopic optical camera (NDI Polaris) provides real-time 6D localization of reflective spheres mounted to the laser and camera. Camera calibration combined with tracking data is used to estimate intrinsic (focal length, principal point, non-linear distortion) and extrinsic (translation, rotation) lens parameters. Source/detector boundary data is computed from the tracked laser/camera positions using radiometry models. Target registration errors (TRE) between real and projected boundary points are ~1-2 mm for typical acquisition geometries. Pre-clinical studies using tissue phantoms are presented to characterize 3D imaging performance. This translational research system is under investigation for clinical applications in head-and-neck surgery including oral cavity tumour resection, lymph node mapping, and free-flap perforator assessment.
NASA Technical Reports Server (NTRS)
1997-01-01
Passive millimeter wave (PMMW) sensors have the ability to see through fog, clouds, dust and sandstorms and thus have the potential to support all-weather operations, both military and commercial. Many of the applications, such as military transport or commercial aircraft landing, are technologically stressing in that they require imaging of a scene with a large field of view in real time and with high spatial resolution. The development of a low cost PMMW focal plane array camera is essential to obtain real-time video images to fulfill the above needs. The overall objective of this multi-year project (Phase 1) was to develop and demonstrate the capabilities of a W-band PMMW camera with a microwave/millimeter wave monolithic integrated circuit (MMIC) focal plane array (FPA) that can be manufactured at low cost for both military and commercial applications. This overall objective was met in July 1997 when the first video images from the camera were generated of an outdoor scene. In addition, our consortium partner McDonnell Douglas was to develop a real-time passive millimeter wave flight simulator to permit pilot evaluation of a PMMW-equipped aircraft in a landing scenario. A working version of this simulator was completed. This work was carried out under the DARPA-funded PMMW Camera Technology Reinvestment Project (TRP), also known as the PMMW Camera DARPA Joint Dual-Use Project. In this final report for the Phase 1 activities, a year by year description of what the specific objectives were, the approaches taken, and the progress made is presented, followed by a description of the validation and imaging test results obtained in 1997.
The influence of focal spot blooming on high-contrast spatial resolution in CT imaging.
Grimes, Joshua; Duan, Xinhui; Yu, Lifeng; Halaweish, Ahmed F; Haag, Nicole; Leng, Shuai; McCollough, Cynthia
2015-10-01
The objective of this work was to investigate focal spot blooming effects on the spatial resolution of CT images and to evaluate an x-ray tube that uses dynamic focal spot control for minimizing focal spot blooming. The influence of increasing tube current at a fixed tube potential of 80 kV on high-contrast spatial resolution of seven different CT scanner models (scanners A-G), including one scanner that uses dynamic focal spot control to reduce focal spot blooming (scanner A), was evaluated. Spatial resolution was assessed using a wire phantom for the modulation transfer function (MTF) calculation and a copper disc phantom for measuring the slice sensitivity profile (SSP). The impact of varying the tube potential was investigated on two scanner models (scanners A and B) by measuring the MTF and SSP and also by using the resolution bar pattern module of the ACR CT phantom. The phantoms were scanned at 70-150 kV on scanner A and 80-140 kV on scanner B, with tube currents from 100 mA up to the maximum tube current available on each scanner. The images were reconstructed using a slice thickness of 0.6 mm with both smooth and sharp kernels. Additionally, focal spot size at varying tube potentials and currents was directly measured using pinhole and slit camera techniques. Evaluation of the MTF and SSP data from the 7 CT scanner models evaluated demonstrated decreased focal spot blooming for newer scanners, as evidenced by decreasing deviations in MTF and SSP as tube current varied. For scanners A and B, where focal spot blooming effects as a function of tube potential were assessed, the spatial resolution variation in the axial plane was much smaller on scanner A compared to scanner B as tube potential and current changed. On scanner A, the 50% MTF never decreased by more than 2% from the 50% MTF measured at 100 mA. On scanner B, the 50% MTF decreased by as much as 19% from the 50% MTF measured at 100 mA. Assessments of the SSP, the bar patterns in the ACR phantom and the pinhole and slit camera measurements were consistent with the MTF calculations. Focal spot blooming has a noticeable effect on spatial resolution in CT imaging. The focal spot shaping technology of scanner A greatly reduced blooming effects.
Poerner, Tudor C; Otto, Sylvia; Gassdorf, Johannes; Nitsche, Kristina; Janiak, Florian; Scheller, Bruno; Goebel, Björn; Jung, Christian; Figulla, Hans R
2014-12-01
In this randomized trial, strut coverage and neointimal proliferation of a therapy of bare metal stents (BMSs) postdilated with the paclitaxel drug-eluting balloon (DEB) was compared with everolimus drug-eluting stents (DESs) at 6-month follow-up using optical coherence tomography. We hypothesized sufficient stent coverage at follow-up. A total of 105 lesions in 90 patients were treated with either XIENCE V DES (n=51) or BMS postdilated with the SeQuent Please DEB (n=54). At follow-up, comparable results on the primary optical coherence tomography end point (percentage uncovered struts 5.64±9.65% in BMS+DEB versus 4.93±9.29% in DES; P=0.366) were found. Thus, BMS+DEB achieved the prespecified noninferiority margin of 5% uncovered struts versus DES (difference between treatment means, 0.71%; one-sided upper 95% confidence interval, 4.14%; noninferiority P=0.04). Optical coherence tomography analysis showed significantly more global neointimal proliferation in the BMS+DEB group (15.7±7.8 versus 11.0±5.2 mm(3) proliferation volume/cm stent length; P=0.002). No significant focal in-stent stenosis analyzed with angiography (percentage diameter stenosis at follow-up, 22.8±11.9 versus 16.9±10.4; P=0.014) and optical coherence tomography (peak local area stenosis, 39.5±13.8% versus 36.8±15.6%; P=0.409) was found. Good stent strut coverage of >94% was found in both therapy groups. Despite greater suppression of global neointimal growth in DES, both DES and BMS+DEB effectively prevented clinically relevant focal restenosis at 6-month follow-up. http://www.clinicaltrials.gov. Unique identifier: NCT01056744. © 2014 American Heart Association, Inc.
Supernova Argonne/HEP Dark Energy Survey Group Ravi Gupta, Eve Kovacs, Steve Kuhlmann, Hal Spinka, Kasia Pomian The Argonne/HEP Dark Energy Survey (DES) group worked to build and test the Dark Energy Camera
Bifocal Stereo for Multipath Person Re-Identification
NASA Astrophysics Data System (ADS)
Blott, G.; Heipke, C.
2017-11-01
This work presents an approach for the task of person re-identification by exploiting bifocal stereo cameras. Present monocular person re-identification approaches show a decreasing working distance, when increasing the image resolution to obtain a higher reidentification performance. We propose a novel 3D multipath bifocal approach, containing a rectilinear lens with larger focal length for long range distances and a fish eye lens of a smaller focal length for the near range. The person re-identification performance is at least on par with 2D re-identification approaches but the working distance of the approach is increased and on average 10% more re-identification performance can be achieved in the overlapping field of view compared to a single camera. In addition, the 3D information is exploited from the overlapping field of view to solve potential 2D ambiguities.
Liang, Kun; Yang, Cailan; Peng, Li; Zhou, Bo
2017-02-01
In uncooled long-wave IR camera systems, the temperature of a focal plane array (FPA) is variable along with the environmental temperature as well as the operating time. The spatial nonuniformity of the FPA, which is partly affected by the FPA temperature, obviously changes as well, resulting in reduced image quality. This study presents a real-time nonuniformity correction algorithm based on FPA temperature to compensate for nonuniformity caused by FPA temperature fluctuation. First, gain coefficients are calculated using a two-point correction technique. Then offset parameters at different FPA temperatures are obtained and stored in tables. When the camera operates, the offset tables are called to update the current offset parameters via a temperature-dependent interpolation. Finally, the gain coefficients and offset parameters are used to correct the output of the IR camera in real time. The proposed algorithm is evaluated and compared with two representative shutterless algorithms [minimizing the sum of the squares of errors algorithm (MSSE), template-based solution algorithm (TBS)] using IR images captured by a 384×288 pixel uncooled IR camera with a 17 μm pitch. Experimental results show that this method can quickly trace the response drift of the detector units when the FPA temperature changes. The quality of the proposed algorithm is as good as MSSE, while the processing time is as short as TBS, which means the proposed algorithm is good for real-time control and at the same time has a high correction effect.
Dynamic calibration of pan-tilt-zoom cameras for traffic monitoring.
Song, Kai-Tai; Tai, Jen-Chao
2006-10-01
Pan-tilt-zoom (PTZ) cameras have been widely used in recent years for monitoring and surveillance applications. These cameras provide flexible view selection as well as a wider observation range. This makes them suitable for vision-based traffic monitoring and enforcement systems. To employ PTZ cameras for image measurement applications, one first needs to calibrate the camera to obtain meaningful results. For instance, the accuracy of estimating vehicle speed depends on the accuracy of camera calibration and that of vehicle tracking results. This paper presents a novel calibration method for a PTZ camera overlooking a traffic scene. The proposed approach requires no manual operation to select the positions of special features. It automatically uses a set of parallel lane markings and the lane width to compute the camera parameters, namely, focal length, tilt angle, and pan angle. Image processing procedures have been developed for automatically finding parallel lane markings. Interesting experimental results are presented to validate the robustness and accuracy of the proposed method.
Potentialities of HTS Superconductor Technology in Telecommunication Satellites
2005-07-13
satellites de télécommunications, utilisant des composants supraconducteurs à haute température critique et de l’électronique refroidie sont présentés... supraconducteur ) is the focal array of a receiving antenna (FAFR). The target application is a receive multi-beam satellite antenna in Ka-band. Small
A DirtI Application for LBT Commissioning Campaigns
NASA Astrophysics Data System (ADS)
Borelli, J. L.
2009-09-01
In order to characterize the Gregorian focal stations and test the performance achieved by the Large Binocular Telescope (LBT) adaptive optics system, two infrared test cameras were constructed within a joint project between INAF (Observatorio Astronomico di Bologna, Italy) and the Max Planck Institute for Astronomy (Germany). Is intended here to describe the functionality and successful results obtained with the Daemon for the Infrared Test Camera Interface (DirtI) during commissioning campaigns.
Graphic design of pinhole cameras
NASA Technical Reports Server (NTRS)
Edwards, H. B.; Chu, W. P.
1979-01-01
The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.
Image quality testing of assembled IR camera modules
NASA Astrophysics Data System (ADS)
Winters, Daniel; Erichsen, Patrik
2013-10-01
Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.
Defining habitat covariates in camera-trap based occupancy studies
Niedballa, Jürgen; Sollmann, Rahel; Mohamed, Azlan bin; Bender, Johannes; Wilting, Andreas
2015-01-01
In species-habitat association studies, both the type and spatial scale of habitat covariates need to match the ecology of the focal species. We assessed the potential of high-resolution satellite imagery for generating habitat covariates using camera-trapping data from Sabah, Malaysian Borneo, within an occupancy framework. We tested the predictive power of covariates generated from satellite imagery at different resolutions and extents (focal patch sizes, 10–500 m around sample points) on estimates of occupancy patterns of six small to medium sized mammal species/species groups. High-resolution land cover information had considerably more model support for small, patchily distributed habitat features, whereas it had no advantage for large, homogeneous habitat features. A comparison of different focal patch sizes including remote sensing data and an in-situ measure showed that patches with a 50-m radius had most support for the target species. Thus, high-resolution satellite imagery proved to be particularly useful in heterogeneous landscapes, and can be used as a surrogate for certain in-situ measures, reducing field effort in logistically challenging environments. Additionally, remote sensed data provide more flexibility in defining appropriate spatial scales, which we show to impact estimates of wildlife-habitat associations. PMID:26596779
NASA Astrophysics Data System (ADS)
Takahashi, Tadayuki; Mitsuda, Kazuhisa; Kelley, Richard; Aarts, Henri; Aharonian, Felix; Akamatsu, Hiroki; Akimoto, Fumie; Allen, Steve; Anabuki, Naohisa; Angelini, Lorella; Arnaud, Keith; Asai, Makoto; Audard, Marc; Awaki, Hisamitsu; Azzarello, Philipp; Baluta, Chris; Bamba, Aya; Bando, Nobutaka; Bautz, Mark; Blandford, Roger; Boyce, Kevin; Brown, Greg; Cackett, Ed; Chernyakova, Mara; Coppi, Paolo; Costantini, Elisa; de Plaa, Jelle; den Herder, Jan-Willem; DiPirro, Michael; Done, Chris; Dotani, Tadayasu; Doty, John; Ebisawa, Ken; Eckart, Megan; Enoto, Teruaki; Ezoe, Yuichiro; Fabian, Andrew; Ferrigno, Carlo; Foster, Adam; Fujimoto, Ryuichi; Fukazawa, Yasushi; Funk, Stefan; Furuzawa, Akihiro; Galeazzi, Massimiliano; Gallo, Luigi; Gandhi, Poshak; Gendreau, Keith; Gilmore, Kirk; Haas, Daniel; Haba, Yoshito; Hamaguchi, Kenji; Hatsukade, Isamu; Hayashi, Takayuki; Hayashida, Kiyoshi; Hiraga, Junko; Hirose, Kazuyuki; Hornschemeier, Ann; Hoshino, Akio; Hughes, John; Hwang, Una; Iizuka, Ryo; Inoue, Yoshiyuki; Ishibashi, Kazunori; Ishida, Manabu; Ishimura, Kosei; Ishisaki, Yoshitaka; Ito, Masayuki; Iwata, Naoko; Iyomoto, Naoko; Kaastra, Jelle; Kallman, Timothy; Kamae, Tuneyoshi; Kataoka, Jun; Katsuda, Satoru; Kawahara, Hajime; Kawaharada, Madoka; Kawai, Nobuyuki; Kawasaki, Shigeo; Khangaluyan, Dmitry; Kilbourne, Caroline; Kimura, Masashi; Kinugasa, Kenzo; Kitamoto, Shunji; Kitayama, Tetsu; Kohmura, Takayoshi; Kokubun, Motohide; Kosaka, Tatsuro; Koujelev, Alex; Koyama, Katsuji; Krimm, Hans; Kubota, Aya; Kunieda, Hideyo; LaMassa, Stephanie; Laurent, Philippe; Lebrun, Francois; Leutenegger, Maurice; Limousin, Olivier; Loewenstein, Michael; Long, Knox; Lumb, David; Madejski, Grzegorz; Maeda, Yoshitomo; Makishima, Kazuo; Marchand, Genevieve; Markevitch, Maxim; Matsumoto, Hironori; Matsushita, Kyoko; McCammon, Dan; McNamara, Brian; Miller, Jon; Miller, Eric; Mineshige, Shin; Minesugi, Kenji; Mitsuishi, Ikuyuki; Miyazawa, Takuya; Mizuno, Tsunefumi; Mori, Hideyuki; Mori, Koji; Mukai, Koji; Murakami, Toshio; Murakami, Hiroshi; Mushotzky, Richard; Nagano, Hosei; Nagino, Ryo; Nakagawa, Takao; Nakajima, Hiroshi; Nakamori, Takeshi; Nakazawa, Kazuhiro; Namba, Yoshiharu; Natsukari, Chikara; Nishioka, Yusuke; Nobukawa, Masayoshi; Nomachi, Masaharu; O'Dell, Steve; Odaka, Hirokazu; Ogawa, Hiroyuki; Ogawa, Mina; Ogi, Keiji; Ohashi, Takaya; Ohno, Masanori; Ohta, Masayuki; Okajima, Takashi; Okamoto, Atsushi; Okazaki, Tsuyoshi; Ota, Naomi; Ozaki, Masanobu; Paerels, Fritzs; Paltani, Stéphane; Parmar, Arvind; Petre, Robert; Pohl, Martin; Porter, F. Scott; Ramsey, Brian; Reis, Rubens; Reynolds, Christopher; Russell, Helen; Safi-Harb, Samar; Sakai, Shin-ichiro; Sameshima, Hiroaki; Sanders, Jeremy; Sato, Goro; Sato, Rie; Sato, Yohichi; Sato, Kosuke; Sawada, Makoto; Serlemitsos, Peter; Seta, Hiromi; Shibano, Yasuko; Shida, Maki; Shimada, Takanobu; Shinozaki, Keisuke; Shirron, Peter; Simionescu, Aurora; Simmons, Cynthia; Smith, Randall; Sneiderman, Gary; Soong, Yang; Stawarz, Lukasz; Sugawara, Yasuharu; Sugita, Hiroyuki; Sugita, Satoshi; Szymkowiak, Andrew; Tajima, Hiroyasu; Takahashi, Hiromitsu; Takeda, Shin-ichiro; Takei, Yoh; Tamagawa, Toru; Tamura, Takayuki; Tamura, Keisuke; Tanaka, Takaaki; Tanaka, Yasuo; Tashiro, Makoto; Tawara, Yuzuru; Terada, Yukikatsu; Terashima, Yuichi; Tombesi, Francesco; Tomida, Hiroshi; Tsuboi, Yohko; Tsujimoto, Masahiro; Tsunemi, Hiroshi; Tsuru, Takeshi; Uchida, Hiroyuki; Uchiyama, Yasunobu; Uchiyama, Hideki; Ueda, Yoshihiro; Ueno, Shiro; Uno, Shinichiro; Urry, Meg; Ursino, Eugenio; de Vries, Cor; Wada, Atsushi; Watanabe, Shin; Werner, Norbert; White, Nicholas; Yamada, Takahiro; Yamada, Shinya; Yamaguchi, Hiroya; Yamasaki, Noriko; Yamauchi, Shigeo; Yamauchi, Makoto; Yatsu, Yoichi; Yonetoku, Daisuke; Yoshida, Atsumasa; Yuasa, Takayuki
2012-09-01
The joint JAXA/NASA ASTRO-H mission is the sixth in a series of highly successful X-ray missions initiated by the Institute of Space and Astronautical Science (ISAS). ASTRO-H will investigate the physics of the highenergy universe via a suite of four instruments, covering a very wide energy range, from 0.3 keV to 600 keV. These instruments include a high-resolution, high-throughput spectrometer sensitive over 0.3-12 keV with high spectral resolution of ΔE ≦ 7 eV, enabled by a micro-calorimeter array located in the focal plane of thin-foil X-ray optics; hard X-ray imaging spectrometers covering 5-80 keV, located in the focal plane of multilayer-coated, focusing hard X-ray mirrors; a wide-field imaging spectrometer sensitive over 0.4-12 keV, with an X-ray CCD camera in the focal plane of a soft X-ray telescope; and a non-focusing Compton-camera type soft gamma-ray detector, sensitive in the 40-600 keV band. The simultaneous broad bandpass, coupled with high spectral resolution, will enable the pursuit of a wide variety of important science themes.
NASA Technical Reports Server (NTRS)
1978-01-01
The large format camera (LFC) designed as a 30 cm focal length cartographic camera system that employs forward motion compensation in order to achieve the full image resolution provided by its 80 degree field angle lens is described. The feasibility of application of the current LFC design to deployment in the orbiter program as the Orbiter Camera Payload System was assessed and the changes that are necessary to meet such a requirement are discussed. Current design and any proposed design changes were evaluated relative to possible future deployment of the LFC on a free flyer vehicle or in a WB-57F. Preliminary mission interface requirements for the LFC are given.
640 x 480 MWIR and LWIR camera system developments
NASA Astrophysics Data System (ADS)
Tower, John R.; Villani, Thomas S.; Esposito, Benjamin J.; Gilmartin, Harvey R.; Levine, Peter A.; Coyle, Peter J.; Davis, Timothy J.; Shallcross, Frank V.; Sauer, Donald J.; Meyerhofer, Dietrich
1993-01-01
The performance of a 640 x 480 PtSi, 3,5 microns (MWIR), Stirling cooled camera system with a minimum resolvable temperature of 0.03 is considered. A preliminary specification of a full-TV resolution PtSi radiometer was developed using the measured performance characteristics of the Stirling cooled camera. The radiometer is capable of imaging rapid thermal transients from 25 to 250 C with better than 1 percent temperature resolution. This performance is achieved using the electronic exposure control capability of the MOS focal plane array (FPA). A liquid nitrogen cooled camera with an eight-position filter wheel has been developed using the 640 x 480 PtSi FPA. Low thermal mass packaging for the FPA was developed for Joule-Thomson applications.
640 x 480 MWIR and LWIR camera system developments
NASA Astrophysics Data System (ADS)
Tower, J. R.; Villani, T. S.; Esposito, B. J.; Gilmartin, H. R.; Levine, P. A.; Coyle, P. J.; Davis, T. J.; Shallcross, F. V.; Sauer, D. J.; Meyerhofer, D.
The performance of a 640 x 480 PtSi, 3,5 microns (MWIR), Stirling cooled camera system with a minimum resolvable temperature of 0.03 is considered. A preliminary specification of a full-TV resolution PtSi radiometer was developed using the measured performance characteristics of the Stirling cooled camera. The radiometer is capable of imaging rapid thermal transients from 25 to 250 C with better than 1 percent temperature resolution. This performance is achieved using the electronic exposure control capability of the MOS focal plane array (FPA). A liquid nitrogen cooled camera with an eight-position filter wheel has been developed using the 640 x 480 PtSi FPA. Low thermal mass packaging for the FPA was developed for Joule-Thomson applications.
Note: Simple hysteresis parameter inspector for camera module with liquid lens
NASA Astrophysics Data System (ADS)
Chen, Po-Jui; Liao, Tai-Shan; Hwang, Chi-Hung
2010-05-01
A method to inspect hysteresis parameter is presented in this article. The hysteresis of whole camera module with liquid lens can be measured rather than a single lens merely. Because the variation in focal length influences image quality, we propose utilizing the sharpness of images which is captured from camera module for hysteresis evaluation. Experiments reveal that the profile of sharpness hysteresis corresponds to the characteristic of contact angle of liquid lens. Therefore, it can infer that the hysteresis of camera module is induced by the contact angle of liquid lens. An inspection process takes only 20 s to complete. Thus comparing with other instruments, this inspection method is more suitable to integrate into the mass production lines for online quality assurance.
Phenomenology of a Water Venting in Low Earth Orbit
1992-01-01
of the transport of outgas , the interaction of the vehicle with the ionospheric plasma, the energy balance of cometary material, and the uses of...l 4.04 100 ZA Ic 50 o)o 200 250 300 350 400 Distance along profile (pixels) Fig. 8. (a) Equi-photocurrent plot of the water trail from the aft...distance of the onboard camera’s short ONBOARD-CANIERA IMAGES focal length lens. their corresponding mean irradiance at the focal plane is - Xs we have
NASA Astrophysics Data System (ADS)
Lyuty, V. M.; Abdullayev, B. I.; Alekberov, I. A.; Gulmaliyev, N. I.; Mikayilov, Kh. M.; Rustamov, B. N.
2009-12-01
Short description of optical and electric scheme of CCD photometer with camera U-47 installed on the Cassegrain focus of ZEISS-600 telescope of the ShAO NAS Azerbaijan is provided. The reducer of focus with factor of reduction 1.7 is applied. It is calculated equivalent focal distances of a telescope with a focus reducer. General calculations of optimum distance from focal plane and t sizes of optical filters of photometer are presented.
The LSST Camera 500 watt -130 degC Mixed Refrigerant Cooling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowden, Gordon B.; Langton, Brian J.; /SLAC
2014-05-28
The LSST Camera has a higher cryogenic heat load than previous CCD telescope cameras due to its large size (634 mm diameter focal plane, 3.2 Giga pixels) and its close coupled front-end electronics operating at low temperature inside the cryostat. Various refrigeration technologies are considered for this telescope/camera environment. MMR-Technology’s Mixed Refrigerant technology was chosen. A collaboration with that company was started in 2009. The system, based on a cluster of Joule-Thomson refrigerators running a special blend of mixed refrigerants is described. Both the advantages and problems of applying this technology to telescope camera refrigeration are discussed. Test results frommore » a prototype refrigerator running in a realistic telescope configuration are reported. Current and future stages of the development program are described. (auth)« less
First results on video meteors from Crete, Greece
NASA Astrophysics Data System (ADS)
Maravelias, G.
2012-01-01
This work presents the first systematic video meteor observations from a, forthcoming permanent, station in Crete, Greece, operating as the first official node within the International Meteor Organization's Video Network. It consists of a Watec 902 H2 Ultimate camera equipped with a Panasonic WV-LA1208 (focal length 12mm, f/0.8) lens running MetRec. The system operated for 42 nights during 2011 (August 19-December 30, 2011) recording 1905 meteors. It is significantly more performant than a previous system used by the author during the Perseids 2010 (DMK camera 21AF04.AS by The Imaging Source, CCTV lens of focal length 2.8 mm, UFO Capture v2.22), which operated for 17 nights (August 4-22, 2010) recording 32 meteors. Differences - according to the author's experience - between the two softwares (MetRec, UFO Capture) are discussed along with a small guide to video meteor hardware.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katsanos, Konstantinos, E-mail: katsanos@med.upatras.gr; Spiliopoulos, Stavros; Diamantopoulos, Athanasios
2013-06-15
IntroductionDrug-eluting stents (DES) have been proposed for the treatment of infrapopliteal arterial disease. We performed a systematic review to provide a qualitative analysis and quantitative data synthesis of randomized controlled trials (RCTs) assessing infrapopliteal DES.Materials and MethodsPubMed (Medline), EMBASE (Excerpta Medical Database), AMED (Allied and Complementary medicine Database), Scopus, CENTRAL (Cochrane Central Register of Controlled Trials), online content, and abstract meetings were searched in September 2012 for eligible RCTs according to the preferred reporting items for systematic reviews and meta-analyses selection process. Risk of bias was assessed using the Cochrane Collaboration's tool. Primary endpoint was primary patency defined as absencemore » of {>=}50 % vessel restenosis at 1 year. Secondary outcome measures included patient survival, limb amputations, change of Rutherford-Becker class, target lesion revascularization (TLR) events, complete wound healing, and event-free survival at 1 year. Risk ratio (RRs) were calculated using the Mantel-Haenszel fixed effects model, and number-needed-to-treat values are reported.ResultsThree RCTs involving 501 patients with focal infrapopliteal lesions were analyzed (YUKON-BTX, DESTINY, and ACHILLES trials). All three RCTs included relatively short and focal infrapopliteal lesions. At 1 year, there was clear superiority of infrapopliteal DES compared with control treatments in terms of significantly higher primary patency (80.0 vs. 58.5 %; pooled RR = 1.37, 95 % confidence interval [CI] = 1.18-1.58, p < 0.0001; number-needed-to-treat (NNT) value = 4.8), improvement of Rutherford-Becker class (79.0 vs. 69.6 %; pooled RR = 1.13, 95 % CI = 1.002-1.275, p = 0.045; NNT = 11.1), decreased TLR events (9.9 vs. 22.0 %; pooled RR = 0.45, 95 % CI = 0.28-0.73, p = 0.001; NNT = 8.3), improved wound healing (76.8 vs. 59.7 %; pooled RR = 1.29, 95 % CI = 1.02-1.62, p = 0.04; NNT = 5.9), and better overall event-free survival (72.2 vs. 57.3 %; pooled RR = 1.26, 95 % CI = 1.10-1.44, p = 0.0006; NNT = 6.7). Conclusion. DES for focal infrapopliteal lesions significantly inhibit vascular restenosis and thereby improve primary patency, decrease repeat procedures, improve wound healing, and prolong overall event-free survival.« less
The development of large-aperture test system of infrared camera and visible CCD camera
NASA Astrophysics Data System (ADS)
Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying
2015-10-01
Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.
A rigid and thermally stable all ceramic optical support bench assembly for the LSST Camera
NASA Astrophysics Data System (ADS)
Kroedel, Matthias; Langton, J. Brian; Wahl, Bill
2017-09-01
This paper will present the ceramic design, fabrication and metrology results and assembly plan of the LSST camera optical bench structure which is using the unique manufacturing features of the HB-Cesic technology. The optical bench assembly consists of a rigid "Grid" fabrication supporting individual raft plates mounting sensor assemblies by way of a rigid kinematic support system to meet extreme stringent requirements for focal plane planarity and stability.
NASA Astrophysics Data System (ADS)
Aguilar, J. A.; Basili, A.; Boccone, V.; Cadoux, F.; Christov, A.; della Volpe, D.; Montaruli, T.; Płatos, Ł.; Rameez, M.
2015-01-01
The focal-plane cameras of γ -ray telescopes frequently use light concentrators in front of the light sensors. The purpose of these concentrators is to increase the effective area of the camera as well as to reduce the stray light coming at large incident angles. These light concentrators are usually based on the Winston cone design. In this contribution we present the design of a hexagonal hollow light concentrator with a lateral profile optimized using a cubic Bézier function to achieve a higher collection efficiency in the angular region of interest. The design presented here is optimized for a Davies-Cotton telescope with a primary mirror of about 4 m in diameter and a focal length of 5.6 m. The described concentrators are part of an innovative camera made up of silicon-photomultiplier sensors, although a similar approach can be used for other sizes of single-mirror telescopes with different camera sensors, including photomultipliers. The challenge of our approach is to achieve a cost-effective design suitable for standard industrial production of both the plastic concentrator substrate and the reflective coating. At the same time we maximize the optical performance. In this paper we also describe the optical set-up to measure the absolute collection efficiency of the light concentrators and demonstrate our good understanding of the measured data using a professional ray-tracing simulation.
NASA Astrophysics Data System (ADS)
Chatterjee, Abhijit; Verma, Anurag
2016-05-01
The Advanced Wide Field Sensor (AWiFS) camera caters to high temporal resolution requirement of Resourcesat-2A mission with repeativity of 5 days. The AWiFS camera consists of four spectral bands, three in the visible and near IR and one in the short wave infrared. The imaging concept in VNIR bands is based on push broom scanning that uses linear array silicon charge coupled device (CCD) based Focal Plane Array (FPA). On-Board Calibration unit for these CCD based FPAs is used to monitor any degradation in FPA during entire mission life. Four LEDs are operated in constant current mode and 16 different light intensity levels are generated by electronically changing exposure of CCD throughout the calibration cycle. This paper describes experimental setup and characterization results of various flight model visible LEDs (λP=650nm) for development of On-Board Calibration unit of Advanced Wide Field Sensor (AWiFS) camera of RESOURCESAT-2A. Various LED configurations have been studied to meet dynamic range coverage of 6000 pixels silicon CCD based focal plane array from 20% to 60% of saturation during night pass of the satellite to identify degradation of detector elements. The paper also explains comparison of simulation and experimental results of CCD output profile at different LED combinations in constant current mode.
Petroll, W M; Jester, J V; Cavanagh, H D
1996-01-01
A new depth encoding system (DES) is presented, which makes it possible to calculate, display, and record the z-axis position continuously during in vivo imaging using tandem scanning confocal microscopy (TSCM). In order to verify the accuracy of the DES for calculating the position of the focal plane in the cornea both in vitro and in vivo, we compared TSCM measurements of corneal thickness to measurements made using an ultrasonic pachymeter (UP, a standard clinical instrument) in both enucleated rabbit, cat, and human eyes (n = 15), and in both human patients (n = 7). Very close agreement was found between the UP and TSCM measurements in enucleated eyes; the mean percent difference was 0.50 +/- 2.58% (mean +/- SD, not significant). A significant correlation (R = 0.995, n = 15, p < 0.01) was found between UP and TSCM measurements. These results verify that the theoretical equation for calculating focal depth provided by the TSCM manufacturer is accurate for corneal imaging. Similarly, close agreement was found between the in vivo UP and TSCM measurements; the mean percent differences was 1.67 +/- 1.38% (not significant), confirming that z-axis drift can be minimized with proper applanation of the objective. These results confirm the accuracy of the DES for imaging of the cornea both ex vivo and in vivo. This system should be of great utility for applications where quantitation of the three-dimensional location of cellular structures is needed.
Recent developments for the Large Binocular Telescope Guiding Control Subsystem
NASA Astrophysics Data System (ADS)
Golota, T.; De La Peña, M. D.; Biddick, C.; Lesser, M.; Leibold, T.; Miller, D.; Meeks, R.; Hahn, T.; Storm, J.; Sargent, T.; Summers, D.; Hill, J.; Kraus, J.; Hooper, S.; Fisher, D.
2014-07-01
The Large Binocular Telescope (LBT) has eight Acquisition, Guiding, and wavefront Sensing Units (AGw units). They provide guiding and wavefront sensing capability at eight different locations at both direct and bent Gregorian focal stations. Recent additions of focal stations for PEPSI and MODS instruments doubled the number of focal stations in use including respective motion, camera controller server computers, and software infrastructure communicating with Guiding Control Subsystem (GCS). This paper describes the improvements made to the LBT GCS and explains how these changes have led to better maintainability and contributed to increased reliability. This paper also discusses the current GCS status and reviews potential upgrades to further improve its performance.
NASA Astrophysics Data System (ADS)
den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.
2015-10-01
Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.
Status of MUSIC, the MUltiwavelength Sub/millimeter Inductance Camera
NASA Astrophysics Data System (ADS)
Golwala, Sunil R.; Bockstiegel, Clint; Brugger, Spencer; Czakon, Nicole G.; Day, Peter K.; Downes, Thomas P.; Duan, Ran; Gao, Jiansong; Gill, Amandeep K.; Glenn, Jason; Hollister, Matthew I.; LeDuc, Henry G.; Maloney, Philip R.; Mazin, Benjamin A.; McHugh, Sean G.; Miller, David; Noroozian, Omid; Nguyen, Hien T.; Sayers, Jack; Schlaerth, James A.; Siegel, Seth; Vayonakis, Anastasios K.; Wilson, Philip R.; Zmuidzinas, Jonas
2012-09-01
We present the status of MUSIC, the MUltiwavelength Sub/millimeter Inductance Camera, a new instrument for the Caltech Submillimeter Observatory. MUSIC is designed to have a 14', diffraction-limited field-of-view instrumented with 2304 detectors in 576 spatial pixels and four spectral bands at 0.87, 1.04, 1.33, and 1.98 mm. MUSIC will be used to study dusty star-forming galaxies, galaxy clusters via the Sunyaev-Zeldovich effect, and star formation in our own and nearby galaxies. MUSIC uses broadband superconducting phased-array slot-dipole antennas to form beams, lumpedelement on-chip bandpass filters to define spectral bands, and microwave kinetic inductance detectors to sense incoming light. The focal plane is fabricated in 8 tiles consisting of 72 spatial pixels each. It is coupled to the telescope via an ambient-temperature ellipsoidal mirror and a cold reimaging lens. A cold Lyot stop sits at the image of the primary mirror formed by the ellipsoidal mirror. Dielectric and metal-mesh filters are used to block thermal infrared and out-ofband radiation. The instrument uses a pulse tube cooler and 3He/ 3He/4He closed-cycle cooler to cool the focal plane to below 250 mK. A multilayer shield attenuates Earth's magnetic field. Each focal plane tile is read out by a single pair of coaxes and a HEMT amplifier. The readout system consists of 16 copies of custom-designed ADC/DAC and IF boards coupled to the CASPER ROACH platform. We focus on recent updates on the instrument design and results from the commissioning of the full camera in 2012.
Brown, David M; Juarez, Juan C; Brown, Andrea M
2013-12-01
A laser differential image-motion monitor (DIMM) system was designed and constructed as part of a turbulence characterization suite during the DARPA free-space optical experimental network experiment (FOENEX) program. The developed link measurement system measures the atmospheric coherence length (r0), atmospheric scintillation, and power in the bucket for the 1550 nm band. DIMM measurements are made with two separate apertures coupled to a single InGaAs camera. The angle of arrival (AoA) for the wavefront at each aperture can be calculated based on focal spot movements imaged by the camera. By utilizing a single camera for the simultaneous measurement of the focal spots, the correlation of the variance in the AoA allows a straightforward computation of r0 as in traditional DIMM systems. Standard measurements of scintillation and power in the bucket are made with the same apertures by redirecting a percentage of the incoming signals to InGaAs detectors integrated with logarithmic amplifiers for high sensitivity and high dynamic range. By leveraging two, small apertures, the instrument forms a small size and weight configuration for mounting to actively tracking laser communication terminals for characterizing link performance.
Spatial calibration of an optical see-through head mounted display
Gilson, Stuart J.; Fitzgibbon, Andrew W.; Glennerster, Andrew
2010-01-01
We present here a method for calibrating an optical see-through Head Mounted Display (HMD) using techniques usually applied to camera calibration (photogrammetry). Using a camera placed inside the HMD to take pictures simultaneously of a tracked object and features in the HMD display, we could exploit established camera calibration techniques to recover both the intrinsic and extrinsic properties of the HMD (width, height, focal length, optic centre and principal ray of the display). Our method gives low re-projection errors and, unlike existing methods, involves no time-consuming and error-prone human measurements, nor any prior estimates about the HMD geometry. PMID:18599125
Miniature Wide-Angle Lens for Small-Pixel Electronic Camera
NASA Technical Reports Server (NTRS)
Mouroulils, Pantazis; Blazejewski, Edward
2009-01-01
A proposed wideangle lens is shown that would be especially well suited for an electronic camera in which the focal plane is occupied by an image sensor that has small pixels. The design of the lens is intended to satisfy requirements for compactness, high image quality, and reasonably low cost, while addressing issues peculiar to the operation of small-pixel image sensors. Hence, this design is expected to enable the development of a new generation of compact, high-performance electronic cameras. The lens example shown has a 60 degree field of view and a relative aperture (f-number) of 3.2. The main issues affecting the design are also shown.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Rick Wetherington checks out one of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with an electric drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Kenny Allen works on the recently acquired Contraves-Goerz Kineto Tracking Mount (KTM). Trailer-mounted with a center console/seat and electric drive tracking mount, the KTM includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff. There are 10 KTMs certified for use on the Eastern Range.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Kenny Allen works on the recently acquired Contraves-Goerz Kineto Tracking Mount (KTM). Trailer-mounted with a center console/seat and electric drive tracking mount, the KTM includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff. There are 10 KTMs certified for use on the Eastern Range.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Kenny Allen checks out one of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with an electric drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
Optical registration of spaceborne low light remote sensing camera
NASA Astrophysics Data System (ADS)
Li, Chong-yang; Hao, Yan-hui; Xu, Peng-mei; Wang, Dong-jie; Ma, Li-na; Zhao, Ying-long
2018-02-01
For the high precision requirement of spaceborne low light remote sensing camera optical registration, optical registration of dual channel for CCD and EMCCD is achieved by the high magnification optical registration system. System integration optical registration and accuracy of optical registration scheme for spaceborne low light remote sensing camera with short focal depth and wide field of view is proposed in this paper. It also includes analysis of parallel misalignment of CCD and accuracy of optical registration. Actual registration results show that imaging clearly, MTF and accuracy of optical registration meet requirements, it provide important guarantee to get high quality image data in orbit.
C-RED one: ultra-high speed wavefront sensing in the infrared made possible
NASA Astrophysics Data System (ADS)
Gach, J.-L.; Feautrier, Philippe; Stadler, Eric; Greffe, Timothee; Clop, Fabien; Lemarchand, Stéphane; Carmignani, Thomas; Boutolleau, David; Baker, Ian
2016-07-01
First Light Imaging's CRED-ONE infrared camera is capable of capturing up to 3500 full frames per second with a subelectron readout noise. This breakthrough has been made possible thanks to the use of an e-APD infrared focal plane array which is a real disruptive technology in imagery. We will show the performances of the camera, its main features and compare them to other high performance wavefront sensing cameras like OCAM2 in the visible and in the infrared. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.
NASA Technical Reports Server (NTRS)
Lane, Marc; Hsieh, Cheng; Adams, Lloyd
1989-01-01
In undertaking the design of a 2000-mm focal length camera for the Mariner Mark II series of spacecraft, JPL sought novel materials with the requisite dimensional and thermal stability, outgassing and corrosion resistance, low mass, high stiffness, and moderate cost. Metal-matrix composites and Al-Li alloys have, in addition to excellent mechanical properties and low density, a suitably low coefficient of thermal expansion, high specific stiffness, and good electrical conductivity. The greatest single obstacle to application of these materials to camera structure design is noted to have been the lack of information regarding long-term dimensional stability.
The Wide Field Imager instrument for Athena
NASA Astrophysics Data System (ADS)
Meidinger, Norbert; Barbera, Marco; Emberger, Valentin; Fürmetz, Maria; Manhart, Markus; Müller-Seidlitz, Johannes; Nandra, Kirpal; Plattner, Markus; Rau, Arne; Treberspurg, Wolfgang
2017-08-01
ESA's next large X-ray mission ATHENA is designed to address the Cosmic Vision science theme 'The Hot and Energetic Universe'. It will provide answers to the two key astrophysical questions how does ordinary matter assemble into the large-scale structures we see today and how do black holes grow and shape the Universe. The ATHENA spacecraft will be equipped with two focal plane cameras, a Wide Field Imager (WFI) and an X-ray Integral Field Unit (X-IFU). The WFI instrument is optimized for state-of-the-art resolution spectroscopy over a large field of view of 40 amin x 40 amin and high count rates up to and beyond 1 Crab source intensity. The cryogenic X-IFU camera is designed for high-spectral resolution imaging. Both cameras share alternately a mirror system based on silicon pore optics with a focal length of 12 m and large effective area of about 2 m2 at an energy of 1 keV. Although the mission is still in phase A, i.e. studying the feasibility and developing the necessary technology, the definition and development of the instrumentation made already significant progress. The herein described WFI focal plane camera covers the energy band from 0.2 keV to 15 keV with 450 μm thick fully depleted back-illuminated silicon active pixel sensors of DEPFET type. The spatial resolution will be provided by one million pixels, each with a size of 130 μm x 130 μm. The time resolution requirement for the WFI large detector array is 5 ms and for the WFI fast detector 80 μs. The large effective area of the mirror system will be completed by a high quantum efficiency above 90% for medium and higher energies. The status of the various WFI subsystems to achieve this performance will be described and recent changes will be explained here.
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele
2011-06-01
Recent advances in micro-optical element fabrication using gray scale technology have opened up the opportunity to create simultaneous multi-spectral imaging with fine structure diffractive lenses. This paper will discuss an approach that uses diffractive optical lenses configured in an array (lenslet array) and placed in close proximity to the focal plane array which enables a small compact simultaneous multispectral imaging camera [1]. The lenslet array is designed so that all lenslets have a common focal length with each lenslet tuned for a different wavelength. The number of simultaneous spectral images is determined by the number of individually configured lenslets in the array. The number of spectral images can be increased by a factor of 2 when using it with a dual-band focal plane array (MWIR/LWIR) by exploiting multiple diffraction orders. In addition, modulation of the focal length of the lenslet array with piezoelectric actuation will enable spectral bin fill-in allowing additional spectral coverage while giving up simultaneity. Different lenslet array spectral imaging concept designs are presented in this paper along with a unique concept for prefiltering the radiation focused on the detector. This approach to spectral imaging has applications in the detection of chemical agents in both aerosolized form and as a liquid on a surface. It also can be applied to the detection of weaponized biological agent and IED detection in various forms from manufacturing to deployment and post detection during forensic analysis.
Stereo depth distortions in teleoperation
NASA Technical Reports Server (NTRS)
Diner, Daniel B.; Vonsydow, Marika
1988-01-01
In teleoperation, a typical application of stereo vision is to view a work space located short distances (1 to 3m) in front of the cameras. The work presented here treats converged camera placement and studies the effects of intercamera distance, camera-to-object viewing distance, and focal length of the camera lenses on both stereo depth resolution and stereo depth distortion. While viewing the fronto-parallel plane 1.4 m in front of the cameras, depth errors are measured on the order of 2cm. A geometric analysis was made of the distortion of the fronto-parallel plane of divergence for stereo TV viewing. The results of the analysis were then verified experimentally. The objective was to determine the optimal camera configuration which gave high stereo depth resolution while minimizing stereo depth distortion. It is found that for converged cameras at a fixed camera-to-object viewing distance, larger intercamera distances allow higher depth resolutions, but cause greater depth distortions. Thus with larger intercamera distances, operators will make greater depth errors (because of the greater distortions), but will be more certain that they are not errors (because of the higher resolution).
Orbital-science investigation: Part C: photogrammetry of Apollo 15 photography
Wu, Sherman S.C.; Schafer, Francis J.; Jordan, Raymond; Nakata, Gary M.; Derick, James L.
1972-01-01
Mapping of large areas of the Moon by photogrammetric methods was not seriously considered until the Apollo 15 mission. In this mission, a mapping camera system and a 61-cm optical-bar high-resolution panoramic camera, as well as a laser altimeter, were used. The mapping camera system comprises a 7.6-cm metric terrain camera and a 7.6-cm stellar camera mounted in a fixed angular relationship (an angle of 96° between the two camera axes). The metric camera has a glass focal-plane plate with reseau grids. The ground-resolution capability from an altitude of 110 km is approximately 20 m. Because of the auxiliary stellar camera and the laser altimeter, the resulting metric photography can be used not only for medium- and small-scale cartographic or topographic maps, but it also can provide a basis for establishing a lunar geodetic network. The optical-bar panoramic camera has a 135- to 180-line resolution, which is approximately 1 to 2 m of ground resolution from an altitude of 110 km. Very large scale specialized topographic maps for supporting geologic studies of lunar-surface features can be produced from the stereoscopic coverage provided by this camera.
An Unusual View: MISR sees the Moon
2017-08-17
The job of the Multiangle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite is to view Earth. For more than 17 years, its nine cameras have stared downward 24 hours a day, faithfully collecting images used to study Earth's surface and atmosphere. On August 5, however, MISR captured some very unusual data as the Terra satellite performed a backflip in space. This maneuver was performed to allow MISR and the other instruments on Terra to catch a glimpse of the Moon, something that has been done only once before, in 2003. Why task an elderly satellite with such a radical maneuver? Since we can be confident that the Moon's brightness has remained very constant over the mission, MISR's images of the Moon can be used as a check of the instrument's calibration, allowing an independent verification of the procedures used to correct the images for any changes the cameras have experienced over their many years in space. If changes in the cameras' responses to light aren't properly accounted for, the images captured by MISR would make it appear as if Earth were growing darker or lighter, which would throw off scientists' efforts to characterize air pollution, cloud cover and Earth's climate. Because of this, the MISR team uses several methods to calibrate the data, all of which involve imaging something with a known (or independently measured) brightness and correcting the images to match that brightness. Every month, MISR views two panels of a special material called Spectralon, which reflects sunlight in a very particular way, onboard the instrument. Periodically, this calibration is checked by a field team who measures the brightness of a flat, uniformly colored surface on Earth, usually a dry desert lakebed, as MISR flies overhead. The lunar maneuver offers a third opportunity to check the brightness calibration of MISR's images. While viewing Earth, MISR's cameras are fixed at nine different angles, with one (called An) pointed straight down, four canted forwards (Af, Bf, Cf, and Df) and four angled backwards (Aa, Ba, Ca, and Da). The A, B, C, and D cameras have different focal lengths, with the most oblique (D) cameras having the longest focal lengths in order to preserve spatial resolution on the ground. During the lunar maneuver, however, the spacecraft rotated so that each camera saw the almost-full Moon straight on. This means that the different focal lengths produce images with different resolutions. The D cameras produce the sharpest images. These grayscale images were made with raw data from the red spectral band of each camera. Because the spacecraft is constantly rotating while these images were taken, the images are "smeared" in the vertical direction, producing an oval-shaped Moon. These have been corrected to restore the Moon to its true circular shape. https://photojournal.jpl.nasa.gov/catalog/PIA21876
Mapping Sequence performed during the STS-117 R-Bar Pitch Maneuver
2007-06-10
ISS015-E-11298 (10 June 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Atlantis as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows part of the commander's side or port side of Atlantis' cabin. Distance from the station and shuttle at this time was approximately 600 feet.
Mapping Sequence performed during the STS-118 R-Bar Pitch Maneuver
2007-08-10
ISS015-E-21344 (10 Aug. 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Endeavour as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows the nose cone of Endeavour and surrounding area. Distance between the station and shuttle at this time was approximately 600 feet.
Imaging spectroscopy using embedded diffractive optical arrays
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele; Hinnrichs, Bradford
2017-09-01
Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera based on diffractive optic arrays. This approach to hyperspectral imaging has been demonstrated in all three infrared bands SWIR, MWIR and LWIR. The hyperspectral optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of this infrared hyperspectral sensor. This new and innovative approach to an infrared hyperspectral imaging spectrometer uses micro-optics that are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a small satellite, mini-UAV, commercial quadcopter or man portable. Also, an application of how this spectral imaging technology can easily be used to quantify the mass and volume flow rates of hydrocarbon gases. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. The detector array is divided into sub-images covered by each lenslet. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the number of simultaneous different spectral images collected each frame of the camera. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame. This system spans the SWIR and MWIR bands with a single optical array and focal plane array.
Camera Concepts for the Advanced Gamma-Ray Imaging System (AGIS)
NASA Astrophysics Data System (ADS)
Nepomuk Otte, Adam
2009-05-01
The Advanced Gamma-Ray Imaging System (AGIS) is a concept for the next generation observatory in ground-based very high energy gamma-ray astronomy. Design goals are ten times better sensitivity, higher angular resolution, and a lower energy threshold than existing Cherenkov telescopes. Each telescope is equipped with a camera that detects and records the Cherenkov-light flashes from air showers. The camera is comprised of a pixelated focal plane of blue sensitive and fast (nanosecond) photon detectors that detect the photon signal and convert it into an electrical one. The incorporation of trigger electronics and signal digitization into the camera are under study. Given the size of AGIS, the camera must be reliable, robust, and cost effective. We are investigating several directions that include innovative technologies such as Geiger-mode avalanche-photodiodes as a possible detector and switched capacitor arrays for the digitization.
Rogers, B.T. Jr.; Davis, W.C.
1957-12-17
This patent relates to high speed cameras having resolution times of less than one-tenth microseconds suitable for filming distinct sequences of a very fast event such as an explosion. This camera consists of a rotating mirror with reflecting surfaces on both sides, a narrow mirror acting as a slit in a focal plane shutter, various other mirror and lens systems as well as an innage recording surface. The combination of the rotating mirrors and the slit mirror causes discrete, narrow, separate pictures to fall upon the film plane, thereby forming a moving image increment of the photographed event. Placing a reflecting surface on each side of the rotating mirror cancels the image velocity that one side of the rotating mirror would impart, so as a camera having this short a resolution time is thereby possible.
Personal Visual Aids for Aircrew.
1981-06-01
oedema or areas of pigmentation or depigmentation, such as central serous retinopathy or focal choroiditis of differing aetiology. Heat induced lenticular ...Fig 11). a. Normal grid pattern b. Pincushion distortion c. Astigmatic distortion d. Para central scotoma Fig 11. Amsler grids illustrating visual...ML). Ophtalmologie. Maladie yeux. Astigmatisme . Corne. Prothbse. Pilotes. Vision. Lunettes. 46. A propos du vol et de la correction des presbytes
An Integrated Optimal Estimation Approach to Spitzer Space Telescope Focal Plane Survey
NASA Technical Reports Server (NTRS)
Bayard, David S.; Kang, Bryan H.; Brugarolas, Paul B.; Boussalis, D.
2004-01-01
This paper discusses an accurate and efficient method for focal plane survey that was used for the Spitzer Space Telescope. The approach is based on using a high-order 37-state Instrument Pointing Frame (IPF) Kalman filter that combines both engineering parameters and science parameters into a single filter formulation. In this approach, engineering parameters such as pointing alignments, thermomechanical drift and gyro drifts are estimated along with science parameters such as plate scales and optical distortions. This integrated approach has many advantages compared to estimating the engineering and science parameters separately. The resulting focal plane survey approach is applicable to a diverse range of science instruments such as imaging cameras, spectroscopy slits, and scanning-type arrays alike. The paper will summarize results from applying the IPF Kalman Filter to calibrating the Spitzer Space Telescope focal plane, containing the MIPS, IRAC, and the IRS science Instrument arrays.
Optics Near the Snell Angle in a Water-to-Air Change of Medium
2007-01-01
the seawater wedge at the focus of a notional 57.3-mm lens modeled in ZEMAX ® [5]. The boxes are plotted in units of µm, and lens focal length is...lenses had insufficient focal-plane coverage. The ZEMAX spot diagram of this layout is depicted in Fig. 4. It is corrected for the horizon angle...the Fig. 9 ZEMAX layout. It is a two-prism design, but only one prism need be built and carried within the camera, with the forward prism being the
Wavefront Sensing With Switched Lenses for Defocus Diversity
NASA Technical Reports Server (NTRS)
Dean, Bruce H.
2007-01-01
In an alternative hardware design for an apparatus used in image-based wavefront sensing, defocus diversity is introduced by means of fixed lenses that are mounted in a filter wheel (see figure) so that they can be alternately switched into a position in front of the focal plane of an electronic camera recording the image formed by the optical system under test. [The terms image-based, wavefront sensing, and defocus diversity are defined in the first of the three immediately preceding articles, Broadband Phase Retrieval for Image-Based Wavefront Sensing (GSC-14899-1).] Each lens in the filter wheel is designed so that the optical effect of placing it at the assigned position is equivalent to the optical effect of translating the camera a specified defocus distance along the optical axis. Heretofore, defocus diversity has been obtained by translating the imaging camera along the optical axis to various defocus positions. Because data must be taken at multiple, accurately measured defocus positions, it is necessary to mount the camera on a precise translation stage that must be calibrated for each defocus position and/or to use an optical encoder for measurement and feedback control of the defocus positions. Additional latency is introduced into the wavefront sensing process as the camera is translated to the various defocus positions. Moreover, if the optical system under test has a large focal length, the required defocus values are large, making it necessary to use a correspondingly bulky translation stage. By eliminating the need for translation of the camera, the alternative design simplifies and accelerates the wavefront-sensing process. This design is cost-effective in that the filterwheel/lens mechanism can be built from commercial catalog components. After initial calibration of the defocus value of each lens, a selected defocus value is introduced by simply rotating the filter wheel to place the corresponding lens in front of the camera. The rotation of the wheel can be automated by use of a motor drive, and further calibration is not necessary. Because a camera-translation stage is no longer needed, the size of the overall apparatus can be correspondingly reduced.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Kenny Allen makes adjustments on one of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with a center console/seat and electric drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operators Rick Worthington (left) and Kenny Allen work on one of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with a center console/seat and electric drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Kenny Allen stands in the center console area of one of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with an electric-drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operator Rick Wetherington sits in the center console seat of one of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with an electric drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
2004-05-19
KENNEDY SPACE CENTER, FLA. -- Johnson Controls operators Rick Wetherington (left) and Kenny Allen work on two of the recently acquired Contraves-Goerz Kineto Tracking Mounts (KTM). There are 10 KTMs certified for use on the Eastern Range. The KTM, which is trailer-mounted with a center console/seat and electric drive tracking mount, includes a two-camera, camera control unit that will be used during launches. The KTM is designed for remotely controlled operations and offers a combination of film, shuttered and high-speed digital video, and FLIR cameras configured with 20-inch to 150-inch focal length lenses. The KTMs are generally placed in the field and checked out the day before a launch and manned 3 hours prior to liftoff.
Visible-infrared achromatic imaging by wavefront coding with wide-angle automobile camera
NASA Astrophysics Data System (ADS)
Ohta, Mitsuhiko; Sakita, Koichi; Shimano, Takeshi; Sugiyama, Takashi; Shibasaki, Susumu
2016-09-01
We perform an experiment of achromatic imaging with wavefront coding (WFC) using a wide-angle automobile lens. Our original annular phase mask for WFC was inserted to the lens, for which the difference between the focal positions at 400 nm and at 950 nm is 0.10 mm. We acquired images of objects using a WFC camera with this lens under the conditions of visible and infrared light. As a result, the effect of the removal of the chromatic aberration of the WFC system was successfully determined. Moreover, we fabricated a demonstration set assuming the use of a night vision camera in an automobile and showed the effect of the WFC system.
Single-snapshot 2D color measurement by plenoptic imaging system
NASA Astrophysics Data System (ADS)
Masuda, Kensuke; Yamanaka, Yuji; Maruyama, Go; Nagai, Sho; Hirai, Hideaki; Meng, Lingfei; Tosic, Ivana
2014-03-01
Plenoptic cameras enable capture of directional light ray information, thus allowing applications such as digital refocusing, depth estimation, or multiband imaging. One of the most common plenoptic camera architectures contains a microlens array at the conventional image plane and a sensor at the back focal plane of the microlens array. We leverage the multiband imaging (MBI) function of this camera and develop a single-snapshot, single-sensor high color fidelity camera. Our camera is based on a plenoptic system with XYZ filters inserted in the pupil plane of the main lens. To achieve high color measurement precision of this system, we perform an end-to-end optimization of the system model that includes light source information, object information, optical system information, plenoptic image processing and color estimation processing. Optimized system characteristics are exploited to build an XYZ plenoptic colorimetric camera prototype that achieves high color measurement precision. We describe an application of our colorimetric camera to color shading evaluation of display and show that it achieves color accuracy of ΔE<0.01.
NASA Astrophysics Data System (ADS)
Torkildsen, H. E.; Hovland, H.; Opsahl, T.; Haavardsholm, T. V.; Nicolas, S.; Skauli, T.
2014-06-01
In some applications of multi- or hyperspectral imaging, it is important to have a compact sensor. The most compact spectral imaging sensors are based on spectral filtering in the focal plane. For hyperspectral imaging, it has been proposed to use a "linearly variable" bandpass filter in the focal plane, combined with scanning of the field of view. As the image of a given object in the scene moves across the field of view, it is observed through parts of the filter with varying center wavelength, and a complete spectrum can be assembled. However if the radiance received from the object varies with viewing angle, or with time, then the reconstructed spectrum will be distorted. We describe a camera design where this hyperspectral functionality is traded for multispectral imaging with better spectral integrity. Spectral distortion is minimized by using a patterned filter with 6 bands arranged close together, so that a scene object is seen by each spectral band in rapid succession and with minimal change in viewing angle. The set of 6 bands is repeated 4 times so that the spectral data can be checked for internal consistency. Still the total extent of the filter in the scan direction is small. Therefore the remainder of the image sensor can be used for conventional imaging with potential for using motion tracking and 3D reconstruction to support the spectral imaging function. We show detailed characterization of the point spread function of the camera, demonstrating the importance of such characterization as a basis for image reconstruction. A simplified image reconstruction based on feature-based image coregistration is shown to yield reasonable results. Elimination of spectral artifacts due to scene motion is demonstrated.
Multipurpose Hyperspectral Imaging System
NASA Technical Reports Server (NTRS)
Mao, Chengye; Smith, David; Lanoue, Mark A.; Poole, Gavin H.; Heitschmidt, Jerry; Martinez, Luis; Windham, William A.; Lawrence, Kurt C.; Park, Bosoon
2005-01-01
A hyperspectral imaging system of high spectral and spatial resolution that incorporates several innovative features has been developed to incorporate a focal plane scanner (U.S. Patent 6,166,373). This feature enables the system to be used for both airborne/spaceborne and laboratory hyperspectral imaging with or without relative movement of the imaging system, and it can be used to scan a target of any size as long as the target can be imaged at the focal plane; for example, automated inspection of food items and identification of single-celled organisms. The spectral resolution of this system is greater than that of prior terrestrial multispectral imaging systems. Moreover, unlike prior high-spectral resolution airborne and spaceborne hyperspectral imaging systems, this system does not rely on relative movement of the target and the imaging system to sweep an imaging line across a scene. This compact system (see figure) consists of a front objective mounted at a translation stage with a motorized actuator, and a line-slit imaging spectrograph mounted within a rotary assembly with a rear adaptor to a charged-coupled-device (CCD) camera. Push-broom scanning is carried out by the motorized actuator which can be controlled either manually by an operator or automatically by a computer to drive the line-slit across an image at a focal plane of the front objective. To reduce the cost, the system has been designed to integrate as many as possible off-the-shelf components including the CCD camera and spectrograph. The system has achieved high spectral and spatial resolutions by using a high-quality CCD camera, spectrograph, and front objective lens. Fixtures for attachment of the system to a microscope (U.S. Patent 6,495,818 B1) make it possible to acquire multispectral images of single cells and other microscopic objects.
The NOAO NEWFIRM Data Handling System
NASA Astrophysics Data System (ADS)
Zárate, N.; Fitzpatrick, M.
2008-08-01
The NOAO Extremely Wide-Field IR Mosaic (NEWFIRM) is a new 1-2.4 micron IR camera that is now being commissioned for the 4m Mayall telescope at Kitt Peak. The focal plane consists of a 2x2 mosaic of 2048x2048 arrays offerring a field-of-view of 27.6' on a side. The use of dual MONSOON array controllers permits very fast readout, a scripting interface allows for highly efficient observing modes. We describe the Data Handling System (DHS) for the NEWFIRM camera which is designed to meet the performance requirements of the instrument as well as the observing environment in which in operates. It is responsible for receiving the data stream from the detector and instrument software, rectifying the image geometry, presenting a real-time display of the image to the user, final assembly of a science-grade image with complete headers, as well as triggering automated pipeline and archival functions. The DHS uses an event-based messaging system to control multiple processes on a distributed network of machines. The asynchronous nature of this processing means the DHS operates independently from the camera readout and the design of the system is inherently scalable to larger focal planes that use a greater number of array controllers. Current status and future plans for the DHS are also discussed.
Status and performance of HST/Wide Field Camera 3
NASA Astrophysics Data System (ADS)
Kimble, Randy A.; MacKenty, John W.; O'Connell, Robert W.
2006-06-01
Wide Field Camera 3 (WFC3) is a powerful UV/visible/near-infrared camera currently in development for installation into the Hubble Space Telescope. WFC3 provides two imaging channels. The UVIS channel features a 4096 x 4096 pixel CCD focal plane covering 200 to 1000 nm wavelengths with a 160 x 160 arcsec field of view. The UVIS channel provides unprecedented sensitivity and field of view in the near ultraviolet for HST. It is particularly well suited for studies of the star formation history of local galaxies and clusters, searches for Lyman alpha dropouts at moderate redshift, and searches for low surface brightness structures against the dark UV sky background. The IR channel features a 1024 x 1024 pixel HgCdTe focal plane covering 800 to 1700 nm with a 139 x 123 arcsec field of view, providing a major advance in IR survey efficiency for HST. IR channel science goals include studies of dark energy, galaxy formation at high redshift, and star formation. The instrument is being prepared for launch as part of HST Servicing Mission 4, tentatively scheduled for late 2007, contingent upon formal approval of shuttle-based servicing after successful shuttle return-to-flight. We report here on the status and performance of WFC3.
Sky camera geometric calibration using solar observations
Urquhart, Bryan; Kurtz, Ben; Kleissl, Jan
2016-09-05
A camera model and associated automated calibration procedure for stationary daytime sky imaging cameras is presented. The specific modeling and calibration needs are motivated by remotely deployed cameras used to forecast solar power production where cameras point skyward and use 180° fisheye lenses. Sun position in the sky and on the image plane provides a simple and automated approach to calibration; special equipment or calibration patterns are not required. Sun position in the sky is modeled using a solar position algorithm (requiring latitude, longitude, altitude and time as inputs). Sun position on the image plane is detected using a simple image processing algorithm. Themore » performance evaluation focuses on the calibration of a camera employing a fisheye lens with an equisolid angle projection, but the camera model is general enough to treat most fixed focal length, central, dioptric camera systems with a photo objective lens. Calibration errors scale with the noise level of the sun position measurement in the image plane, but the calibration is robust across a large range of noise in the sun position. In conclusion, calibration performance on clear days ranged from 0.94 to 1.24 pixels root mean square error.« less
The DES Science Verification Weak Lensing Shear Catalogs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarvis, M.
We present weak lensing shear catalogs for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogs of 2.12 million and 3.44 million galaxies respectively. We also detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SVmore » data. Furthermore, we discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogs for the full 5-year DES, which is expected to cover 5000 square degrees.« less
The DES Science Verification Weak Lensing Shear Catalogs
Jarvis, M.
2016-05-01
We present weak lensing shear catalogs for 139 square degrees of data taken during the Science Verification (SV) time for the new Dark Energy Camera (DECam) being used for the Dark Energy Survey (DES). We describe our object selection, point spread function estimation and shear measurement procedures using two independent shear pipelines, IM3SHAPE and NGMIX, which produce catalogs of 2.12 million and 3.44 million galaxies respectively. We also detail a set of null tests for the shear measurements and find that they pass the requirements for systematic errors at the level necessary for weak lensing science applications using the SVmore » data. Furthermore, we discuss some of the planned algorithmic improvements that will be necessary to produce sufficiently accurate shear catalogs for the full 5-year DES, which is expected to cover 5000 square degrees.« less
Variable-focus liquid lens for miniature cameras
NASA Astrophysics Data System (ADS)
Kuiper, S.; Hendriks, B. H. W.
2004-08-01
The meniscus between two immiscible liquids can be used as an optical lens. A change in curvature of this meniscus by electrowetting leads to a change in focal distance. It is demonstrated that two liquids in a tube form a self-centered lens with a high optical quality. The motion of the lens during a focusing action was studied by observation through the transparent tube wall. Finally, a miniature achromatic camera module was designed and constructed based on this adjustable lens, showing that it is excellently suited for use in portable applications.
Photogrammetry of the Map Instrument in a Cryogenic Vacuum Environment
NASA Technical Reports Server (NTRS)
Hill, M.; Packard, E.; Pazar, R.
2000-01-01
MAP Instrument requirements dictated that the instruments Focal Plane Assembly (FPA) and Thermal Reflector System (TRS) maintain a high degree of structural integrity at operational temperatures (< 50K). To verify integrity at these extremes, an elaborate test fixture was constructed to provide a large cryogenic (< 20K) radiative environment and a mobile photogrammetry camera. This paper will discuss MAP's Instrument requirements, how those requirements were verified using photogrammetry, and the test setup used to provide the environment and camera movement needed to verify the instrument's requirements.
Color camera computed tomography imaging spectrometer for improved spatial-spectral image accuracy
NASA Technical Reports Server (NTRS)
Wilson, Daniel W. (Inventor); Bearman, Gregory H. (Inventor); Johnson, William R. (Inventor)
2011-01-01
Computed tomography imaging spectrometers ("CTIS"s) having color focal plane array detectors are provided. The color FPA detector may comprise a digital color camera including a digital image sensor, such as a Foveon X3.RTM. digital image sensor or a Bayer color filter mosaic. In another embodiment, the CTIS includes a pattern imposed either directly on the object scene being imaged or at the field stop aperture. The use of a color FPA detector and the pattern improves the accuracy of the captured spatial and spectral information.
Mapping Sequence performed during the STS-118 R-Bar Pitch Maneuver
2007-08-10
ISS015-E-21335 (10 Aug. 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Endeavour as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image is an almost nadir perspective over Endeavour's of aft cabin and its docking system. Distance between the station and shuttle at this time was approximately 600 feet.
Mapping Sequence performed during the STS-117 R-Bar Pitch Maneuver
2007-06-10
ISS015-E-11354 (10 June 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Atlantis as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows a view of the underside of nose/nosecap and forward landing gear doors. Distance from the station and shuttle at this time was approximately 600 feet.
Exact optics - III. Schwarzschild's spectrograph camera revised
NASA Astrophysics Data System (ADS)
Willstrop, R. V.
2004-03-01
Karl Schwarzschild identified a system of two mirrors, each defined by conic sections, free of third-order spherical aberration, coma and astigmatism, and with a flat focal surface. He considered it impractical, because the field was too restricted. This system was rediscovered as a quadratic approximation to one of Lynden-Bell's `exact optics' designs which have wider fields. Thus the `exact optics' version has a moderate but useful field, with excellent definition, suitable for a spectrograph camera. The mirrors are strongly aspheric in both the Schwarzschild design and the exact optics version.
Adaptive metalenses with simultaneous electrical control of focal length, astigmatism, and shift.
She, Alan; Zhang, Shuyan; Shian, Samuel; Clarke, David R; Capasso, Federico
2018-02-01
Focal adjustment and zooming are universal features of cameras and advanced optical systems. Such tuning is usually performed longitudinally along the optical axis by mechanical or electrical control of focal length. However, the recent advent of ultrathin planar lenses based on metasurfaces (metalenses), which opens the door to future drastic miniaturization of mobile devices such as cell phones and wearable displays, mandates fundamentally different forms of tuning based on lateral motion rather than longitudinal motion. Theory shows that the strain field of a metalens substrate can be directly mapped into the outgoing optical wavefront to achieve large diffraction-limited focal length tuning and control of aberrations. We demonstrate electrically tunable large-area metalenses controlled by artificial muscles capable of simultaneously performing focal length tuning (>100%) as well as on-the-fly astigmatism and image shift corrections, which until now were only possible in electron optics. The device thickness is only 30 μm. Our results demonstrate the possibility of future optical microscopes that fully operate electronically, as well as compact optical systems that use the principles of adaptive optics to correct many orders of aberrations simultaneously.
Application of preconditioned alternating direction method of multipliers in depth from focal stack
NASA Astrophysics Data System (ADS)
Javidnia, Hossein; Corcoran, Peter
2018-03-01
Postcapture refocusing effect in smartphone cameras is achievable using focal stacks. However, the accuracy of this effect is totally dependent on the combination of the depth layers in the stack. The accuracy of the extended depth of field effect in this application can be improved significantly by computing an accurate depth map, which has been an open issue for decades. To tackle this issue, a framework is proposed based on a preconditioned alternating direction method of multipliers for depth from the focal stack and synthetic defocus application. In addition to its ability to provide high structural accuracy, the optimization function of the proposed framework can, in fact, converge faster and better than state-of-the-art methods. The qualitative evaluation has been done on 21 sets of focal stacks and the optimization function has been compared against five other methods. Later, 10 light field image sets have been transformed into focal stacks for quantitative evaluation purposes. Preliminary results indicate that the proposed framework has a better performance in terms of structural accuracy and optimization in comparison to the current state-of-the-art methods.
Miniaturization of dielectric liquid microlens in package
Yang, Chih-Cheng; Tsai, C. Gary; Yeh, J. Andrew
2010-01-01
This study presents packaged microscale liquid lenses actuated with liquid droplets of 300–700 μm in diameter using the dielectric force manipulation. The liquid microlens demonstrated function focal length tunability in a plastic package. The focal length of the liquid lens with a lens droplet of 500 μm in diameter is shortened from 4.4 to 2.2 mm when voltages applied change from 0 to 79 Vrms. Dynamic responses that are analyzed using 2000 frames∕s high speed motion cameras show that the advancing and receding times are measured to be 90 and 60 ms, respectively. The size effect of dielectric liquid microlens is characterized for a lens droplet of 300–700 μm in diameter in an aspect of focal length. PMID:21267438
Embrace the Dark Side: Advancing the Dark Energy Survey
NASA Astrophysics Data System (ADS)
Suchyta, Eric
The Dark Energy Survey (DES) is an ongoing cosmological survey intended to study the properties of the accelerated expansion of the Universe. In this dissertation, I present work of mine that has advanced the progress of DES. First is an introduction, which explores the physics of the cosmos, as well as how DES intends to probe it. Attention is given to developing the theoretical framework cosmologists use to describe the Universe, and to explaining observational evidence which has furnished our current conception of the cosmos. Emphasis is placed on the dark sector - dark matter and dark energy - the content of the Universe not explained by the Standard Model of particle physics. As its name suggests, the Dark Energy Survey has been specially designed to measure the properties of dark energy. DES will use a combination of galaxy cluster, weak gravitational lensing, angular clustering, and supernovae measurements to derive its state of the art constraints, each of which is discussed in the text. The work described in this dissertation includes science measurements directly related to the first three of these probes. The dissertation presents my contributions to the readout and control system of the Dark Energy Camera (DECam); the name of this software is SISPI. SISPI uses client-server and publish-subscribe communication patterns to coordinate and command actions among the many hardware components of DECam - the survey instrument for DES, a 570 megapixel CCD camera, mounted at prime focus of the Blanco 4-m Telescope. The SISPI work I discuss includes coding applications for DECam's filter changer mechanism and hexapod, as well as developing the Scripts Editor, a GUI application for DECam users to edit and export observing sequence SISPI can load and execute. Next, the dissertation describes the processing of early DES data, which I contributed. This furnished the data products used in the first-completed DES science analysis, and contributed to improving the collaboration-wide treatment of the data. The science measurements themselves are also detailed. We verified DES's capabilities for performing weak lensing analyses by measuring the masses of four galaxy clusters, finding consistency with previous measurements, and utilized DECam's wide field-of-view for a photometric study of filament-like structures in the fields. Finally, my recent work with Balrog is presented. Balrog is a simulation toolkit for embedding fake objects into real survey images in order to correct for systematic biases. We have used Balrog to extend DES galaxy clustering measurements down to fainter limits than previously possible, finding results consistent with higher-resolution space-based data. The methodology used in this analysis generalizes beyond galaxy clustering alone, and promises to be useful in future imaging survey measurements.
NASA Technical Reports Server (NTRS)
Diner, Daniel B. (Inventor)
1989-01-01
A method and apparatus is developed for obtaining a stereo image with reduced depth distortion and optimum depth resolution. Static and dynamic depth distortion and depth resolution tradeoff is provided. Cameras obtaining the images for a stereo view are converged at a convergence point behind the object to be presented in the image, and the collection-surface-to-object distance, the camera separation distance, and the focal lengths of zoom lenses for the cameras are all increased. Doubling the distances cuts the static depth distortion in half while maintaining image size and depth resolution. Dynamic depth distortion is minimized by panning a stereo view-collecting camera system about a circle which passes through the convergence point and the camera's first nodal points. Horizontal field shifting of the television fields on a television monitor brings both the monitor and the stereo views within the viewer's limit of binocular fusion.
System Architecture of the Dark Energy Survey Camera Readout Electronics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaw, Theresa; /FERMILAB; Ballester, Otger
2010-05-27
The Dark Energy Survey makes use of a new camera, the Dark Energy Camera (DECam). DECam will be installed in the Blanco 4M telescope at Cerro Tololo Inter-American Observatory (CTIO). DECam is presently under construction and is expected to be ready for observations in the fall of 2011. The focal plane will make use of 62 2Kx4K and 12 2kx2k fully depleted Charge-Coupled Devices (CCDs) for guiding, alignment and focus. This paper will describe design considerations of the system; including, the entire signal path used to read out the CCDs, the development of a custom crate and backplane, the overallmore » grounding scheme and early results of system tests.« less
InGaAs focal plane arrays for low-light-level SWIR imaging
NASA Astrophysics Data System (ADS)
MacDougal, Michael; Hood, Andrew; Geske, Jon; Wang, Jim; Patel, Falgun; Follman, David; Manzo, Juan; Getty, Jonathan
2011-06-01
Aerius Photonics will present their latest developments in large InGaAs focal plane arrays, which are used for low light level imaging in the short wavelength infrared (SWIR) regime. Aerius will present imaging in both 1280x1024 and 640x512 formats. Aerius will present characterization of the FPA including dark current measurements. Aerius will also show the results of development of SWIR FPAs for high temperaures, including imagery and dark current data. Finally, Aerius will show results of using the SWIR camera with Aerius' SWIR illuminators using VCSEL technology.
Improved Scanners for Microscopic Hyperspectral Imaging
NASA Technical Reports Server (NTRS)
Mao, Chengye
2009-01-01
Improved scanners to be incorporated into hyperspectral microscope-based imaging systems have been invented. Heretofore, in microscopic imaging, including spectral imaging, it has been customary to either move the specimen relative to the optical assembly that includes the microscope or else move the entire assembly relative to the specimen. It becomes extremely difficult to control such scanning when submicron translation increments are required, because the high magnification of the microscope enlarges all movements in the specimen image on the focal plane. To overcome this difficulty, in a system based on this invention, no attempt would be made to move either the specimen or the optical assembly. Instead, an objective lens would be moved within the assembly so as to cause translation of the image at the focal plane: the effect would be equivalent to scanning in the focal plane. The upper part of the figure depicts a generic proposed microscope-based hyperspectral imaging system incorporating the invention. The optical assembly of this system would include an objective lens (normally, a microscope objective lens) and a charge-coupled-device (CCD) camera. The objective lens would be mounted on a servomotor-driven translation stage, which would be capable of moving the lens in precisely controlled increments, relative to the camera, parallel to the focal-plane scan axis. The output of the CCD camera would be digitized and fed to a frame grabber in a computer. The computer would store the frame-grabber output for subsequent viewing and/or processing of images. The computer would contain a position-control interface board, through which it would control the servomotor. There are several versions of the invention. An essential feature common to all versions is that the stationary optical subassembly containing the camera would also contain a spatial window, at the focal plane of the objective lens, that would pass only a selected portion of the image. In one version, the window would be a slit, the CCD would contain a one-dimensional array of pixels, and the objective lens would be moved along an axis perpendicular to the slit to spatially scan the image of the specimen in pushbroom fashion. The image built up by scanning in this case would be an ordinary (non-spectral) image. In another version, the optics of which are depicted in the lower part of the figure, the spatial window would be a slit, the CCD would contain a two-dimensional array of pixels, the slit image would be refocused onto the CCD by a relay-lens pair consisting of a collimating and a focusing lens, and a prism-gratingprism optical spectrometer would be placed between the collimating and focusing lenses. Consequently, the image on the CCD would be spatially resolved along the slit axis and spectrally resolved along the axis perpendicular to the slit. As in the first-mentioned version, the objective lens would be moved along an axis perpendicular to the slit to spatially scan the image of the specimen in pushbroom fashion.
NASA Astrophysics Data System (ADS)
Zhang, Rumin; Liu, Peng; Liu, Dijun; Su, Guobin
2015-12-01
In this paper, we establish a forward simulation model of plenoptic camera which is implemented by inserting a micro-lens array in a conventional camera. The simulation model is used to emulate how the space objects at different depths are imaged by the main lens then remapped by the micro-lens and finally captured on the 2D sensor. We can easily modify the parameters of the simulation model such as the focal lengths and diameters of the main lens and micro-lens and the number of micro-lens. Employing the spatial integration, the refocused images and all-in-focus images are rendered based on the plenoptic images produced by the model. The forward simulation model can be used to determine the trade-offs between different configurations and to test any new researches related to plenoptic camera without the need of prototype.
Plenoptic Imager for Automated Surface Navigation
NASA Technical Reports Server (NTRS)
Zollar, Byron; Milder, Andrew; Milder, Andrew; Mayo, Michael
2010-01-01
An electro-optical imaging device is capable of autonomously determining the range to objects in a scene without the use of active emitters or multiple apertures. The novel, automated, low-power imaging system is based on a plenoptic camera design that was constructed as a breadboard system. Nanohmics proved feasibility of the concept by designing an optical system for a prototype plenoptic camera, developing simulated plenoptic images and range-calculation algorithms, constructing a breadboard prototype plenoptic camera, and processing images (including range calculations) from the prototype system. The breadboard demonstration included an optical subsystem comprised of a main aperture lens, a mechanical structure that holds an array of micro lenses at the focal distance from the main lens, and a structure that mates a CMOS imaging sensor the correct distance from the micro lenses. The demonstrator also featured embedded electronics for camera readout, and a post-processor executing image-processing algorithms to provide ranging information.
NASA Astrophysics Data System (ADS)
Liu, L.; Huang, Zh.; Qiu, Zh.; Li, B.
2018-01-01
A handheld RGB camera was developed to monitor the in vivo distribution of porphyrin-based photosensitizer (PS) hematoporphyrin monomethyl ether (HMME) in blood vessels during photodynamic therapy (PDT). The focal length, f-number, International Standardization Organization (ISO) sensitivity, and shutter speed of the camera were optimized for the solution sample with various HMME concentrations. After the parameter optimization, it was found that the red intensity value of the fluorescence image was linearly related to the fluorescence intensity under investigated conditions. The RGB camera was then used to monitor the in vivo distribution of HMME in blood vessels in a skin-fold window chamber model. The red intensity value of the recorded RGB fluorescence image was found to be linearly correlated to HMME concentrations in the range 0-24 μM. Significant differences in the red to green intensity ratios were observed between the blood vessels and the surrounding tissue.
The multifocus plenoptic camera
NASA Astrophysics Data System (ADS)
Georgiev, Todor; Lumsdaine, Andrew
2012-01-01
The focused plenoptic camera is based on the Lippmann sensor: an array of microlenses focused on the pixels of a conventional image sensor. This device samples the radiance, or plenoptic function, as an array of cameras with large depth of field, focused at a certain plane in front of the microlenses. For the purpose of digital refocusing (which is one of the important applications) the depth of field needs to be large, but there are fundamental optical limitations to this. The solution of the above problem is to use and array of interleaved microlenses of different focal lengths, focused at two or more different planes. In this way a focused image can be constructed at any depth of focus, and a really wide range of digital refocusing can be achieved. This paper presents our theory and results of implementing such camera. Real world images are demonstrating the extended capabilities, and limitations are discussed.
To catch a comet: Technical overview of CAN DO G-324
NASA Technical Reports Server (NTRS)
Obrien, T. J. (Editor)
1986-01-01
The primary objective of the C. E. Williams Middle School Get Away Special CAN DO is the photographing of Comet Halley. The project will involve middle school students, grades 6 through 8, in the study and interpretation of astronomical photographs and techniques. G-324 is contained in a 5 cubic foot GAS Canister with an opening door and pyrex window for photography. It will be pressurized with one atmosphere of dry nitrogen. Three 35mm still cameras with 250 exposure film backs and different focal length lenses will be fired by a combination of automatic timer and an active comet detector. A lightweight 35mm movie camera will shoot single exposures at about 1/2 minute intervals to give an overlapping skymap of the mission. The fifth camera is a solid state television camera specially constructed for detection of the comet by microprocessor.
Wrist Camera Orientation for Effective Telerobotic Orbital Replaceable Unit (ORU) Changeout
NASA Technical Reports Server (NTRS)
Jones, Sharon Monica; Aldridge, Hal A.; Vazquez, Sixto L.
1997-01-01
The Hydraulic Manipulator Testbed (HMTB) is the kinematic replica of the Flight Telerobotic Servicer (FTS). One use of the HMTB is to evaluate advanced control techniques for accomplishing robotic maintenance tasks on board the Space Station. Most maintenance tasks involve the direct manipulation of the robot by a human operator when high-quality visual feedback is important for precise control. An experiment was conducted in the Systems Integration Branch at the Langley Research Center to compare several configurations of the manipulator wrist camera for providing visual feedback during an Orbital Replaceable Unit changeout task. Several variables were considered such as wrist camera angle, camera focal length, target location, lighting. Each study participant performed the maintenance task by using eight combinations of the variables based on a Latin square design. The results of this experiment and conclusions based on data collected are presented.
The TolTEC Camera for the LMT Telescope
NASA Astrophysics Data System (ADS)
Bryan, Sean
2018-01-01
TolTEC is a new camera being built for the 50-meter Large Millimeter-wave Telescope (LMT) on Sierra Negra in Puebla, Mexico. The instrument will discover and characterize distant galaxies by detecting the thermal emission of dust heated by starlight. The polarimetric capabilities of the camera will measure magnetic fields in star-forming regions in the Milky Way. The optical design of the camera uses mirrors, lenses, and dichroics to simultaneously couple a 4 arcminute diameter field of view onto three single-band focal planes at 150, 220, and 280 GHz. The 7000 polarization-selective detectors are single-band horn-coupled LEKID detectors fabricated at NIST. A rotating half wave plate operates at ambient temperature to modulate the polarized signal. In addition to the galactic and extragalactic surveys already planned, TolTEC installed at the LMT will provide open observing time to the community.
A Combined Laser-Communication and Imager for Microspacecraft (ACLAIM)
NASA Technical Reports Server (NTRS)
Hemmati, H.; Lesh, J.
1998-01-01
ACLAIM is a multi-function instrument consisting of a laser communication terminal and an imaging camera that share a common telescope. A single APS- (Active Pixel Sensor) based focal-plane-array is used to perform both the acquisition and tracking (for laser communication) and science imaging functions.
50 CFR 217.55 - Requirements for monitoring and reporting.
Code of Federal Regulations, 2014 CFR
2014-10-01
... MAMMALS INCIDENTAL TO SPECIFIED ACTIVITIES Taking of Marine Mammals Incidental To Target and Missile... the following monitoring measures: (1) Visual land-based monitoring. (i) Prior to each missile launch... located varying distances from the missile launch site. Each video camera will be set to record a focal...
50 CFR 216.155 - Requirements for monitoring and reporting.
Code of Federal Regulations, 2013 CFR
2013-10-01
... IMPORTING OF MARINE MAMMALS Taking Of Marine Mammals Incidental To Missile Launch Activities from San... monitoring measures: (1) Visual Land-Based Monitoring. (i) Prior to each missile launch, an observer(s) will... from the missile launch site. Each video camera will be set to record a focal subgroup within the...
50 CFR 216.155 - Requirements for monitoring and reporting.
Code of Federal Regulations, 2012 CFR
2012-10-01
... IMPORTING OF MARINE MAMMALS Taking Of Marine Mammals Incidental To Missile Launch Activities from San... monitoring measures: (1) Visual Land-Based Monitoring. (i) Prior to each missile launch, an observer(s) will... from the missile launch site. Each video camera will be set to record a focal subgroup within the...
50 CFR 216.155 - Requirements for monitoring and reporting.
Code of Federal Regulations, 2011 CFR
2011-10-01
... IMPORTING OF MARINE MAMMALS Taking Of Marine Mammals Incidental To Missile Launch Activities from San... monitoring measures: (1) Visual Land-Based Monitoring. (i) Prior to each missile launch, an observer(s) will... from the missile launch site. Each video camera will be set to record a focal subgroup within the...
Military Applications of Curved Focal Plane Arrays Developed by the HARDI Program
2011-01-01
considered one of the main founders of geometrical optics, modern photography, and cinematography . Among his inventions are the Petzval portrait lens...still be a problem. B. HARDI Program/Institute for Defense Analyses (IDA) Task 1. HARDI Program State-of-the- art cameras could be improved by
Optics for MUSIC: a new (sub)millimeter camera for the Caltech Submillimeter Observatory
NASA Astrophysics Data System (ADS)
Sayers, Jack; Czakon, Nicole G.; Day, Peter K.; Downes, Thomas P.; Duan, Ran P.; Gao, Jiansong; Glenn, Jason; Golwala, Sunil R.; Hollister, Matt I.; LeDuc, Henry G.; Mazin, Benjamin A.; Maloney, Philip R.; Noroozian, Omid; Nguyen, Hien T.; Schlaerth, James A.; Siegel, Seth; Vaillancourt, John E.; Vayonakis, Anastasios; Wilson, Philip R.; Zmuidzinas, Jonas
2010-07-01
We will present the design and implementation, along with calculations and some measurements of the performance, of the room-temperature and cryogenic optics for MUSIC, a new (sub)millimeter camera we are developing for the Caltech Submm Observatory (CSO). The design consists of two focusing elements in addition to the CSO primary and secondary mirrors: a warm off-axis elliptical mirror and a cryogenic (4K) lens. These optics will provide a 14 arcmin field of view that is diffraction limited in all four of the MUSIC observing bands (2.00, 1.33, 1.02, and 0.86 mm). A cold (4K) Lyot stop will be used to define the primary mirror illumination, which will be maximized while keeping spillover at the sub 1% level. The MUSIC focal plane will be populated with broadband phased antenna arrays that efficiently couple to factor of (see manuscript) 3 in bandwidth,1, 2 and each pixel on the focal plane will be read out via a set of four lumped element filters that define the MUSIC observing bands (i.e., each pixel on the focal plane simultaneously observes in all four bands). Finally, a series of dielectric and metal-mesh low pass filters have been implemented to reduce the optical power load on the MUSIC cryogenic stages to a quasi-negligible level while maintaining good transmission in-band.
Determining fast orientation changes of multi-spectral line cameras from the primary images
NASA Astrophysics Data System (ADS)
Wohlfeil, Jürgen
2012-01-01
Fast orientation changes of airborne and spaceborne line cameras cannot always be avoided. In such cases it is essential to measure them with high accuracy to ensure a good quality of the resulting imagery products. Several approaches exist to support the orientation measurement by using optical information received through the main objective/telescope. In this article an approach is proposed that allows the determination of non-systematic orientation changes between every captured line. It does not require any additional camera hardware or onboard processing capabilities but the payload images and a rough estimate of the camera's trajectory. The approach takes advantage of the typical geometry of multi-spectral line cameras with a set of linear sensor arrays for different spectral bands on the focal plane. First, homologous points are detected within the heavily distorted images of different spectral bands. With their help a connected network of geometrical correspondences can be built up. This network is used to calculate the orientation changes of the camera with the temporal and angular resolution of the camera. The approach was tested with an extensive set of aerial surveys covering a wide range of different conditions and achieved precise and reliable results.
Huesch, Marco D
2013-06-01
Assessing the real-world comparative effectiveness of common interventions is challenged by unmeasured confounding. To determine whether the mortality benefit shown for drug-eluting stents (DES) over bare metal stents (BMS) in observational studies persists after controls for/tests for confounding. Retrospective observational study involving 38,019 patients, 65 years or older admitted for an index percutaneous coronary intervention receiving DES or BMS in Pennsylvania in 2004-2005 followed up for death through 3 years. Analysis was at the patient level. Mortality was analyzed with Cox proportional hazards models allowing for stratification by disease severity or DES use propensity, accounting for clustering of patients. Instrumental variables analysis used lagged physician stent usage to proxy for the focal stent type decision. A method originating in work by Cornfield and others in 1954 and popularized by Greenland in 1996 was used to assess robustness to confounding. DES was associated with a significantly lower adjusted risk of death at 3 years in Cox and in instrumented analyses. An implausibly strong hypothetical unobserved confounder would be required to fully explain these results. Confounding by indication can bias observational studies. No strong evidence of such selection biases was found in the reduced risk of death among elderly patients receiving DES instead of BMS in a Pennsylvanian state-wide population. © Health Research and Educational Trust.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, T. S.
Meeting the science goals for many current and future ground-based optical large-area sky surveys requires that the calibrated broadband photometry is stable in time and uniform over the sky to 1% precision or better. Past surveys have achieved photometric precision of 1-2% by calibrating the survey's stellar photometry with repeated measurements of a large number of stars observed in multiple epochs. The calibration techniques employed by these surveys only consider the relative frame-by-frame photometric zeropoint offset and the focal plane position-dependent illumination corrections, which are independent of the source color. However, variations in the wavelength dependence of the atmospheric transmissionmore » and the instrumental throughput induce source color-dependent systematic errors. These systematic errors must also be considered to achieve the most precise photometric measurements. In this paper, we examine such systematic chromatic errors using photometry from the Dark Energy Survey (DES) as an example. We define a natural magnitude system for DES and calculate the systematic errors on stellar magnitudes, when the atmospheric transmission and instrumental throughput deviate from the natural system. We conclude that the systematic chromatic errors caused by the change of airmass in each exposure, the change of the precipitable water vapor and aerosol in the atmosphere over time, and the non-uniformity of instrumental throughput over the focal plane, can be up to 2% in some bandpasses. We compare the calculated systematic chromatic errors with the observed DES data. For the test sample data, we correct these errors using measurements of the atmospheric transmission and instrumental throughput. The residual after correction is less than 0.3%. We also find that the errors for non-stellar objects are redshift-dependent and can be larger than those for stars at certain redshifts.« less
NASA Technical Reports Server (NTRS)
Graff, Paige Valderrama; Baker, Marshalyn (Editor); Graff, Trevor (Editor); Lindgren, Charlie (Editor); Mailhot, Michele (Editor); McCollum, Tim (Editor); Runco, Susan (Editor); Stefanov, William (Editor); Willis, Kim (Editor)
2010-01-01
Scientists from the Image Science and Analysis Laboratory (ISAL) at NASA's Johnson Space Center (JSC) work with astronauts onboard the International Space Station (ISS) who take images of Earth. Astronaut photographs, sometimes referred to as Crew Earth Observations, are taken using hand-held digital cameras onboard the ISS. These digital images allow scientists to study our Earth from the unique perspective of space. Astronauts have taken images of Earth since the 1960s. There is a database of over 900,000 astronaut photographs available at http://eol.jsc.nasa.gov . Images are requested by ISAL scientists at JSC and astronauts in space personally frame and acquire them from the Destiny Laboratory or other windows in the ISS. By having astronauts take images, they can specifically frame them according to a given request and need. For example, they can choose to use different lenses to vary the amount of area (field of view) an image will cover. Images can be taken at different times of the day which allows different lighting conditions to bring out or highlight certain features. The viewing angle at which an image is acquired can also be varied to show the same area from different perspectives. Pointing the camera straight down gives you a nadir shot. Pointing the camera at an angle to get a view across an area would be considered an oblique shot. Being able to change these variables makes astronaut photographs a unique and useful data set. Astronaut photographs are taken from the ISS from altitudes of 300 - 400 km (185 to 250 miles). One of the current cameras being used, the Nikon D3X digital camera, can take images using a 50, 100, 250, 400 or 800mm lens. These different lenses allow for a wider or narrower field of view. The higher the focal length (800mm for example) the narrower the field of view (less area will be covered). Higher focal lengths also show greater detail of the area on the surface being imaged. Scientists from the Image Science and Analysis Laboratory (ISAL) at NASA s Johnson Space Center (JSC) work with astronauts onboard the International Space Station (ISS) who take images of Earth. Astronaut photographs, sometimes referred to as Crew Earth Observations, are taken using hand-held digital cameras onboard the ISS. These digital images allow scientists to study our Earth from the unique perspective of space. Astronauts have taken images of Earth since the 1960s. There is a database of over 900,000 astronaut photographs available at http://eol.jsc.nasa.gov . Images are requested by ISAL scientists at JSC and astronauts in space personally frame and acquire them from the Destiny Laboratory or other windows in the ISS. By having astronauts take images, they can specifically frame them according to a given request and need. For example, they can choose to use different lenses to vary the amount of area (field of view) an image will cover. Images can be taken at different times of the day which allows different lighting conditions to bring out or highlight certain features. The viewing angle at which an image is acquired can also be varied to show the same area from different perspectives. Pointing the camera straight down gives you a nadir shot. Pointing the camera at an angle to get a view across an area would be considered an oblique shot. Being able to change these variables makes astronaut photographs a unique and useful data set. Astronaut photographs are taken from the ISS from altitudes of 300 - 400 km (approx.185 to 250 miles). One of the current cameras being used, the Nikon D3X digital camera, can take images using a 50, 100, 250, 400 or 800mm lens. These different lenses allow for a wider or narrower field of view. The higher the focal length (800mm for example) the narrower the field of view (less area will be covered). Higher focal lengths also show greater detail of the area on the surface being imaged. There are four major systems or spheres of Earth. They are: Atmosphere, Biosphere, Hydrosphe, and Litho/Geosphere.
The Dark Energy Survey: more than dark energy – an overview
Abbott, T.
2016-03-21
This overview article describes the legacy prospect and discovery potential of the Dark Energy Survey (DES) beyond cosmological studies, illustrating it with examples from the DES early data. DES is using a wide-field camera (DECam) on the 4m Blanco Telescope in Chile to image 5000 sq deg of the sky in five filters ( grizY). By its completion the survey is expected to have generated a catalogue of 300 million galaxies with photometric redshifts and 100 million stars. In addition, a time-domain survey search over 27 sq deg is expected to yield a sample of thousands of Type Ia supernovaemore » and other transients. The main goals of DES are to characterise dark energy and dark matter, and to test alternative models of gravity; these goals will be pursued by studying large scale structure, cluster counts, weak gravitational lensing and Type Ia supernovae. However, DES also provides a rich data set which allows us to study many other aspects of astrophysics. In this paper we focus on additional science with DES, emphasizing areas where the survey makes a difference with respect to other current surveys. The paper illustrates, using early data (from `Science Verification', and from the first, second and third seasons of observations), what DES can tell us about the solar system, the Milky Way, galaxy evolution, quasars, and other topics. In addition, we show that if the cosmological model is assumed to be Lambda+ Cold Dark Matter (LCDM) then important astrophysics can be deduced from the primary DES probes. Lastly, highlights from DES early data include the discovery of 34 Trans Neptunian Objects, 17 dwarf satellites of the Milky Way, one published z > 6 quasar (and more confirmed) and two published superluminous supernovae (and more confirmed).« less
The Dark Energy Survey: More than dark energy - An overview
Abbott, T.
2016-03-21
This overview article describes the legacy prospect and discovery potential of the Dark Energy Survey (DES) beyond cosmological studies, illustrating it with examples from the DES early data. DES is using a wide-field camera (DECam) on the 4m Blanco Telescope in Chile to image 5000 sq deg of the sky in five filters ( grizY). By its completion the survey is expected to have generated a catalogue of 300 million galaxies with photometric redshifts and 100 million stars. In addition, a time-domain survey search over 27 sq deg is expected to yield a sample of thousands of Type Ia supernovaemore » and other transients. The main goals of DES are to characterise dark energy and dark matter, and to test alternative models of gravity; these goals will be pursued by studying large scale structure, cluster counts, weak gravitational lensing and Type Ia supernovae. However, DES also provides a rich data set which allows us to study many other aspects of astrophysics. In this paper we focus on additional science with DES, emphasizing areas where the survey makes a difference with respect to other current surveys. The paper illustrates, using early data (from `Science Verification', and from the first, second and third seasons of observations), what DES can tell us about the solar system, the Milky Way, galaxy evolution, quasars, and other topics. In addition, we show that if the cosmological model is assumed to be Lambda+ Cold Dark Matter (LCDM) then important astrophysics can be deduced from the primary DES probes. Lastly, highlights from DES early data include the discovery of 34 Trans Neptunian Objects, 17 dwarf satellites of the Milky Way, one published z > 6 quasar (and more confirmed) and two published superluminous supernovae (and more confirmed).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abbott, T.
This overview article describes the legacy prospect and discovery potential of the Dark Energy Survey (DES) beyond cosmological studies, illustrating it with examples from the DES early data. DES is using a wide-field camera (DECam) on the 4m Blanco Telescope in Chile to image 5000 sq deg of the sky in five filters ( grizY). By its completion the survey is expected to have generated a catalogue of 300 million galaxies with photometric redshifts and 100 million stars. In addition, a time-domain survey search over 27 sq deg is expected to yield a sample of thousands of Type Ia supernovaemore » and other transients. The main goals of DES are to characterise dark energy and dark matter, and to test alternative models of gravity; these goals will be pursued by studying large scale structure, cluster counts, weak gravitational lensing and Type Ia supernovae. However, DES also provides a rich data set which allows us to study many other aspects of astrophysics. In this paper we focus on additional science with DES, emphasizing areas where the survey makes a difference with respect to other current surveys. The paper illustrates, using early data (from `Science Verification', and from the first, second and third seasons of observations), what DES can tell us about the solar system, the Milky Way, galaxy evolution, quasars, and other topics. In addition, we show that if the cosmological model is assumed to be Lambda+ Cold Dark Matter (LCDM) then important astrophysics can be deduced from the primary DES probes. Lastly, highlights from DES early data include the discovery of 34 Trans Neptunian Objects, 17 dwarf satellites of the Milky Way, one published z > 6 quasar (and more confirmed) and two published superluminous supernovae (and more confirmed).« less
The Dark Energy Survey: more than dark energy – an overview
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vikram, Vinu; Abbott, T; Abdalla, F. B.
This overview paper describes the legacy prospect and discovery potential of the Dark Energy Survey (DES) beyond cosmological studies, illustrating it with examples from the DES early data. DES is using a wide-field camera (DECam) on the 4 m Blanco Telescope in Chile to image 5000 sq deg of the sky in five filters (grizY). By its completion, the survey is expected to have generated a catalogue of 300 million galaxies with photometric redshifts and 100 million stars. In addition, a time-domain survey search over 27 sq deg is expected to yield a sample of thousands of Type Ia supernovaemore » and other transients. The main goals of DES are to characterize dark energy and dark matter, and to test alternative models of gravity; these goals will be pursued by studying large-scale structure, cluster counts, weak gravitational lensing and Type Ia supernovae. However, DES also provides a rich data set which allows us to study many other aspects of astrophysics. In this paper, we focus on additional science with DES, emphasizing areas where the survey makes a difference with respect to other current surveys. The paper illustrates, using early data (from ‘Science Verification’, and from the first, second and third seasons of observations), what DES can tell us about the Solar system, the Milky Way, galaxy evolution, quasars and other topics. In addition, we show that if the cosmological model is assumed to be Λ+cold dark matter, then important astrophysics can be deduced from the primary DES probes. Highlights from DES early data include the discovery of 34 trans-Neptunian objects, 17 dwarf satellites of the Milky Way, one published z > 6 quasar (and more confirmed) and two published superluminous supernovae (and more confirmed).« less
The Dark Energy Survey: more than dark energy – an overview
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abbott, T.
This overview article describes the legacy prospect and discovery potential of the Dark Energy Survey (DES) beyond cosmological studies, illustrating it with examples from the DES early data. DES is using a wide-field camera (DECam) on the 4m Blanco Telescope in Chile to image 5000 sq deg of the sky in five filters ( grizY). By its completion the survey is expected to have generated a catalogue of 300 million galaxies with photometric redshifts and 100 million stars. In addition, a time-domain survey search over 27 sq deg is expected to yield a sample of thousands of Type Ia supernovaemore » and other transients. The main goals of DES are to characterise dark energy and dark matter, and to test alternative models of gravity; these goals will be pursued by studying large scale structure, cluster counts, weak gravitational lensing and Type Ia supernovae. However, DES also provides a rich data set which allows us to study many other aspects of astrophysics. In this paper we focus on additional science with DES, emphasizing areas where the survey makes a difference with respect to other current surveys. The paper illustrates, using early data (from `Science Verification', and from the first, second and third seasons of observations), what DES can tell us about the solar system, the Milky Way, galaxy evolution, quasars, and other topics. In addition, we show that if the cosmological model is assumed to be Lambda+ Cold Dark Matter (LCDM) then important astrophysics can be deduced from the primary DES probes. Lastly, highlights from DES early data include the discovery of 34 Trans Neptunian Objects, 17 dwarf satellites of the Milky Way, one published z > 6 quasar (and more confirmed) and two published superluminous supernovae (and more confirmed).« less
Focus collimator press for a collimator for gamma ray cameras
DOE Office of Scientific and Technical Information (OSTI.GOV)
York, R.N.; York, D.L.
A focus collimator press for collimators for gamma ray cameras is described comprising a pivot arm of fixed length mounted on a travelling pivot which is movable in the plane of a spaced apart work table surface in a direction toward and away from the work table. A press plate is carried at the opposite end of the fixed length pivot arm, and is maintained in registration with the same portion of the work table for pressing engagement with each undulating radiation opaque strip as it is added to the top of a collimator stack in process by movement ofmore » the travelling pivot inward toward the work table. This enables the press plate to maintain its relative position above the collimator stack and at the same time the angle of the press plate changes, becoming less acute in relation to the work table as the travelling pivot motes inwardly toward the work table. The fixed length of the pivot arm is substantially equal to the focal point of the converging apertures formed by each pair of undulating strips stacked together. Thus, the focal point of each aperture row falls substantially on the axis of the travelling pivot, and since it moves in the plane of the work table surface the focal point of each aperture row is directed to lie in the same common plane. When one of two collimator stacks made in this way is rotated 180 degrees and the two bonded together along their respective first strips, all focal points of every aperture row lie on the central axis of the completed collimator.« less
Electro-optical detector for use in a wide mass range mass spectrometer
NASA Technical Reports Server (NTRS)
Giffin, Charles E. (Inventor)
1976-01-01
An electro-optical detector is disclosed for use in a wide mass range mass spectrometer (MS), in the latter the focal plane is at or very near the exit end of the magnetic analyzer, so that a strong magnetic field of the order of 1000G or more is present at the focal plane location. The novel detector includes a microchannel electron multiplier array (MCA) which is positioned at the focal plane to convert ion beams which are focused by the MS at the focal plane into corresponding electron beams which are then accelerated to form visual images on a conductive phosphored surface. These visual images are then converted into images on the target of a vidicon camera or the like for electronic processing. Due to the strong magnetic field at the focal plane, in one embodiment of the invention, the MCA with front and back parallel ends is placed so that its front end forms an angle of not less than several degrees, preferably on the order of 10.degree.-20.degree., with respect to the focal plane, with the center line of the front end preferably located in the focal plane. In another embodiment the MCA is wedge-shaped, with its back end at an angle of about 10.degree.-20.degree. with respect to the front end. In this embodiment the MCA is placed so that its front end is located at the focal plane.
Calibration and accuracy analysis of a focused plenoptic camera
NASA Astrophysics Data System (ADS)
Zeller, N.; Quint, F.; Stilla, U.
2014-08-01
In this article we introduce new methods for the calibration of depth images from focused plenoptic cameras and validate the results. We start with a brief description of the concept of a focused plenoptic camera and how from the recorded raw image a depth map can be estimated. For this camera, an analytical expression of the depth accuracy is derived for the first time. In the main part of the paper, methods to calibrate a focused plenoptic camera are developed and evaluated. The optical imaging process is calibrated by using a method which is already known from the calibration of traditional cameras. For the calibration of the depth map two new model based methods, which make use of the projection concept of the camera are developed. These new methods are compared to a common curve fitting approach, which is based on Taylor-series-approximation. Both model based methods show significant advantages compared to the curve fitting method. They need less reference points for calibration than the curve fitting method and moreover, supply a function which is valid in excess of the range of calibration. In addition the depth map accuracy of the plenoptic camera was experimentally investigated for different focal lengths of the main lens and is compared to the analytical evaluation.
FOCAL PLANE WAVEFRONT SENSING USING RESIDUAL ADAPTIVE OPTICS SPECKLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Codona, Johanan L.; Kenworthy, Matthew, E-mail: jlcodona@gmail.com
2013-04-20
Optical imperfections, misalignments, aberrations, and even dust can significantly limit sensitivity in high-contrast imaging systems such as coronagraphs. An upstream deformable mirror (DM) in the pupil can be used to correct or compensate for these flaws, either to enhance the Strehl ratio or suppress the residual coronagraphic halo. Measurement of the phase and amplitude of the starlight halo at the science camera is essential for determining the DM shape that compensates for any non-common-path (NCP) wavefront errors. Using DM displacement ripples to create a series of probe and anti-halo speckles in the focal plane has been proposed for space-based coronagraphsmore » and successfully demonstrated in the lab. We present the theory and first on-sky demonstration of a technique to measure the complex halo using the rapidly changing residual atmospheric speckles at the 6.5 m MMT telescope using the Clio mid-IR camera. The AO system's wavefront sensor measurements are used to estimate the residual wavefront, allowing us to approximately compute the rapidly evolving phase and amplitude of speckle halo. When combined with relatively short, synchronized science camera images, the complex speckle estimates can be used to interferometrically analyze the images, leading to an estimate of the static diffraction halo with NCP effects included. In an operational system, this information could be collected continuously and used to iteratively correct quasi-static NCP errors or suppress imperfect coronagraphic halos.« less
Mapping Sequence performed during the STS-117 R-Bar Pitch Maneuver
2007-06-10
ISS015-E-11351 (10 June 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Atlantis as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows part of Atlantis' underside thermal protection system and part of the port side cabin, including the hatch, as well as a section of the open payload bay cover. Distance from the station and shuttle at this time was approximately 600 feet.
Mapping Sequence performed during the STS-117 R-Bar Pitch Maneuver
2007-06-10
ISS015-E-11320 (10 June 2007) --- This is one of a series of images, photographed with a digital still camera using an 800mm focal length, featuring the different areas of the Space Shuttle Atlantis as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows part of Atlantis' cabin and its docking system, which a short time later was involved in linking up with the orbital outpost. Distance from the station and shuttle at this time was approximately 600 feet.
Mapping Sequence performed during the STS-118 R-Bar Pitch Maneuver
2007-08-10
ISS015-E-21340 (10 Aug. 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Endeavour as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows part of the commander's side or port side of Endeavour's cabin, including the hatch, as well as a section of the open payload bay cover. Distance between the station and shuttle at this time was approximately 600 feet.
NASA Technical Reports Server (NTRS)
Mach, Douglas M.; Rust, W. David
1989-01-01
The present device for lightning channel propagation-velocity determination employs eight photodetectors mounted behind precision horizontal slits in the focal plane of a photographic camera lens. The eight photodetector pulses, IRIG-B time, and slow and fast electric field-change waveforms are recorded on a 14-track analog tape recorder. A comparison of the present results with those obtained by a streaking camera shows no significant differences between the velocities obtained from the same strokes with the two systems; neither is there any difference in pulse characteristics or in the velocities calculated from them.
dark matter structure in universe The findings - the most accurate made of the universe's present large -scale structure - support the dark matter/dark energy model. Read More Muon g-2 Muon magnet's moment has massive international experiment. Read More Dark Energy Camera DES reveals most accurate measurement of
NASA Astrophysics Data System (ADS)
Oertel, D.; Jahn, H.; Sandau, R.; Walter, I.; Driescher, H.
1990-10-01
Objectives of the multifunctional stereo imaging camera (MUSIC) system to be deployed on the Soviet Mars-94 mission are outlined. A high-resolution stereo camera (HRSC) and wide-angle opto-electronic stereo scanner (WAOSS) are combined in terms of hardware, software, technology aspects, and solutions. Both HRSC and WAOSS are push-button instruments containing a single optical system and focal plates with several parallel CCD line sensors. Emphasis is placed on the MUSIC system's stereo capability, its design, mass memory, and data compression. A 1-Gbit memory is divided into two parts: 80 percent for HRSC and 20 percent for WAOSS, while the selected on-line compression strategy is based on macropixel coding and real-time transform coding.
Image intensification; Proceedings of the Meeting, Los Angeles, CA, Jan. 17, 18, 1989
NASA Astrophysics Data System (ADS)
Csorba, Illes P.
Various papers on image intensification are presented. Individual topics discussed include: status of high-speed optical detector technologies, super second generation imge intensifier, gated image intensifiers and applications, resistive-anode position-sensing photomultiplier tube operational modeling, undersea imaging and target detection with gated image intensifier tubes, image intensifier modules for use with commercially available solid state cameras, specifying the components of an intensified solid state television camera, superconducting IR focal plane arrays, one-inch TV camera tube with very high resolution capacity, CCD-Digicon detector system performance parameters, high-resolution X-ray imaging device, high-output technology microchannel plate, preconditioning of microchannel plate stacks, recent advances in small-pore microchannel plate technology, performance of long-life curved channel microchannel plates, low-noise microchannel plates, development of a quartz envelope heater.
Observation of interaction of shock wave with gas bubble by image converter camera
NASA Astrophysics Data System (ADS)
Yoshii, M.; Tada, M.; Tsuji, T.; Isuzugawa, Kohji
1995-05-01
When a spark discharge occurs at the first focal point of a semiellipsoid or a reflector located in water, a spherical shock wave is produced. A part of the wave spreads without reflecting on the reflector and is called direct wave in this paper. Another part reflects on the semiellipsoid and converges near the second focal point, that is named the focusing wave, and locally produces a high pressure. This phenomenon is applied to disintegrators of kidney stone. But it is concerned that cavitation bubbles induced in the body by the expansion wave following the focusing wave will injure human tissue around kidney stone. In this paper, in order to examine what happens when shock waves strike bubbles on human tissue, the aspect that an air bubble is truck by the spherical shock wave or its behavior is visualized by the schlieren system and its photographs are taken using an image converter camera. Besides,the variation of the pressure amplitude caused by the shock wave and the flow of water around the bubble is measured with a pressure probe.
NASA Astrophysics Data System (ADS)
Reverchon, Jean-Luc; Gourdel, Yves; Robo, Jean-Alexandre; Truffer, Jean-Patrick; Costard, Eric; Brault, Julien; Duboz, Jean-Yves
2017-11-01
The fast development of nitrides has given the opportunity to investigate AlGaN as a material for ultraviolet detection. Such AlGaN based camera presents an intrinsic spectral selectivity and an extremely low dark current at room temperature. Firstly, we will present results on focal plane array of 320x256 pixels with a pitch of 30μm. The peak responsivity is around 280nm (solar-blind), 310nm and 360nm. These results are obtained in a standard SWIR supply chain (readout circuit, electronics). With the existing near-UV camera grown on sapphire, the short wavelength cutoff is due to a window layer improving the material quality of the active layer. The ultimate shortest wavelength would be 200nm due to sapphire substrate. We present here the ways to transfer the standard design of Schottky photodiodes from sapphire to silicon substrate. We will show the capability to remove the silicon substrate, and etch the window layer in order to extend the band width to lower wavelengths.
Optomechanical stability design of space optical mapping camera
NASA Astrophysics Data System (ADS)
Li, Fuqiang; Cai, Weijun; Zhang, Fengqin; Li, Na; Fan, Junjie
2018-01-01
According to the interior orientation elements and imaging quality requirements of mapping application to mapping camera and combined with off-axis three-mirror anastigmat(TMA) system, high optomechanical stability design of a space optical mapping camera is introduced in this paper. The configuration is a coaxial TMA system used in off-axis situation. Firstly, the overall optical arrangement is described., and an overview of the optomechanical packaging is provided. Zerodurglass, carbon fiber composite and carbon-fiber reinforced silicon carbon (C/SiC) are widely used in the optomechanical structure, because their low coefficient of thermal expansion (CTE) can reduce the thermal sensitivity of the mirrors and focal plane. Flexible and unloading support are used in reflector and camera supporting structure. Epoxy structural adhesives is used for bonding optics to metal structure is also introduced in this paper. The primary mirror is mounted by means of three-point ball joint flexures system, which is attach to the back of the mirror. Then, In order to predict flexural displacements due to gravity, static finite element analysis (FEA) is performed on the primary mirror. The optical performance peak-to-valley (PV) and root-mean-square (RMS) wavefront errors are detected before and after assemble. Also, the dynamic finite element analysis(FEA) of the whole optical arrangement is carried out as to investigate the performance of optomechanical. Finally, in order to evaluate the stability of the design, the thermal vacuum test and vibration test are carried out and the Modulation Transfer Function (MTF) and elements of interior orientation are presented as the evaluation index. Before and after the thermal vacuum test and vibration test, the MTF, focal distance and position of the principal point of optical system are measured and the result is as expected.
Calibration of Action Cameras for Photogrammetric Purposes
Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo
2014-01-01
The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898
Calibration of action cameras for photogrammetric purposes.
Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo
2014-09-18
The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution.
ATTICA family of thermal cameras in submarine applications
NASA Astrophysics Data System (ADS)
Kuerbitz, Gunther; Fritze, Joerg; Hoefft, Jens-Rainer; Ruf, Berthold
2001-10-01
Optronics Mast Systems (US: Photonics Mast Systems) are electro-optical devices which enable a submarine crew to observe the scenery above water during dive. Unlike classical submarine periscopes they are non-hull-penetrating and therefore have no direct viewing capability. Typically they have electro-optical cameras both for the visual and for an IR spectral band with panoramic view and a stabilized line of sight. They can optionally be equipped with laser range- finders, antennas, etc. The brand name ATTICA (Advanced Two- dimensional Thermal Imager with CMOS-Array) characterizes a family of thermal cameras using focal-plane-array (FPA) detectors which can be tailored to a variety of requirements. The modular design of the ATTICA components allows the use of various detectors (InSb, CMT 3...5 μm , CMT 7...11 μm ) for specific applications. By means of a microscanner ATTICA cameras achieve full standard TV resolution using detectors with only 288 X 384 (US:240 X 320) detector elements. A typical requirement for Optronics-Mast Systems is a Quick- Look-Around capability. For FPA cameras this implies the need for a 'descan' module which can be incorporated in the ATTICA cameras without complications.
Design, demonstration and testing of low F-number LWIR panoramic imaging relay optics
NASA Astrophysics Data System (ADS)
Furxhi, Orges; Frascati, Joe; Driggers, Ronald
2018-04-01
Panoramic imaging is inherently wide field of view. High sensitivity uncooled Long Wave Infrared (LWIR) imaging requires low F-number optics. These two requirements result in short back working distance designs that, in addition to being costly, are challenging to integrate with commercially available uncooled LWIR cameras and cores. Common challenges include the relocation of the shutter flag, custom calibration of the camera dynamic range and NUC tables, focusing, and athermalization. Solutions to these challenges add to the system cost and make panoramic uncooled LWIR cameras commercially unattractive. In this paper, we present the design of Panoramic Imaging Relay Optics (PIRO) and show imagery and test results with one of the first prototypes. PIRO designs use several reflective surfaces (generally two) to relay a panoramic scene onto a real, donut-shaped image. The PIRO donut is imaged on the focal plane of the camera using a commercially-off-the-shelf (COTS) low F-number lens. This approach results in low component cost and effortless integration with pre-calibrated commercially available cameras and lenses.
Dense depth maps from correspondences derived from perceived motion
NASA Astrophysics Data System (ADS)
Kirby, Richard; Whitaker, Ross
2017-01-01
Many computer vision applications require finding corresponding points between images and using the corresponding points to estimate disparity. Today's correspondence finding algorithms primarily use image features or pixel intensities common between image pairs. Some 3-D computer vision applications, however, do not produce the desired results using correspondences derived from image features or pixel intensities. Two examples are the multimodal camera rig and the center region of a coaxial camera rig. We present an image correspondence finding technique that aligns pairs of image sequences using optical flow fields. The optical flow fields provide information about the structure and motion of the scene, which are not available in still images but can be used in image alignment. We apply the technique to a dual focal length stereo camera rig consisting of a visible light-infrared camera pair and to a coaxial camera rig. We test our method on real image sequences and compare our results with the state-of-the-art multimodal and structure from motion (SfM) algorithms. Our method produces more accurate depth and scene velocity reconstruction estimates than the state-of-the-art multimodal and SfM algorithms.
Optical Design of the LSST Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivier, S S; Seppala, L; Gilmore, K
2008-07-16
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a camera system that includes a set of broad-band filters and refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. Optical design of the camera lenses and filters is integrated with optical design of telescope mirrors to optimize performance, resulting in excellent image quality over the entire field from ultra-violet to near infra-red wavelengths. The LSST camera optics design consists of three refractive lenses withmore » clear aperture diameters of 1.55 m, 1.10 m and 0.69 m and six interchangeable, broad-band, filters with clear aperture diameters of 0.75 m. We describe the methodology for fabricating, coating, mounting and testing these lenses and filters, and we present the results of detailed tolerance analyses, demonstrating that the camera optics will perform to the specifications required to meet their performance goals.« less
AO WFS detector developments at ESO to prepare for the E-ELT
NASA Astrophysics Data System (ADS)
Downing, Mark; Casali, Mark; Finger, Gert; Lewis, Steffan; Marchetti, Enrico; Mehrgan, Leander; Ramsay, Suzanne; Reyes, Javier
2016-07-01
ESO has a very active on-going AO WFS detector development program to not only meet the needs of the current crop of instruments for the VLT, but also has been actively involved in gathering requirements, planning, and developing detectors and controllers/cameras for the instruments in design and being proposed for the E-ELT. This paper provides an overall summary of the AO WFS Detector requirements of the E-ELT instruments currently in design and telescope focal units. This is followed by a description of the many interesting detector, controller, and camera developments underway at ESO to meet these needs; a) the rationale behind and plan to upgrade the 240x240 pixels, 2000fps, "zero noise", L3Vision CCD220 sensor based AONGC camera; b) status of the LGSD/NGSD High QE, 3e- RoN, fast 700fps, 1760x1680 pixels, Visible CMOS Imager and camera development; c) status of and development plans for the Selex SAPHIRA NIR eAPD and controller. Most of the instruments and detector/camera developments are described in more detail in other papers at this conference.
Warren, Sean C; Kim, Youngchan; Stone, James M; Mitchell, Claire; Knight, Jonathan C; Neil, Mark A A; Paterson, Carl; French, Paul M W; Dunsby, Chris
2016-09-19
This paper demonstrates multiphoton excited fluorescence imaging through a polarisation maintaining multicore fiber (PM-MCF) while the fiber is dynamically deformed using all-proximal detection. Single-shot proximal measurement of the relative optical path lengths of all the cores of the PM-MCF in double pass is achieved using a Mach-Zehnder interferometer read out by a scientific CMOS camera operating at 416 Hz. A non-linear least squares fitting procedure is then employed to determine the deformation-induced lateral shift of the excitation spot at the distal tip of the PM-MCF. An experimental validation of this approach is presented that compares the proximally measured deformation-induced lateral shift in focal spot position to an independent distally measured ground truth. The proximal measurement of deformation-induced shift in focal spot position is applied to correct for deformation-induced shifts in focal spot position during raster-scanning multiphoton excited fluorescence imaging.
Myers, Matthew R; Giridhar, Dushyanth
2011-06-01
In the characterization of high-intensity focused ultrasound (HIFU) systems, it is desirable to know the intensity field within a tissue phantom. Infrared (IR) thermography is a potentially useful method for inferring this intensity field from the heating pattern within the phantom. However, IR measurements require an air layer between the phantom and the camera, making inferences about the thermal field in the absence of the air complicated. For example, convection currents can arise in the air layer and distort the measurements relative to the phantom-only situation. Quantitative predictions of intensity fields based upon IR temperature data are also complicated by axial and radial diffusion of heat. In this paper, mathematical expressions are derived for use with IR temperature data acquired at times long enough that noise is a relatively small fraction of the temperature trace, but small enough that convection currents have not yet developed. The relations were applied to simulated IR data sets derived from computed pressure and temperature fields. The simulation was performed in a finite-element geometry involving a HIFU transducer sonicating upward in a phantom toward an air interface, with an IR camera mounted atop an air layer, looking down at the heated interface. It was found that, when compared to the intensity field determined directly from acoustic propagation simulations, intensity profiles could be obtained from the simulated IR temperature data with an accuracy of better than 10%, at pre-focal, focal, and post-focal locations. © 2011 Acoustical Society of America
Determination de l'échelle dans un micromètre à fils.
NASA Astrophysics Data System (ADS)
Angelo, O. S.
1987-09-01
L'auteur analyse le procédé employé pour déterminer l'échelle du micromètre appliqué au réfracteur, diamètre 150 mm, longueur focale 3000 mm, de l'Observatoire privé E. Dembowski de Polpenazze (Brescia)-Italie. Il pense que cette expérience peut être utile à tous ceux qui voudront se mesurer avec l'observation des étoiles doubles visuelles.
High-frame-rate infrared and visible cameras for test range instrumentation
NASA Astrophysics Data System (ADS)
Ambrose, Joseph G.; King, B.; Tower, John R.; Hughes, Gary W.; Levine, Peter A.; Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; O'Mara, K.; Sjursen, W.; McCaffrey, Nathaniel J.; Pantuso, Francis P.
1995-09-01
Field deployable, high frame rate camera systems have been developed to support the test and evaluation activities at the White Sands Missile Range. The infrared cameras employ a 640 by 480 format PtSi focal plane array (FPA). The visible cameras employ a 1024 by 1024 format backside illuminated CCD. The monolithic, MOS architecture of the PtSi FPA supports commandable frame rate, frame size, and integration time. The infrared cameras provide 3 - 5 micron thermal imaging in selectable modes from 30 Hz frame rate, 640 by 480 frame size, 33 ms integration time to 300 Hz frame rate, 133 by 142 frame size, 1 ms integration time. The infrared cameras employ a 500 mm, f/1.7 lens. Video outputs are 12-bit digital video and RS170 analog video with histogram-based contrast enhancement. The 1024 by 1024 format CCD has a 32-port, split-frame transfer architecture. The visible cameras exploit this architecture to provide selectable modes from 30 Hz frame rate, 1024 by 1024 frame size, 32 ms integration time to 300 Hz frame rate, 1024 by 1024 frame size (with 2:1 vertical binning), 0.5 ms integration time. The visible cameras employ a 500 mm, f/4 lens, with integration time controlled by an electro-optical shutter. Video outputs are RS170 analog video (512 by 480 pixels), and 12-bit digital video.
First results from the TOPSAT camera
NASA Astrophysics Data System (ADS)
Greenway, Paul; Tosh, Ian; Morris, Nigel; Burton, Gary; Cawley, Steve
2017-11-01
The TopSat camera is a low cost remote sensing imager capable of producing 2.5 metre resolution panchromatic imagery, funded by the British National Space Centre's Mosaic programme. The instrument was designed and assembled at the Space Science & Technology Department of the CCLRC's Rutherford Appleton Laboratory (RAL) in the UK, and was launched on the 27th October 2005 from Plesetsk Cosmodrome in Northern Russia on a Kosmos-3M. The camera utilises an off-axis three mirror system, which has the advantages of excellent image quality over a wide field of view, combined with a compactness that makes its overall dimensions smaller than its focal length. Keeping the costs to a minimum has been a major design driver in the development of this camera. The camera is part of the TopSat mission, which is a collaboration between four UK organisations; QinetiQ, Surrey Satellite Technology Ltd (SSTL), RAL and Infoterra. Its objective is to demonstrate provision of rapid response high resolution imagery to fixed and mobile ground stations using a low cost minisatellite. The paper "Development of the TopSat Camera" presented by RAL at the 5th ICSO in 2004 described the opto-mechanical design, assembly, alignment and environmental test methods implemented. Now that the spacecraft is in orbit and successfully acquiring images, this paper presents the first results from the camera and makes an initial assessment of the camera's in-orbit performance.
NASA Astrophysics Data System (ADS)
Lee, Kyuhang; Ko, Jinseok; Wi, Hanmin; Chung, Jinil; Seo, Hyeonjin; Jo, Jae Heung
2018-06-01
The visible TV system used in the Korea Superconducting Tokamak Advanced Research device has been equipped with a periscope to minimize the damage on its CCD pixels from neutron radiation. The periscope with more than 2.3 m in overall length has been designed for the visible camera system with its semi-diagonal field of view as wide as 30° and its effective focal length as short as 5.57 mm. The design performance of the periscope includes the modulation transfer function greater than 0.25 at 68 cycles/mm with low distortion. The installed periscope system has confirmed the image qualities as designed and also as comparable as those from its predecessor but with far less probabilities of neutral damages on the camera.
Final Optical Design of PANIC, a Wide-Field Infrared Camera for CAHA
NASA Astrophysics Data System (ADS)
Cárdenas, M. C.; Gómez, J. Rodríguez; Lenzen, R.; Sánchez-Blanco, E.
We present the Final Optical Design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Ritchey-Chrtien focus of the Calar Alto 2.2 m telescope. This will be the first instrument built under the German-Spanish consortium that manages the Calar Alto observatory. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. The optical design produces a well defined internal pupil available to reducing the thermal background by a cryogenic pupil stop. A mosaic of four detectors Hawaii 2RG of 2 k ×2 k, made by Teledyne, will give a field of view of 31.9 arcmin ×31.9 arcmin.
NASA Astrophysics Data System (ADS)
Fuh, Yiin-Kuen; Chen, Pin-Wen; Lai, Zheng-Hong
2016-07-01
Mechanically deformable lenses with dynamically tunable focal lengths have been developed in this work. The fabricated five types of aspheric polydimethylsiloxane (PDMS) lenses presented here have an initial focal length of 7.0, 7.8, 9.0, 10.0 and 10.2 mm. Incorporating two modes of operation in biconvex and concave-convex configurations, the focal lengths can be tuned dynamically as 5.2-10.2, 5.5-9.9, 6.6-11.9, 6.1-13.5 and 6.6-13.5 mm respectively. Additive manufacturing was utilized to fabricate these five types of aspheric lenses (APLs) via sequential layering of PDMS materials. Complex structures with three-dimensional features and shorter focal lengths can be successfully produced by repeatedly depositing, inverting and curing controlled PDMS volume onto previously cured PDMS droplets. From our experiments, we empirically found a direct dependence of the focal length of the lenses with the amount (volume) of deposited PDMS droplets. This new mouldless, low-cost, and flexible lens fabrication method is able to transform an ordinary commercial smartphone camera into a low-cost portable microscope. A few microscopic features can be readily visualized, such as wrinkles of ladybird pupa and printed circuit board. The fabrication technique by successively applying hanging droplet and facile mechanical focal-length-tuning set-up can be easily adopted in the development of high-performance optical lenses.
NASA Astrophysics Data System (ADS)
Pomares, Jorge; Felicetti, Leonard; Pérez, Javier; Emami, M. Reza
2018-02-01
An image-based servo controller for the guidance of a spacecraft during non-cooperative rendezvous is presented in this paper. The controller directly utilizes the visual features from image frames of a target spacecraft for computing both attitude and orbital maneuvers concurrently. The utilization of adaptive optics, such as zooming cameras, is also addressed through developing an invariant-image servo controller. The controller allows for performing rendezvous maneuvers independently from the adjustments of the camera focal length, improving the performance and versatility of maneuvers. The stability of the proposed control scheme is proven analytically in the invariant space, and its viability is explored through numerical simulations.
Fixed-focus camera objective for small remote sensing satellites
NASA Astrophysics Data System (ADS)
Topaz, Jeremy M.; Braun, Ofer; Freiman, Dov
1993-09-01
An athermalized objective has been designed for a compact, lightweight push-broom camera which is under development at El-Op Ltd. for use in small remote-sensing satellites. The high performance objective has a fixed focus setting, but maintains focus passively over the full range of temperatures encountered in small satellites. The lens is an F/5.0, 320 mm focal length Tessar type, operating over the range 0.5 - 0.9 micrometers . It has a 16 degree(s) field of view and accommodates various state-of-the-art silicon detector arrays. The design and performance of the objective is described in this paper.
Mapping Sequence performed during the STS-117 R-Bar Pitch Maneuver
2007-06-10
ISS015-E-11328 (10 June 2007) --- This is one of a series of images photographed with a digital still camera using an 800mm focal length featuring the different areas of the Space Shuttle Atlantis as it approached the International Space Station and performed a back-flip to accommodate close scrutiny by eyeballs and cameras. This image shows part of the commander's side or port side of Atlantis' cabin, including the hatch, as well as a section of the open payload bay cover and part of the docking system. Distance from the station and shuttle at this time was approximately 600 feet.
Investigation of the flow structure in thin polymer films using 3D µPTV enhanced by GPU
NASA Astrophysics Data System (ADS)
Cavadini, Philipp; Weinhold, Hannes; Tönsmann, Max; Chilingaryan, Suren; Kopmann, Andreas; Lewkowicz, Alexander; Miao, Chuan; Scharfer, Philip; Schabel, Wilhelm
2018-04-01
To understand the effects of inhomogeneous drying on the quality of polymer coatings, an experimental setup to resolve the occurring flow field throughout the drying film has been developed. Deconvolution microscopy is used to analyze the flow field in 3D and time. Since the dimension of the spatial component in the direction of the line-of-sight is limited compared to the lateral components, a multi-focal approach is used. Here, the beam of light is equally distributed on up to five cameras using cubic beam splitters. Adding a meniscus lens between each pair of camera and beam splitter and setting different distances between each camera and its meniscus lens creates multi-focality and allows one to increase the depth of the observed volume. Resolving the spatial component in the line-of-sight direction is based on analyzing the point spread function. The analysis of the PSF is computational expensive and introduces a high complexity compared to traditional particle image velocimetry approaches. A new algorithm tailored to the parallel computing architecture of recent graphics processing units has been developed. The algorithm is able to process typical images in less than a second and has further potential to realize online analysis in the future. As a prove of principle, the flow fields occurring in thin polymer solutions drying at ambient conditions and at boundary conditions that force inhomogeneous drying are presented.
Fine Guidance Sensing for Coronagraphic Observatories
NASA Technical Reports Server (NTRS)
Brugarolas, Paul; Alexander, James W.; Trauger, John T.; Moody, Dwight C.
2011-01-01
Three options have been developed for Fine Guidance Sensing (FGS) for coronagraphic observatories using a Fine Guidance Camera within a coronagraphic instrument. Coronagraphic observatories require very fine precision pointing in order to image faint objects at very small distances from a target star. The Fine Guidance Camera measures the direction to the target star. The first option, referred to as Spot, was to collect all of the light reflected from a coronagraph occulter onto a focal plane, producing an Airy-type point spread function (PSF). This would allow almost all of the starlight from the central star to be used for centroiding. The second approach, referred to as Punctured Disk, collects the light that bypasses a central obscuration, producing a PSF with a punctured central disk. The final approach, referred to as Lyot, collects light after passing through the occulter at the Lyot stop. The study includes generation of representative images for each option by the science team, followed by an engineering evaluation of a centroiding or a photometric algorithm for each option. After the alignment of the coronagraph to the fine guidance system, a "nulling" point on the FGS focal point is determined by calibration. This alignment is implemented by a fine alignment mechanism that is part of the fine guidance camera selection mirror. If the star images meet the modeling assumptions, and the star "centroid" can be driven to that nulling point, the contrast for the coronagraph will be maximized.
KWFC: four square degrees camera for the Kiso Schmidt Telescope
NASA Astrophysics Data System (ADS)
Sako, Shigeyuki; Aoki, Tsutomu; Doi, Mamoru; Ienaka, Nobuyuki; Kobayashi, Naoto; Matsunaga, Noriyuki; Mito, Hiroyuki; Miyata, Takashi; Morokuma, Tomoki; Nakada, Yoshikazu; Soyano, Takao; Tarusawa, Ken'ichi; Miyazaki, Satoshi; Nakata, Fumiaki; Okada, Norio; Sarugaku, Yuki; Richmond, Michael W.
2012-09-01
The Kiso Wide Field Camera (KWFC) is a facility instrument for the 105-cm Schmidt telescope being operated by the Kiso Observatory of the University of Tokyo. This camera has been designed for wide-field observations by taking advantage of a large focal-plane area of the Schmidt telescope. Eight CCD chips with a total of 8k x 8k pixels cover a field-of-view of 2.2 degrees x 2.2 degrees on the sky. The dewar window works as a field flattener lens minimizing an image distortion across the field of view. Two shutter plates moving in parallel achieve uniform exposures on all the CCD pixels. The KWFC is equipped with a filter exchanger composed of an industrial robotic arm, a filter magazine capable of storing 12 filters, and a filter holder at the focal plane. Both the arm and the magazine are installed inside the tube framework of the telescope but without vignetting the beam. Wide-field survey programs searching for supernovae and late-type variable stars have begun in April 2012. The survey observations are performed with a management software system for facility instruments including the telescope and the KWFC. This system automatically carries out observations based on target lists registered in advance and makes appropriate decisions for implementation of observations by referring to weather conditions and status of the instruments. Image data obtained in the surveys are processed with pipeline software in real time to search for candidates of time-variable sources.
Data filtering with support vector machines in geometric camera calibration.
Ergun, B; Kavzoglu, T; Colkesen, I; Sahin, C
2010-02-01
The use of non-metric digital cameras in close-range photogrammetric applications and machine vision has become a popular research agenda. Being an essential component of photogrammetric evaluation, camera calibration is a crucial stage for non-metric cameras. Therefore, accurate camera calibration and orientation procedures have become prerequisites for the extraction of precise and reliable 3D metric information from images. The lack of accurate inner orientation parameters can lead to unreliable results in the photogrammetric process. A camera can be well defined with its principal distance, principal point offset and lens distortion parameters. Different camera models have been formulated and used in close-range photogrammetry, but generally sensor orientation and calibration is performed with a perspective geometrical model by means of the bundle adjustment. In this study, support vector machines (SVMs) using radial basis function kernel is employed to model the distortions measured for Olympus Aspherical Zoom lens Olympus E10 camera system that are later used in the geometric calibration process. It is intended to introduce an alternative approach for the on-the-job photogrammetric calibration stage. Experimental results for DSLR camera with three focal length settings (9, 18 and 36 mm) were estimated using bundle adjustment with additional parameters, and analyses were conducted based on object point discrepancies and standard errors. Results show the robustness of the SVMs approach on the correction of image coordinates by modelling total distortions on-the-job calibration process using limited number of images.
Miniature infrared hyperspectral imaging sensor for airborne applications
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele; Hinnrichs, Bradford; McCutchen, Earl
2017-05-01
Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera, both MWIR and LWIR, small enough to serve as a payload on a miniature unmanned aerial vehicles. The optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of the sensor. This new and innovative approach to infrared hyperspectral imaging spectrometer uses micro-optics and will be explained in this paper. The micro-optics are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a mini-UAV or commercial quadcopter. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the spatial resolution. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame.
Infrared hyperspectral imaging miniaturized for UAV applications
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele; Hinnrichs, Bradford; McCutchen, Earl
2017-02-01
Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera, both MWIR and LWIR, small enough to serve as a payload on a miniature unmanned aerial vehicles. The optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of the sensor. This new and innovative approach to infrared hyperspectral imaging spectrometer uses micro-optics and will be explained in this paper. The micro-optics are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a mini-UAV or commercial quadcopter. Also, an example of how this technology can easily be used to quantify a hydrocarbon gas leak's volume and mass flowrates. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the spatial resolution. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame.
NASA Astrophysics Data System (ADS)
Lachaine, Remi
Les chirurgiens generent des bulles dans le corps humain a l'aide d'irradiation laser depuis plusieurs decennies. Ils utilisent ces bulles comme de petits scalpels, leur permettant de faire des incisions precises et localisees. Une des applications de cet outil chirurgical est la perforation cellulaire. Au lieu d'utiliser une aiguille pour perforer la membrane des cellules, il est possible de focaliser des impulsions laser en surface d'une cellule, formant un plasma au point focal du laser et generant une bulle qui perfore la membrane cellulaire. Toutefois, ce procede est assez lent et la perforation massive de cellules in-vivo n'est pas envisageable. Pour accelerer le processus, il est possible d'utiliser des nanoparticules plasmoniques. Ces dernieres agissent comme des nano-antennes qui permettent de concentrer la lumiere sur une echelle nanometrique. La possibilite d'irradier un grand nombre de nanoparticules simultanement a donne un nouvel elan a la generation de bulle comme outil de perforation cellulaire. L'utilisation de nanoparticules dans un contexte biomedical comporte toutefois certains risques. En particulier, la fragmentation de nanoparticules peut augmenter la toxicite du traitement. Dans un cas ideal, il est preferable d'utiliser des nanoparticules qui ne sont pas endommagees par l'irradiation laser. Cette these a pour but de developper une methode d'ingenierie de nanoparticules robustes permettant la generation efficace de bulles a des fins biomedicales. Il est tout d'abord demontre experimentalement que la formation de plasma est bel et bien le mecanisme physique principal menant a la generation de bulles lors de l'irradiation infrarouge (longueur d'onde de 800 nm) et ultrarapide (temps d'impulsion entre 45 fs et 1 ps) de nanoparticules d'or de 100 nm. Pour realiser cette demonstration, une methode pompe-sonde de detection de bulles d'environ 1 mum a ete elaboree. Cette methode a permis de mettre en evidence une difference de taille de 18% entre les bulles generees avec une irradiation de polarisation lineaire par rapport a une polarisation circulaire lorsque la duree d'impulsion etait inferieure a la picoseconde. Pour des impulsions plus longues, il est montre que les tailles de bulles sont independantes de la polarisation des impulsions incidentes. Ce comportement particulier est en accord avec les predictions theoriques qui incluent la formation non-lineaire de plasma et ne peut pas etre explique en considerant uniquement l'absorption des particules. Ensuite, une methode de conception de nanoparticules robustes pour la generation de bulles est elaboree. Cette methode se base sur les proprietes optiques des nanostructures.
The Dark Energy Survey Data Release 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abbott, T.M.C.; et al.
We describe the first public data release of the Dark Energy Survey, DES DR1, consisting of reduced single epoch images, coadded images, coadded source catalogs, and associated products and services assembled over the first three years of DES science operations. DES DR1 is based on optical/near-infrared imaging from 345 distinct nights (August 2013 to February 2016) by the Dark Energy Camera mounted on the 4-m Blanco telescope at Cerro Tololo Inter-American Observatory in Chile. We release data from the DES wide-area survey covering ~5,000 sq. deg. of the southern Galactic cap in five broad photometric bands, grizY. DES DR1 hasmore » a median delivered point-spread function of g = 1.12, r = 0.96, i = 0.88, z = 0.84, and Y = 0.90 arcsec FWHM, a photometric precision of < 1% in all bands, and an astrometric precision of 151 mas. The median coadded catalog depth for a 1.95" diameter aperture at S/N = 10 is g = 24.33, r = 24.08, i = 23.44, z = 22.69, and Y = 21.44 mag. DES DR1 includes nearly 400M distinct astronomical objects detected in ~10,000 coadd tiles of size 0.534 sq. deg. produced from ~39,000 individual exposures. Benchmark galaxy and stellar samples contain ~310M and ~ 80M objects, respectively, following a basic object quality selection. These data are accessible through a range of interfaces, including query web clients, image cutout servers, jupyter notebooks, and an interactive coadd image visualization tool. DES DR1 constitutes the largest photometric data set to date at the achieved depth and photometric precision.« less
Shake, Rattle and Roll: James Webb Telescope Components Pass Tests
NASA Technical Reports Server (NTRS)
2008-01-01
This image shows a model of one of three detectors for the Mid-Infrared Instrument on NASA's upcoming James Webb Space Telescope. The detector, which looks green in this picture, and is similar to the charge-coupled devices, or 'CCDs,' in digital cameras, is housed in the brick-like unit shown here, called a focal plane module.NASA Astrophysics Data System (ADS)
Mazin, Ben
2014-07-01
Microwave Kinetic Inductance Detectors (MKIDs) are single photon counting, energy resolving detectors applicable across the UVOIR. The first MKID instrument, ARCONS, has been taking data on the Palomar 200" for several years, and we have recently published the first papers using ARCONS data. There are currently two UVOIR MKID instruments fully funded and under construction for planet hunting, DARKNESS for the Palomar P1640 coronagraph, and MEC for Subaru's SCExAO.There are significant opportunities available in pairing MKIDs with TMT. MKIDs can serve as a combined science camera and fast focal plane speckle sensor, allowing rapid feedback to cancel atmospheric speckles. A MKID-based TMT Planet Imager (potentially just a visiting SCExAO+MEC) could discover and take spectra of planets in the habitable zones of nearby M dwarfs, potentially discovering life by looking at spectral signatures in their atmospheres.Another promising application is using the outer part of the focal plane that is ignored by NFIRAOS for a large MKID array. This instrument would serve as a serendipitous camera, providing imaging and spectroscopy for galaxies that would rotate through the field during the normal use of IRIS and IRMOS. This "free" 30-m time would yield a very deep imaging catalog with R~30 spectroscopy.
A Three-Line Stereo Camera Concept for Planetary Exploration
NASA Technical Reports Server (NTRS)
Sandau, Rainer; Hilbert, Stefan; Venus, Holger; Walter, Ingo; Fang, Wai-Chi; Alkalai, Leon
1997-01-01
This paper presents a low-weight stereo camera concept for planetary exploration. The camera uses three CCD lines within the image plane of one single objective. Some of the main features of the camera include: focal length-90 mm, FOV-18.5 deg, IFOV-78 (mu)rad, convergence angles-(+/-)10 deg, radiometric dynamics-14 bit, weight-2 kg, and power consumption-12.5 Watts. From an orbit altitude of 250 km the ground pixel size is 20m x 20m and the swath width is 82 km. The CCD line data is buffered in the camera internal mass memory of 1 Gbit. After performing radiometric correction and application-dependent preprocessing the data is compressed and ready for downlink. Due to the aggressive application of advanced technologies in the area of microelectronics and innovative optics, the low mass and power budgets of 2 kg and 12.5 Watts is achieved, while still maintaining high performance. The design of the proposed light-weight camera is also general purpose enough to be applicable to other planetary missions such as the exploration of Mars, Mercury, and the Moon. Moreover, it is an example of excellent international collaboration on advanced technology concepts developed at DLR, Germany, and NASA's Jet Propulsion Laboratory, USA.
Improved calibration-based non-uniformity correction method for uncooled infrared camera
NASA Astrophysics Data System (ADS)
Liu, Chengwei; Sui, Xiubao
2017-08-01
With the latest improvements of microbolometer focal plane arrays (FPA), uncooled infrared (IR) cameras are becoming the most widely used devices in thermography, especially in handheld devices. However the influences derived from changing ambient condition and the non-uniform response of the sensors make it more difficult to correct the nonuniformity of uncooled infrared camera. In this paper, based on the infrared radiation characteristic in the TEC-less uncooled infrared camera, a novel model was proposed for calibration-based non-uniformity correction (NUC). In this model, we introduce the FPA temperature, together with the responses of microbolometer under different ambient temperature to calculate the correction parameters. Based on the proposed model, we can work out the correction parameters with the calibration measurements under controlled ambient condition and uniform blackbody. All correction parameters can be determined after the calibration process and then be used to correct the non-uniformity of the infrared camera in real time. This paper presents the detail of the compensation procedure and the performance of the proposed calibration-based non-uniformity correction method. And our method was evaluated on realistic IR images obtained by a 384x288 pixels uncooled long wave infrared (LWIR) camera operated under changed ambient condition. The results show that our method can exclude the influence caused by the changed ambient condition, and ensure that the infrared camera has a stable performance.
NASA Astrophysics Data System (ADS)
Luquet, Ph.; Brouard, L.; Chinal, E.
2017-11-01
Astrium has developed a product line of compact and versatile instruments for HR and VHR missions in Earth Observation. These cameras consist on a Silicon Carbide Korsch-type telescope, a focal plane with one or several retina modules - including five lines CCD, optical filters and front end electronics - and the instrument main electronics. Several versions have been developed with a telescope pupil diameter from 200 mm up to 650 mm, covering a large range of GSD (from 2.5 m down to sub-metric) and swath (from 10km up to 30 km) and compatible with different types of platform. Nine cameras have already been manufactured for five different programs: ALSAT2 (Algeria), SSOT (Chile), SPOT6 & SPOT7 (France), KRS (Kazakhstan) and VNREDSat (Vietnam). Two of them have already been launched and are delivering high quality images.
CANICA: The Cananea Near-Infrared Camera at the 2.1 m OAGH Telescope
NASA Astrophysics Data System (ADS)
Carrasco, L.; Hernández Utrera, O.; Vázquez, S.; Mayya, Y. D.; Carrasco, E.; Pedraza, J.; Castillo-Domínguez, E.; Escobedo, G.; Devaraj, R.; Luna, A.
2017-10-01
The Cananea near-infrared camera (CANICA) is an instrument commissioned at the 2.12 m telescope of the Guillermo Haro Astrophysical Observatory (OAGH) located in Cananea, Sonora, México. CANICA operates in the near-infrared at multiple bands including J(1.24 μm), H(1.63 μm) and K' (2.12 μm) broad-bands. CANICA in located at the Ritchey-Chrétien focal plane of the telescope, reimaging the f/12 beam into f/6 beam. The detector is a 1024 × 1024 HgCdTe HAWAII array of 18.5 μm pixel size, covering a field of view of 5.5 × 5.5 arcmin2, for a plate scale of 0.32 arcsec/pixel. The camera is enclosed in a cryostat, cooled with liquid nitrogen to 77 K. The cryostat contains the collimator, two 15-position filter wheels, single fixed reimaging optics and the detector.
Cryogenic solid Schmidt camera as a base for future wide-field IR systems
NASA Astrophysics Data System (ADS)
Yudin, Alexey N.
2011-11-01
Work is focused on study of capability of solid Schmidt camera to serve as a wide-field infrared lens for aircraft system with whole sphere coverage, working in 8-14 um spectral range, coupled with spherical focal array of megapixel class. Designs of 16 mm f/0.2 lens with 60 and 90 degrees sensor diagonal are presented, their image quality is compared with conventional solid design. Achromatic design with significantly improved performance, containing enclosed soft correcting lens behind protective front lens is proposed. One of the main goals of the work is to estimate benefits from curved detector arrays in 8-14 um spectral range wide-field systems. Coupling of photodetector with solid Schmidt camera by means of frustrated total internal reflection is considered, with corresponding tolerance analysis. The whole lens, except front element, is considered to be cryogenic, with solid Schmidt unit to be flown by hydrogen for improvement of bulk transmission.
The CAOS camera platform: ushering in a paradigm change in extreme dynamic range imager design
NASA Astrophysics Data System (ADS)
Riza, Nabeel A.
2017-02-01
Multi-pixel imaging devices such as CCD, CMOS and Focal Plane Array (FPA) photo-sensors dominate the imaging world. These Photo-Detector Array (PDA) devices certainly have their merits including increasingly high pixel counts and shrinking pixel sizes, nevertheless, they are also being hampered by limitations in instantaneous dynamic range, inter-pixel crosstalk, quantum full well capacity, signal-to-noise ratio, sensitivity, spectral flexibility, and in some cases, imager response time. Recently invented is the Coded Access Optical Sensor (CAOS) Camera platform that works in unison with current Photo-Detector Array (PDA) technology to counter fundamental limitations of PDA-based imagers while providing high enough imaging spatial resolution and pixel counts. Using for example the Texas Instruments (TI) Digital Micromirror Device (DMD) to engineer the CAOS camera platform, ushered in is a paradigm change in advanced imager design, particularly for extreme dynamic range applications.
Camera sensor arrangement for crop/weed detection accuracy in agronomic images.
Romeo, Juan; Guerrero, José Miguel; Montalvo, Martín; Emmi, Luis; Guijarro, María; Gonzalez-de-Santos, Pablo; Pajares, Gonzalo
2013-04-02
In Precision Agriculture, images coming from camera-based sensors are commonly used for weed identification and crop line detection, either to apply specific treatments or for vehicle guidance purposes. Accuracy of identification and detection is an important issue to be addressed in image processing. There are two main types of parameters affecting the accuracy of the images, namely: (a) extrinsic, related to the sensor's positioning in the tractor; (b) intrinsic, related to the sensor specifications, such as CCD resolution, focal length or iris aperture, among others. Moreover, in agricultural applications, the uncontrolled illumination, existing in outdoor environments, is also an important factor affecting the image accuracy. This paper is exclusively focused on two main issues, always with the goal to achieve the highest image accuracy in Precision Agriculture applications, making the following two main contributions: (a) camera sensor arrangement, to adjust extrinsic parameters and (b) design of strategies for controlling the adverse illumination effects.
Initial astronomical results with a new 5-14 micron Si:Ga 58x62 DRO array camera
NASA Technical Reports Server (NTRS)
Gezari, Dan; Folz, Walter; Woods, Larry
1989-01-01
A new array camera system was developed using a 58 x 62 pixel Si:Ga (gallium doped silicon) DRO (direct readout) photoconductor array detector manufactured by Hughes/Santa Barbara Research Center (SBRC). The camera system is a broad band photometer designed for 5 to 14 micron imaging with large ground-based optical telescopes. In a typical application a 10 micron photon flux of about 10(exp 9) photons sec(exp -1) m(exp -2) microns(exp -1) arcsec(exp -2) is incident in the telescope focal plane, while the detector well capacity of these arrays is 10(exp 5) to 10 (exp 6) electrons. However, when the real efficiencies and operating conditions are accounted for, the 2-channel 3596 pixel array operates with about 1/2 full wells at 10 micron and 10% bandwidth with high duty cycle and no real experimental compromises.
A solid state lightning propagation speed sensor
NASA Technical Reports Server (NTRS)
Mach, Douglas M.; Rust, W. David
1989-01-01
A device to measure the propagation speeds of cloud-to-ground lightning has been developed. The lightning propagation speed (LPS) device consists of eight solid state silicon photodetectors mounted behind precision horizontal slits in the focal plane of a 50-mm lens on a 35-mm camera. Although the LPS device produces results similar to those obtained from a streaking camera, the LPS device has the advantages of smaller size, lower cost, mobile use, and easier data collection and analysis. The maximum accuracy for the LPS is 0.2 microsec, compared with about 0.8 microsecs for the streaking camera. It is found that the return stroke propagation speed for triggered lightning is different than that for natural lightning if measurements are taken over channel segments less than 500 m. It is suggested that there are no significant differences between the propagation speeds of positive and negative flashes. Also, differences between natural and triggered dart leaders are discussed.
Comparison of parameters of modern cooled and uncooled thermal cameras
NASA Astrophysics Data System (ADS)
Bareła, Jarosław; Kastek, Mariusz; Firmanty, Krzysztof; Krupiński, Michał
2017-10-01
During the design of a system employing thermal cameras one always faces a problem of choosing the camera types best suited for the task. In many cases such a choice is far from optimal one, and there are several reasons for that. System designers often favor tried and tested solution they are used to. They do not follow the latest developments in the field of infrared technology and sometimes their choices are based on prejudice and not on facts. The paper presents the results of measurements of basic parameters of MWIR and LWIR thermal cameras, carried out in a specialized testing laboratory. The measured parameters are decisive in terms of image quality generated by thermal cameras. All measurements were conducted according to current procedures and standards. However the camera settings were not optimized for a specific test conditions or parameter measurements. Instead the real settings used in normal camera operations were applied to obtain realistic camera performance figures. For example there were significant differences between measured values of noise parameters and catalogue data provided by manufacturers, due to the application of edge detection filters to increase detection and recognition ranges. The purpose of this paper is to provide help in choosing the optimal thermal camera for particular application, answering the question whether to opt for cheaper microbolometer device or apply slightly better (in terms of specifications) yet more expensive cooled unit. Measurements and analysis were performed by qualified personnel with several dozen years of experience in both designing and testing of thermal camera systems with both cooled and uncooled focal plane arrays. Cameras of similar array sizes and optics were compared, and for each tested group the best performing devices were selected.
Strategic options towards an affordable high-performance infrared camera
NASA Astrophysics Data System (ADS)
Oduor, Patrick; Mizuno, Genki; Dutta, Achyut K.; Lewis, Jay; Dhar, Nibir K.
2016-05-01
The promise of infrared (IR) imaging attaining low-cost akin to CMOS sensors success has been hampered by the inability to achieve cost advantages that are necessary for crossover from military and industrial applications into the consumer and mass-scale commercial realm despite well documented advantages. Banpil Photonics is developing affordable IR cameras by adopting new strategies to speed-up the decline of the IR camera cost curve. We present a new short-wave IR (SWIR) camera; 640x512 pixel InGaAs uncooled system that is high sensitivity low noise (<50e-), high dynamic range (100 dB), high-frame rates (> 500 frames per second (FPS)) at full resolution, and low power consumption (< 1 W) in a compact system. This camera paves the way towards mass market adoption by not only demonstrating high-performance IR imaging capability value add demanded by military and industrial application, but also illuminates a path towards justifiable price points essential for consumer facing application industries such as automotive, medical, and security imaging adoption. Among the strategic options presented include new sensor manufacturing technologies that scale favorably towards automation, multi-focal plane array compatible readout electronics, and dense or ultra-small pixel pitch devices.
In vitro near-infrared imaging of occlusal dental caries using a germanium-enhanced CMOS camera
NASA Astrophysics Data System (ADS)
Lee, Chulsung; Darling, Cynthia L.; Fried, Daniel
2010-02-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310-nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study was to determine whether the lesion contrast derived from NIR transillumination can be used to estimate lesion severity. Another aim was to compare the performance of a new Ge enhanced complementary metal-oxide-semiconductor (CMOS) based NIR imaging camera with the InGaAs focal plane array (FPA). Extracted human teeth (n=52) with natural occlusal caries were imaged with both cameras at 1310-nm and the image contrast between sound and carious regions was calculated. After NIR imaging, teeth were sectioned and examined using more established methods, namely polarized light microscopy (PLM) and transverse microradiography (TMR) to calculate lesion severity. Lesions were then classified into 4 categories according to the lesion severity. Lesion contrast increased significantly with lesion severity for both cameras (p<0.05). The Ge enhanced CMOS camera equipped with the larger array and smaller pixels yielded higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
In vitro near-infrared imaging of occlusal dental caries using germanium enhanced CMOS camera.
Lee, Chulsung; Darling, Cynthia L; Fried, Daniel
2010-03-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310-nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study was to determine whether the lesion contrast derived from NIR transillumination can be used to estimate lesion severity. Another aim was to compare the performance of a new Ge enhanced complementary metal-oxide-semiconductor (CMOS) based NIR imaging camera with the InGaAs focal plane array (FPA). Extracted human teeth (n=52) with natural occlusal caries were imaged with both cameras at 1310-nm and the image contrast between sound and carious regions was calculated. After NIR imaging, teeth were sectioned and examined using more established methods, namely polarized light microscopy (PLM) and transverse microradiography (TMR) to calculate lesion severity. Lesions were then classified into 4 categories according to the lesion severity. Lesion contrast increased significantly with lesion severity for both cameras (p<0.05). The Ge enhanced CMOS camera equipped with the larger array and smaller pixels yielded higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, T. S.; DePoy, D. L.; Marshall, J. L.
Here, we report that meeting the science goals for many current and future ground-based optical large-area sky surveys requires that the calibrated broadband photometry is both stable in time and uniform over the sky to 1% precision or better. Past and current surveys have achieved photometric precision of 1%–2% by calibrating the survey's stellar photometry with repeated measurements of a large number of stars observed in multiple epochs. The calibration techniques employed by these surveys only consider the relative frame-by-frame photometric zeropoint offset and the focal plane position-dependent illumination corrections, which are independent of the source color. However, variations inmore » the wavelength dependence of the atmospheric transmission and the instrumental throughput induce source color-dependent systematic errors. These systematic errors must also be considered to achieve the most precise photometric measurements. In this paper, we examine such systematic chromatic errors (SCEs) using photometry from the Dark Energy Survey (DES) as an example. We first define a natural magnitude system for DES and calculate the systematic errors on stellar magnitudes when the atmospheric transmission and instrumental throughput deviate from the natural system. We conclude that the SCEs caused by the change of airmass in each exposure, the change of the precipitable water vapor and aerosol in the atmosphere over time, and the non-uniformity of instrumental throughput over the focal plane can be up to 2% in some bandpasses. We then compare the calculated SCEs with the observed DES data. For the test sample data, we correct these errors using measurements of the atmospheric transmission and instrumental throughput from auxiliary calibration systems. In conclusion, the residual after correction is less than 0.3%. Moreover, we calculate such SCEs for Type Ia supernovae and elliptical galaxies and find that the chromatic errors for non-stellar objects are redshift-dependent and can be larger than those for stars at certain redshifts.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, T. S.; DePoy, D. L.; Marshall, J. L.
Meeting the science goals for many current and future ground-based optical large-area sky surveys requires that the calibrated broadband photometry is both stable in time and uniform over the sky to 1% precision or better. Past and current surveys have achieved photometric precision of 1%–2% by calibrating the survey’s stellar photometry with repeated measurements of a large number of stars observed in multiple epochs. The calibration techniques employed by these surveys only consider the relative frame-by-frame photometric zeropoint offset and the focal plane position-dependent illumination corrections, which are independent of the source color. However, variations in the wavelength dependence ofmore » the atmospheric transmission and the instrumental throughput induce source color-dependent systematic errors. These systematic errors must also be considered to achieve the most precise photometric measurements. In this paper, we examine such systematic chromatic errors (SCEs) using photometry from the Dark Energy Survey (DES) as an example. We first define a natural magnitude system for DES and calculate the systematic errors on stellar magnitudes when the atmospheric transmission and instrumental throughput deviate from the natural system. We conclude that the SCEs caused by the change of airmass in each exposure, the change of the precipitable water vapor and aerosol in the atmosphere over time, and the non-uniformity of instrumental throughput over the focal plane can be up to 2% in some bandpasses. We then compare the calculated SCEs with the observed DES data. For the test sample data, we correct these errors using measurements of the atmospheric transmission and instrumental throughput from auxiliary calibration systems. The residual after correction is less than 0.3%. Moreover, we calculate such SCEs for Type Ia supernovae and elliptical galaxies and find that the chromatic errors for non-stellar objects are redshift-dependent and can be larger than those for stars at certain redshifts.« less
Li, T. S.; DePoy, D. L.; Marshall, J. L.; ...
2016-06-01
Here, we report that meeting the science goals for many current and future ground-based optical large-area sky surveys requires that the calibrated broadband photometry is both stable in time and uniform over the sky to 1% precision or better. Past and current surveys have achieved photometric precision of 1%–2% by calibrating the survey's stellar photometry with repeated measurements of a large number of stars observed in multiple epochs. The calibration techniques employed by these surveys only consider the relative frame-by-frame photometric zeropoint offset and the focal plane position-dependent illumination corrections, which are independent of the source color. However, variations inmore » the wavelength dependence of the atmospheric transmission and the instrumental throughput induce source color-dependent systematic errors. These systematic errors must also be considered to achieve the most precise photometric measurements. In this paper, we examine such systematic chromatic errors (SCEs) using photometry from the Dark Energy Survey (DES) as an example. We first define a natural magnitude system for DES and calculate the systematic errors on stellar magnitudes when the atmospheric transmission and instrumental throughput deviate from the natural system. We conclude that the SCEs caused by the change of airmass in each exposure, the change of the precipitable water vapor and aerosol in the atmosphere over time, and the non-uniformity of instrumental throughput over the focal plane can be up to 2% in some bandpasses. We then compare the calculated SCEs with the observed DES data. For the test sample data, we correct these errors using measurements of the atmospheric transmission and instrumental throughput from auxiliary calibration systems. In conclusion, the residual after correction is less than 0.3%. Moreover, we calculate such SCEs for Type Ia supernovae and elliptical galaxies and find that the chromatic errors for non-stellar objects are redshift-dependent and can be larger than those for stars at certain redshifts.« less
Li, Jin; Liu, Zilong
2017-07-24
Remote sensing cameras in the visible/near infrared range are essential tools in Earth-observation, deep-space exploration, and celestial navigation. Their imaging performance, i.e. image quality here, directly determines the target-observation performance of a spacecraft, and even the successful completion of a space mission. Unfortunately, the camera itself, such as a optical system, a image sensor, and a electronic system, limits the on-orbit imaging performance. Here, we demonstrate an on-orbit high-resolution imaging method based on the invariable modulation transfer function (IMTF) of cameras. The IMTF, which is stable and invariable to the changing of ground targets, atmosphere, and environment on orbit or on the ground, depending on the camera itself, is extracted using a pixel optical focal-plane (PFP). The PFP produces multiple spatial frequency targets, which are used to calculate the IMTF at different frequencies. The resulting IMTF in combination with a constrained least-squares filter compensates for the IMTF, which represents the removal of the imaging effects limited by the camera itself. This method is experimentally confirmed. Experiments on an on-orbit panchromatic camera indicate that the proposed method increases 6.5 times of the average gradient, 3.3 times of the edge intensity, and 1.56 times of the MTF value compared to the case when IMTF is not used. This opens a door to push the limitation of a camera itself, enabling high-resolution on-orbit optical imaging.
Efficient volumetric estimation from plenoptic data
NASA Astrophysics Data System (ADS)
Anglin, Paul; Reeves, Stanley J.; Thurow, Brian S.
2013-03-01
The commercial release of the Lytro camera, and greater availability of plenoptic imaging systems in general, have given the image processing community cost-effective tools for light-field imaging. While this data is most commonly used to generate planar images at arbitrary focal depths, reconstruction of volumetric fields is also possible. Similarly, deconvolution is a technique that is conventionally used in planar image reconstruction, or deblurring, algorithms. However, when leveraged with the ability of a light-field camera to quickly reproduce multiple focal planes within an imaged volume, deconvolution offers a computationally efficient method of volumetric reconstruction. Related research has shown than light-field imaging systems in conjunction with tomographic reconstruction techniques are also capable of estimating the imaged volume and have been successfully applied to particle image velocimetry (PIV). However, while tomographic volumetric estimation through algorithms such as multiplicative algebraic reconstruction techniques (MART) have proven to be highly accurate, they are computationally intensive. In this paper, the reconstruction problem is shown to be solvable by deconvolution. Deconvolution offers significant improvement in computational efficiency through the use of fast Fourier transforms (FFTs) when compared to other tomographic methods. This work describes a deconvolution algorithm designed to reconstruct a 3-D particle field from simulated plenoptic data. A 3-D extension of existing 2-D FFT-based refocusing techniques is presented to further improve efficiency when computing object focal stacks and system point spread functions (PSF). Reconstruction artifacts are identified; their underlying source and methods of mitigation are explored where possible, and reconstructions of simulated particle fields are provided.
NASA Astrophysics Data System (ADS)
Thomas, N.
2016-12-01
The Sheath Transport Observer for the Redistribution of Mass (STORM) and the Cusp Plasma Imaging Detector (CuPID) instruments are soft X-ray cameras the utilize slumped micropore ('lobster-eye') optics. These lobster-eye optics, developed by the University of Leicester and the Photonis Corporation, provide for wide field-of-view imaging of X-ray line emission produced via charge exchange between hydrogen in the Earth's exosphere and heavy ions in the solar wind. Both instruments have position sensitize, chevron configuration, microchannel plate detectors in their respective focal planes. STORM possess two, 4 cm by 4 cm, lobster-eye optics, each with a focal length of 37.5 cm. It flew as a piggy back payload on the Diffuse X-ray emission from the Local galaxy (DXL) sounding rocket mission which was launched in December of 2012 from White Sands Missile Range, New Mexico. STORM operated successfully during this mission and represents the first use of lobster-eye optics in space. A future version of STORM, in high orbit, could image a significant portion of the magnetosheath to infer the locations of the magnetopause and the bow shock. CuPID is a 3U CubeSat variant of STORM that uses a single optic with a 27.5 cm focal length. A sounding rocket borne CuPID flew as a science payload with DXL from White Sands in December of 2015 with results forthcoming.
NASA Astrophysics Data System (ADS)
Smee, Stephen A.; Prochaska, Travis; Shectman, Stephen A.; Hammond, Randolph P.; Barkhouser, Robert H.; DePoy, D. L.; Marshall, J. L.
2012-09-01
We describe the conceptual optomechanical design for GMACS, a wide-field, multi-object, moderate-resolution optical spectrograph for the Giant Magellan Telescope (GMT). GMACS is a candidate first-light instrument for the GMT and will be one of several instruments housed in the Gregorian Instrument Rotator (GIR) located at the Gregorian focus. The instrument samples a 9 arcminute x 18 arcminute field of view providing two resolution modes (i.e, low resolution, R ~ 2000, and moderate resolution, R ~ 4000) over a 3700 Å to 10200 Å wavelength range. To minimize the size of the optics, four fold mirrors at the GMT focal plane redirect the full field into four individual "arms", that each comprises a double spectrograph with a red and blue channel. Hence, each arm samples a 4.5 arcminute x 9 arcminute field of view. The optical layout naturally leads to three separate optomechanical assemblies: a focal plane assembly, and two identical optics modules. The focal plane assembly contains the last element of the telescope's wide-field corrector, slit-mask, tent-mirror assembly, and slit-mask magazine. Each of the two optics modules supports two of the four instrument arms and houses the aft-optics (i.e. collimators, dichroics, gratings, and cameras). A grating exchange mechanism, and articulated gratings and cameras facilitate multiple resolution modes. In this paper we describe the details of the GMACS optomechanical design, including the requirements and considerations leading to the design, mechanism details, optics mounts, and predicted flexure performance.
Hubble Space Telescope faint object camera instrument handbook (Post-COSTAR), version 5.0
NASA Technical Reports Server (NTRS)
Nota, A. (Editor); Jedrzejewski, R. (Editor); Greenfield, P. (Editor); Hack, W. (Editor)
1994-01-01
The faint object camera (FOC) is a long-focal-ratio, photon-counting device capable of taking high-resolution two-dimensional images of the sky up to 14 by 14 arc seconds squared in size with pixel dimensions as small as 0.014 by 0.014 arc seconds squared in the 1150 to 6500 A wavelength range. Its performance approaches that of an ideal imaging system at low light levels. The FOC is the only instrument on board the Hubble Space Telescope (HST) to fully use the spatial resolution capabilities of the optical telescope assembly (OTA) and is one of the European Space Agency's contributions to the HST program.
Geometric calibration of Colour and Stereo Surface Imaging System of ESA's Trace Gas Orbiter
NASA Astrophysics Data System (ADS)
Tulyakov, Stepan; Ivanov, Anton; Thomas, Nicolas; Roloff, Victoria; Pommerol, Antoine; Cremonese, Gabriele; Weigel, Thomas; Fleuret, Francois
2018-01-01
There are many geometric calibration methods for "standard" cameras. These methods, however, cannot be used for the calibration of telescopes with large focal lengths and complex off-axis optics. Moreover, specialized calibration methods for the telescopes are scarce in literature. We describe the calibration method that we developed for the Colour and Stereo Surface Imaging System (CaSSIS) telescope, on board of the ExoMars Trace Gas Orbiter (TGO). Although our method is described in the context of CaSSIS, with camera-specific experiments, it is general and can be applied to other telescopes. We further encourage re-use of the proposed method by making our calibration code and data available on-line.
Radiometric infrared focal plane array imaging system for thermographic applications
NASA Technical Reports Server (NTRS)
Esposito, B. J.; Mccafferty, N.; Brown, R.; Tower, J. R.; Kosonocky, W. F.
1992-01-01
This document describes research performed under the Radiometric Infrared Focal Plane Array Imaging System for Thermographic Applications contract. This research investigated the feasibility of using platinum silicide (PtSi) Schottky-barrier infrared focal plane arrays (IR FPAs) for NASA Langley's specific radiometric thermal imaging requirements. The initial goal of this design was to develop a high spatial resolution radiometer with an NETD of 1 percent of the temperature reading over the range of 0 to 250 C. The proposed camera design developed during this study and described in this report provides: (1) high spatial resolution (full-TV resolution); (2) high thermal dynamic range (0 to 250 C); (3) the ability to image rapid, large thermal transients utilizing electronic exposure control (commandable dynamic range of 2,500,000:1 with exposure control latency of 33 ms); (4) high uniformity (0.5 percent nonuniformity after correction); and (5) high thermal resolution (0.1 C at 25 C background and 0.5 C at 250 C background).
Radiometric infrared focal plane array imaging system for thermographic applications
NASA Astrophysics Data System (ADS)
Esposito, B. J.; McCafferty, N.; Brown, R.; Tower, J. R.; Kosonocky, W. F.
1992-11-01
This document describes research performed under the Radiometric Infrared Focal Plane Array Imaging System for Thermographic Applications contract. This research investigated the feasibility of using platinum silicide (PtSi) Schottky-barrier infrared focal plane arrays (IR FPAs) for NASA Langley's specific radiometric thermal imaging requirements. The initial goal of this design was to develop a high spatial resolution radiometer with an NETD of 1 percent of the temperature reading over the range of 0 to 250 C. The proposed camera design developed during this study and described in this report provides: (1) high spatial resolution (full-TV resolution); (2) high thermal dynamic range (0 to 250 C); (3) the ability to image rapid, large thermal transients utilizing electronic exposure control (commandable dynamic range of 2,500,000:1 with exposure control latency of 33 ms); (4) high uniformity (0.5 percent nonuniformity after correction); and (5) high thermal resolution (0.1 C at 25 C background and 0.5 C at 250 C background).
Solution for the nonuniformity correction of infrared focal plane arrays.
Zhou, Huixin; Liu, Shangqian; Lai, Rui; Wang, Dabao; Cheng, Yubao
2005-05-20
Based on the S-curve model of the detector response of infrared focal plan arrays (IRFPAs), an improved two-point correction algorithm is presented. The algorithm first transforms the nonlinear image data into linear data and then uses the normal two-point algorithm to correct the linear data. The algorithm can effectively overcome the influence of nonlinearity of the detector's response, and it enlarges the correction precision and the dynamic range of the response. A real-time imaging-signal-processing system for IRFPAs that is based on a digital signal processor and field-programmable gate arrays is also presented. The nonuniformity correction capability of the presented solution is validated by experimental imaging procedures of a 128 x 128 pixel IRFPA camera prototype.
Thermally tunable-focus lenticular lens using liquid crystal.
Heo, Kyong Chan; Yu, Seung Hun; Kwon, Jin Hyuk; Gwag, Jin Seog
2013-12-10
A thermally tunable focusing lenticular liquid crystal (LC) lens array was fabricated using a polymer LC component, including a polarizer that produces linearly polarized light. The focal length in the proposed structure could be tuned by temperature-adjusted applied voltage to a transparent heater in a lenticular LC lens cell because it alters the birefringence of the LC and varies the difference in refractive index between the LC and the polymer. The results showed that the focal length of the E7 LC used varied continuously with temperature from 5.6 to 8.7 mm from 25°C to 54°C, respectively. The proposed lenticular LC lens has potential use in photonic devices such as biological imaging, phone cameras, and optical sensors.
Three-dimensional particle tracking via tunable color-encoded multiplexing.
Duocastella, Martí; Theriault, Christian; Arnold, Craig B
2016-03-01
We present a novel 3D tracking approach capable of locating single particles with nanometric precision over wide axial ranges. Our method uses a fast acousto-optic liquid lens implemented in a bright field microscope to multiplex light based on color into different and selectable focal planes. By separating the red, green, and blue channels from an image captured with a color camera, information from up to three focal planes can be retrieved. Multiplane information from the particle diffraction rings enables precisely locating and tracking individual objects up to an axial range about 5 times larger than conventional single-plane approaches. We apply our method to the 3D visualization of the well-known coffee-stain phenomenon in evaporating water droplets.
New Focal Plane Array Controller for the Instruments of the Subaru Telescope
NASA Astrophysics Data System (ADS)
Nakaya, Hidehiko; Komiyama, Yutaka; Miyazaki, Satoshi; Yamashita, Takuya; Yagi, Masafumi; Sekiguchi, Maki
2006-03-01
We have developed a next-generation data acquisition system, MESSIA5 (Modularized Extensible System for Image Acquisition), which comprises the digital part of a focal plane array controller. The new data acquisition system was constructed based on a 64 bit, 66 MHz PCI (peripheral component interconnect) bus architecture and runs on an x86 CPU computer with (non-real-time) Linux. The system, including the CPU board, is placed at the telescope focus, and standard gigabit Ethernet is adopted for the data transfer, as opposed to a dedicated fiber link. During the summer of 2002, we installed the new system for the first time on the Subaru prime-focus camera Suprime-Cam and successfully improved the observing performance.
Coaxial fundus camera for opthalmology
NASA Astrophysics Data System (ADS)
de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.
2015-09-01
A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.
History of the formerly top secret KH-9 Hexagon spy satellite
NASA Astrophysics Data System (ADS)
Pressel, Phil
2014-12-01
This paper is about the development, design, fabrication and use of the KH-9 Hexagon spy in the sky satellite camera system that was finally declassified by the National Reconnaissance Office on September 17, 2011 twenty five years after the program ended. It was the last film based reconnaissance camera and was known by experts in the field as "the most complicated system ever put up in orbit." It provided important intelligence for the United States government and was the reason that President Nixon was able to sign the SALT treaty, and when President Reagan said "Trust but Verify" it provided the means of verification. Each satellite weighed 30,000 pounds and carried two cameras thereby permitting photographs of the entire landmass of the earth to be taken in stereo. Each camera carried up to 30 miles of film for a total of 60 miles of film. Ultra-complex mechanisms controlled the structurally "wimpy" film that traveled at speeds up to 204 inches per second at the focal plane and was perfectly synchronized to the optical image.
2015-01-16
beamsplitters. A longpass dichroic beamsplitter ( Semrock , FF347-Di01-50.4x71.2) reflects the ultravio- let hydroxyl radical (OH) chemiluminescence, an...and a 80 nm FWHM band- pass filter centered at 300 nm ( Semrock , FF01-300/80-25) fitted to a UV F/4.5 105 mm focal length Nikkor lens. The camera is
Development of the FPI+ as facility science instrument for SOFIA cycle four observations
NASA Astrophysics Data System (ADS)
Pfüller, Enrico; Wiedemann, Manuel; Wolf, Jürgen; Krabbe, Alfred
2016-08-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a heavily modified Boeing 747SP aircraft, accommodating a 2.5m infrared telescope. This airborne observation platform takes astronomers to flight altitudes of up to 13.7 km (45,000ft) and therefore allows an unobstructed view of the infrared universe at wavelengths between 0.3 m and 1600 m. SOFIA is currently completing its fourth cycle of observations and utilizes eight different imaging and spectroscopic science instruments. New instruments for SOFIAs cycle 4 observations are the High-resolution Airborne Wideband Camera-plus (HAWC+) and the Focal Plane Imager (FPI+). The latter is an integral part of the telescope assembly and is used on every SOFIA flight to ensure precise tracking on the desired targets. The FPI+ is used as a visual-light photometer in its role as facility science instrument. Since the upgrade of the FPI camera and electronics in 2013, it uses a thermo-electrically cooled science grade EM-CCD sensor inside a commercial-off-the-shelf Andor camera. The back-illuminated sensor has a peak quantum efficiency of 95% and the dark current is as low as 0.01 e-/pix/sec. With this new hardware the telescope has successfully tracked on 16th magnitude stars and thus the sky coverage, e.g. the area of sky that has suitable tracking stars, has increased to 99%. Before its use as an integrated tracking imager, the same type of camera has been used as a standalone diagnostic tool to analyze the telescope pointing stability at frequencies up to 200 Hz (imaging with 400 fps). These measurements help to improve the telescope pointing control algorithms and therefore reduce the image jitter in the focal plane. Science instruments benefit from this improvement with smaller image sizes for longer exposure times. The FPI has also been used to support astronomical observations like stellar occultations by the dwarf planet Pluto and a number of exoplanet transits. Especially the observation of the occultation events benefits from the high camera sensitivity, fast readout capability and the low read noise and it was possible to achieve high time resolution on the photometric light curves. This paper will give an overview of the development from the standalone diagnostic camera to the upgraded guiding/tracking camera, fully integrated into the telescope, while still offering the diagnostic capabilities and finally to the use as a facility science instrument on SOFIA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, X; Grimes, J; Yu, L
Purpose: Focal spot blooming is an increase in the focal spot size at increased tube current and/or decreased tube potential. In this work, we evaluated the influence of tube current on the focal spot size at low kV for two CT systems, one of which used a tube designed to reduce blooming effects. Methods: A slit camera (10 micron slit) was used to measure focal spot size on two CT scanners from the same manufacturer (Siemens Somatom Force and Definition Flash) at 70 kV and low, medium and maximum tube currents, according to the capabilities of each system (Force: 100,more » 800 and 1300 mA; Flash: 100, 200 and 500 mA). Exposures were made with a stationary tube in service mode using a raised stand without table movement or flying focal spot technique. Focal spot size, nominally 0.8 and 1.2 mm, respectively, was measured parallel and perpendicular to the cathode-anode axis by calculating the full-width-at-half-maximum of the slit profile recording using computed radiographic plates. Results: Focal spot sizes perpendicular to the anode-cathode axis increased at the maximum mA by 5.7% on the Force and 39.1% on the Flash relative to that at the minimal mA, even though the mA was increased 13-fold on the Force and only 5- fold on the Flash. Focal spot size increased parallel to the anode-cathode axis by 70.4% on Force and 40.9% on Flash. Conclusion: For CT protocols using low kV, high mA is typically required. These protocols are relevant in children and smaller adults, and for dual-energy scanning. Technical measures to limit focal spot blooming are important in these settings to avoid reduced spatial resolution. The x-ray tube on a recently-introduced scanner appears to greatly reduce blooming effects, even at very high mA values. CHM has research support from Siemens Healthcare.« less
Mars Exploration Rover Athena Panoramic Camera (Pancam) investigation
Bell, J.F.; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.N.; Arneson, H.M.; Brown, D.; Collins, S.A.; Dingizian, A.; Elliot, S.T.; Hagerott, E.C.; Hayes, A.G.; Johnson, M.J.; Johnson, J. R.; Joseph, J.; Kinch, K.; Lemmon, M.T.; Morris, R.V.; Scherr, L.; Schwochert, M.; Shepard, M.K.; Smith, G.H.; Sohl-Dickstein, J. N.; Sullivan, R.J.; Sullivan, W.T.; Wadsworth, M.
2003-01-01
The Panoramic Camera (Pancam) investigation is part of the Athena science payload launched to Mars in 2003 on NASA's twin Mars Exploration Rover (MER) missions. The scientific goals of the Pancam investigation are to assess the high-resolution morphology, topography, and geologic context of each MER landing site, to obtain color images to constrain the mineralogic, photometric, and physical properties of surface materials, and to determine dust and aerosol opacity and physical properties from direct imaging of the Sun and sky. Pancam also provides mission support measurements for the rovers, including Sun-finding for rover navigation, hazard identification and digital terrain modeling to help guide long-term rover traverse decisions, high-resolution imaging to help guide the selection of in situ sampling targets, and acquisition of education and public outreach products. The Pancam optical, mechanical, and electronics design were optimized to achieve these science and mission support goals. Pancam is a multispectral, stereoscopic, panoramic imaging system consisting of two digital cameras mounted on a mast 1.5 m above the Martian surface. The mast allows Pancam to image the full 360?? in azimuth and ??90?? in elevation. Each Pancam camera utilizes a 1024 ?? 1024 active imaging area frame transfer CCD detector array. The Pancam optics have an effective focal length of 43 mm and a focal ratio f/20, yielding an instantaneous field of view of 0.27 mrad/pixel and a field of view of 16?? ?? 16??. Each rover's two Pancam "eyes" are separated by 30 cm and have a 1?? toe-in to provide adequate stereo parallax. Each eye also includes a small eight position filter wheel to allow surface mineralogic studies, multispectral sky imaging, and direct Sun imaging in the 400-1100 nm wavelength region. Pancam was designed and calibrated to operate within specifications on Mars at temperatures from -55?? to +5??C. An onboard calibration target and fiducial marks provide the capability to validate the radiometric and geometric calibration on Mars. Copyright 2003 by the American Geophysical Union.
Wang, Zhengzhou; Hu, Bingliang; Yin, Qinye
2017-01-01
The schlieren method of measuring far-field focal spots offers many advantages at the Shenguang III laser facility such as low cost and automatic laser-path collimation. However, current methods of far-field focal spot measurement often suffer from low precision and efficiency when the final focal spot is merged manually, thereby reducing the accuracy of reconstruction. In this paper, we introduce an improved schlieren method to construct the high dynamic-range image of far-field focal spots and improve the reconstruction accuracy and efficiency. First, a detection method based on weak light beam sampling and magnification imaging was designed; images of the main and side lobes of the focused laser irradiance in the far field were obtained using two scientific CCD cameras. Second, using a self-correlation template matching algorithm, a circle the same size as the schlieren ball was dug from the main lobe cutting image and used to change the relative region of the main lobe cutting image within a 100×100 pixel region. The position that had the largest correlation coefficient between the side lobe cutting image and the main lobe cutting image when a circle was dug was identified as the best matching point. Finally, the least squares method was used to fit the center of the side lobe schlieren small ball, and the error was less than 1 pixel. The experimental results show that this method enables the accurate, high-dynamic-range measurement of a far-field focal spot and automatic image reconstruction. Because the best matching point is obtained through image processing rather than traditional reconstruction methods based on manual splicing, this method is less sensitive to the efficiency of focal-spot reconstruction and thus offers better experimental precision. PMID:28207758
Alfonso, Fernando; Pérez-Vizcayno, María José; García Del Blanco, Bruno; García-Touchard, Arturo; López-Mínguez, José-Ramón; Masotti, Mónica; Zueco, Javier; Melgares, Rafael; Mainar, Vicente; Moreno, Raul; Domínguez, Antonio; Sanchís, Juan; Bethencourt, Armando; Moreu, José; Cequier, Angel; Martí, Vicens; Otaegui, Imanol; Bastante, Teresa; Gonzalo, Nieves; Jiménez-Quevedo, Pilar; Cárdenas, Alberto; Fernández, Cristina
2016-07-01
Treatment of patients with drug-eluting stent (DES) in-stent restenosis (ISR) is more challenging than that of patients with bare-metal stent ISR. However, the results of everolimus-eluting stents (EES) in these distinct scenarios remain unsettled. A pooled analysis of the RIBS IV (Restenosis Intra-Stent of Drug-Eluting Stents: Paclitaxel-Eluting Balloon vs Everolimus-Eluting Stent) and RIBS V (Restenosis Intra-Stent of Bare Metal Stents: Paclitaxel-Eluting Balloon vs Everolimus-Eluting Stent) randomized trials was performed using patient-level data to compare the efficacy of EES in bare-metal stent ISR and DES-ISR. Inclusion and exclusion criteria were identical in both trials. Results of 94 patients treated with EES for bare-metal stent ISR were compared with those of 155 patients treated with EES for DES-ISR. Baseline characteristics were more adverse in patients with DES-ISR, although they presented later and more frequently with a focal pattern. After intervention, minimal lumen diameter (2.22±0.5 versus 2.38±0.5 mm, P=0.01) was smaller in the DES-ISR group. Late angiographic findings (89.3% of eligible patients), including minimal lumen diameter (2.03±0.7 versus 2.36±0.6 mm, P<0.001) and diameter stenosis (23±22 versus 13±17%, P<0.001) were poorer in patients with DES-ISR. Results were consistent in the in-segment and in-lesion analyses. On multiple linear regression analysis, minimal lumen diameter at follow-up remained significantly smaller in patients with DES-ISR. Finally, at 1-year clinical follow-up (100% of patients), mortality (2.6 versus 0%, P<0.01) and need for target vessel revascularization (8 versus 2%, P=0.03) were higher in the DES-ISR group. This patient-level pooled analysis of the RIBS IV and RIBS V randomized clinical trials suggests that EES provide favorable outcomes in patients with ISR. However, the results of EES are less satisfactory in patients with DES-ISR than in those with bare-metal stent ISR. URL: http://www.clinicaltrials.gov. Unique identifiers: NCT01239953 and NCT01239940. © 2016 American Heart Association, Inc.
Intraocular camera for retinal prostheses: Refractive and diffractive lens systems
NASA Astrophysics Data System (ADS)
Hauer, Michelle Christine
The focus of this thesis is on the design and analysis of refractive, diffractive, and hybrid refractive/diffractive lens systems for a miniaturized camera that can be surgically implanted in the crystalline lens sac and is designed to work in conjunction with current and future generation retinal prostheses. The development of such an intraocular camera (IOC) would eliminate the need for an external head-mounted or eyeglass-mounted camera. Placing the camera inside the eye would allow subjects to use their natural eye movements for foveation (attention) instead of more cumbersome head tracking, would notably aid in personal navigation and mobility, and would also be significantly more psychologically appealing from the standpoint of personal appearances. The capability for accommodation with no moving parts or feedback control is incorporated by employing camera designs that exhibit nearly infinite depth of field. Such an ultracompact optical imaging system requires a unique combination of refractive and diffractive optical elements and relaxed system constraints derived from human psychophysics. This configuration necessitates an extremely compact, short focal-length lens system with an f-number close to unity. Initially, these constraints appear highly aggressive from an optical design perspective. However, after careful analysis of the unique imaging requirements of a camera intended to work in conjunction with the relatively low pixellation levels of a retinal microstimulator array, it becomes clear that such a design is not only feasible, but could possibly be implemented with a single lens system.
The ideal subject distance for passport pictures.
Verhoff, Marcel A; Witzel, Carsten; Kreutz, Kerstin; Ramsthaler, Frank
2008-07-04
In an age of global combat against terrorism, the recognition and identification of people on document images is of increasing significance. Experiments and calculations have shown that the camera-to-subject distance - not the focal length of the lens - can have a significant effect on facial proportions. Modern passport pictures should be able to function as a reference image for automatic and manual picture comparisons. This requires a defined subject distance. It is completely unclear which subject distance, in the taking of passport photographs, is ideal for the recognition of the actual person. We show here that the camera-to-subject distance that is perceived as ideal is dependent on the face being photographed, even if the distance of 2m was most frequently preferred. So far the problem of the ideal camera-to-subject distance for faces has only been approached through technical calculations. We have, for the first time, answered this question experimentally with a double-blind experiment. Even if there is apparently no ideal camera-to-subject distance valid for every face, 2m can be proposed as ideal for the taking of passport pictures. The first step would actually be the determination of a camera-to-subject distance for the taking of passport pictures within the standards. From an anthropological point of view it would be interesting to find out which facial features allow the preference of a shorter camera-to-subject distance and which allow the preference of a longer camera-to-subject distance.
Automatic Orientation of Large Blocks of Oblique Images
NASA Astrophysics Data System (ADS)
Rupnik, E.; Nex, F.; Remondino, F.
2013-05-01
Nowadays, multi-camera platforms combining nadir and oblique cameras are experiencing a revival. Due to their advantages such as ease of interpretation, completeness through mitigation of occluding areas, as well as system accessibility, they have found their place in numerous civil applications. However, automatic post-processing of such imagery still remains a topic of research. Configuration of cameras poses a challenge on the traditional photogrammetric pipeline used in commercial software and manual measurements are inevitable. For large image blocks it is certainly an impediment. Within theoretical part of the work we review three common least square adjustment methods and recap on possible ways for a multi-camera system orientation. In the practical part we present an approach that successfully oriented a block of 550 images acquired with an imaging system composed of 5 cameras (Canon Eos 1D Mark III) with different focal lengths. Oblique cameras are rotated in the four looking directions (forward, backward, left and right) by 45° with respect to the nadir camera. The workflow relies only upon open-source software: a developed tool to analyse image connectivity and Apero to orient the image block. The benefits of the connectivity tool are twofold: in terms of computational time and success of Bundle Block Adjustment. It exploits the georeferenced information provided by the Applanix system in constraining feature point extraction to relevant images only, and guides the concatenation of images during the relative orientation. Ultimately an absolute transformation is performed resulting in mean re-projection residuals equal to 0.6 pix.
Thermal design and simulation of an attitude-varied space camera
NASA Astrophysics Data System (ADS)
Wang, Chenjie; Yang, Wengang; Feng, Liangjie; Li, XuYang; Wang, Yinghao; Fan, Xuewu; Wen, Desheng
2015-10-01
An attitude-varied space camera changes attitude continually when it is working, its attitude changes with large angle in short time leads to the significant change of heat flux; Moreover, the complicated inner heat sources, other payloads and the satellite platform will also bring thermal coupling effects to the space camera. According to a space camera which is located on a two dimensional rotating platform, detailed thermal design is accomplished by means of thermal isolation, thermal transmission and temperature compensation, etc. Then the ultimate simulation cases of both high temperature and low temperature are chosen considering the obscuration of the satellite platform and other payloads, and also the heat flux analysis of light entrance and radiator surface of the camera. NEVEDA and SindaG are used to establish the simulation model of the camera and the analysis is carried out. The results indicate that, under both passive and active thermal control, the temperature of optical components is 20+/-1°C,both their radial and axial temperature gradient are less than 0.3°C, while the temperature of the main structural components is 20+/-2°C, and the temperature fluctuation of the focal plane assemblies is 3.0-9.5°C The simulation shows that the thermal control system can meet the need of the mission, and the thermal design is efficient and reasonable.
High-precision method of binocular camera calibration with a distortion model.
Li, Weimin; Shan, Siyu; Liu, Hui
2017-03-10
A high-precision camera calibration method for binocular stereo vision system based on a multi-view template and alternative bundle adjustment is presented in this paper. The proposed method could be achieved by taking several photos on a specially designed calibration template that has diverse encoded points in different orientations. In this paper, the method utilized the existing algorithm used for monocular camera calibration to obtain the initialization, which involves a camera model, including radial lens distortion and tangential distortion. We created a reference coordinate system based on the left camera coordinate to optimize the intrinsic parameters of left camera through alternative bundle adjustment to obtain optimal values. Then, optimal intrinsic parameters of the right camera can be obtained through alternative bundle adjustment when we create a reference coordinate system based on the right camera coordinate. We also used all intrinsic parameters that were acquired to optimize extrinsic parameters. Thus, the optimal lens distortion parameters and intrinsic and extrinsic parameters were obtained. Synthetic and real data were used to test the method. The simulation results demonstrate that the maximum mean absolute relative calibration errors are about 3.5e-6 and 1.2e-6 for the focal length and the principal point, respectively, under zero-mean Gaussian noise with 0.05 pixels standard deviation. The real result shows that the reprojection error of our model is about 0.045 pixels with the relative standard deviation of 1.0e-6 over the intrinsic parameters. The proposed method is convenient, cost-efficient, highly precise, and simple to carry out.
Application of infrared uncooled cameras in surveillance systems
NASA Astrophysics Data System (ADS)
Dulski, R.; Bareła, J.; Trzaskawka, P.; PiÄ tkowski, T.
2013-10-01
The recent necessity to protect military bases, convoys and patrols gave serious impact to the development of multisensor security systems for perimeter protection. One of the most important devices used in such systems are IR cameras. The paper discusses technical possibilities and limitations to use uncooled IR camera in a multi-sensor surveillance system for perimeter protection. Effective ranges of detection depend on the class of the sensor used and the observed scene itself. Application of IR camera increases the probability of intruder detection regardless of the time of day or weather conditions. It also simultaneously decreased the false alarm rate produced by the surveillance system. The role of IR cameras in the system was discussed as well as technical possibilities to detect human being. Comparison of commercially available IR cameras, capable to achieve desired ranges was done. The required spatial resolution for detection, recognition and identification was calculated. The simulation of detection ranges was done using a new model for predicting target acquisition performance which uses the Targeting Task Performance (TTP) metric. Like its predecessor, the Johnson criteria, the new model bounds the range performance with image quality. The scope of presented analysis is limited to the estimation of detection, recognition and identification ranges for typical thermal cameras with uncooled microbolometer focal plane arrays. This type of cameras is most widely used in security systems because of competitive price to performance ratio. Detection, recognition and identification range calculations were made, and the appropriate results for the devices with selected technical specifications were compared and discussed.
Diabetes-associated dry eye syndrome in a new humanized transgenic model of type 1 diabetes.
Imam, Shahnawaz; Elagin, Raya B; Jaume, Juan Carlos
2013-01-01
Patients with Type 1 Diabetes (T1D) are at high risk of developing lacrimal gland dysfunction. We have developed a new model of human T1D using double-transgenic mice carrying HLA-DQ8 diabetes-susceptibility haplotype instead of mouse MHC-class II and expressing the human beta cell autoantigen Glutamic Acid Decarboxylase in pancreatic beta cells. We report here the development of dry eye syndrome (DES) after diabetes induction in our humanized transgenic model. Double-transgenic mice were immunized with DNA encoding human GAD65, either naked or in adenoviral vectors, to induce T1D. Mice monitored for development of diabetes developed lacrimal gland dysfunction. Animals developed lacrimal gland disease (classically associated with diabetes in Non Obese Diabetic [NOD] mice and with T1D in humans) as they developed glucose intolerance and diabetes. Animals manifested obvious clinical signs of dry eye syndrome (DES), from corneal erosions to severe keratitis. Histological studies of peri-bulbar areas revealed lymphocytic infiltration of glandular structures. Indeed, infiltrative lesions were observed in lacrimal/Harderian glands within weeks following development of glucose intolerance. Lesions ranged from focal lymphocytic infiltration to complete acinar destruction. We observed a correlation between the severity of the pancreatic infiltration and the severity of the ocular disease. Our results demonstrate development of DES in association with antigen-specific insulitis and diabetes following immunization with clinically relevant human autoantigen concomitantly expressed in pancreatic beta cells of diabetes-susceptible mice. As in the NOD mouse model and as in human T1D, our animals developed diabetes-associated DES. This specific finding stresses the relevance of our model for studying these human diseases. We believe our model will facilitate studies to prevent/treat diabetes-associated DES as well as human diabetes.
A high-sensitivity EM-CCD camera for the open port telescope cavity of SOFIA
NASA Astrophysics Data System (ADS)
Wiedemann, Manuel; Wolf, Jürgen; McGrotty, Paul; Edwards, Chris; Krabbe, Alfred
2016-08-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) has three target acquisition and tracking cameras. All three imagers originally used the same cameras, which did not meet the sensitivity requirements, due to low quantum efficiency and high dark current. The Focal Plane Imager (FPI) suffered the most from high dark current, since it operated in the aircraft cabin at room temperatures without active cooling. In early 2013 the FPI was upgraded with an iXon3 888 from Andor Techonolgy. Compared to the original cameras, the iXon3 has a factor five higher QE, thanks to its back-illuminated sensor, and orders of magnitude lower dark current, due to a thermo-electric cooler and "inverted mode operation." This leads to an increase in sensitivity of about five stellar magnitudes. The Wide Field Imager (WFI) and Fine Field Imager (FFI) shall now be upgraded with equally sensitive cameras. However, they are exposed to stratospheric conditions in flight (typical conditions: T≍-40° C, p≍ 0:1 atm) and there are no off-the-shelf CCD cameras with the performance of an iXon3, suited for these conditions. Therefore, Andor Technology and the Deutsches SOFIA Institut (DSI) are jointly developing and qualifying a camera for these conditions, based on the iXon3 888. These changes include replacement of electrical components with MIL-SPEC or industrial grade components and various system optimizations, a new data interface that allows the image data transmission over 30m of cable from the camera to the controller, a new power converter in the camera to generate all necessary operating voltages of the camera locally and a new housing that fulfills airworthiness requirements. A prototype of this camera has been built and tested in an environmental test chamber at temperatures down to T=-62° C and pressure equivalent to 50 000 ft altitude. In this paper, we will report about the development of the camera and present results from the environmental testing.
NASA Astrophysics Data System (ADS)
Scaduto, Lucimara C. N.; Malavolta, Alexandre T.; Modugno, Rodrigo G.; Vales, Luiz F.; Carvalho, Erica G.; Evangelista, Sérgio; Stefani, Mario A.; de Castro Neto, Jarbas C.
2017-11-01
The first Brazilian remote sensing multispectral camera (MUX) is currently under development at Opto Eletronica S.A. It consists of a four-spectral-band sensor covering a 450nm to 890nm wavelength range. This camera will provide images within a 20m ground resolution at nadir. The MUX camera is part of the payload of the upcoming Sino-Brazilian satellites CBERS 3&4 (China-Brazil Earth Resource Satellite). The preliminary alignment between the optical system and the CCD sensor, which is located at the focal plane assembly, was obtained in air condition, clean room environment. A collimator was used for the performance evaluation of the camera. The preliminary performance evaluation of the optical channel was registered by compensating the collimator focus position due to changes in the test environment, as an air-to-vacuum environment transition leads to a defocus process in this camera. Therefore, it is necessary to confirm that the alignment of the camera must always be attained ensuring that its best performance is reached for an orbital vacuum condition. For this reason and as a further step on the development process, the MUX camera Qualification Model was tested and evaluated inside a thermo-vacuum chamber and submitted to an as-orbit vacuum environment. In this study, the influence of temperature fields was neglected. This paper reports on the performance evaluation and discusses the results for this camera when operating within those mentioned test conditions. The overall optical tests and results show that the "in air" adjustment method was suitable to be performed, as a critical activity, to guarantee the equipment according to its design requirements.
Hubble Space Telescope: Faint object camera instrument handbook. Version 2.0
NASA Technical Reports Server (NTRS)
Paresce, Francesco (Editor)
1990-01-01
The Faint Object Camera (FOC) is a long focal ratio, photon counting device designed to take high resolution two dimensional images of areas of the sky up to 44 by 44 arcseconds squared in size, with pixel dimensions as small as 0.0007 by 0.0007 arcseconds squared in the 1150 to 6500 A wavelength range. The basic aim of the handbook is to make relevant information about the FOC available to a wide range of astronomers, many of whom may wish to apply for HST observing time. The FOC, as presently configured, is briefly described, and some basic performance parameters are summarized. Also included are detailed performance parameters and instructions on how to derive approximate FOC exposure times for the proposed targets.
Optimization design of periscope type 3X zoom lens design for a five megapixel cellphone camera
NASA Astrophysics Data System (ADS)
Sun, Wen-Shing; Tien, Chuen-Lin; Pan, Jui-Wen; Chao, Yu-Hao; Chu, Pu-Yi
2016-11-01
This paper presents a periscope type 3X zoom lenses design for a five megapixel cellphone camera. The configuration of optical system uses the right angle prism in front of the zoom lenses to change the optical path rotated by a 90° angle resulting in the zoom lenses length of 6 mm. The zoom lenses can be embedded in mobile phone with a thickness of 6 mm. The zoom lenses have three groups with six elements. The half field of view is varied from 30° to 10.89°, the effective focal length is adjusted from 3.142 mm to 9.426 mm, and the F-number is changed from 2.8 to 5.13.
Owuor, Theresa O; Reid, Michaela; Reschke, Lauren; Hagemann, Ian; Greco, Suellen; Modi, Zeel; Moley, Kelle H
2018-01-01
Thirty-eight percent of US adult women are obese, meaning that more children are now born of overweight and obese mothers, leading to an increase in predisposition to several adult onset diseases. To explore this phenomenon, we developed a maternal obesity animal model by feeding mice a diet composed of high fat/ high sugar (HF/HS) and assessed both maternal diet and offspring diet on the development of endometrial cancer (ECa). We show that maternal diet by itself did not lead to ECa initiation in wildtype offspring of the C57Bl/6J mouse strain. While offspring fed a HF/HS post-weaning diet resulted in poor metabolic health and decreased uterine weight (regardless of maternal diet), it did not lead to ECa. We also investigated the effects of the maternal obesogenic diet on ECa development in a Diethylstilbestrol (DES) carcinogenesis mouse model. All mice injected with DES had reproductive tract lesions including decreased number of glands, condensed and hyalinized endometrial stroma, and fibrosis and increased collagen deposition that in some mice extended into the myometrium resulting in extensive disruption and loss of the inner and outer muscular layers. Fifty percent of DES mice that were exposed to maternal HF/HS diet developed several features indicative of the initial stages of carcinogenesis including focal glandular and atypical endometrial hyperplasia versus 0% of their Chow counterparts. There was an increase in phospho-Akt expression in DES mice exposed to maternal HF/HS diet, a regulator of persistent proliferation in the endometrium, and no difference in total Akt, phospho-PTEN and total PTEN expression. In summary, maternal HF/HS diet exposure induces endometrial hyperplasia and other precancerous phenotypes in mice treated with DES. This study suggests that maternal obesity alone is not sufficient for the development of ECa, but has an additive effect in the presence of a secondary insult such as DES.
NASA Astrophysics Data System (ADS)
Rustan, Pedro L.
1995-01-01
The U.S. Department of Defense (DoD) and the National Aeronautics and Space Administration (NASA) started a cooperative program in 1992 to flight qualify recently developed lightweight technologies in a radiation stressed environment. The spacecraft, referred to as Clementine, was designed, built, and launched in less than a two year period. The spacecraft was launched into a high inclination orbit from Vandenburg Air Force Base in California on a Titan IIG launch vehicle in January 1994. The spacecraft was injected into a 420 by 3000 km orbit around the Moon and remained there for over two months. Unfortunately, after successfully completing the Lunar phase of the mission, a software malfunction prevented the accomplishment of the near-Earth asteroid (NEA) phase. Some of the technologies incorporated in the Clementine spacecraft include: a 370 gram, 7 watt star tracker camera; a 500 gram, 6 watt, UV/Vis camera; a 1600 gram, 30 watt Indium Antimonide focal plane array NIR camera; a 1650 gram, 30 watt, Mercury Cadmium Telluride LWIR camera; a LIDAR camera which consists of a Nd:YAG diode pumped laser for ranging and an intensified photocathode charge-coupled detector for imaging. The scientific results of the mission will be first analyzed by a NASA selected team, and then will be available to the entire community.
Embedded Augmented Reality Training System for Dynamic Human-Robot Cooperation
2009-10-01
through (OST) head- mounted displays ( HMDs ) still lack in usability and ergonomics because of their size, weight, resolution, and the hard-to-realize...with addressable focal planes [10], for example. Accurate and easy-to-use calibration routines for OST HMDs remains a challenging task; established...methods are based on matching of virtual over real objects [11], newer approaches use cameras looking directly through the HMD optics to exploit both
Optimal design of an earth observation optical system with dual spectral and high resolution
NASA Astrophysics Data System (ADS)
Yan, Pei-pei; Jiang, Kai; Liu, Kai; Duan, Jing; Shan, Qiusha
2017-02-01
With the increasing demand of the high-resolution remote sensing images by military and civilians, Countries around the world are optimistic about the prospect of higher resolution remote sensing images. Moreover, design a visible/infrared integrative optic system has important value in earth observation. Because visible system can't identify camouflage and recon at night, so we should associate visible camera with infrared camera. An earth observation optical system with dual spectral and high resolution is designed. The paper mainly researches on the integrative design of visible and infrared optic system, which makes the system lighter and smaller, and achieves one satellite with two uses. The working waveband of the system covers visible, middle infrared (3-5um). Dual waveband clear imaging is achieved with dispersive RC system. The focal length of visible system is 3056mm, F/# is 10.91. And the focal length of middle infrared system is 1120mm, F/# is 4. In order to suppress the middle infrared thermal radiation and stray light, the second imaging system is achieved and the narcissus phenomenon is analyzed. The system characteristic is that the structure is simple. And the especial requirements of the Modulation Transfer Function (MTF), spot, energy concentration, and distortion etc. are all satisfied.
A novel dual-color bifocal imaging system for single-molecule studies.
Jiang, Chang; Kaul, Neha; Campbell, Jenna; Meyhofer, Edgar
2017-05-01
In this paper, we report the design and implementation of a dual-color bifocal imaging (DBI) system that is capable of acquiring two spectrally distinct, spatially registered images of objects located in either same or two distinct focal planes. We achieve this by separating an image into two channels with distinct chromatic properties and independently focusing both images onto a single CCD camera. The two channels in our device are registered with subpixel accuracy, and long-term stability of the registered images with nanometer-precision was accomplished by reducing the drift of the images to ∼5 nm. We demonstrate the capabilities of our DBI system by imaging biomolecules labeled with spectrally distinct dyes and micro- and nano-sized spheres located in different focal planes.
Ultrathin zoom telescopic objective.
Li, Lei; Wang, Di; Liu, Chao; Wang, Qiong-Hua
2016-08-08
We report an ultrathin zoom telescopic objective that can achieve continuous zoom change and has reduced compact volume. The objective consists of an annular folded lens and three electrowetting liquid lenses. The annular folded lens undertakes the main part of the focal power of the lens system. Due to a multiple-fold design, the optical path is folded in a lens with the thickness of ~1.98mm. The electrowetting liquid lenses constitute a zoom part. Based on the proposed objective, an ultrathin zoom telescopic camera is demonstrated. We analyze the properties of the proposed objective. The aperture of the proposed objective is ~15mm. The total length of the system is ~18mm with a tunable focal length ~48mm to ~65mm. Compared with the conventional zoom telescopic objective, the total length has been largely reduced.
NASA Technical Reports Server (NTRS)
Leviton, Douglas B.; Madison, Timothy J.; Petrone, Peter
1998-01-01
The focal shift of an optical filter used in non-collimated light depends directly on substrate thickness and index of refraction. The HST Advanced Camera for Surveys (ACS) requires a set of filters whose focal shifts are tightly matched. Knowing the index of refraction for substrate glasses allows precise substrate thicknesses to be specified. Two refractometers have been developed at the Goddard Space Flight Center (GSFC) to determine the indices of refraction of materials from which ACS filters are made. Modem imaging detectors for the near infrared, visible, and far ultraviolet spectral regions make these simple yet sophisticated refractometers possible. A new technology, high accuracy, angular encoder also developed at GSFC makes high precision index measurement possible in the vacuum ultraviolet.
Apollo 12 photography 70 mm, 16 mm, and 35 mm frame index
NASA Technical Reports Server (NTRS)
1970-01-01
For each 70-mm frame, the index presents information on: (1) the focal length of the camera, (2) the photo scale at the principal point of the frame, (3) the selenographic coordinates at the principal point of the frame, (4) the percentage of forward overlap of the frame, (5) the sun angle (medium, low, high), (6) the quality of the photography, (7) the approximate tilt (minimum and maximum) of the camera, and (8) the direction of tilt. A brief description of each frame is also included. The index to the 16-mm sequence photography includes information concerning the approximate surface coverage of the photographic sequence and a brief description of the principal features shown. A column of remarks is included to indicate: (1) if the sequence is plotted on the photographic index map and (2) the quality of the photography. The pictures taken using the lunar surface closeup stereoscopic camera (35 mm) are also described in this same index format.
The Use of Video-Tacheometric Technology for Documenting and Analysing Geometric Features of Objects
NASA Astrophysics Data System (ADS)
Woźniak, Marek; Świerczyńska, Ewa; Jastrzębski, Sławomir
2015-12-01
This paper analyzes selected aspects of the use of video-tacheometric technology for inventorying and documenting geometric features of objects. Data was collected with the use of the video-tacheometer Topcon Image Station IS-3 and the professional camera Canon EOS 5D Mark II. During the field work and the development of data the following experiments have been performed: multiple determination of the camera interior orientation parameters and distortion parameters of five lenses with different focal lengths, reflectorless measurements of profiles for the elevation and inventory of decorative surface wall of the building of Warsaw Ballet School. During the research the process of acquiring and integrating video-tacheometric data was analysed as well as the process of combining "point cloud" acquired by using video-tacheometer in the scanning process with independent photographs taken by a digital camera. On the basis of tests performed, utility of the use of video-tacheometric technology in geodetic surveys of geometrical features of buildings has been established.
Innovative optronics for the new PUMA tank
NASA Astrophysics Data System (ADS)
Fritze, J.; Münzberg, M.; Schlemmer, H.
2010-04-01
The new PUMA tank is equipped with a fully stabilized 360° periscope. The thermal imager in the periscope is identical to the imager in the gunner sight. All optronic images of the cameras can be fed on every electronic display within the tank. The thermal imagers operate with a long wave 384x288 MCT starring focal plane array. The high quantum efficiency of MCT provides low NETD values at short integration times. The thermal imager has an image resolution of 768x576 pixels by means of a micro scanner. The MCT detector operates at high temperatures above 75K with high stability in noise and correctibility and offers high reliability (MTTF) values for the complete camera in a very compact design. The paper discusses the principle and functionality of the optronic combination of direct view optical channel, thermal imager and visible camera and discusses in detail the performances of the subcomponents with respect to demands for new tank applications.
Challenges and solutions for high performance SWIR lens design
NASA Astrophysics Data System (ADS)
Gardner, M. C.; Rogers, P. J.; Wilde, M. F.; Cook, T.; Shipton, A.
2016-10-01
Shortwave infrared (SWIR) cameras are becoming increasingly attractive due to the improving size, resolution and decreasing prices of InGaAs focal plane arrays (FPAs). The rapid development of competitively priced HD performance SWIR cameras has not been matched in SWIR imaging lenses with the result that the lens is now more likely to be the limiting factor in imaging quality than the FPA. Adapting existing lens designs from the visible region by re-coating for SWIR will improve total transmission but diminished image quality metrics such as MTF, and in particular large field angle performance such as vignetting, field curvature and distortion are serious consequences. To meet this challenge original SWIR solutions are presented including a wide field of view fixed focal length lens for commercial machine vision (CMV) and a wide angle, small, lightweight defence lens and their relevant design considerations discussed. Issues restricting suitable glass types will be examined. The index and dispersion properties at SWIR wavelengths can differ significantly from their visible values resulting in unusual glass combinations when matching doublet elements. Materials chosen simultaneously allow athermalization of the design as well as containing matched CTEs in the elements of doublets. Recently, thinned backside-illuminated InGaAs devices have made Vis.SWIR cameras viable. The SWIR band is sufficiently close to the visible that the same constituent materials can be used for AR coatings covering both bands. Keeping the lens short and mass low can easily result in high incidence angles which in turn complicates coating design, especially when extended beyond SWIR into the visible band. This paper also explores the potential performance of wideband Vis.SWIR AR coatings.
Orion Optical Navigation Progress Toward Exploration Mission 1
NASA Technical Reports Server (NTRS)
Holt, Greg N.; D'Souza, Christopher N.; Saley, David
2018-01-01
Optical navigation of human spacecraft was proposed on Gemini and implemented successfully on Apollo as a means of autonomously operating the vehicle in the event of lost communication with controllers on Earth. The Orion emergency return system utilizing optical navigation has matured in design over the last several years, and is currently undergoing the final implementation and test phase in preparation for Exploration Mission 1 (EM-1) in 2019. The software development is past its Critical Design Review, and is progressing through test and certification for human rating. The filter architecture uses a square-root-free UDU covariance factorization. Linear Covariance Analysis (LinCov) was used to analyze the measurement models and the measurement error models on a representative EM-1 trajectory. The Orion EM-1 flight camera was calibrated at the Johnson Space Center (JSC) electro-optics lab. To permanently stake the focal length of the camera a 500 mm focal length refractive collimator was used. Two Engineering Design Unit (EDU) cameras and an EDU star tracker were used for a live-sky test in Denver. In-space imagery with high-fidelity truth metadata is rare so these live-sky tests provide one of the closest real-world analogs to operational use. A hardware-in-the-loop test rig was developed in the Johnson Space Center Electro-Optics Lab to exercise the OpNav system prior to integrated testing on the Orion vehicle. The software is verified with synthetic images. Several hundred off-nominal images are also used to analyze robustness and fault detection in the software. These include effects such as stray light, excess radiation damage, and specular reflections, and are used to help verify the tuning parameters chosen for the algorithms such as earth atmosphere bias, minimum pixel intensity, and star detection thresholds.
Development of the focal plane PNCCD camera system for the X-ray space telescope eROSITA
NASA Astrophysics Data System (ADS)
Meidinger, Norbert; Andritschke, Robert; Ebermayer, Stefanie; Elbs, Johannes; Hälker, Olaf; Hartmann, Robert; Herrmann, Sven; Kimmel, Nils; Schächner, Gabriele; Schopper, Florian; Soltau, Heike; Strüder, Lothar; Weidenspointner, Georg
2010-12-01
A so-called PNCCD, a special type of CCD, was developed twenty years ago as focal plane detector for the XMM-Newton X-ray astronomy mission of the European Space Agency ESA. Based on this detector concept and taking into account the experience of almost ten years of operation in space, a new X-ray CCD type was designed by the ‘MPI semiconductor laboratory’ for an upcoming X-ray space telescope, called eROSITA (extended Roentgen survey with an imaging telescope array). This space telescope will be equipped with seven X-ray mirror systems of Wolter-I type and seven CCD cameras, placed in their foci. The instrumentation permits the exploration of the X-ray universe in the energy band from 0.3 up to 10 keV by spectroscopic measurements with a time resolution of 50 ms for a full image comprising 384×384 pixels. Main scientific goals are an all-sky survey and investigation of the mysterious ‘Dark Energy’. The eROSITA space telescope, which is developed under the responsibility of the ‘Max-Planck-Institute for extraterrestrial physics’, is a scientific payload on the new Russian satellite ‘Spectrum-Roentgen-Gamma’ (SRG). The mission is already approved by the responsible Russian and German space agencies. After launch in 2012 the destination of the satellite is Lagrange point L2. The planned observational program takes about seven years. We describe the design of the eROSITA camera system and present important test results achieved recently with the eROSITA prototype PNCCD detector. This includes a comparison of the eROSITA detector with the XMM-Newton detector.
Astrometric Calibration and Performance of the Dark Energy Camera
Bernstein, G. M.; Armstrong, R.; Plazas, A. A.; ...
2017-05-30
We characterize the ability of the Dark Energy Camera (DECam) to perform relative astrometry across its 500 Mpix, 3more » $deg^2$ science field of view, and across 4 years of operation. This is done using internal comparisons of $~ 4 x 10^7$ measurements of high-S/N stellar images obtained in repeat visits to fields of moderate stellar density, with the telescope dithered to move the sources around the array. An empirical astrometric model includes terms for: optical distortions; stray electric fields in the CCD detectors; chromatic terms in the instrumental and atmospheric optics; shifts in CCD relative positions of up to $$\\approx 10 \\mu m$$ when the DECam temperature cycles; and low-order distortions to each exposure from changes in atmospheric refraction and telescope alignment. Errors in this astrometric model are dominated by stochastic variations with typical amplitudes of 10-30 mas (in a 30 s exposure) and $$5^{\\prime}-10^{\\prime}$$ arcmin coherence length, plausibly attributed to Kolmogorov-spectrum atmospheric turbulence. The size of these atmospheric distortions is not closely related to the seeing. Given an astrometric reference catalog at density $$\\approx 0.7$$ $$arcmin^{-2}$$, e.g. from Gaia, the typical atmospheric distortions can be interpolated to $$\\approx$$ 7 mas RMS accuracy (for 30 s exposures) with $$1^{\\prime}$$ arcmin coherence length for residual errors. Remaining detectable error contributors are 2-4 mas RMS from unmodelled stray electric fields in the devices, and another 2-4 mas RMS from focal plane shifts between camera thermal cycles. Thus the astrometric solution for a single DECam exposure is accurate to 3-6 mas ( $$\\approx$$ 0.02 pixels, or $$\\approx$$ 300 nm) on the focal plane, plus the stochastic atmospheric distortion.« less
Cooling the dark energy camera instrument
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmitt, R.L.; Cease, H.; /Fermilab
2008-06-01
DECam, camera for the Dark Energy Survey (DES), is undergoing general design and component testing. For an overview see DePoy, et al in these proceedings. For a description of the imager, see Cease, et al in these proceedings. The CCD instrument will be mounted at the prime focus of the CTIO Blanco 4m telescope. The instrument temperature will be 173K with a heat load of 113W. In similar applications, cooling CCD instruments at the prime focus has been accomplished by three general methods. Liquid nitrogen reservoirs have been constructed to operate in any orientation, pulse tube cryocoolers have been usedmore » when tilt angles are limited and Joule-Thompson or Stirling cryocoolers have been used with smaller heat loads. Gifford-MacMahon cooling has been used at the Cassegrain but not at the prime focus. For DES, the combined requirements of high heat load, temperature stability, low vibration, operation in any orientation, liquid nitrogen cost and limited space available led to the design of a pumped, closed loop, circulating nitrogen system. At zenith the instrument will be twelve meters above the pump/cryocooler station. This cooling system expected to have a 10,000 hour maintenance interval. This paper will describe the engineering basis including the thermal model, unbalanced forces, cooldown time, the single and two-phase flow model.« less
Imaging characteristics of photogrammetric camera systems
Welch, R.; Halliday, J.
1973-01-01
In view of the current interest in high-altitude and space photographic systems for photogrammetric mapping, the United States Geological Survey (U.S.G.S.) undertook a comprehensive research project designed to explore the practical aspects of applying the latest image quality evaluation techniques to the analysis of such systems. The project had two direct objectives: (1) to evaluate the imaging characteristics of current U.S.G.S. photogrammetric camera systems; and (2) to develop methodologies for predicting the imaging capabilities of photogrammetric camera systems, comparing conventional systems with new or different types of systems, and analyzing the image quality of photographs. Image quality was judged in terms of a number of evaluation factors including response functions, resolving power, and the detectability and measurability of small detail. The limiting capabilities of the U.S.G.S. 6-inch and 12-inch focal length camera systems were established by analyzing laboratory and aerial photographs in terms of these evaluation factors. In the process, the contributing effects of relevant parameters such as lens aberrations, lens aperture, shutter function, image motion, film type, and target contrast procedures for analyzing image quality and predicting and comparing performance capabilities. ?? 1973.
Rapid-cadence optical monitoring for short-period variability of ɛ Aurigae
NASA Astrophysics Data System (ADS)
Billings, Gary
2013-07-01
ɛ Aurigae was observed with CCD cameras and 35 mm SLR camera lenses, at rapid cadence (>1/minute), for long runs (up to 11 hours), on multiple occasions during 2009 - 2011, to monitor for variability of the system at scales of minutes to hours. The lens and camera were changed during the period to improve results, finalizing on a 135 mm focal length Canon f/2 lens (at f/2.8), an ND8 neutral density filter, a Johnson V filter, and an SBIG ST-8XME camera (Kodak KAF-1603ME microlensed chip). Differential photometry was attempted, but because of the large separation between the variable and comparison star (η Aur), noise caused by transient extinction variations was not consistently eliminated. The lowest-noise time series for searching for short-period variability proved to be the extinction-corrected instrumental magnitude of ɛ Aur obtained on "photometric nights", with η Aur used to determine and monitor the extinction coefficient for the night. No flares or short-period variations of ɛ Aur were detected by visual inspection of the light curves from observing runs with noise levels as low as 0.008 magnitudes rms.
NASA Technical Reports Server (NTRS)
1980-01-01
Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.
Opto-mechanical design of the G-CLEF flexure control camera system
NASA Astrophysics Data System (ADS)
Oh, Jae Sok; Park, Chan; Kim, Jihun; Kim, Kang-Min; Chun, Moo-Young; Yu, Young Sam; Lee, Sungho; Nah, Jakyoung; Park, Sung-Joon; Szentgyorgyi, Andrew; McMuldroch, Stuart; Norton, Timothy; Podgorski, William; Evans, Ian; Mueller, Mark; Uomoto, Alan; Crane, Jeffrey; Hare, Tyson
2016-08-01
The GMT-Consortium Large Earth Finder (G-CLEF) is the very first light instrument of the Giant Magellan Telescope (GMT). The G-CLEF is a fiber feed, optical band echelle spectrograph that is capable of extremely precise radial velocity measurement. KASI (Korea Astronomy and Space Science Institute) is responsible for Flexure Control Camera (FCC) included in the G-CLEF Front End Assembly (GCFEA). The FCC is a kind of guide camera, which monitors the field images focused on a fiber mirror to control the flexure and the focus errors within the GCFEA. The FCC consists of five optical components: a collimator including triple lenses for producing a pupil, neutral density filters allowing us to use much brighter star as a target or a guide, a tent prism as a focus analyzer for measuring the focus offset at the fiber mirror, a reimaging camera with three pair of lenses for focusing the beam on a CCD focal plane, and a CCD detector for capturing the image on the fiber mirror. In this article, we present the optical and mechanical FCC designs which have been modified after the PDR in April 2015.
Lee, Chulsung; Lee, Dustin; Darling, Cynthia L; Fried, Daniel
2010-01-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310 nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study is to determine whether the lesion contrast derived from NIR imaging in both transmission and reflectance can be used to estimate lesion severity. Two NIR imaging detector technologies are investigated: a new Ge-enhanced complementary metal-oxide-semiconductor (CMOS)-based NIR imaging camera, and an InGaAs focal plane array (FPA). Natural occlusal caries lesions are imaged with both cameras at 1310 nm, and the image contrast between sound and carious regions is calculated. After NIR imaging, teeth are sectioned and examined using polarized light microscopy (PLM) and transverse microradiography (TMR) to determine lesion severity. Lesions are then classified into four categories according to lesion severity. Lesion contrast increases significantly with lesion severity for both cameras (p<0.05). The Ge-enhanced CMOS camera equipped with the larger array and smaller pixels yields higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
Lee, Chulsung; Lee, Dustin; Darling, Cynthia L.; Fried, Daniel
2010-01-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310 nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study is to determine whether the lesion contrast derived from NIR imaging in both transmission and reflectance can be used to estimate lesion severity. Two NIR imaging detector technologies are investigated: a new Ge-enhanced complementary metal-oxide-semiconductor (CMOS)-based NIR imaging camera, and an InGaAs focal plane array (FPA). Natural occlusal caries lesions are imaged with both cameras at 1310 nm, and the image contrast between sound and carious regions is calculated. After NIR imaging, teeth are sectioned and examined using polarized light microscopy (PLM) and transverse microradiography (TMR) to determine lesion severity. Lesions are then classified into four categories according to lesion severity. Lesion contrast increases significantly with lesion severity for both cameras (p<0.05). The Ge-enhanced CMOS camera equipped with the larger array and smaller pixels yields higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity. PMID:20799842
NASA Astrophysics Data System (ADS)
Lee, Chulsung; Lee, Dustin; Darling, Cynthia L.; Fried, Daniel
2010-07-01
The high transparency of dental enamel in the near-infrared (NIR) at 1310 nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study is to determine whether the lesion contrast derived from NIR imaging in both transmission and reflectance can be used to estimate lesion severity. Two NIR imaging detector technologies are investigated: a new Ge-enhanced complementary metal-oxide-semiconductor (CMOS)-based NIR imaging camera, and an InGaAs focal plane array (FPA). Natural occlusal caries lesions are imaged with both cameras at 1310 nm, and the image contrast between sound and carious regions is calculated. After NIR imaging, teeth are sectioned and examined using polarized light microscopy (PLM) and transverse microradiography (TMR) to determine lesion severity. Lesions are then classified into four categories according to lesion severity. Lesion contrast increases significantly with lesion severity for both cameras (p<0.05). The Ge-enhanced CMOS camera equipped with the larger array and smaller pixels yields higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity.
Camera for Quasars in the Early Universe (CQUEAN)
NASA Astrophysics Data System (ADS)
Kim, Eunbin; Park, W.; Lim, J.; Jeong, H.; Kim, J.; Oh, H.; Pak, S.; Im, M.; Kuehne, J.
2010-05-01
The early universe of z ɳ is where the first stars, galaxies, and quasars formed, starting the re-ionization of the universe. The discovery and the study of quasars in the early universe allow us to witness the beginning of history of astronomical objects. In order to perform a medium-deep, medium-wide, imaging survey of quasars, we are developing an optical CCD camera, CQUEAN (Camera for QUasars in EArly uNiverse) which uses a 1024*1024 pixel deep-depletion CCD. It has an enhanced QE than conventional CCD at wavelength band around 1μm, thus it will be an efficient tool for observation of quasars at z > 7. It will be attached to the 2.1m telescope at McDonald Observatory, USA. A focal reducer is designed to secure a larger field of view at the cassegrain focus of 2.1m telescope. For long stable exposures, auto-guiding system will be implemented by using another CCD camera viewing an off-axis field. All these instruments will be controlled by the software written in python on linux platform. CQUEAN is expected to see the first light during summer in 2010.
Skupsky, Stanley; Kessler, Terrance J.; Letzring, Samuel A.
1993-01-01
A temporally shaped or modified optical output pulse is generated from a bandwidth-encoded optical input pulse in a system in which the input pulse is in the form of a beam which is spectrally spread into components contained within the bandwidth, followed by deflection of the spectrally spread beam (SBD) thereby spatially mapping the components in correspondence with the temporal input pulse profile in the focal plane of a lens, and by spatially selective attenuation of selected components in that focal plane. The shaped or modified optical output pulse is then reconstructed from the attenuated spectral components. The pulse-shaping system is particularly useful for generating optical pulses of selected temporal shape over a wide range of pulse duration, such pulses finding application in the fields of optical communication, optical recording and data storage, atomic and molecular spectroscopy and laser fusion. An optical streak camera is also provided which uses SBD to display the beam intensity in the focal plane as a function of time during the input pulse.
Skupsky, S.; Kessler, T.J.; Letzring, S.A.
1993-11-16
A temporally shaped or modified optical output pulse is generated from a bandwidth-encoded optical input pulse in a system in which the input pulse is in the form of a beam which is spectrally spread into components contained within the bandwidth, followed by deflection of the spectrally spread beam (SBD) thereby spatially mapping the components in correspondence with the temporal input pulse profile in the focal plane of a lens, and by spatially selective attenuation of selected components in that focal plane. The shaped or modified optical output pulse is then reconstructed from the attenuated spectral components. The pulse-shaping system is particularly useful for generating optical pulses of selected temporal shape over a wide range of pulse duration, such pulses finding application in the fields of optical communication, optical recording and data storage, atomic and molecular spectroscopy and laser fusion. An optical streak camera is also provided which uses SBD to display the beam intensity in the focal plane as a function of time during the input pulse. 10 figures.
NASA Astrophysics Data System (ADS)
Fuh, Yiin-Kuen; Lai, Zheng-Hong
2017-02-01
A fast processing route of aspheric polydimethylsiloxane (PDMS) lenses array (APLA) is proposed via the combined effect of inverted gravitational and heat-assisted forces. The fabrication time can be dramatically reduced to 30 s, compared favorably to the traditional duration of 2 hours of repeated cycles of addition-curing processes. In this paper, a low-cost flexible lens can be fabricated by repeatedly depositing, inverting, curing a hanging transparent PDMS elastomer droplet on a previously deposited curved structure. Complex structures with aspheric curve features and various focal lengths can be successfully produced and the fabricated 4 types of APLA have various focal lengths in the range of 7.03 mm, 6.00 mm, 5.33 mm, and 4.43 mm, respectively. Empirically, a direct relationship between the PDMS volume and focal lengths of the lenses can be experimentally deducted. Using these fabricated APLA, an ordinary commercial smartphone camera can be easily transformed to a low-cost, portable digital microscopy (50×magnification) such that point of care diagnostic can be implemented pervasively.
Optical Characterization of the SPT-3G Focal Plane
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pan, Z.; et al.
The third-generation South Pole Telescope camera is designed to measure the cosmic microwave background across three frequency bands (95, 150 and 220 GHz) with ~16,000 transition-edge sensor (TES) bolometers. Each multichroic pixel on a detector wafer has a broadband sinuous antenna that couples power to six TESs, one for each of the three observing bands and both polarization directions, via lumped element filters. Ten detector wafers populate the focal plane, which is coupled to the sky via a large-aperture optical system. Here we present the frequency band characterization with Fourier transform spectroscopy, measurements of optical time constants, beam properties, andmore » optical and polarization efficiencies of the focal plane. The detectors have frequency bands consistent with our simulations, and have high average optical efficiency which is 86%, 77% and 66% for the 95, 150 and 220 GHz detectors. The time constants of the detectors are mostly between 0.5 ms and 5 ms. The beam is round with the correct size, and the polarization efficiency is more than 90% for most of the bolometers« less
Target-Tracking Camera for a Metrology System
NASA Technical Reports Server (NTRS)
Liebe, Carl; Bartman, Randall; Chapsky, Jacob; Abramovici, Alexander; Brown, David
2009-01-01
An analog electronic camera that is part of a metrology system measures the varying direction to a light-emitting diode that serves as a bright point target. In the original application for which the camera was developed, the metrological system is used to determine the varying relative positions of radiating elements of an airborne synthetic aperture-radar (SAR) antenna as the airplane flexes during flight; precise knowledge of the relative positions as a function of time is needed for processing SAR readings. It has been common metrology system practice to measure the varying direction to a bright target by use of an electronic camera of the charge-coupled-device or active-pixel-sensor type. A major disadvantage of this practice arises from the necessity of reading out and digitizing the outputs from a large number of pixels and processing the resulting digital values in a computer to determine the centroid of a target: Because of the time taken by the readout, digitization, and computation, the update rate is limited to tens of hertz. In contrast, the analog nature of the present camera makes it possible to achieve an update rate of hundreds of hertz, and no computer is needed to determine the centroid. The camera is based on a position-sensitive detector (PSD), which is a rectangular photodiode with output contacts at opposite ends. PSDs are usually used in triangulation for measuring small distances. PSDs are manufactured in both one- and two-dimensional versions. Because it is very difficult to calibrate two-dimensional PSDs accurately, the focal-plane sensors used in this camera are two orthogonally mounted one-dimensional PSDs.
The opto-cryo-mechanical design of the short wavelength camera for the CCAT Observatory
NASA Astrophysics Data System (ADS)
Parshley, Stephen C.; Adams, Joseph; Nikola, Thomas; Stacey, Gordon J.
2014-07-01
The CCAT observatory is a 25-m class Gregorian telescope designed for submillimeter observations that will be deployed at Cerro Chajnantor (~5600 m) in the high Atacama Desert region of Chile. The Short Wavelength Camera (SWCam) for CCAT is an integral part of the observatory, enabling the study of star formation at high and low redshifts. SWCam will be a facility instrument, available at first light and operating in the telluric windows at wavelengths of 350, 450, and 850 μm. In order to trace the large curvature of the CCAT focal plane, and to suit the available instrument space, SWCam is divided into seven sub-cameras, each configured to a particular telluric window. A fully refractive optical design in each sub-camera will produce diffraction-limited images. The material of choice for the optical elements is silicon, due to its excellent transmission in the submillimeter and its high index of refraction, enabling thin lenses of a given power. The cryostat's vacuum windows double as the sub-cameras' field lenses and are ~30 cm in diameter. The other lenses are mounted at 4 K. The sub-cameras will share a single cryostat providing thermal intercepts at 80, 15, 4, 1 and 0.1 K, with cooling provided by pulse tube cryocoolers and a dilution refrigerator. The use of the intermediate temperature stage at 15 K minimizes the load at 4 K and reduces operating costs. We discuss our design requirements, specifications, key elements and expected performance of the optical, thermal and mechanical design for the short wavelength camera for CCAT.
Low-cost uncooled VOx infrared camera development
NASA Astrophysics Data System (ADS)
Li, Chuan; Han, C. J.; Skidmore, George D.; Cook, Grady; Kubala, Kenny; Bates, Robert; Temple, Dorota; Lannon, John; Hilton, Allan; Glukh, Konstantin; Hardy, Busbee
2013-06-01
The DRS Tamarisk® 320 camera, introduced in 2011, is a low cost commercial camera based on the 17 µm pixel pitch 320×240 VOx microbolometer technology. A higher resolution 17 µm pixel pitch 640×480 Tamarisk®640 has also been developed and is now in production serving the commercial markets. Recently, under the DARPA sponsored Low Cost Thermal Imager-Manufacturing (LCTI-M) program and internal project, DRS is leading a team of industrial experts from FiveFocal, RTI International and MEMSCAP to develop a small form factor uncooled infrared camera for the military and commercial markets. The objective of the DARPA LCTI-M program is to develop a low SWaP camera (<3.5 cm3 in volume and <500 mW in power consumption) that costs less than US $500 based on a 10,000 units per month production rate. To meet this challenge, DRS is developing several innovative technologies including a small pixel pitch 640×512 VOx uncooled detector, an advanced digital ROIC and low power miniature camera electronics. In addition, DRS and its partners are developing innovative manufacturing processes to reduce production cycle time and costs including wafer scale optic and vacuum packaging manufacturing and a 3-dimensional integrated camera assembly. This paper provides an overview of the DRS Tamarisk® project and LCTI-M related uncooled technology development activities. Highlights of recent progress and challenges will also be discussed. It should be noted that BAE Systems and Raytheon Vision Systems are also participants of the DARPA LCTI-M program.
Comparison Between RGB and Rgb-D Cameras for Supporting Low-Cost Gnss Urban Navigation
NASA Astrophysics Data System (ADS)
Rossi, L.; De Gaetani, C. I.; Pagliari, D.; Realini, E.; Reguzzoni, M.; Pinto, L.
2018-05-01
A pure GNSS navigation is often unreliable in urban areas because of the presence of obstructions, thus preventing a correct reception of the satellite signal. The bridging between GNSS outages, as well as the vehicle attitude reconstruction, can be recovered by using complementary information, such as visual data acquired by RGB-D or RGB cameras. In this work, the possibility of integrating low-cost GNSS and visual data by means of an extended Kalman filter has been investigated. The focus is on the comparison between the use of RGB-D or RGB cameras. In particular, a Microsoft Kinect device (second generation) and a mirrorless Canon EOS M RGB camera have been compared. The former is an interesting RGB-D camera because of its low-cost, easiness of use and raw data accessibility. The latter has been selected for the high-quality of the acquired images and for the possibility of mounting fixed focal length lenses with a lower weight and cost with respect to a reflex camera. The designed extended Kalman filter takes as input the GNSS-only trajectory and the relative orientation between subsequent pairs of images. Depending on the visual data acquisition system, the filter is different because RGB-D cameras acquire both RGB and depth data, allowing to solve the scale problem, which is instead typical of image-only solutions. The two systems and filtering approaches were assessed by ad-hoc experimental tests, showing that the use of a Kinect device for supporting a u-blox low-cost receiver led to a trajectory with a decimeter accuracy, that is 15 % better than the one obtained when using the Canon EOS M camera.
Medical Applications of IR Focal Plane Arrays
1998-03-15
University of Memphis, USA, E. Wolf, H. Bada C Leffler - University of Tennessee at Memphis, USA, M. Daley ■ University of Memphis, USA A two channel ...optical aperture versus thermal sensitivity at two different resolution settings for an optimized medical IR camera LIST OF TABLES TABLE 1 Advantages...34. Technology Transferred: Through this work, infrared imaging in medicine was exposed to ever-growing audiences. For the first time, the work of the last two
Simon Newcomb, America’s First Great Astronomer
2009-02-01
1874 and 1882 transits of Venus across the Sun. A heliostat tracked the Sun and reflected its light through a fixed telescope, where the image was...a new and unique camera consisting of a heliostat , long-focal-length telescope, and photographic plate assembly5 (see figures 2 and 3). While the...and relays or solenoids qualified as leading- Remote mirror Objective lenses Rotating mirrorFixed mirror Observer’s eyepiece Adjustable slit Heliostat
The readout system for the ArTeMis camera
NASA Astrophysics Data System (ADS)
Doumayrou, E.; Lortholary, M.; Dumaye, L.; Hamon, G.
2014-07-01
During ArTeMiS observations at the APEX telescope (Chajnantor, Chile), 5760 bolometric pixels from 20 arrays at 300mK, corresponding to 3 submillimeter focal planes at 450μm, 350μm and 200μm, have to be read out simultaneously at 40Hz. The read out system, made of electronics and software, is the full chain from the cryostat to the telescope. The readout electronics consists of cryogenic buffers at 4K (NABU), based on CMOS technology, and of warm electronic acquisition systems called BOLERO. The bolometric signal given by each pixel has to be amplified, sampled, converted, time stamped and formatted in data packets by the BOLERO electronics. The time stamping is obtained by the decoding of an IRIG-B signal given by APEX and is key to ensure the synchronization of the data with the telescope. Specifically developed for ArTeMiS, BOLERO is an assembly of analogue and digital FPGA boards connected directly on the top of the cryostat. Two detectors arrays (18*16 pixels), one NABU and one BOLERO interconnected by ribbon cables constitute the unit of the electronic architecture of ArTeMiS. In total, the 20 detectors for the tree focal planes are read by 10 BOLEROs. The software is working on a Linux operating system, it runs on 2 back-end computers (called BEAR) which are small and robust PCs with solid state disks. They gather the 10 BOLEROs data fluxes, and reconstruct the focal planes images. When the telescope scans the sky, the acquisitions are triggered thanks to a specific network protocol. This interface with APEX enables to synchronize the acquisition with the observations on sky: the time stamped data packets are sent during the scans to the APEX software that builds the observation FITS files. A graphical user interface enables the setting of the camera and the real time display of the focal plane images, which is essential in laboratory and commissioning phases. The software is a set of C++, Labview and Python, the qualities of which are respectively used for rapidity, powerful graphic interfacing and scripting. The commands to the camera can be sequenced in Python scripts. The paper describes the whole electronic and software readout chain designed to fulfill the specificities of ArTeMiS and its performances. The specific options used are explained, for example, the limited room in the Cassegrain cabin of APEX has led us to a quite compact design. This system was successfully used in summer 2013 for the commissioning and the first scientific observations with a preliminary set of 4 detectors at 350μm.
Focal plane arrays based on Type-II indium arsenide/gallium antimonide superlattices
NASA Astrophysics Data System (ADS)
Delaunay, Pierre-Yves
The goal of this work is to demonstrate that Type-II InAs/GaSb superlattices can perform high quality infrared imaging from the middle (MWIR) to the long (LWIR) wavelength infrared range. Theoretically, focal plane arrays (FPAs) based on this technology could be operated at higher temperatures, with lower dark currents than the leading HgCdTe platform. This effort will focus on the fabrication of MWIR and LWIR FPAs with performance similar to existing infrared cameras. Some applications in the MWIR require fast, sensitive imagers able to sustain frame rates up to 100Hz. Such speed can only be achieved with photon detectors. However, these cameras need to be operated below 170K. Current research in this spectral band focuses on increasing the operating temperature of the FPA to a point where cooling could be performed with compact and reliable thermoelectric coolers. Type-II superlattice was used to demonstrate a camera that presented similar performance to HgCdTe and that could be operated up to room temperature. At 80K, the camera could detect temperature differences as low as 10 mK for an integration time shorter than 25 ms. In the LWIR, the electric performance of Type-II photodiodes is mainly limited by surface leakage. Aggressive processing steps such as hybridization and underfill can increase the dark current of the devices by several orders of magnitude. New cleaning and passivation techniques were used to reduce the dark current of FPA diodes by two orders of magnitudes. The absorbing GaSb substrate was also removed to increase the quantum efficiency of the devices up to 90%. At 80K, a FPA with a 9.6 microm 50%-cutoff in responsivity was able to detect temperature differences as low as 19 mK, only limited by the performance of the testing system. The non-uniformity in responsivity reached 3.8% for a 98.2% operability. The third generation of infrared cameras is based on multi-band imaging in order to improve the recognition capabilities of the imager. Preliminary detectors based on back to back diodes presented similar performance to single colors devices; the quantum efficiency was measured higher than 40% for both bands. Preliminary imaging results were demonstrated in the LWIR.
NASA Astrophysics Data System (ADS)
Swain, Pradyumna; Mark, David
2004-09-01
The emergence of curved CCD detectors as individual devices or as contoured mosaics assembled to match the curved focal planes of astronomical telescopes and terrestrial stereo panoramic cameras represents a major optical design advancement that greatly enhances the scientific potential of such instruments. In altering the primary detection surface within the telescope"s optical instrumentation system from flat to curved, and conforming the applied CCD"s shape precisely to the contour of the telescope"s curved focal plane, a major increase in the amount of transmittable light at various wavelengths through the system is achieved. This in turn enables multi-spectral ultra-sensitive imaging with much greater spatial resolution necessary for large and very large telescope applications, including those involving infrared image acquisition and spectroscopy, conducted over very wide fields of view. For earth-based and space-borne optical telescopes, the advent of curved CCD"s as the principle detectors provides a simplification of the telescope"s adjoining optics, reducing the number of optical elements and the occurrence of optical aberrations associated with large corrective optics used to conform to flat detectors. New astronomical experiments may be devised in the presence of curved CCD applications, in conjunction with large format cameras and curved mosaics, including three dimensional imaging spectroscopy conducted over multiple wavelengths simultaneously, wide field real-time stereoscopic tracking of remote objects within the solar system at high resolution, and deep field survey mapping of distant objects such as galaxies with much greater multi-band spatial precision over larger sky regions. Terrestrial stereo panoramic cameras equipped with arrays of curved CCD"s joined with associative wide field optics will require less optical glass and no mechanically moving parts to maintain continuous proper stereo convergence over wider perspective viewing fields than their flat CCD counterparts, lightening the cameras and enabling faster scanning and 3D integration of objects moving within a planetary terrain environment. Preliminary experiments conducted at the Sarnoff Corporation indicate the feasibility of curved CCD imagers with acceptable electro-optic integrity. Currently, we are in the process of evaluating the electro-optic performance of a curved wafer scale CCD imager. Detailed ray trace modeling and experimental electro-optical data performance obtained from the curved imager will be presented at the conference.
TIFR Near Infrared Imaging Camera-II on the 3.6 m Devasthal Optical Telescope
NASA Astrophysics Data System (ADS)
Baug, T.; Ojha, D. K.; Ghosh, S. K.; Sharma, S.; Pandey, A. K.; Kumar, Brijesh; Ghosh, Arpan; Ninan, J. P.; Naik, M. B.; D’Costa, S. L. A.; Poojary, S. S.; Sandimani, P. R.; Shah, H.; Krishna Reddy, B.; Pandey, S. B.; Chand, H.
Tata Institute of Fundamental Research (TIFR) Near Infrared Imaging Camera-II (TIRCAM2) is a closed-cycle Helium cryo-cooled imaging camera equipped with a Raytheon 512×512 pixels InSb Aladdin III Quadrant focal plane array (FPA) having sensitivity to photons in the 1-5μm wavelength band. In this paper, we present the performance of the camera on the newly installed 3.6m Devasthal Optical Telescope (DOT) based on the calibration observations carried out during 2017 May 11-14 and 2017 October 7-31. After the preliminary characterization, the camera has been released to the Indian and Belgian astronomical community for science observations since 2017 May. The camera offers a field-of-view (FoV) of ˜86.5‧‧×86.5‧‧ on the DOT with a pixel scale of 0.169‧‧. The seeing at the telescope site in the near-infrared (NIR) bands is typically sub-arcsecond with the best seeing of ˜0.45‧‧ realized in the NIR K-band on 2017 October 16. The camera is found to be capable of deep observations in the J, H and K bands comparable to other 4m class telescopes available world-wide. Another highlight of this camera is the observational capability for sources up to Wide-field Infrared Survey Explorer (WISE) W1-band (3.4μm) magnitudes of 9.2 in the narrow L-band (nbL; λcen˜ 3.59μm). Hence, the camera could be a good complementary instrument to observe the bright nbL-band sources that are saturated in the Spitzer-Infrared Array Camera (IRAC) ([3.6] ≲ 7.92 mag) and the WISE W1-band ([3.4] ≲ 8.1 mag). Sources with strong polycyclic aromatic hydrocarbon (PAH) emission at 3.3μm are also detected. Details of the observations and estimated parameters are presented in this paper.
2003-03-07
File name :DSC_0749.JPG File size :1.1MB(1174690Bytes) Date taken :2003/03/07 13:51:29 Image size :2000 x 1312 Resolution :300 x 300 dpi Number of bits :8bit/channel Protection attribute :Off Hide Attribute :Off Camera ID :N/A Camera :NIKON D1H Quality mode :FINE Metering mode :Matrix Exposure mode :Shutter priority Speed light :No Focal length :20 mm Shutter speed :1/500second Aperture :F11.0 Exposure compensation :0 EV White Balance :Auto Lens :20 mm F 2.8 Flash sync mode :N/A Exposure difference :0.0 EV Flexible program :No Sensitivity :ISO200 Sharpening :Normal Image Type :Color Color Mode :Mode II(Adobe RGB) Hue adjustment :3 Saturation Control :N/A Tone compensation :Normal Latitude(GPS) :N/A Longitude(GPS) :N/A Altitude(GPS) :N/A
2002-02-19
File name :DSC_0028.JPG File size :2.8MB(2950833Bytes) Date taken :2002/02/19 09:49:01 Image size :3008 x 2000 Resolution :300 x 300 dpi Number of bits :8bit/channel Protection attribute :Off Hide Attribute :Off Camera ID :N/A Camera :NIKON D100 Quality mode :N/A Metering mode :Matrix Exposure mode :Shutter priority Speed light :Yes Focal length :24 mm Shutter speed :1/60second Aperture :F3.5 Exposure compensation :0 EV White Balance :N/A Lens :N/A Flash sync mode :N/A Exposure difference :N/A Flexible program :N/A Sensitivity :N/A Sharpening :N/A Image Type :Color Color Mode :N/A Hue adjustment :N/A Saturation Control :N/A Tone compensation :N/A Latitude(GPS) :N/A Longitude(GPS) :N/A Altitude(GPS) :N/A
2002-02-24
File name :DSC_0047.JPG File size :2.8MB(2931574Bytes) Date taken :2002/02/24 10:06:57 Image size :3008 x 2000 Resolution :300 x 300 dpi Number of bits :8bit/channel Protection attribute :Off Hide Attribute :Off Camera ID :N/A Camera :NIKON D100 Quality mode :N/A Metering mode :Matrix Exposure mode :Shutter priority Speed light :Yes Focal length :24 mm Shutter speed :1/180second Aperture :F20.0 Exposure compensation :+0.3 EV White Balance :N/A Lens :N/A Flash sync mode :N/A Exposure difference :N/A Flexible program :N/A Sensitivity :N/A Sharpening :N/A Image Type :Color Color Mode :N/A Hue adjustment :N/A Saturation Control :N/A Tone compensation :N/A Latitude(GPS) :N/A Longitude(GPS) :N/A Altitude(GPS) :N/A
IRAIT project: future mid-IR operations at Dome C during summer
NASA Astrophysics Data System (ADS)
Tosti, Gino; IRAIT Collaboration
The project IRAIT consists of a robotic mid-infrared telescope that will be hosted at Dome C in the Italian-French Concordia station on the Antarctic Plateau. The telescope was built in collaboration with the PNRA (sectors Technology and Earth-Sun Interaction and Astrophysics). Its focal plane instrumentation is a mid-infrared Camera (5-25 mu m), based on the TIRCAM II prototype, which is the result of a join effort between Institutes of CNR and INAF. International collaborations with French and Spanish Institutes for the construction of a near infrared spectrographic camera have also been started. We present the status of the project and the ongoing developments that will make possible to start infrared observations at Dome C during the summer Antarctic campaign 2005-2006.
Theoretical performance model for single image depth from defocus.
Trouvé-Peloux, Pauline; Champagnat, Frédéric; Le Besnerais, Guy; Idier, Jérôme
2014-12-01
In this paper we present a performance model for depth estimation using single image depth from defocus (SIDFD). Our model is based on an original expression of the Cramér-Rao bound (CRB) in this context. We show that this model is consistent with the expected behavior of SIDFD. We then study the influence on the performance of the optical parameters of a conventional camera such as the focal length, the aperture, and the position of the in-focus plane (IFP). We derive an approximate analytical expression of the CRB away from the IFP, and we propose an interpretation of the SIDFD performance in this domain. Finally, we illustrate the predictive capacity of our performance model on experimental data comparing several settings of a consumer camera.
The NIKA2 Large Field-of-View Millimeter Continuum Camera for the 30-M IRAM Telescope
NASA Astrophysics Data System (ADS)
Monfardini, Alessandro
2018-01-01
We have constructed and deployed a multi-thousands pixels dual-band (150 and 260 GHz, respectively 2mm and 1.15mm wavelengths) camera to image an instantaneous field-of-view of 6.5arc-min and configurable to map the linear polarization at 260GHz. We are providing a detailed description of this instrument, named NIKA2 (New IRAM KID Arrays 2), in particular focusing on the cryogenics, the optics, the focal plane arrays based on Kinetic Inductance Detectors (KID) and the readout electronics. We are presenting the performance measured on the sky during the commissioning runs that took place between October 2015 and April 2017 at the 30-meter IRAM (Institute of Millimetric Radio Astronomy) telescope at Pico Veleta, and preliminary science-grade results.
Valero-Cabré, Antoni; Pascual-Leone, Alvaro; Coubard, Olivier A.
2011-01-01
Introduction Les méthodes de stimulation cérébrale non invasives telles que la Stimulation Magnétique Transcrânienne (SMT) sont largement utilisées pour établir des inférences causales sur les relations entre cerveau et comportement. Des applications cliniques basées sur la SMT ont également été développées pour traiter des affections neurologiques ou psychiatriques comme la dépression, la dystonie, la douleur, les acouphènes ou les séquelles d’accident vasculaire cérébral. État des connaissances La SMT fonctionne en induisant de manière non invasive et de manière focale des courants électriques dans des régions corticales, modulant ainsi leur niveau d’activité de façon variable suivant la fréquence, le nombre d’impulsions, les intervalles et la durée de stimulation utilisés. S’agissant du cortex moteur, on sait par exemple que les patterns d’impulsions de SMT à basse fréquence ou ceux délivrées de manière continue tendent à déprimer l’activité locale, tandis que la SMT à haute fréquence et discontinus tend à la potentialiser. Outre ses effets locaux, la SMT peut aussi avoir des effets à distance sur les régions cérébrales, véhiculés par les connections anatomiques et qui dépendent de l’efficacité et du signe de ces connexions. Perspectives Dans le domaine de la recherche fondamentale et des applications thérapeutiques, l’utilisation efficace de la SMT requiert, cependant, la compréhension approfondie de ses principes opérationnels, de ses risques, de ses potentialités et de ses limites. Dans cet article, nous présentons les principes par lesquels opèrent les méthodes de stimulation cérébrale non invasive, et en particulier la SMT. Conclusion À l’issue de sa lecture, le lecteur sera en mesure de discuter de façon critique les études scientifiques et cliniques utilisant la SMT, ainsi que de concevoir des applications SMT suivant une hypothèse a priori dans le domaine de la recherche en neuroscience fondamentale et/ou clinique. PMID:21420698
Automated translating beam profiler for in situ laser beam spot-size and focal position measurements
NASA Astrophysics Data System (ADS)
Keaveney, James
2018-03-01
We present a simple and convenient, high-resolution solution for automated laser-beam profiling with axial translation. The device is based on a Raspberry Pi computer, Pi Noir CMOS camera, stepper motor, and commercial translation stage. We also provide software to run the device. The CMOS sensor is sensitive over a large wavelength range between 300 and 1100 nm and can be translated over 25 mm along the beam axis. The sensor head can be reversed without changing its axial position, allowing for a quantitative estimate of beam overlap with counter-propagating laser beams. Although not limited to this application, the intended use for this device is the automated measurement of the focal position and spot-size of a Gaussian laser beam. We present example data of one such measurement to illustrate device performance.
Keaveney, James
2018-03-01
We present a simple and convenient, high-resolution solution for automated laser-beam profiling with axial translation. The device is based on a Raspberry Pi computer, Pi Noir CMOS camera, stepper motor, and commercial translation stage. We also provide software to run the device. The CMOS sensor is sensitive over a large wavelength range between 300 and 1100 nm and can be translated over 25 mm along the beam axis. The sensor head can be reversed without changing its axial position, allowing for a quantitative estimate of beam overlap with counter-propagating laser beams. Although not limited to this application, the intended use for this device is the automated measurement of the focal position and spot-size of a Gaussian laser beam. We present example data of one such measurement to illustrate device performance.
Variable Shadow Screens for Imaging Optical Devices
NASA Technical Reports Server (NTRS)
Lu, Ed; Chretien, Jean L.
2004-01-01
Variable shadow screens have been proposed for reducing the apparent brightnesses of very bright light sources relative to other sources within the fields of view of diverse imaging optical devices, including video and film cameras and optical devices for imaging directly into the human eye. In other words, variable shadow screens would increase the effective dynamic ranges of such devices. Traditionally, imaging sensors are protected against excessive brightness by use of dark filters and/or reduction of iris diameters. These traditional means do not increase dynamic range; they reduce the ability to view or image dimmer features of an image because they reduce the brightness of all parts of an image by the same factor. On the other hand, a variable shadow screen would darken only the excessively bright parts of an image. For example, dim objects in a field of view that included the setting Sun or bright headlights could be seen more readily in a picture taken through a variable shadow screen than in a picture of the same scene taken through a dark filter or a narrowed iris. The figure depicts one of many potential variations of the basic concept of the variable shadow screen. The shadow screen would be a normally transparent liquid-crystal matrix placed in front of a focal-plane array of photodetectors in a charge-coupled-device video camera. The shadow screen would be placed far enough from the focal plane so as not to disrupt the focal-plane image to an unacceptable degree, yet close enough so that the out-of-focus shadows cast by the screen would still be effective in darkening the brightest parts of the image. The image detected by the photodetector array itself would be used as feedback to drive the variable shadow screen: The video output of the camera would be processed by suitable analog and/or digital electronic circuitry to generate a negative partial version of the image to be impressed on the shadow screen. The parts of the shadow screen in front of those parts of the image with brightness below a specified threshold would be left transparent; the parts of the shadow screen in front of those parts of the image where the brightness exceeded the threshold would be darkened by an amount that would increase with the excess above the threshold.
NASA Astrophysics Data System (ADS)
Zhang, Hua; Zeng, Luan
2017-11-01
Binocular stereoscopic vision can be used for space-based space targets near observation. In order to solve the problem that the traditional binocular vision system cannot work normally after interference, an online calibration method of binocular stereo measuring camera with self-reference is proposed. The method uses an auxiliary optical imaging device to insert the image of the standard reference object into the edge of the main optical path and image with the target on the same focal plane, which is equivalent to a standard reference in the binocular imaging optical system; When the position of the system and the imaging device parameters are disturbed, the image of the standard reference will change accordingly in the imaging plane, and the position of the standard reference object does not change. The camera's external parameters can be re-calibrated by the visual relationship of the standard reference object. The experimental results show that the maximum mean square error of the same object can be reduced from the original 72.88mm to 1.65mm when the right camera is deflected by 0.4 degrees and the left camera is high and low with 0.2° rotation. This method can realize the online calibration of binocular stereoscopic vision measurement system, which can effectively improve the anti - jamming ability of the system.
Report on the eROSITA camera system
NASA Astrophysics Data System (ADS)
Meidinger, Norbert; Andritschke, Robert; Bornemann, Walter; Coutinho, Diogo; Emberger, Valentin; Hälker, Olaf; Kink, Walter; Mican, Benjamin; Müller, Siegfried; Pietschner, Daniel; Predehl, Peter; Reiffers, Jonas
2014-07-01
The eROSITA space telescope is currently developed for the determination of cosmological parameters and the equation of state of dark energy via evolution of clusters of galaxies. Furthermore, the instrument development was strongly motivated by the intention of a first imaging X-ray all-sky survey enabling measurements above 2 keV. eROSITA is a scientific payload on the Russian research satellite SRG. Its destination after launch is the Lagrangian point L2. The observational program of the observatory divides into an all-sky survey and pointed observations and takes in total about 7.5 years. The instrument comprises an array of 7 identical and parallel aligned telescopes. Each of the seven focal plane cameras is equipped with a PNCCD detector, an enhanced type of the XMM-Newton focal plane detector. This instrumentation permits spectroscopy and imaging of X-rays in the energy band from 0.3 keV to 10 keV with a field of view of 1.0 degree. The camera development is done at the Max-Planck-Institute for extraterrestrial physics. Key component of each camera is the PNCCD chip. This silicon sensor is a back-illuminated, fully depleted and column-parallel type of charge coupled device. The image area of the 450 micron thick frame-transfer CCD comprises an array of 384 x 384 pixels, each with a size of 75 micron x 75 micron. Readout of the signal charge that is generated by an incident X-ray photon in the CCD is accomplished by an ASIC, the so-called eROSITA CAMEX. It provides 128 parallel analog signal processing channels but multiplexes the signals finally to one output which feeds the detector signals to a fast 14-bit ADC. The read noise of this system is equivalent to a noise charge of about 2.5 electrons rms. We achieve an energy resolution close to the theoretical limit given by Fano noise (except for very low energies). For example, the FWHM at an energy of 5.9 keV is approximately 140 eV. The complete camera assembly comprises the camera head with the detector as key component, the electronics for detector operation as well as data acquisition and the filter wheel unit. In addition to the on-chip light blocking filter directly deposited on the photon entrance window of the PNCCD, an external filter can be moved in front of the sensor, which serves also for contamination protection. Furthermore, an on-board calibration source emitting several fluorescence lines is accommodated on the filter wheel mechanism for the purpose of in-orbit calibration. Since the spectroscopic silicon sensors need cooling down to -95°C to mitigate best radiation damage effects, an elaborate cooling system is necessary. It consists of two different types of heat pipes linking the seven detectors to two radiators. Based on the tests with an engineering model, a flight design was developed for the camera and a qualification model has been built. The tests and the performance of this camera is presented in the following. In conclusion an outlook on the flight cameras is given.
Visual Tour Based on Panaromic Images for Indoor Places in Campus
NASA Astrophysics Data System (ADS)
Bakirman, T.
2012-07-01
In this paper, it is aimed to create a visual tour based on panoramic images for Civil Engineering Faculty in Yildiz Technical University. For this purpose, panoramic images should be obtained. Thus, photos taken with a tripod to have the same angle of view in every photo and panoramic images were created with stitching photos. Two different cameras with different focal length were used. With the panoramic images, visual tour with navigation tools created.
Method for stitching microbial images using a neural network
NASA Astrophysics Data System (ADS)
Semenishchev, E. A.; Voronin, V. V.; Marchuk, V. I.; Tolstova, I. V.
2017-05-01
Currently an analog microscope has a wide distribution in the following fields: medicine, animal husbandry, monitoring technological objects, oceanography, agriculture and others. Automatic method is preferred because it will greatly reduce the work involved. Stepper motors are used to move the microscope slide and allow to adjust the focus in semi-automatic or automatic mode view with transfer images of microbiological objects from the eyepiece of the microscope to the computer screen. Scene analysis allows to locate regions with pronounced abnormalities for focusing specialist attention. This paper considers the method for stitching microbial images, obtained of semi-automatic microscope. The method allows to keep the boundaries of objects located in the area of capturing optical systems. Objects searching are based on the analysis of the data located in the area of the camera view. We propose to use a neural network for the boundaries searching. The stitching image boundary is held of the analysis borders of the objects. To auto focus, we use the criterion of the minimum thickness of the line boundaries of object. Analysis produced the object located in the focal axis of the camera. We use method of recovery of objects borders and projective transform for the boundary of objects which are based on shifted relative to the focal axis. Several examples considered in this paper show the effectiveness of the proposed approach on several test images.
Small form-factor VGA camera with variable focus by liquid lens
NASA Astrophysics Data System (ADS)
Oikarinen, Kari A.; Aikio, Mika
2010-05-01
We present the design of a 24 mm long variable focus lens for 1/4" sensor. The chosen CMOS color sensor has VGA (640×480) resolution and 5.6 μm pixel size. The lens utilizes one Varioptic Arctic 320 liquid lens that has a voltage-controllable focal length due to the electrowetting effect. There are no mechanical moving parts. The principle of operation of the liquid lens is explained briefly. We discuss designing optical systems with this type of lens. This includes a modeling approach that allows entering a voltage value to modify the configuration of the liquid lens. The presented design consists only of spherical glass surfaces. The choice to use spherical surfaces was made in order to decrease the costs of manufacturing and provide more predictable performance by the better established method. Fabrication tolerances are compensated by the adjustability of the liquid lens, further increasing the feasibility of manufacturing. The lens is manufactured and assembled into a demonstrator camera. It has an f-number of 2.5 and 40 degree full field of view. The effective focal length varies around 6 millimeters as the liquid lens is adjusted. In simulations we have achieved a focus distance controllable between 20 millimeters and infinity. The design differs from previous approaches by having the aperture stop in the middle of the system instead of in front.
Loehfelm, Thomas W; Prater, Adam B; Debebe, Tequam; Sekhar, Aarti K
2017-02-01
We digitized the radiography teaching file at Black Lion Hospital (Addis Ababa, Ethiopia) during a recent trip, using a standard digital camera and a fluorescent light box. Our goal was to photograph every radiograph in the existing library while optimizing the final image size to the maximum resolution of a high quality tablet computer, preserving the contrast resolution of the radiographs, and minimizing total library file size. A secondary important goal was to minimize the cost and time required to take and process the images. Three workers were able to efficiently remove the radiographs from their storage folders, hang them on the light box, operate the camera, catalog the image, and repack the radiographs back to the storage folder. Zoom, focal length, and film speed were fixed, while aperture and shutter speed were manually adjusted for each image, allowing for efficiency and flexibility in image acquisition. Keeping zoom and focal length fixed, which kept the view box at the same relative position in all of the images acquired during a single photography session, allowed unused space to be batch-cropped, saving considerable time in post-processing, at the expense of final image resolution. We present an analysis of the trade-offs in workflow efficiency and final image quality, and demonstrate that a few people with minimal equipment can efficiently digitize a teaching file library.
2003-09-08
KENNEDY SPACE CENTER, FLA. - The Window Observational Research Facility (WORF), seen in the Space Station Processing Facility, was designed and built by the Boeing Co. at NASA’s Marshall Space Flight Center in Huntsville, Ala. WORF will be delivered to the International Space Station and placed in the rack position in front of the Destiny lab window, providing locations for attaching cameras, multi-spectral scanners and other instruments. WORF will support a variety of scientific and commercial experiments in areas of Earth systems and processes, global ecological changes in Earth’s biosphere, lithosphere, hydrosphere and climate system, Earth resources, natural hazards, and education. After installation, it will become a permanent focal point for Earth Science research aboard the space station.
2003-09-08
KENNEDY SPACE CENTER, FLA. - Workers in the Space Station Processing Facility check out the Window Observational Research Facility (WORF), designed and built by the Boeing Co. at NASA’s Marshall Space Flight Center in Huntsville, Ala. WORF will be delivered to the International Space Station and placed in the rack position in front of the Destiny lab window, providing locations for attaching cameras, multi-spectral scanners and other instruments. WORF will support a variety of scientific and commercial experiments in areas of Earth systems and processes, global ecological changes in Earth’s biosphere, lithosphere, hydrosphere and climate system, Earth resources, natural hazards, and education. After installation, it will become a permanent focal point for Earth Science research aboard the space station.
The optical design of a visible adaptive optics system for the Magellan Telescope
NASA Astrophysics Data System (ADS)
Kopon, Derek
The Magellan Adaptive Optics system will achieve first light in November of 2012. This AO system contains several subsystems including the 585-actuator concave adaptive secondary mirror, the Calibration Return Optic (CRO) alignment and calibration system, the CLIO 1-5 microm IR science camera, the movable guider camera and active optics assembly, and the W-Unit, which contains both the Pyramid Wavefront Sensor (PWFS) and the VisAO visible science camera. In this dissertation, we present details of the design, fabrication, assembly, alignment, and laboratory performance of the VisAO camera and its optical components. Many of these components required a custom design, such as the Spectral Differential Imaging Wollaston prisms and filters and the coronagraphic spots. One component, the Atmospheric Dispersion Corrector (ADC), required a unique triplet design that had until now never been fabricated and tested on sky. We present the design, laboratory, and on-sky results for our triplet ADC. We also present details of the CRO test setup and alignment. Because Magellan is a Gregorian telescope, the ASM is a concave ellipsoidal mirror. By simulating a star with a white light point source at the far conjugate, we can create a double-pass test of the whole system without the need for a real on-sky star. This allows us to test the AO system closed loop in the Arcetri test tower at its nominal design focal length and optical conjugates. The CRO test will also allow us to calibrate and verify the system off-sky at the Magellan telescope during commissioning and periodically thereafter. We present a design for a possible future upgrade path for a new visible Integral Field Spectrograph. By integrating a fiber array bundle at the VisAO focal plane, we can send light to a pre-existing facility spectrograph, such as LDSS3, which will allow 20 mas spatial sampling and R˜1,800 spectra over the band 0.6-1.05 microm. This would be the highest spatial resolution IFU to date, either from the ground or in space.
Automated optical testing of LWIR objective lenses using focal plane array sensors
NASA Astrophysics Data System (ADS)
Winters, Daniel; Erichsen, Patrik; Domagalski, Christian; Peter, Frank; Heinisch, Josef; Dumitrescu, Eugen
2012-10-01
The image quality of today's state-of-the-art IR objective lenses is constantly improving while at the same time the market for thermography and vision grows strongly. Because of increasing demands on the quality of IR optics and increasing production volumes, the standards for image quality testing increase and tests need to be performed in shorter time. Most high-precision MTF testing equipment for the IR spectral bands in use today relies on the scanning slit method that scans a 1D detector over a pattern in the image generated by the lens under test, followed by image analysis to extract performance parameters. The disadvantages of this approach are that it is relatively slow, it requires highly trained operators for aligning the sample and the number of parameters that can be extracted is limited. In this paper we present lessons learned from the R and D process on using focal plane array (FPA) sensors for testing of long-wave IR (LWIR, 8-12 m) optics. Factors that need to be taken into account when switching from scanning slit to FPAs are e.g.: the thermal background from the environment, the low scene contrast in the LWIR, the need for advanced image processing algorithms to pre-process camera images for analysis and camera artifacts. Finally, we discuss 2 measurement systems for LWIR lens characterization that we recently developed with different target applications: 1) A fully automated system suitable for production testing and metrology that uses uncooled microbolometer cameras to automatically measure MTF (on-axis and at several o-axis positions) and parameters like EFL, FFL, autofocus curves, image plane tilt, etc. for LWIR objectives with an EFL between 1 and 12mm. The measurement cycle time for one sample is typically between 6 and 8s. 2) A high-precision research-grade system using again an uncooled LWIR camera as detector, that is very simple to align and operate. A wide range of lens parameters (MTF, EFL, astigmatism, distortion, etc.) can be easily and accurately measured with this system.
Metrology camera system of prime focus spectrograph for Suburu telescope
NASA Astrophysics Data System (ADS)
Wang, Shiang-Yu; Chou, Richard C. Y.; Huang, Pin-Jie; Ling, Hung-Hsu; Karr, Jennifer; Chang, Yin-Chang; Hu, Yen-Sang; Hsu, Shu-Fu; Chen, Hsin-Yo; Gunn, James E.; Reiley, Dan J.; Tamura, Naoyuki; Takato, Naruhisa; Shimono, Atsushi
2016-08-01
The Prime Focus Spectrograph (PFS) is a new optical/near-infrared multi-fiber spectrograph designed for the prime focus of the 8.2m Subaru telescope. PFS will cover a 1.3 degree diameter field with 2394 fibers to complement the imaging capabilities of Hyper SuprimeCam. To retain high throughput, the final positioning accuracy between the fibers and observing targets of PFS is required to be less than 10 microns. The metrology camera system (MCS) serves as the optical encoder of the fiber motors for the configuring of fibers. MCS provides the fiber positions within a 5 microns error over the 45 cm focal plane. The information from MCS will be fed into the fiber positioner control system for the closed loop control. MCS will be located at the Cassegrain focus of Subaru telescope in order to cover the whole focal plane with one 50M pixel Canon CMOS camera. It is a 380mm Schmidt type telescope which generates a uniform spot size with a 10 micron FWHM across the field for reasonable sampling of the point spread function. Carbon fiber tubes are used to provide a stable structure over the operating conditions without focus adjustments. The CMOS sensor can be read in 0.8s to reduce the overhead for the fiber configuration. The positions of all fibers can be obtained within 0.5s after the readout of the frame. This enables the overall fiber configuration to be less than 2 minutes. MCS will be installed inside a standard Subaru Cassgrain Box. All components that generate heat are located inside a glycol cooled cabinet to reduce the possible image motion due to heat. The optics and camera for MCS have been delivered and tested. The mechanical parts and supporting structure are ready as of spring 2016. The integration of MCS will start in the summer of 2016. In this report, the performance of the MCS components, the alignment and testing procedure as well as the status of the PFS MCS will be presented.
Back-illuminate fiber system research for multi-object fiber spectroscopic telescope
NASA Astrophysics Data System (ADS)
Zhou, Zengxiang; Liu, Zhigang; Hu, Hongzhuan; Wang, Jianping; Zhai, Chao; Chu, Jiaru
2016-07-01
In the telescope observation, the position of fiber will highly influence the spectra efficient input in the fiber to the spectrograph. When the fibers were back illuminated on the spectra end, they would export light on the positioner end, so the CCD cameras could capture the photo of fiber tip position covered the focal plane, calculates the precise position information by light centroid method and feeds back to control system. A set of fiber back illuminated system was developed which combined to the low revolution spectro instruments in LAMOST. It could provide uniform light output to the fibers, meet the requirements for the CCD camera measurement. The paper was introduced the back illuminated system design and different test for the light resource. After optimization, the effect illuminated system could compare with the integrating sphere, meet the conditions of fiber position measurement.Using parallel controlled fiber positioner as the spectroscopic receiver is an efficiency observation system for spectra survey, has been used in LAMOST recently, and will be proposed in CFHT and rebuilt telescope Mayall. In the telescope observation, the position of fiber will highly influence the spectra efficient input in the fiber to the spectrograph. When the fibers were back illuminated on the spectra end, they would export light on the positioner end, so the CCD cameras could capture the photo of fiber tip position covered the focal plane, calculates the precise position information by light centroid method and feeds back to control system. After many years on these research, the back illuminated fiber measurement was the best method to acquire the precision position of fibers. In LAMOST, a set of fiber back illuminated system was developed which combined to the low revolution spectro instruments in LAMOST. It could provide uniform light output to the fibers, meet the requirements for the CCD camera measurement and was controlled by high-level observation system which could shut down during the telescope observation. The paper was introduced the back illuminated system design and different test for the light resource. After optimization, the effect illuminated system could compare the integrating sphere, meet the conditions of fiber position measurement.
Nishino, Masami; Lee, Yasuharu; Nakamura, Daisuke; Yoshimura, Takahiro; Taniike, Masayuki; Makino, Nobuhiko; Kato, Hiroyasu; Egami, Yasuyuki; Shutta, Ryu; Tanouchi, Jun; Yamada, Yoshio
2012-10-01
In-stent restenosis (ISR), especially focal ISR, after percutaneous coronary intervention (PCI) remains one of the major clinical problems in the drug-eluting stent (DES) era. Several reports have revealed that excimer laser coronary angioplasty (ELCA) is useful for ISR; however, detailed findings after ELCA are unknown. Therefore, we investigated the condition of the neointima after ELCA for ISR with optical coherence tomography (OCT) and compared the OCT findings and clinical outcome between ELCA and cutting-balloon angioplasty (CBA). Twenty-one consecutive patients with focal ISR who underwent ELCA or CBA were enrolled. All patients underwent 12- to 15-month follow-up coronary angiography. OCT was performed immediately after successful PCI to evaluate the neointimal condition in the ISR lesion. We compared the following OCT parameters between ELCA and CBA groups: maximal thickness of remaining in-stent neointima (MTN), number of tears, minimum lumen dimension (MLD), and minimum lumen area (MLA). We also evaluated clinical outcomes, including target vessel revascularization, acute myocardial infarction, death, and stent thrombosis. MLA in the ELCA group (n = 10) was significantly larger than in the CBA group, and number of tears in the ELCA group was significantly lower than in the CBA group. A trend was shown toward lower TLR with ELCA versus CBA (10.0% vs 45.5%). OCT immediately after ELCA for ISR lesions revealed larger lumen area and smaller number of tears compared with CBA, which may support favorable effects of ELCA for focal ISR.
C-RED One and C-RED2: SWIR high-performance cameras using Saphira e-APD and Snake InGaAs detectors
NASA Astrophysics Data System (ADS)
Gach, Jean-Luc; Feautrier, Philippe; Stadler, Eric; Clop, Fabien; Lemarchand, Stephane; Carmignani, Thomas; Wanwanscappel, Yann; Boutolleau, David
2018-02-01
After the development of the OCAM2 EMCCD fast visible camera dedicated to advanced adaptive optics wavefront sensing, First Light Imaging moved to the SWIR fast cameras with the development of the C-RED One and the C-RED 2 cameras. First Light Imaging's C-RED One infrared camera is capable of capturing up to 3500 full frames per second with a subelectron readout noise and very low background. C-RED One is based on the last version of the SAPHIRA detector developed by Leonardo UK. This breakthrough has been made possible thanks to the use of an e-APD infrared focal plane array which is a real disruptive technology in imagery. C-RED One is an autonomous system with an integrated cooling system and a vacuum regeneration system. It operates its sensor with a wide variety of read out techniques and processes video on-board thanks to an FPGA. We will show its performances and expose its main features. In addition to this project, First Light Imaging developed an InGaAs 640x512 fast camera with unprecedented performances in terms of noise, dark and readout speed based on the SNAKE SWIR detector from Sofradir. The camera was called C-RED 2. The C-RED 2 characteristics and performances will be described. The C-RED One project has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944. The C-RED 2 development is supported by the "Investments for the future" program and the Provence Alpes Côte d'Azur Region, in the frame of the CPER.
Free-form reflective optics for mid-infrared camera and spectrometer on board SPICA
NASA Astrophysics Data System (ADS)
Fujishiro, Naofumi; Kataza, Hirokazu; Wada, Takehiko; Ikeda, Yuji; Sakon, Itsuki; Oyabu, Shinki
2017-11-01
SPICA (Space Infrared Telescope for Cosmology and Astrophysics) is an astronomical mission optimized for mid-and far-infrared astronomy with a cryogenically cooled 3-m class telescope, envisioned for launch in early 2020s. Mid-infrared Camera and Spectrometer (MCS) is a focal plane instrument for SPICA with imaging and spectroscopic observing capabilities in the mid-infrared wavelength range of 5-38μm. MCS consists of two relay optical modules and following four scientific optical modules of WFC (Wide Field Camera; 5'x 5' field of view, f/11.7 and f/4.2 cameras), LRS (Low Resolution Spectrometer; 2'.5 long slits, prism dispersers, f/5.0 and f/1.7 cameras, spectral resolving power R ∼ 50-100), MRS (Mid Resolution Spectrometer; echelles, integral field units by image slicer, f/3.3 and f/1.9 cameras, R ∼ 1100-3000) and HRS (High Resolution Spectrometer; immersed echelles, f/6.0 and f/3.6 cameras, R ∼ 20000-30000). Here, we present optical design and expected optical performance of MCS. Most parts of MCS optics adopt off-axis reflective system for covering the wide wavelength range of 5-38μm without chromatic aberration and minimizing problems due to changes in shapes and refractive indices of materials from room temperature to cryogenic temperature. In order to achieve the high specification requirements of wide field of view, small F-number and large spectral resolving power with compact size, we employed the paraxial and aberration analysis of off-axial optical systems (Araki 2005 [1]) which is a design method using free-form surfaces for compact reflective optics such as head mount displays. As a result, we have successfully designed compact reflective optics for MCS with as-built performance of diffraction-limited image resolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banerji, M.; Jouvel, S.; Lin, H.
2014-11-25
We present the combination of optical data from the Science Verification phase of the Dark Energy Survey (DES) with near-infrared (NIR) data from the European Southern Observatory VISTA Hemisphere Survey (VHS). The deep optical detections from DES are used to extract fluxes and associated errors from the shallower VHS data. Joint seven-band ( grizYJK) photometric catalogues are produced in a single 3 sq-deg dedicated camera field centred at 02h26m-04d36m where the availability of ancillary multiwavelength photometry and spectroscopy allows us to test the data quality. Dual photometry increases the number of DES galaxies with measured VHS fluxes by a factormore » of ~4.5 relative to a simple catalogue level matching and results in a ~1.5 mag increase in the 80 per cent completeness limit of the NIR data. Almost 70 per cent of DES sources have useful NIR flux measurements in this initial catalogue. Photometric redshifts are estimated for a subset of galaxies with spectroscopic redshifts and initial results, although currently limited by small number statistics, indicate that the VHS data can help reduce the photometric redshift scatter at both z < 0.5 and z > 1. We present example DES+VHS colour selection criteria for high-redshift luminous red galaxies (LRGs) at z ~ 0.7 as well as luminous quasars. Using spectroscopic observations in this field we show that the additional VHS fluxes enable a cleaner selection of both populations with <10 per cent contamination from galactic stars in the case of spectroscopically confirmed quasars and <0.5 per cent contamination from galactic stars in the case of spectroscopically confirmed LRGs. The combined DES+VHS data set, which will eventually cover almost 5000 sq-deg, will therefore enable a range of new science and be ideally suited for target selection for future wide-field spectroscopic surveys.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banerji, M.; Jouvel, S.; Lin, H.
2014-11-25
We present the combination of optical data from the Science Verification phase of the Dark Energy Survey (DES) with near-infrared (NIR) data from the European Southern Observatory VISTA Hemisphere Survey (VHS). The deep optical detections from DES are used to extract fluxes and associated errors from the shallower VHS data. Joint seven-band (grizYJK) photometric catalogues are produced in a single 3 sq-deg dedicated camera field centred at 02h26m-04d36m where the availability of ancillary multiwavelength photometry and spectroscopy allows us to test the data quality. Dual photometry increases the number of DES galaxies with measured VHS fluxes by a factor ofmore » similar to 4.5 relative to a simple catalogue level matching and results in a similar to 1.5 mag increase in the 80 per cent completeness limit of the NIR data. Almost 70 per cent of DES sources have useful NIR flux measurements in this initial catalogue. Photometric redshifts are estimated for a subset of galaxies with spectroscopic redshifts and initial results, although currently limited by small number statistics, indicate that the VHS data can help reduce the photometric redshift scatter at both z < 0.5 and z > 1. We present example DES VHS colour selection criteria for high-redshift luminous red galaxies (LRGs) at z similar to 0.7 as well as luminous quasars. Using spectroscopic observations in this field we show that the additional VHS fluxes enable a cleaner selection of both populations with <10 per cent contamination from galactic stars in the case of spectroscopically confirmed quasars and <0.5 per cent contamination from galactic stars in the case of spectroscopically confirmed LRGs. The combined DES+VHS data set, which will eventually cover almost 5000 sq-deg, will therefore enable a range of new science and be ideally suited for target selection for future wide-field spectroscopic surveys.« less
Broadband Achromatic Telecentric Lens
NASA Technical Reports Server (NTRS)
Mouroulis, Pantazis
2007-01-01
A new type of lens design features broadband achromatic performance as well as telecentricity, using a minimum number of spherical elements. With appropriate modifications, the lens design form can be tailored to cover the range of response of the focal-plane array, from Si (400-1,000 nm) to InGaAs (400-1,700 or 2,100 nm) or InSb/HgCdTe reaching to 2,500 nm. For reference, lenses typically are achromatized over the visible wavelength range of 480-650 nm. In remote sensing applications, there is a need for broadband achromatic telescopes, normally satisfied with mirror-based systems. However, mirror systems are not always feasible due to size or geometry restrictions. They also require expensive aspheric surfaces. Non-obscured mirror systems can be difficult to align and have a limited (essentially one-dimensional) field of view. Centrally obscured types have a two-dimensional but very limited field in addition to the obscuration. Telecentricity is a highly desirable property for matching typical spectrometer types, as well as for reducing the variation of the angle of incidence and cross-talk on the detector for simple camera types. This rotationally symmetric telescope with no obscuration and using spherical surfaces and selected glass types fills a need in the range of short focal lengths. It can be used as a compact front unit for a matched spectrometer, as an ultra-broadband camera objective lens, or as the optics of an integrated camera/spectrometer in which the wavelength information is obtained by the use of strip or linear variable filters on the focal plane array. This kind of camera and spectrometer system can find applications in remote sensing, as well as in-situ applications for geological mapping and characterization of minerals, ecological studies, and target detection and identification through spectral signatures. Commercially, the lens can be used in quality-control applications via spectral analysis. The lens design is based on the rear landscape lens with the aperture stop in front of all elements. This allows sufficient room for telecentricity in addition to making the stop easily accessible. The crucial design features are the use of a doublet with an ultra-low dispersion glass (fluorite or S-FPL53), and the use of a strong negative element, which enables flat field and telecentricity in conjunction with the last (field lens) element. The field lens also can be designed to be in contact with the array, a feature that is desirable in some applications. The lens has a 20deg field of view, for a 50-mm focal length, and is corrected over the range of wavelengths of 450-2,300 nm. Transverse color, which is the most pernicious aberration for spectroscopic work, is controlled at the level of 1 m or below at 0.7 m field and 5 m at full field. The maximum chief ray angle is less than 1.7 , providing good telecentricity. An additional feature of this lens is that it is made exclusively with glasses that provide good transmission up to 2,300 nm and even some transmission to 2,500 nm; thus, the lens can be used in applications that cover the entire solar-reflected spectrum. Alternative realizations are possible that provide enhanced resolution and even less transverse color over a narrower wavelength range.
MMW/THz imaging using upconversion to visible, based on glow discharge detector array and CCD camera
NASA Astrophysics Data System (ADS)
Aharon, Avihai; Rozban, Daniel; Abramovich, Amir; Yitzhaky, Yitzhak; Kopeika, Natan S.
2017-10-01
An inexpensive upconverting MMW/THz imaging method is suggested here. The method is based on glow discharge detector (GDD) and silicon photodiode or simple CCD/CMOS camera. The GDD was previously found to be an excellent room-temperature MMW radiation detector by measuring its electrical current. The GDD is very inexpensive and it is advantageous due to its wide dynamic range, broad spectral range, room temperature operation, immunity to high power radiation, and more. An upconversion method is demonstrated here, which is based on measuring the visual light emitting from the GDD rather than its electrical current. The experimental setup simulates a setup that composed of a GDD array, MMW source, and a basic CCD/CMOS camera. The visual light emitting from the GDD array is directed to the CCD/CMOS camera and the change in the GDD light is measured using image processing algorithms. The combination of CMOS camera and GDD focal plane arrays can yield a faster, more sensitive, and very inexpensive MMW/THz camera, eliminating the complexity of the electronic circuits and the internal electronic noise of the GDD. Furthermore, three dimensional imaging systems based on scanning prohibited real time operation of such imaging systems. This is easily solved and is economically feasible using a GDD array. This array will enable us to acquire information on distance and magnitude from all the GDD pixels in the array simultaneously. The 3D image can be obtained using methods like frequency modulation continuous wave (FMCW) direct chirp modulation, and measuring the time of flight (TOF).
Concepts, laboratory, and telescope test results of the plenoptic camera as a wavefront sensor
NASA Astrophysics Data System (ADS)
Rodríguez-Ramos, L. F.; Montilla, I.; Fernández-Valdivia, J. J.; Trujillo-Sevilla, J. L.; Rodríguez-Ramos, J. M.
2012-07-01
The plenoptic camera has been proposed as an alternative wavefront sensor adequate for extended objects within the context of the design of the European Solar Telescope (EST), but it can also be used with point sources. Originated in the field of the Electronic Photography, the plenoptic camera directly samples the Light Field function, which is the four - dimensional representation of all the light entering a camera. Image formation can then be seen as the result of the photography operator applied to this function, and many other features of the light field can be exploited to extract information of the scene, like depths computation to extract 3D imaging or, as it will be specifically addressed in this paper, wavefront sensing. The underlying concept of the plenoptic camera can be adapted to the case of a telescope by using a lenslet array of the same f-number placed at the focal plane, thus obtaining at the detector a set of pupil images corresponding to every sampled point of view. This approach will generate a generalization of Shack-Hartmann, Curvature and Pyramid wavefront sensors in the sense that all those could be considered particular cases of the plenoptic wavefront sensor, because the information needed as the starting point for those sensors can be derived from the plenoptic image. Laboratory results obtained with extended objects, phase plates and commercial interferometers, and even telescope observations using stars and the Moon as an extended object are presented in the paper, clearly showing the capability of the plenoptic camera to behave as a wavefront sensor.
Fusion of light-field and photogrammetric surface form data
NASA Astrophysics Data System (ADS)
Sims-Waterhouse, Danny; Piano, Samanta; Leach, Richard K.
2017-08-01
Photogrammetry based systems are able to produce 3D reconstructions of an object given a set of images taken from different orientations. In this paper, we implement a light-field camera within a photogrammetry system in order to capture additional depth information, as well as the photogrammetric point cloud. Compared to a traditional camera that only captures the intensity of the incident light, a light-field camera also provides angular information for each pixel. In principle, this additional information allows 2D images to be reconstructed at a given focal plane, and hence a depth map can be computed. Through the fusion of light-field and photogrammetric data, we show that it is possible to improve the measurement uncertainty of a millimetre scale 3D object, compared to that from the individual systems. By imaging a series of test artefacts from various positions, individual point clouds were produced from depth-map information and triangulation of corresponding features between images. Using both measurements, data fusion methods were implemented in order to provide a single point cloud with reduced measurement uncertainty.
Modeling and Simulation of High Resolution Optical Remote Sensing Satellite Geometric Chain
NASA Astrophysics Data System (ADS)
Xia, Z.; Cheng, S.; Huang, Q.; Tian, G.
2018-04-01
The high resolution satellite with the longer focal length and the larger aperture has been widely used in georeferencing of the observed scene in recent years. The consistent end to end model of high resolution remote sensing satellite geometric chain is presented, which consists of the scene, the three line array camera, the platform including attitude and position information, the time system and the processing algorithm. The integrated design of the camera and the star tracker is considered and the simulation method of the geolocation accuracy is put forward by introduce the new index of the angle between the camera and the star tracker. The model is validated by the geolocation accuracy simulation according to the test method of the ZY-3 satellite imagery rigorously. The simulation results show that the geolocation accuracy is within 25m, which is highly consistent with the test results. The geolocation accuracy can be improved about 7 m by the integrated design. The model combined with the simulation method is applicable to the geolocation accuracy estimate before the satellite launching.
Mitsubishi thermal imager using the 512 x 512 PtSi focal plane arrays
NASA Astrophysics Data System (ADS)
Fujino, Shotaro; Miyoshi, Tetsuo; Yokoh, Masataka; Kitahara, Teruyoshi
1990-01-01
MITSUBISHI THERMAL IMAGER model IR-5120A is high resolution and high sensitivity infrared television imaging system. It was exhibited in SPIE'S 1988 Technical Symposium on OPTICS, ELECTRO-OPTICS, and SENSORS, held at April 1988 Orlando, and acquired interest of many attendants of the symposium for it's high performance. The detector is a Platinium Silicide Charge Sweep Device (CSD) array containing more than 260,000 individual pixels manufactured by Mitsubishi Electric Co. The IR-5120A consists of a Camera Head. containing the CSD, a stirling cycle cooler and support electronics, and a Camera Control Unit containing the pixel fixed pattern noise corrector, video controllor, cooler driver and support power supplies. The stirling cycle cooler built into the Camera Head is used for keeping CSD temperature of approx. 80K with the features such as light weight, long life of more than 2000 hours and low acoustical noise. This paper describes an improved Thermal Imager, with more light weight, compact size and higher performance, and it's design philosophy, characteristics and field image.
The optical design of the G-CLEF Spectrograph: the first light instrument for the GMT
NASA Astrophysics Data System (ADS)
Ben-Ami, Sagi; Epps, Harland; Evans, Ian; Mueller, Mark; Podgorski, William; Szentgyorgyi, Andrew
2016-08-01
The GMT-Consortium Large Earth Finder (G-CLEF), the first major light instrument for the GMT, is a fiber-fed, high-resolution echelle spectrograph. In the following paper, we present the optical design of G-CLEF. We emphasize the unique solutions derived for the spectrograph fiber-feed: the Mangin mirror that corrects the cylindrical field curvature, the implementation of VPH grisms as cross dispersers, and our novel solution for a multi-colored exposure meter. We describe the spectrograph blue and red cameras comprised of 7 and 8 elements respectively, with one aspheric surface in each camera, and present the expected echellogram imaged on the instrument focal planes. Finally, we present ghost analysis and mitigation strategy that takes into account both single reflection and double reflection back scattering from various elements in the optical train.
C-RED One : the infrared camera using the Saphira e-APD detector
NASA Astrophysics Data System (ADS)
Greffe, Timothée.; Feautrier, Philippe; Gach, Jean-Luc; Stadler, Eric; Clop, Fabien; Lemarchand, Stephane; Boutolleau, David; Baker, Ian
2016-08-01
Name for Person Card: Observatoire de la Côte d'Azur First Light Imaging' C-RED One infrared camera is capable of capturing up to 3500 full frames per second with a sub-electron readout noise and very low background. This breakthrough has been made possible thanks to the use of an e- APD infrared focal plane array which is a real disruptive technology in imagery. C-RED One is an autonomous system with an integrated cooling system and a vacuum regeneration system. It operates its sensor with a wide variety of read out techniques and processes video on-board thanks to an FPGA. We will show its performances and expose its main features. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.
First light observations with TIFR Near Infrared Imaging Camera (TIRCAM-II)
NASA Astrophysics Data System (ADS)
Ojha, D. K.; Ghosh, S. K.; D'Costa, S. L. A.; Naik, M. B.; Sandimani, P. R.; Poojary, S. S.; Bhagat, S. B.; Jadhav, R. B.; Meshram, G. S.; Bakalkar, C. B.; Ramaprakash, A. N.; Mohan, V.; Joshi, J.
TIFR near infrared imaging camera (TIRCAM-II) is based on the Aladdin III Quadrant InSb focal plane array (512×512 pixels; 27.6 μm pixel size; sensitive between 1 - 5.5 μm). TIRCAM-II had its first engineering run with the 2 m IUCAA telescope at Girawali during February - March 2011. The first light observations with TIRCAM-II were quite successful. Several infrared standard with TIRCAM-II were quite successful. Several infrared standard stars, the Trapezium Cluster in Orion region, McNeil's nebula, etc., were observed in the J, K and in a narrow-band at 3.6 μm (nbL). In the nbL band, some bright stars could be detected from the Girawali site. The performance of TIRCAM-II is discussed in the light of preliminary observations in near infrared bands.
NASA Technical Reports Server (NTRS)
Cameron, R.; Aldcroft, T.; Podgorski, W. A.; Freeman, M. D.
2000-01-01
The aspect determination system of the Chandra X-ray Observatory plays a key role in realizing the full potential of Chandra's X-ray optics and detectors. We review the performance of the spacecraft hardware components and sub-systems, which provide information for both real time control of the attitude and attitude stability of the Chandra Observatory and also for more accurate post-facto attitude reconstruction. These flight components are comprised of the aspect camera (star tracker) and inertial reference units (gyros), plus the fiducial lights and fiducial transfer optics which provide an alignment null reference system for the science instruments and X-ray optics, together with associated thermal and structural components. Key performance measures will be presented for aspect camera focal plane data, gyro performance both during stable pointing and during maneuvers, alignment stability and mechanism repeatability.
The In-flight Spectroscopic Performance of the Swift XRT CCD Camera During 2006-2007
NASA Technical Reports Server (NTRS)
Godet, O.; Beardmore, A.P.; Abbey, A.F.; Osborne, J.P.; Page, K.L.; Evans, P.; Starling, R.; Wells, A.A.; Angelini, L.; Burrows, D.N.;
2007-01-01
The Swift X-ray Telescope focal plane camera is a front-illuminated MOS CCD, providing a spectral response kernel of 135 eV FWHM at 5.9 keV as measured before launch. We describe the CCD calibration program based on celestial and on-board calibration sources, relevant in-flight experiences, and developments in the CCD response model. We illustrate how the revised response model describes the calibration sources well. Comparison of observed spectra with models folded through the instrument response produces negative residuals around and below the Oxygen edge. We discuss several possible causes for such residuals. Traps created by proton damage on the CCD increase the charge transfer inefficiency (CTI) over time. We describe the evolution of the CTI since the launch and its effect on the CCD spectral resolution and the gain.
The Multi-site All-Sky CAmeRA (MASCARA). Finding transiting exoplanets around bright (mV < 8) stars
NASA Astrophysics Data System (ADS)
Talens, G. J. J.; Spronck, J. F. P.; Lesage, A.-L.; Otten, G. P. P. L.; Stuik, R.; Pollacco, D.; Snellen, I. A. G.
2017-05-01
This paper describes the design, operations, and performance of the Multi-site All-Sky CAmeRA (MASCARA). Its primary goal is to find new exoplanets transiting bright stars, 4 < mV < 8, by monitoring the full sky. MASCARA consists of one northern station on La Palma, Canary Islands (fully operational since February 2015), one southern station at La Silla Observatory, Chile (operational from early 2017), and a data centre at Leiden Observatory in the Netherlands. Both MASCARA stations are equipped with five interline CCD cameras using wide field lenses (24 mm focal length) with fixed pointings, which together provide coverage down to airmass 3 of the local sky. The interline CCD cameras allow for back-to-back exposures, taken at fixed sidereal times with exposure times of 6.4 sidereal seconds. The exposures are short enough that the motion of stars across the CCD does not exceed one pixel during an integration. Astrometry and photometry are performed on-site, after which the resulting light curves are transferred to Leiden for further analysis. The final MASCARA archive will contain light curves for 70 000 stars down to mV = 8.4, with a precision of 1.5% per 5 minutes at mV = 8.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Posada, C. M.; Ade, P. A. R.; Ahmed, Z.
2015-08-11
This work presents the procedures used by Argonne National Laboratory to fabricate large arrays of multichroic transition-edge sensor (TES) bolometers for cosmic microwave background (CMB) measurements. These detectors will be assembled into the focal plane for the SPT-3G camera, the third generation CMB camera to be installed in the South Pole Telescope. The complete SPT-3G camera will have approximately 2690 pixels, for a total of 16,140 TES bolometric detectors. Each pixel is comprised of a broad-band sinuous antenna coupled to a Nb microstrip line. In-line filters are used to define the different band-passes before the millimeter-wavelength signal is fed tomore » the respective Ti/Au TES bolometers. There are six TES bolometer detectors per pixel, which allow for measurements of three band-passes (95 GHz, 150 GHz and 220 GHz) and two polarizations. The steps involved in the monolithic fabrication of these detector arrays are presented here in detail. Patterns are defined using a combination of stepper and contact lithography. The misalignment between layers is kept below 200 nm. The overall fabrication involves a total of 16 processes, including reactive and magnetron sputtering, reactive ion etching, inductively coupled plasma etching and chemical etching.« less
Zhong, Hua; Redo-Sanchez, Albert; Zhang, X-C
2006-10-02
We present terahertz (THz) reflective spectroscopic focal-plane imaging of four explosive and bio-chemical materials (2, 4-DNT, Theophylline, RDX and Glutamic Acid) at a standoff imaging distance of 0.4 m. The 2 dimension (2-D) nature of this technique enables a fast acquisition time and is very close to a camera-like operation, compared to the most commonly used point emission-detection and raster scanning configuration. The samples are identified by their absorption peaks extracted from the negative derivative of the reflection coefficient respect to the frequency (-dr/dv) of each pixel. Classification of the samples is achieved by using minimum distance classifier and neural network methods with a rate of accuracy above 80% and a false alarm rate below 8%. This result supports the future application of THz time-domain spectroscopy (TDS) in standoff distance sensing, imaging, and identification.
Noise-cancellation-based nonuniformity correction algorithm for infrared focal-plane arrays.
Godoy, Sebastián E; Pezoa, Jorge E; Torres, Sergio N
2008-10-10
The spatial fixed-pattern noise (FPN) inherently generated in infrared (IR) imaging systems compromises severely the quality of the acquired imagery, even making such images inappropriate for some applications. The FPN refers to the inability of the photodetectors in the focal-plane array to render a uniform output image when a uniform-intensity scene is being imaged. We present a noise-cancellation-based algorithm that compensates for the additive component of the FPN. The proposed method relies on the assumption that a source of noise correlated to the additive FPN is available to the IR camera. An important feature of the algorithm is that all the calculations are reduced to a simple equation, which allows for the bias compensation of the raw imagery. The algorithm performance is tested using real IR image sequences and is compared to some classical methodologies. (c) 2008 Optical Society of America
Smart lens: tunable liquid lens for laser tracking
NASA Astrophysics Data System (ADS)
Lin, Fan-Yi; Chu, Li-Yu; Juan, Yu-Shan; Pan, Sih-Ting; Fan, Shih-Kang
2007-05-01
A tracking system utilizing tunable liquid lens is proposed and demonstrated. Adapting the concept of EWOD (electrowetting-on-dielectric), the curvature of a droplet on a dielectric film can be controlled by varying the applied voltage. When utilizing the droplet as an optical lens, the focal length of this adaptive liquid lens can be adjusted as desired. Moreover, the light that passes through it can therefore be focused to different positions in space. In this paper, the tuning range of the curvature and focal length of the tunable liquid lens is investigated. Droplet transformation is observed and analyzed under a CCD camera. A tracking system combining the tunable liquid lens with a laser detection system is also proposed. With a feedback circuit that maximizing the returned signal by controlling the tunable lens, the laser beam can keep tracked on a distant reflected target while it is moving.
Manuel, Anastacia M; Phillion, Donald W; Olivier, Scot S; Baker, Kevin L; Cannon, Brice
2010-01-18
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary, along with three refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. In order to maintain image quality during operation, the deformations and rigid body motions of the three large mirrors must be actively controlled to minimize optical aberrations, which arise primarily from forces due to gravity and thermal expansion. We describe the methodology for measuring the telescope aberrations using a set of curvature wavefront sensors located in the four corners of the LSST camera focal plane. We present a comprehensive analysis of the wavefront sensing system, including the availability of reference stars, demonstrating that this system will perform to the specifications required to meet the LSST performance goals.
NASA Astrophysics Data System (ADS)
Mazzinghi, Piero; Bratina, Vojko; Gambicorti, Lisa; Simonetti, Francesca; Zuccaro Marchi, Alessandro
2017-11-01
New technologies are proposed for large aperture and wide Field of View (FOV) space telescopes dedicated to detection of Ultra High Energy Cosmic Rays and Neutrinos flux, through observation of fluorescence traces in atmosphere and diffused Cerenkov signals. The presented advanced detection system is a spaceborne LEO telescope, with better performance than ground-based observatories, detecting up to 103 - 104 events/year. Different design approaches are implemented, all with very large FOV and focal surface detectors with sufficient segmentation and time resolution to allow precise reconstructions of the arrival direction. In particular, two Schmidt cameras are suggested as an appropriate solution to match most of the optical and technical requirements: large FOV, low f/#, reduction of stray light, optionally flat focal surface, already proven low-cost construction technologies. Finally, a preliminary proposal of a wideFOV retrofocus catadioptric telescope is explained.
Variable-focus liquid lens for portable applications
NASA Astrophysics Data System (ADS)
Kuiper, Stein; Hendriks, Benno H.; Huijbregts, Laura J.; Hirschberg, A. Mico; Renders, Christel A.; van As, Marco A.
2004-10-01
The meniscus between two immiscible liquids can be used as an optical lens. A change in curvature of this meniscus by electrowetting leads to a change in focal distance. We demonstrate that two liquids in a tube form a self-centered tunable lens of high optical quality. Several properties were studied, such as optical performance, electrical characteristics and dynamic behavior. We designed and constructed a miniature camera module based on this tunable lens and show that it is very well suited for use in portable applications.
Earth Observations taken by Expedition 41 crewmember
2014-09-17
ISS041-E-016740 (17 Sept. 2014) --- One of the Expedition 41 crew members aboard the Earth-orbiting International Space Station exposed this Sept. 17 nocturnal scene featuring most of the largest cities on the central eastern seaboard. Even at 221 nautical miles above Earth, the 28mm focal length on the still camera was able to pick up detail in the image, for example, Central Park on Manhattan at right frame. The nation?s capital is very near frame center.
VizieR Online Data Catalog: UWISH2 extended H2 emission line sources (Froebrich+, 2015)
NASA Astrophysics Data System (ADS)
Froebrich, D.; Makin, S. V.; Davis, C. J.; Gledhill, T. M.; Kim, Y.; Koo, B.-C.; Rowles, J.; Eisloffel, J.; Nicholas, J.; Lee, J. J.; Williamson, J.; Buckner, A. S. M.
2016-07-01
All data were acquired using the Wide Field Camera (WFCAM) on the United Kingdom Infrared Telescope (UKIRT), Mauna Kea, Hawaii. WFCAM houses four Rockwell Hawaii-II (HgCdTe 2048x2048-pixel) arrays spaced by 94 per cent in the focal plane. The pixel scale measures 0.4-arcsec, although microstepping is used to generate reduced mosaics with a 0.2-arcsec pixel scale and thereby fully sample the expected seeing. (3 data files).
NASA Astrophysics Data System (ADS)
Soulié, G.
2007-09-01
This paper contains 403 measures of double stars.These measures have been made with a 12",F/10 Meade LX 200 telescope and an X2 Barlow lens giving an effective focal length of about 5.5 meters. The calibration is calculated with measures of standard pairs. Frames have been obtained with a CCD camera MX516.
Infrared-thermographic screening of the activity and enantioselectivity of enzymes.
Reetz, M T; Hermes, M; Becker, M H
2001-05-01
The infrared radiation caused by the heat of reaction of an enantioselective enzyme-catalyzed transformation can be detected by modern photovoltaic infrared (IR)-thermographic cameras equipped with focal-plane array detectors. Specifically, in the lipase-catalyzed enantioselective acylation of racemic 1-phenylethanol, the (R)- and (S)-substrates are allowed to react separately in the wells of microtiter plates, the (R)-alcohol showing hot spots in the IR-thermographic images. Thus, highly enantioselective enzymes can be identified at kinetic resolution.
1989-01-24
coherent noise . To overcome this disadvantages, a new holographic inverse filtering system has been developed by the authors. The inverse filter is...beam is blocked. The deblurred aerial image is formed in the image plane IP (the back focal plane of L2). The frequency of the grating used in this... impulse response of the optical system. For certain types of blurs, which include linear motion of the camera under the assumption that the picture
Focal plane instrument for the Solar UV-Vis-IR Telescope aboard SOLAR-C
NASA Astrophysics Data System (ADS)
Katsukawa, Yukio; Suematsu, Yoshinori; Shimizu, Toshifumi; Ichimoto, Kiyoshi; Takeyama, Norihide
2011-10-01
It is presented the conceptual design of a focal plane instrument for the Solar UV-Vis-IR Telescope (SUVIT) aboard the next Japanese solar mission SOLAR-C. A primary purpose of the telescope is to achieve precise as well as high resolution spectroscopic and polarimetric measurements of the solar chromosphere with a big aperture of 1.5 m, which is expected to make a significant progress in understanding basic MHD processes in the solar atmosphere. The focal plane instrument consists of two packages: A filtergraph package is to get not only monochromatic images but also Dopplergrams and magnetograms using a tunable narrow-band filter and interference filters. A spectrograph package is to perform accurate spectro-polarimetric observations for measuring chromospheric magnetic fields, and is employing a Littrow-type spectrograph. The most challenging aspect in the instrument design is wide wavelength coverage from 280 nm to 1.1 μm to observe multiple chromospheric lines, which is to be realized with a lens unit including fluoride glasses. A high-speed camera for correlation tracking of granular motion is also implemented in one of the packages for an image stabilization system, which is essential to achieve high spatial resolution and high polarimetric accuracy.
Verification of the Sentinel-4 focal plane subsystem
NASA Astrophysics Data System (ADS)
Williges, Christian; Uhlig, Mathias; Hilbert, Stefan; Rossmann, Hannes; Buchwinkler, Kevin; Babben, Steffen; Sebastian, Ilse; Hohn, Rüdiger; Reulke, Ralf
2017-09-01
The Sentinel-4 payload is a multi-spectral camera system, designed to monitor atmospheric conditions over Europe from a geostationary orbit. The German Aerospace Center, DLR Berlin, conducted the verification campaign of the Focal Plane Subsystem (FPS) during the second half of 2016. The FPS consists, of two Focal Plane Assemblies (FPAs), two Front End Electronics (FEEs), one Front End Support Electronic (FSE) and one Instrument Control Unit (ICU). The FPAs are designed for two spectral ranges: UV-VIS (305 nm - 500 nm) and NIR (750 nm - 775 nm). In this publication, we will present in detail the set-up of the verification campaign of the Sentinel-4 Qualification Model (QM). This set up will also be used for the upcoming Flight Model (FM) verification, planned for early 2018. The FPAs have to be operated at 215 K +/- 5 K, making it necessary to exploit a thermal vacuum chamber (TVC) for the test accomplishment. The test campaign consists mainly of radiometric tests. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Selected test analyses and results will be presented.
Le syndrome néphrotique idiopathique (SNI) de l’enfant à Dakar: à propos de 40 cas
Keita, Younoussa; Lemrabott, Ahmed Tall; Sylla, Assane; Niang, Babacar; Ka, El Hadji Fary; Dial, Chérif Mohamed; Ndongo, Aliou Abdoulaye; Sow, Amadou; Moreira, Claude; Niang, Abdou; Ndiaye, Ousmane; Diouf, Boucar; Sall, Mouhamadou Guélaye
2017-01-01
Introduction L’objectif de ce travail était d’analyser les caractéristiques diagnostiques, thérapeutiques et évolutives de l’enfant atteint de néphrose dans un service de pédiatrie de Dakar. Méthodes L’étude était réalisée au service de pédiatrie de l’hôpital Aristide Le Dantec. Il s’agissait d’une étude rétrospective sur une période de 03 ans allant du 1er janvier 2012 au 31 décembre 2014. Ont été inclus tous les patients âgés de 02 ans à 12 ans présentant un tableau de Syndrome néphrotique idiopathique. Résultats Quarante cas de néphrose étaient colligés soit une prévalence de 23% parmi les néphropathies prises en charge dans le service. L’âge moyen était de 7,11± 3,14 ans. Le syndrome néphrotique était pur chez 72,5% (n=29) des patients. Les œdèmes des membres inférieurs étaient présents chez 100% des patients, l’oligurie dans 55% (n=22) et l’HTA dans 5% (n=2) des cas. La protéinurie moyenne était de 145,05 ± 85,54 mg/kg/24heures. La protidémie moyenne était de 46,42 ±7,88 g/L et l’albuminémie moyenne de 17,90 ± 7,15 g/L. Trente-neuf patients avaient reçu une corticothérapie à base de prednisone. La corticosensibilité était retenue chez 77% (n=30) des patients et la corticorésistance chez 13% (n=5) des cas. Le facteur de mauvaise réponse à la corticothérapie était un niveau de protéinurie initiale supérieure à 150 mg/kg/jour (p = 0,024). La biopsie rénale était réalisée chez 18% (n=7) des patients et retrouvait dans 57,2% (n=4) des cas une hyalinose segmentaire et focale. Le cyclophosphamide et l’azathioprine étaient associés aux corticoïdes dans 10% (n=4) des cas chacun. Le taux de rémission globale était de 89,8%. L’évolution vers l’insuffisance rénale chronique était notée chez trois (03) des patients. Conclusion La néphrose représentait près du quart des néphropathies prises en charge dans notre service. Le taux de rémission globale était élevé. Le seul facteur de mauvaise réponse à la corticothérapie était le niveau de protéinurie initiale élevée. En cas d’indication de la biopsie rénale chez nos patients, la HSF était la lésion la plus fréquemment retrouvée. PMID:28533882
Recent results obtained on the APEX 12 m antenna with the ArTeMiS prototype camera
NASA Astrophysics Data System (ADS)
Talvard, M.; André, P.; Rodriguez, L.; Le-Pennec, Y.; De Breuck, C.; Revéret, V.; Agnèse, P.; Boulade, O.; Doumayrou, E.; Dubreuil, D.; Ercolani, E.; Gallais, P.; Horeau, B.; Lagage, PO; Leriche, B.; Lortholary, M.; Martignac, J.; Minier, V.; Pantin, E.; Rabanus, D.; Relland, J.; Willmann, G.
2008-07-01
ArTeMiS is a camera designed to operate on large ground based submillimetric telescopes in the 3 atmospheric windows 200, 350 and 450 µm. The focal plane of this camera will be equipped with 5760 bolometric pixels cooled down at 300 mK with an autonomous cryogenic system. The pixels have been manufactured, based on the same technology processes as used for the Herschel-PACS space photometer. We review in this paper the present status and the future plans of this project. A prototype camera, named P-ArTeMiS, has been developed and successfully tested on the KOSMA telescope in 2006 at Gornergrat 3100m, Switzerland. Preliminary results were presented at the previous SPIE conference in Orlando (Talvard et al, 2006). Since then, the prototype camera has been proposed and successfully installed on APEX, a 12 m antenna operated by the Max Planck Institute für Radioastronomie, the European Southern Observatory and the Onsala Space Observatory on the Chajnantor site at 5100 m altitude in Chile. Two runs have been achieved in 2007, first in March and the latter in November. We present in the second part of this paper the first processed images obtained on star forming regions and on circumstellar and debris disks. Calculated sensitivities are compared with expectations. These illustrate the improvements achieved on P-ArTeMiS during the 3 experimental campaigns.
Variabilite temporelle des naines T et construction d'une camera infrarouge a grand champ
NASA Astrophysics Data System (ADS)
Artigau, Etienne
Le travail de thèse décrit ici se divise en deux sections distinctes: la première porte sur une étude de la variabilité temporelle des naines T et la seconde sur la construction et les performances de la Caméra PAnoramique Proche InfraRouge (CPAPIR). Les naines brunes sont des objets qui se forment comme les étoiles, lors de l'effondrement gravitationnel d'un nuage de gaz moléculaire, mais dont la masse est trop faible pour leur permettre d'entretenir des réactions de fusion nucléaire. Environ 70% des naines brunes de type L, qui ont des températures comprises entre 2200 K et 1500 K, présentent une variabilité temporelle dont les mécanismes exacts font toujours l'objet de débats. Nous avons étendu la recherche de variabilité temporelle aux naines brunes ayant des températures inférieures à ~1500 K et qui présentent les signatures du méthane, soit les naines T. Nos observations menées à l'Observatoire du mont Mégantic montrent qu'une fraction importante des naines T sont variables à 1.2 mm et 1.6 mm à des niveaux allant de 17 mmag à 53 mmag RMS. Les propriétés photométriques de cette variabilité sont consistantes avec une évolution de la couverture de nuages de poussière à la surface de plusieurs naines T. Des observations complémentaires menées en spectroscopie au télescope Canada-France-Hawaii montrent, pour une naine T, une variabilité spectroscopique dans le proche infrarouge qui est aussi consistante avec l'évolution de tels nuages de poussière. CPAPIR est une caméra infrarouge con=E7ue pour être utilisée à l'Observatoire du mont Mégantic. Elle possède un champ de 30' × 30', soit le plus grand champ de vue parmi les caméras infrarouges astronomiques actuellement en service. CPAPIR est équipé d'un détecteur de type Hawaii-II sensible de 0.8 mm à 2.4 mm avec 2048×2048 pixels. L'optique cryogénique de CPAPIR comprend 8 lentilles cryogéniques et 10 filtres disposés dans deux roues à filtres. Les observations menées au télescope avec CPAPIR montrent que la qualité d'image obtenue et la transmission globale sont conformes aux prédictions faites à partir du design optique et des courbes de transmission des revêtements utilisées pour ses différentes composantes optiques. Mots-clefs . astronomie, naines brunes, naines T, instrumentation, caméra, relevé, infrarouge
Détection infrarouge par ferroélectriques en couche mince
NASA Astrophysics Data System (ADS)
Audaire, L.; Agnèse, P.; Rambaud, Ph.; Pirot, M.
1994-07-01
Ferroelectric materials are used in solid-state infrared detectors. The thermal scene is focused on a 2D staring array, the pyroelectric charge variations or the permittivity variations are read by an electronic circuit implemented in the focal plane. Various developments have been carried out and led to user friendly sensors which are now available in several laboratories and even as commercialized products in England and USA, but also in France. Les matériaux ferroélectriques sont utilisés en tant que détecteurs thermiques pour la prise de vue en infrarouge. L'optique projette l'image thermique de la scène sur une matrice de détecteurs et la variation de polarisation ou la variation de permittivité sont lues par une électronique intégrée au plan focal. Le travail de recherche et de développement a débouché sur des senseurs simples d'emploi qui commencent à être disponibles sur catalogues, en Angleterre et aux Etats-Unis, mais également en France.
Implementation of a 4x8 NIR and CCD Mosaic Focal Plane Technology
NASA Astrophysics Data System (ADS)
Jelinsky, Patrick; Bebek, C. J.; Besuner, R. W.; Haller, G. M.; Harris, S. E.; Hart, P. A.; Heetderks, H. D.; Levi, M. E.; Maldonado, S. E.; Roe, N. A.; Roodman, A. J.; Sapozhnikov, L.
2011-01-01
Mission concepts for NASA's Wide Field Infrared Survey Telescope (WFIRST), ESA's EUCLID mission, as well as for ground based observations, have requirements for large mosaic focal planes to image visible and near infrared (NIR) wavelengths. We have developed detectors, readout electronics and focal plane design techniques that can be used to create very large scalable focal plane mosaic cameras. In our technology, CCDs and HgCdTe detectors can be intermingled on a single, silicon carbide (SiC) cold plate. This enables optimized, wideband observing strategies. The CCDs, developed at Lawrence Berkeley National Laboratory, are fully-depleted, p-channel devices that are backside illuminated capable of operating at temperatures as low as 110K and have been optimized for the weak lensing dark energy technique. The NIR detectors are 1.7µm and 2.0µm wavelength cutoff H2RG® HgCdTe, manufactured by Teledyne Imaging Sensors under contract to LBL. Both the CCDs and NIR detectors are packaged on 4-side abuttable SiC pedestals with a common mounting footprint supporting a 44.16mm mosaic pitch and are coplanar. Both types of detectors have direct-attached, readout electronics that convert the detector signal directly to serial, digital data streams and allow a flexible, low cost data acquisition strategy, despite the large data volume. A mosaic of these detectors can be operated at a common temperature that achieves the required dark current and read noise performance in both types of detectors necessary for dark energy observations. We report here the design and integration for a focal plane designed to accommodate a 4x8 heterogeneous array of CCDs and HgCdTe detectors. Our current implementation contains over 1/4-billion pixels.
The optical design concept of SPICA-SAFARI
NASA Astrophysics Data System (ADS)
Jellema, Willem; Kruizinga, Bob; Visser, Huib; van den Dool, Teun; Pastor Santos, Carmen; Torres Redondo, Josefina; Eggens, Martin; Ferlet, Marc; Swinyard, Bruce; Dohlen, Kjetil; Griffin, Doug; Gonzalez Fernandez, Luis Miguel; Belenguer, Tomas; Matsuhara, Hideo; Kawada, Mitsunobu; Doi, Yasuo
2012-09-01
The Safari instrument on the Japanese SPICA mission is a zodiacal background limited imaging spectrometer offering a photometric imaging (R ≍ 2), and a low (R = 100) and medium spectral resolution (R = 2000 at 100 μm) spectroscopy mode in three photometric bands covering the 34-210 μm wavelength range. The instrument utilizes Nyquist sampled filled arrays of very sensitive TES detectors providing a 2’x2’ instantaneous field of view. The all-reflective optical system of Safari is highly modular and consists of an input optics module containing the entrance shutter, a calibration source and a pair of filter wheels, followed by an interferometer and finally the camera bay optics accommodating the focal-plane arrays. The optical design is largely driven and constrained by volume inviting for a compact three-dimensional arrangement of the interferometer and camera bay optics without compromising the optical performance requirements associated with a diffraction- and background-limited spectroscopic imaging instrument. Central to the optics we present a flexible and compact non-polarizing Mach-Zehnder interferometer layout, with dual input and output ports, employing a novel FTS scan mechanism based on magnetic bearings and a linear motor. In this paper we discuss the conceptual design of the focal-plane optics and describe how we implement the optical instrument functions, define the photometric bands, deal with straylight control, diffraction and thermal emission in the long-wavelength limit and interface to the large-format FPA arrays at one end and the SPICA telescope assembly at the other end.
NASA Astrophysics Data System (ADS)
Butts, Robert R.
1997-08-01
A low noise, high resolution Shack-Hartmann wavefront sensor was included in the ABLE-ACE instrument suite to obtain direct high resolution phase measurements of the 0.53 micrometers pulsed laser beam propagated through high altitude atmospheric turbulence. The wavefront sensor employed a Fired geometry using a lenslet array which provided approximately 17 sub-apertures across the pupil. The lenslets focused the light in each sub-aperture onto a 21 by 21 array of pixels in the camera focal plane with 8 pixels in the camera focal plane with 8 pixels across the central lobe of the diffraction limited spot. The goal of the experiment was to measure the effects of the turbulence in the free atmosphere on propagation, but the wavefront sensor also detected the aberrations induced by the aircraft boundary layer and the receiver aircraft internal beam path. Data analysis methods used to extract the desired atmospheric contribution to the phase measurements from the data corrupted by non-atmospheric aberrations are described. Approaches which were used included a reconstruction of the phase as a linear combination of Zernike polynomials coupled with optical estimator sand computation of structure functions of the sub-aperture slopes. The theoretical basis for the data analysis techniques is presented. Results are described, and comparisons with theory and simulations are shown. Estimates of average turbulence strength along the propagation path from the wavefront sensor showed good agreement with other sensor. The Zernike spectra calculated from the wavefront sensor data were consistent with the standard Kolmogorov model of turbulence.
Real-time optical multiple object recognition and tracking system and method
NASA Technical Reports Server (NTRS)
Chao, Tien-Hsin (Inventor); Liu, Hua-Kuang (Inventor)
1990-01-01
System for optically recognizing and tracking a plurality of objects within a field of vision. Laser (46) produces a coherent beam (48). Beam splitter (24) splits the beam into object (26) and reference (28) beams. Beam expanders (50) and collimators (52) transform the beams (26, 28) into coherent collimated light beams (26', 28'). A two-dimensional SLM (54), disposed in the object beam (26'), modulates the object beam with optical information as a function of signals from a first camera (16) which develops X and Y signals reflecting the contents of its field of vision. A hololens (38), positioned in the object beam (26') subsequent to the modulator (54), focuses the object beam at a plurality of focal points (42). A planar transparency-forming film (32), disposed with the focal points on an exposable surface, forms a multiple position interference filter (62) upon exposure of the surface and development processing of the film (32). A reflector (53) directing the reference beam (28') onto the film (32), exposes the surface, with images focused by the hololens (38), to form interference patterns on the surface. There is apparatus (16', 64) for sensing and indicating light passage through respective ones of the positions of the filter (62), whereby recognition of objects corresponding to respective ones of the positions of the filter (62) is affected. For tracking, apparatus (64) focuses light passing through the filter (62) onto a matrix of CCD's in a second camera (16') to form a two-dimensional display of the recognized objects.
NASA Astrophysics Data System (ADS)
Lin, Han; Baoqi, Mao; Wen, Sun; Weimin, Shen
2016-10-01
There is a race to develop spaceborne high-resolution video cameras since Skybox's success. For low manufacture cost and adaption to micro and small satellites, it is urgent to design and develop compact long focal length optical system with not only small volume, light weight and easy implementation, and also two dimensional field. Our focus is on the Coaxial Three-Mirror Anastigmat (CTMA) with intermediate real image for its no need outer hood and compactness and for its easy alignment, low-order aspheric surface and low cost. The means to deflect its image space beam for accessibility of focal plane array detector and to eliminate its inherent secondary obscuration from its primary mirror central hole and deflection flat mirror is discussed. The conditions to satisfy the above-mentioned requirements are presented with our derived relationship among its optical and structural parameters based on Gaussian optics and geometry. One flat mirror near its exit pupil can be used to deflect its image plane from its axis. And its total length can be decreased with other some flat mirrors. Method for determination of its initial structure with the derived formulae is described through one design example. Furthermore, optimized CTMA without secondary obscuration and with effective focal length (EFFL) of 10m is reported. Its full field, F-number and total length are respectively 1.1°×1°, F/14.3, and one eighth of its EFFL. And its imaging quality is near diffraction limit.
Search for Near-Earth Objects with Small Aphelion Distances
NASA Technical Reports Server (NTRS)
Tholen, David J.
2003-01-01
An essential component of our ability to efficiently find NEOs at small solar elongation is a focal reducer, whose construction is being separately funded by a grant from NSF. This focal reducer will increase the field of view of the 8k CCD mosaic camera from 19 arc min to about 32 arc min at the Cassegrain focus of the University of Hawaii 2.24-m telescope. As of January, all but one of the lenses for the focal reducer were in hand. The final lens had been delayed due to problems with the availability of the rather exotic material out of which the manufacturer was to fabricate the lens. Perhaps as a result of their rush to deliver that final lens, it developed a crack during the annealing process at the manufacturer, thus they had to start over. The total delay in delivery of that last lens was nearly ten months, and therefore the focal reducer was not completed on schedule and could not be used on the telescope this semester. A postdoctoral research associate was recruited to handle the day-to-day operations. The closing date for applications was 2002 December 3 1, and seven were received. One applicant was not qualified, and two were marginal. Of the four qualified candidates, Fabrizio Bernardi stood out as being best qualified. He was a student of Andrea Carusi and had worked on the CINEOS project in Italy, which includes a component of searching for NEOs at small solar elongations.
Preliminary assessment of the ATHENA/WFI non-X-ray background
NASA Astrophysics Data System (ADS)
Perinati, Emanuele; Barbera, Marco; Diebold, Sebastian; Guzman, Alejandro; Santangelo, Andrea; Tenzer, Chris
2017-12-01
We present a preliminary assessment of the non-X-ray background for the WFI on board ATHENA conducted at IAAT in the context of the collaborative background and radiation damage working group activities. Our main result is that in the baseline configuration originally assumed for the camera the requirement on the level of non-X-ray background could not be met. In light of the results of Geant4 simulations we propose and discuss a possible optimization of the camera design and pinpoint some open issues to be addressed in the next phase of investigation. One of these concerns the possible contribution to the non-X-ray background from soft protons and ions funneled to the focal plane through the optics. This is not quantified at this stage, here we just briefly report on our ongoing activities aimed at validating the mechanisms of proton scattering at grazing incidence.
NASA Astrophysics Data System (ADS)
Simoens, François; Meilhan, Jérôme; Nicolas, Jean-Alain
2015-10-01
Sensitive and large-format terahertz focal plane arrays (FPAs) integrated in compact and hand-held cameras that deliver real-time terahertz (THz) imaging are required for many application fields, such as non-destructive testing (NDT), security, quality control of food, and agricultural products industry. Two technologies of uncooled THz arrays that are being studied at CEA-Leti, i.e., bolometer and complementary metal oxide semiconductor (CMOS) field effect transistors (FET), are able to meet these requirements. This paper reminds the followed technological approaches and focuses on the latest modeling and performance analysis. The capabilities of application of these arrays to NDT and security are then demonstrated with experimental tests. In particular, high technological maturity of the THz bolometer camera is illustrated with fast scanning of large field of view of opaque scenes achieved in a complete body scanner prototype.
Continuous-wave terahertz digital holography by use of a pyroelectric array camera.
Ding, Sheng-Hui; Li, Qi; Li, Yun-Da; Wang, Qi
2011-06-01
Terahertz (THz) digital holography is realized based on a 2.52 THz far-IR gas laser and a commercial 124 × 124 pyroelectric array camera. Off-axis THz holograms are obtained by recording interference patterns between light passing through the sample and the reference wave. A numerical reconstruction process is performed to obtain the field distribution at the object surface. Different targets were imaged to test the system's imaging capability. Compared with THz focal plane images, the image quality of the reconstructed images are improved a lot. The results show that the system's imaging resolution can reach at least 0.4 mm. The system also has the potential for real-time imaging application. This study confirms that digital holography is a promising technique for real-time, high-resolution THz imaging, which has extensive application prospects. © 2011 Optical Society of America
Bi, Sheng; Zeng, Xiao; Tang, Xin; Qin, Shujia; Lai, King Wai Chiu
2016-01-01
Compressive sensing (CS) theory has opened up new paths for the development of signal processing applications. Based on this theory, a novel single pixel camera architecture has been introduced to overcome the current limitations and challenges of traditional focal plane arrays. However, video quality based on this method is limited by existing acquisition and recovery methods, and the method also suffers from being time-consuming. In this paper, a multi-frame motion estimation algorithm is proposed in CS video to enhance the video quality. The proposed algorithm uses multiple frames to implement motion estimation. Experimental results show that using multi-frame motion estimation can improve the quality of recovered videos. To further reduce the motion estimation time, a block match algorithm is used to process motion estimation. Experiments demonstrate that using the block match algorithm can reduce motion estimation time by 30%. PMID:26950127
Report of the facility definition team spacelab UV-Optical Telescope Facility
NASA Technical Reports Server (NTRS)
1975-01-01
Scientific requirements for the Spacelab Ultraviolet-Optical Telescope (SUOT) facility are presented. Specific programs involving high angular resolution imagery over wide fields, far ultraviolet spectroscopy, precisely calibrated spectrophotometry and spectropolarimetry over a wide wavelength range, and planetary studies, including high resolution synoptic imagery, are recommended. Specifications for the mounting configuration, instruments for the mounting configuration, instrument mounting system, optical parameters, and the pointing and stabilization system are presented. Concepts for the focal plane instruments are defined. The functional requirements of the direct imaging camera, far ultraviolet spectrograph, and the precisely calibrated spectrophotometer are detailed, and the planetary camera concept is outlined. Operational concepts described in detail are: the makeup and functions of shuttle payload crew, extravehicular activity requirements, telescope control and data management, payload operations control room, orbital constraints, and orbital interfaces (stabilization, maneuvering requirements and attitude control, contamination, utilities, and payload weight considerations).
Application of phase matching autofocus in airborne long-range oblique photography camera
NASA Astrophysics Data System (ADS)
Petrushevsky, Vladimir; Guberman, Asaf
2014-06-01
The Condor2 long-range oblique photography (LOROP) camera is mounted in an aerodynamically shaped pod carried by a fast jet aircraft. Large aperture, dual-band (EO/MWIR) camera is equipped with TDI focal plane arrays and provides high-resolution imagery of extended areas at long stand-off ranges, at day and night. Front Ritchey-Chretien optics is made of highly stable materials. However, the camera temperature varies considerably in flight conditions. Moreover, a composite-material structure of the reflective objective undergoes gradual dehumidification in dry nitrogen atmosphere inside the pod, causing some small decrease of the structure length. The temperature and humidity effects change a distance between the mirrors by just a few microns. The distance change is small but nevertheless it alters the camera's infinity focus setpoint significantly, especially in the EO band. To realize the optics' resolution potential, the optimal focus shall be constantly maintained. In-flight best focus calibration and temperature-based open-loop focus control give mostly satisfactory performance. To get even better focusing precision, a closed-loop phase-matching autofocus method was developed for the camera. The method makes use of an existing beamsharer prism FPA arrangement where aperture partition exists inherently in an area of overlap between the adjacent detectors. The defocus is proportional to an image phase shift in the area of overlap. Low-pass filtering of raw defocus estimate reduces random errors related to variable scene content. Closed-loop control converges robustly to precise focus position. The algorithm uses the temperature- and range-based focus prediction as an initial guess for the closed-loop phase-matching control. The autofocus algorithm achieves excellent results and works robustly in various conditions of scene illumination and contrast.
Curiosity Rover View of Alluring Martian Geology Ahead
2015-08-05
A southward-looking panorama combining images from both cameras of the Mast Camera Mastcam instrument on NASA Curiosity Mars Rover shows diverse geological textures on Mount Sharp. A southward-looking panorama combining images from both cameras of the Mast Camera (Mastcam) instrument on NASA's Curiosity Mars Rover shows diverse geological textures on Mount Sharp. Three years after landing on Mars, the mission is investigating this layered mountain for evidence about changes in Martian environmental conditions, from an ancient time when conditions were favorable for microbial life to the much-drier present. Gravel and sand ripples fill the foreground, typical of terrains that Curiosity traversed to reach Mount Sharp from its landing site. Outcrops in the midfield are of two types: dust-covered, smooth bedrock that forms the base of the mountain, and sandstone ridges that shed boulders as they erode. Rounded buttes in the distance contain sulfate minerals, perhaps indicating a change in the availability of water when they formed. Some of the layering patterns on higher levels of Mount Sharp in the background are tilted at different angles than others, evidence of complicated relationships still to be deciphered. The scene spans from southeastward at left to southwestward at right. The component images were taken on April 10 and 11, 2015, the 952nd and 953rd Martian days (or sols) since the rover's landing on Mars on Aug. 6, 2012, UTC (Aug. 5, PDT). Images in the central part of the panorama are from Mastcam's right-eye camera, which is equipped with a 100-millimeter-focal-length telephoto lens. Images used in outer portions, including the most distant portions of the mountain in the scene, were taken with Mastcam's left-eye camera, using a wider-angle, 34-millimeter lens. http://photojournal.jpl.nasa.gov/catalog/PIA19803
An automated calibration method for non-see-through head mounted displays.
Gilson, Stuart J; Fitzgibbon, Andrew W; Glennerster, Andrew
2011-08-15
Accurate calibration of a head mounted display (HMD) is essential both for research on the visual system and for realistic interaction with virtual objects. Yet, existing calibration methods are time consuming and depend on human judgements, making them error prone, and are often limited to optical see-through HMDs. Building on our existing approach to HMD calibration Gilson et al. (2008), we show here how it is possible to calibrate a non-see-through HMD. A camera is placed inside a HMD displaying an image of a regular grid, which is captured by the camera. The HMD is then removed and the camera, which remains fixed in position, is used to capture images of a tracked calibration object in multiple positions. The centroids of the markers on the calibration object are recovered and their locations re-expressed in relation to the HMD grid. This allows established camera calibration techniques to be used to recover estimates of the HMD display's intrinsic parameters (width, height, focal length) and extrinsic parameters (optic centre and orientation of the principal ray). We calibrated a HMD in this manner and report the magnitude of the errors between real image features and reprojected features. Our calibration method produces low reprojection errors without the need for error-prone human judgements. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wong, Erwin
2000-03-01
Traditional methods of linear based imaging limits the viewer to a single fixed-point perspective. By means of a single lens multiple perspective mirror system, a 360-degree representation of the area around the camera is reconstructed. This reconstruction is used overcome the limitations of a traditional camera by providing the viewer with many different perspectives. By constructing the mirror into a hemispherical surface with multiple focal lengths at various diameters on the mirror, and by placing a parabolic mirror overhead, a stereoscopic image can be extracted from the image captured by a high-resolution camera placed beneath the mirror. Image extraction and correction is made by computer processing of the image obtained by camera; the image present up to five distinguishable different viewpoints that a computer can extrapolate pseudo- perspective data from. Geometric and depth for field can be extrapolated via comparison and isolation of objects within a virtual scene post processed by the computer. Combining data with scene rendering software provides the viewer with the ability to choose a desired viewing position, multiple dynamic perspectives, and virtually constructed perspectives based on minimal existing data. An examination into the workings of the mirror relay system is provided, including possible image extrapolation and correctional methods. Generation of data and virtual interpolated and constructed data is also mentioned.
2011-07-01
taken with the same camera head, operating temperature, range of calibrated blackbody illuminations, and using the same long-wavelength IR ( LWIR ) f/2...measurements shown in this article and are tabulated for comparison purposes only. Images were taken with all four devices using an f/2 LWIR lens (8–12 μm...These were acquired after a nonuniformity correction. A custom image-scaling algorithm was used to avoid the standard nonuniformity corrected scaling
NASA Astrophysics Data System (ADS)
Smith, H. J.; Barnes, T. G., III; Tull, R. G.; Nather, R. E.; Angel, R.; Meinel, A.; Macfarlane, M.; Brault, J.; Neugebauer, G.; Gillett, F.; Richardson, E. H.
Contents: Introductions (H. J. Smith). History of the project (H. J. Smith). Project constraints (T. G. Barnes III).Project constraints (R. G. Tull). Telescope concept (R. E. Nather). Auxiliary instruments (R. E. Nather). Paul-Baker prime focus (R. Angel). Prime focus and Nasmyth cameras (A. Meinel). Nasmyth focal reducers (M. MacFarlane). Spectrometry (R. Angel, R. G. Tull, J. Brault). Infrared sites (G. Neugebauer). IR instrumentation (F. Gillett). Prime focus imaging (E. H. Richardson). Primary mirror figure control (R. G. Tull).
Earth Observations taken by Expedition 41 crewmember
2014-09-27
ISS041-E-045469 (27 Sept. 2014) --- One of the Expedition 41 crew members aboard the International Space Station, flying at an altitude of 222 nautical miles above a point in the Atlantic Ocean several hundred miles off the coast of Africa near the Tropic of Cancer, photographed this eye-catching panorama of the night sky on Sept. 27. NASA astronaut Reid Wiseman, flight engineer, tweeted the image, which was taken with an electronic still camera, set with a 24mm focal length. In his accompanying comments, Wiseman stated, "Sahara sands make the Earth glow orange."
Design of a 2-mm Wavelength KIDs Prototype Camera for the Large Millimeter Telescope
NASA Astrophysics Data System (ADS)
Velázquez, M.; Ferrusca, D.; Castillo-Dominguez, E.; Ibarra-Medel, E.; Ventura, S.; Gómez-Rivera, V.; Hughes, D.; Aretxaga, I.; Grant, W.; Doyle, S.; Mauskopf, P.
2016-08-01
A new camera is being developed for the Large Millimeter Telescope (Sierra Negra, México) by an international collaboration with the University of Massachusetts, the University of Cardiff, and Arizona State University. The camera is based on kinetic inductance detectors (KIDs), a very promising technology due to their sensitivity and especially, their compatibility with frequency domain multiplexing at microwave frequencies allowing large format arrays, in comparison with other detection technologies for mm-wavelength astronomy. The instrument will have a 100 pixels array of KIDs to image the 2-mm wavelength band and is designed for closed cycle operation using a pulse tube cryocooler along with a three-stage sub-kelvin 3He cooler to provide a 250 mK detector stage. RF cabling is used to readout the detectors from room temperature to 250 mK focal plane, and the amplification stage is achieved with a low-noise amplifier operating at 4 K. The readout electronics will be based on open-source reconfigurable open architecture computing hardware in order to perform real-time microwave transmission measurements and monitoring the resonance frequency of each detector, as well as the detection process.
Inferred UV Fluence Focal-Spot Profiles from Soft X-Ray Pinhole Camera Measurements on OMEGA
NASA Astrophysics Data System (ADS)
Theobald, W.; Sorce, C.; Epstein, R.; Keck, R. L.; Kellogg, C.; Kessler, T. J.; Kwiatkowski, J.; Marshall, F. J.; Seka, W.; Shvydky, A.; Stoeckl, C.
2017-10-01
The drive uniformity of OMEGA cryogenic implosions is affected by UV beamfluence variations on target, which require careful monitoring at full laser power. This is routinely performed with multiple pinhole cameras equipped with charge-injection devices (CID's) that record the x-ray emission in the 3- to 7-keV photon energy range from an Au-coated target. The technique relies on the knowledge of the relation between x-ray fluence Fx and UV fluence FUV ,Fx FUVγ , with a measured γ = 3.42 for the CID-based diagnostic and 1-ns laser pulse. It is demonstrated here that using a back-thinned charge-coupled-device camera with softer filtration for x-rays with photon energies <2 keV and well calibrated pinhole provides a lower γ 2 and a larger dynamic range in the measured UV fluence. Inferred UV fluence profiles were measured for 100-ps and 1-ns laser pulses and were compared to directly measured profiles from a UV equivalent-target-plane diagnostic. Good agreement between both techniques is reported for selected beams. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.
Multi-band infrared camera systems
NASA Astrophysics Data System (ADS)
Davis, Tim; Lang, Frank; Sinneger, Joe; Stabile, Paul; Tower, John
1994-12-01
The program resulted in an IR camera system that utilizes a unique MOS addressable focal plane array (FPA) with full TV resolution, electronic control capability, and windowing capability. Two systems were delivered, each with two different camera heads: a Stirling-cooled 3-5 micron band head and a liquid nitrogen-cooled, filter-wheel-based, 1.5-5 micron band head. Signal processing features include averaging up to 16 frames, flexible compensation modes, gain and offset control, and real-time dither. The primary digital interface is a Hewlett-Packard standard GPID (IEEE-488) port that is used to upload and download data. The FPA employs an X-Y addressed PtSi photodiode array, CMOS horizontal and vertical scan registers, horizontal signal line (HSL) buffers followed by a high-gain preamplifier and a depletion NMOS output amplifier. The 640 x 480 MOS X-Y addressed FPA has a high degree of flexibility in operational modes. By changing the digital data pattern applied to the vertical scan register, the FPA can be operated in either an interlaced or noninterlaced format. The thermal sensitivity performance of the second system's Stirling-cooled head was the best of the systems produced.
Review of terahertz technology development at INO
NASA Astrophysics Data System (ADS)
Dufour, Denis; Marchese, Linda; Terroux, Marc; Oulachgar, Hassane; Généreux, Francis; Doucet, Michel; Mercier, Luc; Tremblay, Bruno; Alain, Christine; Beaupré, Patrick; Blanchard, Nathalie; Bolduc, Martin; Chevalier, Claude; D'Amato, Dominic; Desroches, Yan; Duchesne, François; Gagnon, Lucie; Ilias, Samir; Jerominek, Hubert; Lagacé, François; Lambert, Julie; Lamontagne, Frédéric; Le Noc, Loïc; Martel, Anne; Pancrati, Ovidiu; Paultre, Jacques-Edmond; Pope, Tim; Provençal, Francis; Topart, Patrice; Vachon, Carl; Verreault, Sonia; Bergeron, Alain
2015-10-01
Over the past decade, INO has leveraged its expertise in the development of uncooled microbolometer detectors for infrared imaging to produce terahertz (THz) imaging systems. By modifying its microbolometer-based focal plane arrays to enhance absorption in the THz bands and by developing custom THz imaging lenses, INO has developed a leading-edge THz imaging system, the IRXCAM-THz-384 camera, capable of exploring novel applications in the emerging field of terahertz imaging and sensing. Using appropriate THz sources, results show that the IRXCAM-THz-384 camera is able to image a variety of concealed objects of interest for applications such as non-destructive testing and weapons detections. By using a longer wavelength (94 GHz) source, it is also capable of sensing the signatures of various objects hidden behind a drywall panel. This article, written as a review of THz research at INO over the past decade, describes the technical components that form the IRXCAM-THz-384 camera and the experimental setup used for active THz imaging. Image results for concealed weapons detection experiments, an exploration of wavelength choice on image quality, and the detection of hidden objects behind drywall are also presented.
Inauguration and first light of the GCT-M prototype for the Cherenkov telescope array
NASA Astrophysics Data System (ADS)
Watson, J. J.; De Franco, A.; Abchiche, A.; Allan, D.; Amans, J.-P.; Armstrong, T. P.; Balzer, A.; Berge, D.; Boisson, C.; Bousquet, J.-J.; Brown, A. M.; Bryan, M.; Buchholtz, G.; Chadwick, P. M.; Costantini, H.; Cotter, G.; Daniel, M. K.; De Frondat, F.; Dournaux, J.-L.; Dumas, D.; Ernenwein, J.-P.; Fasola, G.; Funk, S.; Gironnet, J.; Graham, J. A.; Greenshaw, T.; Hervet, O.; Hidaka, N.; Hinton, J. A.; Huet, J.-M.; Jegouzo, I.; Jogler, T.; Kraus, M.; Lapington, J. S.; Laporte, P.; Lefaucheur, J.; Markoff, S.; Melse, T.; Mohrmann, L.; Molyneux, P.; Nolan, S. J.; Okumura, A.; Osborne, J. P.; Parsons, R. D.; Rosen, S.; Ross, D.; Rowell, G.; Rulten, C. B.; Sato, Y.; Sayède, F.; Schmoll, J.; Schoorlemmer, H.; Servillat, M.; Sol, H.; Stamatescu, V.; Stephan, M.; Stuik, R.; Sykes, J.; Tajima, H.; Thornhill, J.; Tibaldo, L.; Trichard, C.; Vink, J.; White, R.; Yamane, N.; Zech, A.; Zink, A.; Zorn, J.; CTA Consortium
2017-01-01
The Gamma-ray Cherenkov Telescope (GCT) is a candidate for the Small Size Telescopes (SSTs) of the Cherenkov Telescope Array (CTA). Its purpose is to extend the sensitivity of CTA to gamma-ray energies reaching 300 TeV. Its dual-mirror optical design and curved focal plane enables the use of a compact camera of 0.4 m diameter, while achieving a field of view of above 8 degrees. Through the use of the digitising TARGET ASICs, the Cherenkov flash is sampled once per nanosecond contin-uously and then digitised when triggering conditions are met within the analogue outputs of the photosensors. Entire waveforms (typically covering 96 ns) for all 2048 pixels are then stored for analysis, allowing for a broad spectrum of investigations to be performed on the data. Two prototypes of the GCT camera are under development, with differing photosensors: Multi-Anode Photomultipliers (MAPMs) and Silicon Photomultipliers (SiPMs). During November 2015, the GCT MAPM (GCT-M) prototype camera was integrated onto the GCT structure at the Observatoire de Paris-Meudon, where it observed the first Cherenkov light detected by a prototype instrument for CTA.
Extended depth of field system for long distance iris acquisition
NASA Astrophysics Data System (ADS)
Chen, Yuan-Lin; Hsieh, Sheng-Hsun; Hung, Kuo-En; Yang, Shi-Wen; Li, Yung-Hui; Tien, Chung-Hao
2012-10-01
Using biometric signatures for identity recognition has been practiced for centuries. Recently, iris recognition system attracts much attention due to its high accuracy and high stability. The texture feature of iris provides a signature that is unique for each subject. Currently most commercial iris recognition systems acquire images in less than 50 cm, which is a serious constraint that needs to be broken if we want to use it for airport access or entrance that requires high turn-over rate . In order to capture the iris patterns from a distance, in this study, we developed a telephoto imaging system with image processing techniques. By using the cubic phase mask positioned front of the camera, the point spread function was kept constant over a wide range of defocus. With adequate decoding filter, the blurred image was restored, where the working distance between the subject and the camera can be achieved over 3m associated with 500mm focal length and aperture F/6.3. The simulation and experimental results validated the proposed scheme, where the depth of focus of iris camera was triply extended over the traditional optics, while keeping sufficient recognition accuracy.
NASA Astrophysics Data System (ADS)
Druart, Guillaume; Matallah, Noura; Guerineau, Nicolas; Magli, Serge; Chambon, Mathieu; Jenouvrier, Pierre; Mallet, Eric; Reibel, Yann
2014-06-01
Today, both military and civilian applications require miniaturized optical systems in order to give an imagery function to vehicles with small payload capacity. After the development of megapixel focal plane arrays (FPA) with micro-sized pixels, this miniaturization will become feasible with the integration of optical functions in the detector area. In the field of cooled infrared imaging systems, the detector area is the Detector-Dewar-Cooler Assembly (DDCA). SOFRADIR and ONERA have launched a new research and innovation partnership, called OSMOSIS, to develop disruptive technologies for DDCA to improve the performance and compactness of optronic systems. With this collaboration, we will break down the technological barriers of DDCA, a sealed and cooled environment dedicated to the infrared detectors, to explore Dewar-level integration of optics. This technological breakthrough will bring more compact multipurpose thermal imaging products, as well as new thermal capabilities such as 3D imagery or multispectral imagery. Previous developments will be recalled (SOIE and FISBI cameras) and new developments will be presented. In particular, we will focus on a dual-band MWIR-LWIR camera and a multichannel camera.
KAPAO Prime: Design and Simulation
NASA Astrophysics Data System (ADS)
McGonigle, Lorcan
2012-11-01
KAPAO (KAPAO A Pomona Adaptive Optics instrument) is a dual-band natural guide star adaptive optics system designed to measure and remove atmospheric aberration from Pomona College's telescope atop Table Mountain. We present here, the final optical system, referred to as Prime, designed in Zemax Optical Design Software. Prime is characterized by diffraction limited imaging over the full 73'' field of view of our Andor Camera at f/33 as well as for our NIR Xenics camera at f/50. In Zemax, tolerances of 1% on OAP focal length and off-axis distance were shown to contribute an additional 4 nm of wavefront error (98% confidence) over the field of view of the Andor camera; the contribution from surface irregularity was determined analytically to be 40nm for OAPs specified to l/10 surface irregularity. Modeling of the temperature deformation of the breadboard in SolidWorks revealed 70 micron contractions along the edges of the board for a decrease of 75 F; when applied to OAP positions such displacements from the optimal layout are predicted to contribute an additional 20 nanometers of wavefront error. Flexure modeling of the breadboard due to gravity is on-going. We hope to begin alignment and testing of ``Prime'' in Q1 2013.
Meteor44 Video Meteor Photometry
NASA Technical Reports Server (NTRS)
Swift, Wesley R.; Suggs, Robert M.; Cooke, William J.
2004-01-01
Meteor44 is a software system developed at MSFC for the calibration and analysis of video meteor data. The dynamic range of the (8bit) video data is extended by approximately 4 magnitudes for both meteors and stellar images using saturation compensation. Camera and lens specific saturation compensation coefficients are derived from artificial variable star laboratory measurements. Saturation compensation significantly increases the number of meteors with measured intensity and improves the estimation of meteoroid mass distribution. Astrometry is automated to determine each image s plate coefficient using appropriate star catalogs. The images are simultaneously intensity calibrated from the contained stars to determine the photon sensitivity and the saturation level referenced above the atmosphere. The camera s spectral response is used to compensate for stellar color index and typical meteor spectra in order to report meteor light curves in traditional visual magnitude units. Recent efforts include improved camera calibration procedures, long focal length "streak" meteor photome&y and two-station track determination. Meteor44 has been used to analyze data from the 2001.2002 and 2003 MSFC Leonid observational campaigns as well as several lesser showers. The software is interactive and can be demonstrated using data from recent Leonid campaigns.
Automatic Camera Calibration for Cultural Heritage Applications Using Unstructured Planar Objects
NASA Astrophysics Data System (ADS)
Adam, K.; Kalisperakis, I.; Grammatikopoulos, L.; Karras, G.; Petsa, E.
2013-07-01
As a rule, image-based documentation of cultural heritage relies today on ordinary digital cameras and commercial software. As such projects often involve researchers not familiar with photogrammetry, the question of camera calibration is important. Freely available open-source user-friendly software for automatic camera calibration, often based on simple 2D chess-board patterns, are an answer to the demand for simplicity and automation. However, such tools cannot respond to all requirements met in cultural heritage conservation regarding possible imaging distances and focal lengths. Here we investigate the practical possibility of camera calibration from unknown planar objects, i.e. any planar surface with adequate texture; we have focused on the example of urban walls covered with graffiti. Images are connected pair-wise with inter-image homographies, which are estimated automatically through a RANSAC-based approach after extracting and matching interest points with the SIFT operator. All valid points are identified on all images on which they appear. Provided that the image set includes a "fronto-parallel" view, inter-image homographies with this image are regarded as emulations of image-to-world homographies and allow computing initial estimates for the interior and exterior orientation elements. Following this initialization step, the estimates are introduced into a final self-calibrating bundle adjustment. Measures are taken to discard unsuitable images and verify object planarity. Results from practical experimentation indicate that this method may produce satisfactory results. The authors intend to incorporate the described approach into their freely available user-friendly software tool, which relies on chess-boards, to assist non-experts in their projects with image-based approaches.
BOREAS Level-0 C-130 Aerial Photography
NASA Technical Reports Server (NTRS)
Newcomer, Jeffrey A.; Dominguez, Roseanne; Hall, Forrest G. (Editor)
2000-01-01
For BOReal Ecosystem-Atmosphere Study (BOREAS), C-130 and other aerial photography was collected to provide finely detailed and spatially extensive documentation of the condition of the primary study sites. The NASA C-130 Earth Resources aircraft can accommodate two mapping cameras during flight, each of which can be fitted with 6- or 12-inch focal-length lenses and black-and-white, natural-color, or color-IR film, depending upon requirements. Both cameras were often in operation simultaneously, although sometimes only the lower resolution camera was deployed. When both cameras were in operation, the higher resolution camera was often used in a more limited fashion. The acquired photography covers the period of April to September 1994. The aerial photography was delivered as rolls of large format (9 x 9 inch) color transparency prints, with imagery from multiple missions (hundreds of prints) often contained within a single roll. A total of 1533 frames were collected from the C-130 platform for BOREAS in 1994. Note that the level-0 C-130 transparencies are not contained on the BOREAS CD-ROM set. An inventory file is supplied on the CD-ROM to inform users of all the data that were collected. Some photographic prints were made from the transparencies. In addition, BORIS staff digitized a subset of the tranparencies and stored the images in JPEG format. The CD-ROM set contains a small subset of the collected aerial photography that were the digitally scanned and stored as JPEG files for most tower and auxiliary sites in the NSA and SSA. See Section 15 for information about how to acquire additional imagery.
Best practices to optimize intraoperative photography.
Gaujoux, Sébastien; Ceribelli, Cecilia; Goudard, Geoffrey; Khayat, Antoine; Leconte, Mahaut; Massault, Pierre-Philippe; Balagué, Julie; Dousset, Bertrand
2016-04-01
Intraoperative photography is used extensively for communication, research, or teaching. The objective of the present work was to define, using a standardized methodology and literature review, the best technical conditions for intraoperative photography. Using either a smartphone camera, a bridge camera, or a single-lens reflex (SLR) camera, photographs were taken under various standard conditions by a professional photographer. All images were independently assessed blinded to technical conditions to define the best shooting conditions and methods. For better photographs, an SLR camera with manual settings should be used. Photographs should be centered and taken vertically and orthogonal to the surgical field with a linear scale to avoid error in perspective. The shooting distance should be about 75 cm using an 80-100-mm focal lens. Flash should be avoided and scialytic low-powered light should be used without focus. The operative field should be clean, wet surfaces should be avoided, and metal instruments should be hidden to avoid reflections. For SLR camera, International Organization for Standardization speed should be as low as possible, autofocus area selection mode should be on single point AF, shutter speed should be above 1/100 second, and aperture should be as narrow as possible, above f/8. For smartphone, use high dynamic range setting if available, use of flash, digital filter, effect apps, and digital zoom is not recommended. If a few basic technical rules are known and applied, high-quality photographs can be taken by amateur photographers and fit the standards accepted in clinical practice, academic communication, and publications. Copyright © 2016 Elsevier Inc. All rights reserved.
Camera Development for the Cherenkov Telescope Array
NASA Astrophysics Data System (ADS)
Moncada, Roberto Jose
2017-01-01
With the Cherenkov Telescope Array (CTA), the very-high-energy gamma-ray universe, between 30 GeV and 300 TeV, will be probed at an unprecedented resolution, allowing deeper studies of known gamma-ray emitters and the possible discovery of new ones. This exciting project could also confirm the particle nature of dark matter by looking for the gamma rays produced by self-annihilating weakly interacting massive particles (WIMPs). The telescopes will use the imaging atmospheric Cherenkov technique (IACT) to record Cherenkov photons that are produced by the gamma-ray induced extensive air shower. One telescope design features dual-mirror Schwarzschild-Couder (SC) optics that allows the light to be finely focused on the high-resolution silicon photomultipliers of the camera modules starting from a 9.5-meter primary mirror. Each camera module will consist of a focal plane module and front-end electronics, and will have four TeV Array Readout with GSa/s Sampling and Event Trigger (TARGET) chips, giving them 64 parallel input channels. The TARGET chip has a self-trigger functionality for readout that can be used in higher logic across camera modules as well as across individual telescopes, which will each have 177 camera modules. There will be two sites, one in the northern and the other in the southern hemisphere, for full sky coverage, each spanning at least one square kilometer. A prototype SC telescope is currently under construction at the Fred Lawrence Whipple Observatory in Arizona. This work was supported by the National Science Foundation's REU program through NSF award AST-1560016.
In-Flight performance of MESSENGER's Mercury dual imaging system
Hawkins, S.E.; Murchie, S.L.; Becker, K.J.; Selby, C.M.; Turner, F.S.; Noble, M.W.; Chabot, N.L.; Choo, T.H.; Darlington, E.H.; Denevi, B.W.; Domingue, D.L.; Ernst, C.M.; Holsclaw, G.M.; Laslo, N.R.; Mcclintock, W.E.; Prockter, L.M.; Robinson, M.S.; Solomon, S.C.; Sterner, R.E.
2009-01-01
The Mercury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) spacecraft, launched in August 2004 and planned for insertion into orbit around Mercury in 2011, has already completed two flybys of the innermost planet. The Mercury Dual Imaging System (MDIS) acquired nearly 2500 images from the first two flybys and viewed portions of Mercury's surface not viewed by Mariner 10 in 1974-1975. Mercury's proximity to the Sun and its slow rotation present challenges to the thermal design for a camera on an orbital mission around Mercury. In addition, strict limitations on spacecraft pointing and the highly elliptical orbit create challenges in attaining coverage at desired geometries and relatively uniform spatial resolution. The instrument designed to meet these challenges consists of dual imagers, a monochrome narrow-angle camera (NAC) with a 1.5?? field of view (FOV) and a multispectral wide-angle camera (WAC) with a 10.5?? FOV, co-aligned on a pivoting platform. The focal-plane electronics of each camera are identical and use a 1024??1024 charge-coupled device detector. The cameras are passively cooled but use diode heat pipes and phase-change-material thermal reservoirs to maintain the thermal configuration during the hot portions of the orbit. Here we present an overview of the instrument design and how the design meets its technical challenges. We also review results from the first two flybys, discuss the quality of MDIS data from the initial periods of data acquisition and how that compares with requirements, and summarize how in-flight tests are being used to improve the quality of the instrument calibration. ?? 2009 SPIE.
Portable retinal imaging for eye disease screening using a consumer-grade digital camera
NASA Astrophysics Data System (ADS)
Barriga, Simon; Larichev, Andrey; Zamora, Gilberto; Soliz, Peter
2012-03-01
The development of affordable means to image the retina is an important step toward the implementation of eye disease screening programs. In this paper we present the i-RxCam, a low-cost, hand-held, retinal camera for widespread applications such as tele-retinal screening for eye diseases like diabetic retinopathy (DR), glaucoma, and age-related ocular diseases. Existing portable retinal imagers do not meet the requirements of a low-cost camera with sufficient technical capabilities (field of view, image quality, portability, battery power, and ease-of-use) to be distributed widely to low volume clinics, such as the offices of single primary care physicians serving rural communities. The i-RxCam uses a Nikon D3100 digital camera body. The camera has a CMOS sensor with 14.8 million pixels. We use a 50mm focal lens that gives a retinal field of view of 45 degrees. The internal autofocus can compensate for about 2D (diopters) of focusing error. The light source is an LED produced by Philips with a linear emitting area that is transformed using a light pipe to the optimal shape at the eye pupil, an annulus. To eliminate corneal reflex we use a polarization technique in which the light passes through a nano-wire polarizer plate. This is a novel type of polarizer featuring high polarization separation (contrast ratio of more than 1000) and very large acceptance angle (>45 degrees). The i-RxCam approach will yield a significantly more economical retinal imaging device that would allow mass screening of the at-risk population.
Development of low-cost high-performance multispectral camera system at Banpil
NASA Astrophysics Data System (ADS)
Oduor, Patrick; Mizuno, Genki; Olah, Robert; Dutta, Achyut K.
2014-05-01
Banpil Photonics (Banpil) has developed a low-cost high-performance multispectral camera system for Visible to Short- Wave Infrared (VIS-SWIR) imaging for the most demanding high-sensitivity and high-speed military, commercial and industrial applications. The 640x512 pixel InGaAs uncooled camera system is designed to provide a compact, smallform factor to within a cubic inch, high sensitivity needing less than 100 electrons, high dynamic range exceeding 190 dB, high-frame rates greater than 1000 frames per second (FPS) at full resolution, and low power consumption below 1W. This is practically all the feature benefits highly desirable in military imaging applications to expand deployment to every warfighter, while also maintaining a low-cost structure demanded for scaling into commercial markets. This paper describes Banpil's development of the camera system including the features of the image sensor with an innovation integrating advanced digital electronics functionality, which has made the confluence of high-performance capabilities on the same imaging platform practical at low cost. It discusses the strategies employed including innovations of the key components (e.g. focal plane array (FPA) and Read-Out Integrated Circuitry (ROIC)) within our control while maintaining a fabless model, and strategic collaboration with partners to attain additional cost reductions on optics, electronics, and packaging. We highlight the challenges and potential opportunities for further cost reductions to achieve a goal of a sub-$1000 uncooled high-performance camera system. Finally, a brief overview of emerging military, commercial and industrial applications that will benefit from this high performance imaging system and their forecast cost structure is presented.
Marcot, Bruce G.; Jorgenson, M. Torre; DeGange, Anthony R.
2014-01-01
5. A Canon® Rebel 3Ti with a Sigma zoom lens (18–200 mm focal length). The Drift® HD-170 and GoPro® Hero3 cameras were secured to the struts and underwing for nadir (direct downward) imaging. The Panasonic® and Canon® cameras were each hand-held for oblique-angle landscape images, shooting through the airplanes’ windows, targeting both general landscape conditions as well as landscape features of special interest, such as tundra fire scars and landslips. The Drift® and GoPro® cameras each were set for time-lapse photography at 5-second intervals for overlapping coverage. Photographs from all cameras (100 percent .jpg format) were date- and time-synchronized to geographic positioning system waypoints taken during the flights, also at 5-second intervals, providing precise geotagging (latitude-longitude) of all files. All photographs were adjusted for color saturation and gamma, and nadir photographs were corrected for lens distortion for the Drift® and GoPro® cameras’ 170° wide-angle distortion. EXIF (exchangeable image file format) data on camera settings and geotagging were extracted into spreadsheet databases. An additional 1 hour, 20 minutes, and 43 seconds of high-resolution videos were recorded at 60 frames per second with the GoPro® camera along selected transect segments, and also were image-adjusted and corrected for lens distortion. Geotagged locations of 12,395 nadir photographs from the Drift® and GoPro® cameras were overlayed in a geographic information system (ArcMap 10.0) onto a map of 44 ecotypes (land- and water-cover types) of the Arctic Network study area. Presence and area of each ecotype occurring within a geographic information system window centered on the location of each photograph were recorded and included in the spreadsheet databases. All original and adjusted photographs, videos, geographic positioning system flight tracks, and photograph databases are available by contacting ascweb@usgs.gov.
Marshall Grazing Incidence X-ray Spectrometer (MaGIXS) Slit-Jaw Imaging System
NASA Astrophysics Data System (ADS)
Wilkerson, P.; Champey, P. R.; Winebarger, A. R.; Kobayashi, K.; Savage, S. L.
2017-12-01
The Marshall Grazing Incidence X-ray Spectrometer is a NASA sounding rocket payload providing a 0.6 - 2.5 nm spectrum with unprecedented spatial and spectral resolution. The instrument is comprised of a novel optical design, featuring a Wolter1 grazing incidence telescope, which produces a focused solar image on a slit plate, an identical pair of stigmatic optics, a planar diffraction grating and a low-noise detector. When MaGIXS flies on a suborbital launch in 2019, a slit-jaw camera system will reimage the focal plane of the telescope providing a reference for pointing the telescope on the solar disk and aligning the data to supporting observations from satellites and other rockets. The telescope focuses the X-ray and EUV image of the sun onto a plate covered with a phosphor coating that absorbs EUV photons, which then fluoresces in visible light. This 10-week REU project was aimed at optimizing an off-axis mounted camera with 600-line resolution NTSC video for extremely low light imaging of the slit plate. Radiometric calculations indicate an intensity of less than 1 lux at the slit jaw plane, which set the requirement for camera sensitivity. We selected a Watec 910DB EIA charge-coupled device (CCD) monochrome camera, which has a manufacturer quoted sensitivity of 0.0001 lux at F1.2. A high magnification and low distortion lens was then identified to image the slit jaw plane from a distance of approximately 10 cm. With the selected CCD camera, tests show that at extreme low-light levels, we achieve a higher resolution than expected, with only a moderate drop in frame rate. Based on sounding rocket flight heritage, the launch vehicle attitude control system is known to stabilize the instrument pointing such that jitter does not degrade video quality for context imaging. Future steps towards implementation of the imaging system will include ruggedizing the flight camera housing and mounting the selected camera and lens combination to the instrument structure.
The NASA 2003 Mars Exploration Rover Panoramic Camera (Pancam) Investigation
NASA Astrophysics Data System (ADS)
Bell, J. F.; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Morris, R. V.; Athena Team
2002-12-01
The Panoramic Camera System (Pancam) is part of the Athena science payload to be launched to Mars in 2003 on NASA's twin Mars Exploration Rover missions. The Pancam imaging system on each rover consists of two major components: a pair of digital CCD cameras, and the Pancam Mast Assembly (PMA), which provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360o of azimuth and from zenith to nadir, providing a complete view of the scene around the rover. Pancam utilizes two 1024x2048 Mitel frame transfer CCD detector arrays, each having a 1024x1024 active imaging area and 32 optional additional reference pixels per row for offset monitoring. Each array is combined with optics and a small filter wheel to become one "eye" of a multispectral, stereoscopic imaging system. The optics for both cameras consist of identical 3-element symmetrical lenses with an effective focal length of 42 mm and a focal ratio of f/20, yielding an IFOV of 0.28 mrad/pixel or a rectangular FOV of 16o\\x9D 16o per eye. The two eyes are separated by 30 cm horizontally and have a 1o toe-in to provide adequate parallax for stereo imaging. The cameras are boresighted with adjacent wide-field stereo Navigation Cameras, as well as with the Mini-TES instrument. The Pancam optical design is optimized for best focus at 3 meters range, and allows Pancam to maintain acceptable focus from infinity to within 1.5 meters of the rover, with a graceful degradation (defocus) at closer ranges. Each eye also contains a small 8-position filter wheel to allow multispectral sky imaging, direct Sun imaging, and surface mineralogic studies in the 400-1100 nm wavelength region. Pancam has been designed and calibrated to operate within specifications from -55oC to +5oC. An onboard calibration target and fiducial marks provide the ability to validate the radiometric and geometric calibration on Mars. Pancam relies heavily on use of the JPL ICER wavelet compression algorithm to maximize data return within stringent mission downlink limits. The scientific goals of the Pancam investigation are to: (a) obtain monoscopic and stereoscopic image mosaics to assess the morphology, topography, and geologic context of each MER landing site; (b) obtain multispectral visible to short-wave near-IR images of selected regions to determine surface color and mineralogic properties; (c) obtain multispectral images over a range of viewing geometries to constrain surface photometric and physical properties; and (d) obtain images of the Martian sky, including direct images of the Sun, to determine dust and aerosol opacity and physical properties. In addition, Pancam also serves a variety of operational functions on the MER mission, including (e) serving as the primary Sun-finding camera for rover navigation; (f) resolving objects on the scale of the rover wheels to distances of ~100 m to help guide navigation decisions; (g) providing stereo coverage adequate for the generation of digital terrain models to help guide and refine rover traverse decisions; (h) providing high resolution images and other context information to guide the selection of the most interesting in situ sampling targets; and (i) supporting acquisition and release of exciting E/PO products.
Verification of the SENTINEL-4 Focal Plane Subsystem
NASA Astrophysics Data System (ADS)
Williges, C.; Hohn, R.; Rossmann, H.; Hilbert, S.; Uhlig, M.; Buchwinkler, K.; Reulke, R.
2017-05-01
The Sentinel-4 payload is a multi-spectral camera system which is designed to monitor atmospheric conditions over Europe. The German Aerospace Center (DLR) in Berlin, Germany conducted the verification campaign of the Focal Plane Subsystem (FPS) on behalf of Airbus Defense and Space GmbH, Ottobrunn, Germany. The FPS consists, inter alia, of two Focal Plane Assemblies (FPAs), one for the UV-VIS spectral range (305 nm … 500 nm), the second for NIR (750 nm … 775 nm). In this publication, we will present in detail the opto-mechanical laboratory set-up of the verification campaign of the Sentinel-4 Qualification Model (QM) which will also be used for the upcoming Flight Model (FM) verification. The test campaign consists mainly of radiometric tests performed with an integrating sphere as homogenous light source. The FPAs have mainly to be operated at 215 K ± 5 K, making it necessary to exploit a thermal vacuum chamber (TVC) for the test accomplishment. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Furthermore selected test analyses and results will be presented, showing that the Sentinel-4 FPS meets specifications.
NASA Astrophysics Data System (ADS)
Schuster, Norbert; Franks, John
2011-06-01
In the 8-12 micron waveband Focal Plane Arrays (FPA) are available with a 17 micron pixel pitch in different arrays sizes (e.g. 512 x 480 pixels and 320 x 240 pixels) and with excellent electrical properties. Many applications become possible using this new type of IR-detector which will become the future standard in uncooled technology. Lenses with an f-number faster than f/1.5 minimize the diffraction impact on the spatial resolution and guarantee a high thermal resolution for uncooled cameras. Both effects will be quantified. The distinction between Traditional f-number (TF) and Radiometric f-number (RF) is discussed. Lenses with different focal lengths are required for applications in a variety of markets. They are classified by their Horizontal field of view (HFOV). Respecting the requirements for high volume markets, several two lens solutions will be discussed. A commonly accepted parameter of spatial resolution is the Modulation Transfer Function (MTF)-value at the Nyquist frequency of the detector (here 30cy/mm). This parameter of resolution will be presented versus field of view. Wide Angle and Super Wide Angle lenses are susceptible to low relative illumination in the corner of the detector. Measures to reduce this drop to an acceptable value are presented.
Cao, Yanpeng; Tisse, Christel-Loic
2013-09-01
In uncooled long-wave infrared (LWIR) microbolometer imaging systems, temperature fluctuations of the focal plane array (FPA) result in thermal drift and spatial nonuniformity. In this paper, we present a novel approach based on single-image processing to simultaneously estimate temperature variances of FPAs and compensate the resulting temperature-dependent nonuniformity. Through well-controlled thermal calibrations, empirical behavioral models are derived to characterize the relationship between the responses of microbolometer and FPA temperature variations. Then, under the assumption that strong dependency exists between spatially adjacent pixels, we estimate the optimal FPA temperature so as to minimize the global intensity variance across the entire thermal infrared image. We make use of the estimated FPA temperature to infer an appropriate nonuniformity correction (NUC) profile. The performance and robustness of the proposed temperature-adaptive NUC method are evaluated on realistic IR images obtained by a 640 × 512 pixels uncooled LWIR microbolometer imaging system operating in a significantly changed temperature environment.
NASA Astrophysics Data System (ADS)
Li, Shichun; Chen, Genyu; Katayama, Seiji; Zhang, Yi
2014-06-01
The spatter and the molten pool behavior, which were the important phenomena concerned with the welding quality, were observed and studied by using the high-speed camera and the X-ray transmission imaging system during laser welding under different welding parameters. The formation mechanism of spatter and the corresponding relationships between the spatter and molten pool behavior were investigated. The increase of laser power could cause more intense evaporation and lead to more spatter. When the focal position of laser beam was changed, different forms of spatter were generated, as well as the flow trends of molten metal on the front keyhole wall and at the rear molten pool were changed. The results revealed that the behavior of molten pool, which could be affected by the absorbed energy distribution in the keyhole, was the key factor to determine the spatter formation during laser welding. The relatively sound weld seam could be obtained during laser welding with the focal position located inside the metal.
2009-04-16
This star chart illustrates the large patch of sky that NASA Kepler mission will stare at for the duration of its three-and-a-half-year lifetime. The planet hunter's full field of view occupies 100 square degrees of our Milky Way galaxy, in the constellations Cygnus and Lyra. Kepler's focal plane, or the area where starlight is focused, is depicted on the star chart as a series of 42 vertical and horizontal rectangles. These rectangles represent the 95-megapixel camera's 42 charge-coupled devices, or CCDs. Scientists selected the orientation of the focal plane's field of view to avoid the region's brightest stars, which are shown as the largest black dots. Some of these bright stars can be seen falling in between the CCD modules, in areas that are not imaged. This was done so that the brightest stars will not saturate large portions of the detectors. Saturation causes signals from the bright stars to spill, or "bloom," into nearby planet-hunting territory. http://photojournal.jpl.nasa.gov/catalog/PIA11983
2007-07-01
engineering of a process or system that mimics biology, to investigate behaviours in robots that emulate animals such as self - healing and swarming [2...7.3.5 References 7-25 7.4 Adaptive Automation for Robotic Military Systems 7-29 7.4.1 Introduction 7-29 7.4.2 Human Performance Issues for...Figure 6-7 Integrated Display of Video, Range Readings, and Robot Representation 6-31 Figure 6-8 Representing the Pose of a Panning Camera 6-32 Figure
Performance of the dark energy camera liquid nitrogen cooling system
NASA Astrophysics Data System (ADS)
Cease, H.; Alvarez, M.; Alvarez, R.; Bonati, M.; Derylo, G.; Estrada, J.; Flaugher, B.; Flores, R.; Lathrop, A.; Munoz, F.; Schmidt, R.; Schmitt, R. L.; Schultz, K.; Kuhlmann, S.; Zhao, A.
2014-01-01
The Dark Energy Camera, the Imager and its cooling system was installed onto the Blanco 4m telescope at the Cerro Tololo Inter-American Observatory in Chile in September 2012. The imager cooling system is a LN2 two-phase closed loop cryogenic cooling system. The cryogenic circulation processing is located off the telescope. Liquid nitrogen vacuum jacketed transfer lines are run up the outside of the telescope truss tubes to the imager inside the prime focus cage. The design of the cooling system along with commissioning experiences and initial cooling system performance is described. The LN2 cooling system with the DES imager was initially operated at Fermilab for testing, then shipped and tested in the Blanco Coudé room. Now the imager is operating inside the prime focus cage. It is shown that the cooling performance sufficiently cools the imager in a closed loop mode, which can operate for extended time periods without maintenance or LN2 fills.
Tunable liquid microlens array driven by pyroelectric effect: full interferometric characterization
NASA Astrophysics Data System (ADS)
Miccio, Lisa; Grilli, Simonetta; Vespini, Veronica; Ferraro, Pietro
2008-09-01
Liquid lenses with adjustable focal length are of great interest in the field of microfluidic devices. They are, usually, realized by electrowetting effect after electrodes patterning on a hydrofobic substrate. Applications are possible in many fields ranging from commercial products such as digital cameras to biological cell sorting. We realized an open array of liquid lenses with adjustable focal length without electrode patterning. We used a z-cut Lithium Niobate crystal (LN) as substrate and few microliters of an oily substance to obtain the droplets array. The spontaneous polarization of LN crystals is reversed by the electric field poling process, thus enabling the realization of periodically poled LN (PPLN) crystals. The substrate consists of a two-dimensional square array of reversed domains with a period around 200 μm. Each domain presents an hexagonal geometry due to the crystal structure. PPLN is first covered by a thin and homogeneous layer of the above mentioned liquid and therefore its temperature is changed by means of a digitally controlled hot plate. During heating and cooling process there is a rearrangement of the liquid layer until it reaches the final topography. Lenses formation is due to the superficial tension changing at the liquid-solid interface by means of the pyroelectric effect. Such effect allows to create a two-dimensional lens pattern of tunable focal length without electrodes. The temporal evolution of both shape and focal length lenses are quantitatively measured by Digital Holographic Microscopy. Array imaging properties and quantitative analysis of the lenses features and aberrations are presented.
NASA Technical Reports Server (NTRS)
Connelly, Joseph A.; Ohl, Raymond G.; Mink, Ronald G.; Mentzell, J. Eric; Saha, Timo T.; Tveekrem, June L.; Hylan, Jason E.; Sparr, Leroy M.; Chambers, V. John; Hagopian, John G.
2003-01-01
The Infrared Multi-Object Spectrometer (IRMOS) is a facility instrument for the Kitt Peak National Observatory 4 and 2.1 meter telescopes. IRMOS is a near-IR (0.8 - 2.5 micron) spectrometer with low- to mid-resolving power (R = 300 - 3000). IRMOS produces simultaneous spectra of approximately 100 objects in its 2.8 x 2.0 arc-min field of view using a commercial Micro Electro-Mechanical Systems (MEMS) Digital Micro-mirror Device (DMD) from Texas Instruments. The IRMOS optical design consists of two imaging subsystems. The focal reducer images the focal plane of the telescope onto the DMD field stop, and the spectrograph images the DMD onto the detector. We describe ambient breadboard subsystem alignment and imaging performance of each stage independently, and the ambient and cryogenic imaging performance of the fully assembled instrument. Interferometric measurements of subsystem wavefront error serve to venfy alignment, and are accomplished using a commercial, modified Twyman-Green laser unequal path interferometer. Image testing provides further verification of the optomechanical alignment method and a measurement of near-angle scattered light due to mirror small-scale surface error. Image testing is performed at multiple field points. A mercury-argon pencil lamp provides spectral lines at 546.1 nm and 1550 nm, and a CCD camera and IR camera are used as detectors. We use commercial optical modeling software to predict the point-spread function and its effect on instrument slit transmission and resolution. Our breadboard test results validate this prediction. We conclude with an instrument performance prediction for first light.
Determining the phase and amplitude distortion of a wavefront using a plenoptic sensor.
Wu, Chensheng; Ko, Jonathan; Davis, Christopher C
2015-05-01
We have designed a plenoptic sensor to retrieve phase and amplitude changes resulting from a laser beam's propagation through atmospheric turbulence. Compared with the commonly restricted domain of (-π,π) in phase reconstruction by interferometers, the reconstructed phase obtained by the plenoptic sensors can be continuous up to a multiple of 2π. When compared with conventional Shack-Hartmann sensors, ambiguities caused by interference or low intensity, such as branch points and branch cuts, are less likely to happen and can be adaptively avoided by our reconstruction algorithm. In the design of our plenoptic sensor, we modified the fundamental structure of a light field camera into a mini Keplerian telescope array by accurately cascading the back focal plane of its object lens with a microlens array's front focal plane and matching the numerical aperture of both components. Unlike light field cameras designed for incoherent imaging purposes, our plenoptic sensor operates on the complex amplitude of the incident beam and distributes it into a matrix of images that are simpler and less subject to interference than a global image of the beam. Then, with the proposed reconstruction algorithms, the plenoptic sensor is able to reconstruct the wavefront and a phase screen at an appropriate depth in the field that causes the equivalent distortion on the beam. The reconstructed results can be used to guide adaptive optics systems in directing beam propagation through atmospheric turbulence. In this paper, we will show the theoretical analysis and experimental results obtained with the plenoptic sensor and its reconstruction algorithms.
Determining the phase and amplitude distortion of a wavefront using a plenoptic sensor
NASA Astrophysics Data System (ADS)
Wu, Chensheng; Ko, Jonathan; Davis, Christopher C.
2015-05-01
We have designed a plenoptic sensor to retrieve phase and amplitude changes resulting from a laser beam's propagation through atmospheric turbulence. Compared with the commonly restricted domain of (-pi, pi) in phase reconstruction by interferometers, the reconstructed phase obtained by the plenoptic sensors can be continuous up to a multiple of 2pi. When compared with conventional Shack-Hartmann sensors, ambiguities caused by interference or low intensity, such as branch points and branch cuts, are less likely to happen and can be adaptively avoided by our reconstruction algorithm. In the design of our plenoptic sensor, we modified the fundamental structure of a light field camera into a mini Keplerian telescope array by accurately cascading the back focal plane of its object lens with a microlens array's front focal plane and matching the numerical aperture of both components. Unlike light field cameras designed for incoherent imaging purposes, our plenoptic sensor operates on the complex amplitude of the incident beam and distributes it into a matrix of images that are simpler and less subject to interference than a global image of the beam. Then, with the proposed reconstruction algorithms, the plenoptic sensor is able to reconstruct the wavefront and a phase screen at an appropriate depth in the field that causes the equivalent distortion on the beam. The reconstructed results can be used to guide adaptive optics systems in directing beam propagation through atmospheric turbulence. In this paper we will show the theoretical analysis and experimental results obtained with the plenoptic sensor and its reconstruction algorithms.
An overview of instrumentation for the Large Binocular Telescope
NASA Astrophysics Data System (ADS)
Wagner, R. Mark
2006-06-01
An overview of instrumentation for the Large Binocular Telescope is presented. Optical instrumentation includes the Large Binocular Camera (LBC), a pair of wide-field (27' × 27') mosaic CCD imagers at the prime focus, and the Multi-Object Double Spectrograph (MODS), a pair of dual-beam blue-red optimized long-slit spectrographs mounted at the straight-through F/15 Gregorian focus incorporating multiple slit masks for multi-object spectroscopy over a 6' field and spectral resolutions of up to 8000. Infrared instrumentation includes the LBT Near-IR Spectroscopic Utility with Camera and Integral Field Unit for Extragalactic Research (LUCIFER), a modular near-infrared (0.9-2.5 μm) imager and spectrograph pair mounted at a bent interior focal station and designed for seeing-limited (FOV: 4' × 4') imaging, long-slit spectroscopy, and multi-object spectroscopy utilizing cooled slit masks and diffraction limited (FOV: 0'.5 × 0'.5) imaging and long-slit spectroscopy. Strategic instruments under development for the remaining two combined focal stations include an interferometric cryogenic beam combiner with near-infrared and thermal-infrared instruments for Fizeau imaging and nulling interferometry (LBTI) and an optical bench near-infrared beam combiner utilizing multi-conjugate adaptive optics for high angular resolution and sensitivity (LINC-NIRVANA). In addition, a fiber-fed bench spectrograph (PEPSI) capable of ultra high resolution spectroscopy and spectropolarimetry (R = 40,000-300,000) will be available as a principal investigator instrument. The availability of all these instruments mounted simultaneously on the LBT permits unique science, flexible scheduling, and improved operational support.
An overview of instrumentation for the Large Binocular Telescope
NASA Astrophysics Data System (ADS)
Wagner, R. Mark
2004-09-01
An overview of instrumentation for the Large Binocular Telescope is presented. Optical instrumentation includes the Large Binocular Camera (LBC), a pair of wide-field (27'x 27') UB/VRI optimized mosaic CCD imagers at the prime focus, and the Multi-Object Double Spectrograph (MODS), a pair of dual-beam blue-red optimized long-slit spectrographs mounted at the straight-through F/15 Gregorian focus incorporating multiple slit masks for multi-object spectroscopy over a 6\\arcmin\\ field and spectral resolutions of up to 8000. Infrared instrumentation includes the LBT Near-IR Spectroscopic Utility with Camera and Integral Field Unit for Extragalactic Research (LUCIFER), a modular near-infrared (0.9-2.5 μm) imager and spectrograph pair mounted at a bent interior focal station and designed for seeing-limited (FOV: 4'x 4') imaging, long-slit spectroscopy, and multi-object spectroscopy utilizing cooled slit masks and diffraction limited (FOV: 0'.5 x 0'.5) imaging and long-slit spectroscopy. Strategic instruments under development for the remaining two combined focal stations include an interferometric cryogenic beam combiner with near-infrared and thermal-infrared instruments for Fizeau imaging and nulling interferometry (LBTI) and an optical bench beam combiner with visible and near-infrared imagers utilizing multi-conjugate adaptive optics for high angular resolution and sensitivity (LINC/NIRVANA). In addition, a fiber-fed bench spectrograph (PEPSI) capable of ultra high resolution spectroscopy and spectropolarimetry (R = 40,000-300,000) will be available as a principal investigator instrument. The availability of all these instruments mounted simultaneously on the LBT permits unique science, flexible scheduling, and improved operational support.
An overview of instrumentation for the Large Binocular Telescope
NASA Astrophysics Data System (ADS)
Wagner, R. Mark
2008-07-01
An overview of instrumentation for the Large Binocular Telescope is presented. Optical instrumentation includes the Large Binocular Camera (LBC), a pair of wide-field (27' × 27') mosaic CCD imagers at the prime focus, and the Multi-Object Double Spectrograph (MODS), a pair of dual-beam blue-red optimized long-slit spectrographs mounted at the straight-through F/15 Gregorian focus incorporating multiple slit masks for multi-object spectroscopy over a 6 field and spectral resolutions of up to 8000. Infrared instrumentation includes the LBT Near-IR Spectroscopic Utility with Camera and Integral Field Unit for Extragalactic Research (LUCIFER), a modular near-infrared (0.9-2.5 μm) imager and spectrograph pair mounted at a bent interior focal station and designed for seeing-limited (FOV: 4' × 4') imaging, long-slit spectroscopy, and multi-object spectroscopy utilizing cooled slit masks and diffraction limited (FOV: 0.5' × 0.5') imaging and long-slit spectroscopy. Strategic instruments under development for the remaining two combined focal stations include an interferometric cryogenic beam combiner with near-infrared and thermal-infrared instruments for Fizeau imaging and nulling interferometry (LBTI) and an optical bench near-infrared beam combiner utilizing multi-conjugate adaptive optics for high angular resolution and sensitivity (LINC-NIRVANA). In addition, a fiber-fed bench spectrograph (PEPSI) capable of ultra high resolution spectroscopy and spectropolarimetry (R = 40,000-300,000) will be available as a principal investigator instrument. The availability of all these instruments mounted simultaneously on the LBT permits unique science, flexible scheduling, and improved operational support.
Stray light calibration of the Dawn Framing Camera
NASA Astrophysics Data System (ADS)
Kovacs, Gabor; Sierks, Holger; Nathues, Andreas; Richards, Michael; Gutierrez-Marques, Pablo
2013-10-01
Sensitive imaging systems with high dynamic range onboard spacecrafts are susceptible to ghost and stray-light effects. During the design phase, the Dawn Framing Camera was laid out and optimized to minimize those unwanted, parasitic effects. However, the requirement of low distortion to the optical design and use of a front-lit focal plane array induced an additional stray light component. This paper presents the ground-based and in-flight procedures characterizing the stray-light artifacts. The in-flight test used the Sun as the stray light source, at different angles of incidence. The spacecraft was commanded to point predefined solar elongation positions, and long exposure images were recorded. The PSNIT function was calculated by the known illumination and the ground based calibration information. In the ground based calibration, several extended and point sources were used with long exposure times in dedicated imaging setups. The tests revealed that the major contribution to the stray light is coming from the ghost reflections between the focal plan array and the band pass interference filters. Various laboratory experiments and computer modeling simulations were carried out to quantify the amount of this effect, including the analysis of the diffractive reflection pattern generated by the imaging sensor. The accurate characterization of the detector reflection pattern is the key to successfully predict the intensity distribution of the ghost image. Based on the results, and the properties of the optical system, a novel correction method is applied in the image processing pipeline. The effect of this correction procedure is also demonstrated with the first images of asteroid Vesta.
Optical follow-up of gravitational wave triggers with DECam
NASA Astrophysics Data System (ADS)
Herner, K.; Annis, J.; Berger, E.; Brout, D.; Butler, R.; Chen, H.; Cowperthwaite, P.; Diehl, H.; Doctor, Z.; Drlica-Wagner, A.; Farr, B.; Finley, D.; Frieman, J.; Holz, D.; Kessler, R.; Lin, H.; Marriner, J.; Nielsen, E.; Palmese, A.; Sako, M.; Soares-Santos, M.; Sobreira, F.; Yanny, B.
2017-10-01
Gravitational wave (GW) events have several possible progenitors, including black hole mergers, cosmic string cusps, supernovae, neutron star mergers, and black hole-neutron star mergers. A subset of GW events are expected to produce electromagnetic (EM) emission that, once detected, will provide complementary information about their astrophysical context. To that end, the LIGO-Virgo Collaboration has partnered with other teams to send GW candidate alerts so that searches for their EM counterparts can be pursued. One such partner is the Dark Energy Survey (DES) and Dark Energy Camera (DECam) Gravitational Waves Program (DES-GW). Situated on the 4m Blanco Telescope at the Cerro Tololo Inter-American Observatory in Chile, DECam is an ideal instrument for optical followup observations of GW triggers in the southern sky. The DES-GW program performs subtraction of new search images with respect to preexisting overlapping images to select candidate sources. Due to the short decay timescale of the expected EM counterparts and the need to quickly eliminate survey areas with no counterpart candidates, it is critical to complete the initial analysis of each night’s images within 24 hours. The computational challenges in achieving this goal include maintaining robust I/O pipelines during the processing, being able to quickly acquire template images of new sky regions outside of the typical DES observing regions, and being able to rapidly provision additional batch computing resources with little advance notice. We will discuss the search area determination, imaging pipeline, general data transfer strategy, and methods to quickly increase the available amount of batch computing. We will present results from the first season of observations from September 2015 to January 2016 and conclude by presenting improvements planned for the second observing season.
Optical follow-up of gravitational wave triggers with DECam
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herner, K.; Annis, J.; Berger, E.
Gravitational wave (GW) events have several possible progenitors, including black hole mergers, cosmic string cusps, supernovae, neutron star mergers, and black hole{neutron star mergers. A subset of GW events are expected to produce electromagnetic (EM) emission that, once detected, will provide complementary information about their astrophysical context. To that end, the LIGO-Virgo Collaboration has partnered with other teams to send GW candidate alerts so that searches for their EM counterparts can be pursued. One such partner is the Dark Energy Survey (DES) and Dark Energy Camera (DECam) Gravitational Waves Program (DES- GW). Situated on the 4m Blanco Telescope at themore » Cerro Tololo Inter-American Observatory in Chile, DECam is an ideal instrument for optical followup observations of GW triggers in the southern sky. The DES-GW program performs subtraction of new search images with respect to preexisting overlapping images to select candidate sources. Due to the short decay timescale of the expected EM counterparts and the need to quickly eliminate survey areas with no counterpart candidates, it is critical to complete the initial analysis of each night's images within 24 hours. The computational challenges in achieving this goal include maintaining robust I/O pipelines during the processing, being able to quickly acquire template images of new sky regions outside of the typical DES observing regions, and being able to rapidly provision additional batch computing resources with little advance notice. We will discuss the search area determination, imaging pipeline, general data transfer strategy, and methods to quickly increase the available amount of batch computing. We will present results from the rst season of observations from September 2015 to January 2016 and conclude by presenting improvements planned for the second observing season.« less
NASA Astrophysics Data System (ADS)
Peltoniemi, Mikko; Aurela, Mika; Böttcher, Kristin; Kolari, Pasi; Loehr, John; Karhu, Jouni; Kubin, Eero; Linkosalmi, Maiju; Melih Tanis, Cemal; Nadir Arslan, Ali
2017-04-01
Ecosystems' potential to provide services, e.g. to sequester carbon is largely driven by the phenological cycle of vegetation. Timing of phenological events is required for understanding and predicting the influence of climate change on ecosystems and to support various analyses of ecosystem functioning. We established a network of cameras for automated monitoring of phenological activity of vegetation in boreal ecosystems of Finland. Cameras were mounted on 14 sites, each site having 1-3 cameras. In this study, we used cameras at 11 of these sites to investigate how well networked cameras detect phenological development of birches (Betula spp.) along the latitudinal gradient. Birches are interesting focal species for the analyses as they are common throughout Finland. In our cameras they often appear in smaller quantities within dominant species in the images. Here, we tested whether small scattered birch image elements allow reliable extraction of color indices and changes therein. We compared automatically derived phenological dates from these birch image elements to visually determined dates from the same image time series, and to independent observations recorded in the phenological monitoring network from the same region. Automatically extracted season start dates based on the change of green color fraction in the spring corresponded well with the visually interpreted start of season, and field observed budburst dates. During the declining season, red color fraction turned out to be superior over green color based indices in predicting leaf yellowing and fall. The latitudinal gradients derived using automated phenological date extraction corresponded well with gradients based on phenological field observations from the same region. We conclude that already small and scattered birch image elements allow reliable extraction of key phenological dates for birch species. Devising cameras for species specific analyses of phenological timing will be useful for explaining variation of time series of satellite based indices, and it will also benefit models describing ecosystem functioning at species or plant functional type level. With the contribution of the LIFE+ financial instrument of the European Union (LIFE12 ENV/FI/000409 Monimet, http://monimet.fmi.fi)
A High Resolution TDI CCD Camera forMicrosatellite (HRCM)
NASA Astrophysics Data System (ADS)
Hao, Yuncai; Zheng, You; Dong, Ying; Li, Tao; Yu, Shijie
In resent years it is a important development direction in the commercial remote sensing field to obtain (1-5)m high ground resolution from space using microsatellite. Thanks to progress of new technologies, new materials and new detectors it is possible to develop 1m ground resolution space imaging system with weight less than 20kg. Based on many years works on optical system design a project of very high resolution TDI CCD camera using in space was proposed by the authors of this paper. The performance parameters and optical lay-out of the HRCM was presented. A compact optical design and results analysis for the system was given in the paper also. and small fold mirror to take a line field of view usable for TDI CCD and short outer size. The length along the largest size direction is about 1/4 of the focal length. And two 4096X96(grades) line TDI CCD will be used as the focal plane detector. The special optical parts are fixed near before the final image for getting the ground pixel resolution higher than the Nyquist resolution of the detector using the sub-pixel technique which will be explained in the paper. In the system optical SiC will be used as the mirror material, the C-C composite material will be used as the material of the mechanical structure framework. The circle frame of the primary and secondary mirrors will use one time turning on a machine tool in order to assuring concentric request for alignment of the system. In general the HRCM have the performance parameters with 2.5m focal length, 20 FOV, 1/11relative aperture, (0.4-0.8) micrometer spectral range, 10 micron pixel size of TDI CCD, weight less than 20kg, 1m ground pixel resolution at flying orbit 500km high. Design and analysis of the HRCM put up in the paper indicate that HRCM have many advantages to use it in space. Keywords High resolution TDI CCD Sub-pixel imaging Light-weighted optical system SiC mirror
Focal plane alignment and detector characterization for the Subaru prime focus spectrograph
NASA Astrophysics Data System (ADS)
Hart, Murdock; Barkhouser, Robert H.; Carr, Michael; Golebiowski, Mirek; Gunn, James E.; Hope, Stephen C.; Smee, Stephen A.
2014-07-01
We describe the infrastructure being developed to align and characterize the detectors for the Subaru Measure- ment of Images and Redshifts (SuMIRe) Prime Focus Spectrograph (PFS). PFS will employ four three-channel spectrographs with an operating wavelength range of 3800 °A to 12600 °A. Each spectrograph will be comprised of two visible channels and one near infrared (NIR) channel, where each channel will use a separate Schmidt camera to image the captured spectra onto their respective detectors. In the visible channels, Hamamatsu 2k × 4k CCDs will be mounted in pairs to create a single 4k × 4k detector, while the NIR channel will use a single Teledyne 4k × 4k H4RG HgCdTe device. The fast f/1.1 optics of the Schmidt cameras will give a shallow depth of focus necessitating an optimization of the focal plane array flatness. The minimum departure from flatness of the focal plane array for the visible channels is set the by the CCD flatness, typically 10 μm peak-to-valley. We will adjust the coplanarity for a pair of CCDs such that the flatness of the array is consistent with the flatness of the detectors themselves. To achieve this we will use an optical non-contact measurement system to measure surface flatness and coplanarity at both ambient and operating temperatures, and use shims to adjust the coplanarity of the CCDs. We will characterize the performance of the detectors for PFS consistent with the scientific goals for the project. To this end we will measure the gain, linearity, full well, quantum efficiency (QE), charge diffusion, charge transfer inefficiency (CTI), and noise properties of these devices. We also desire to better understand the non-linearity of the photon transfer curve for the CCDs, and the charge persistence/reciprocity problems of the HgCdTe devices. To enable the metrology and characterization of these detectors we are building two test cryostats nearly identical in design. The first test cryostat will primarily be used for the coplanarity measurements and sub- pixel illumination testing, and the second will be dedicated to performance characterization requiring at field illumination. In this paper we will describe the design of the test cryostats. We will also describe the system we have built for measuring focal plane array flatness, and examine the precision and error with which it operates. Finally we will detail the methods by which we plan to characterize the performance of the detectors for PFS, and provide preliminary results.
Geometric facial comparisons in speed-check photographs.
Buck, Ursula; Naether, Silvio; Kreutz, Kerstin; Thali, Michael
2011-11-01
In many cases, it is not possible to call the motorists to account for their considerable excess in speeding, because they deny being the driver on the speed-check photograph. An anthropological comparison of facial features using a photo-to-photo comparison can be very difficult depending on the quality of the photographs. One difficulty of that analysis method is that the comparison photographs of the presumed driver are taken with a different camera or camera lens and from a different angle than for the speed-check photo. To take a comparison photograph with exactly the same camera setup is almost impossible. Therefore, only an imprecise comparison of the individual facial features is possible. The geometry and position of each facial feature, for example the distances between the eyes or the positions of the ears, etc., cannot be taken into consideration. We applied a new method using 3D laser scanning, optical surface digitalization, and photogrammetric calculation of the speed-check photo, which enables a geometric comparison. Thus, the influence of the focal length and the distortion of the objective lens are eliminated and the precise position and the viewing direction of the speed-check camera are calculated. Even in cases of low-quality images or when the face of the driver is partly hidden, good results are delivered using this method. This new method, Geometric Comparison, is evaluated and validated in a prepared study which is described in this article.
Estimation of bone mineral content using gamma camera: A real possibility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levy, L.M.; Hoory, S.; Bandyopadhyay, D.
1985-05-01
Osteopenia and Osteoporosis are the diseases related to loss of bone minerals. At present, dual photon absorptiometry using a dedicated specially built scanner along with a very high source of Gd-153 is being used as a diagnostic tool for the early detection of bone loss. The present study was undertaken to explore the possibility that gamma cameras which are widely available in all Nuclear Medicine departments could be used successfully to evaluate bone mineral content. A Siemens LFOV gamma camera equipped with a converging collimator was used for this purpose. A fixed source (100 mCi) of Gd-153 was placed atmore » the focal point of the collimator. A series of calcium chloride solutions of varying concentrations in plastic vials were placed near the center of the collimator and imaged both in air and water. Both 44 Kev and 100 Kev images were digitized in 128 x 128 matrices and processed in a CD and A Delta system attached to a VAX 11-750 computer. Uniformity corrections for each field of view were applied and the attenuation coefficients of calcium chloride for both peaks of Gd-153 were evaluated. In addition, due to the high count rate, corrections for the dead time losses were also found to be essential. An excellent concordance between the estimated Calcium contents and that actually present were obtained by this technic. In conclusion, use of gamma camera for the routine evaluation of Osteoporosis appears to be highly promising and worth pursuing.« less
KAPAO Prime: Design and Simulation
NASA Astrophysics Data System (ADS)
McGonigle, Lorcan; Choi, P. I.; Severson, S. A.; Spjut, E.
2013-01-01
KAPAO (KAPAO A Pomona Adaptive Optics instrument) is a dual-band natural guide star adaptive optics system designed to measure and remove atmospheric aberration over UV-NIR wavelengths from Pomona College’s telescope atop Table Mountain. We present here, the final optical system, KAPAO Prime, designed in Zemax Optical Design Software that uses custom off-axis paraboloid mirrors (OAPs) to manipulate light appropriately for a Shack-Hartman wavefront sensor, deformable mirror, and science cameras. KAPAO Prime is characterized by diffraction limited imaging over the full 81” field of view of our optical camera at f/33 as well as over the smaller field of view of our NIR camera at f/50. In Zemax, tolerances of 1% on OAP focal length and off-axis distance were shown to contribute an additional 4 nm of wavefront error (98% confidence) over the field of view of our optical camera; the contribution from surface irregularity was determined analytically to be 40nm for OAPs specified to λ/10 surface irregularity (632.8nm). Modeling of the temperature deformation of the breadboard in SolidWorks revealed 70 micron contractions along the edges of the board for a decrease of 75°F when applied to OAP positions such displacements from the optimal layout are predicted to contribute an additional 20 nanometers of wavefront error. Flexure modeling of the breadboard due to gravity is on-going. We hope to begin alignment and testing of KAPAO Prime in Q1 2013.
SOFIA tracking image simulation
NASA Astrophysics Data System (ADS)
Taylor, Charles R.; Gross, Michael A. K.
2016-09-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) tracking camera simulator is a component of the Telescope Assembly Simulator (TASim). TASim is a software simulation of the telescope optics, mounting, and control software. Currently in its fifth major version, TASim is relied upon for telescope operator training, mission planning and rehearsal, and mission control and science instrument software development and testing. TASim has recently been extended for hardware-in-the-loop operation in support of telescope and camera hardware development and control and tracking software improvements. All three SOFIA optical tracking cameras are simulated, including the Focal Plane Imager (FPI), which has recently been upgraded to the status of a science instrument that can be used on its own or in parallel with one of the seven infrared science instruments. The simulation includes tracking camera image simulation of starfields based on the UCAC4 catalog at real-time rates of 4-20 frames per second. For its role in training and planning, it is important for the tracker image simulation to provide images with a realistic appearance and response to changes in operating parameters. For its role in tracker software improvements, it is vital to have realistic signal and noise levels and precise star positions. The design of the software simulation for precise subpixel starfield rendering (including radial distortion), realistic point-spread function as a function of focus, tilt, and collimation, and streaking due to telescope motion will be described. The calibration of the simulation for light sensitivity, dark and bias signal, and noise will also be presented
NASA Astrophysics Data System (ADS)
Liu, Chengwei; Sui, Xiubao; Gu, Guohua; Chen, Qian
2018-02-01
For the uncooled long-wave infrared (LWIR) camera, the infrared (IR) irradiation the focal plane array (FPA) receives is a crucial factor that affects the image quality. Ambient temperature fluctuation as well as system power consumption can result in changes of FPA temperature and radiation characteristics inside the IR camera; these will further degrade the imaging performance. In this paper, we present a novel shutterless non-uniformity correction method to compensate for non-uniformity derived from the variation of ambient temperature. Our method combines a calibration-based method and the properties of a scene-based method to obtain correction parameters at different ambient temperature conditions, so that the IR camera performance can be less influenced by ambient temperature fluctuation or system power consumption. The calibration process is carried out in a temperature chamber with slowly changing ambient temperature and a black body as uniform radiation source. Enough uniform images are captured and the gain coefficients are calculated during this period. Then in practical application, the offset parameters are calculated via the least squares method based on the gain coefficients, the captured uniform images and the actual scene. Thus we can get a corrected output through the gain coefficients and offset parameters. The performance of our proposed method is evaluated on realistic IR images and compared with two existing methods. The images we used in experiments are obtained by a 384× 288 pixels uncooled LWIR camera. Results show that our proposed method can adaptively update correction parameters as the actual target scene changes and is more stable to temperature fluctuation than the other two methods.
Geiger-mode APD camera system for single-photon 3D LADAR imaging
NASA Astrophysics Data System (ADS)
Entwistle, Mark; Itzler, Mark A.; Chen, Jim; Owens, Mark; Patel, Ketan; Jiang, Xudong; Slomkowski, Krystyna; Rangwala, Sabbir
2012-06-01
The unparalleled sensitivity of 3D LADAR imaging sensors based on single photon detection provides substantial benefits for imaging at long stand-off distances and minimizing laser pulse energy requirements. To obtain 3D LADAR images with single photon sensitivity, we have demonstrated focal plane arrays (FPAs) based on InGaAsP Geiger-mode avalanche photodiodes (GmAPDs) optimized for use at either 1.06 μm or 1.55 μm. These state-of-the-art FPAs exhibit excellent pixel-level performance and the capability for 100% pixel yield on a 32 x 32 format. To realize the full potential of these FPAs, we have recently developed an integrated camera system providing turnkey operation based on FPGA control. This system implementation enables the extremely high frame-rate capability of the GmAPD FPA, and frame rates in excess of 250 kHz (for 0.4 μs range gates) can be accommodated using an industry-standard CameraLink interface in full configuration. Real-time data streaming for continuous acquisition of 2 μs range gate point cloud data with 13-bit time-stamp resolution at 186 kHz frame rates has been established using multiple solid-state storage drives. Range gate durations spanning 4 ns to 10 μs provide broad operational flexibility. The camera also provides real-time signal processing in the form of multi-frame gray-scale contrast images and single-frame time-stamp histograms, and automated bias control has been implemented to maintain a constant photon detection efficiency in the presence of ambient temperature changes. A comprehensive graphical user interface has been developed to provide complete camera control using a simple serial command set, and this command set supports highly flexible end-user customization.
Low-cost panoramic infrared surveillance system
NASA Astrophysics Data System (ADS)
Kecskes, Ian; Engel, Ezra; Wolfe, Christopher M.; Thomson, George
2017-05-01
A nighttime surveillance concept consisting of a single surface omnidirectional mirror assembly and an uncooled Vanadium Oxide (VOx) longwave infrared (LWIR) camera has been developed. This configuration provides a continuous field of view spanning 360° in azimuth and more than 110° in elevation. Both the camera and the mirror are readily available, off-the-shelf, inexpensive products. The mirror assembly is marketed for use in the visible spectrum and requires only minor modifications to function in the LWIR spectrum. The compactness and portability of this optical package offers significant advantages over many existing infrared surveillance systems. The developed system was evaluated on its ability to detect moving, human-sized heat sources at ranges between 10 m and 70 m. Raw camera images captured by the system are converted from rectangular coordinates in the camera focal plane to polar coordinates and then unwrapped into the users azimuth and elevation system. Digital background subtraction and color mapping are applied to the images to increase the users ability to extract moving items from background clutter. A second optical system consisting of a commercially available 50 mm f/1.2 ATHERM lens and a second LWIR camera is used to examine the details of objects of interest identified using the panoramic imager. A description of the components of the proof of concept is given, followed by a presentation of raw images taken by the panoramic LWIR imager. A description of the method by which these images are analyzed is given, along with a presentation of these results side-by-side with the output of the 50 mm LWIR imager and a panoramic visible light imager. Finally, a discussion of the concept and its future development are given.
Toward real-time endoscopically-guided robotic navigation based on a 3D virtual surgical field model
NASA Astrophysics Data System (ADS)
Gong, Yuanzheng; Hu, Danying; Hannaford, Blake; Seibel, Eric J.
2015-03-01
The challenge is to accurately guide the surgical tool within the three-dimensional (3D) surgical field for roboticallyassisted operations such as tumor margin removal from a debulked brain tumor cavity. The proposed technique is 3D image-guided surgical navigation based on matching intraoperative video frames to a 3D virtual model of the surgical field. A small laser-scanning endoscopic camera was attached to a mock minimally-invasive surgical tool that was manipulated toward a region of interest (residual tumor) within a phantom of a debulked brain tumor. Video frames from the endoscope provided features that were matched to the 3D virtual model, which were reconstructed earlier by raster scanning over the surgical field. Camera pose (position and orientation) is recovered by implementing a constrained bundle adjustment algorithm. Navigational error during the approach to fluorescence target (residual tumor) is determined by comparing the calculated camera pose to the measured camera pose using a micro-positioning stage. From these preliminary results, computation efficiency of the algorithm in MATLAB code is near real-time (2.5 sec for each estimation of pose), which can be improved by implementation in C++. Error analysis produced 3-mm distance error and 2.5 degree of orientation error on average. The sources of these errors come from 1) inaccuracy of the 3D virtual model, generated on a calibrated RAVEN robotic platform with stereo tracking; 2) inaccuracy of endoscope intrinsic parameters, such as focal length; and 3) any endoscopic image distortion from scanning irregularities. This work demonstrates feasibility of micro-camera 3D guidance of a robotic surgical tool.
A LiDAR data-based camera self-calibration method
NASA Astrophysics Data System (ADS)
Xu, Lijun; Feng, Jing; Li, Xiaolu; Chen, Jianjun
2018-07-01
To find the intrinsic parameters of a camera, a LiDAR data-based camera self-calibration method is presented here. Parameters have been estimated using particle swarm optimization (PSO), enhancing the optimal solution of a multivariate cost function. The main procedure of camera intrinsic parameter estimation has three parts, which include extraction and fine matching of interest points in the images, establishment of cost function, based on Kruppa equations and optimization of PSO using LiDAR data as the initialization input. To improve the precision of matching pairs, a new method of maximal information coefficient (MIC) and maximum asymmetry score (MAS) was used to remove false matching pairs based on the RANSAC algorithm. Highly precise matching pairs were used to calculate the fundamental matrix so that the new cost function (deduced from Kruppa equations in terms of the fundamental matrix) was more accurate. The cost function involving four intrinsic parameters was minimized by PSO for the optimal solution. To overcome the issue of optimization pushed to a local optimum, LiDAR data was used to determine the scope of initialization, based on the solution to the P4P problem for camera focal length. To verify the accuracy and robustness of the proposed method, simulations and experiments were implemented and compared with two typical methods. Simulation results indicated that the intrinsic parameters estimated by the proposed method had absolute errors less than 1.0 pixel and relative errors smaller than 0.01%. Based on ground truth obtained from a meter ruler, the distance inversion accuracy in the experiments was smaller than 1.0 cm. Experimental and simulated results demonstrated that the proposed method was highly accurate and robust.
Abbott, Andrew P; Azam, Muhammad; Ryder, Karl S; Saleem, Saima
2013-07-16
This study has shown for the first time that digital holographic microscopy (DHM) can be used as a new analytical tool in analysis of kinetic mechanism and growth during electrolytic deposition processes. Unlike many alternative established electrochemical microscopy methods such as probe microscopy, DHM is both the noninvasive and noncontact, the unique holographic imaging allows the observations and measurement to be made remotely. DHM also provides interferometric resolution (nanometer vertical scale) with a very short acquisition time. It is a surface metrology technique that enables the retrieval of information about a 3D structure from the phase contrast of a single hologram acquired using a conventional digital camera. Here DHM has been applied to investigate directly the electro-crystallization of a metal on a substrate in real time (in situ) from two deep eutectic solvent (DES) systems based on mixture of choline chloride and either urea or ethylene glycol. We show, using electrochemical DHM that the nucleation and growth of silver deposits in these systems are quite distinct and influenced strongly by the hydrogen bond donor of the DES.
Seam tracking with adaptive image capture for fine-tuning of a high power laser welding process
NASA Astrophysics Data System (ADS)
Lahdenoja, Olli; Säntti, Tero; Laiho, Mika; Paasio, Ari; Poikonen, Jonne K.
2015-02-01
This paper presents the development of methods for real-time fine-tuning of a high power laser welding process of thick steel by using a compact smart camera system. When performing welding in butt-joint configuration, the laser beam's location needs to be adjusted exactly according to the seam line in order to allow the injected energy to be absorbed uniformly into both steel sheets. In this paper, on-line extraction of seam parameters is targeted by taking advantage of a combination of dynamic image intensity compression, image segmentation with a focal-plane processor ASIC, and Hough transform on an associated FPGA. Additional filtering of Hough line candidates based on temporal windowing is further applied to reduce unrealistic frame-to-frame tracking variations. The proposed methods are implemented in Matlab by using image data captured with adaptive integration time. The simulations are performed in a hardware oriented way to allow real-time implementation of the algorithms on the smart camera system.
Causes of cine image quality deterioration in cardiac catheterization laboratories.
Levin, D C; Dunham, L R; Stueve, R
1983-10-01
Deterioration of cineangiographic image quality can result from malfunctions or technical errors at a number of points along the cine imaging chain: generator and automatic brightness control, x-ray tube, x-ray beam geometry, image intensifier, optics, cine camera, cine film, film processing, and cine projector. Such malfunctions or errors can result in loss of image contrast, loss of spatial resolution, improper control of film optical density (brightness), or some combination thereof. While the electronic and photographic technology involved is complex, physicians who perform cardiac catheterization should be conversant with the problems and what can be done to solve them. Catheterization laboratory personnel have control over a number of factors that directly affect image quality, including radiation dose rate per cine frame, kilovoltage or pulse width (depending on type of automatic brightness control), cine run time, selection of small or large focal spot, proper object-intensifier distance and beam collimation, aperture of the cine camera lens, selection of cine film, processing temperature, processing immersion time, and selection of developer.
Intra-cavity upconversion to 631 nm of images illuminated by an eye-safe ASE source at 1550 nm.
Torregrosa, A J; Maestre, H; Capmany, J
2015-11-15
We report an image wavelength upconversion system. The system mixes an incoming image at around 1550 nm (eye-safe region) illuminated by an amplified spontaneous emission (ASE) fiber source with a Gaussian beam at 1064 nm generated in a continuous-wave diode-pumped Nd(3+):GdVO(4) laser. Mixing takes place in a periodically poled lithium niobate (PPLN) crystal placed intra-cavity. The upconverted image obtained by sum-frequency mixing falls around the 631 nm red spectral region, well within the spectral response of standard silicon focal plane array bi-dimensional sensors, commonly used in charge-coupled device (CCD) or complementary metal-oxide-semiconductor (CMOS) video cameras, and of most image intensifiers. The use of ASE illumination benefits from a noticeable increase in the field of view (FOV) that can be upconverted with regard to using coherent laser illumination. The upconverted power allows us to capture real-time video in a standard nonintensified CCD camera.
Method for acquiring, storing and analyzing crystal images
NASA Technical Reports Server (NTRS)
Gester, Thomas E. (Inventor); Rosenblum, William M. (Inventor); Christopher, Gayle K. (Inventor); Hamrick, David T. (Inventor); Delucas, Lawrence J. (Inventor); Tillotson, Brian (Inventor)
2003-01-01
A system utilizing a digital computer for acquiring, storing and evaluating crystal images. The system includes a video camera (12) which produces a digital output signal representative of a crystal specimen positioned within its focal window (16). The digitized output from the camera (12) is then stored on data storage media (32) together with other parameters inputted by a technician and relevant to the crystal specimen. Preferably, the digitized images are stored on removable media (32) while the parameters for different crystal specimens are maintained in a database (40) with indices to the digitized optical images on the other data storage media (32). Computer software is then utilized to identify not only the presence and number of crystals and the edges of the crystal specimens from the optical image, but to also rate the crystal specimens by various parameters, such as edge straightness, polygon formation, aspect ratio, surface clarity, crystal cracks and other defects or lack thereof, and other parameters relevant to the quality of the crystals.
Front-end multiplexing—applied to SQUID multiplexing: Athena X-IFU and QUBIC experiments
NASA Astrophysics Data System (ADS)
Prele, D.
2015-08-01
As we have seen for digital camera market and a sensor resolution increasing to "megapixels", all the scientific and high-tech imagers (whatever the wave length - from radio to X-ray range) tends also to always increases the pixels number. So the constraints on front-end signals transmission increase too. An almost unavoidable solution to simplify integration of large arrays of pixels is front-end multiplexing. Moreover, "simple" and "efficient" techniques allow integration of read-out multiplexers in the focal plane itself. For instance, CCD (Charge Coupled Device) technology has boost number of pixels in digital camera. Indeed, this is exactly a planar technology which integrates both the sensors and a front-end multiplexed readout. In this context, front-end multiplexing techniques will be discussed for a better understanding of their advantages and their limits. Finally, the cases of astronomical instruments in the millimeter and in the X-ray ranges using SQUID (Superconducting QUantum Interference Device) will be described.
Acquisition of 3d Information for Vanished Structure by Using Only AN Ancient Picture
NASA Astrophysics Data System (ADS)
Kunii, Y.; Sakamoto, R.
2016-06-01
In order to acquire 3D information for reconstruction of vanished historical structure, grasp of 3D shape of such structure was attempted by using an ancient picture. Generally, 3D information of a structure is acquired by photogrammetric theory which requires two or more pictures. This paper clarifies that the geometrical information of the structure was obtained only from an ancient picture, and 3D information was acquired. This kind of method was applied for an ancient picture of the Old Imperial Theatre. The Old Imperial Theatre in the picture is constituted by two-point perspective. Therefore, estimated value of focal length of camera, length of camera to the Old Imperial Theatre and some parameters were calculated by estimation of field angle, using body height as an index of length and some geometrical information. Consequently, 3D coordinate of 120 measurement points on the surface of the Old Imperial Theatre were calculated respectively, and 3DCG modeling of the Old Imperial Theatre was realized.
Three dimensional measurement with an electrically tunable focused plenoptic camera
NASA Astrophysics Data System (ADS)
Lei, Yu; Tong, Qing; Xin, Zhaowei; Wei, Dong; Zhang, Xinyu; Liao, Jing; Wang, Haiwei; Xie, Changsheng
2017-03-01
A liquid crystal microlens array (LCMLA) with an arrayed microhole pattern electrode based on nematic liquid crystal materials using a fabrication method including traditional UV-photolithography and wet etching is presented. Its focusing performance is measured under different voltage signals applied between the electrodes of the LCMLA. The experimental outcome shows that the focal length of the LCMLA can be tuned easily by only changing the root mean square value of the voltage signal applied. The developed LCMLA is further integrated with a main lens and an imaging sensor to construct a LCMLA-based focused plenoptic camera (LCFPC) prototype. The focused range of the LCFPC can be shifted electrically along the optical axis of the imaging system. The principles and methods for acquiring several key parameters such as three dimensional (3D) depth, positioning, and motion expression are given. The depth resolution is discussed in detail. Experiments are carried out to obtain the static and dynamic 3D information of objects chosen.
Three dimensional measurement with an electrically tunable focused plenoptic camera.
Lei, Yu; Tong, Qing; Xin, Zhaowei; Wei, Dong; Zhang, Xinyu; Liao, Jing; Wang, Haiwei; Xie, Changsheng
2017-03-01
A liquid crystal microlens array (LCMLA) with an arrayed microhole pattern electrode based on nematic liquid crystal materials using a fabrication method including traditional UV-photolithography and wet etching is presented. Its focusing performance is measured under different voltage signals applied between the electrodes of the LCMLA. The experimental outcome shows that the focal length of the LCMLA can be tuned easily by only changing the root mean square value of the voltage signal applied. The developed LCMLA is further integrated with a main lens and an imaging sensor to construct a LCMLA-based focused plenoptic camera (LCFPC) prototype. The focused range of the LCFPC can be shifted electrically along the optical axis of the imaging system. The principles and methods for acquiring several key parameters such as three dimensional (3D) depth, positioning, and motion expression are given. The depth resolution is discussed in detail. Experiments are carried out to obtain the static and dynamic 3D information of objects chosen.
Nanometric depth resolution from multi-focal images in microscopy.
Dalgarno, Heather I C; Dalgarno, Paul A; Dada, Adetunmise C; Towers, Catherine E; Gibson, Gavin J; Parton, Richard M; Davis, Ilan; Warburton, Richard J; Greenaway, Alan H
2011-07-06
We describe a method for tracking the position of small features in three dimensions from images recorded on a standard microscope with an inexpensive attachment between the microscope and the camera. The depth-measurement accuracy of this method is tested experimentally on a wide-field, inverted microscope and is shown to give approximately 8 nm depth resolution, over a specimen depth of approximately 6 µm, when using a 12-bit charge-coupled device (CCD) camera and very bright but unresolved particles. To assess low-flux limitations a theoretical model is used to derive an analytical expression for the minimum variance bound. The approximations used in the analytical treatment are tested using numerical simulations. It is concluded that approximately 14 nm depth resolution is achievable with flux levels available when tracking fluorescent sources in three dimensions in live-cell biology and that the method is suitable for three-dimensional photo-activated localization microscopy resolution. Sub-nanometre resolution could be achieved with photon-counting techniques at high flux levels.
Nanometric depth resolution from multi-focal images in microscopy
Dalgarno, Heather I. C.; Dalgarno, Paul A.; Dada, Adetunmise C.; Towers, Catherine E.; Gibson, Gavin J.; Parton, Richard M.; Davis, Ilan; Warburton, Richard J.; Greenaway, Alan H.
2011-01-01
We describe a method for tracking the position of small features in three dimensions from images recorded on a standard microscope with an inexpensive attachment between the microscope and the camera. The depth-measurement accuracy of this method is tested experimentally on a wide-field, inverted microscope and is shown to give approximately 8 nm depth resolution, over a specimen depth of approximately 6 µm, when using a 12-bit charge-coupled device (CCD) camera and very bright but unresolved particles. To assess low-flux limitations a theoretical model is used to derive an analytical expression for the minimum variance bound. The approximations used in the analytical treatment are tested using numerical simulations. It is concluded that approximately 14 nm depth resolution is achievable with flux levels available when tracking fluorescent sources in three dimensions in live-cell biology and that the method is suitable for three-dimensional photo-activated localization microscopy resolution. Sub-nanometre resolution could be achieved with photon-counting techniques at high flux levels. PMID:21247948
Rapid estimation of frequency response functions by close-range photogrammetry
NASA Technical Reports Server (NTRS)
Tripp, J. S.
1985-01-01
The accuracy of a rapid method which estimates the frequency response function from stereoscopic dynamic data is computed. It is shown that reversal of the order of the operations of coordinate transformation and Fourier transformation, which provides a significant increase in computational speed, introduces error. A portion of the error, proportional to the perturbation components normal to the camera focal planes, cannot be eliminated. The remaining error may be eliminated by proper scaling of frequency data prior to coordinate transformation. Methods are developed for least squares estimation of the full 3x3 frequency response matrix for a three dimensional structure.
Stokes image reconstruction for two-color microgrid polarization imaging systems.
Lemaster, Daniel A
2011-07-18
The Air Force Research Laboratory has developed a new microgrid polarization imaging system capable of simultaneously reconstructing linear Stokes parameter images in two colors on a single focal plane array. In this paper, an effective method for extracting Stokes images is presented for this type of camera system. It is also shown that correlations between the color bands can be exploited to significantly increase overall spatial resolution. Test data is used to show the advantages of this approach over bilinear interpolation. The bounds (in terms of available reconstruction bandwidth) on image resolution are also provided.
Electrowetting-Based Variable-Focus Lens for Miniature Systems
NASA Astrophysics Data System (ADS)
Hendriks, B. H. W.; Kuiper, S.; van As, M. A. J.; et al.
The meniscus between two immiscible liquids of different refractive indices can be used as a lens. A change in curvature of this meniscus by electrostatic control of the solid/liquid interfacial tension leads to a change in focal distance. It is demonstrated that two liquids in a tube form a self-centred variable-focus lens. The optical properties of this lens were investigated experimentally. We designed and constructed a miniature camera module based on this variable lens suitable for mobile applications. Furthermore, the liquid lens was applied in a Blu-ray Disc optical recording system to enable dual layer disc reading/writing.
VizieR Online Data Catalog: GJ 1214b optical and near-IR transit phot. (Angerhausen+, 2017)
NASA Astrophysics Data System (ADS)
Angerhausen, D.; Dreyer, C.; Placek, B.; Csizmadia, Sz.; Eigmueller, P.; Godolt, M.; Kitzmann, D.; Mallonn, M.; Becklin, E.; Collins, P.; Dunham, E. W.; Grenfell, J. L.; Hamilton, R. T.; Kabath, P.; Logsdon, S. E.; Mandell, A.; Mandushev, G.; McElwain, M.; McLean, I. S.; Pfueller, E.; Rauer, H.; Savage, M.; Shenoy, S.; Vacca, W. D.; van Cleve, J. E.; Wiedemann, M.; Wolf, J.
2017-11-01
The joint US-German Cycle 2 Guest Investigator (GI) programme - US-proposal: Angerhausen (2013); Germanproposal: Dreyer (2013) - was performed on SOFIAs flight number 149 on UT February 27, 2014. Observations were simultaneously conducted in two optical HIPO channels: open blue at 0.3-0.6um and Sloan z' at 0.9-m; and one infrared FLITECAM fiter: Paschen-α cont. at 1.9um. Complementary data were also obtained with the optical focal plane guiding camera FPI+ in the Sloan i' band (0.8um). (5 data files).
Sung, Yu-Lung; Jeang, Jenn; Lee, Chia-Hsiung; Shih, Wei-Chuan
2015-04-01
We present a highly repeatable, lithography-free and mold-free method for fabricating flexible optical lenses by in situ curing liquid polydimethylsiloxane droplets on a preheated smooth surface with an inkjet printing process. This method enables us to fabricate lenses with a focal length as short as 5.6 mm, which can be controlled by varying the droplet volume and the temperature of the preheated surface. Furthermore, the lens can be attached to a smartphone camera without any accessories and can produce high-resolution (1 μm) images for microscopy applications.
Optical Flow in a Smart Sensor Based on Hybrid Analog-Digital Architecture
Guzmán, Pablo; Díaz, Javier; Agís, Rodrigo; Ros, Eduardo
2010-01-01
The purpose of this study is to develop a motion sensor (delivering optical flow estimations) using a platform that includes the sensor itself, focal plane processing resources, and co-processing resources on a general purpose embedded processor. All this is implemented on a single device as a SoC (System-on-a-Chip). Optical flow is the 2-D projection into the camera plane of the 3-D motion information presented at the world scenario. This motion representation is widespread well-known and applied in the science community to solve a wide variety of problems. Most applications based on motion estimation require work in real-time; hence, this restriction must be taken into account. In this paper, we show an efficient approach to estimate the motion velocity vectors with an architecture based on a focal plane processor combined on-chip with a 32 bits NIOS II processor. Our approach relies on the simplification of the original optical flow model and its efficient implementation in a platform that combines an analog (focal-plane) and digital (NIOS II) processor. The system is fully functional and is organized in different stages where the early processing (focal plane) stage is mainly focus to pre-process the input image stream to reduce the computational cost in the post-processing (NIOS II) stage. We present the employed co-design techniques and analyze this novel architecture. We evaluate the system’s performance and accuracy with respect to the different proposed approaches described in the literature. We also discuss the advantages of the proposed approach as well as the degree of efficiency which can be obtained from the focal plane processing capabilities of the system. The final outcome is a low cost smart sensor for optical flow computation with real-time performance and reduced power consumption that can be used for very diverse application domains. PMID:22319283
Low-cost thermo-electric infrared FPAs and their automotive applications
NASA Astrophysics Data System (ADS)
Hirota, Masaki; Ohta, Yoshimi; Fukuyama, Yasuhiro
2008-04-01
This paper describes three low-cost infrared focal plane arrays (FPAs) having a 1,536, 2,304, and 10,800 elements and experimental vehicle systems. They have a low-cost potential because each element consists of p-n polysilicon thermocouples, which allows the use of low-cost ultra-fine microfabrication technology commonly employed in the conventional semiconductor manufacturing processes. To increase the responsivity of FPA, we have developed a precisely patterned Au-black absorber that has high infrared absorptivity of more than 90%. The FPA having a 2,304 elements achieved high resposivity of 4,300 V/W. In order to reduce package cost, we developed a vacuum-sealed package integrated with a molded ZnS lens. The camera aiming the temperature measurement of a passenger cabin is compact and light weight devices that measures 45 x 45 x 30 mm and weighs 190 g. The camera achieves a noise equivalent temperature deviation (NETD) of less than 0.7°C from 0 to 40°C. In this paper, we also present a several experimental systems that use infrared cameras. One experimental system is a blind spot pedestrian warning system that employs four infrared cameras. It can detect the infrared radiation emitted from a human body and alerts the driver when a pedestrian is in a blind spot. The system can also prevent the vehicle from moving in the direction of the pedestrian. Another system uses a visible-light camera and infrared sensors to detect the presence of a pedestrian in a rear blind spot and alerts the driver. The third system is a new type of human-machine interface system that enables the driver to control the car's audio system without letting go of the steering wheel. Uncooled infrared cameras are still costly, which limits their automotive use to high-end luxury cars at present. To promote widespread use of IR imaging sensors on vehicles, we need to reduce their cost further.
Broadband Terahertz Computed Tomography Using a 5k-pixel Real-time THz Camera
NASA Astrophysics Data System (ADS)
Trichopoulos, Georgios C.; Sertel, Kubilay
2015-07-01
We present a novel THz computed tomography system that enables fast 3-dimensional imaging and spectroscopy in the 0.6-1.2 THz band. The system is based on a new real-time broadband THz camera that enables rapid acquisition of multiple cross-sectional images required in computed tomography. Tomographic reconstruction is achieved using digital images from the densely-packed large-format (80×64) focal plane array sensor located behind a hyper-hemispherical silicon lens. Each pixel of the sensor array consists of an 85 μm × 92 μm lithographically fabricated wideband dual-slot antenna, monolithically integrated with an ultra-fast diode tuned to operate in the 0.6-1.2 THz regime. Concurrently, optimum impedance matching was implemented for maximum pixel sensitivity, enabling 5 frames-per-second image acquisition speed. As such, the THz computed tomography system generates diffraction-limited resolution cross-section images as well as the three-dimensional models of various opaque and partially transparent objects. As an example, an over-the-counter vitamin supplement pill is imaged and its material composition is reconstructed. The new THz camera enables, for the first time, a practical application of THz computed tomography for non-destructive evaluation and biomedical imaging.
Use of Vertical Aerial Images for Semi-Oblique Mapping
NASA Astrophysics Data System (ADS)
Poli, D.; Moe, K.; Legat, K.; Toschi, I.; Lago, F.; Remondino, F.
2017-05-01
The paper proposes a methodology for the use of the oblique sections of images from large-format photogrammetric cameras, by exploiting the effect of the central perspective geometry in the lateral parts of the nadir images ("semi-oblique" images). The point of origin of the investigation was the execution of a photogrammetric flight over Norcia (Italy), which was seriously damaged after the earthquake of 30/10/2016. Contrary to the original plan of oblique acquisitions, the flight was executed on 15/11/2017 using an UltraCam Eagle camera with focal length 80 mm, and combining two flight plans, rotated by 90º ("crisscross" flight). The images (GSD 5 cm) were used to extract a 2.5D DSM cloud, sampled to a XY-grid size of 2 GSD, a 3D point clouds with a mean spatial resolution of 1 GSD and a 3D mesh model at a resolution of 10 cm of the historic centre of Norcia for a quantitative assessment of the damages. From the acquired nadir images the "semi-oblique" images (forward, backward, left and right views) could be extracted and processed in a modified version of GEOBLY software for measurements and restitution purposes. The potential of such semi-oblique image acquisitions from nadir-view cameras is hereafter shown and commented.
Gamma Ray Burst Optical Counterpart Search Experiment (GROCSE)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, H.S.; Ables, E.; Bionta, R.M.
GROCSE (Gamma-Ray Optical Counterpart Search Experiments) is a system of automated telescopes that search for simultaneous optical activity associated with gamma ray bursts in response to real-time burst notifications provided by the BATSE/BACODINE network. The first generation system, GROCSE 1, is sensitive down to Mv {approximately} 8.5 and requires an average of 12 seconds to obtain the first images of the gamma ray burst error box defined by the BACODINE trigger. The collaboration is now constructing a second generation system which has a 4 second slewing time and can reach Mv {approximately} 14 with a 5 second exposure. GROCSE 2more » consists of 4 cameras on a single mount. Each camera views the night sky through a commercial Canon lens (f/1.8, focal length 200 mm) and utilizes a 2K x 2K Loral CCD. Light weight and low noise custom readout electronics were designed and fabricated for these CCDs. The total field of view of the 4 cameras is 17.6 x 17.6 {degree}. GROCSE II will be operated by the end of 1995. In this paper, the authors present an overview of the GROCSE system and the results of measurements with a GROCSE 2 prototype unit.« less
Model of an optical system's influence on sensitivity of microbolometric focal plane array
NASA Astrophysics Data System (ADS)
Gogler, Sławomir; Bieszczad, Grzegorz; Zarzycka, Alicja; Szymańska, Magdalena; Sosnowski, Tomasz
2012-10-01
Thermal imagers and used therein infrared array sensors are subject to calibration procedure and evaluation of their voltage sensitivity on incident radiation during manufacturing process. The calibration procedure is especially important in so-called radiometric cameras, where accurate radiometric quantities, given in physical units, are of concern. Even though non-radiometric cameras are not expected to stand up to such elevated standards, it is still important, that the image faithfully represents temperature variations across the scene. The detectors used in thermal camera are illuminated by infrared radiation transmitted through a specialized optical system. Each optical system used influences irradiation distribution across an sensor array. In the article a model describing irradiation distribution across an array sensor working with an optical system used in the calibration set-up has been proposed. In the said method optical and geometrical considerations of the array set-up have been taken into account. By means of Monte-Carlo simulation, large number of rays has been traced to the sensor plane, what allowed to determine the irradiation distribution across the image plane for different aperture limiting configurations. Simulated results have been confronted with proposed analytical expression. Presented radiometric model allows fast and accurate non-uniformity correction to be carried out.
NASA Astrophysics Data System (ADS)
Tchernykh, Valerij; Dyblenko, Sergej; Janschek, Klaus; Seifart, Klaus; Harnisch, Bernd
2005-08-01
The cameras commonly used for Earth observation from satellites require high attitude stability during the image acquisition. For some types of cameras (high-resolution "pushbroom" scanners in particular), instantaneous attitude changes of even less than one arcsecond result in significant image distortion and blurring. Especially problematic are the effects of high-frequency attitude variations originating from micro-shocks and vibrations produced by the momentum and reaction wheels, mechanically activated coolers, and steering and deployment mechanisms on board. The resulting high attitude-stability requirements for Earth-observation satellites are one of the main reasons for their complexity and high cost. The novel SmartScan imaging concept, based on an opto-electronic system with no moving parts, offers the promise of high-quality imaging with only moderate satellite attitude stability. SmartScan uses real-time recording of the actual image motion in the focal plane of the camera during frame acquisition to correct the distortions in the image. Exceptional real-time performances with subpixel-accuracy image-motion measurement are provided by an innovative high-speed onboard opto-electronic correlation processor. SmartScan will therefore allow pushbroom scanners to be used for hyper-spectral imaging from satellites and other space platforms not primarily intended for imaging missions, such as micro- and nano-satellites with simplified attitude control, low-orbiting communications satellites, and manned space stations.
Image interpolation and denoising for division of focal plane sensors using Gaussian processes.
Gilboa, Elad; Cunningham, John P; Nehorai, Arye; Gruev, Viktor
2014-06-16
Image interpolation and denoising are important techniques in image processing. These methods are inherent to digital image acquisition as most digital cameras are composed of a 2D grid of heterogeneous imaging sensors. Current polarization imaging employ four different pixelated polarization filters, commonly referred to as division of focal plane polarization sensors. The sensors capture only partial information of the true scene, leading to a loss of spatial resolution as well as inaccuracy of the captured polarization information. Interpolation is a standard technique to recover the missing information and increase the accuracy of the captured polarization information. Here we focus specifically on Gaussian process regression as a way to perform a statistical image interpolation, where estimates of sensor noise are used to improve the accuracy of the estimated pixel information. We further exploit the inherent grid structure of this data to create a fast exact algorithm that operates in ����(N(3/2)) (vs. the naive ���� (N³)), thus making the Gaussian process method computationally tractable for image data. This modeling advance and the enabling computational advance combine to produce significant improvements over previously published interpolation methods for polarimeters, which is most pronounced in cases of low signal-to-noise ratio (SNR). We provide the comprehensive mathematical model as well as experimental results of the GP interpolation performance for division of focal plane polarimeter.
Two-Arm Flexible Thermal Strap
NASA Technical Reports Server (NTRS)
Urquiza, Eugenio; Vasquez, Cristal; Rodriquez, Jose I.; Leland, Robert S.; VanGorp, Byron E.
2011-01-01
Airborne and space infrared cameras require highly flexible direct cooling of mechanically-sensitive focal planes. A thermal electric cooler is often used together with a thermal strap as a means to transport the thermal energy removed from the infrared detector. While effective, traditional thermal straps are only truly flexible in one direction. In this scenario, a cooling solution must be highly conductive, lightweight, able to operate within a vacuum, and highly flexible in all axes to accommodate adjustment of the focal plane while transmitting minimal force. A two-armed thermal strap using three end pieces and a twisted section offers enhanced elastic movement, significantly beyond the motion permitted by existing thermal straps. This design innovation allows for large elastic displacements in two planes and moderate elasticity in the third plane. By contrast, a more conventional strap of the same conductance offers less flexibility and asymmetrical elasticity. The two-arm configuration reduces the bending moment of inertia for a given conductance by creating the same cross-sectional area for thermal conduction, but with only half the thickness. This reduction in the thickness has a significant effect on the flexibility since there is a cubic relationship between the thickness and the rigidity or bending moment of inertia. The novelty of the technology lies in the mechanical design and manufacturing of the thermal strap. The enhanced flexibility will facilitate cooling of mechanically sensitive components (example: optical focal planes). This development is a significant contribution to the thermal cooling of optics. It is known to be especially important in the thermal control of optical focal planes due to their highly sensitive alignment requirements and mechanical sensitivity; however, many other applications exist including the cooling of gimbal-mounted components.
Newly discovered globular clusters in NGC 147 and NGC 185 from PAndAS
NASA Astrophysics Data System (ADS)
Veljanoski, J.; Ferguson, A. M. N.; Huxor, A. P.; Mackey, A. D.; Fishlock, C. K.; Irwin, M. J.; Tanvir, N.; Chapman, S. C.; Ibata, R. A.; Lewis, G. F.; McConnachie, A.
2013-11-01
Using data from the Pan-Andromeda Archaeological Survey (PAndAS), we have discovered four new globular clusters (GCs) associated with the M31 dwarf elliptical (dE) satellites NGC 147 and NGC 185. Three of these are associated with NGC 147 and one with NGC 185. All lie beyond the main optical boundaries of the galaxies and are the most remote clusters yet known in these systems. Radial velocities derived from low-resolution spectra are used to argue that the GCs are bound to the dwarfs and are not part of the M31 halo population. Combining PAndAS with United Kingdom Infrared Telescope (UKIRT)/WFCAM (Wide-Field Camera) data, we present the first homogeneous optical and near-IR photometry for the entire GC systems of these dEs. Colour-colour plots and published colour-metallicity relations are employed to constrain GC ages and metallicities. It is demonstrated that the clusters are in general metal poor ([Fe/H] < -1.25 dex), while the ages are more difficult to constrain. The mean (V - I)0 colours of the two GC systems are very similar to those of the GC systems of dEs in the Virgo and Fornax clusters, as well as the extended halo GC population in M31. The new clusters bring the GC-specific frequency (SN) to ˜9 in NGC 147 and ˜5 in NGC 185, consistent with values found for dEs of similar luminosity residing in a range of environments.
NASA Astrophysics Data System (ADS)
Hoefflinger, Bernd
Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 μm technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and μm-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.
Alignment and Performance of the Infrared Multi-Object Spectrometer
NASA Technical Reports Server (NTRS)
Connelly, Joseph A.; Ohl, Raymond G.; Mentzell, J. Eric; Madison, Timothy J.; Hylan, Jason E.; Mink, Ronald G.; Saha, Timo T.; Tveekrem, June L.; Sparr, Leroy M.; Chambers, V. John;
2004-01-01
The Infrared Multi-Object Spectrometer (IRMOS) is a principle investigator class instrument for the Kitt Peak National Observatory 4 and 2.1 meter telescopes. IRMOS is a near-IR (0.8 - 2.5 micron) spectrometer with low-to mid-resolving power (R = 300 - 3000). IRMOS produces simultaneous spectra of approximately 100 objects in its 2.8 x 2.0 arc-min field of view (4 m telescope) using a commercial Micro Electro-Mechanical Systems (MEMS) micro-mirror array (MMA) from Texas Instruments. The IRMOS optical design consists of two imaging subsystems. The focal reducer images the focal plane of the telescope onto the MMA field stop, and the spectrograph images the MMA onto the detector. We describe ambient breadboard subsystem alignment and imaging performance of each stage independently, and ambient imaging performance of the fully assembled instrument. Interferometric measurements of subsystem wavefront error serve as a qualitative alignment guide, and are accomplished using a commercial, modified Twyman-Green laser unequal path interferometer. Image testing provides verification of the optomechanical alignment method and a measurement of near-angle scattered light due to mirror small-scale surface error. Image testing is performed at multiple field points. A mercury-argon pencil lamp provides a spectral line at 546.1 nanometers, a blackbody source provides a line at 1550 nanometers, and a CCD camera and IR camera are used as detectors. We use commercial optical modeling software to predict the point-spread function and its effect on instrument slit transmission and resolution. Our breadboard and instrument level test results validate this prediction. We conclude with an instrument performance prediction for cryogenic operation and first light in late 2003.
An overview of instrumentation for the Large Binocular Telescope
NASA Astrophysics Data System (ADS)
Wagner, R. Mark
2010-07-01
An overview of instrumentation for the Large Binocular Telescope is presented. Optical instrumentation includes the Large Binocular Camera (LBC), a pair of wide-field (27 × 27) mosaic CCD imagers at the prime focus, and the Multi-Object Double Spectrograph (MODS), a pair of dual-beam blue-red optimized long-slit spectrographs mounted at the straight-through F/15 Gregorian focus incorporating multiple slit masks for multi-object spectroscopy over a 6 field and spectral resolutions of up to 8000. Infrared instrumentation includes the LBT Near-IR Spectroscopic Utility with Camera and Integral Field Unit for Extragalactic Research (LUCIFER), a modular near-infrared (0.9-2.5 μm) imager and spectrograph pair mounted at a bent interior focal station and designed for seeing-limited (FOV: 4 × 4) imaging, long-slit spectroscopy, and multi-object spectroscopy utilizing cooled slit masks and diffraction limited (FOV: 0.5 × 0.5) imaging and long-slit spectroscopy. Strategic instruments under development for the remaining two combined focal stations include an interferometric cryogenic beam combiner with near-infrared and thermal-infrared instruments for Fizeau imaging and nulling interferometry (LBTI) and an optical bench near-infrared beam combiner utilizing multi-conjugate adaptive optics for high angular resolution and sensitivity (LINC-NIRVANA). In addition, a fiber-fed bench spectrograph (PEPSI) capable of ultra high resolution spectroscopy and spectropolarimetry (R = 40,000-300,000) will be available as a principal investigator instrument. The availability of all these instruments mounted simultaneously on the LBT permits unique science, flexible scheduling, and improved operational support. Over the past two years the LBC and the first LUCIFER instrument have been brought into routine scientific operation and MODS1 commissioning is set to begin in the fall of 2010.
The (In)Effectiveness of Simulated Blur for Depth Perception in Naturalistic Images.
Maiello, Guido; Chessa, Manuela; Solari, Fabio; Bex, Peter J
2015-01-01
We examine depth perception in images of real scenes with naturalistic variation in pictorial depth cues, simulated dioptric blur and binocular disparity. Light field photographs of natural scenes were taken with a Lytro plenoptic camera that simultaneously captures images at up to 12 focal planes. When accommodation at any given plane was simulated, the corresponding defocus blur at other depth planes was extracted from the stack of focal plane images. Depth information from pictorial cues, relative blur and stereoscopic disparity was separately introduced into the images. In 2AFC tasks, observers were required to indicate which of two patches extracted from these images was farther. Depth discrimination sensitivity was highest when geometric and stereoscopic disparity cues were both present. Blur cues impaired sensitivity by reducing the contrast of geometric information at high spatial frequencies. While simulated generic blur may not assist depth perception, it remains possible that dioptric blur from the optics of an observer's own eyes may be used to recover depth information on an individual basis. The implications of our findings for virtual reality rendering technology are discussed.
The (In)Effectiveness of Simulated Blur for Depth Perception in Naturalistic Images
Maiello, Guido; Chessa, Manuela; Solari, Fabio; Bex, Peter J.
2015-01-01
We examine depth perception in images of real scenes with naturalistic variation in pictorial depth cues, simulated dioptric blur and binocular disparity. Light field photographs of natural scenes were taken with a Lytro plenoptic camera that simultaneously captures images at up to 12 focal planes. When accommodation at any given plane was simulated, the corresponding defocus blur at other depth planes was extracted from the stack of focal plane images. Depth information from pictorial cues, relative blur and stereoscopic disparity was separately introduced into the images. In 2AFC tasks, observers were required to indicate which of two patches extracted from these images was farther. Depth discrimination sensitivity was highest when geometric and stereoscopic disparity cues were both present. Blur cues impaired sensitivity by reducing the contrast of geometric information at high spatial frequencies. While simulated generic blur may not assist depth perception, it remains possible that dioptric blur from the optics of an observer’s own eyes may be used to recover depth information on an individual basis. The implications of our findings for virtual reality rendering technology are discussed. PMID:26447793
Extreme depth-of-field intraocular lenses
NASA Astrophysics Data System (ADS)
Baker, Kenneth M.
1996-05-01
A new technology brings the full aperture single vision pseudophakic eye's effective hyperfocal distance within the half-meter range. A modulated index IOL containing a subsurface zeroth order coherent microlenticular mosaic defined by an index gradient adds a normalizing function to the vergences or parallactic angles of incoming light rays subtended from field object points and redirects them, in the case of near-field images, to that of far-field images. Along with a scalar reduction of the IOL's linear focal range, this results in an extreme depth of field with a narrow depth of focus and avoids the focal split-up, halo, and inherent reduction in contrast of multifocal IOLs. A high microlenticular spatial frequency, which, while still retaining an anisotropic medium, results in a nearly total zeroth order propagation throughout the visible spectrum. The curved lens surfaces still provide most of the refractive power of the IOL, and the unique holographic fabrication technology is especially suitable not only for IOLs but also for contact lenses, artificial corneas, and miniature lens elements for cameras and other optical devices.
NASA Astrophysics Data System (ADS)
Fukuda, Takahito; Shinomura, Masato; Xia, Peng; Awatsuji, Yasuhiro; Nishio, Kenzo; Matoba, Osamu
2017-04-01
We constructed a parallel-phase-shifting digital holographic microscopy (PPSDHM) system using an inverted magnification optical system, and succeeded in three-dimensional (3D) motion-picture imaging for 3D displacement of a microscopic object. In the PPSDHM system, the inverted and afocal magnification optical system consisted of a microscope objective (16.56 mm focal length and 0.25 numerical aperture) and a convex lens (300 mm focal length and 82 mm aperture diameter). A polarization-imaging camera was used to record multiple phase-shifted holograms with a single-shot exposure. We recorded an alum crystal, sinking down in aqueous solution of alum, by the constructed PPSDHM system at 60 frames/s for about 20 s and reconstructed high-quality 3D motion-picture image of the crystal. Then, we calculated amounts of displacement of the crystal from the amounts in the focus plane and the magnifications of the magnification optical system, and obtained the 3D trajectory of the crystal by that amounts.
NASA Astrophysics Data System (ADS)
Kim, Sang-Youn; Yeo, Myoung; Shin, Eun-Jae; Park, Won-Hyeong; Jang, Jong-Seok; Nam, Byeong-Uk; Bae, Jin Woo
2015-11-01
In this paper, we propose a variable focus microlens module based on a transparent, electroactive, and non-ionic PVC/DBA gel. A non-ionic PVC/DBA (nPVC) gel on an ITO glass was confined beneath a rigid annular electrode, and applied pressure squeezed a bulge of the nPVC gel into the annular electrode, resulting in a hemispherical plano-convex nPVC gel microlens. The proposed nPVC gel microlens was analyzed and optimized. When voltage is applied to the circular perimeter (the annular electrode) of this fabricated microlens, electrically induced creep deformation of the nPVC gel occurs, changing its optical focal length. The focal length remarkably increases from 3.8 mm up to 14.3 mm with increasing applied voltages from 300 V to 800 V. Due to its compact, transparent, and electroactive characteristics, the proposed nPVC gel microlens can be easily inserted into small consumer electronic devices, such as digital cameras, camcorders, cell phones, and other portable optical devices.