Sample records for common aperture camera

  1. Ringfield lithographic camera

    DOEpatents

    Sweatt, William C.

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  2. Ringfield lithographic camera

    DOEpatents

    Sweatt, W.C.

    1998-09-08

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

  3. The development of large-aperture test system of infrared camera and visible CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  4. 4D Light Field Imaging System Using Programmable Aperture

    NASA Technical Reports Server (NTRS)

    Bae, Youngsam

    2012-01-01

    Complete depth information can be extracted from analyzing all angles of light rays emanated from a source. However, this angular information is lost in a typical 2D imaging system. In order to record this information, a standard stereo imaging system uses two cameras to obtain information from two view angles. Sometimes, more cameras are used to obtain information from more angles. However, a 4D light field imaging technique can achieve this multiple-camera effect through a single-lens camera. Two methods are available for this: one using a microlens array, and the other using a moving aperture. The moving-aperture method can obtain more complete stereo information. The existing literature suggests a modified liquid crystal panel [LC (liquid crystal) panel, similar to ones commonly used in the display industry] to achieve a moving aperture. However, LC panels cannot withstand harsh environments and are not qualified for spaceflight. In this regard, different hardware is proposed for the moving aperture. A digital micromirror device (DMD) will replace the liquid crystal. This will be qualified for harsh environments for the 4D light field imaging. This will enable an imager to record near-complete stereo information. The approach to building a proof-ofconcept is using existing, or slightly modified, off-the-shelf components. An SLR (single-lens reflex) lens system, which typically has a large aperture for fast imaging, will be modified. The lens system will be arranged so that DMD can be integrated. The shape of aperture will be programmed for single-viewpoint imaging, multiple-viewpoint imaging, and coded aperture imaging. The novelty lies in using a DMD instead of a LC panel to move the apertures for 4D light field imaging. The DMD uses reflecting mirrors, so any light transmission lost (which would be expected from the LC panel) will be minimal. Also, the MEMS-based DMD can withstand higher temperature and pressure fluctuation than a LC panel can. Robotics need near complete stereo images for their autonomous navigation, manipulation, and depth approximation. The imaging system can provide visual feedback

  5. Common aperture multispectral spotter camera: Spectro XR

    NASA Astrophysics Data System (ADS)

    Petrushevsky, Vladimir; Freiman, Dov; Diamant, Idan; Giladi, Shira; Leibovich, Maor

    2017-10-01

    The Spectro XRTM is an advanced color/NIR/SWIR/MWIR 16'' payload recently developed by Elbit Systems / ELOP. The payload's primary sensor is a spotter camera with common 7'' aperture. The sensor suite includes also MWIR zoom, EO zoom, laser designator or rangefinder, laser pointer / illuminator and laser spot tracker. Rigid structure, vibration damping and 4-axes gimbals enable high level of line-of-sight stabilization. The payload's list of features include multi-target video tracker, precise boresight, strap-on IMU, embedded moving map, geodetic calculations suite, and image fusion. The paper describes main technical characteristics of the spotter camera. Visible-quality, all-metal front catadioptric telescope maintains optical performance in wide range of environmental conditions. High-efficiency coatings separate the incoming light into EO, SWIR and MWIR band channels. Both EO and SWIR bands have dual FOV and 3 spectral filters each. Several variants of focal plane array formats are supported. The common aperture design facilitates superior DRI performance in EO and SWIR, in comparison to the conventionally configured payloads. Special spectral calibration and color correction extend the effective range of color imaging. An advanced CMOS FPA and low F-number of the optics facilitate low light performance. SWIR band provides further atmospheric penetration, as well as see-spot capability at especially long ranges, due to asynchronous pulse detection. MWIR band has good sharpness in the entire field-of-view and (with full HD FPA) delivers amount of detail far exceeding one of VGA-equipped FLIRs. The Spectro XR offers level of performance typically associated with larger and heavier payloads.

  6. Coded aperture solution for improving the performance of traffic enforcement cameras

    NASA Astrophysics Data System (ADS)

    Masoudifar, Mina; Pourreza, Hamid Reza

    2016-10-01

    A coded aperture camera is proposed for automatic license plate recognition (ALPR) systems. It captures images using a noncircular aperture. The aperture pattern is designed for the rapid acquisition of high-resolution images while preserving high spatial frequencies of defocused regions. It is obtained by minimizing an objective function, which computes the expected value of perceptual deblurring error. The imaging conditions and camera sensor specifications are also considered in the proposed function. The designed aperture improves the depth of field (DoF) and subsequently ALPR performance. The captured images can be directly analyzed by the ALPR software up to a specific depth, which is 13 m in our case, though it is 11 m for the circular aperture. Moreover, since the deblurring results of images captured by our aperture yield fewer artifacts than those captured by the circular aperture, images can be first deblurred and then analyzed by the ALPR software. In this way, the DoF and recognition rate can be improved at the same time. Our case study shows that the proposed camera can improve the DoF up to 17 m while it is limited to 11 m in the conventional aperture.

  7. Method for measuring the focal spot size of an x-ray tube using a coded aperture mask and a digital detector.

    PubMed

    Russo, Paolo; Mettivier, Giovanni

    2011-04-01

    The goal of this study is to evaluate a new method based on a coded aperture mask combined with a digital x-ray imaging detector for measurements of the focal spot sizes of diagnostic x-ray tubes. Common techniques for focal spot size measurements employ a pinhole camera, a slit camera, or a star resolution pattern. The coded aperture mask is a radiation collimator consisting of a large number of apertures disposed on a predetermined grid in an array, through which the radiation source is imaged onto a digital x-ray detector. The method of the coded mask camera allows one to obtain a one-shot accurate and direct measurement of the two dimensions of the focal spot (like that for a pinhole camera) but at a low tube loading (like that for a slit camera). A large number of small apertures in the coded mask operate as a "multipinhole" with greater efficiency than a single pinhole, but keeping the resolution of a single pinhole. X-ray images result from the multiplexed output on the detector image plane of such a multiple aperture array, and the image of the source is digitally reconstructed with a deconvolution algorithm. Images of the focal spot of a laboratory x-ray tube (W anode: 35-80 kVp; focal spot size of 0.04 mm) were acquired at different geometrical magnifications with two different types of digital detector (a photon counting hybrid silicon pixel detector with 0.055 mm pitch and a flat panel CMOS digital detector with 0.05 mm pitch) using a high resolution coded mask (type no-two-holes-touching modified uniformly redundant array) with 480 0.07 mm apertures, designed for imaging at energies below 35 keV. Measurements with a slit camera were performed for comparison. A test with a pinhole camera and with the coded mask on a computed radiography mammography unit with 0.3 mm focal spot was also carried out. The full width at half maximum focal spot sizes were obtained from the line profiles of the decoded images, showing a focal spot of 0.120 mm x 0.105 mm at 35 kVp and M = 6.1, with a detector entrance exposure as low as 1.82 mR (0.125 mA s tube load). The slit camera indicated a focal spot of 0.112 mm x 0.104 mm at 35 kVp and M = 3.15, with an exposure at the detector of 72 mR. Focal spot measurements with the coded mask could be performed up to 80 kVp. Tolerance to angular misalignment with the reference beam up to 7 degrees in in-plane rotations and 1 degrees deg in out-of-plane rotations was observed. The axial distance of the focal spot from the coded mask could also be determined. It is possible to determine the beam intensity via measurement of the intensity of the decoded image of the focal spot and via a calibration procedure. Coded aperture masks coupled to a digital area detector produce precise determinations of the focal spot of an x-ray tube with reduced tube loading and measurement time, coupled to a large tolerance in the alignment of the mask.

  8. Hyperspectral imaging for food processing automation

    NASA Astrophysics Data System (ADS)

    Park, Bosoon; Lawrence, Kurt C.; Windham, William R.; Smith, Doug P.; Feldner, Peggy W.

    2002-11-01

    This paper presents the research results that demonstrates hyperspectral imaging could be used effectively for detecting feces (from duodenum, ceca, and colon) and ingesta on the surface of poultry carcasses, and potential application for real-time, on-line processing of poultry for automatic safety inspection. The hyperspectral imaging system included a line scan camera with prism-grating-prism spectrograph, fiber optic line lighting, motorized lens control, and hyperspectral image processing software. Hyperspectral image processing algorithms, specifically band ratio of dual-wavelength (565/517) images and thresholding were effective on the identification of fecal and ingesta contamination of poultry carcasses. A multispectral imaging system including a common aperture camera with three optical trim filters (515.4 nm with 8.6- nm FWHM), 566.4 nm with 8.8-nm FWHM, and 631 nm with 10.2-nm FWHM), which were selected and validated by a hyperspectral imaging system, was developed for a real-time, on-line application. A total image processing time required to perform the current multispectral images captured by a common aperture camera was approximately 251 msec or 3.99 frames/sec. A preliminary test shows that the accuracy of real-time multispectral imaging system to detect feces and ingesta on corn/soybean fed poultry carcasses was 96%. However, many false positive spots that cause system errors were also detected.

  9. Dynamically reconfigurable holographic metasurface aperture for a Mills-Cross monochromatic microwave camera.

    PubMed

    Yurduseven, Okan; Marks, Daniel L; Fromenteze, Thomas; Smith, David R

    2018-03-05

    We present a reconfigurable, dynamic beam steering holographic metasurface aperture to synthesize a microwave camera at K-band frequencies. The aperture consists of a 1D printed microstrip transmission line with the front surface patterned into an array of slot-shaped subwavelength metamaterial elements (or meta-elements) dynamically tuned between "ON" and "OFF" states using PIN diodes. The proposed aperture synthesizes a desired radiation pattern by converting the waveguide-mode to a free space radiation by means of a binary modulation scheme. This is achieved in a holographic manner; by interacting the waveguide-mode (reference-wave) with the metasurface layer (hologram layer). It is shown by means of full-wave simulations that using the developed metasurface aperture, the radiated wavefronts can be engineered in an all-electronic manner without the need for complex phase-shifting circuits or mechanical scanning apparatus. Using the dynamic beam steering capability of the developed antenna, we synthesize a Mills-Cross composite aperture, forming a single-frequency all-electronic microwave camera.

  10. The Effect of a Pre-Lens Aperture on the Temperature Range and Image Uniformity of Microbolometer Infrared Cameras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinwiddie, Ralph Barton; Parris, Larkin S.; Lindal, John M.

    This paper explores the temperature range extension of long-wavelength infrared (LWIR) cameras by placing an aperture in front of the lens. An aperture smaller than the lens will reduce the radiance to the sensor, allowing the camera to image targets much hotter than typically allowable. These higher temperatures were accurately determined after developing a correction factor which was applied to the built-in temperature calibration. The relationship between aperture diameter and temperature range is linear. The effect of pre-lens apertures on the image uniformity is a form of anti-vignetting, meaning the corners appear brighter (hotter) than the rest of the image.more » An example of using this technique to measure temperatures of high melting point polymers during 3D printing provide valuable information of the time required for the weld-line temperature to fall below the glass transition temperature.« less

  11. Recent Developments In High Speed Lens Design At The NPRL

    NASA Astrophysics Data System (ADS)

    Mcdowell, M. W.; Klee, H. W.

    1987-09-01

    Although the lens provides the link between the high speed camera and the outside world, there has over the years been little evidence of co-operation between the optical design and high speed photography communities. It is still only too common for a manufacturer to develop a camera of improved performance and resolution and then to combine this with a standard camera lens. These lenses were often designed for a completely different recording medium and, more often than not, their use results in avoidable degradation of the overall system performance. There is a tendency to assume that a specialized lens would be too expensive and that pushing the aperture automatically implies more complex optical systems. In the present paper some recent South African developments in the design of large aperture lenses are described. The application of a new design principle, based on the work earlier this century of Bernhard Schmidt, shows that ultra-fast lenses need not be overly complex and a basic four-element lens configuration can be adapted to a wide variety of applications.

  12. Joint estimation of high resolution images and depth maps from light field cameras

    NASA Astrophysics Data System (ADS)

    Ohashi, Kazuki; Takahashi, Keita; Fujii, Toshiaki

    2014-03-01

    Light field cameras are attracting much attention as tools for acquiring 3D information of a scene through a single camera. The main drawback of typical lenselet-based light field cameras is the limited resolution. This limitation comes from the structure where a microlens array is inserted between the sensor and the main lens. The microlens array projects 4D light field on a single 2D image sensor at the sacrifice of the resolution; the angular resolution and the position resolution trade-off under the fixed resolution of the image sensor. This fundamental trade-off remains after the raw light field image is converted to a set of sub-aperture images. The purpose of our study is to estimate a higher resolution image from low resolution sub-aperture images using a framework of super-resolution reconstruction. In this reconstruction, these sub-aperture images should be registered as accurately as possible. This registration is equivalent to depth estimation. Therefore, we propose a method where super-resolution and depth refinement are performed alternatively. Most of the process of our method is implemented by image processing operations. We present several experimental results using a Lytro camera, where we increased the resolution of a sub-aperture image by three times horizontally and vertically. Our method can produce clearer images compared to the original sub-aperture images and the case without depth refinement.

  13. A higher-speed compressive sensing camera through multi-diode design

    NASA Astrophysics Data System (ADS)

    Herman, Matthew A.; Tidman, James; Hewitt, Donna; Weston, Tyler; McMackin, Lenore

    2013-05-01

    Obtaining high frame rates is a challenge with compressive sensing (CS) systems that gather measurements in a sequential manner, such as the single-pixel CS camera. One strategy for increasing the frame rate is to divide the FOV into smaller areas that are sampled and reconstructed in parallel. Following this strategy, InView has developed a multi-aperture CS camera using an 8×4 array of photodiodes that essentially act as 32 individual simultaneously operating single-pixel cameras. Images reconstructed from each of the photodiode measurements are stitched together to form the full FOV. To account for crosstalk between the sub-apertures, novel modulation patterns have been developed to allow neighboring sub-apertures to share energy. Regions of overlap not only account for crosstalk energy that would otherwise be reconstructed as noise, but they also allow for tolerance in the alignment of the DMD to the lenslet array. Currently, the multi-aperture camera is built into a computational imaging workstation configuration useful for research and development purposes. In this configuration, modulation patterns are generated in a CPU and sent to the DMD via PCI express, which allows the operator to develop and change the patterns used in the data acquisition step. The sensor data is collected and then streamed to the workstation via an Ethernet or USB connection for the reconstruction step. Depending on the amount of data taken and the amount of overlap between sub-apertures, frame rates of 2-5 frames per second can be achieved. In a stand-alone camera platform, currently in development, pattern generation and reconstruction will be implemented on-board.

  14. A versatile calibration procedure for portable coded aperture gamma cameras and RGB-D sensors

    NASA Astrophysics Data System (ADS)

    Paradiso, V.; Crivellaro, A.; Amgarou, K.; de Lanaute, N. Blanc; Fua, P.; Liénard, E.

    2018-04-01

    The present paper proposes a versatile procedure for the geometrical calibration of coded aperture gamma cameras and RGB-D depth sensors, using only one radioactive point source and a simple experimental set-up. Calibration data is then used for accurately aligning radiation images retrieved by means of the γ-camera with the respective depth images computed with the RGB-D sensor. The system resulting from such a combination is thus able to retrieve, automatically, the distance of radioactive hotspots by means of pixel-wise mapping between gamma and depth images. This procedure is of great interest for a wide number of applications, ranging from precise automatic estimation of the shape and distance of radioactive objects to Augmented Reality systems. Incidentally, the corresponding results validated the choice of a perspective design model for a coded aperture γ-camera.

  15. Polarizing aperture stereoscopic cinema camera

    NASA Astrophysics Data System (ADS)

    Lipton, Lenny

    2012-03-01

    The art of stereoscopic cinematography has been held back because of the lack of a convenient way to reduce the stereo camera lenses' interaxial to less than the distance between the eyes. This article describes a unified stereoscopic camera and lens design that allows for varying the interaxial separation to small values using a unique electro-optical polarizing aperture design for imaging left and right perspective views onto a large single digital sensor (the size of the standard 35mm frame) with the means to select left and right image information. Even with the added stereoscopic capability the appearance of existing camera bodies will be unaltered.

  16. Polarizing aperture stereoscopic cinema camera

    NASA Astrophysics Data System (ADS)

    Lipton, Lenny

    2012-07-01

    The art of stereoscopic cinematography has been held back because of the lack of a convenient way to reduce the stereo camera lenses' interaxial to less than the distance between the eyes. This article describes a unified stereoscopic camera and lens design that allows for varying the interaxial separation to small values using a unique electro-optical polarizing aperture design for imaging left and right perspective views onto a large single digital sensor, the size of the standard 35 mm frame, with the means to select left and right image information. Even with the added stereoscopic capability, the appearance of existing camera bodies will be unaltered.

  17. Analysis of the covariance function and aperture averaged fluctuations of irradiance to calculate Cn2

    NASA Astrophysics Data System (ADS)

    Cauble, Galen D.; Wayne, David T.

    2017-09-01

    The growth of optical communication has created a need to correctly characterize the atmospheric channel. Atmospheric turbulence along a given channel can drastically affect optical communication signal quality. One means of characterizing atmospheric turbulence is through measurement of the refractive index structure parameter, Cn2. When calculating Cn2 from the scintillation index, σΙ2,the point aperture scintillation index is required. Direct measurement of the point aperture scintillation index is difficult at long ranges due to the light collecting abilities of small apertures. When aperture size is increased past the atmospheric correlation width, aperture averaging decreases the scintillation index below that of the point aperture scintillation index. While the aperture averaging factor can be calculated from theory, it does not often agree with experimental results. Direct measurement of the aperture averaging factor via the pupil plane irradiance covariance function allows conversion from the aperture averaged scintillation index to the point aperture scintillation index. Using a finite aperture, camera, and detector, the aperture averaged scintillation index and aperture averaging factor are measured in parallel and the point aperture scintillation index is calculated. A new instrument built by SSC Pacific was used to collect scintillation data at the Townes Institute Science and Technology Experimentation Facility (TISTEF). This new instrument's data was then compared to BLS900 data. The results show that direct measurement of the aperture averaging factor is achievable using a camera and matches well with groundtruth instrumentation.

  18. Multi-color pyrometry imaging system and method of operating the same

    DOEpatents

    Estevadeordal, Jordi; Nirmalan, Nirm Velumylum; Tralshawala, Nilesh; Bailey, Jeremy Clyde

    2017-03-21

    A multi-color pyrometry imaging system for a high-temperature asset includes at least one viewing port in optical communication with at least one high-temperature component of the high-temperature asset. The system also includes at least one camera device in optical communication with the at least one viewing port. The at least one camera device includes a camera enclosure and at least one camera aperture defined in the camera enclosure, The at least one camera aperture is in optical communication with the at least one viewing port. The at least one camera device also includes a multi-color filtering mechanism coupled to the enclosure. The multi-color filtering mechanism is configured to sequentially transmit photons within a first predetermined wavelength band and transmit photons within a second predetermined wavelength band that is different than the first predetermined wavelength band.

  19. Material of LAPAN's thermal IR camera equipped with two microbolometers in one aperture

    NASA Astrophysics Data System (ADS)

    Bustanul, A.; Irwan, P.; Andi M., T.

    2017-11-01

    Besides the wavelength used, there is another factor that we have to notice in designing an optical system. It is material used which is correct for the spectral bands determined. Basically, due the limitation of the available range and expensive, choosing and determining materials for Infra Red (IR) wavelength are more difficult and complex rather than visible spectrum. We also had the same problem while designing our thermal IR camera equipped with two microbolometers sharing aperture. Two spectral bands, 3 - 4 μm (MWIR) and 8 - 12 μm (LWIR), have been decided to be our thermal IR camera spectrum to address missions, i.e., peat land fire, volcanoes activities, and Sea Surface Temperature (SST). Referring those bands, we chose the appropriate material for LAPAN's IR camera optics. This paper describes material of LAPAN's IR camera equipped with two microbolometer in one aperture. First of all, we were learning and understanding of optical materials properties all matters of IR technology including its bandwidths. Considering some aspects, i.e., Transmission, Index of Refraction, Thermal properties covering the index gradient and coefficient of thermal expansion (CTE), the analysis then has been accomplished. Moreover, we were utilizing a commercial software, Thermal Desktop/Sinda Fluint, to strengthen the process. Some restrictions such as space environment, low cost, and performance mainly durability and transmission, were also cared throughout the trade off the works. The results of all those analysis, either in graphs or in measurement, indicate that the lens of LAPAN's IR camera with sharing aperture is based on Germanium/Zinc Selenide materials.

  20. 3D reconstruction based on light field images

    NASA Astrophysics Data System (ADS)

    Zhu, Dong; Wu, Chunhong; Liu, Yunluo; Fu, Dongmei

    2018-04-01

    This paper proposed a method of reconstructing three-dimensional (3D) scene from two light field images capture by Lytro illium. The work was carried out by first extracting the sub-aperture images from light field images and using the scale-invariant feature transform (SIFT) for feature registration on the selected sub-aperture images. Structure from motion (SFM) algorithm is further used on the registration completed sub-aperture images to reconstruct the three-dimensional scene. 3D sparse point cloud was obtained in the end. The method shows that the 3D reconstruction can be implemented by only two light field camera captures, rather than at least a dozen times captures by traditional cameras. This can effectively solve the time-consuming, laborious issues for 3D reconstruction based on traditional digital cameras, to achieve a more rapid, convenient and accurate reconstruction.

  1. Medium-sized aperture camera for Earth observation

    NASA Astrophysics Data System (ADS)

    Kim, Eugene D.; Choi, Young-Wan; Kang, Myung-Seok; Kim, Ee-Eul; Yang, Ho-Soon; Rasheed, Ad. Aziz Ad.; Arshad, Ahmad Sabirin

    2017-11-01

    Satrec Initiative and ATSB have been developing a medium-sized aperture camera (MAC) for an earth observation payload on a small satellite. Developed as a push-broom type high-resolution camera, the camera has one panchromatic and four multispectral channels. The panchromatic channel has 2.5m, and multispectral channels have 5m of ground sampling distances at a nominal altitude of 685km. The 300mm-aperture Cassegrain telescope contains two aspheric mirrors and two spherical correction lenses. With a philosophy of building a simple and cost-effective camera, the mirrors incorporate no light-weighting, and the linear CCDs are mounted on a single PCB with no beam splitters. MAC is the main payload of RazakSAT to be launched in 2005. RazakSAT is a 180kg satellite including MAC, designed to provide high-resolution imagery of 20km swath width on a near equatorial orbit (NEqO). The mission objective is to demonstrate the capability of a high-resolution remote sensing satellite system on a near equatorial orbit. This paper describes the overview of the MAC and RarakSAT programmes, and presents the current development status of MAC focusing on key optical aspects of Qualification Model.

  2. Optical performance analysis of plenoptic camera systems

    NASA Astrophysics Data System (ADS)

    Langguth, Christin; Oberdörster, Alexander; Brückner, Andreas; Wippermann, Frank; Bräuer, Andreas

    2014-09-01

    Adding an array of microlenses in front of the sensor transforms the capabilities of a conventional camera to capture both spatial and angular information within a single shot. This plenoptic camera is capable of obtaining depth information and providing it for a multitude of applications, e.g. artificial re-focusing of photographs. Without the need of active illumination it represents a compact and fast optical 3D acquisition technique with reduced effort in system alignment. Since the extent of the aperture limits the range of detected angles, the observed parallax is reduced compared to common stereo imaging systems, which results in a decreased depth resolution. Besides, the gain of angular information implies a degraded spatial resolution. This trade-off requires a careful choice of the optical system parameters. We present a comprehensive assessment of possible degrees of freedom in the design of plenoptic systems. Utilizing a custom-built simulation tool, the optical performance is quantified with respect to particular starting conditions. Furthermore, a plenoptic camera prototype is demonstrated in order to verify the predicted optical characteristics.

  3. Laser differential image-motion monitor for characterization of turbulence during free-space optical communication tests.

    PubMed

    Brown, David M; Juarez, Juan C; Brown, Andrea M

    2013-12-01

    A laser differential image-motion monitor (DIMM) system was designed and constructed as part of a turbulence characterization suite during the DARPA free-space optical experimental network experiment (FOENEX) program. The developed link measurement system measures the atmospheric coherence length (r0), atmospheric scintillation, and power in the bucket for the 1550 nm band. DIMM measurements are made with two separate apertures coupled to a single InGaAs camera. The angle of arrival (AoA) for the wavefront at each aperture can be calculated based on focal spot movements imaged by the camera. By utilizing a single camera for the simultaneous measurement of the focal spots, the correlation of the variance in the AoA allows a straightforward computation of r0 as in traditional DIMM systems. Standard measurements of scintillation and power in the bucket are made with the same apertures by redirecting a percentage of the incoming signals to InGaAs detectors integrated with logarithmic amplifiers for high sensitivity and high dynamic range. By leveraging two, small apertures, the instrument forms a small size and weight configuration for mounting to actively tracking laser communication terminals for characterizing link performance.

  4. Camera sensor arrangement for crop/weed detection accuracy in agronomic images.

    PubMed

    Romeo, Juan; Guerrero, José Miguel; Montalvo, Martín; Emmi, Luis; Guijarro, María; Gonzalez-de-Santos, Pablo; Pajares, Gonzalo

    2013-04-02

    In Precision Agriculture, images coming from camera-based sensors are commonly used for weed identification and crop line detection, either to apply specific treatments or for vehicle guidance purposes. Accuracy of identification and detection is an important issue to be addressed in image processing. There are two main types of parameters affecting the accuracy of the images, namely: (a) extrinsic, related to the sensor's positioning in the tractor; (b) intrinsic, related to the sensor specifications, such as CCD resolution, focal length or iris aperture, among others. Moreover, in agricultural applications, the uncontrolled illumination, existing in outdoor environments, is also an important factor affecting the image accuracy. This paper is exclusively focused on two main issues, always with the goal to achieve the highest image accuracy in Precision Agriculture applications, making the following two main contributions: (a) camera sensor arrangement, to adjust extrinsic parameters and (b) design of strategies for controlling the adverse illumination effects.

  5. Focus collimator press for a collimator for gamma ray cameras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    York, R.N.; York, D.L.

    A focus collimator press for collimators for gamma ray cameras is described comprising a pivot arm of fixed length mounted on a travelling pivot which is movable in the plane of a spaced apart work table surface in a direction toward and away from the work table. A press plate is carried at the opposite end of the fixed length pivot arm, and is maintained in registration with the same portion of the work table for pressing engagement with each undulating radiation opaque strip as it is added to the top of a collimator stack in process by movement ofmore » the travelling pivot inward toward the work table. This enables the press plate to maintain its relative position above the collimator stack and at the same time the angle of the press plate changes, becoming less acute in relation to the work table as the travelling pivot motes inwardly toward the work table. The fixed length of the pivot arm is substantially equal to the focal point of the converging apertures formed by each pair of undulating strips stacked together. Thus, the focal point of each aperture row falls substantially on the axis of the travelling pivot, and since it moves in the plane of the work table surface the focal point of each aperture row is directed to lie in the same common plane. When one of two collimator stacks made in this way is rotated 180 degrees and the two bonded together along their respective first strips, all focal points of every aperture row lie on the central axis of the completed collimator.« less

  6. Vision Aided Inertial Navigation System Augmented with a Coded Aperture

    DTIC Science & Technology

    2011-03-24

    as the change in blur at different distances from the pixel plane can be inferred. Cameras with a micro lens array (called plenoptic cameras...images from 8 slightly different perspectives [14,43]. Dappled photography is a similar to the plenoptic camera approach except that a cosine mask

  7. [Results of testing of MINISKAN mobile gamma-ray camera and specific features of its design].

    PubMed

    Utkin, V M; Kumakhov, M A; Blinov, N N; Korsunskiĭ, V N; Fomin, D K; Kolesnikova, N V; Tultaev, A V; Nazarov, A A; Tararukhina, O B

    2007-01-01

    The main results of engineering, biomedical, and clinical testing of MINISKAN mobile gamma-ray camera are presented. Specific features of the camera hardware and software, as well as the main technical specifications, are described. The gamma-ray camera implements a new technology based on reconstructive tomography, aperture encoding, and digital processing of signals.

  8. Studies of auroral X-ray imaging from high altitude spacecraft

    NASA Technical Reports Server (NTRS)

    Mckenzie, D. L.; Mizera, P. F.; Rice, C. J.

    1980-01-01

    Results of a study of techniques for imaging the aurora from a high altitude satellite at X-ray wavelengths are summarized. The X-ray observations allow the straightforward derivation of the primary auroral X-ray spectrum and can be made at all local times, day and night. Five candidate imaging systems are identified: X-ray telescope, multiple pinhole camera, coded aperture, rastered collimator, and imaging collimator. Examples of each are specified, subject to common weight and size limits which allow them to be intercompared. The imaging ability of each system is tested using a wide variety of sample spectra which are based on previous satellite observations. The study shows that the pinhole camera and coded aperture are both good auroral imaging systems. The two collimated detectors are significantly less sensitive. The X-ray telescope provides better image quality than the other systems in almost all cases, but a limitation to energies below about 4 keV prevents this system from providing the spectra data essential to deriving electron spectra, energy input to the atmosphere, and atmospheric densities and conductivities. The orbit selection requires a tradeoff between spatial resolution and duty cycle.

  9. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integratedmore » into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.« less

  10. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    DOE PAGES

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia; ...

    2015-08-13

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integratedmore » into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.« less

  11. A panoramic coded aperture gamma camera for radioactive hotspots localization

    NASA Astrophysics Data System (ADS)

    Paradiso, V.; Amgarou, K.; Blanc De Lanaute, N.; Schoepff, V.; Amoyal, G.; Mahe, C.; Beltramello, O.; Liénard, E.

    2017-11-01

    A known disadvantage of the coded aperture imaging approach is its limited field-of-view (FOV), which often results insufficient when analysing complex dismantling scenes such as post-accidental scenarios, where multiple measurements are needed to fully characterize the scene. In order to overcome this limitation, a panoramic coded aperture γ-camera prototype has been developed. The system is based on a 1 mm thick CdTe detector directly bump-bonded to a Timepix readout chip, developed by the Medipix2 collaboration (256 × 256 pixels, 55 μm pitch, 14.08 × 14.08 mm2 sensitive area). A MURA pattern coded aperture is used, allowing for background subtraction without the use of heavy shielding. Such system is then combined with a USB color camera. The output of each measurement is a semi-spherical image covering a FOV of 360 degrees horizontally and 80 degrees vertically, rendered in spherical coordinates (θ,phi). The geometrical shapes of the radiation-emitting objects are preserved by first registering and stitching the optical images captured by the prototype, and applying, subsequently, the same transformations to their corresponding radiation images. Panoramic gamma images generated by using the technique proposed in this paper are described and discussed, along with the main experimental results obtained in laboratories campaigns.

  12. An image-tube camera for cometary spectrography

    NASA Astrophysics Data System (ADS)

    Mamadov, O.

    The paper discusses the mounting of an image tube camera. The cathode is of antimony, sodium, potassium, and cesium. The parts used for mounting are of acrylic plastic and a fabric-based laminate. A mounting design that does not include cooling is presented. The aperture ratio of the camera is 1:27. Also discussed is the way that the camera is joined to the spectrograph.

  13. Dual-camera design for coded aperture snapshot spectral imaging.

    PubMed

    Wang, Lizhi; Xiong, Zhiwei; Gao, Dahua; Shi, Guangming; Wu, Feng

    2015-02-01

    Coded aperture snapshot spectral imaging (CASSI) provides an efficient mechanism for recovering 3D spectral data from a single 2D measurement. However, since the reconstruction problem is severely underdetermined, the quality of recovered spectral data is usually limited. In this paper we propose a novel dual-camera design to improve the performance of CASSI while maintaining its snapshot advantage. Specifically, a beam splitter is placed in front of the objective lens of CASSI, which allows the same scene to be simultaneously captured by a grayscale camera. This uncoded grayscale measurement, in conjunction with the coded CASSI measurement, greatly eases the reconstruction problem and yields high-quality 3D spectral data. Both simulation and experimental results demonstrate the effectiveness of the proposed method.

  14. Plenoptic Imager for Automated Surface Navigation

    NASA Technical Reports Server (NTRS)

    Zollar, Byron; Milder, Andrew; Milder, Andrew; Mayo, Michael

    2010-01-01

    An electro-optical imaging device is capable of autonomously determining the range to objects in a scene without the use of active emitters or multiple apertures. The novel, automated, low-power imaging system is based on a plenoptic camera design that was constructed as a breadboard system. Nanohmics proved feasibility of the concept by designing an optical system for a prototype plenoptic camera, developing simulated plenoptic images and range-calculation algorithms, constructing a breadboard prototype plenoptic camera, and processing images (including range calculations) from the prototype system. The breadboard demonstration included an optical subsystem comprised of a main aperture lens, a mechanical structure that holds an array of micro lenses at the focal distance from the main lens, and a structure that mates a CMOS imaging sensor the correct distance from the micro lenses. The demonstrator also featured embedded electronics for camera readout, and a post-processor executing image-processing algorithms to provide ranging information.

  15. Flow Interactions and Control

    DTIC Science & Technology

    2012-03-08

    to-Use 3-D Camera For Measurements in Turbulent Flow Fields B Thurow, Auburn Near Mid Far Conventional imaging Plenoptic imaging Conventional 2...depth-of-field and blur  Reduced aperture (restricted angular information) leads to low signal levels Lightfield Imaging  Plenoptic camera records

  16. The system analysis of light field information collection based on the light field imaging

    NASA Astrophysics Data System (ADS)

    Wang, Ye; Li, Wenhua; Hao, Chenyang

    2016-10-01

    Augmented reality(AR) technology is becoming the study focus, and the AR effect of the light field imaging makes the research of light field camera attractive. The micro array structure was adopted in most light field information acquisition system(LFIAS) since emergence of light field camera, micro lens array(MLA) and micro pinhole array(MPA) system mainly included. It is reviewed in this paper the structure of the LFIAS that the Light field camera commonly used in recent years. LFIAS has been analyzed based on the theory of geometrical optics. Meanwhile, this paper presents a novel LFIAS, plane grating system, we call it "micro aperture array(MAA." And the LFIAS are analyzed based on the knowledge of information optics; This paper proves that there is a little difference in the multiple image produced by the plane grating system. And the plane grating system can collect and record the amplitude and phase information of the field light.

  17. Ultra-compact imaging system based on multi-aperture architecture

    NASA Astrophysics Data System (ADS)

    Meyer, Julia; Brückner, Andreas; Leitel, Robert; Dannberg, Peter; Bräuer, Andreas; Tünnermann, Andreas

    2011-03-01

    As a matter of course, cameras are integrated in the field of information and communication technology. It can be observed, that there is a trend that those cameras get smaller and at the same time cheaper. Because single aperture have a limit of miniaturization, while simultaneously keeping the same space-bandwidth-product and transmitting a wide field of view, there is a need of new ideas like the multi aperture optical systems. In the proposed camera system the image is formed with many different channels each consisting of four microlenses which are arranged one after another in different microlens arrays. A partial image which fits together with the neighbouring one is formed in every single channel, so that a real erect image is generated and a conventional image sensor can be used. The microoptical fabrication process and the assembly are well established and can be carried out on wafer-level. Laser writing is used for the fabrication of the masks. UV-lithography, a reflow process and UV-molding is needed for the fabrication of the apertures and the lenses. The developed system is very small in terms of both length and lateral dimensions and has a VGA resolution and a diagonal field of view of 65 degrees. This microoptical vision system is appropriate for being implemented in electronic devices such as webcams integrated in notebookdisplays.

  18. Digital micromirror device-based common-path quantitative phase imaging.

    PubMed

    Zheng, Cheng; Zhou, Renjie; Kuang, Cuifang; Zhao, Guangyuan; Yaqoob, Zahid; So, Peter T C

    2017-04-01

    We propose a novel common-path quantitative phase imaging (QPI) method based on a digital micromirror device (DMD). The DMD is placed in a plane conjugate to the objective back-aperture plane for the purpose of generating two plane waves that illuminate the sample. A pinhole is used in the detection arm to filter one of the beams after sample to create a reference beam. Additionally, a transmission-type liquid crystal device, placed at the objective back-aperture plane, eliminates the specular reflection noise arising from all the "off" state DMD micromirrors, which is common in all DMD-based illuminations. We have demonstrated high sensitivity QPI, which has a measured spatial and temporal noise of 4.92 nm and 2.16 nm, respectively. Experiments with calibrated polystyrene beads illustrate the desired phase measurement accuracy. In addition, we have measured the dynamic height maps of red blood cell membrane fluctuations, showing the efficacy of the proposed system for live cell imaging. Most importantly, the DMD grants the system convenience in varying the interference fringe period on the camera to easily satisfy the pixel sampling conditions. This feature also alleviates the pinhole alignment complexity. We envision that the proposed DMD-based common-path QPI system will allow for system miniaturization and automation for a broader adaption.

  19. Digital micromirror device-based common-path quantitative phase imaging

    PubMed Central

    Zheng, Cheng; Zhou, Renjie; Kuang, Cuifang; Zhao, Guangyuan; Yaqoob, Zahid; So, Peter T. C.

    2017-01-01

    We propose a novel common-path quantitative phase imaging (QPI) method based on a digital micromirror device (DMD). The DMD is placed in a plane conjugate to the objective back-aperture plane for the purpose of generating two plane waves that illuminate the sample. A pinhole is used in the detection arm to filter one of the beams after sample to create a reference beam. Additionally, a transmission-type liquid crystal device, placed at the objective back-aperture plane, eliminates the specular reflection noise arising from all the “off” state DMD micromirrors, which is common in all DMD-based illuminations. We have demonstrated high sensitivity QPI, which has a measured spatial and temporal noise of 4.92 nm and 2.16 nm, respectively. Experiments with calibrated polystyrene beads illustrate the desired phase measurement accuracy. In addition, we have measured the dynamic height maps of red blood cell membrane fluctuations, showing the efficacy of the proposed system for live cell imaging. Most importantly, the DMD grants the system convenience in varying the interference fringe period on the camera to easily satisfy the pixel sampling conditions. This feature also alleviates the pinhole alignment complexity. We envision that the proposed DMD-based common-path QPI system will allow for system miniaturization and automation for a broader adaption. PMID:28362789

  20. Optical Design of the LSST Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S; Seppala, L; Gilmore, K

    2008-07-16

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a camera system that includes a set of broad-band filters and refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. Optical design of the camera lenses and filters is integrated with optical design of telescope mirrors to optimize performance, resulting in excellent image quality over the entire field from ultra-violet to near infra-red wavelengths. The LSST camera optics design consists of three refractive lenses withmore » clear aperture diameters of 1.55 m, 1.10 m and 0.69 m and six interchangeable, broad-band, filters with clear aperture diameters of 0.75 m. We describe the methodology for fabricating, coating, mounting and testing these lenses and filters, and we present the results of detailed tolerance analyses, demonstrating that the camera optics will perform to the specifications required to meet their performance goals.« less

  1. Modified plenoptic camera for phase and amplitude wavefront sensing

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Davis, Christopher C.

    2013-09-01

    Shack-Hartmann sensors have been widely applied in wavefront sensing. However, they are limited to measuring slightly distorted wavefronts whose local tilt doesn't surpass the numerical aperture of its micro-lens array and cross talk of incident waves on the mrcro-lens array should be strictly avoided. In medium to strong turbulence cases of optic communication, where large jitter in angle of arrival and local interference caused by break-up of beam are common phenomena, Shack-Hartmann sensors no longer serve as effective tools in revealing distortions in a signal wave. Our design of a modified Plenoptic Camera shows great potential in observing and extracting useful information from severely disturbed wavefronts. Furthermore, by separating complex interference patterns into several minor interference cases, it may also be capable of telling regional phase difference of coherently illuminated objects.

  2. Comparison of 10 digital SLR cameras for orthodontic photography.

    PubMed

    Bister, D; Mordarai, F; Aveling, R M

    2006-09-01

    Digital photography is now widely used to document orthodontic patients. High quality intra-oral photography depends on a satisfactory 'depth of field' focus and good illumination. Automatic 'through the lens' (TTL) metering is ideal to achieve both the above aims. Ten current digital single lens reflex (SLR) cameras were tested for use in intra- and extra-oral photography as used in orthodontics. The manufacturers' recommended macro-lens and macro-flash were used with each camera. Handling characteristics, colour-reproducibility, quality of the viewfinder and flash recharge time were investigated. No camera took acceptable images in factory default setting or 'automatic' mode: this mode was not present for some cameras (Nikon, Fujifilm); led to overexposure (Olympus) or poor depth of field (Canon, Konica-Minolta, Pentax), particularly for intra-oral views. Once adjusted, only Olympus cameras were able to take intra- and extra-oral photographs without the need to change settings, and were therefore the easiest to use. All other cameras needed adjustments of aperture (Canon, Konica-Minolta, Pentax), or aperture and flash (Fujifilm, Nikon), making the latter the most complex to use. However, all cameras produced high quality intra- and extra-oral images, once appropriately adjusted. The resolution of the images is more than satisfactory for all cameras. There were significant differences relating to the quality of colour reproduction, size and brightness of the viewfinders. The Nikon D100 and Fujifilm S 3 Pro consistently scored best for colour fidelity. Pentax and Konica-Minolta had the largest and brightest viewfinders.

  3. Small Unmanned Aerial Vehicles; DHS’s Answer to Border Surveillance Requirements

    DTIC Science & Technology

    2013-03-01

    5 of more than 4000 illegal aliens, including the seizure of more than 15,000 pounds of marijuana .13 In addition to the Predator UAVs being...payload includes two color video cameras, an infrared camera that offers night vision capability and synthetic aperture radar that provides high

  4. UXO Forum 1996

    DTIC Science & Technology

    1996-01-01

    used to locate and characterize a magnetic dipole source, and this finding accelerated the development of superconducting tensor gradiometers for... superconducting magnetic field gradiometer, two-color infrared camera, synthetic aperture radar, and a visible spectrum camera. The combination of these...Pieter Hoekstra, Blackhawk GeoSciences ......................................... 68 Prediction for UXO Shape and Orientation Effects on Magnetic

  5. Design and Construction of an X-ray Lightning Camera

    NASA Astrophysics Data System (ADS)

    Schaal, M.; Dwyer, J. R.; Rassoul, H. K.; Uman, M. A.; Jordan, D. M.; Hill, J. D.

    2010-12-01

    A pinhole-type camera was designed and built for the purpose of producing high-speed images of the x-ray emissions from rocket-and-wire-triggered lightning. The camera consists of 30 7.62-cm diameter NaI(Tl) scintillation detectors, each sampling at 10 million frames per second. The steel structure of the camera is encased in 1.27-cm thick lead, which blocks x-rays that are less than 400 keV, except through a 7.62-cm diameter “pinhole” aperture located at the front of the camera. The lead and steel structure is covered in 0.16-cm thick aluminum to block RF noise, water and light. All together, the camera weighs about 550-kg and is approximately 1.2-m x 0.6-m x 0.6-m. The image plane, which is adjustable, was placed 32-cm behind the pinhole aperture, giving a field of view of about ±38° in both the vertical and horizontal directions. The elevation of the camera is adjustable between 0 and 50° from horizontal and the camera may be pointed in any azimuthal direction. In its current configuration, the camera’s angular resolution is about 14°. During the summer of 2010, the x-ray camera was located 44-m from the rocket-launch tower at the UF/Florida Tech International Center for Lightning Research and Testing (ICLRT) at Camp Blanding, FL and several rocket-triggered lightning flashes were observed. In this presentation, I will discuss the design, construction and operation of this x-ray camera.

  6. Time-of-Flight Microwave Camera.

    PubMed

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-10-05

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable "stealth" regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows "camera-like" behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.

  7. Far ultraviolet wide field imaging and photometry - Spartan-202 Mark II Far Ultraviolet Camera

    NASA Technical Reports Server (NTRS)

    Carruthers, George R.; Heckathorn, Harry M.; Opal, Chet B.; Witt, Adolf N.; Henize, Karl G.

    1988-01-01

    The U.S. Naval Research Laboratory' Mark II Far Ultraviolet Camera, which is expected to be a primary scientific instrument aboard the Spartan-202 Space Shuttle mission, is described. This camera is intended to obtain FUV wide-field imagery of stars and extended celestial objects, including diffuse nebulae and nearby galaxies. The observations will support the HST by providing FUV photometry of calibration objects. The Mark II camera is an electrographic Schmidt camera with an aperture of 15 cm, a focal length of 30.5 cm, and sensitivity in the 1230-1600 A wavelength range.

  8. Visible-regime polarimetric imager: a fully polarimetric, real-time imaging system.

    PubMed

    Barter, James D; Thompson, Harold R; Richardson, Christine L

    2003-03-20

    A fully polarimetric optical camera system has been constructed to obtain polarimetric information simultaneously from four synchronized charge-coupled device imagers at video frame rates of 60 Hz and a resolution of 640 x 480 pixels. The imagers view the same scene along the same optical axis by means of a four-way beam-splitting prism similar to ones used for multiple-imager, common-aperture color TV cameras. Appropriate polarizing filters in front of each imager provide the polarimetric information. Mueller matrix analysis of the polarimetric response of the prism, analyzing filters, and imagers is applied to the detected intensities in each imager as a function of the applied state of polarization over a wide range of linear and circular polarization combinations to obtain an average polarimetric calibration consistent to approximately 2%. Higher accuracies can be obtained by improvement of the polarimetric modeling of the splitting prism and by implementation of a pixel-by-pixel calibration.

  9. Using focused plenoptic cameras for rich image capture.

    PubMed

    Georgiev, T; Lumsdaine, A; Chunev, G

    2011-01-01

    This approach uses a focused plenoptic camera to capture the plenoptic function's rich "non 3D" structure. It employs two techniques. The first simultaneously captures multiple exposures (or other aspects) based on a microlens array having an interleaved set of different filters. The second places multiple filters at the main lens aperture.

  10. Ground Testing of Prototype Hardware and Processing Algorithms for a Wide Area Space Surveillance System (WASSS)

    NASA Astrophysics Data System (ADS)

    Goldstein, N.; Dressler, R. A.; Richtsmeier, S. S.; McLean, J.; Dao, P. D.; Murray-Krezan, J.; Fulcoly, D. O.

    2013-09-01

    Recent ground testing of a wide area camera system and automated star removal algorithms has demonstrated the potential to detect, quantify, and track deep space objects using small aperture cameras and on-board processors. The camera system, which was originally developed for a space-based Wide Area Space Surveillance System (WASSS), operates in a fixed-stare mode, continuously monitoring a wide swath of space and differentiating celestial objects from satellites based on differential motion across the field of view. It would have greatest utility in a LEO orbit to provide automated and continuous monitoring of deep space with high refresh rates, and with particular emphasis on the GEO belt and GEO transfer space. Continuous monitoring allows a concept of change detection and custody maintenance not possible with existing sensors. The detection approach is equally applicable to Earth-based sensor systems. A distributed system of such sensors, either Earth-based, or space-based, could provide automated, persistent night-time monitoring of all of deep space. The continuous monitoring provides a daily record of the light curves of all GEO objects above a certain brightness within the field of view. The daily updates of satellite light curves offers a means to identify specific satellites, to note changes in orientation and operational mode, and to queue other SSA assets for higher resolution queries. The data processing approach may also be applied to larger-aperture, higher resolution camera systems to extend the sensitivity towards dimmer objects. In order to demonstrate the utility of the WASSS system and data processing, a ground based field test was conducted in October 2012. We report here the results of the observations made at Magdalena Ridge Observatory using the prototype WASSS camera, which has a 4×60° field-of-view , <0.05° resolution, a 2.8 cm2 aperture, and the ability to view within 4° of the sun. A single camera pointed at the GEO belt provided a continuous night-long record of the intensity and location of more than 50 GEO objects detected within the camera's 60-degree field-of-view, with a detection sensitivity similar to the camera's shot noise limit of Mv=13.7. Performance is anticipated to scale with aperture area, allowing the detection of dimmer objects with larger-aperture cameras. The sensitivity of the system depends on multi-frame averaging and an image processing algorithm that exploits the different angular velocities of celestial objects and SOs. Principal Components Analysis (PCA) is used to filter out all objects moving with the velocity of the celestial frame of reference. The resulting filtered images are projected back into an Earth-centered frame of reference, or into any other relevant frame of reference, and co-added to form a series of images of the GEO objects as a function of time. The PCA approach not only removes the celestial background, but it also removes systematic variations in system calibration, sensor pointing, and atmospheric conditions. The resulting images are shot-noise limited, and can be exploited to automatically identify deep space objects, produce approximate state vectors, and track their locations and intensities as a function of time.

  11. Time-Lapse Motion Picture Technique Applied to the Study of Geological Processes.

    PubMed

    Miller, R D; Crandell, D R

    1959-09-25

    Light-weight, battery-operated timers were built and coupled to 16-mm motion-picture cameras having apertures controlled by photoelectric cells. The cameras were placed adjacent to Emmons Glacier on Mount Rainier. The film obtained confirms the view that exterior time-lapse photography can be applied to the study of slow-acting geologic processes.

  12. Multiple-aperture optical design for micro-level cameras using 3D-printing method

    NASA Astrophysics Data System (ADS)

    Peng, Wei-Jei; Hsu, Wei-Yao; Cheng, Yuan-Chieh; Lin, Wen-Lung; Yu, Zong-Ru; Chou, Hsiao-Yu; Chen, Fong-Zhi; Fu, Chien-Chung; Wu, Chong-Syuan; Huang, Chao-Tsung

    2018-02-01

    The design of the ultra miniaturized camera using 3D-printing technology directly printed on to the complementary metal-oxide semiconductor (CMOS) imaging sensor is presented in this paper. The 3D printed micro-optics is manufactured using the femtosecond two-photon direct laser writing, and the figure error which could achieve submicron accuracy is suitable for the optical system. Because the size of the micro-level camera is approximately several hundreds of micrometers, the resolution is reduced much and highly limited by the Nyquist frequency of the pixel pitch. For improving the reduced resolution, one single-lens can be replaced by multiple-aperture lenses with dissimilar field of view (FOV), and then stitching sub-images with different FOV can achieve a high resolution within the central region of the image. The reason is that the angular resolution of the lens with smaller FOV is higher than that with larger FOV, and then the angular resolution of the central area can be several times than that of the outer area after stitching. For the same image circle, the image quality of the central area of the multi-lens system is significantly superior to that of a single-lens. The foveated image using stitching FOV breaks the limitation of the resolution for the ultra miniaturized imaging system, and then it can be applied such as biomedical endoscopy, optical sensing, and machine vision, et al. In this study, the ultra miniaturized camera with multi-aperture optics is designed and simulated for the optimum optical performance.

  13. Target-Tracking Camera for a Metrology System

    NASA Technical Reports Server (NTRS)

    Liebe, Carl; Bartman, Randall; Chapsky, Jacob; Abramovici, Alexander; Brown, David

    2009-01-01

    An analog electronic camera that is part of a metrology system measures the varying direction to a light-emitting diode that serves as a bright point target. In the original application for which the camera was developed, the metrological system is used to determine the varying relative positions of radiating elements of an airborne synthetic aperture-radar (SAR) antenna as the airplane flexes during flight; precise knowledge of the relative positions as a function of time is needed for processing SAR readings. It has been common metrology system practice to measure the varying direction to a bright target by use of an electronic camera of the charge-coupled-device or active-pixel-sensor type. A major disadvantage of this practice arises from the necessity of reading out and digitizing the outputs from a large number of pixels and processing the resulting digital values in a computer to determine the centroid of a target: Because of the time taken by the readout, digitization, and computation, the update rate is limited to tens of hertz. In contrast, the analog nature of the present camera makes it possible to achieve an update rate of hundreds of hertz, and no computer is needed to determine the centroid. The camera is based on a position-sensitive detector (PSD), which is a rectangular photodiode with output contacts at opposite ends. PSDs are usually used in triangulation for measuring small distances. PSDs are manufactured in both one- and two-dimensional versions. Because it is very difficult to calibrate two-dimensional PSDs accurately, the focal-plane sensors used in this camera are two orthogonally mounted one-dimensional PSDs.

  14. Time-of-Flight Microwave Camera

    PubMed Central

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-01-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz–12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum. PMID:26434598

  15. Time-of-Flight Microwave Camera

    NASA Astrophysics Data System (ADS)

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-10-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.

  16. Development of a Compact & Easy-to-Use 3-D Camera for High Speed Turbulent Flow Fields

    DTIC Science & Technology

    2013-12-05

    resolved. Also, in the case of a single camera system, the use of an aperture greatly reduces the amount of collected light. The combination of these...a study on wall-bounded turbulence [Sheng_2006]. Nevertheless, these techniques are limited to small measurement volumes, while maintaining a high...It has also been adapted to kHz rates using high-speed cameras for aeroacoustic studies (see Violato et al. [17, 18]. Tomo-PIV, however, has some

  17. Two degree of freedom camera mount

    NASA Technical Reports Server (NTRS)

    Ambrose, Robert O. (Inventor)

    2003-01-01

    A two degree of freedom camera mount. The camera mount includes a socket, a ball, a first linkage and a second linkage. The socket includes an interior surface and an opening. The ball is positioned within an interior of the socket. The ball includes a coupling point for rotating the ball relative to the socket and an aperture for mounting a camera. The first and second linkages are rotatably connected to the socket and slidably connected to the coupling point of the ball. Rotation of the linkages with respect to the socket causes the ball to rotate with respect to the socket.

  18. Focusing and depth of field in photography: application in dermatology practice.

    PubMed

    Taheri, Arash; Yentzer, Brad A; Feldman, Steven R

    2013-11-01

    Conventional photography obtains a sharp image of objects within a given 'depth of field'; objects not within the depth of field are out of focus. In recent years, digital photography revolutionized the way pictures are taken, edited, and stored. However, digital photography does not result in a deeper depth of field or better focusing. In this article, we briefly review the concept of depth of field and focus in photography as well as new technologies in this area. A deep depth of field is used to have more objects in focus; a shallow depth of field can emphasize a subject by blurring the foreground and background objects. The depth of field can be manipulated by adjusting the aperture size of the camera, with smaller apertures increasing the depth of field at the cost of lower levels of light capture. Light-field cameras are a new generation of digital cameras that offer several new features, including the ability to change the focus on any object in the image after taking the photograph. Understanding depth of field and camera technology helps dermatologists to capture their subjects in focus more efficiently. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Polarized fluorescence for skin cancer diagnostic with a multi-aperture camera

    NASA Astrophysics Data System (ADS)

    Kandimalla, Haripriya; Ramella-Roman, Jessica C.

    2008-02-01

    Polarized fluorescence has shown some promising results in assessment of skin cancer margins. Researchers have used tetracycline and cross polarization imaging for nonmelanoma skin cancer demarcation as well as investigating endogenous skin polarized fluorescence. In this paper we present a new instrument for polarized fluorescence imaging, able to calculate the full fluorescence Stokes vector in one snapshot. The core of our system is a multi-aperture camera constructed with a two by two lenslet array. Three of the lenses have polarizing elements in front of them, oriented at 0°, + 45°and 90° with respect to light source polarization. A flash lamp combined with a polarizer parallel to the source-camera-sample plane and a UV filter is used as an excitation source. A blue filter in front of the camera system is used to collect only the fluorescent emission of interest and filter out the incident light. In-vitro tests of endogenous and exogenous polarized fluorescence on collagen rich material like bovine tendon were performed and Stokes vector of polarized fluorescence calculated. The system has the advantage of eliminating moving artifacts with the collection of different polarization states and stoke vector in a single snap shot.

  20. Imaging through strong turbulence with a light field approach.

    PubMed

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C

    2016-05-30

    Under strong turbulence conditions, object's images can be severely distorted and become unrecognizable throughout the observing time. Conventional image restoring algorithms do not perform effectively in these circumstances due to the loss of good references on the object. We propose the use a plenoptic sensor as a light field camera to map a conventional camera image onto a cell image array in the image's sub-angular spaces. Accordingly, each cell image on the plenoptic sensor is equivalent to the image acquired by a sub-aperture of the imaging lens. The wavefront distortion over the lens aperture can be analyzed by comparing cell images in the plenoptic sensor. By using a modified "Laplacian" metric, we can identify a good cell image in a plenoptic image sequence. The good cell image corresponds with the time and sub-aperture area on the imaging lens where wavefront distortion becomes relatively and momentarily "flat". As a result, it will reveal the fundamental truths of the object that would be severely distorted on normal cameras. In this paper, we will introduce the underlying physics principles and mechanisms of our approach and experimentally demonstrate its effectiveness under strong turbulence conditions. In application, our approach can be used to provide a good reference for conventional image restoring approaches under strong turbulence conditions. This approach can also be used as an independent device to perform object recognition tasks through severe turbulence distortions.

  1. Aperture shape dependencies in extended depth of focus for imaging camera by wavefront coding

    NASA Astrophysics Data System (ADS)

    Sakita, Koichi; Ohta, Mitsuhiko; Shimano, Takeshi; Sakemoto, Akito

    2015-02-01

    Optical transfer functions (OTFs) on various directional spatial frequency axes for cubic phase mask (CPM) with circular and square apertures are investigated. Although OTF has no zero points, it has a very close value to zero for a circular aperture at low frequencies on diagonal axis, which results in degradation of restored images. The reason for close-to-zero value in OTF is also analyzed in connection with point spread function profiles using Fourier slice theorem. To avoid close-to-zero condition, square aperture with CPM is indispensable in WFC. We optimized cubic coefficient α of CPM and coefficients of digital filter, and succeeded to get excellent de-blurred images at large depth of field.

  2. Forming images with thermal neutrons

    NASA Astrophysics Data System (ADS)

    Vanier, Peter E.; Forman, Leon

    2003-01-01

    Thermal neutrons passing through air have scattering lengths of about 20 meters. At further distances, the majority of neutrons emanating from a moderated source will scatter multiple times in the air before being detected, and will not retain information about the location of the source, except that their density will fall off somewhat faster than 1/r2. However, there remains a significant fraction of the neutrons that will travel 20 meters or more without scattering and can be used to create an image of the source. A few years ago, a proof-of-principle "camera" was demonstrated that could produce images of a scene containing sources of thermalized neutrons and could locate a source comparable in strength with an improvised nuclear device at ranges over 60 meters. The instrument makes use of a coded aperture with a uniformly redundant array of openings, analogous to those used in x-ray and gamma cameras. The detector is a position-sensitive He-3 proportional chamber, originally used for neutron diffraction. A neutron camera has many features in common with those designed for non-focusable photons, as well as some important differences. Potential applications include detecting nuclear smuggling, locating non-metallic land mines, assaying nuclear waste, and surveying for health physics purposes.

  3. Light-efficient photography.

    PubMed

    Hasinoff, Samuel W; Kutulakos, Kiriakos N

    2011-11-01

    In this paper, we consider the problem of imaging a scene with a given depth of field at a given exposure level in the shortest amount of time possible. We show that by 1) collecting a sequence of photos and 2) controlling the aperture, focus, and exposure time of each photo individually, we can span the given depth of field in less total time than it takes to expose a single narrower-aperture photo. Using this as a starting point, we obtain two key results. First, for lenses with continuously variable apertures, we derive a closed-form solution for the globally optimal capture sequence, i.e., that collects light from the specified depth of field in the most efficient way possible. Second, for lenses with discrete apertures, we derive an integer programming problem whose solution is the optimal sequence. Our results are applicable to off-the-shelf cameras and typical photography conditions, and advocate the use of dense, wide-aperture photo sequences as a light-efficient alternative to single-shot, narrow-aperture photography.

  4. Spatial imaging of UV emission from Jupiter and Saturn

    NASA Technical Reports Server (NTRS)

    Clarke, J. T.; Moos, H. W.

    1981-01-01

    Spatial imaging with the IUE is accomplished both by moving one of the apertures in a series of exposures and within the large aperture in a single exposure. The image of the field of view subtended by the large aperture is focussed directly onto the detector camera face at each wavelength; since the spatial resolution of the instrument is 5 to 6 arc sec and the aperture extends 23.0 by 10.3 arc sec, imaging both parallel and perpendicular to dispersion is possible in a single exposure. The correction for the sensitivity variation along the slit at 1216 A is obtained from exposures of diffuse geocoronal H Ly alpha emission. The relative size of the aperture superimposed on the apparent discs of Jupiter and Saturn in typical observation is illustrated. By moving the planet image 10 to 20 arc sec along the major axis of the aperture (which is constrained to point roughly north-south) maps of the discs of these planets are obtained with 6 arc sec spatial resolution.

  5. Large-mirror testing facility at the National Optical Astronomy Observatories.

    NASA Astrophysics Data System (ADS)

    Barr, L. D.; Coudé du Foresto, V.; Fox, J.; Poczulp, G. A.; Richardson, J.; Roddier, C.; Roddier, F.

    1991-09-01

    A method for testing the surfaces of large mirrors has been developed to be used even when conditions of vibration and thermal turbulence in the light path cannot be eliminated. The full aperture of the mirror under test is examined by means of a scatterplate interferometer that has the property of being a quasi-common-path method, although any means for obtaining interference fringes will do. The method uses a remotely operated CCD camera system to record the fringe pattern from the workpiece. The typical test is done with a camera exposure of about a millisecond to "freeze" the fringe pattern on the detector. Averaging up to 10 separate exposures effectively eliminates the turbulence effects. The method described provides the optician with complete numerical information and visual plots for the surface under test and the diffracted image the method will produce, all within a few minutes, to an accuracy of 0.01 μm measured peak-to-valley.

  6. Photometric Characterization of the Dark Energy Camera

    DOE PAGES

    Bernstein, G. M.; Abbott, T. M. C.; Armstrong, R.; ...

    2018-04-02

    We characterize the variation in photometric response of the Dark Energy Camera (DECam) across its 520 Mpix science array during 4 years of operation. These variations are measured using high signal-to-noise aperture photometry of >10 7 stellar images in thousands of exposures of a few selected fields, with the telescope dithered to move the sources around the array. A calibration procedure based on these results brings the rms variation in aperture magnitudes of bright stars on cloudless nights down to 2–3 mmag, with <1 mmag of correlated photometric errors for stars separated by ≥20''. On cloudless nights, any departures ofmore » the exposure zeropoints from a secant airmass law exceeding 1 mmag are plausibly attributable to spatial/temporal variations in aperture corrections. These variations can be inferred and corrected by measuring the fraction of stellar light in an annulus between 6'' and 8'' diameter. Key elements of this calibration include: correction of amplifier nonlinearities; distinguishing pixel-area variations and stray light from quantum-efficiency variations in the flat fields; field-dependent color corrections; and the use of an aperture-correction proxy. The DECam response pattern across the 2° field drifts over months by up to ±9 mmag, in a nearly wavelength-independent low-order pattern. Here, we find no fundamental barriers to pushing global photometric calibrations toward mmag accuracy.« less

  7. Astrometric Calibration and Performance of the Dark Energy Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernstein, G. M.; Armstrong, R.; Plazas, A. A.

    2017-05-30

    We characterize the variation in photometric response of the Dark Energy Camera (DECam) across its 520~Mpix science array during 4 years of operation. These variations are measured using high signal-to-noise aperture photometry ofmore » $>10^7$ stellar images in thousands of exposures of a few selected fields, with the telescope dithered to move the sources around the array. A calibration procedure based on these results brings the RMS variation in aperture magnitudes of bright stars on cloudless nights down to 2--3 mmag, with <1 mmag of correlated photometric errors for stars separated by $$\\ge20$$". On cloudless nights, any departures of the exposure zeropoints from a secant airmass law exceeding >1 mmag are plausibly attributable to spatial/temporal variations in aperture corrections. These variations can be inferred and corrected by measuring the fraction of stellar light in an annulus between 6" and 8" diameter. Key elements of this calibration include: correction of amplifier nonlinearities; distinguishing pixel-area variations and stray light from quantum-efficiency variations in the flat fields; field-dependent color corrections; and the use of an aperture-correction proxy. The DECam response pattern across the 2-degree field drifts over months by up to $$\\pm7$$ mmag, in a nearly-wavelength-independent low-order pattern. We find no fundamental barriers to pushing global photometric calibrations toward mmag accuracy.« less

  8. Photometric Characterization of the Dark Energy Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernstein, G. M.; Abbott, T. M. C.; Armstrong, R.

    We characterize the variation in photometric response of the Dark Energy Camera (DECam) across its 520 Mpix science array during 4 years of operation. These variations are measured using high signal-to-noise aperture photometry of >10 7 stellar images in thousands of exposures of a few selected fields, with the telescope dithered to move the sources around the array. A calibration procedure based on these results brings the rms variation in aperture magnitudes of bright stars on cloudless nights down to 2–3 mmag, with <1 mmag of correlated photometric errors for stars separated by ≥20''. On cloudless nights, any departures ofmore » the exposure zeropoints from a secant airmass law exceeding 1 mmag are plausibly attributable to spatial/temporal variations in aperture corrections. These variations can be inferred and corrected by measuring the fraction of stellar light in an annulus between 6'' and 8'' diameter. Key elements of this calibration include: correction of amplifier nonlinearities; distinguishing pixel-area variations and stray light from quantum-efficiency variations in the flat fields; field-dependent color corrections; and the use of an aperture-correction proxy. The DECam response pattern across the 2° field drifts over months by up to ±9 mmag, in a nearly wavelength-independent low-order pattern. Here, we find no fundamental barriers to pushing global photometric calibrations toward mmag accuracy.« less

  9. Photometric Characterization of the Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Bernstein, G. M.; Abbott, T. M. C.; Armstrong, R.; Burke, D. L.; Diehl, H. T.; Gruendl, R. A.; Johnson, M. D.; Li, T. S.; Rykoff, E. S.; Walker, A. R.; Wester, W.; Yanny, B.

    2018-05-01

    We characterize the variation in photometric response of the Dark Energy Camera (DECam) across its 520 Mpix science array during 4 years of operation. These variations are measured using high signal-to-noise aperture photometry of >107 stellar images in thousands of exposures of a few selected fields, with the telescope dithered to move the sources around the array. A calibration procedure based on these results brings the rms variation in aperture magnitudes of bright stars on cloudless nights down to 2–3 mmag, with <1 mmag of correlated photometric errors for stars separated by ≥20″. On cloudless nights, any departures of the exposure zeropoints from a secant airmass law exceeding 1 mmag are plausibly attributable to spatial/temporal variations in aperture corrections. These variations can be inferred and corrected by measuring the fraction of stellar light in an annulus between 6″ and 8″ diameter. Key elements of this calibration include: correction of amplifier nonlinearities; distinguishing pixel-area variations and stray light from quantum-efficiency variations in the flat fields; field-dependent color corrections; and the use of an aperture-correction proxy. The DECam response pattern across the 2° field drifts over months by up to ±9 mmag, in a nearly wavelength-independent low-order pattern. We find no fundamental barriers to pushing global photometric calibrations toward mmag accuracy.

  10. Double Star Measurements at the Southern Sky with 50 cm Reflectors and Fast CCD Cameras in 2012

    NASA Astrophysics Data System (ADS)

    Anton, Rainer

    2014-07-01

    A Cassegrain and a Ritchey-Chrétien reflector, both with 50 cm aperture, were used in Namibia for recordings of double stars with fast CCD cameras and a notebook computer. From superposition of "lucky images", measurements of 39 double and multiple systems were obtained and compared with literature data. Occasional deviations are discussed. Images of some remarkable systems are also presented.

  11. Camera array based light field microscopy

    PubMed Central

    Lin, Xing; Wu, Jiamin; Zheng, Guoan; Dai, Qionghai

    2015-01-01

    This paper proposes a novel approach for high-resolution light field microscopy imaging by using a camera array. In this approach, we apply a two-stage relay system for expanding the aperture plane of the microscope into the size of an imaging lens array, and utilize a sensor array for acquiring different sub-apertures images formed by corresponding imaging lenses. By combining the rectified and synchronized images from 5 × 5 viewpoints with our prototype system, we successfully recovered color light field videos for various fast-moving microscopic specimens with a spatial resolution of 0.79 megapixels at 30 frames per second, corresponding to an unprecedented data throughput of 562.5 MB/s for light field microscopy. We also demonstrated the use of the reported platform for different applications, including post-capture refocusing, phase reconstruction, 3D imaging, and optical metrology. PMID:26417490

  12. Calibration of scintillation-light filters for neutron time-of-flight spectrometers at the National Ignition Facility.

    PubMed

    Sayre, D B; Barbosa, F; Caggiano, J A; DiPuccio, V N; Eckart, M J; Grim, G P; Hartouni, E P; Hatarik, R; Weber, F A

    2016-11-01

    Sixty-four neutral density filters constructed of metal plates with 88 apertures of varying diameter have been radiographed with a soft x-ray source and CCD camera at National Security Technologies, Livermore. An analysis of the radiographs fits the radial dependence of the apertures' image intensities to sigmoid functions, which can describe the rapidly decreasing intensity towards the apertures' edges. The fitted image intensities determine the relative attenuation value of each filter. Absolute attenuation values of several imaged filters, measured in situ during calibration experiments, normalize the relative quantities which are now used in analyses of neutron spectrometer data at the National Ignition Facility.

  13. Double Star Measurements at the Southern Sky with a 50 cm Reflector and a Fast CCD Camera in 2014

    NASA Astrophysics Data System (ADS)

    Anton, Rainer

    2015-04-01

    A Ritchey-Chrétien reflector with 50 cm aperture was used in Namibia for recordings of double stars with a fast CCD camera and a notebook computer. From superposition of "lucky images", measurements of 91 pairings in 79 double and multiple systems were obtained and compared with literature data. Occasional deviations are discussed. Some images of noteworthy systems are also presented.

  14. Research into a Single-aperture Light Field Camera System to Obtain Passive Ground-based 3D Imagery of LEO Objects

    NASA Astrophysics Data System (ADS)

    Bechis, K.; Pitruzzello, A.

    2014-09-01

    This presentation describes our ongoing research into using a ground-based light field camera to obtain passive, single-aperture 3D imagery of LEO objects. Light field cameras are an emerging and rapidly evolving technology for passive 3D imaging with a single optical sensor. The cameras use an array of lenslets placed in front of the camera focal plane, which provides angle of arrival information for light rays originating from across the target, allowing range to target and 3D image to be obtained from a single image using monocular optics. The technology, which has been commercially available for less than four years, has the potential to replace dual-sensor systems such as stereo cameras, dual radar-optical systems, and optical-LIDAR fused systems, thus reducing size, weight, cost, and complexity. We have developed a prototype system for passive ranging and 3D imaging using a commercial light field camera and custom light field image processing algorithms. Our light field camera system has been demonstrated for ground-target surveillance and threat detection applications, and this paper presents results of our research thus far into applying this technology to the 3D imaging of LEO objects. The prototype 3D imaging camera system developed by Northrop Grumman uses a Raytrix R5 C2GigE light field camera connected to a Windows computer with an nVidia graphics processing unit (GPU). The system has a frame rate of 30 Hz, and a software control interface allows for automated camera triggering and light field image acquisition to disk. Custom image processing software then performs the following steps: (1) image refocusing, (2) change detection, (3) range finding, and (4) 3D reconstruction. In Step (1), a series of 2D images are generated from each light field image; the 2D images can be refocused at up to 100 different depths. Currently, steps (1) through (3) are automated, while step (4) requires some user interaction. A key requirement for light field camera operation is that the target must be within the near-field (Fraunhofer distance) of the collecting optics. For example, in visible light the near-field of a 1-m telescope extends out to about 3,500 km, while the near-field of the AEOS telescope extends out over 46,000 km. For our initial proof of concept, we have integrated our light field camera with a 14-inch Meade LX600 advanced coma-free telescope, to image various surrogate ground targets at up to tens of kilometers range. Our experiments with the 14-inch telescope have assessed factors and requirements that are traceable and scalable to a larger-aperture system that would have the near-field distance needed to obtain 3D images of LEO objects. The next step would be to integrate a light field camera with a 1-m or larger telescope and evaluate its 3D imaging capability against LEO objects. 3D imaging of LEO space objects with light field camera technology can potentially provide a valuable new tool for space situational awareness, especially for those situations where laser or radar illumination of the target objects is not feasible.

  15. Laboratory demonstration of image reconstruction for coherent optical system of modular imaging collectors (COSMIC)

    NASA Technical Reports Server (NTRS)

    Traub, W. A.

    1984-01-01

    The first physical demonstration of the principle of image reconstruction using a set of images from a diffraction-blurred elongated aperture is reported. This is an optical validation of previous theoretical and numerical simulations of the COSMIC telescope array (coherent optical system of modular imaging collectors). The present experiment utilizes 17 diffraction blurred exposures of a laboratory light source, as imaged by a lens covered by a narrow-slit aperture; the aperture is rotated 10 degrees between each exposure. The images are recorded in digitized form by a CCD camera, Fourier transformed, numerically filtered, and added; the sum is then filtered and inverse Fourier transformed to form the final image. The image reconstruction process is found to be stable with respect to uncertainties in values of all physical parameters such as effective wavelength, rotation angle, pointing jitter, and aperture shape. Future experiments will explore the effects of low counting rates, autoguiding on the image, various aperture configurations, and separated optics.

  16. 3D SAPIV particle field reconstruction method based on adaptive threshold.

    PubMed

    Qu, Xiangju; Song, Yang; Jin, Ying; Li, Zhenhua; Wang, Xuezhen; Guo, ZhenYan; Ji, Yunjing; He, Anzhi

    2018-03-01

    Particle image velocimetry (PIV) is a necessary flow field diagnostic technique that provides instantaneous velocimetry information non-intrusively. Three-dimensional (3D) PIV methods can supply the full understanding of a 3D structure, the complete stress tensor, and the vorticity vector in the complex flows. In synthetic aperture particle image velocimetry (SAPIV), the flow field can be measured with large particle intensities from the same direction by different cameras. During SAPIV particle reconstruction, particles are commonly reconstructed by manually setting a threshold to filter out unfocused particles in the refocused images. In this paper, the particle intensity distribution in refocused images is analyzed, and a SAPIV particle field reconstruction method based on an adaptive threshold is presented. By using the adaptive threshold to filter the 3D measurement volume integrally, the three-dimensional location information of the focused particles can be reconstructed. The cross correlations between images captured from cameras and images projected by the reconstructed particle field are calculated for different threshold values. The optimal threshold is determined by cubic curve fitting and is defined as the threshold value that causes the correlation coefficient to reach its maximum. The numerical simulation of a 16-camera array and a particle field at two adjacent time events quantitatively evaluates the performance of the proposed method. An experimental system consisting of a camera array of 16 cameras was used to reconstruct the four adjacent frames in a vortex flow field. The results show that the proposed reconstruction method can effectively reconstruct the 3D particle fields.

  17. Teaching Fraunhofer diffraction via experimental and simulated images in the laboratory

    NASA Astrophysics Data System (ADS)

    Peinado, Alba; Vidal, Josep; Escalera, Juan Carlos; Lizana, Angel; Campos, Juan; Yzuel, Maria

    2012-10-01

    Diffraction is an important phenomenon introduced to Physics university students in a subject of Fundamentals of Optics. In addition, in the Physics Degree syllabus of the Universitat Autònoma de Barcelona, there is an elective subject in Applied Optics. In this subject, diverse diffraction concepts are discussed in-depth from different points of view: theory, experiments in the laboratory and computing exercises. In this work, we have focused on the process of teaching Fraunhofer diffraction through laboratory training. Our approach involves students working in small groups. They visualize and acquire some important diffraction patterns with a CCD camera, such as those produced by a slit, a circular aperture or a grating. First, each group calibrates the CCD camera, that is to say, they obtain the relation between the distances in the diffraction plane in millimeters and in the computer screen in pixels. Afterwards, they measure the significant distances in the diffraction patterns and using the appropriate diffraction formalism, they calculate the size of the analyzed apertures. Concomitantly, students grasp the convolution theorem in the Fourier domain by analyzing the diffraction of 2-D gratings of elemental apertures. Finally, the learners use a specific software to simulate diffraction patterns of different apertures. They can control several parameters: shape, size and number of apertures, 1-D or 2-D gratings, wavelength, focal lens or pixel size.Therefore, the program allows them to reproduce the images obtained experimentally, and generate others by changingcertain parameters. This software has been created in our research group, and it is freely distributed to the students in order to help their learning of diffraction. We have observed that these hands on experiments help students to consolidate their theoretical knowledge of diffraction in a pedagogical and stimulating learning process.

  18. TU-F-CAMPUS-T-03: A Novel Iris Quality Assurance Phantom for the CyberKnife Radiosurgery System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Descovich, M; Pinnaduwage, D; Sudhyadhom, A

    Purpose: A novel CCD camera and conical scintillator based phantom that is capable of measuring the targeting and field size accuracy of a robotic radiosurgery system has been developed. This work investigates its application in measuring the field sizes and beam divergence of the CyberKnife variable aperture collimator (Iris). Methods: The phantom was placed on the treatment couch and the robot position was adjusted to obtain an anterior -posterior beam perpendicular to the cone’s central axis. The FWHM of the 12 Iris apertures (5, 7.5, 10, 12.5, 15, 20, 25, 30, 35, 40, 50, and 60 mm) were measured frommore » the beam flux map on the conical scintillator surface as seen by the CCD camera. For each measurement 30 MU were delivered to the phantom at a dose rate of 1000 MU/min. The measurements were repeated at 4 SAD distances between 75 and 85 cm. These readings were used to project the aperture size as if the flux map on the scintillator were located 80 cm from the source (SSD). These projected FWHM beam diameters were then compared to the commissioning data. Results: A series of 12 beam divergence equations were obtained from the 4 sets of data using linear trend lines on Excel scatter plots. These equations were then used to project the FWHM measurements at 80 cm SSD. The average aperture accuracy for beams from 5 through 40 mm was 0.08 mm. The accuracy for the 50 and 60 mm beams were 0.33 and 0.58 mm when compared to film commissioning data. Conclusion: The experimental results for 10 apertures agree with the stated Iris accuracy of ±0.2 mm at 80 cm SAD. The results for the 50 and 60 mm aperture were repeatable and can serve as a reliable trend indicator of any deviations away from the commissioning values. Brett Nelson is President/CTO of Logos Systems.« less

  19. Mirror Illumination and Spillover Measurements of the Atacama Cosmology Telescope

    NASA Technical Reports Server (NTRS)

    Gallardo, Patricio; Dunner, Rolando; Wollack, Ed; Jerez-Hanckes, Carlos

    2012-01-01

    The Atacama Cosmology Telescope (ACT) is a 6 m telescope designed to map the Cosmic Microwave Background (CMB) simultaneously at 145 GHz, 220GHz and 280GHz, The receiver in ACT, the Millimeter Bolometer Array Camera, features 1000 TES bolometers in each band, The detector performance depends critically on the total optical loading, requiring the spmover contributions from the optics to be minimal. This inspired the use of a cold Lyot stop to limit the illumination of the primary and the use of guard rings surrounding the primary and secondary reflectors. Here, we present a direct measurement of the illumination aperture for both reflectors and of the attenuation level outside the main optical path. We used a 145 GHz, 1 m W source and a chopper wheel to produce a time-varying signal with a broad heam proflle, We sampled the response of the camera for different locations of the source, placed in front and beside the primary and secondary mirrors. The aperture of the primary was measured to be 5,72 plus or minus 0,17m in diameter (95 plus or minus 3% of its geometrical size), while the aperture of the secondary yielded 2 plus or minus 0.12m in diameter. Both apertures are consistent with the optical design. Comparing to previous measurements of the beam solid angle from planet observations, we estimate an optical efficiency of 72.3 plus or minus 4,8%. We found that the attenuation outside the primary aperture was -16 plus or minus 2dB, which is below the theoretical expectations, and -22 plus or minus 1 dB outside the secondary aperture, which is consistent with simulations. These results motivated the extension of the baffles surrounding the secondary mirror, with the following reduction in detector optical loading from 2,24 pW to 188pW.

  20. Applying and extending ISO/TC42 digital camera resolution standards to mobile imaging products

    NASA Astrophysics Data System (ADS)

    Williams, Don; Burns, Peter D.

    2007-01-01

    There are no fundamental differences between today's mobile telephone cameras and consumer digital still cameras that suggest many existing ISO imaging performance standards do not apply. To the extent that they have lenses, color filter arrays, detectors, apertures, image processing, and are hand held, there really are no operational or architectural differences. Despite this, there are currently differences in the levels of imaging performance. These are driven by physical and economic constraints, and image-capture conditions. Several ISO standards for resolution, well established for digital consumer digital cameras, require care when applied to the current generation of cell phone cameras. In particular, accommodation of optical flare, shading non-uniformity and distortion are recommended. We offer proposals for the application of existing ISO imaging resolution performance standards to mobile imaging products, and suggestions for extending performance standards to the characteristic behavior of camera phones.

  1. Design and realization of an AEC&AGC system for the CCD aerial camera

    NASA Astrophysics Data System (ADS)

    Liu, Hai ying; Feng, Bing; Wang, Peng; Li, Yan; Wei, Hao yun

    2015-08-01

    An AEC and AGC(Automatic Exposure Control and Automatic Gain Control) system was designed for a CCD aerial camera with fixed aperture and electronic shutter. The normal AEC and AGE algorithm is not suitable to the aerial camera since the camera always takes high-resolution photographs in high-speed moving. The AEC and AGE system adjusts electronic shutter and camera gain automatically according to the target brightness and the moving speed of the aircraft. An automatic Gamma correction is used before the image is output so that the image is better for watching and analyzing by human eyes. The AEC and AGC system could avoid underexposure, overexposure, or image blurring caused by fast moving or environment vibration. A series of tests proved that the system meet the requirements of the camera system with its fast adjusting speed, high adaptability, high reliability in severe complex environment.

  2. Application of phase matching autofocus in airborne long-range oblique photography camera

    NASA Astrophysics Data System (ADS)

    Petrushevsky, Vladimir; Guberman, Asaf

    2014-06-01

    The Condor2 long-range oblique photography (LOROP) camera is mounted in an aerodynamically shaped pod carried by a fast jet aircraft. Large aperture, dual-band (EO/MWIR) camera is equipped with TDI focal plane arrays and provides high-resolution imagery of extended areas at long stand-off ranges, at day and night. Front Ritchey-Chretien optics is made of highly stable materials. However, the camera temperature varies considerably in flight conditions. Moreover, a composite-material structure of the reflective objective undergoes gradual dehumidification in dry nitrogen atmosphere inside the pod, causing some small decrease of the structure length. The temperature and humidity effects change a distance between the mirrors by just a few microns. The distance change is small but nevertheless it alters the camera's infinity focus setpoint significantly, especially in the EO band. To realize the optics' resolution potential, the optimal focus shall be constantly maintained. In-flight best focus calibration and temperature-based open-loop focus control give mostly satisfactory performance. To get even better focusing precision, a closed-loop phase-matching autofocus method was developed for the camera. The method makes use of an existing beamsharer prism FPA arrangement where aperture partition exists inherently in an area of overlap between the adjacent detectors. The defocus is proportional to an image phase shift in the area of overlap. Low-pass filtering of raw defocus estimate reduces random errors related to variable scene content. Closed-loop control converges robustly to precise focus position. The algorithm uses the temperature- and range-based focus prediction as an initial guess for the closed-loop phase-matching control. The autofocus algorithm achieves excellent results and works robustly in various conditions of scene illumination and contrast.

  3. Analysis of accuracy in photogrammetric roughness measurements

    NASA Astrophysics Data System (ADS)

    Olkowicz, Marcin; Dąbrowski, Marcin; Pluymakers, Anne

    2017-04-01

    Regarding permeability, one of the most important features of shale gas reservoirs is the effective aperture of cracks opened during hydraulic fracturing, both propped and unpropped. In a propped fracture, the aperture is controlled mostly by proppant size and its embedment, and fracture surface roughness only has a minor influence. In contrast, in an unpropped fracture aperture is controlled by the fracture roughness and the wall displacement. To measure fracture surface roughness, we have used the photogrammetric method since it is time- and cost-efficient. To estimate the accuracy of this method we compare the photogrammetric measurements with reference measurements taken with a White Light Interferometer (WLI). Our photogrammetric setup is based on high resolution 50 Mpx camera combined with a focus stacking technique. The first step for photogrammetric measurements is to determine the optimal camera positions and lighting. We compare multiple scans of one sample, taken with different settings of lighting and camera positions, with the reference WLI measurement. The second step is to perform measurements of all studied fractures with the parameters that produced the best results in the first step. To compare photogrammetric and WLI measurements we regrid both data sets onto a regular 10 μm grid and determined the best fit, followed by a calculation of the difference between the measurements. The first results of the comparison show that for 90 % of measured points the absolute vertical distance between WLI and photogrammetry is less than 10 μm, while the mean absolute vertical distance is 5 μm. This proves that our setup can be used for fracture roughness measurements in shales.

  4. Ground Testing of Prototype Hardware and Processing Algorithms for a Wide Area Space Surveillance System (WASSS)

    DTIC Science & Technology

    2013-09-01

    Ground testing of prototype hardware and processing algorithms for a Wide Area Space Surveillance System (WASSS) Neil Goldstein, Rainer A...at Magdalena Ridge Observatory using the prototype Wide Area Space Surveillance System (WASSS) camera, which has a 4 x 60 field-of-view , < 0.05...objects with larger-aperture cameras. The sensitivity of the system depends on multi-frame averaging and a Principal Component Analysis based image

  5. VizieR Online Data Catalog: Observed light curve of (3200) Phaethon (Ansdell+, 2014)

    NASA Astrophysics Data System (ADS)

    Ansdell, M.; Meech, K. J.; Hainaut, O.; Buie, M. W.; Kaluna, H.; Bauer, J.; Dundon, L.

    2017-04-01

    We obtained time series photometry over 15 nights from 1994 to 2013. All but three nights used the Tektronix 2048x2048 pixel CCD camera on the University of Hawaii 2.2 m telescope on Mauna Kea. Two nights used the PRISM 2048x2048 pixel CCD camera on the Perkins 72 inch telescope at the Lowell Observatory in Flagstaff, Arizona, while one night used the Optic 2048x4096 CCD camera also on the University of Hawaii 2.2 m telescope. All observations used the standard Kron-Cousins R filter with the telescope guiding on (3200) Phaethon at non-sidereal rates. Raw images were processed with standard IRAF routines for bias subtraction, flat-fielding, and cosmic ray removal (Tody, 1986SPIE..627..733T). We constructed reference flat fields by median combining dithered images of either twilight or the object field (in both cases, flattening reduced gradients to <1% across the CCD). We performed photometry using the IRAF phot routine with circular apertures typically 5'' in radius, although aperture sizes changed depending on the night and/or exposure as they were chosen to consistently include 99.5% of the object's light. (1 data file).

  6. Calibration of the Auger Fluorescence Telescopes

    NASA Astrophysics Data System (ADS)

    Klages, H.; Pierre Auger Observatory Collaboration

    Thirty fluorescence telescopes in four stations will overlook the detector array of the southern hemisphere experiment of the Pierre Auger project. The main aim of these telescopes is tracking of EHE air showers, measurement of the longitudinal shower development (Xmax) and determination of the absolute energy of EHE events. A telescope camera contains 440 PMTs each covering a 1.5 x 1.5 degree pixel of the sky. The response of every pixel is converted into the number of charged particles at the observed part of the shower. This reconstruction includes the shower/observer geometry and the details of the atmospheric photon production and transport. The remaining experimental task is to convert the ADC counts of the camera pixel electronics into the light flux entering the Schmidt aperture. Three types of calibration and control are necessary : a) Monitoring of time dependent variations has to be performed for all parts of the optics and for all pixels frequently. Common illumination for all pixels of a camera allows the detection of individual deviations. Properties of windows, filters and mirrors have to be measured separately. b) Differences in pixel-to-pixel efficiency are mainly due to PMT gain and to differences in effective area (camera shadow, mirror size limits). Homogeneous and isotropic illumination will enable cross calibration. c) An absolute calibration has to be performed once in a while using trusted light monitors. The calibration methods used for the Pierre Auger FD telescopes in Argentina are discussed.

  7. Visible camera cryostat design and performance for the SuMIRe Prime Focus Spectrograph (PFS)

    NASA Astrophysics Data System (ADS)

    Smee, Stephen A.; Gunn, James E.; Golebiowski, Mirek; Hope, Stephen C.; Madec, Fabrice; Gabriel, Jean-Francois; Loomis, Craig; Le fur, Arnaud; Dohlen, Kjetil; Le Mignant, David; Barkhouser, Robert; Carr, Michael; Hart, Murdock; Tamura, Naoyuki; Shimono, Atsushi; Takato, Naruhisa

    2016-08-01

    We describe the design and performance of the SuMIRe Prime Focus Spectrograph (PFS) visible camera cryostats. SuMIRe PFS is a massively multi-plexed ground-based spectrograph consisting of four identical spectrograph modules, each receiving roughly 600 fibers from a 2394 fiber robotic positioner at the prime focus. Each spectrograph module has three channels covering wavelength ranges 380 nm - 640 nm, 640 nm - 955 nm, and 955 nm - 1.26 um, with the dispersed light being imaged in each channel by a f/1.07 vacuum Schmidt camera. The cameras are very large, having a clear aperture of 300 mm at the entrance window, and a mass of 280 kg. In this paper we describe the design of the visible camera cryostats and discuss various aspects of cryostat performance.

  8. An array of virtual Frisch-grid CdZnTe detectors and a front-end application-specific integrated circuit for large-area position-sensitive gamma-ray cameras.

    PubMed

    Bolotnikov, A E; Ackley, K; Camarda, G S; Cherches, C; Cui, Y; De Geronimo, G; Fried, J; Hodges, D; Hossain, A; Lee, W; Mahler, G; Maritato, M; Petryk, M; Roy, U; Salwen, C; Vernon, E; Yang, G; James, R B

    2015-07-01

    We developed a robust and low-cost array of virtual Frisch-grid CdZnTe detectors coupled to a front-end readout application-specific integrated circuit (ASIC) for spectroscopy and imaging of gamma rays. The array operates as a self-reliant detector module. It is comprised of 36 close-packed 6 × 6 × 15 mm(3) detectors grouped into 3 × 3 sub-arrays of 2 × 2 detectors with the common cathodes. The front-end analog ASIC accommodates up to 36 anode and 9 cathode inputs. Several detector modules can be integrated into a single- or multi-layer unit operating as a Compton or a coded-aperture camera. We present the results from testing two fully assembled modules and readout electronics. The further enhancement of the arrays' performance and reduction of their cost are possible by using position-sensitive virtual Frisch-grid detectors, which allow for accurate corrections of the response of material non-uniformities caused by crystal defects.

  9. An array of virtual Frisch-grid CdZnTe detectors and a front-end application-specific integrated circuit for large-area position-sensitive gamma-ray cameras

    DOE PAGES

    Bolotnikov, A. E.; Ackley, K.; Camarda, G. S.; ...

    2015-07-28

    We developed a robust and low-cost array of virtual Frisch-grid CdZnTe (CZT) detectors coupled to a front-end readout ASIC for spectroscopy and imaging of gamma rays. The array operates as a self-reliant detector module. It is comprised of 36 close-packed 6x6x15 mm 3 detectors grouped into 3x3 sub-arrays of 2x2 detectors with the common cathodes. The front-end analog ASIC accommodates up to 36 anode and 9 cathode inputs. Several detector modules can be integrated into a single- or multi-layer unit operating as a Compton or a coded-aperture camera. We present the results from testing two fully assembled modules and readoutmore » electronics. The further enhancement of the arrays’ performance and reduction of their cost are made possible by using position-sensitive virtual Frisch-grid detectors, which allow for accurate corrections of the response of material non-uniformities caused by crystal defects.« less

  10. Miniature Wide-Angle Lens for Small-Pixel Electronic Camera

    NASA Technical Reports Server (NTRS)

    Mouroulils, Pantazis; Blazejewski, Edward

    2009-01-01

    A proposed wideangle lens is shown that would be especially well suited for an electronic camera in which the focal plane is occupied by an image sensor that has small pixels. The design of the lens is intended to satisfy requirements for compactness, high image quality, and reasonably low cost, while addressing issues peculiar to the operation of small-pixel image sensors. Hence, this design is expected to enable the development of a new generation of compact, high-performance electronic cameras. The lens example shown has a 60 degree field of view and a relative aperture (f-number) of 3.2. The main issues affecting the design are also shown.

  11. Scalar wave-optical reconstruction of plenoptic camera images.

    PubMed

    Junker, André; Stenau, Tim; Brenner, Karl-Heinz

    2014-09-01

    We investigate the reconstruction of plenoptic camera images in a scalar wave-optical framework. Previous publications relating to this topic numerically simulate light propagation on the basis of ray tracing. However, due to continuing miniaturization of hardware components it can be assumed that in combination with low-aperture optical systems this technique may not be generally valid. Therefore, we study the differences between ray- and wave-optical object reconstructions of true plenoptic camera images. For this purpose we present a wave-optical reconstruction algorithm, which can be run on a regular computer. Our findings show that a wave-optical treatment is capable of increasing the detail resolution of reconstructed objects.

  12. Extracting spatial information from large aperture exposures of diffuse sources

    NASA Technical Reports Server (NTRS)

    Clarke, J. T.; Moos, H. W.

    1981-01-01

    The spatial properties of large aperture exposures of diffuse emission can be used both to investigate spatial variations in the emission and to filter out camera noise in exposures of weak emission sources. Spatial imaging can be accomplished both parallel and perpendicular to dispersion with a resolution of 5-6 arc sec, and a narrow median filter running perpendicular to dispersion across a diffuse image selectively filters out point source features, such as reseaux marks and fast particle hits. Spatial information derived from observations of solar system objects is presented.

  13. Characterization of fracture aperture for groundwater flow and transport

    NASA Astrophysics Data System (ADS)

    Sawada, A.; Sato, H.; Tetsu, K.; Sakamoto, K.

    2007-12-01

    This paper presents experiments and numerical analyses of flow and transport carried out on natural fractures and transparent replica of fractures. The purpose of this study was to improve the understanding of the role of heterogeneous aperture patterns on channelization of groundwater flow and dispersion in solute transport. The research proceeded as follows: First, a precision plane grinder was applied perpendicular to the fracture plane to characterize the aperture distribution on a natural fracture with 1 mm of increment size. Although both time and labor were intensive, this approach provided a detailed, three dimensional picture of the pattern of fracture aperture. This information was analyzed to provide quantitative measures for the fracture aperture distribution, including JRC (Joint Roughness Coefficient) and fracture contact area ratio. These parameters were used to develop numerical models with corresponding synthetic aperture patterns. The transparent fracture replica and numerical models were then used to study how transport is affected by the aperture spatial pattern. In the transparent replica, transmitted light intensity measured by a CCD camera was used to image channeling and dispersion due to the fracture aperture spatial pattern. The CCD image data was analyzed to obtain the quantitative fracture aperture and tracer concentration data according to Lambert-Beer's law. The experimental results were analyzed using the numerical models. Comparison of the numerical models to the transparent replica provided information about the nature of channeling and dispersion due to aperture spatial patterns. These results support to develop a methodology for defining representative fracture aperture of a simplified parallel fracture model for flow and transport in heterogeneous fractures for contaminant transport analysis.

  14. Water window imaging x ray microscope

    NASA Technical Reports Server (NTRS)

    Hoover, Richard B. (Inventor)

    1992-01-01

    A high resolution x ray microscope for imaging microscopic structures within biological specimens has an optical system including a highly polished primary and secondary mirror coated with identical multilayer coatings, the mirrors acting at normal incidence. The coatings have a high reflectivity in the narrow wave bandpass between 23.3 and 43.7 angstroms and have low reflectivity outside of this range. The primary mirror has a spherical concave surface and the secondary mirror has a spherical convex surface. The radii of the mirrors are concentric about a common center of curvature on the optical axis of the microscope extending from the object focal plane to the image focal plane. The primary mirror has an annular configuration with a central aperture and the secondary mirror is positioned between the primary mirror and the center of curvature for reflecting radiation through the aperture to a detector. An x ray filter is mounted at the stage end of the microscope, and film sensitive to x rays in the desired band width is mounted in a camera at the image plane of the optical system. The microscope is mounted within a vacuum chamber for minimizing the absorption of x rays in air from a source through the microscope.

  15. Multiplexed two in-line holographic recordings for flow characterization in a flexible vessel

    NASA Astrophysics Data System (ADS)

    Lobera, Julia; Palero, Virginia; Roche, Eva M.; Gómez Climente, Marina; López Torres, Ana M.; Andrés, Nieves; Arroyo, M. Pilar

    2017-06-01

    The simultaneous presence of the real and virtual images in the hologram reconstruction is inherent in the in-line holography. This drawback can be overcome with a shifted knife-edge aperture at the focal plane of the imaging lens. The shifted aperture DIH produces holograms where the real and virtual images are completely separated. In this paper we propose a modification of the shifted aperture DIH that allows recording two holograms simultaneously using one camera, while retaining the simplicity of the in-line configuration and the advantage of the shifted-aperture strategy. As in typical stereoscopy, the advantage of this configuration is limited by the angle between the two illuminating beams, and therefore the aperture size. Some improvement on the out-of-plane resolution can be expected from a combined analysis of the multiplexed holograms. In order to compare this technique with other in-line holographic configurations, several experiments have been performed to study the spatial resolution along the optical axis. The capabilities of the different techniques for characterizing the flow in a flexible and transparent model of a carotid bifurcation are also investigated.

  16. Multi-Aperture-Based Probabilistic Noise Reduction of Random Telegraph Signal Noise and Photon Shot Noise in Semi-Photon-Counting Complementary-Metal-Oxide-Semiconductor Image Sensor

    PubMed Central

    Ishida, Haruki; Kagawa, Keiichiro; Komuro, Takashi; Zhang, Bo; Seo, Min-Woong; Takasawa, Taishi; Yasutomi, Keita; Kawahito, Shoji

    2018-01-01

    A probabilistic method to remove the random telegraph signal (RTS) noise and to increase the signal level is proposed, and was verified by simulation based on measured real sensor noise. Although semi-photon-counting-level (SPCL) ultra-low noise complementary-metal-oxide-semiconductor (CMOS) image sensors (CISs) with high conversion gain pixels have emerged, they still suffer from huge RTS noise, which is inherent to the CISs. The proposed method utilizes a multi-aperture (MA) camera that is composed of multiple sets of an SPCL CIS and a moderately fast and compact imaging lens to emulate a very fast single lens. Due to the redundancy of the MA camera, the RTS noise is removed by the maximum likelihood estimation where noise characteristics are modeled by the probability density distribution. In the proposed method, the photon shot noise is also relatively reduced because of the averaging effect, where the pixel values of all the multiple apertures are considered. An extremely low-light condition that the maximum number of electrons per aperture was the only 2e− was simulated. PSNRs of a test image for simple averaging, selective averaging (our previous method), and the proposed method were 11.92 dB, 11.61 dB, and 13.14 dB, respectively. The selective averaging, which can remove RTS noise, was worse than the simple averaging because it ignores the pixels with RTS noise and photon shot noise was less improved. The simulation results showed that the proposed method provided the best noise reduction performance. PMID:29587424

  17. Maximum likelihood estimation in calibrating a stereo camera setup.

    PubMed

    Muijtjens, A M; Roos, J M; Arts, T; Hasman, A

    1999-02-01

    Motion and deformation of the cardiac wall may be measured by following the positions of implanted radiopaque markers in three dimensions, using two x-ray cameras simultaneously. Regularly, calibration of the position measurement system is obtained by registration of the images of a calibration object, containing 10-20 radiopaque markers at known positions. Unfortunately, an accidental change of the position of a camera after calibration requires complete recalibration. Alternatively, redundant information in the measured image positions of stereo pairs can be used for calibration. Thus, a separate calibration procedure can be avoided. In the current study a model is developed that describes the geometry of the camera setup by five dimensionless parameters. Maximum Likelihood (ML) estimates of these parameters were obtained in an error analysis. It is shown that the ML estimates can be found by application of a nonlinear least squares procedure. Compared to the standard unweighted least squares procedure, the ML method resulted in more accurate estimates without noticeable bias. The accuracy of the ML method was investigated in relation to the object aperture. The reconstruction problem appeared well conditioned as long as the object aperture is larger than 0.1 rad. The angle between the two viewing directions appeared to be the parameter that was most likely to cause major inaccuracies in the reconstruction of the 3-D positions of the markers. Hence, attempts to improve the robustness of the method should primarily focus on reduction of the error in this parameter.

  18. A multi-cone x-ray imaging Bragg crystal spectrometer

    DOE PAGES

    Bitter, M.; Hill, K. W.; Gao, Lan; ...

    2016-08-26

    This article describes a new x-ray imaging Bragg crystal spectrometer, which—in combination with a streak camera or a gated strip detector—can be used for time-resolved measurements of x-ray line spectra at the National Ignition Facility and other high power laser facilities. The main advantage of this instrument is that it produces perfect images of a point source for each wavelength in a selectable spectral range and that the detector plane can be perpendicular to the crystal surface or inclined by an arbitrary angle with respect to the crystal surface. Furthermore, these unique imaging properties are obtained by bending the x-raymore » diffracting crystal into a certain shape, which is generated by arranging multiple cones with different aperture angles on a common nodal line.« less

  19. UrtheCast Second-Generation Earth Observation Sensors

    NASA Astrophysics Data System (ADS)

    Beckett, K.

    2015-04-01

    UrtheCast's Second-Generation state-of-the-art Earth Observation (EO) remote sensing platform will be hosted on the NASA segment of International Space Station (ISS). This platform comprises a high-resolution dual-mode (pushbroom and video) optical camera and a dual-band (X and L) Synthetic Aperture RADAR (SAR) instrument. These new sensors will complement the firstgeneration medium-resolution pushbroom and high-definition video cameras that were mounted on the Russian segment of the ISS in early 2014. The new cameras are expected to be launched to the ISS in late 2017 via the Space Exploration Technologies Corporation Dragon spacecraft. The Canadarm will then be used to install the remote sensing platform onto a CBM (Common Berthing Mechanism) hatch on Node 3, allowing the sensor electronics to be accessible from the inside of the station, thus limiting their exposure to the space environment and allowing for future capability upgrades. The UrtheCast second-generation system will be able to take full advantage of the strengths that each of the individual sensors offers, such that the data exploitation capabilities of the combined sensors is significantly greater than from either sensor alone. This represents a truly novel platform that will lead to significant advances in many other Earth Observation applications such as environmental monitoring, energy and natural resources management, and humanitarian response, with data availability anticipated to begin after commissioning is completed in early 2018.

  20. "Sturdy as a house with four windows," the star tracker of the future

    NASA Astrophysics Data System (ADS)

    Duivenvoorde, Tom; Leijtens, Johan; van der Heide, Erik J.

    2017-11-01

    Ongoing miniaturization of spacecraft demands the reduction in size of Attitude and Orbit Control Systems (AOCS). Therefore TNO has created a new design of a multi aperture, high performance, and miniaturized star tracker. The innovative design incorporates the latest developments in camera technology, attitude calculation and mechanical design into a system with 5 arc seconds accuracy, making the system usable for many applications. In this paper the results are presented of the system design and analysis, as well as the performance predictions for the Multi Aperture Baffled Star Tracker (MABS). The highly integrated system consists of multiple apertures without the need for external baffles, resulting in major advantages in mass, volume, alignment with the spacecraft and relative aperture stability. In the analysis part of this paper, the thermal and mechanical stability are discussed. In the final part the simulation results will be described that have lead to the predicted accuracy of the star tracker system and a peek into the future of attitude sensors is given.

  1. A new compact, high sensitivity neutron imaging systema)

    NASA Astrophysics Data System (ADS)

    Caillaud, T.; Landoas, O.; Briat, M.; Rossé, B.; Thfoin, I.; Philippe, F.; Casner, A.; Bourgade, J. L.; Disdier, L.; Glebov, V. Yu.; Marshall, F. J.; Sangster, T. C.; Park, H. S.; Robey, H. F.; Amendt, P.

    2012-10-01

    We have developed a new small neutron imaging system (SNIS) diagnostic for the OMEGA laser facility. The SNIS uses a penumbral coded aperture and has been designed to record images from low yield (109-1010 neutrons) implosions such as those using deuterium as the fuel. This camera was tested at OMEGA in 2009 on a rugby hohlraum energetics experiment where it recorded an image at a yield of 1.4 × 1010. The resolution of this image was 54 μm and the camera was located only 4 meters from target chamber centre. We recently improved the instrument by adding a cooled CCD camera. The sensitivity of the new camera has been fully characterized using a linear accelerator and a 60Co γ-ray source. The calibration showed that the signal-to-noise ratio could be improved by using raw binning detection.

  2. Photometry of compact galaxies.

    NASA Technical Reports Server (NTRS)

    Shen, B. S. P.; Usher, P. D.; Barrett, J. W.

    1972-01-01

    Photometric histories of the N galaxies 3C 390.3 and PKS 0521-36. Four other compact galaxies, Markarian 9, I Zw 92, 2 Zw 136, and III Zw 77 showed no evidence of variability. The photometric histories were obtained from an exhaustive study of those plates of the Harvard collection taken with large aperture cameras. The images of all galaxies reported were indistinguishable from stars due to the camera f-ratios and low surface brightness of the outlying nebulosities of the galaxies. Standard techniques for the study of variable stars are therefore applicable.

  3. Composite x-ray pinholes for time-resolved microphotography of laser compressed targets.

    PubMed

    Attwood, D T; Weinstein, B W; Wuerker, R F

    1977-05-01

    Composite x-ray pinholes having dichroic properties are presented. These pinholes permit both x-ray imaging and visible alignment with micron accuracy by presenting different apparent apertures in these widely disparate regions of the spectrum. Their use is mandatory in certain applications in which the x-ray detection consists of a limited number of resolvable elements whose use one wishes to maximize. Mating the pinhole camera with an x-ray streaking camera is described, along with experiments which spatially and temporally resolve the implosion of laser irradiated targets.

  4. Color camera computed tomography imaging spectrometer for improved spatial-spectral image accuracy

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Bearman, Gregory H. (Inventor); Johnson, William R. (Inventor)

    2011-01-01

    Computed tomography imaging spectrometers ("CTIS"s) having color focal plane array detectors are provided. The color FPA detector may comprise a digital color camera including a digital image sensor, such as a Foveon X3.RTM. digital image sensor or a Bayer color filter mosaic. In another embodiment, the CTIS includes a pattern imposed either directly on the object scene being imaged or at the field stop aperture. The use of a color FPA detector and the pattern improves the accuracy of the captured spatial and spectral information.

  5. The Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Mather, John C.; Seery, Bernard (Technical Monitor)

    2001-01-01

    The Next Generation Space Telescope NGST is an 6-7 m class radiatively cooled telescope, planned for launch to the Lagrange point L2 in 2009, to be built by a partnership of NASA, ESA, and CSA. The NGST science program calls for three core instruments: 1) Near IR camera, 0.6 - 5 micrometer; 2) Near IR multiobject spectrometer, 1 - 5 micrometer, and 3) Mid IR camera and spectrometer, 5 - 28 micrometers. I will report on the scientific goals, project status, and the recent reduction in aperture from the target of 8 m.

  6. Beam Combination for Stellar Imager and its Application to Full-Aperture Imaging

    NASA Technical Reports Server (NTRS)

    Mozurkewich, D.; Carpenter, K. G.; Lyon, R. G.

    2007-01-01

    Stellar Imager (SI) will be a Space-Based telescope consisting of 20 to 30 separated apertures. It is designed for UV/Optical imaging of stellar surfaces and asteroseismology. This report describes details of an alternative optical design for the beam combiner, dubbed the Spatial Frequency Remapper (SFR). It sacrifices the large field of view of the Fizeau combiner. In return, spectral resolution is obtained with a diffraction grating rather than an array of energy-resolving detectors. The SFR design works in principle and has been implemented with MIRC at CHARA for a small number of apertures. Here, we show the number of optical surfaces can be reduced and the concept scales gracefully to the large number of apertures needed for Stellar Imager. We also describe a potential application of this spatial frequency remapping to improved imaging with filled aperture systems. For filled-aperture imaging, the SFR becomes the core of an improved aperture masking system. To date, aperture-masking has produced the best images with ground-based telescopes but at the expense of low sensitivity due to short exposures and discarding most of the light collected by the telescope. This design eliminates the light-loss problem previously claimed to be inherent in all aperture-masking designs. We also argue that at least in principle, the short-integration time limit can also be overcome. With these improvements, it becomes an ideal camera for TPF-C; since it can form speckle-free images in the presence of wavefront errors, it should significantly relax the stability requirements of the current designs.

  7. Evaluation of multispectral plenoptic camera

    NASA Astrophysics Data System (ADS)

    Meng, Lingfei; Sun, Ting; Kosoglow, Rich; Berkner, Kathrin

    2013-01-01

    Plenoptic cameras enable capture of a 4D lightfield, allowing digital refocusing and depth estimation from data captured with a compact portable camera. Whereas most of the work on plenoptic camera design has been based a simplistic geometric-optics-based characterization of the optical path only, little work has been done of optimizing end-to-end system performance for a specific application. Such design optimization requires design tools that need to include careful parameterization of main lens elements, as well as microlens array and sensor characteristics. In this paper we are interested in evaluating the performance of a multispectral plenoptic camera, i.e. a camera with spectral filters inserted into the aperture plane of the main lens. Such a camera enables single-snapshot spectral data acquisition.1-3 We first describe in detail an end-to-end imaging system model for a spectrally coded plenoptic camera that we briefly introduced in.4 Different performance metrics are defined to evaluate the spectral reconstruction quality. We then present a prototype which is developed based on a modified DSLR camera containing a lenslet array on the sensor and a filter array in the main lens. Finally we evaluate the spectral reconstruction performance of a spectral plenoptic camera based on both simulation and measurements obtained from the prototype.

  8. Photographic zoom fisheye lens design for DSLR cameras

    NASA Astrophysics Data System (ADS)

    Yan, Yufeng; Sasian, Jose

    2017-09-01

    Photographic fisheye lenses with fixed focal length for cameras with different sensor formats have been well developed for decades. However, photographic fisheye lenses with variable focal length are rare on the market due in part to the greater design difficulty. This paper presents a large aperture zoom fisheye lens for DSLR cameras that produces both circular and diagonal fisheye imaging for 35-mm sensors and diagonal fisheye imaging for APS-C sensors. The history and optical characteristics of fisheye lenses are briefly reviewed. Then, a 9.2- to 16.1-mm F/2.8 to F/3.5 zoom fisheye lens design is presented, including the design approach and aberration control. Image quality and tolerance performance analysis for this lens are also presented.

  9. Dermoscopy-guided reflectance confocal microscopy of skin using high-NA objective lens with integrated wide-field color camera

    NASA Astrophysics Data System (ADS)

    Dickensheets, David L.; Kreitinger, Seth; Peterson, Gary; Heger, Michael; Rajadhyaksha, Milind

    2016-02-01

    Reflectance Confocal Microscopy, or RCM, is being increasingly used to guide diagnosis of skin lesions. The combination of widefield dermoscopy (WFD) with RCM is highly sensitive (~90%) and specific (~ 90%) for noninvasively detecting melanocytic and non-melanocytic skin lesions. The combined WFD and RCM approach is being implemented on patients to triage lesions into benign (with no biopsy) versus suspicious (followed by biopsy and pathology). Currently, however, WFD and RCM imaging are performed with separate instruments, while using an adhesive ring attached to the skin to sequentially image the same region and co-register the images. The latest small handheld RCM instruments offer no provision yet for a co-registered wide-field image. This paper describes an innovative solution that integrates an ultra-miniature dermoscopy camera into the RCM objective lens, providing simultaneous wide-field color images of the skin surface and RCM images of the subsurface cellular structure. The objective lens (0.9 NA) includes a hyperhemisphere lens and an ultra-miniature CMOS color camera, commanding a 4 mm wide dermoscopy view of the skin surface. The camera obscures the central portion of the aperture of the objective lens, but the resulting annular aperture provides excellent RCM optical sectioning and resolution. Preliminary testing on healthy volunteers showed the feasibility of combined WFD and RCM imaging to concurrently show the skin surface in wide-field and the underlying microscopic cellular-level detail. The paper describes this unique integrated dermoscopic WFD/RCM lens, and shows representative images. The potential for dermoscopy-guided RCM for skin cancer diagnosis is discussed.

  10. Dermoscopy-guided reflectance confocal microscopy of skin using high-NA objective lens with integrated wide-field color camera.

    PubMed

    Dickensheets, David L; Kreitinger, Seth; Peterson, Gary; Heger, Michael; Rajadhyaksha, Milind

    2016-02-01

    Reflectance Confocal Microscopy, or RCM, is being increasingly used to guide diagnosis of skin lesions. The combination of widefield dermoscopy (WFD) with RCM is highly sensitive (~90%) and specific (~ 90%) for noninvasively detecting melanocytic and non-melanocytic skin lesions. The combined WFD and RCM approach is being implemented on patients to triage lesions into benign (with no biopsy) versus suspicious (followed by biopsy and pathology). Currently, however, WFD and RCM imaging are performed with separate instruments, while using an adhesive ring attached to the skin to sequentially image the same region and co-register the images. The latest small handheld RCM instruments offer no provision yet for a co-registered wide-field image. This paper describes an innovative solution that integrates an ultra-miniature dermoscopy camera into the RCM objective lens, providing simultaneous wide-field color images of the skin surface and RCM images of the subsurface cellular structure. The objective lens (0.9 NA) includes a hyperhemisphere lens and an ultra-miniature CMOS color camera, commanding a 4 mm wide dermoscopy view of the skin surface. The camera obscures the central portion of the aperture of the objective lens, but the resulting annular aperture provides excellent RCM optical sectioning and resolution. Preliminary testing on healthy volunteers showed the feasibility of combined WFD and RCM imaging to concurrently show the skin surface in wide-field and the underlying microscopic cellular-level detail. The paper describes this unique integrated dermoscopic WFD/RCM lens, and shows representative images. The potential for dermoscopy-guided RCM for skin cancer diagnosis is discussed.

  11. Lens and Camera Arrays for Sky Surveys and Space Surveillance

    NASA Astrophysics Data System (ADS)

    Ackermann, M.; Cox, D.; McGraw, J.; Zimmer, P.

    2016-09-01

    In recent years, a number of sky survey projects have chosen to use arrays of commercial cameras coupled with commercial photographic lenses to enable low-cost, wide-area observation. Projects such as SuperWASP, FAVOR, RAPTOR, Lotis, PANOPTES, and DragonFly rely on multiple cameras with commercial lenses to image wide areas of the sky each night. The sensors are usually commercial astronomical charge coupled devices (CCDs) or digital single reflex (DSLR) cameras, while the lenses are large-aperture, highend consumer items intended for general photography. While much of this equipment is very capable and relatively inexpensive, this approach comes with a number of significant limitations that reduce sensitivity and overall utility of the image data. The most frequently encountered limitations include lens vignetting, narrow spectral bandpass, and a relatively large point spread function. Understanding these limits helps to assess the utility of the data, and identify areas where advanced optical designs could significantly improve survey performance.

  12. Grisms Developed for FOCAS

    NASA Astrophysics Data System (ADS)

    Ebizuka, Noboru; Kawabata, Koji S.; Oka, Keiko; Yamada, Akiko; Kashiwagi, Masako; Kodate, Kashiko; Hattori, Takashi; Kashikawa, Nobunari; Iye, Masanori

    2011-03-01

    Faint Object Camera and Spectrograph (FOCAS) is a versatile common-use optical instrument for the 8.2 m Subaru Telescope, offering imaging and spectroscopic observations. FOCAS employs grisms with resolving powers ranging from 280 to 8200 as dispersive optical elements. A grism is a direct-vision grating composed of a transmission grating and prism(s). FOCAS has five grisms with replica surface-relief gratings including an echelle-type grism, and eight grisms with volume-phase holographic (VPH) gratings. The size of these grisms is 110 mm × 106 mm in aperture with a maximum thickness of 110 mm. We employ not only the dichromated gelatin, but also the hologram resin as a recording material for VPH gratings. We discuss the performance of these FOCAS grisms measured in the laboratory, and verify it by test observations, and show examples of astronomical spectroscopic observations.

  13. X-band noise temperature effects of rain on DSN antenna feedhorns

    NASA Technical Reports Server (NTRS)

    Slobin, S. D.; Franco, M. M.; Clauss, R. C.

    1982-01-01

    Simulated rain tests were carried out to determine the noise temperature contribution of liquid water adhering to the aperture cover material on both a standard DSN X-band feedhorn and on an S/X-band common aperture feedhorn. It was found that for the particular common aperture feedhorn tested, system noise temperature increases were much greater when the plastic horn cover material was old and weathered than when it was new. The age and condition of the aperture cover material is believed to be a major factor in the amount of degradation experienced by a telecommunications system during rain events.

  14. Overview of LBTI: A Multipurpose Facility for High Spatial Resolution Observations

    NASA Technical Reports Server (NTRS)

    Hinz, P. M.; Defrere, D.; Skemer, A.; Bailey, V.; Stone, J.; Spalding, E.; Vaz, A.; Pinna, E.; Puglisi, A.; Esposito, S.; hide

    2016-01-01

    The Large Binocular Telescope Interferometer (LBTI) is a high spatial resolution instrument developed for coherent imaging and nulling interferometry using the 14.4 m baseline of the 2x8.4 m LBT. The unique telescope design, comprising of the dual apertures on a common elevation-azimuth mount, enables a broad use of observing modes. The full system is comprised of dual adaptive optics systems, a near-infrared phasing camera, a 1-5 micrometer camera (called LMIRCam), and an 8-13 micrometer camera (called NOMIC). The key program for LBTI is the Hunt for Observable Signatures of Terrestrial planetary Systems (HOSTS), a survey using nulling interferometry to constrain the typical brightness from exozodiacal dust around nearby stars. Additional observations focus on the detection and characterization of giant planets in the thermal infrared, high spatial resolution imaging of complex scenes such as Jupiter's moon, Io, planets forming in transition disks, and the structure of active Galactic Nuclei (AGN). Several instrumental upgrades are currently underway to improve and expand the capabilities of LBTI. These include: Improving the performance and limiting magnitude of the parallel adaptive optics systems; quadrupling the field of view of LMIRcam (increasing to 20"x20"); adding an integral field spectrometry mode; and implementing a new algorithm for path length correction that accounts for dispersion due to atmospheric water vapor. We present the current architecture and performance of LBTI, as well as an overview of the upgrades.

  15. Universal ICT Picosecond Camera

    NASA Astrophysics Data System (ADS)

    Lebedev, Vitaly B.; Syrtzev, V. N.; Tolmachyov, A. M.; Feldman, Gregory G.; Chernyshov, N. A.

    1989-06-01

    The paper reports on the design of an ICI camera operating in the mode of linear or three-frame image scan. The camera incorporates two tubes: time-analyzing ICI PIM-107 1 with cathode S-11, and brightness amplifier PMU-2V (gain about 104) for the image shaped by the first tube. The camera is designed on the basis of streak camera AGAT-SF3 2 with almost the same power sources, but substantially modified pulse electronics. Schematically, the design of tube PIM-107 is depicted in the figure. The tube consists of cermet housing 1, photocathode 2 made in a separate vacuum volume and introduced into the housing by means of a manipulator. In a direct vicinity of the photocathode, accelerating electrode is located made of a fine-structure grid. An electrostatic lens formed by focusing electrode 4 and anode diaphragm 5 produces a beam of electrons with a "remote crossover". The authors have suggested this term for an electron beam whose crossover is 40 to 60 mm away from the anode diaphragm plane which guarantees high sensitivity of scan plates 6 with respect to multiaperture framing diaphragm 7. Beyond every diaphragm aperture, a pair of deflecting plates 8 is found shielded from compensation plates 10 by diaphragm 9. The electronic image produced by the photocathode is focused on luminescent screen 11. The tube is controlled with the help of two saw-tooth voltages applied in antiphase across plates 6 and 10. Plates 6 serve for sweeping the electron beam over the surface of diaphragm 7. The beam is either allowed toward the screen, or delayed by the diaphragm walls. In such a manner, three frames are obtained, the number corresponding to that of the diaphragm apertures. Plates 10 serve for stopping the compensation of the image streak sweep on the screen. To avoid overlapping of frames, plates 8 receive static potentials responsible for shifting frames on the screen. Changing the potentials applied to plates 8, one can control the spacing between frames and partially or fully overlap the frames. This sort of control is independent of the frequency of frame running and of their duration, and can only determine frame positioning on the screen. Since diaphragm 7 is located in the area of crossover and electron trajectories cross in the crossover, the frame is not decomposed into separate elements during its formation. The image is transferred onto the screen practically within the entire time of frame duration increasing the aperture ratio of the tube as compared to that in Ref. 3.

  16. High-speed and high-resolution quantitative phase imaging with digital-micromirror device-based illumination (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Zhou, Renjie; Jin, Di; Yaqoob, Zahid; So, Peter T. C.

    2017-02-01

    Due to the large number of available mirrors, the patterning speed, low-cost, and compactness, digital-micromirror devices (DMDs) have been extensively used in biomedical imaging system. Recently, DMDs have been brought to the quantitative phase microscopy (QPM) field to achieve synthetic-aperture imaging and tomographic imaging. Last year, our group demonstrated using DMD for QPM, where the phase-retrieval is based on a recently developed Fourier ptychography algorithm. In our previous system, the illumination angle was varied through coding the aperture plane of the illumination system, which has a low efficiency on utilizing the laser power. In our new DMD-based QPM system, we use the Lee-holograms, which is conjugated to the sample plane, to change the illumination angles for much higher power efficiency. Multiple-angle illumination can also be achieved with this method. With this versatile system, we can achieve FPM-based high-resolution phase imaging with 250 nm lateral resolution using the Rayleigh criteria. Due to the use of a powerful laser, the imaging speed would only be limited by the camera acquisition speed. With a fast camera, we expect to achieve close to 100 fps phase imaging speed that has not been achieved in current FPM imaging systems. By adding reference beam, we also expect to achieve synthetic-aperture imaging while directly measuring the phase of the sample fields. This would reduce the phase-retrieval processing time to allow for real-time imaging applications in the future.

  17. The ABLE ACE wavefront sensor

    NASA Astrophysics Data System (ADS)

    Butts, Robert R.

    1997-08-01

    A low noise, high resolution Shack-Hartmann wavefront sensor was included in the ABLE-ACE instrument suite to obtain direct high resolution phase measurements of the 0.53 micrometers pulsed laser beam propagated through high altitude atmospheric turbulence. The wavefront sensor employed a Fired geometry using a lenslet array which provided approximately 17 sub-apertures across the pupil. The lenslets focused the light in each sub-aperture onto a 21 by 21 array of pixels in the camera focal plane with 8 pixels in the camera focal plane with 8 pixels across the central lobe of the diffraction limited spot. The goal of the experiment was to measure the effects of the turbulence in the free atmosphere on propagation, but the wavefront sensor also detected the aberrations induced by the aircraft boundary layer and the receiver aircraft internal beam path. Data analysis methods used to extract the desired atmospheric contribution to the phase measurements from the data corrupted by non-atmospheric aberrations are described. Approaches which were used included a reconstruction of the phase as a linear combination of Zernike polynomials coupled with optical estimator sand computation of structure functions of the sub-aperture slopes. The theoretical basis for the data analysis techniques is presented. Results are described, and comparisons with theory and simulations are shown. Estimates of average turbulence strength along the propagation path from the wavefront sensor showed good agreement with other sensor. The Zernike spectra calculated from the wavefront sensor data were consistent with the standard Kolmogorov model of turbulence.

  18. Optical design of common aperture, common focal plane, multispectral optics for military applications

    NASA Astrophysics Data System (ADS)

    Thompson, Nicholas Allan

    2013-06-01

    With recent developments in multispectral detector technology, the interest in common aperture, common focal plane multispectral imaging systems is increasing. Such systems are particularly desirable for military applications, where increased levels of target discrimination and identification are required in cost-effective, rugged, lightweight systems. During the optical design of dual waveband or multispectral systems, the options for material selection are limited. This selection becomes even more restrictive for military applications, where material resilience, thermal properties, and color correction must be considered. We discuss the design challenges that lightweight multispectral common aperture systems present, along with some potential design solutions. Consideration is given to material selection for optimum color correction, as well as material resilience and thermal correction. This discussion is supported using design examples currently in development at Qioptiq.

  19. Survey of United States Commercial Satellites in Geosynchronous Earth Orbit

    DTIC Science & Technology

    1994-09-01

    248 a. Imaging Sensors ...... ............ 248 (1) Return Beam Vidicon Camera . ... 249 (2) Scanners. ...... ............ 249 b. Nonimaging ...251 a. Imaging Microwave Sensors ......... .. 251 (1) Synthetic Aperture Radar . ... 251 b. Nonimaging Microwave Sensors ..... .. 252 (1) Radar...The stream of electrons travels alonq the axis oa the tube, constrained by focusing magnets, until it reaches the collector . Surrounding this electron

  20. Methods for increasing the sensitivity of gamma-ray imagers

    DOEpatents

    Mihailescu, Lucian [Pleasanton, CA; Vetter, Kai M [Alameda, CA; Chivers, Daniel H [Fremont, CA

    2012-02-07

    Methods are presented that increase the position resolution and granularity of double sided segmented semiconductor detectors. These methods increase the imaging resolution capability of such detectors, either used as Compton cameras, or as position sensitive radiation detectors in imagers such as SPECT, PET, coded apertures, multi-pinhole imagers, or other spatial or temporal modulated imagers.

  1. Systems for increasing the sensitivity of gamma-ray imagers

    DOEpatents

    Mihailescu, Lucian; Vetter, Kai M.; Chivers, Daniel H.

    2012-12-11

    Systems that increase the position resolution and granularity of double sided segmented semiconductor detectors are provided. These systems increase the imaging resolution capability of such detectors, either used as Compton cameras, or as position sensitive radiation detectors in imagers such as SPECT, PET, coded apertures, multi-pinhole imagers, or other spatial or temporal modulated imagers.

  2. Optimizing Radiometric Fidelity to Enhance Aerial Image Change Detection Utilizing Digital Single Lens Reflex (DSLR) Cameras

    NASA Astrophysics Data System (ADS)

    Kerr, Andrew D.

    Determining optimal imaging settings and best practices related to the capture of aerial imagery using consumer-grade digital single lens reflex (DSLR) cameras, should enable remote sensing scientists to generate consistent, high quality, and low cost image data sets. Radiometric optimization, image fidelity, image capture consistency and repeatability were evaluated in the context of detailed image-based change detection. The impetus for this research is in part, a dearth of relevant, contemporary literature, on the utilization of consumer grade DSLR cameras for remote sensing, and the best practices associated with their use. The main radiometric control settings on a DSLR camera, EV (Exposure Value), WB (White Balance), light metering, ISO, and aperture (f-stop), are variables that were altered and controlled over the course of several image capture missions. These variables were compared for their effects on dynamic range, intra-frame brightness variation, visual acuity, temporal consistency, and the detectability of simulated cracks placed in the images. This testing was conducted from a terrestrial, rather than an airborne collection platform, due to the large number of images per collection, and the desire to minimize inter-image misregistration. The results point to a range of slightly underexposed image exposure values as preferable for change detection and noise minimization fidelity. The makeup of the scene, the sensor, and aerial platform, influence the selection of the aperture and shutter speed which along with other variables, allow for estimation of the apparent image motion (AIM) motion blur in the resulting images. The importance of the image edges in the image application, will in part dictate the lowest usable f-stop, and allow the user to select a more optimal shutter speed and ISO. The single most important camera capture variable is exposure bias (EV), with a full dynamic range, wide distribution of DN values, and high visual contrast and acuity occurring around -0.7 to -0.3EV exposure bias. The ideal values for sensor gain, was found to be ISO 100, with ISO 200 a less desirable. This study offers researchers a better understanding of the effects of camera capture settings on RSI pairs and their influence on image-based change detection.

  3. Wide-Field Optic for Autonomous Acquisition of Laser Link

    NASA Technical Reports Server (NTRS)

    Page, Norman A.; Charles, Jeffrey R.; Biswas, Abhijit

    2011-01-01

    An innovation reported in Two-Camera Acquisition and Tracking of a Flying Target, NASA Tech Briefs, Vol. 32, No. 8 (August 2008), p. 20, used a commercial fish-eye lens and an electronic imaging camera for initially locating objects with subsequent handover to an actuated narrow-field camera. But this operated against a dark-sky background. An improved solution involves an optical design based on custom optical components for the wide-field optical system that directly addresses the key limitations in acquiring a laser signal from a moving source such as an aircraft or a spacecraft. The first challenge was to increase the light collection entrance aperture diameter, which was approximately 1 mm in the first prototype. The new design presented here increases this entrance aperture diameter to 4.2 mm, which is equivalent to a more than 16 times larger collection area. One of the trades made in realizing this improvement was to restrict the field-of-view to +80 deg. elevation and 360 azimuth. This trade stems from practical considerations where laser beam propagation over the excessively high air mass, which is in the line of sight (LOS) at low elevation angles, results in vulnerability to severe atmospheric turbulence and attenuation. An additional benefit of the new design is that the large entrance aperture is maintained even at large off-axis angles when the optic is pointed at zenith. The second critical limitation for implementing spectral filtering in the design was tackled by collimating the light prior to focusing it onto the focal plane. This allows the placement of the narrow spectral filter in the collimated portion of the beam. For the narrow band spectral filter to function properly, it is necessary to adequately control the range of incident angles at which received light intercepts the filter. When this angle is restricted via collimation, narrower spectral filtering can be implemented. The collimated beam (and the filter) must be relatively large to reduce the incident angle down to only a few degrees. In the presented embodiment, the filter diameter is more than ten times larger than the entrance aperture. Specifically, the filter has a clear aperture of about 51 mm. The optical design is refractive, and is comprised of nine custom refractive elements and an interference filter. The restricted maximum angle through the narrow-band filter ensures the efficient use of a 2-nm noise equivalent bandwidth spectral width optical filter at low elevation angles (where the range is longest), at the expense of less efficiency for high elevations, which can be tolerated because the range at high elevation angles is shorter. The image circle is 12 mm in diameter, mapped to 80 x 360 of sky, centered on the zenith.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolotnikov, A. E., E-mail: bolotnik@bnl.gov; Ackley, K.; Camarda, G. S.

    We developed a robust and low-cost array of virtual Frisch-grid CdZnTe detectors coupled to a front-end readout application-specific integrated circuit (ASIC) for spectroscopy and imaging of gamma rays. The array operates as a self-reliant detector module. It is comprised of 36 close-packed 6 × 6 × 15 mm{sup 3} detectors grouped into 3 × 3 sub-arrays of 2 × 2 detectors with the common cathodes. The front-end analog ASIC accommodates up to 36 anode and 9 cathode inputs. Several detector modules can be integrated into a single- or multi-layer unit operating as a Compton or a coded-aperture camera. We presentmore » the results from testing two fully assembled modules and readout electronics. The further enhancement of the arrays’ performance and reduction of their cost are possible by using position-sensitive virtual Frisch-grid detectors, which allow for accurate corrections of the response of material non-uniformities caused by crystal defects.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolotnikov, A. E.; Ackley, K.; Camarda, G. S.

    We developed a robust and low-cost array of virtual Frisch-grid CdZnTe (CZT) detectors coupled to a front-end readout ASIC for spectroscopy and imaging of gamma rays. The array operates as a self-reliant detector module. It is comprised of 36 close-packed 6x6x15 mm 3 detectors grouped into 3x3 sub-arrays of 2x2 detectors with the common cathodes. The front-end analog ASIC accommodates up to 36 anode and 9 cathode inputs. Several detector modules can be integrated into a single- or multi-layer unit operating as a Compton or a coded-aperture camera. We present the results from testing two fully assembled modules and readoutmore » electronics. The further enhancement of the arrays’ performance and reduction of their cost are made possible by using position-sensitive virtual Frisch-grid detectors, which allow for accurate corrections of the response of material non-uniformities caused by crystal defects.« less

  6. THE DARK ENERGY CAMERA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flaugher, B.; Diehl, H. T.; Alvarez, O.

    2015-11-15

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuummore » Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel{sup −1}. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less

  7. The Dark Energy Camera

    DOE PAGES

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar.more » The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less

  8. A Planar Two-Dimensional Superconducting Bolometer Array for the Green Bank Telescope

    NASA Technical Reports Server (NTRS)

    Benford, Dominic; Staguhn, Johannes G.; Chervenak, James A.; Chen, Tina C.; Moseley, S. Harvey; Wollack, Edward J.; Devlin, Mark J.; Dicker, Simon R.; Supanich, Mark

    2004-01-01

    In order to provide high sensitivity rapid imaging at 3.3mm (90GHz) for the Green Bank Telescope - the world's largest steerable aperture - a camera is being built by the University of Pennsylvania, NASA/GSFC, and NRAO. The heart of this camera is an 8x8 close-packed, Nyquist-sampled detector array. We have designed and are fabricating a functional superconducting bolometer array system using a monolithic planar architecture. Read out by SQUID multiplexers, the superconducting transition edge sensors will provide fast, linear, sensitive response for high performance imaging. This will provide the first ever superconducting bolometer array on a facility instrument.

  9. Spatial and Angular Resolution Enhancement of Light Fields Using Convolutional Neural Networks

    NASA Astrophysics Data System (ADS)

    Gul, M. Shahzeb Khan; Gunturk, Bahadir K.

    2018-05-01

    Light field imaging extends the traditional photography by capturing both spatial and angular distribution of light, which enables new capabilities, including post-capture refocusing, post-capture aperture control, and depth estimation from a single shot. Micro-lens array (MLA) based light field cameras offer a cost-effective approach to capture light field. A major drawback of MLA based light field cameras is low spatial resolution, which is due to the fact that a single image sensor is shared to capture both spatial and angular information. In this paper, we present a learning based light field enhancement approach. Both spatial and angular resolution of captured light field is enhanced using convolutional neural networks. The proposed method is tested with real light field data captured with a Lytro light field camera, clearly demonstrating spatial and angular resolution improvement.

  10. Spatial and Angular Resolution Enhancement of Light Fields Using Convolutional Neural Networks.

    PubMed

    Gul, M Shahzeb Khan; Gunturk, Bahadir K

    2018-05-01

    Light field imaging extends the traditional photography by capturing both spatial and angular distribution of light, which enables new capabilities, including post-capture refocusing, post-capture aperture control, and depth estimation from a single shot. Micro-lens array (MLA) based light field cameras offer a cost-effective approach to capture light field. A major drawback of MLA based light field cameras is low spatial resolution, which is due to the fact that a single image sensor is shared to capture both spatial and angular information. In this paper, we present a learning based light field enhancement approach. Both spatial and angular resolution of captured light field is enhanced using convolutional neural networks. The proposed method is tested with real light field data captured with a Lytro light field camera, clearly demonstrating spatial and angular resolution improvement.

  11. Prototype color field sequential television lens assembly

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The design, development, and evaluation of a prototype modular lens assembly with a self-contained field sequential color wheel is presented. The design of a color wheel of maximum efficiency, the selection of spectral filters, and the design of a quiet, efficient wheel drive system are included. Design tradeoffs considered for each aspect of the modular assembly are discussed. Emphasis is placed on achieving a design which can be attached directly to an unmodified camera, thus permitting use of the assembly in evaluating various candidate camera and sensor designs. A technique is described which permits maintaining high optical efficiency with an unmodified camera. A motor synchronization system is developed which requires only the vertical synchronization signal as a reference frequency input. Equations and tradeoff curves are developed to permit optimizing the filter wheel aperture shapes for a variety of different design conditions.

  12. Progress in the hyperspectral payload for PRISMA programme

    NASA Astrophysics Data System (ADS)

    Meini, Marco; Battazza, Fabrizio; Formaro, Roberto; Bini, Alessandro

    2013-10-01

    The PRISMA (PRecursore IperSpettrale della Missione Applicativa) Programme is an ASI (Agenzia Spaziale Italiana) hyperspectral mission for Earth observation based on a mono-payload single satellite: an Italian Consortium is in charge to realize the mission; Selex ES has the full responsibility of the hyperspectral payload composed by a high spectral resolution spectrometer optically integrated with a medium resolution panchromatic camera. The optical design permits to cover the wavelength range from 400 to 2500 nm and it is based on high transmittance optical assemblies, including a reflective common telescope in Three-Mirror Anastigmat (TMA) configuration, a single slit aperture, a panchromatic camera (700-900 nm) and a spectrometer having two channels (VNIR and SWIR), each one using an suitable prism configuration and spectrally separated by a beam splitter, conceived to minimize the number of optical elements. High performance MCT-based detectors represent the core of the instrument. To provide the required data quality for the entire mission lifetime (5 years), an accurate and stable calibration unit (radiometric and spectral) is integrated, for the in-flight instrument calibration. The thermal design has been based on a passive cooling system: a double stage radiator, suitable oriented and protected from unwanted heat fluxes, high performance heat pipes and an operational heaters network represent the solution adopted to achieve the required thermal stability.

  13. Design of a high-numerical-aperture digital micromirror device camera with high dynamic range.

    PubMed

    Qiao, Yang; Xu, Xiping; Liu, Tao; Pan, Yue

    2015-01-01

    A high-NA imaging system with high dynamic range is presented based on a digital micromirror device (DMD). The DMD camera consists of an objective imaging system and a relay imaging system, connected by a DMD chip. With the introduction of a total internal reflection prism system, the objective imaging system is designed with a working F/# of 1.97, breaking through the F/2.45 limitation of conventional DMD projection lenses. As for the relay imaging system, an off-axis design that could correct off-axis aberrations of the tilt relay imaging system is developed. This structure has the advantage of increasing the NA of the imaging system while maintaining a compact size. Investigation revealed that the dynamic range of a DMD camera could be greatly increased, by 2.41 times. We built one prototype DMD camera with a working F/# of 1.23, and the field experiments proved the validity and reliability our work.

  14. Single-event transient imaging with an ultra-high-speed temporally compressive multi-aperture CMOS image sensor.

    PubMed

    Mochizuki, Futa; Kagawa, Keiichiro; Okihara, Shin-ichiro; Seo, Min-Woong; Zhang, Bo; Takasawa, Taishi; Yasutomi, Keita; Kawahito, Shoji

    2016-02-22

    In the work described in this paper, an image reproduction scheme with an ultra-high-speed temporally compressive multi-aperture CMOS image sensor was demonstrated. The sensor captures an object by compressing a sequence of images with focal-plane temporally random-coded shutters, followed by reconstruction of time-resolved images. Because signals are modulated pixel-by-pixel during capturing, the maximum frame rate is defined only by the charge transfer speed and can thus be higher than those of conventional ultra-high-speed cameras. The frame rate and optical efficiency of the multi-aperture scheme are discussed. To demonstrate the proposed imaging method, a 5×3 multi-aperture image sensor was fabricated. The average rising and falling times of the shutters were 1.53 ns and 1.69 ns, respectively. The maximum skew among the shutters was 3 ns. The sensor observed plasma emission by compressing it to 15 frames, and a series of 32 images at 200 Mfps was reconstructed. In the experiment, by correcting disparities and considering temporal pixel responses, artifacts in the reconstructed images were reduced. An improvement in PSNR from 25.8 dB to 30.8 dB was confirmed in simulations.

  15. A Simple Model of the Accommodating Lens of the Human Eye

    ERIC Educational Resources Information Center

    Oommen, Vinay; Kanthakumar, Praghalathan

    2014-01-01

    The human eye is often discussed as optically equivalent to a photographic camera. The iris is compared with the shutter, the pupil to the aperture, and the retina to the film, and both have lens systems to focus rays of light. Although many similarities exist, a major difference between the two systems is the mechanism involved in focusing an…

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsutani, Takaomi; Taya, Masaki; Ikuta, Takashi

    A parallel image detection system using an annular pupil for electron optics were developed to realize an increase in the depth of focus, aberration-free imaging and separation of amplitude and phase images under scanning transmission electron microscopy (STEM). Apertures for annular pupils able to suppress high-energy electron scattering were developed using a focused ion beam (FIB) technique. The annular apertures were designed with outer diameter of oe 40 {mu}m and inner diameter of oe32 {mu}m. A taper angle varying from 20 deg. to 1 deg. was applied to the slits of the annular apertures to suppress the influence of high-energymore » electron scattering. Each azimuth angle image on scintillator was detected by a multi-anode photomultiplier tube assembly through 40 optical fibers bundled in a ring shape. To focus the image appearing on the scintillator on optical fibers, an optical lens relay system attached with CCD camera was developed. The system enables the taking of 40 images simultaneously from different scattered directions.« less

  17. KC-135 aero-optical turbulent boundary layer/shear layer experiment revisited

    NASA Technical Reports Server (NTRS)

    Craig, J.; Allen, C.

    1987-01-01

    The aero-optical effects associated with propagating a laser beam through both an aircraft turbulent boundary layer and artificially generated shear layers are examined. The data present comparisons from observed optical performance with those inferred from aerodynamic measurements of unsteady density and correlation lengths within the same random flow fields. Using optical instrumentation with tens of microsecond temporal resolution through a finite aperture, optical performance degradation was determined and contrasted with the infinite aperture time averaged aerodynamic measurement. In addition, the optical data were artificially clipped to compare to theoretical scaling calculations. Optical instrumentation consisted of a custom Q switched Nd:Yag double pulsed laser, and a holographic camera which recorded the random flow field in a double pass, double pulse mode. Aerodynamic parameters were measured using hot film anemometer probes and a five hole pressure probe. Each technique is described with its associated theoretical basis for comparison. The effects of finite aperture and spatial and temporal frequencies of the random flow are considered.

  18. Improvements in low-cost label-free QPI microscope for live cell imaging

    NASA Astrophysics Data System (ADS)

    Seniya, C.; Towers, C. E.; Towers, D. P.

    2017-07-01

    This paper reports an improvement in the development of a low-cost QPI microscope offering new capabilities in term of phase measurement accuracy for label-free live samples in the longer term (i.e., hours to days). The spatially separated scattered and non-scattered image light fields are reshaped in the Fourier plane and modulated to form an interference image at a CCD camera. The apertures that enable these two beams to be generated have been optimised by means of laser-cut apertures placed on the mirrors of a Michelson interferometer and has improved the phase measuring and reconstruction capability of the QPI microscope. The microscope was tested with transparent onion cells as an object of interest.

  19. Double Donut Schmidt Camera, a wide-field, large-aperture, and lightweight space telescope for the detection of ultrahigh energy cosmic rays.

    PubMed

    Sandri, Paolo; Mazzinghi, Piero; Da Deppo, Vania

    2018-04-20

    A wide-field, large-aperture, and lightweight Schmidt configuration has been studied for a space mission proposal named Extreme Universe Space Observatory free flyer (EUSO-FF). EUSO-FF will be devoted to the study of ultrahigh energy cosmic rays, i.e., with energy >5×10 19   eV, through the detection of UV fluorescence light emitted by air showers in the Earth's atmosphere. The proposed telescope has a field of view of about 50° and an entrance pupil diameter of 4.2 m. The mirror is deployable and segmented to fit the diameter of the launcher fairing; the corrector is a lightweight annular corona.

  20. Science observations with the IUE using the one-gyro mode

    NASA Technical Reports Server (NTRS)

    Imhoff, C.; Pitts, R.; Arquilla, R.; Shrader, Chris R.; Perez, M. R.; Webb, J.

    1990-01-01

    The International Ultraviolet Explorer (IUE) attitude control system originally included an inertial reference package containing six gyroscopes for three axis stabilization. The science instrument includes a prime and redundant Field Error Sensor (FES) camera for target acquisition and offset guiding. Since launch, four of the six gyroscopes have failed. The current attitude control system utilizes the remaining two gyros and a Fine Sun Sensor (FSS) for three axis stabilization. When the next gyro fails, a new attitude control system will be uplinked which will rely on the remaining gyro and the FSS for general three axis stabilization. In addition to the FSS, the FES cameras will be required to assist in maintaining fine attitude control during target acquisition. This has required thoroughly determining the characteristics of the FES cameras and the spectrograph aperture plate as well as devising new target acquisition procedures. The results of this work are presented.

  1. Science observations with the IUE using the one-gyro mode

    NASA Technical Reports Server (NTRS)

    Imhoff, C.; Pitts, R.; Arquilla, R.; Shrader, C.; Perez, M.; Webb, J.

    1990-01-01

    The International Ultraviolet Explorer (IUE) attitude control system originally included an inertial reference package containing six gyroscopes for three axis stabilization. The science instrument includes a prime and redundant Field Error Sensor (FES) camera for target acquisition and offset guiding. Since launch, four of the six gyroscopes have failed. The current attitude control system utilizes the remaining two gyros and a Fine Sun Sensor (FSS) for three axis stabilization. When the next gyro fails, a new attitude control system will be uplinked, which will relay on the remaining gyro and the FSS for general three axis stabilization. In addition to the FSS, the FES cameras will be required to assist in maintaining fine attitude control during target acquisition. This has required thoroughly determining the characteristics of the FES cameras and the spectrograph aperture plate as well as devising new target acquisition procedures. The results of this work are presented.

  2. Common aperture multispectral optics for military applications

    NASA Astrophysics Data System (ADS)

    Thompson, N. A.

    2012-06-01

    With the recent developments in multi-spectral detector technology the interest in common aperture, common focal plane multi-spectral imaging systems is increasing. Such systems are particularly desirable for military applications where increased levels of target discrimination and identification are required in cost-effective, rugged, lightweight systems. During the optical design of dual waveband or multi-spectral systems, the options for material selection are limited. This selection becomes even more restrictive for military applications as material resilience and thermal properties must be considered in addition to colour correction. In this paper we discuss the design challenges that lightweight multi-spectral common aperture systems present along with some potential design solutions. Consideration will be given to material selection for optimum colour correction as well as material resilience and thermal correction. This discussion is supported using design examples that are currently in development at Qioptiq.

  3. Collimator of multiple plates with axially aligned identical random arrays of apertures

    NASA Technical Reports Server (NTRS)

    Hoover, R. B.; Underwood, J. H. (Inventor)

    1973-01-01

    A collimator is disclosed for examining the spatial location of distant sources of radiation and for imaging by projection, small, near sources of radiation. The collimator consists of a plurality of plates, all of which are pierced with an identical random array of apertures. The plates are mounted perpendicular to a common axis, with like apertures on consecutive plates axially aligned so as to form radiation channels parallel to the common axis. For near sources, the collimator is interposed between the source and a radiation detector and is translated perpendicular to the common axis so as to project radiation traveling parallel to the common axis incident to the detector. For far sources the collimator is scanned by rotating it in elevation and azimuth with a detector to determine the angular distribution of the radiation from the source.

  4. Light field reconstruction robust to signal dependent noise

    NASA Astrophysics Data System (ADS)

    Ren, Kun; Bian, Liheng; Suo, Jinli; Dai, Qionghai

    2014-11-01

    Capturing four dimensional light field data sequentially using a coded aperture camera is an effective approach but suffers from low signal noise ratio. Although multiplexing can help raise the acquisition quality, noise is still a big issue especially for fast acquisition. To address this problem, this paper proposes a noise robust light field reconstruction method. Firstly, scene dependent noise model is studied and incorporated into the light field reconstruction framework. Then, we derive an optimization algorithm for the final reconstruction. We build a prototype by hacking an off-the-shelf camera for data capturing and prove the concept. The effectiveness of this method is validated with experiments on the real captured data.

  5. Evaluation of S190A radiometric exposure test data

    NASA Technical Reports Server (NTRS)

    Lockwood, H. E.; Goodding, R. A.

    1974-01-01

    The S190A preflight radiometric exposure test data generated as part of preflight and system test of KM-002 Sequence 29 on flight camera S/N 002 was analyzed. The analysis was to determine camera system transmission using available data which included: (1) films exposed to a calibrated light source subject; (2) filter transmission data; (3) calibrated light source data; (4) density vs. log10 exposure curves for the films; and (5) spectral sensitometric data for the films. The procedure used is outlined, and includes the data and a transmission matrix as a function of field position for nine measured points on each station-film-filter-aperture-shutter speed combination.

  6. High-speed light field camera and frequency division multiplexing for fast multi-plane velocity measurements.

    PubMed

    Fischer, Andreas; Kupsch, Christian; Gürtler, Johannes; Czarske, Jürgen

    2015-09-21

    Non-intrusive fast 3d measurements of volumetric velocity fields are necessary for understanding complex flows. Using high-speed cameras and spectroscopic measurement principles, where the Doppler frequency of scattered light is evaluated within the illuminated plane, each pixel allows one measurement and, thus, planar measurements with high data rates are possible. While scanning is one standard technique to add the third dimension, the volumetric data is not acquired simultaneously. In order to overcome this drawback, a high-speed light field camera is proposed for obtaining volumetric data with each single frame. The high-speed light field camera approach is applied to a Doppler global velocimeter with sinusoidal laser frequency modulation. As a result, a frequency multiplexing technique is required in addition to the plenoptic refocusing for eliminating the crosstalk between the measurement planes. However, the plenoptic refocusing is still necessary in order to achieve a large refocusing range for a high numerical aperture that minimizes the measurement uncertainty. Finally, two spatially separated measurement planes with 25×25 pixels each are simultaneously acquired with a measurement rate of 0.5 kHz with a single high-speed camera.

  7. Preliminary calibration results of the wide angle camera of the imaging instrument OSIRIS for the Rosetta mission

    NASA Astrophysics Data System (ADS)

    Da Deppo, V.; Naletto, G.; Nicolosi, P.; Zambolin, P.; De Cecco, M.; Debei, S.; Parzianello, G.; Ramous, P.; Zaccariotto, M.; Fornasier, S.; Verani, S.; Thomas, N.; Barthol, P.; Hviid, S. F.; Sebastian, I.; Meller, R.; Sierks, H.; Keller, H. U.; Barbieri, C.; Angrilli, F.; Lamy, P.; Rodrigo, R.; Rickman, H.; Wenzel, K. P.

    2017-11-01

    Rosetta is one of the cornerstone missions of the European Space Agency for having a rendezvous with the comet 67P/Churyumov-Gerasimenko in 2014. The imaging instrument on board the satellite is OSIRIS (Optical, Spectroscopic and Infrared Remote Imaging System), a cooperation among several European institutes, which consists of two cameras: a Narrow (NAC) and a Wide Angle Camera (WAC). The WAC optical design is an innovative one: it adopts an all reflecting, unvignetted and unobstructed two mirror configuration which allows to cover a 12° × 12° field of view with an F/5.6 aperture and gives a nominal contrast ratio of about 10-4. The flight model of this camera has been successfully integrated and tested in our laboratories, and finally has been integrated on the satellite which is now waiting to be launched in February 2004. In this paper we are going to describe the optical characteristics of the camera, and to summarize the results so far obtained with the preliminary calibration data. The analysis of the optical performance of this model shows a good agreement between theoretical performance and experimental results.

  8. Experimental demonstration of tri-aperture Differential Synthetic Aperture Ladar

    NASA Astrophysics Data System (ADS)

    Zhao, Zhilong; Huang, Jianyu; Wu, Shudong; Wang, Kunpeng; Bai, Tao; Dai, Ze; Kong, Xinyi; Wu, Jin

    2017-04-01

    A tri-aperture Differential Synthetic Aperture Ladar (DSAL) is demonstrated in laboratory, which is configured by using one common aperture to transmit the illuminating laser and another two along-track receiving apertures to collect back-scattered laser signal for optical heterodyne detection. The image formation theory on this tri-aperture DSAL shows that there are two possible methods to reconstruct the azimuth Phase History Data (PHD) for aperture synthesis by following standard DSAL principle, either method resulting in a different matched filter as well as an azimuth image resolution. The experimental setup of the tri-aperture DSAL adopts a frequency chirped laser of about 40 mW in 1550 nm wavelength range as the illuminating source and an optical isolator composed of a polarizing beam-splitter and a quarter wave plate to virtually line the three apertures in the along-track direction. Various DSAL images up to target distance of 12.9 m are demonstrated using both PHD reconstructing methods.

  9. VizieR Online Data Catalog: Astrometric monitoring of ultracool dwarf binaries (Dupuy+, 2017)

    NASA Astrophysics Data System (ADS)

    Dupuy, T. J.; Liu, M. C.

    2017-09-01

    In Table 1 we list all 33 binaries in our Keck+CFHT astrometric monitoring sample, along with three other binaries that have published orbit and parallax measurements. We began obtaining resolved Keck AO astrometry in 2007-2008, and we combined our new astrometry with available data in the literature or public archives (e.g., HST and Gemini) to refine our orbital period estimates and thereby our prioritization for Keck observations. We present here new Keck/NIRC2 AO imaging and non-redundant aperture-masking observations, in addition to a re-analysis of our own previously published data and publicly available archival data for our sample binaries. Table 2 gives our measured astrometry and flux ratios for all Keck AO data used in our orbital analysis spanning 2003 Apr 15 to 2016 May 13. In total there are 339 distinct measurements (unique bandpass and epoch for a given target), where 302 of these are direct imaging and 37 are non-redundant aperture masking. Eight of the imaging measurements are from six unpublished archival data sets. See section 3.1.1 for further details. In addition to our Keck AO monitoring, we also obtained data for three T dwarf binaries over a three-year HST program using the Advanced Camera for Surveys (ACS) Wide Field Camera (WFC) in the F814W bandpass. See section 3.1.2 for further details. Many of our sample binaries have HST imaging data in the public archive. We have re-analyzed the available archival data coming from the WFPC2 Planetary Camera (WFPC2-PC1), ACS High Resolution Channel (ACS-HRC), and NICMOS Camera 1 (NICMOS-NIC1). See section 3.1.3 for further details. We present here an updated analysis of our data from the Hawaii Infrared Parallax Program that uses the CFHT facility infrared camera WIRCam. Our observing strategy and custom astrometry pipeline are described in detail in Dupuy & Liu (2012, J/ApJS/201/19). See section 3.2 for further explanations. (10 data files).

  10. Defining ray sets for the analysis of lenslet-based optical systems including plenoptic cameras and Shack-Hartmann wavefront sensors

    NASA Astrophysics Data System (ADS)

    Moore, Lori

    Plenoptic cameras and Shack-Hartmann wavefront sensors are lenslet-based optical systems that do not form a conventional image. The addition of a lens array into these systems allows for the aberrations generated by the combination of the object and the optical components located prior to the lens array to be measured or corrected with post-processing. This dissertation provides a ray selection method to determine the rays that pass through each lenslet in a lenslet-based system. This first-order, ray trace method is developed for any lenslet-based system with a well-defined fore optic, where in this dissertation the fore optic is all of the optical components located prior to the lens array. For example, in a plenoptic camera the fore optic is a standard camera lens. Because a lens array at any location after the exit pupil of the fore optic is considered in this analysis, it is applicable to both plenoptic cameras and Shack-Hartmann wavefront sensors. Only a generic, unaberrated fore optic is considered, but this dissertation establishes a framework for considering the effect of an aberrated fore optic in lenslet-based systems. The rays from the fore optic that pass through a lenslet placed at any location after the fore optic are determined. This collection of rays is reduced to three rays that describe the entire lenslet ray set. The lenslet ray set is determined at the object, image, and pupil planes of the fore optic. The consideration of the apertures that define the lenslet ray set for an on-axis lenslet leads to three classes of lenslet-based systems. Vignetting of the lenslet rays is considered for off-axis lenslets. Finally, the lenslet ray set is normalized into terms similar to the field and aperture vector used to describe the aberrated wavefront of the fore optic. The analysis in this dissertation is complementary to other first-order models that have been developed for a specific plenoptic camera layout or Shack-Hartmann wavefront sensor application. This general analysis determines the location where the rays of each lenslet pass through the fore optic establishing a framework to consider the effect of an aberrated fore optic in a future analysis.

  11. Differential Optical Synthetic Aperture Radar

    DOEpatents

    Stappaerts, Eddy A.

    2005-04-12

    A new differential technique for forming optical images using a synthetic aperture is introduced. This differential technique utilizes a single aperture to obtain unique (N) phases that can be processed to produce a synthetic aperture image at points along a trajectory. This is accomplished by dividing the aperture into two equal "subapertures", each having a width that is less than the actual aperture, along the direction of flight. As the platform flies along a given trajectory, a source illuminates objects and the two subapertures are configured to collect return signals. The techniques of the invention is designed to cancel common-mode errors, trajectory deviations from a straight line, and laser phase noise to provide the set of resultant (N) phases that can produce an image having a spatial resolution corresponding to a synthetic aperture.

  12. A generalized method for determining radiation patterns of aperture antennas and its application to reflector antennas. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Paknys, J. R.

    1982-01-01

    The reflector antenna may be thought of as an aperture antenna. The classical solution for the radiation pattern of such an antenna is found by the aperture integration (AI) method. Success with this method depends on how accurately the aperture currents are known beforehand. In the past, geometrical optics (GO) has been employed to find the aperture currents. This approximation is suitable for calculating the main beam and possibly the first few sidelobes. A better approximation is to use aperture currents calculated from the geometrical theory of diffraction (GTD). Integration of the GTD currents over and extended aperture yields more accurate results for the radiation pattern. This approach is useful when conventional AI and GTD solutions have no common region of validity. This problem arises in reflector antennas. Two dimensional models of parabolic reflectors are studied; however, the techniques discussed can be applied to any aperture antenna.

  13. Aspheric and freeform surfaces metrology with software configurable optical test system: a computerized reverse Hartmann test

    NASA Astrophysics Data System (ADS)

    Su, Peng; Khreishi, Manal A. H.; Su, Tianquan; Huang, Run; Dominguez, Margaret Z.; Maldonado, Alejandro; Butel, Guillaume; Wang, Yuhao; Parks, Robert E.; Burge, James H.

    2014-03-01

    A software configurable optical test system (SCOTS) based on deflectometry was developed at the University of Arizona for rapidly, robustly, and accurately measuring precision aspheric and freeform surfaces. SCOTS uses a camera with an external stop to realize a Hartmann test in reverse. With the external camera stop as the reference, a coordinate measuring machine can be used to calibrate the SCOTS test geometry to a high accuracy. Systematic errors from the camera are carefully investigated and controlled. Camera pupil imaging aberration is removed with the external aperture stop. Imaging aberration and other inherent errors are suppressed with an N-rotation test. The performance of the SCOTS test is demonstrated with the measurement results from a 5-m-diameter Large Synoptic Survey Telescope tertiary mirror and an 8.4-m diameter Giant Magellan Telescope primary mirror. The results show that SCOTS can be used as a large-dynamic-range, high-precision, and non-null test method for precision aspheric and freeform surfaces. The SCOTS test can achieve measurement accuracy comparable to traditional interferometric tests.

  14. Improving the photometric precision of IRAC Channel 1

    NASA Astrophysics Data System (ADS)

    Mighell, Kenneth J.; Glaccum, William; Hoffmann, William

    2008-07-01

    Planning is underway for a possible post-cryogenic mission with the Spitzer Space Telescope. Only Channels 1 and 2 (3.6 and 4.5 μm) of the Infrared Array Camera (IRAC) will be operational; they will have unmatched sensitivity from 3 to 5 microns until the James Webb Space Telescope is launched. At SPIE Orlando, Mighell described his NASA-funded MATPHOT algorithm for precision stellar photometry and astrometry and presented MATPHOT-based simulations that suggested Channel 1 stellar photometry may be significantly improved by modeling the nonuniform RQE within each pixel, which, when not taken into account in aperture photometry, causes the derived flux to vary according to where the centroid falls within a single pixel (the pixel-phase effect). We analyze archival observations of calibration stars and compare the precision of stellar aperture photometry, with the recommended 1-dimensional and a new 2-dimensional pixel-phase aperture-flux correction, and MATPHOT-based PSF-fitting photometry which accounts for the observed loss of stellar flux due to the nonuniform intrapixel quantum efficiency. We show how the precision of aperture photometry of bright isolated stars corrected with the new 2-dimensional aperture-flux correction function can yield photometry that is almost as precise as that produced by PSF-fitting procedures. This timely research effort is intended to enhance the science return not only of observations already in Spitzer data archive but also those that would be made during the Spitzer Warm Mission.

  15. EXPERIMENTS IN LITHOGRAPHY FROM REMOTE SENSOR IMAGERY.

    USGS Publications Warehouse

    Kidwell, R. H.; McSweeney, J.; Warren, A.; Zang, E.; Vickers, E.

    1983-01-01

    Imagery from remote sensing systems such as the Landsat multispectral scanner and return beam vidicon, as well as synthetic aperture radar and conventional optical camera systems, contains information at resolutions far in excess of that which can be reproduced by the lithographic printing process. The data often require special handling to produce both standard and special map products. Some conclusions have been drawn regarding processing techniques, procedures for production, and printing limitations.

  16. Dust Science with SPICA/MCS

    NASA Astrophysics Data System (ADS)

    Sakon, I.; Onaka, T.; Kataza, H.; Wada, T.; Sarugaku, Y.; Matsuhara, H.; Nakagawa, T.; Kobayashi, N.; Kemper, C.; Ohyama, Y.; Matsumoto, T.; Seok, J. Y.

    Mid-Infrared Camera and Spectrometers (MCS) is one of the Focal-Plane Instruments proposed for the SPICA mission in the pre-project phase. SPICA MCS is equipped with two spectrometers with different spectral resolution powers (R=λ /δ λ ); medium-resolution spectrometer (MRS) which covers 12-38µ m with R≃1100-3000, and high-resolution spectrometer (HRS) which covers either 12-18µ m with R≃30000. MCS is also equipped with Wide Field Camera (WFC), which is capable of performing multi-objects grism spectroscopy in addition to the imaging observation. A small slit aperture for low-resolution slit spectroscopy is planned to be placed just next to the field of view (FOV) aperture for imaging and slit-less spectroscopic observation. MCS covers an important part of the core spectral range of SPICA and, complementary with SAFARI (SpicA FAR-infrared Instrument), can do crucial observations for a number of key science cases to revolutionize our understanding of the lifecycle of dust in the universe. In this article, the latest design specification and the expected performance of the SPICA/MCS are introduced. Key science cases that should be targetted by SPICA/MCS have been discussed by the MCS science working group. Among such science cases, some of those related to dust science are briefly introduced.

  17. Design of an adjustable bipod flexure for a large-aperture mirror of a space camera.

    PubMed

    Liu, Bei; Wang, Wei; Qu, Yan-Jun; Li, Xu-Peng; Wang, Xiao; Zhao, Hui

    2018-05-20

    An adjustable bipod flexure (ABF) technique for a large-aperture mirror of a space camera is presented. The proposed flexure mount can decrease the surface distortions caused by the machining error and the assembly error of the mirror assembly (MA) in a horizontal optical testing layout. Through the analysis of the compliance matrix of conventional bipod flexure, the positional relationship between the rotation center and the apex of the flexure is investigated. Then, the principle of the adjustable flexure, known as the trapezoidal switching principle, is proposed based on the analysis result. The structure and application of the flexure are also described. The optical performance of the mirror mounted by the adjustable flexures in different misalignments was performed using finite element methods. The result shows that the astigmatic aberration due to gravity is effectively reduced by adjusting the mount, and the root-mean-square value of the mirror can be minimized with the misalignment between the flexure pivot and the neutral plane minimized. New monolithic bipod flexures, based on the optimal regulating variable Δ u according to the measurement results, are manufactured to replace the ABFs to secure the mirror's safety against launch loads. Modal analysis verified the mechanical safety of the MA with respect to the new monolithic flexures.

  18. Light field image denoising using a linear 4D frequency-hyperfan all-in-focus filter

    NASA Astrophysics Data System (ADS)

    Dansereau, Donald G.; Bongiorno, Daniel L.; Pizarro, Oscar; Williams, Stefan B.

    2013-02-01

    Imaging in low light is problematic as sensor noise can dominate imagery, and increasing illumination or aperture size is not always effective or practical. Computational photography offers a promising solution in the form of the light field camera, which by capturing redundant information offers an opportunity for elegant noise rejection. We show that the light field of a Lambertian scene has a 4D hyperfan-shaped frequency-domain region of support at the intersection of a dual-fan and a hypercone. By designing and implementing a filter with appropriately shaped passband we accomplish denoising with a single all-in-focus linear filter. Drawing examples from the Stanford Light Field Archive and images captured using a commercially available lenselet- based plenoptic camera, we demonstrate that the hyperfan outperforms competing methods including synthetic focus, fan-shaped antialiasing filters, and a range of modern nonlinear image and video denoising techniques. We show the hyperfan preserves depth of field, making it a single-step all-in-focus denoising filter suitable for general-purpose light field rendering. We include results for different noise types and levels, over a variety of metrics, and in real-world scenarios. Finally, we show that the hyperfan's performance scales with aperture count.

  19. Electron diffraction covering a wide angular range from Bragg diffraction to small-angle diffraction.

    PubMed

    Nakajima, Hiroshi; Kotani, Atsuhiro; Harada, Ken; Mori, Shigeo

    2018-04-09

    We construct an electron optical system to investigate Bragg diffraction (the crystal lattice plane, 10-2 to 10-3 rad) with the objective lens turned off by adjusting the current in the intermediate lenses. A crossover was located on the selected-area aperture plane. Thus, the dark-field imaging can be performed by using a selected-area aperture to select Bragg diffraction spots. The camera length can be controlled in the range of 0.8-4 m without exciting the objective lens. Furthermore, we can observe the magnetic-field dependence of electron diffraction using the objective lens under weak excitation conditions. The diffraction mode for Bragg diffraction can be easily switched to a small-angle electron diffraction mode having a camera length of more than 100 m. We propose this experimental method to acquire electron diffraction patterns that depict an extensive angular range from 10-2 to 10-7 rad. This method is applied to analyze the magnetic microstructures in three distinct magnetic materials, i.e. a uniaxial magnetic structure of BaFe10.35Sc1.6Mg0.05O19, a martensite of a Ni-Mn-Ga alloy, and a helical magnetic structure of Ba0.5Sr1.5Zn2Fe12O22.

  20. MEGARA Optics: Sub-aperture Stitching Interferometry for Large Surfaces

    NASA Astrophysics Data System (ADS)

    Aguirre-Aguirre, Daniel; Carrasco, Esperanza; Izazaga-Pérez, Rafael; Páez, Gonzalo; Granados-Agustín, Fermín; Percino-Zacarías, Elizabeth; Gil de Paz, Armando; Gallego, Jesús; Iglesias-Páramo, Jorge; Villalobos-Mendoza, Brenda

    2018-04-01

    In this work, we present a detailed analysis of sub-aperture interferogram stitching software to test circular and elliptical clear apertures with diameters and long axes up to 272 and 180 mm, respectively, from the Multi-Espectrógrafo en GTC de Alta Resolución para Astronomía (MEGARA). MEGARA is a new spectrograph for the Gran Telescopio Canarias (GTC). It offers a resolution between 6000 and 20000 via the use of volume phase holographic gratings. It has an integral field unit and a set of robots for multi-object spectroscopy at the telescope focal plane. The output end of the fibers forms the spectrograph pseudo-slit. The fixed geometry of the collimator and camera configuration requires prisms in addition to the flat windows of the volume phase holographic gratings. There are 73 optical elements of large aperture and high precision manufactured in Mexico at the Instituto Nacional de Astrofísica, Óptica y Electrónica (INAOE) and the Centro de Investigaciones en Óptica (CIO). The principle of stitching interferometry is to divide the surface being tested into overlapping small sections, which allows an easier analysis (Kim & Wyant 1981). This capability is ideal for non-contact tests for unique and large optics as required by astronomical instruments. We show that the results obtained with our sub-aperture stitching algorithm were consistent with other methods that analyze the entire aperture. We used this method to analyze the 24 MEGARA prisms that could not be tested otherwise. The instrument has been successfully commissioned at GTC in all the spectral configurations. The fulfillment of the irregularity specifications was one of the necessary conditions to comply with the spectral requirements.

  1. Tibial tunnel aperture irregularity after drilling with 5 reamer designs: a qualitative micro-computed tomography analysis.

    PubMed

    Geeslin, Andrew G; Jansson, Kyle S; Wijdicks, Coen A; Chapman, Mark A; Fok, Alex S; LaPrade, Robert F

    2011-04-01

    There is limited information in the literature on comparisons of antegrade versus retrograde reaming techniques and the effect on the creation of anterior cruciate ligament (ACL) tibial tunnel entry and exit apertures. Proximal and distal apertures of ACL tibial tunnels, as created with different reamers, will be affected by type of reamer design. Controlled laboratory study. Forty skeletally mature porcine tibias with bone mineral density values comparable with a young athletic population were included in this study. Five 9-mm reamer models were used (3 antegrade: A1, smooth-bore reamer; A2, acorn-head reamer; A3, flat-head reamer; 2 retrograde: R1, retrograde acorn reamer; R2, single-blade retrograde reamer), and a new reamer was used for each tibia (8 reamer-tibia pairs per reamer model). All specimens underwent micro-computed tomography scanning, and images were reconstructed and analyzed using 3-dimensional image analysis software. Aperture rim fractures were graded on a 0-IV scale that described the proportion of the fractured aperture circumference. Specimens with incomplete apertures were also recorded. Because of the unique characteristics of various tunnels, intratunnel characteristics were observed and recorded. In sum, 1 proximal and 7 distal aperture rim fractures were found; 3, 0, and 4 distal aperture rim fractures were found with groups A1, A2, and A3, respectively. Incomplete apertures were more commonly found at the distal aperture (n = 15) than the proximal aperture (n = 8); there were no tibias with this finding at both apertures. All incomplete distal apertures occurred with the retrograde technique, and all incomplete proximal apertures occurred with the antegrade technique, most commonly with reamer design A3. An added finding of tunnel curvature at the distal aspect of the tunnel was observed in all 8 tibias with R1 reamers and 5 tibias with R2 reamers. This phenomenon was not observed in any of the tibias reamed with the antegrade technique. Anterior cruciate ligament tibial tunnel aperture characteristics were highly dependent on reamer design. Optimal proximal aperture characteristics were produced by the retrograde reamers, whereas optimal distal aperture characteristics were obtained with the antegrade reamers. In addition, a phenomenon of tunnel curvature in retrograde-type reamers was found, which may have effects on ACL graft or screw fixation. Differences in tunnel aperture shapes and fractures depend on reamer design. This information is important for the creation of ACL reconstruction tunnels with different reamer designs.

  2. A control system of a mini survey facility for photometric monitoring

    NASA Astrophysics Data System (ADS)

    Tsutsui, Hironori; Yanagisawa, Kenshi; Izumiura, Hideyuki; Shimizu, Yasuhiro; Hanaue, Takumi; Ita, Yoshifusa; Ichikawa, Takashi; Komiyama, Takahiro

    2016-08-01

    We have built a control system for a mini survey facility dedicated to photometric monitoring of nearby bright (K<5) stars in the near-infrared region. The facility comprises a 4-m-diameter rotating dome and a small (30-mm aperture) wide-field (5 × 5 sq. deg. field of view) infrared (1.0-2.5 microns) camera on an equatorial fork mount, as well as power sources and other associated equipment. All the components other than the camera are controlled by microcomputerbased I/O boards that were developed in-house and are in many of the open-use instruments in our observatory. We present the specifications and configuration of the facility hardware, as well as the structure of its control software.

  3. Polarimetric optical imaging of scattering surfaces.

    PubMed

    Barter, J D; Lee, P H

    1996-10-20

    A polarimetric optical specular event detector (OSED) has been developed to provide spatially and temporally resolved polarimetric data of backscattering in the visible from water wave surfaces. The OSED acquires simultaneous, two-dimensionally resolved images of the remote target in two orthogonal planes of polarization. With the use of plane-polarized illumination the OSED presently can measure, in an ensemble of breaking waves, the equivalent four-element polarization matrix common to polarimetric radars. Upgrade to full Stokes parameter state of polarization measurements is straightforward with the use of present single-aperture, multi-imager CCD camera technology. The OSED is used in conjunction with a coherent pulse-chirped radar (PCR), which also measures the four-element polarization matrix, to provide direct time-correlated identification of backscattering mechanisms operative during wave-breaking events which heretofore have not been described theoretically. We describe the instrument and its implementation, and examples of spatially resolved polarimetric data are displayed as correlated with the PCR backscatter cross section and polarization ratio records.

  4. Electronic still camera

    NASA Astrophysics Data System (ADS)

    Holland, S. Douglas

    1992-09-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  5. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  6. Developing a bright 17 keV x-ray source for probing high-energy-density states of matter at high spatial resolution

    NASA Astrophysics Data System (ADS)

    Huntington, C. M.; Park, H.-S.; Maddox, B. R.; Barrios, M. A.; Benedetti, R.; Braun, D. G.; Hohenberger, M.; Landen, O. L.; Regan, S. P.; Wehrenberg, C. E.; Remington, B. A.

    2015-04-01

    A set of experiments were performed on the National Ignition Facility (NIF) to develop and optimize a bright, 17 keV x-ray backlighter probe using laser-irradiated Nb foils. High-resolution one-dimensional imaging was achieved using a 15 μm wide slit in a Ta substrate to aperture the Nb Heα x-rays onto an open-aperture, time integrated camera. To optimize the x-ray source for imaging applications, the effect of laser pulse shape and spatial profile on the target was investigated. Two laser pulse shapes were used—a "prepulse" shape that included a 3 ns, low-intensity laser foot preceding the high-energy 2 ns square main laser drive, and a pulse without the laser foot. The laser spatial profile was varied by the use of continuous phase plates (CPPs) on a pair of shots compared to beams at best focus, without CPPs. A comprehensive set of common diagnostics allowed for a direct comparison of imaging resolution, total x-ray conversion efficiency, and x-ray spectrum between shots. The use of CPPs was seen to reduce the high-energy tail of the x-ray spectrum, whereas the laser pulse shape had little effect on the high-energy tail. The measured imaging resolution was comparably high for all combinations of laser parameters, but a higher x-ray flux was achieved without phase plates. This increased flux was the result of smaller laser spot sizes, which allowed us to arrange the laser focal spots from multiple beams and produce an x-ray source which was more localized behind the slit aperture. Our experiments are a first demonstration of point-projection geometry imaging at NIF at the energies (>10 keV) necessary for imaging denser, higher-Z targets than have previously been investigated.

  7. Uncooled radiometric camera performance

    NASA Astrophysics Data System (ADS)

    Meyer, Bill; Hoelter, T.

    1998-07-01

    Thermal imaging equipment utilizing microbolometer detectors operating at room temperature has found widespread acceptance in both military and commercial applications. Uncooled camera products are becoming effective solutions to applications currently using traditional, photonic infrared sensors. The reduced power consumption and decreased mechanical complexity offered by uncooled cameras have realized highly reliable, low-cost, hand-held instruments. Initially these instruments displayed only relative temperature differences which limited their usefulness in applications such as Thermography. Radiometrically calibrated microbolometer instruments are now available. The ExplorIR Thermography camera leverages the technology developed for Raytheon Systems Company's first production microbolometer imaging camera, the Sentinel. The ExplorIR camera has a demonstrated temperature measurement accuracy of 4 degrees Celsius or 4% of the measured value (whichever is greater) over scene temperatures ranges of minus 20 degrees Celsius to 300 degrees Celsius (minus 20 degrees Celsius to 900 degrees Celsius for extended range models) and camera environmental temperatures of minus 10 degrees Celsius to 40 degrees Celsius. Direct temperature measurement with high resolution video imaging creates some unique challenges when using uncooled detectors. A temperature controlled, field-of-view limiting aperture (cold shield) is not typically included in the small volume dewars used for uncooled detector packages. The lack of a field-of-view shield allows a significant amount of extraneous radiation from the dewar walls and lens body to affect the sensor operation. In addition, the transmission of the Germanium lens elements is a function of ambient temperature. The ExplorIR camera design compensates for these environmental effects while maintaining the accuracy and dynamic range required by today's predictive maintenance and condition monitoring markets.

  8. The Effects of Radiation on Imagery Sensors in Space

    NASA Technical Reports Server (NTRS)

    Mathis, Dylan

    2007-01-01

    Recent experience using high definition video on the International Space Station reveals camera pixel degradation due to particle radiation to be a much more significant problem with high definition cameras than with standard definition video. Although it may at first appear that increased pixel density on the imager is the logical explanation for this, the ISS implementations of high definition suggest a more complex causal and mediating factor mix. The degree of damage seems to vary from one type of camera to another, and this variation prompts a reconsideration of the possible factors in pixel loss, such as imager size, number of pixels, pixel aperture ratio, imager type (CCD or CMOS), method of error correction/concealment, and the method of compression used for recording or transmission. The problem of imager pixel loss due to particle radiation is not limited to out-of-atmosphere applications. Since particle radiation increases with altitude, it is not surprising to find anecdotal evidence that video cameras subject to many hours of airline travel show an increased incidence of pixel loss. This is even evident in some standard definition video applications, and pixel loss due to particle radiation only stands to become a more salient issue considering the continued diffusion of high definition video cameras in the marketplace.

  9. An efficient multiple exposure image fusion in JPEG domain

    NASA Astrophysics Data System (ADS)

    Hebbalaguppe, Ramya; Kakarala, Ramakrishna

    2012-01-01

    In this paper, we describe a method to fuse multiple images taken with varying exposure times in the JPEG domain. The proposed algorithm finds its application in HDR image acquisition and image stabilization for hand-held devices like mobile phones, music players with cameras, digital cameras etc. Image acquisition at low light typically results in blurry and noisy images for hand-held camera's. Altering camera settings like ISO sensitivity, exposure times and aperture for low light image capture results in noise amplification, motion blur and reduction of depth-of-field respectively. The purpose of fusing multiple exposures is to combine the sharp details of the shorter exposure images with high signal-to-noise-ratio (SNR) of the longer exposure images. The algorithm requires only a single pass over all images, making it efficient. It comprises of - sigmoidal boosting of shorter exposed images, image fusion, artifact removal and saturation detection. Algorithm does not need more memory than a single JPEG macro block to be kept in memory making it feasible to be implemented as the part of a digital cameras hardware image processing engine. The Artifact removal step reuses the JPEGs built-in frequency analysis and hence benefits from the considerable optimization and design experience that is available for JPEG.

  10. Study on characteristics of the aperture-averaging factor of atmospheric scintillation in terrestrial optical wireless communication

    NASA Astrophysics Data System (ADS)

    Shen, Hong; Liu, Wen-xing; Zhou, Xue-yun; Zhou, Li-ling; Yu, Long-Kun

    2018-02-01

    In order to thoroughly understand the characteristics of the aperture-averaging effect of atmospheric scintillation in terrestrial optical wireless communication and provide references for engineering design and performance evaluation of the optics system employed in the atmosphere, we have theoretically deduced the generally analytic expression of the aperture-averaging factor of atmospheric scintillation, and numerically investigated characteristics of the apertureaveraging factor under different propagation conditions. The limitations of the current commonly used approximate calculation formula of aperture-averaging factor have been discussed, and the results showed that the current calculation formula is not applicable for the small receiving aperture under non-uniform turbulence link. Numerical calculation has showed that aperture-averaging factor of atmospheric scintillation presented an exponential decline model for the small receiving aperture under non-uniform turbulent link, and the general expression of the model was given. This model has certain guiding significance for evaluating the aperture-averaging effect in the terrestrial optical wireless communication.

  11. Medical Applications of IR Focal Plane Arrays

    DTIC Science & Technology

    1998-03-15

    University of Memphis, USA, E. Wolf, H. Bada C Leffler - University of Tennessee at Memphis, USA, M. Daley ■ University of Memphis, USA A two channel ...optical aperture versus thermal sensitivity at two different resolution settings for an optimized medical IR camera LIST OF TABLES TABLE 1 Advantages...34. Technology Transferred: Through this work, infrared imaging in medicine was exposed to ever-growing audiences. For the first time, the work of the last two

  12. Development of a CCD array as an imaging detector for advanced X-ray astrophysics facilities

    NASA Technical Reports Server (NTRS)

    Schwartz, D. A.

    1981-01-01

    The development of a charge coupled device (CCD) X-ray imager for a large aperture, high angular resolution X-ray telescope is discussed. Existing CCDs were surveyed and three candidate concepts were identified. An electronic camera control and computer interface, including software to drive a Fairchild 211 CCD, is described. In addition a vacuum mounting and cooling system is discussed. Performance data for the various components are given.

  13. Sky light polarization detection with linear polarizer triplet in light field camera inspired by insect vision.

    PubMed

    Zhang, Wenjing; Cao, Yu; Zhang, Xuanzhe; Liu, Zejin

    2015-10-20

    Stable information of a sky light polarization pattern can be used for navigation with various advantages such as better performance of anti-interference, no "error cumulative effect," and so on. But the existing method of sky light polarization measurement is weak in real-time performance or with a complex system. Inspired by the navigational capability of a Cataglyphis with its compound eyes, we introduce a new approach to acquire the all-sky image under different polarization directions with one camera and without a rotating polarizer, so as to detect the polarization pattern across the full sky in a single snapshot. Our system is based on a handheld light field camera with a wide-angle lens and a triplet linear polarizer placed over its aperture stop. Experimental results agree with the theoretical predictions. Not only real-time detection but simple and costless architecture demonstrates the superiority of the approach proposed in this paper.

  14. Background and imaging simulations for the hard X-ray camera of the MIRAX mission

    NASA Astrophysics Data System (ADS)

    Castro, M.; Braga, J.; Penacchioni, A.; D'Amico, F.; Sacahui, R.

    2016-07-01

    We report the results of detailed Monte Carlo simulations of the performance expected both at balloon altitudes and at the probable satellite orbit of a hard X-ray coded-aperture camera being developed for the Monitor e Imageador de RAios X (MIRAX) mission. Based on a thorough mass model of the instrument and detailed specifications of the spectra and angular dependence of the various relevant radiation fields at both the stratospheric and orbital environments, we have used the well-known package GEANT4 to simulate the instrumental background of the camera. We also show simulated images of source fields to be observed and calculated the detailed sensitivity of the instrument in both situations. The results reported here are especially important to researchers in this field considering that we provide important information, not easily found in the literature, on how to prepare input files and calculate crucial instrumental parameters to perform GEANT4 simulations for high-energy astrophysics space experiments.

  15. Fuel injector nozzle for an internal combustion engine

    DOEpatents

    Cavanagh, Mark S [Bloomington, IL; Urven, Jr., Roger L.; Lawrence, Keith E [Peoria, IL

    2011-03-22

    A direct injection fuel injector includes a nozzle tip having a plurality of passages allowing fluid communication between an inner nozzle tip surface portion and an outer nozzle tip surface portion and directly into a combustion chamber of an internal combustion engine. A first group of the passages have inner surface apertures located substantially in a first common plane. A second group of the passages have inner surface apertures located substantially in at least a second common plane substantially parallel to the first common plane. The second group has more passages than the first group.

  16. Fuel Injector Nozzle For An Internal Combustion Engine

    DOEpatents

    Cavanagh, Mark S.; Urven, Jr.; Roger L.; Lawrence, Keith E.

    2006-04-25

    A direct injection fuel injector includes a nozzle tip having a plurality of passages allowing fluid communication between an inner nozzle tip surface portion and an outer nozzle tip surface portion and directly into a combustion chamber of an internal combustion engine. A first group of the passages have inner surface apertures located substantially in a first common plane. A second group of the passages have inner surface apertures located substantially in at least a second common plane substantially parallel to the first common plane. The second group has more passages than the first group.

  17. Fuel injector nozzle for an internal combustion engine

    DOEpatents

    Cavanagh, Mark S.; Urven, Jr., Roger L.; Lawrence, Keith E.

    2007-11-06

    A direct injection fuel injector includes a nozzle tip having a plurality of passages allowing fluid communication between an inner nozzle tip surface portion and an outer nozzle tip surface portion and directly into a combustion chamber of an internal combustion engine. A first group of the passages have inner surface apertures located substantially in a first common plane. A second group of the passages have inner surface apertures located substantially in at least a second common plane substantially parallel to the first common plane. The second group has more passages than the first group.

  18. Fuel injector nozzle for an internal combustion engine

    DOEpatents

    Cavanagh, Mark S.; Urven, Jr., Roger L.; Lawrence, Keith E.

    2008-11-04

    A direct injection fuel injector includes a nozzle tip having a plurality of passages allowing fluid communication between an inner nozzle tip surface portion and an outer nozzle tip surface portion and directly into a combustion chamber of an internal combustion engine. A first group of the passages have inner surface apertures located substantially in a first common plane. A second group of the passages have inner surface apertures located substantially in at least a second common plane substantially parallel to the first common plane. The second group has more passages than the first group.

  19. Imaging characteristics of photogrammetric camera systems

    USGS Publications Warehouse

    Welch, R.; Halliday, J.

    1973-01-01

    In view of the current interest in high-altitude and space photographic systems for photogrammetric mapping, the United States Geological Survey (U.S.G.S.) undertook a comprehensive research project designed to explore the practical aspects of applying the latest image quality evaluation techniques to the analysis of such systems. The project had two direct objectives: (1) to evaluate the imaging characteristics of current U.S.G.S. photogrammetric camera systems; and (2) to develop methodologies for predicting the imaging capabilities of photogrammetric camera systems, comparing conventional systems with new or different types of systems, and analyzing the image quality of photographs. Image quality was judged in terms of a number of evaluation factors including response functions, resolving power, and the detectability and measurability of small detail. The limiting capabilities of the U.S.G.S. 6-inch and 12-inch focal length camera systems were established by analyzing laboratory and aerial photographs in terms of these evaluation factors. In the process, the contributing effects of relevant parameters such as lens aberrations, lens aperture, shutter function, image motion, film type, and target contrast procedures for analyzing image quality and predicting and comparing performance capabilities. ?? 1973.

  20. Non-iterative volumetric particle reconstruction near moving bodies

    NASA Astrophysics Data System (ADS)

    Mendelson, Leah; Techet, Alexandra

    2017-11-01

    When multi-camera 3D PIV experiments are performed around a moving body, the body often obscures visibility of regions of interest in the flow field in a subset of cameras. We evaluate the performance of non-iterative particle reconstruction algorithms used for synthetic aperture PIV (SAPIV) in these partially-occluded regions. We show that when partial occlusions are present, the quality and availability of 3D tracer particle information depends on the number of cameras and reconstruction procedure used. Based on these findings, we introduce an improved non-iterative reconstruction routine for SAPIV around bodies. The reconstruction procedure combines binary masks, already required for reconstruction of the body's 3D visual hull, and a minimum line-of-sight algorithm. This approach accounts for partial occlusions without performing separate processing for each possible subset of cameras. We combine this reconstruction procedure with three-dimensional imaging on both sides of the free surface to reveal multi-fin wake interactions generated by a jumping archer fish. Sufficient particle reconstruction in near-body regions is crucial to resolving the wake structures of upstream fins (i.e., dorsal and anal fins) before and during interactions with the caudal tail.

  1. The space telescope scientific instruments

    NASA Technical Reports Server (NTRS)

    Leckrone, D. S.

    1980-01-01

    The paper describes the space telescope with a 2.4 m aperture to be launched at 500 km altitude in late 1983. Four axial-bay and one radial-bay scientific instrument, a wide-field and planetary camera, a faint-object camera, a faint-object spectrograph, and a high-speed photometer are to be installed to conduct the initial observations. The axial instruments are constrained to envelopes with dimensions 0.9 x 0.9 x 2.2 m and their masses cannot exceed 317 kg. The observatory will also be equipped with fine-guidance sensors and a microprocessor. The design concepts of the instruments are outlined and some of the astronomical capabilities including studies of distant and local galaxies, physical properties of quasars, interrelations between quasars and active galactic nuclei are mentioned.

  2. 03pd0676

    NASA Image and Video Library

    2003-03-07

    File name :DSC_0749.JPG File size :1.1MB(1174690Bytes) Date taken :2003/03/07 13:51:29 Image size :2000 x 1312 Resolution :300 x 300 dpi Number of bits :8bit/channel Protection attribute :Off Hide Attribute :Off Camera ID :N/A Camera :NIKON D1H Quality mode :FINE Metering mode :Matrix Exposure mode :Shutter priority Speed light :No Focal length :20 mm Shutter speed :1/500second Aperture :F11.0 Exposure compensation :0 EV White Balance :Auto Lens :20 mm F 2.8 Flash sync mode :N/A Exposure difference :0.0 EV Flexible program :No Sensitivity :ISO200 Sharpening :Normal Image Type :Color Color Mode :Mode II(Adobe RGB) Hue adjustment :3 Saturation Control :N/A Tone compensation :Normal Latitude(GPS) :N/A Longitude(GPS) :N/A Altitude(GPS) :N/A

  3. 03pd0517

    NASA Image and Video Library

    2002-02-19

    File name :DSC_0028.JPG File size :2.8MB(2950833Bytes) Date taken :2002/02/19 09:49:01 Image size :3008 x 2000 Resolution :300 x 300 dpi Number of bits :8bit/channel Protection attribute :Off Hide Attribute :Off Camera ID :N/A Camera :NIKON D100 Quality mode :N/A Metering mode :Matrix Exposure mode :Shutter priority Speed light :Yes Focal length :24 mm Shutter speed :1/60second Aperture :F3.5 Exposure compensation :0 EV White Balance :N/A Lens :N/A Flash sync mode :N/A Exposure difference :N/A Flexible program :N/A Sensitivity :N/A Sharpening :N/A Image Type :Color Color Mode :N/A Hue adjustment :N/A Saturation Control :N/A Tone compensation :N/A Latitude(GPS) :N/A Longitude(GPS) :N/A Altitude(GPS) :N/A

  4. 03pd0535

    NASA Image and Video Library

    2002-02-24

    File name :DSC_0047.JPG File size :2.8MB(2931574Bytes) Date taken :2002/02/24 10:06:57 Image size :3008 x 2000 Resolution :300 x 300 dpi Number of bits :8bit/channel Protection attribute :Off Hide Attribute :Off Camera ID :N/A Camera :NIKON D100 Quality mode :N/A Metering mode :Matrix Exposure mode :Shutter priority Speed light :Yes Focal length :24 mm Shutter speed :1/180second Aperture :F20.0 Exposure compensation :+0.3 EV White Balance :N/A Lens :N/A Flash sync mode :N/A Exposure difference :N/A Flexible program :N/A Sensitivity :N/A Sharpening :N/A Image Type :Color Color Mode :N/A Hue adjustment :N/A Saturation Control :N/A Tone compensation :N/A Latitude(GPS) :N/A Longitude(GPS) :N/A Altitude(GPS) :N/A

  5. Theoretical performance model for single image depth from defocus.

    PubMed

    Trouvé-Peloux, Pauline; Champagnat, Frédéric; Le Besnerais, Guy; Idier, Jérôme

    2014-12-01

    In this paper we present a performance model for depth estimation using single image depth from defocus (SIDFD). Our model is based on an original expression of the Cramér-Rao bound (CRB) in this context. We show that this model is consistent with the expected behavior of SIDFD. We then study the influence on the performance of the optical parameters of a conventional camera such as the focal length, the aperture, and the position of the in-focus plane (IFP). We derive an approximate analytical expression of the CRB away from the IFP, and we propose an interpretation of the SIDFD performance in this domain. Finally, we illustrate the predictive capacity of our performance model on experimental data comparing several settings of a consumer camera.

  6. Investigation of the Production of High Density Uniform Plasmas.

    DTIC Science & Technology

    1980-10-01

    first time with the framing camera. These are a considerable improvement upon the black and white films taken in earlier experi- ments. The different...i 111 I 11Il ELECTRON BEAM JvL ~f OIL REFLECTING PRISMS - -PYREX CELL SUSTAINER CATHODE LENS MIRROR LENS MINATURE ARC LAMP APERTURE FRAMING...was run to test the opposite limit. This cathode also arced earlier than the more con- ventional materials. The first run left several holes in the kap

  7. Sutureless Corneal Transplantation Apparatus And Method

    DOEpatents

    Rowsey, J. James

    1996-12-17

    An eye with a cornea repaired without sutures comprising a donor material in the shape of a partial sphere having a generally central extent, the central extent being of the size and shape of the central portion of a cornea of an eye, the central extent having a periphery of a fixed diameter with an exterior surface in a convex configuration and an interior surface in a concave configuration and with an essentially common thickness throughout, the central extent having a plurality of corneal flaps extending radially from the periphery of the central extent, the flaps having exterior surfaces as a continuation of the exterior surface of the central extent; and a recipient eye in the shape of a partial sphere having a circular aperture in the cornea at its central portion, the central aperture being of a size and shape essentially that of the periphery of the central extent of the donor material, the aperture being of a common thickness at the periphery of the aperture, the central portion having pockets and with the central extent of the donor material located within the aperture of the recipient eye and with the flaps of the central extent being imbricated into the pockets of the recipient eye.

  8. Transport of Particle Swarms Through Fractures

    NASA Astrophysics Data System (ADS)

    Boomsma, E.; Pyrak-Nolte, L. J.

    2011-12-01

    The transport of engineered micro- and nano-scale particles through fractured rock is often assumed to occur as dispersions or emulsions. Another potential transport mechanism is the release of particle swarms from natural or industrial processes where small liquid drops, containing thousands to millions of colloidal-size particles, are released over time from seepage or leaks. Swarms have higher velocities than any individual colloid because the interactions among the particles maintain the cohesiveness of the swarm as it falls under gravity. Thus particle swarms give rise to the possibility that engineered particles may be transported farther and faster in fractures than predicted by traditional dispersion models. In this study, the effect of fractures on colloidal swarm cohesiveness and evolution was studied as a swarm falls under gravity and interacts with fracture walls. Transparent acrylic was used to fabricate synthetic fracture samples with either (1) a uniform aperture or (2) a converging aperture followed by a uniform aperture (funnel-shaped). The samples consisted of two blocks that measured 100 x 100 x 50 mm. The separation between these blocks determined the aperture (0.5 mm to 50 mm). During experiments, a fracture was fully submerged in water and swarms were released into it. The swarms consisted of dilute suspensions of either 25 micron soda-lime glass beads (2% by mass) or 3 micron polystyrene fluorescent beads (1% by mass) with an initial volume of 5μL. The swarms were illuminated with a green (525 nm) LED array and imaged optically with a CCD camera. In the uniform aperture fracture, the speed of the swarm prior to bifurcation increased with aperture up to a maximum at a fracture width of approximately 10 mm. For apertures greater than ~15 mm, the velocity was essentially constant with fracture width (but less than at 10 mm). This peak suggests that two competing mechanisms affect swarm velocity in fractures. The wall provides both drag, which slows the swarm, and a cohesive force that prevents swarm expansion and the corresponding decrease in particle density. For apertures >15mm, though the drag force is small, the loss of swarm cohesion dominates. In small apertures (<5mm), the drag from the wall dominates causing a loss in speed even though there is strong confinement. From a force-based particle interaction approach, the initial simulation did not capture the observed experimental behavior, i.e., the distinct peak in swarm velocities was not observed. For the funnel shaped aperture, the swarm was observed to bifurcate immediately upon reaching the intersection between the converging aperture and the uniform aperture portions of the fracture. Furthermore, converging apertures resulted in the deceleration of a swarm. Thus, the rate of transport of particle swarms is strongly affected by fracture aperture. Acknowledgment: The authors wish to acknowledge support of this work by the Geosciences Research Program, Office of Basic Energy Sciences US Department of Energy (DE-FG02-09ER16022).

  9. Visual imaging control systems of the Mariner to Jupiter and Saturn spacecraft

    NASA Technical Reports Server (NTRS)

    Larks, L.

    1979-01-01

    Design and fabrication of optical systems for the Mariner Jupiter Saturn (Voyager) mission is described. Because of the long distances of these planets from the sun, the spacecraft was designed without solar panels with the electricity generated on-board by radio-isotope thermal generators (RTG). The presence of RTG's and Jupiter radiation environment required that the optical systems be fabricated out of radiation stabilized materials. A narrow angle and a wide angle camera are located on the spacecraft scan platform, with the narrow angle lens a modification of the Mariner 10 lens. The optical system is described, noting that the lens was modified by moving the aperture correctors forward and placing a spider mounted secondary mirror in the original back surface of the second aperture corrector. The wide angle lens was made out of cerium doped, radiation stabilized optical glass with greatest blue transmittance, which would be resistant to RTG and Jupiter radiation.

  10. Large aperture and wide field of view space telescope for the detection of ultra high energy cosmic rays and neutrinos

    NASA Astrophysics Data System (ADS)

    Mazzinghi, Piero; Bratina, Vojko; Gambicorti, Lisa; Simonetti, Francesca; Zuccaro Marchi, Alessandro

    2017-11-01

    New technologies are proposed for large aperture and wide Field of View (FOV) space telescopes dedicated to detection of Ultra High Energy Cosmic Rays and Neutrinos flux, through observation of fluorescence traces in atmosphere and diffused Cerenkov signals. The presented advanced detection system is a spaceborne LEO telescope, with better performance than ground-based observatories, detecting up to 103 - 104 events/year. Different design approaches are implemented, all with very large FOV and focal surface detectors with sufficient segmentation and time resolution to allow precise reconstructions of the arrival direction. In particular, two Schmidt cameras are suggested as an appropriate solution to match most of the optical and technical requirements: large FOV, low f/#, reduction of stray light, optionally flat focal surface, already proven low-cost construction technologies. Finally, a preliminary proposal of a wideFOV retrofocus catadioptric telescope is explained.

  11. In-situ measurement of concentrated solar flux and distribution at the aperture of a central solar receiver

    NASA Astrophysics Data System (ADS)

    Ferriere, Alain; Volut, Mikael; Perez, Antoine; Volut, Yann

    2016-05-01

    A flux mapping system has been designed, implemented and experimented at the top of the Themis solar tower in France. This system features a moving bar associated to a CCD video camera and a flux gauge mounted onto the bar used as reference measurement for calibration purpose. Images and flux signal are acquired separately. The paper describes the equipment and focus on the data processing to issue the distribution of flux density and concentration at the aperture of the solar receiver. Finally, the solar power entering into the receiver is estimated by integration of flux density. The processing is largely automated in the form of a dedicated software with fast execution. A special attention is paid to the accuracy of the results, to the robustness of the algorithm and to the velocity of the processing.

  12. Optical design for CETUS: a wide-field 1.5m aperture UV payload being studied for a NASA probe class mission study

    NASA Astrophysics Data System (ADS)

    Woodruff, Robert A.; Hull, Tony; Heap, Sara R.; Danchi, William; Kendrick, Stephen E.; Purves, Lloyd

    2017-09-01

    We are developing a NASA Headquarters selected Probe-class mission concept called the Cosmic Evolution Through UV Spectroscopy (CETUS) mission, which includes a 1.5-m aperture diameter large field-of-view (FOV) telescope optimized for UV imaging, multi-object spectroscopy, and point-source spectroscopy. The optical system includes a Three Mirror Anastigmatic (TMA) telescope that simultaneously feeds three separate scientific instruments: the near-UV (NUV) Multi-Object Spectrograph (MOS) with a next-generation Micro-Shutter Array (MSA); the two-channel camera covering the far-UV (FUV) and NUV spectrum; and the point-source spectrograph covering the FUV and NUV region with selectable R 40,000 echelle modes and R 2,000 first order modes. The optical system includes fine guidance sensors, wavefront sensing, and spectral and flat-field in-flight calibration sources. This paper will describe the current optical design of CETUS.

  13. Optical design for CETUS: a wide-field 1.5m aperture UV payload being studied for a NASA probe class mission study

    NASA Astrophysics Data System (ADS)

    Woodruff, Robert; Robert Woodruff, Goddard Space Flight Center, Kendrick Optical Consulting

    2018-01-01

    We are developing a NASA Headquarters selected Probe-class mission concept called the Cosmic Evolution Through UV Spectroscopy (CETUS) mission, which includes a 1.5-m aperture diameter large field-of-view (FOV) telescope optimized for UV imaging, multi-object spectroscopy, and point-source spectroscopy. The optical system includes a Three Mirror Anastigmatic (TMA) telescope that simultaneously feeds three separate scientific instruments: the near-UV (NUV) Multi-Object Spectrograph (MOS) with a next-generation Micro-Shutter Array (MSA); the two-channel camera covering the far-UV (FUV) and NUV spectrum; and the point-source spectrograph covering the FUV and NUV region with selectable R~ 40,000 echelle modes and R~ 2,000 first order modes. The optical system includes fine guidance sensors, wavefront sensing, and spectral and flat-field in-flight calibration sources. This paper will describe the current optical design of CETUS.

  14. Simultaneous Water Vapor and Dry Air Optical Path Length Measurements and Compensation with the Large Binocular Telescope Interferometer

    NASA Technical Reports Server (NTRS)

    Defrere, D.; Hinz, P.; Downey, E.; Boehm, M.; Danchi, W. C.; Durney, O.; Ertel, S.; Hill, J. M.; Hoffmann, W. F.; Mennesson, B.; hide

    2016-01-01

    The Large Binocular Telescope Interferometer uses a near-infrared camera to measure the optical path length variations between the two AO-corrected apertures and provide high-angular resolution observations for all its science channels (1.5-13 microns). There is however a wavelength dependent component to the atmospheric turbulence, which can introduce optical path length errors when observing at a wavelength different from that of the fringe sensing camera. Water vapor in particular is highly dispersive and its effect must be taken into account for high-precision infrared interferometric observations as described previously for VLTI/MIDI or the Keck Interferometer Nuller. In this paper, we describe the new sensing approach that has been developed at the LBT to measure and monitor the optical path length fluctuations due to dry air and water vapor separately. After reviewing the current performance of the system for dry air seeing compensation, we present simultaneous H-, K-, and N-band observations that illustrate the feasibility of our feed forward approach to stabilize the path length fluctuations seen by the LBTI nuller uses a near-infrared camera to measure the optical path length variations between the two AO-corrected apertures and provide high-angular resolution observations for all its science channels (1.5-13 microns). There is however a wavelength dependent component to the atmospheric turbulence, which can introduce optical path length errors when observing at a wavelength different from that of the fringe sensing camera. Water vapor in particular is highly dispersive and its effect must be taken into account for high-precision infrared interferometric observations as described previously for VLTI MIDI or the Keck Interferometer Nuller. In this paper, we describe the new sensing approach that has been developed at the LBT to measure and monitor the optical path length fluctuations due to dry air and water vapor separately. After reviewing the current performance of the system for dry air seeing compensation, we present simultaneous H-, K-, and N-band observations that illustrate the feasibility of our feed forward approach to stabilize the path length fluctuations seen by the LBTI nuller.

  15. Integrated electrochromic aperture diaphragm

    NASA Astrophysics Data System (ADS)

    Deutschmann, T.; Oesterschulze, E.

    2014-05-01

    In the last years, the triumphal march of handheld electronics with integrated cameras has opened amazing fields for small high performing optical systems. For this purpose miniaturized iris apertures are of practical importance because they are essential to control both the dynamic range of the imaging system and the depth of focus. Therefore, we invented a micro optical iris based on an electrochromic (EC) material. This material changes its absorption in response to an applied voltage. A coaxial arrangement of annular rings of the EC material is used to establish an iris aperture without need of any mechanical moving parts. The advantages of this device do not only arise from the space-saving design with a thickness of the device layer of 50μm. But it also benefits from low power consumption. In fact, its transmission state is stable in an open circuit, phrased memory effect. Only changes of the absorption require a voltage of up to 2 V. In contrast to mechanical iris apertures the absorption may be controlled on an analog scale offering the opportunity for apodization. These properties make our device the ideal candidate for battery powered and space-saving systems. We present optical measurements concerning control of the transmitted intensity and depth of focus, and studies dealing with switching times, light scattering, and stability. While the EC polymer used in this study still has limitations concerning color and contrast, the presented device features all functions of an iris aperture. In contrast to conventional devices it offers some special features. Owing to the variable chemistry of the EC material, its spectral response may be adjusted to certain applications like color filtering in different spectral regimes (UV, optical range, infrared). Furthermore, all segments may be switched individually to establish functions like spatial Fourier filtering or lateral tunable intensity filters.

  16. A numerical study on the correlation between fracture transmissivity, hydraulic aperture and transport aperture

    NASA Astrophysics Data System (ADS)

    Sawada, A.; Takebe, A.; Sakamoto, K.

    2006-12-01

    Quantitative evaluation of the groundwater velocity in the fractures is a key part of contaminants transport assessment especially in the radioactive waste disposal programs. In a hydrogeological model such as the discrete fracture network model, the transport aperture of water conducting fracture is one of the important parameters for evaluating groundwater velocity. Tracer tests that measure velocity (or transport aperture) are few compared with flow tests that measure transmissivity (or hydraulic aperture). Thus it is useful to estimate transport properties from flow properties. It is commonly assumed that flow and transport aperture are the same, and that aperture is related to the cube root of transmissivity by the parallel-plate analog. Actual field experiments, however, show transport and hydraulic apertures are not always the same, and that transport aperture relates to an empirical constant times the square root of transmissivity. Compared with these field results, the cubic law underestimates transport aperture and overestimates velocity. A possible source of this discrepancy is in-plane heterogeneity of aperture and transmissivity. To study this behavior, numerical simulations using MAFIC were conducted for a single fracture model with a heterogeneous aperture distribution. The simulations varied three parameters - the mean geometrical aperture, JRC (Joint Roughness Coefficient), and the contact area ratio (fracture contact area divided by total fracture area). For each model we determined the equivalent transmissivity and cubic-law aperture under steady flow conditions. Then we simulated mass transport using particle tracking through the same fracture. The transport aperture was estimated from the particle peak arrival time at the downstream boundary. The results show that the mean geometrical aperture is the most sensitive parameter among the three variable parameters in this study. It is also found that the contact area ratio affects transmissivity more than the JRC, and while the JRC strongly affects the velocity and transport aperture. Based on these results, a correlation between the transmissivity, the hydraulic conductivity and the transport aperture will be discussed.

  17. Theoretical Limits of Lunar Vision Aided Navigation with Inertial Navigation System

    DTIC Science & Technology

    2015-03-26

    camera model. Light reflected or projected from objects in the scene of the outside world is taken in by the aperture (or opening) shaped as a double...model’s analog aspects with an analog-to-digital interface converting raw images of the outside world scene into digital information a computer can use to...Figure 2.7. Digital Image Coordinate System. Used with permission [30]. Angular Field of View. The angular field of view is the angle of the world scene

  18. General Astrophysics with the HabEx Workhorse Camera

    NASA Astrophysics Data System (ADS)

    Stern, Daniel; Clarke, John; Gaudi, B. Scott; Kiessling, Alina; Krause, Oliver; Martin, Stefan; Scowen, Paul; Somerville, Rachel; HabEx STDT

    2018-01-01

    The Habitable Exoplanet Imaging Mission (HabEx) concept has been designed to enable an extensive suite of science, broadly put under the rubric of General Astrophysics, in addition to its exoplanet direct imaging science. General astrophysics directly addresses multiple NASA programmatic branches, and HabEx will enable investigations ranging from cosmology, to galaxy evolution, to stellar population studies, to exoplanet transit spectroscopy, to Solar System studies. This poster briefly describes one of the two primary HabEx General Astrophysics instruments, the HabEx Workhorse Camera (HWC). HWC will be a dual-detector UV-to-near-IR imager and multi-object grism spectrometer with a microshutter array and a moderate (3' x 3') field-of-view. We detail some of the key science we expect HWC to undertake, emphasizing unique capabilities enabled by a large-aperture, highly stable space-borne platform at these wavelengths.

  19. Stellar photometry with the Wide Field/Planetary Camera of the Hubble Space Telescope

    NASA Astrophysics Data System (ADS)

    Holtzman, Jon A.

    1990-07-01

    Simulations of Wide Field/Planetary Camera (WF/PC) images are analyzed in order to discover the most effective techniques for stellar photometry and to evaluate the accuracy and limitations of these techniques. The capabilities and operation of the WF/PC and the simulations employed in the study are described. The basic techniques of stellar photometry and methods to improve these techniques for the WF/PC are discussed. The correct parameters for star detection, aperture photometry, and point-spread function (PSF) fitting with the DAOPHOT software of Stetson (1987) are determined. Consideration is given to undersampling of the stellar images by the detector; variations in the PSF; and the crowding of the stellar images. It is noted that, with some changes DAOPHOT, is able to generate photometry almost to the level of photon statistics.

  20. Sub-aperture stitching test of a cylindrical mirror with large aperture

    NASA Astrophysics Data System (ADS)

    Xue, Shuai; Chen, Shanyong; Shi, Feng; Lu, Jinfeng

    2016-09-01

    Cylindrical mirrors are key optics of high-end equipment of national defense and scientific research such as high energy laser weapons, synchrotron radiation system, etc. However, its surface error test technology develops slowly. As a result, its optical processing quality can not meet the requirements, and the developing of the associated equipment is hindered. Computer Generated-Hologram (CGH) is commonly utilized as null for testing cylindrical optics. However, since the fabrication process of CGH with large aperture is not sophisticated yet, the null test of cylindrical optics with large aperture is limited by the aperture of the CGH. Hence CGH null test combined with sub-aperture stitching method is proposed to break the limit of the aperture of CGH for testing cylindrical optics, and the design of CGH for testing cylindrical surfaces is analyzed. Besides, the misalignment aberration of cylindrical surfaces is different from that of the rotational symmetric surfaces since the special shape of cylindrical surfaces, and the existing stitching algorithm of rotational symmetric surfaces can not meet the requirements of stitching cylindrical surfaces. We therefore analyze the misalignment aberrations of cylindrical surfaces, and study the stitching algorithm for measuring cylindrical optics with large aperture. Finally we test a cylindrical mirror with large aperture to verify the validity of the proposed method.

  1. Reconfigurable metasurface aperture for security screening and microwave imaging

    NASA Astrophysics Data System (ADS)

    Sleasman, Timothy; Imani, Mohammadreza F.; Boyarsky, Michael; Pulido-Mancera, Laura; Reynolds, Matthew S.; Smith, David R.

    2017-05-01

    Microwave imaging systems have seen growing interest in recent decades for applications ranging from security screening to space/earth observation. However, hardware architectures commonly used for this purpose have not seen drastic changes. With the advent of metamaterials a wealth of opportunities have emerged for honing metasurface apertures for microwave imaging systems. Recent thrusts have introduced dynamic reconfigurability directly into the aperture layer, providing powerful capabilities from a physical layer with considerable simplicity. The waveforms generated from such dynamic metasurfaces make them suitable for application in synthetic aperture radar (SAR) and, more generally, computational imaging. In this paper, we investigate a dynamic metasurface aperture capable of performing microwave imaging in the K-band (17.5-26.5 GHz). The proposed aperture is planar and promises an inexpensive fabrication process via printed circuit board techniques. These traits are further augmented by the tunability of dynamic metasurfaces, which provides the dexterity necessary to generate field patterns ranging from a sequence of steered beams to a series of uncorrelated radiation patterns. Imaging is experimentally demonstrated with a voltage-tunable metasurface aperture. We also demonstrate the aperture's utility in real-time measurements and perform volumetric SAR imaging. The capabilities of a prototype are detailed and the future prospects of general dynamic metasurface apertures are discussed.

  2. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates

    USGS Publications Warehouse

    Hobbs, Michael T.; Brehme, Cheryl S.

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  3. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates.

    PubMed

    Hobbs, Michael T; Brehme, Cheryl S

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  4. Modeling and Simulation of High Resolution Optical Remote Sensing Satellite Geometric Chain

    NASA Astrophysics Data System (ADS)

    Xia, Z.; Cheng, S.; Huang, Q.; Tian, G.

    2018-04-01

    The high resolution satellite with the longer focal length and the larger aperture has been widely used in georeferencing of the observed scene in recent years. The consistent end to end model of high resolution remote sensing satellite geometric chain is presented, which consists of the scene, the three line array camera, the platform including attitude and position information, the time system and the processing algorithm. The integrated design of the camera and the star tracker is considered and the simulation method of the geolocation accuracy is put forward by introduce the new index of the angle between the camera and the star tracker. The model is validated by the geolocation accuracy simulation according to the test method of the ZY-3 satellite imagery rigorously. The simulation results show that the geolocation accuracy is within 25m, which is highly consistent with the test results. The geolocation accuracy can be improved about 7 m by the integrated design. The model combined with the simulation method is applicable to the geolocation accuracy estimate before the satellite launching.

  5. Transport of Particle Swarms Through Variable Aperture Fractures

    NASA Astrophysics Data System (ADS)

    Boomsma, E.; Pyrak-Nolte, L. J.

    2012-12-01

    Particle transport through fractured rock is a key concern with the increased use of micro- and nano-size particles in consumer products as well as from other activities in the sub- and near surface (e.g. mining, industrial waste, hydraulic fracturing, etc.). While particle transport is often studied as the transport of emulsions or dispersions, particles may also enter the subsurface from leaks or seepage that lead to particle swarms. Swarms are drop-like collections of millions of colloidal-sized particles that exhibit a number of unique characteristics when compared to dispersions and emulsions. Any contaminant or engineered particle that forms a swarm can be transported farther, faster, and more cohesively in fractures than would be expected from a traditional dispersion model. In this study, the effects of several variable aperture fractures on colloidal swarm cohesiveness and evolution were studied as a swarm fell under gravity and interacted with the fracture walls. Transparent acrylic was used to fabricate synthetic fracture samples with (1) a uniform aperture, (2) a converging region followed by a uniform region (funnel shaped), (3) a uniform region followed by a diverging region (inverted funnel), and (4) a cast of a an induced fracture from a carbonate rock. All of the samples consisted of two blocks that measured 100 x 100 x 50 mm. The minimum separation between these blocks determined the nominal aperture (0.5 mm to 20 mm). During experiments a fracture was fully submerged in water and swarms were released into it. The swarms consisted of a dilute suspension of 3 micron polystyrene fluorescent beads (1% by mass) with an initial volume of 5μL. The swarms were illuminated with a green (525 nm) LED array and imaged optically with a CCD camera. The variation in fracture aperture controlled swarm behavior. Diverging apertures caused a sudden loss of confinement that resulted in a rapid change in the swarm's shape as well as a sharp increase in its velocity. Converging apertures caused swarms to decelerate rapidly and become trapped in the transition point between the converging and parallel regions for apertures less than 2.5 mm. In uniform aperture fractures, an optimal aperture range (5 mm to 15 mm) exists where swarm velocity was higher and the swarm maintained cohesion over a longer distance. For apertures below this range the swarms were strongly slowed due to drag from the wall, while for larger apertures the swarm velocity approached an asymptote due to the loss of the walls influence. The transport of particle swarms in fractures is strongly controlled by aperture distribution. While drag from the fracture does slow swarms, especially at small apertures, much of the interesting behavior (shape changes in diverging fracture, optimal aperture in parallel fracture) is best explained by fracture induced preferential confinement that controls the evolution of the swarm. When this confinement is suddenly changed, the swarm responds quickly and dramatically to its new environment. This has important implications for the understanding of contaminant dispersal in subsurface fracture networks because the type of aperture variation can exert a strong influence on particle swarm transport. Acknowledgment: The authors wish to acknowledge support of this work by the Geosciences Research Program, Office of Basic Energy Sciences US Department of Energy (DE-FG02-09ER16022).

  6. Fast Solar Polarimeter: Prototype Characterization and First Results

    NASA Astrophysics Data System (ADS)

    Iglesias, F. A.; Feller, A.; Krishnappa, N.; Solanki, S. K.

    2016-04-01

    Due to the differential and non-simultaneous nature of polarization measurements, seeing induced crosstalk (SIC) and seeing limited spatial resolution can easily counterbalance the benefits of solar imaging polarimetry from the ground. The development of instrumental techniques to treat these issues is necessary to fully exploit the next generation of large-aperture solar facilities, and maintain ground-based data at a competitive level with respect to its space-based counterpart. In particular, considering that many open questions in modern solar physics demand data with challenging specifications of resolution and polarimetric sensitivity that can only be achieved with large telescope apertures (Stenflo 1999). Even if state-of-the-art adaptive optics systems greatly improve image quality, their limited correction —due to finite bandwidth, mode number and seeing anisoplanat- ism— produces large residual values of SIC (Krishnappa & Feller 2012). Dual beam polarimeters are commonly used to reduce SIC between the intensity and polarization signals, however, they cannot compensate for the SIC introduced between circular and linear polarization, which can be relevant for high-precision polarimetry. It is known that fast modulation effectively reduces SIC, but the demodulation of the corresponding intensity signals imposes hard requirements on the frame rate of the associated cameras. One way to avoid a fast sensor, is to decouple the camera readout from the intensity demodulation step. This concept is the cornerstone of the very successful Zurich Imaging Polarimeter (ZIMPOL). Even though the ZIMPOL solution allows the detection of very faint signals (˜10-5), its design is not suitable for high-spatial-resolution applications. We are developing a polarimeter that focuses on both spatial resolution (<0.5 arcsec) and polarimetric sensitivity (10-4). The prototype of this Fast Solar Polarimeter (FSP, see Feller et al. 2014), employs a high frame-rate (400 fps), low-noise (<4 e- RMS), pnCCD camera (Hartmann et al. 2006) that is read in synchronization with a polarization modulator based on ferroelectric liquid crystals. The modulator package is similar to the SOLIS (Keller et al. 2003) design and optimized to have an achromatic total polarimetric efficiency above 80 % in the 400-700 nm wavelength range. The fast modulation frequency of FSP, yielding up to 100 full-Stokes measurements per second, and high duty cycle (>95%), have the double benefit of reducing seeing induced artifacts and improving the final spatial resolution by providing an optimal regime for the application of post-facto image reconstruction techniques. In this poster we describe the FSP prototype, including the characterization results, a technique to correct image smearing due to the sensor frame transfer (Iglesias et al. 2015) and some of the first measurements obtained with the 68-cm Vacuum Tower Telescope located at the Observatorio del Teide, Spain.

  7. Assessment of skin wound healing with a multi-aperture camera

    NASA Astrophysics Data System (ADS)

    Nabili, Marjan; Libin, Alex; Kim, Loan; Groah, Susan; Ramella-Roman, Jessica C.

    2009-02-01

    A clinical trial was conducted at the National Rehabilitation Hospital on 15 individuals to assess whether Rheparan Skin, a bio-engineered component of the extracellular matrix of the skin, is effective at promoting healing of a variety of wounds. Along with standard clinical outcome measures, a spectroscopic camera was used to assess the efficacy of Rheparan skin. Gauzes soaked with Rheparan skin were placed on volunteers wounds for 5 minutes twice weekly for four weeks. Images of the wounds were taken using a multi spectral camera and a digital camera at baseline and weekly thereafter. Spectral images collected at different wavelengths were used combined with optical skin models to quantify parameters of interest such as oxygen saturation (SO2), water content, and melanin concentration. A digital wound measurement system (VERG) was also used to measure the size of the wound. 9 of the 15 measured subjects showed a definitive improvement post treatment in the form of a decrease in wound area. 7 of these 9 individuals also showed an increase in oxygen saturation in the ulcerated area during the trial. A similar trend was seen in other metrics. Spectral imaging of skin wound can be a valuable tool to establish wound-healing trends and to clarify healing mechanisms.

  8. Hyperspectral imager for components identification in the atmosphere

    NASA Astrophysics Data System (ADS)

    Dewandel, Jean-Luc; Beghuin, Didier; Dubois, Xavier; Antoine, Philippe

    2017-11-01

    Several applications require the identification of chemical elements during re-entry of material in the atmosphere. The materials can be from human origin or meteorites. The Automated Transfer Vehicle (ATV) re-entry has been filmed with conventional camera from airborne manual operation. In order to permit the identification of the separate elements from their glow, spectral analysis needs to be added to the video data. In a LET-SME contract with ESA, Lambda-X has built a Fourier Transform Imaging Spectrometer to permit, in a future work, to bring the technology to the readiness level required for the application. In this paper, the principles of the Fourier Transform Imaging spectroscopy are recalled, the different interferometers suitable for supporting the technique are reviewed and the selection process is explained. The final selection of the interferometer corresponds to a birefringent prism based common path shear interferometer. The design of the breadboard and its performances are presented in terms of spatial resolution, aperture, and spectral resolution. A discussion is open regarding perspective of the technique for other remote sensing applications compared to more usual push broom configurations.

  9. Optomechanical System Development of the AWARE Gigapixel Scale Camera

    NASA Astrophysics Data System (ADS)

    Son, Hui S.

    Electronic focal plane arrays (FPA) such as CMOS and CCD sensors have dramatically improved to the point that digital cameras have essentially phased out film (except in very niche applications such as hobby photography and cinema). However, the traditional method of mating a single lens assembly to a single detector plane, as required for film cameras, is still the dominant design used in cameras today. The use of electronic sensors and their ability to capture digital signals that can be processed and manipulated post acquisition offers much more freedom of design at system levels and opens up many interesting possibilities for the next generation of computational imaging systems. The AWARE gigapixel scale camera is one such computational imaging system. By utilizing a multiscale optical design, in which a large aperture objective lens is mated with an array of smaller, well corrected relay lenses, we are able to build an optically simple system that is capable of capturing gigapixel scale images via post acquisition stitching of the individual pictures from the array. Properly shaping the array of digital cameras allows us to form an effectively continuous focal surface using off the shelf (OTS) flat sensor technology. This dissertation details developments and physical implementations of the AWARE system architecture. It illustrates the optomechanical design principles and system integration strategies we have developed through the course of the project by summarizing the results of the two design phases for AWARE: AWARE-2 and AWARE-10. These systems represent significant advancements in the pursuit of scalable, commercially viable snapshot gigapixel imaging systems and should serve as a foundation for future development of such systems.

  10. Minimizing camera-eye optical aberrations during the 3D reconstruction of retinal structures

    NASA Astrophysics Data System (ADS)

    Aldana-Iuit, Javier; Martinez-Perez, M. Elena; Espinosa-Romero, Arturo; Diaz-Uribe, Rufino

    2010-05-01

    3D reconstruction of blood vessels is a powerful visualization tool for physicians, since it allows them to refer to qualitative representation of their subject of study. In this paper we propose a 3D reconstruction method of retinal vessels from fundus images. The reconstruction method propose herein uses images of the same retinal structure in epipolar geometry. Images are preprocessed by RISA system for segmenting blood vessels and obtaining feature points for correspondences. The correspondence points process is solved using correlation. The LMedS analysis and Graph Transformation Matching algorithm are used for outliers suppression. Camera projection matrices are computed with the normalized eight point algorithm. Finally, we retrieve 3D position of the retinal tree points by linear triangulation. In order to increase the power of visualization, 3D tree skeletons are represented by surfaces via generalized cylinders whose radius correspond to morphological measurements obtained by RISA. In this paper the complete calibration process including the fundus camera and the optical properties of the eye, the so called camera-eye system is proposed. On one hand, the internal parameters of the fundus camera are obtained by classical algorithms using a reference pattern. On the other hand, we minimize the undesirable efects of the aberrations induced by the eyeball optical system assuming that contact enlarging lens corrects astigmatism, spherical and coma aberrations are reduced changing the aperture size and eye refractive errors are suppressed adjusting camera focus during image acquisition. Evaluation of two self-calibration proposals and results of 3D blood vessel surface reconstruction are presented.

  11. Coded-aperture Compton camera for gamma-ray imaging

    NASA Astrophysics Data System (ADS)

    Farber, Aaron M.

    This dissertation describes the development of a novel gamma-ray imaging system concept and presents results from Monte Carlo simulations of the new design. Current designs for large field-of-view gamma cameras suitable for homeland security applications implement either a coded aperture or a Compton scattering geometry to image a gamma-ray source. Both of these systems require large, expensive position-sensitive detectors in order to work effectively. By combining characteristics of both of these systems, a new design can be implemented that does not require such expensive detectors and that can be scaled down to a portable size. This new system has significant promise in homeland security, astronomy, botany and other fields, while future iterations may prove useful in medical imaging, other biological sciences and other areas, such as non-destructive testing. A proof-of-principle study of the new gamma-ray imaging system has been performed by Monte Carlo simulation. Various reconstruction methods have been explored and compared. General-Purpose Graphics-Processor-Unit (GPGPU) computation has also been incorporated. The resulting code is a primary design tool for exploring variables such as detector spacing, material selection and thickness and pixel geometry. The advancement of the system from a simple 1-dimensional simulation to a full 3-dimensional model is described. Methods of image reconstruction are discussed and results of simulations consisting of both a 4 x 4 and a 16 x 16 object space mesh have been presented. A discussion of the limitations and potential areas of further study is also presented.

  12. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates

    PubMed Central

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing. PMID:28981533

  13. VizieR Online Data Catalog: MOST photometry of Proxima (Kipping+, 2017)

    NASA Astrophysics Data System (ADS)

    Kipping, D. M.; Cameron, C.; Hartman, J. D.; Davenport, J. R. A.; Matthews, J. M.; Sasselov, D.; Rowe, J.; Siverd, R. J.; Chen, J.; Sandford, E.; Bakos, G. A.; Jordan, A.; Bayliss, D.; Henning, T.; Mancini, L.; Penev, K.; Csubry, Z.; Bhatti, W.; da Silva Bento, J.; Guenther, D. B.; Kuschnig, R.; Moffat, A. F. J.; Rucinski, S. M.; Weiss, W. W.

    2017-06-01

    Microwave and Oscillations of STars (MOST) telescope is a 53kg satellite in low Earth orbit with a 15cm aperture visible band camera (35-750nm). MOST observed Proxima Centauri in 2014 May (beginning on HJD(2000) 2456793.18) for about 12.5 days. MOST again observed Proxima Centauri in 2015 May (starting on HJD(2000) 2457148.54), this time for a total of 31 days. Independent of the MOST observations, Proxima Cen was also monitored by the HATSouth ground-based telescope network. The network consists of six wide-field photometric instruments located at three observatories in the Southern Hemisphere (Las Campanas Observatory [LCO] in Chile, the High Energy Stereoscopic System [HESS] site in Namibia, and Siding Spring Observatory [SSO] in Australia), with two instruments per site. Each instrument consists of four 18cm diameter astrographs and associated 4K*4K backside-illuminated CCD cameras and Sloan r-band filters, placed on a common robotic mount. The four astrographs and cameras together cover a 8.2°*8.2° mosaic field of view at a pixel scale of 3.7''/pixel. Observations of a field containing Proxima Cen were collected as part of the general HATSouth transit survey, with a total of 11071 (this number does not count observations that were rejected as not useful for high-precision photometry, or those that produced large-amplitude outliers in the Proxima Cen light curve) composite 3*80s exposures gathered between 2012 June 14 and 2014 September 20. These include 3430 observations made with the HS-2 unit at LCO, 4630 observations made with the HS-4 unit at the HESS site, and 3011 observations made with the HS-6 unit at the SSO site. Due to weather and other factors, the cadence was nonuniform. The median time difference between consecutive observations in the full time series is 368s. (2 data files).

  14. Optical Measurement Center Status

    NASA Technical Reports Server (NTRS)

    Rodriguez, H.; Abercromby, K.; Mulrooney, M.; Barker, E.

    2007-01-01

    Beginning in 2005, an optical measurement center (OMC) was created to measure the photometric signatures of debris pieces. Initially, the OMC was equipped with a 300 W xenon arc lamp, a SBIG 512 x 512 ST8X MEI CCD camera with standard Johnson filters, and a Lynx 6 robotic arm with five degrees of freedom. As research progressed, modifications were made to the equipment. A customized rotary table was built to overcome the robot s limitation of 180 degree wrist rotation and provide complete 360 degree rotation with little human interaction. This change allowed an initial phase angle (source-object-camera angle) of roughly 5 degrees to be adjusted to 7, 10, 15, 18, 20, 25, or 28 degrees. Additionally, the Johnson R and I CCD filters were replaced with the standard astronomical filters suite (Bessell R,I). In an effort to reduce object saturation, the two generic aperture stops were replaced with neutral density filters. Initially data were taken with aluminum debris pieces from the European Space Operations Centre ESOC2 ground test and more recently with samples from a thermal multi-layered insulation (MLI) commonly used on rocket bodies and satellites. The ESOC2 data provided light curve analysis for one type of material but many different shapes, including flat, bent, curled, folded, and torn. The MLI samples are roughly the same size and shape, but have different surfaces that give rise to interesting photometric light curves. In addition, filter photometry was conducted on the MLI pieces, a process that also will be used on the ESOC2 samples. While obtaining light curve data an anomalous drop in intensity was observed when the table revolved through the second 180 degree rotation. Investigation revealed that the robot s wrist rotation is not reliable past 80 degrees, thus the object may be at slightly different angles at the 180 degree transition. To limit this effect, the initial rotation position begins with the object s minimal surface area facing the camera.

  15. The design of common aperture and multi-band optical system based on day light telescope

    NASA Astrophysics Data System (ADS)

    Chen, Jiao; Wang, Ling; Zhang, Bo; Teng, Guoqi; Wang, Meng

    2017-02-01

    As the development of electro-optical weapon system, the technique of common path and multi-sensor are used popular, and becoming a trend. According to the requirement of miniaturization and lightweight for electro-optical stabilized sighting system, a day light telescope/television viewing-aim system/ laser ranger has been designed in this thesis, which has common aperture. Thus integration scheme of multi-band and common aperture has been adopted. A day light telescope has been presented, which magnification is 8, field of view is 6°, and distance of exit pupil is more than 20mm. For 1/3" CCD, television viewing-aim system which has 156mm focal length, has been completed. In addition, laser ranging system has been designed, with 10km raging distance. This paper outlines its principle which used day light telescope as optical reference of correcting the optical axis. Besides, by means of shared objective, reserved image with inverting prism and coating beam-splitting film on the inclined plane of the cube prism, the system has been applied to electro-optical weapon system, with high-resolution of imaging and high-precision ranging.

  16. Intelligent viewing control for robotic and automation systems

    NASA Astrophysics Data System (ADS)

    Schenker, Paul S.; Peters, Stephen F.; Paljug, Eric D.; Kim, Won S.

    1994-10-01

    We present a new system for supervisory automated control of multiple remote cameras. Our primary purpose in developing this system has been to provide capability for knowledge- based, `hands-off' viewing during execution of teleoperation/telerobotic tasks. The reported technology has broader applicability to remote surveillance, telescience observation, automated manufacturing workcells, etc. We refer to this new capability as `Intelligent Viewing Control (IVC),' distinguishing it from a simple programmed camera motion control. In the IVC system, camera viewing assignment, sequencing, positioning, panning, and parameter adjustment (zoom, focus, aperture, etc.) are invoked and interactively executed by real-time by a knowledge-based controller, drawing on a priori known task models and constraints, including operator preferences. This multi-camera control is integrated with a real-time, high-fidelity 3D graphics simulation, which is correctly calibrated in perspective to the actual cameras and their platform kinematics (translation/pan-tilt). Such merged graphics- with-video design allows the system user to preview and modify the planned (`choreographed') viewing sequences. Further, during actual task execution, the system operator has available both the resulting optimized video sequence, as well as supplementary graphics views from arbitrary perspectives. IVC, including operator-interactive designation of robot task actions, is presented to the user as a well-integrated video-graphic single screen user interface allowing easy access to all relevant telerobot communication/command/control resources. We describe and show pictorial results of a preliminary IVC system implementation for telerobotic servicing of a satellite.

  17. Single-frequency 3D synthetic aperture imaging with dynamic metasurface antennas.

    PubMed

    Boyarsky, Michael; Sleasman, Timothy; Pulido-Mancera, Laura; Diebold, Aaron V; Imani, Mohammadreza F; Smith, David R

    2018-05-20

    Through aperture synthesis, an electrically small antenna can be used to form a high-resolution imaging system capable of reconstructing three-dimensional (3D) scenes. However, the large spectral bandwidth typically required in synthetic aperture radar systems to resolve objects in range often requires costly and complex RF components. We present here an alternative approach based on a hybrid imaging system that combines a dynamically reconfigurable aperture with synthetic aperture techniques, demonstrating the capability to resolve objects in three dimensions (3D), with measurements taken at a single frequency. At the core of our imaging system are two metasurface apertures, both of which consist of a linear array of metamaterial irises that couple to a common waveguide feed. Each metamaterial iris has integrated within it a diode that can be biased so as to switch the element on (radiating) or off (non-radiating), such that the metasurface antenna can produce distinct radiation profiles corresponding to different on/off patterns of the metamaterial element array. The electrically large size of the metasurface apertures enables resolution in range and one cross-range dimension, while aperture synthesis provides resolution in the other cross-range dimension. The demonstrated imaging capabilities of this system represent a step forward in the development of low-cost, high-performance 3D microwave imaging systems.

  18. 4-mm-diameter three-dimensional imaging endoscope with steerable camera for minimally invasive surgery (3-D-MARVEL).

    PubMed

    Bae, Sam Y; Korniski, Ronald J; Shearn, Michael; Manohara, Harish M; Shahinian, Hrayr

    2017-01-01

    High-resolution three-dimensional (3-D) imaging (stereo imaging) by endoscopes in minimally invasive surgery, especially in space-constrained applications such as brain surgery, is one of the most desired capabilities. Such capability exists at larger than 4-mm overall diameters. We report the development of a stereo imaging endoscope of 4-mm maximum diameter, called Multiangle, Rear-Viewing Endoscopic Tool (MARVEL) that uses a single-lens system with complementary multibandpass filter (CMBF) technology to achieve 3-D imaging. In addition, the system is endowed with the capability to pan from side-to-side over an angle of [Formula: see text], which is another unique aspect of MARVEL for such a class of endoscopes. The design and construction of a single-lens, CMBF aperture camera with integrated illumination to generate 3-D images, and the actuation mechanism built into it is summarized.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, A. S., E-mail: alastair.moore@physics.org; Ahmed, M. F.; Soufli, R.

    A dual-channel streaked soft x-ray imager has been designed and used on high energy-density physics experiments at the National Ignition Facility. This streaked imager creates two images of the same x-ray source using two slit apertures and a single shallow angle reflection from a nickel mirror. Thin filters are used to create narrow band pass images at 510 eV and 360 eV. When measuring a Planckian spectrum, the brightness ratio of the two images can be translated into a color-temperature, provided that the spectral sensitivity of the two images is well known. To reduce uncertainty and remove spectral features inmore » the streak camera photocathode from this photon energy range, a thin 100 nm CsI on 50 nm Al streak camera photocathode was implemented. Provided that the spectral shape is well-known, then uncertainties on the spectral sensitivity limits the accuracy of the temperature measurement to approximately 4.5% at 100 eV.« less

  20. Molecular Imaging in the College of Optical Sciences – An Overview of Two Decades of Instrumentation Development

    PubMed Central

    Furenlid, Lars R.; Barrett, Harrison H.; Barber, H. Bradford; Clarkson, Eric W.; Kupinski, Matthew A.; Liu, Zhonglin; Stevenson, Gail D.; Woolfenden, James M.

    2015-01-01

    During the past two decades, researchers at the University of Arizona’s Center for Gamma-Ray Imaging (CGRI) have explored a variety of approaches to gamma-ray detection, including scintillation cameras, solid-state detectors, and hybrids such as the intensified Quantum Imaging Device (iQID) configuration where a scintillator is followed by optical gain and a fast CCD or CMOS camera. We have combined these detectors with a variety of collimation schemes, including single and multiple pinholes, parallel-hole collimators, synthetic apertures, and anamorphic crossed slits, to build a large number of preclinical molecular-imaging systems that perform Single-Photon Emission Computed Tomography (SPECT), Positron Emission Tomography (PET), and X-Ray Computed Tomography (CT). In this paper, we discuss the themes and methods we have developed over the years to record and fully use the information content carried by every detected gamma-ray photon. PMID:26236069

  1. September 2006 Monthly Report- ITER Visible/IRTV Optical Design Scoping Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lasnier, C

    LLNL received a request from the US ITER organization to perform a scoping study of optical design for visible/IR camera systems for the 6 upper ports of ITER. A contract was put in place and the LLNL account number was opened July 19, 2006. A kickoff meeting was held at LLNL July 26. The principal work under the contract is being performed by Lynn Seppala (optical designer), Kevin Morris (mechanical designer), Max Fenstermacher (visible cameras), Mathias Groth (assisting with visible cameras), and Charles Lasnier (IR cameras and Principal Investigator), all LLNL employees. Kevin Morris has imported ITER CAD files andmore » developed a simplified 3D view of the ITER tokamak with upper ports, which he used to determine the optimum viewing angle from an upper port to see the outer target. He also determined the minimum angular field of view needed to see the largest possible coverage of the outer target. We examined the CEA-Cadarache report on their optical design for ITER visible/IRTV equatorial ports. We found that the resolution was diffraction-limited by the 5-mm aperture through the tile. Lynn Seppala developed a similar front-end design for an upper port but with a larger 6-inch-diameter beam. This allows the beam to pass through the port plug and port interspace without further focusing optics until outside the bioshield. This simplifies the design as well as eliminating a requirement for complex relay lenses in the port interspace. The focusing optics are all mirrors, which allows the system to handle light from 0.4 {micro}m to 5 {micro}m wavelength without chromatic aberration. The window material chosen is sapphire, as in the CEA design. Sapphire has good transmission in the desired wavelengths up to 4.8 {micro}m, as well as good mechanical strength. We have verified that sapphire windows of the needed size are commercially available. The diffraction-limited resolution permitted by the 5 mm aperture falls short of the ITER specification value but is well-matched to the resolution of current detectors. A large increase in resolution would require a similar increase in the linear pixel count on a detector. However, we cannot increase the aperture much without affecting the image quality. Lynn Seppala is writing a memo detailing the resolution trade-offs. Charles Lasnier is calculating the radiated power, which will fall on the detector in order to estimate signal-to-noise ratio and maximum frame rate. The signal will be reduced by the fact that the outer target plates are tungsten, which radiates less than carbon at the same temperature. The tungsten will also reflect radiation from the carbon tiles private flux dome, which will radiate efficiently although at a lower temperature than the target plates. The analysis will include estimates of these effects. Max Fenstermacher is investigating the intensity of line emission that will be emitted in the visible band, in order to predict signal-to-noise ratio and maximum frame rate for the visible camera. Andre Kukushkin has modeling results that will give local emission of deuterium and carbon lines. Line integrals of the emission must be done to produce the emitted intensity. The model is not able to handle tungsten and beryllium so we will only be able to estimate deuterium and carbon emission. Total costs as of September 30, 2006 are $87,834.43. Manpower was 0.58 FTE's in August, 1.48 in August, and 1.56 in September.« less

  2. TEM Video Compressive Sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ TEM experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing methods [1, 2, 3, 4] to increase the framerate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integrated into amore » single camera frame during the acquisition process, and then extracted upon readout using statistical compressive sensing inversion. Our simulations show that it should be possible to increase the speed of any camera by at least an order of magnitude. Compressive Sensing (CS) combines sensing and compression in one operation, and thus provides an approach that could further improve the temporal resolution while correspondingly reducing the electron dose rate. Because the signal is measured in a compressive manner, fewer total measurements are required. When applied to TEM video capture, compressive imaging couled improve acquisition speed and reduce the electron dose rate. CS is a recent concept, and has come to the forefront due the seminal work of Candès [5]. Since the publication of Candès, there has been enormous growth in the application of CS and development of CS variants. For electron microscopy applications, the concept of CS has also been recently applied to electron tomography [6], and reduction of electron dose in scanning transmission electron microscopy (STEM) imaging [7]. To demonstrate the applicability of coded aperture CS video reconstruction for atomic level imaging, we simulate compressive sensing on observations of Pd nanoparticles and Ag nanoparticles during exposure to high temperatures and other environmental conditions. Figure 1 highlights the results from the Pd nanoparticle experiment. On the left, 10 frames are reconstructed from a single coded frame—the original frames are shown for comparison. On the right a selection of three frames are shown from reconstructions at compression levels 10,20,30. The reconstructions, which are not post-processed, are true to the original and degrade in a straightforward manner. The final choice of compression level will obviously depend on both the temporal and spatial resolution required for a specific imaging task, but the results indicate that an increase in speed of better than an order of magnitude should be possible for all experiments. References: [1] P Llull, X Liao, X Yuan et al. Optics express 21(9), (2013), p. 10526. [2] J Yang, X Yuan, X Liao et al. Image Processing, IEEE Trans 23(11), (2014), p. 4863. [3] X Yuan, J Yang, P Llull et al. In ICIP 2013 (IEEE), p. 14. [4] X Yuan, P Llull, X Liao et al. In CVPR 2014. p. 3318. [5] EJ Candès, J Romberg and T Tao. Information Theory, IEEE Trans 52(2), (2006), p. 489. [6] P Binev, W Dahmen, R DeVore et al. In Modeling Nanoscale Imaging in Electron Microscopy, eds. T Vogt, W Dahmen and P Binev (Springer US), Nanostructure Science and Technology (2012). p. 73. [7] A Stevens, H Yang, L Carin et al. Microscopy 63(1), (2014), pp. 41.« less

  3. Spirit Mini-TES Observations: From Bonneville Crater to the Columbia Hills.

    NASA Astrophysics Data System (ADS)

    Blaney, D. L.; Athena Science

    2004-11-01

    During the Mars Exploration Rover Extended Mission the Spirit rover traveled from the rim of the crater informally known as "Bonneville, Crater" into the hills informally known as the "Columbia Hills" in Gusev Crater. During this >3 km drive Mini-TES (Miniature Thermal Emission Spectrometer) collected systematic observations to characterize spectral diversity and targeted observations of rocks, soils, rover tracks, and trenches. Surface temperatures have steadily decreased during the drive and arrival into the Columbia hills with the approach of winter. Mini-TES covers the 5-29 micron spectral region with a 20 mrad aperture that is co-registered with panoramic and navigation cameras. As at the landing site (Christensen et al., Science, 2004), many dark rocks in the plains between "Bonneville Crater" show long wavelength (15-25 μm) absorptions due to olivine consistent with the detection of olivine-bearing basalt at this site from orbital TES infrared spectroscopy. Rocks with the spectral signature of olivine are rarer in the Columbia Hills. Measurements of outcrops of presumably intact bedrock lack any olivine signature and are consistent with other results indicating that these rocks are highly altered. Rock coatings and fine dust on rocks are common. Soils have thin dust coatings and disturbed soil (e.g rover tracks and trenches) are consistent with basalt. Mini-TES observations were coordinated with Panoramic Camera (Pancam) observations to allow us to search for correlations of visible spectra properties with infrared. This work was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under contract to NASA.

  4. Determining the phase and amplitude distortion of a wavefront using a plenoptic sensor.

    PubMed

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C

    2015-05-01

    We have designed a plenoptic sensor to retrieve phase and amplitude changes resulting from a laser beam's propagation through atmospheric turbulence. Compared with the commonly restricted domain of (-π,π) in phase reconstruction by interferometers, the reconstructed phase obtained by the plenoptic sensors can be continuous up to a multiple of 2π. When compared with conventional Shack-Hartmann sensors, ambiguities caused by interference or low intensity, such as branch points and branch cuts, are less likely to happen and can be adaptively avoided by our reconstruction algorithm. In the design of our plenoptic sensor, we modified the fundamental structure of a light field camera into a mini Keplerian telescope array by accurately cascading the back focal plane of its object lens with a microlens array's front focal plane and matching the numerical aperture of both components. Unlike light field cameras designed for incoherent imaging purposes, our plenoptic sensor operates on the complex amplitude of the incident beam and distributes it into a matrix of images that are simpler and less subject to interference than a global image of the beam. Then, with the proposed reconstruction algorithms, the plenoptic sensor is able to reconstruct the wavefront and a phase screen at an appropriate depth in the field that causes the equivalent distortion on the beam. The reconstructed results can be used to guide adaptive optics systems in directing beam propagation through atmospheric turbulence. In this paper, we will show the theoretical analysis and experimental results obtained with the plenoptic sensor and its reconstruction algorithms.

  5. Determining the phase and amplitude distortion of a wavefront using a plenoptic sensor

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C.

    2015-05-01

    We have designed a plenoptic sensor to retrieve phase and amplitude changes resulting from a laser beam's propagation through atmospheric turbulence. Compared with the commonly restricted domain of (-pi, pi) in phase reconstruction by interferometers, the reconstructed phase obtained by the plenoptic sensors can be continuous up to a multiple of 2pi. When compared with conventional Shack-Hartmann sensors, ambiguities caused by interference or low intensity, such as branch points and branch cuts, are less likely to happen and can be adaptively avoided by our reconstruction algorithm. In the design of our plenoptic sensor, we modified the fundamental structure of a light field camera into a mini Keplerian telescope array by accurately cascading the back focal plane of its object lens with a microlens array's front focal plane and matching the numerical aperture of both components. Unlike light field cameras designed for incoherent imaging purposes, our plenoptic sensor operates on the complex amplitude of the incident beam and distributes it into a matrix of images that are simpler and less subject to interference than a global image of the beam. Then, with the proposed reconstruction algorithms, the plenoptic sensor is able to reconstruct the wavefront and a phase screen at an appropriate depth in the field that causes the equivalent distortion on the beam. The reconstructed results can be used to guide adaptive optics systems in directing beam propagation through atmospheric turbulence. In this paper we will show the theoretical analysis and experimental results obtained with the plenoptic sensor and its reconstruction algorithms.

  6. Infrared cameras are potential traceable "fixed points" for future thermometry studies.

    PubMed

    Yap Kannan, R; Keresztes, K; Hussain, S; Coats, T J; Bown, M J

    2015-01-01

    The National physical laboratory (NPL) requires "fixed points" whose temperatures have been established by the International Temperature Scale of 1990 (ITS 90) be used for device calibration. In practice, "near" blackbody radiators together with the standard platinum resistance thermometer is accepted as a standard. The aim of this study was to report the correlation and limits of agreement (LOA) of the thermal infrared camera and non-contact infrared temporal thermometer against each other and the "near" blackbody radiator. Temperature readings from an infrared thermography camera (FLIR T650sc) and a non-contact infrared temporal thermometer (Hubdic FS-700) were compared to a near blackbody (Hyperion R blackbody model 982) at 0.5 °C increments between 20-40 °C. At each increment, blackbody cavity temperature was confirmed with the platinum resistance thermometer. Measurements were taken initially with the thermal infrared camera followed by the infrared thermometer, with each device mounted in turn on a stand at a fixed distance of 20 cm and 5 cm from the blackbody aperture, respectively. The platinum thermometer under-estimated the blackbody temperature by 0.015 °C (95% LOA: -0.08 °C to 0.05 °C), in contrast to the thermal infrared camera and infrared thermometer which over-estimated the blackbody temperature by 0.16 °C (95% LOA: 0.03 °C to 0.28 °C) and 0.75 °C (95% LOA: -0.30 °C to 1.79 °C), respectively. Infrared thermometer over-estimates thermal infrared camera measurements by 0.6 °C (95% LOA: -0.46 °C to 1.65 °C). In conclusion, the thermal infrared camera is a potential temperature reference "fixed point" that could substitute mercury thermometers. However, further repeatability and reproducibility studies will be required with different models of thermal infrared cameras.

  7. Thermal Neutron Imaging Using A New Pad-Based Position Sensitive Neutron Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dioszegi I.; Vanier P.E.; Salwen C.

    2016-10-29

    Thermal neutrons (with mean energy of 25 meV) have a scattering mean free path of about 20 m in air. Therefore it is feasible to find localized thermal neutron sources up to ~30 m standoff distance using thermal neutron imaging. Coded aperture thermal neutron imaging was developed in our laboratory in the nineties, using He-3 filled wire chambers. Recently a new generation of coded-aperture neutron imagers has been developed. In the new design the ionization chamber has anode and cathode planes, where the anode is composed of an array of individual pads. The charge is collected on each of themore » individual 5x5 mm2 anode pads, (48x48 in total, corresponding to 24x24 cm2 sensitive area) and read out by application specific integrated circuits (ASICs). The high sensitivity of the ASICs allows unity gain operation mode. The new design has several advantages for field deployable imaging applications, compared to the previous generation of wire-grid based neutron detectors. Among these are the rugged design, lighter weight and use of non-flammable stopping gas. For standoff localization of thermalized neutron sources a low resolution (11x11 pixel) coded aperture mask has been fabricated. Using the new larger area detector and the coarse resolution mask we performed several standoff experiments using moderated californium and plutonium sources at Idaho National Laboratory. In this paper we will report on the development and performance of the new pad-based neutron camera, and present long range coded-aperture images of various thermalized neutron sources.« less

  8. Video semaphore decoding for free-space optical communication

    NASA Astrophysics Data System (ADS)

    Last, Matthew; Fisher, Brian; Ezekwe, Chinwuba; Hubert, Sean M.; Patel, Sheetal; Hollar, Seth; Leibowitz, Brian S.; Pister, Kristofer S. J.

    2001-04-01

    Using teal-time image processing we have demonstrated a low bit-rate free-space optical communication system at a range of more than 20km with an average optical transmission power of less than 2mW. The transmitter is an autonomous one cubic inch microprocessor-controlled sensor node with a laser diode output. The receiver is a standard CCD camera with a 1-inch aperture lens, and both hardware and software implementations of the video semaphore decoding algorithm. With this system sensor data can be reliably transmitted 21 km form San Francisco to Berkeley.

  9. STS-93 orbiter Columbia streaks across Houston sky

    NASA Image and Video Library

    1999-07-27

    S99-08357 (27 July 1999) --- The fly-over of Space Shuttle Columbia's STS-93 re-entry is seen above the Johnson Space Center's Rocket Park. The Saturn V is below the streak that was left by Columbia re-entering the atmosphere. The image was captured with a Hasselblad 503cx medium format camera with a 30mm Hasselblad lens using an 8-second exposure and an aperture setting of f/8. The film was Kodak PMZ 1000 color negative film. The photographer was Mark Sowa of the NASA Johnson Space Center's photography group.

  10. Wafer chamber having a gas curtain for extreme-UV lithography

    DOEpatents

    Kanouff, Michael P.; Ray-Chaudhuri, Avijit K.

    2001-01-01

    An EUVL device includes a wafer chamber that is separated from the upstream optics by a barrier having an aperture that is permeable to the inert gas. Maintaining an inert gas curtain in the proximity of a wafer positioned in a chamber of an extreme ultraviolet lithography device can effectively prevent contaminants from reaching the optics in an extreme ultraviolet photolithography device even though solid window filters are not employed between the source of reflected radiation, e.g., the camera, and the wafer. The inert gas removes the contaminants by entrainment.

  11. NEWFIRM Software--System Integration Using OPC

    NASA Astrophysics Data System (ADS)

    Daly, P. N.

    2004-07-01

    The NOAO Extremely Wide-Field Infra-Red Mosaic (NEWFIRM) camera is being built to satisfy the survey science requirements on the KPNO Mayall and CTIO Blanco 4m telescopes in an era of 8m+ aperture telescopes. Rather than re-invent the wheel, the software system to control the instrument has taken existing software packages and re-used what is appropriate. The result is an end-to-end observation control system using technology components from DRAMA, ORAC, observing tools, GWC, existing in-house motor controllers and new developments like the MONSOON pixel server.

  12. The importance of the boundary condition in the transport of intensity equation based phase measurement

    NASA Astrophysics Data System (ADS)

    Zhang, Jialin; Chen, Qian; Li, Jiaji; Zuo, Chao

    2017-02-01

    The transport of intensity equation (TIE) is a powerful tool for direct quantitative phase retrieval in microscopy imaging. However, there may be some problems when dealing with the boundary condition of the TIE. The previous work introduces a hard-edged aperture to the camera port of the traditional bright field microscope to generate the boundary signal for the TIE solver. Under this Neumann boundary condition, we can obtain the quantitative phase without any assumption or prior knowledge about the test object and the setup. In this paper, we will demonstrate the effectiveness of this method based on some experiments in practice. The micro lens array will be used for the comparison of two TIE solvers results based on introducing the aperture or not and this accurate quantitative phase imaging technique allows measuring cell dry mass which is used in biology to follow cell cycle, to investigate cell metabolism, or to address effects of drugs.

  13. Single Lens Dual-Aperture 3D Imaging System: Color Modeling

    NASA Technical Reports Server (NTRS)

    Bae, Sam Y.; Korniski, Ronald; Ream, Allen; Fritz, Eric; Shearn, Michael

    2012-01-01

    In an effort to miniaturize a 3D imaging system, we created two viewpoints in a single objective lens camera. This was accomplished by placing a pair of Complementary Multi-band Bandpass Filters (CMBFs) in the aperture area. Two key characteristics about the CMBFs are that the passbands are staggered so only one viewpoint is opened at a time when a light band matched to that passband is illuminated, and the passbands are positioned throughout the visible spectrum, so each viewpoint can render color by taking RGB spectral images. Each viewpoint takes a different spectral image from the other viewpoint hence yielding a different color image relative to the other. This color mismatch in the two viewpoints could lead to color rivalry, where the human vision system fails to resolve two different colors. The difference will be closer if the number of passbands in a CMBF increases. (However, the number of passbands is constrained by cost and fabrication technique.) In this paper, simulation predicting the color mismatch is reported.

  14. Large-aperture ground glass surface profile measurement using coherence scanning interferometry.

    PubMed

    Bae, Eundeok; Kim, Yunseok; Park, Sanguk; Kim, Seung-Woo

    2017-01-23

    We present a coherence scanning interferometer configured to deal with rough glass surfaces exhibiting very low reflectance due to severe sub-surface light scattering. A compound light source is prepared by combining a superluminescent light-emitting diode with an ytterbium-doped fiber amplifier. The light source is attuned to offer a short temporal coherence length of 15 μm but with high spatial coherence to secure an adequate correlogram contrast by delivering strongly unbalanced optical power to the low reflectance target. In addition, the infrared spectral range of the light source is shifted close to the visible side at a 1,038 nm center wavelength, so a digital camera of multi-mega pixels available for industrial machine vision can be used to improve the correlogram contrast further with better lateral image resolutions. Experimental results obtained from a ground Zerodur mirror of 200 mm aperture size and 0.9 μm rms roughness are discussed to validate the proposed interferometer system.

  15. Preliminary Opto-Mechanical Design for the X2000 Transceiver

    NASA Technical Reports Server (NTRS)

    Hemmati, H.; Page, N. A.

    2000-01-01

    Preliminary optical design and mechanical conceptual design for a 30 cm aperture transceiver are described. A common aperture is used for both transmit and receive. Special attention was given to off-axis and scattered light rejection and isolation of the receive channel from the transmit channel. Requirements, details of the design and preliminary performance analysis of the transceiver are provided.

  16. Sub-aperture switching based ptychographic iterative engine (sasPIE) method for quantitative imaging

    NASA Astrophysics Data System (ADS)

    Sun, Aihui; Kong, Yan; Jiang, Zhilong; Yu, Wei; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-03-01

    Though ptychographic iterative engine (PIE) has been widely adopted in the quantitative micro-imaging with various illuminations as visible light, X-ray and electron beam, the mechanical inaccuracy in the raster scanning of the sample relative to the illumination always degrades the reconstruction quality seriously and makes the resolution reached much lower than that determined by the numerical aperture of the optical system. To overcome this disadvantage, the sub-aperture switching based PIE method is proposed: the mechanical scanning in the common PIE is replaced by the sub-aperture switching, and the reconstruction error related to the positioning inaccuracy is completely avoided. The proposed technique remarkably improves the reconstruction quality, reduces the complexity of the experimental setup and fundamentally accelerates the data acquisition and reconstruction.

  17. Accuracy evaluation of optical distortion calibration by digital image correlation

    NASA Astrophysics Data System (ADS)

    Gao, Zeren; Zhang, Qingchuan; Su, Yong; Wu, Shangquan

    2017-11-01

    Due to its convenience of operation, the camera calibration algorithm, which is based on the plane template, is widely used in image measurement, computer vision and other fields. How to select a suitable distortion model is always a problem to be solved. Therefore, there is an urgent need for an experimental evaluation of the accuracy of camera distortion calibrations. This paper presents an experimental method for evaluating camera distortion calibration accuracy, which is easy to implement, has high precision, and is suitable for a variety of commonly used lens. First, we use the digital image correlation method to calculate the in-plane rigid body displacement field of an image displayed on a liquid crystal display before and after translation, as captured with a camera. Next, we use a calibration board to calibrate the camera to obtain calibration parameters which are used to correct calculation points of the image before and after deformation. The displacement field before and after correction is compared to analyze the distortion calibration results. Experiments were carried out to evaluate the performance of two commonly used industrial camera lenses for four commonly used distortion models.

  18. Multiocular image sensor with on-chip beam-splitter and inner meta-micro-lens for single-main-lens stereo camera.

    PubMed

    Koyama, Shinzo; Onozawa, Kazutoshi; Tanaka, Keisuke; Saito, Shigeru; Kourkouss, Sahim Mohamed; Kato, Yoshihisa

    2016-08-08

    We developed multiocular 1/3-inch 2.75-μm-pixel-size 2.1M- pixel image sensors by co-design of both on-chip beam-splitter and 100-nm-width 800-nm-depth patterned inner meta-micro-lens for single-main-lens stereo camera systems. A camera with the multiocular image sensor can capture horizontally one-dimensional light filed by both the on-chip beam-splitter horizontally dividing ray according to incident angle, and the inner meta-micro-lens collecting the divided ray into pixel with small optical loss. Cross-talks between adjacent light field images of a fabricated binocular image sensor and of a quad-ocular image sensor are as low as 6% and 7% respectively. With the selection of two images from one-dimensional light filed images, a selective baseline for stereo vision is realized to view close objects with single-main-lens. In addition, by adding multiple light field images with different ratios, baseline distance can be tuned within an aperture of a main lens. We suggest the electrically selective or tunable baseline stereo vision to reduce 3D fatigue of viewers.

  19. Extended depth of field system for long distance iris acquisition

    NASA Astrophysics Data System (ADS)

    Chen, Yuan-Lin; Hsieh, Sheng-Hsun; Hung, Kuo-En; Yang, Shi-Wen; Li, Yung-Hui; Tien, Chung-Hao

    2012-10-01

    Using biometric signatures for identity recognition has been practiced for centuries. Recently, iris recognition system attracts much attention due to its high accuracy and high stability. The texture feature of iris provides a signature that is unique for each subject. Currently most commercial iris recognition systems acquire images in less than 50 cm, which is a serious constraint that needs to be broken if we want to use it for airport access or entrance that requires high turn-over rate . In order to capture the iris patterns from a distance, in this study, we developed a telephoto imaging system with image processing techniques. By using the cubic phase mask positioned front of the camera, the point spread function was kept constant over a wide range of defocus. With adequate decoding filter, the blurred image was restored, where the working distance between the subject and the camera can be achieved over 3m associated with 500mm focal length and aperture F/6.3. The simulation and experimental results validated the proposed scheme, where the depth of focus of iris camera was triply extended over the traditional optics, while keeping sufficient recognition accuracy.

  20. Sparse synthetic aperture with Fresnel elements (S-SAFE) using digital incoherent holograms

    PubMed Central

    Kashter, Yuval; Rivenson, Yair; Stern, Adrian; Rosen, Joseph

    2015-01-01

    Creating a large-scale synthetic aperture makes it possible to break the resolution boundaries dictated by the wave nature of light of common optical systems. However, their implementation is challenging, since the generation of a large size continuous mosaic synthetic aperture composed of many patterns is complicated in terms of both phase matching and time-multiplexing duration. In this study we present an advanced configuration for an incoherent holographic imaging system with super resolution qualities that creates a partial synthetic aperture. The new system, termed sparse synthetic aperture with Fresnel elements (S-SAFE), enables significantly decreasing the number of the recorded elements, and it is free from positional constrains on their location. Additionally, in order to obtain the best image quality we propose an optimal mosaicking structure derived on the basis of physical and numerical considerations, and introduce three reconstruction approaches which are compared and discussed. The super-resolution capabilities of the proposed scheme and its limitations are analyzed, numerically simulated and experimentally demonstrated. PMID:26367947

  1. Water flow in high-speed handpieces.

    PubMed

    Cavalcanti, Bruno Neves; Serairdarian, Paulo Isaías; Rode, Sigmar Mello

    2005-05-01

    This study measured the water flow commonly used in high-speed handpieces to evaluate the water flow's influence on temperature generation. Different flow speeds were evaluated between turbines that had different numbers of cooling apertures. Two water samples were collected from each high-speed handpiece at private practices and at the School of Dentistry at São José dos Campos. The first sample was collected at the customary flow and the second was collected with the terminal opened for maximum flow. The two samples were collected into weighed glass receptacles after 15 seconds of turbine operation. The glass receptacles were reweighed and the difference between weights was recorded to calculate the water flow in mL/min and for further statistical analysis. The average water flow for 137 samples was 29.48 mL/min. The flow speeds obtained were 42.38 mL/min for turbines with one coolant aperture; 34.31 mL/min for turbines with two coolant apertures; and 30.44 mL/min for turbines with three coolant apertures. There were statistical differences between turbines with one and three coolant apertures (Tukey-Kramer multiple comparisons test with P < .05). Turbine handpieces with one cooling aperture distributed more water for the burs than high-speed handpieces with more than one aperture.

  2. Possible Overlaps Between Blobs, Grism Apertures, and Dithers

    NASA Astrophysics Data System (ADS)

    Ryan, R. E.; McCullough, P. R.

    2017-06-01

    We present a investigation into possible overlaps between the known IR blobs with the grism aperture reference positions and the IR dither patterns. Each aperture was designed to place the science target (e.g. a specific star) on a cosmetically clean area of the IR detector. Similarly, the dither patterns were designed to mitigate cosmetic defects by rarely (or ideally never) placing such targets on known defects. Because blobs accumulate with time, the originally defined apertures and dither patterns may no longer accomplish their goals, it is important to reverify these combinations. We find two potential overlaps between the blob, aperture, and dither combinations, but do not recommend any changes to the current suite of aperture references positions and/or dither patterns for two reasons. First, one of the overlaps occurs with a dither/aperture combination that is seldom used for high-value science operations, but rather more common for wide-field surveys/mosaics. Second, the other overlap is 8.7 pix from a blob that has a fiducial radius of 10 pix, which already represents a very conservative distance. We conclude that a similar analysis should be repeated as new blobs occur, to continue to ensure ideal operations for high-value science targets. The purpose of this report is to document the analysis in order to facilitate its repetition in the future.

  3. Numerical simulations of imaging satellites with optical interferometry

    NASA Astrophysics Data System (ADS)

    Ding, Yuanyuan; Wang, Chaoyan; Chen, Zhendong

    2015-08-01

    Optical interferometry imaging system, which is composed of multiple sub-apertures, is a type of sensor that can break through the aperture limit and realize the high resolution imaging. This technique can be utilized to precisely measure the shapes, sizes and position of astronomical objects and satellites, it also can realize to space exploration and space debris, satellite monitoring and survey. Fizeau-Type optical aperture synthesis telescope has the advantage of short baselines, common mount and multiple sub-apertures, so it is feasible for instantaneous direct imaging through focal plane combination.Since 2002, the researchers of Shanghai Astronomical Observatory have developed the study of optical interferometry technique. For array configurations, there are two optimal array configurations proposed instead of the symmetrical circular distribution: the asymmetrical circular distribution and the Y-type distribution. On this basis, two kinds of structure were proposed based on Fizeau interferometric telescope. One is Y-type independent sub-aperture telescope, the other one is segmented mirrors telescope with common secondary mirror.In this paper, we will give the description of interferometric telescope and image acquisition. Then we will mainly concerned the simulations of image restoration based on Y-type telescope and segmented mirrors telescope. The Richardson-Lucy (RL) method, Winner method and the Ordered Subsets Expectation Maximization (OS-EM) method are studied in this paper. We will analyze the influence of different stop rules too. At the last of the paper, we will present the reconstruction results of images of some satellites.

  4. Concept of electro-optical sensor module for sniper detection system

    NASA Astrophysics Data System (ADS)

    Trzaskawka, Piotr; Dulski, Rafal; Kastek, Mariusz

    2010-10-01

    The paper presents an initial concept of the electro-optical sensor unit for sniper detection purposes. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. Being a part of a larger system it should contribute to greater overall system efficiency and lower false alarm rate thanks to data and sensor fusion techniques. Additionally, it is expected to provide some pre-shot detection capabilities. Generally acoustic (or radar) systems used for shot detection offer only "after-the-shot" information and they cannot prevent enemy attack, which in case of a skilled sniper opponent usually means trouble. The passive imaging sensors presented in this paper, together with active systems detecting pointed optics, are capable of detecting specific shooter signatures or at least the presence of suspected objects in the vicinity. The proposed sensor unit use thermal camera as a primary sniper and shot detection tool. The basic camera parameters such as focal plane array size and type, focal length and aperture were chosen on the basis of assumed tactical characteristics of the system (mainly detection range) and current technology level. In order to provide costeffective solution the commercially available daylight camera modules and infrared focal plane arrays were tested, including fast cooled infrared array modules capable of 1000 fps image acquisition rate. The daylight camera operates as a support, providing corresponding visual image, easier to comprehend for a human operator. The initial assumptions concerning sensor operation were verified during laboratory and field test and some example shot recording sequences are presented.

  5. Best practices to optimize intraoperative photography.

    PubMed

    Gaujoux, Sébastien; Ceribelli, Cecilia; Goudard, Geoffrey; Khayat, Antoine; Leconte, Mahaut; Massault, Pierre-Philippe; Balagué, Julie; Dousset, Bertrand

    2016-04-01

    Intraoperative photography is used extensively for communication, research, or teaching. The objective of the present work was to define, using a standardized methodology and literature review, the best technical conditions for intraoperative photography. Using either a smartphone camera, a bridge camera, or a single-lens reflex (SLR) camera, photographs were taken under various standard conditions by a professional photographer. All images were independently assessed blinded to technical conditions to define the best shooting conditions and methods. For better photographs, an SLR camera with manual settings should be used. Photographs should be centered and taken vertically and orthogonal to the surgical field with a linear scale to avoid error in perspective. The shooting distance should be about 75 cm using an 80-100-mm focal lens. Flash should be avoided and scialytic low-powered light should be used without focus. The operative field should be clean, wet surfaces should be avoided, and metal instruments should be hidden to avoid reflections. For SLR camera, International Organization for Standardization speed should be as low as possible, autofocus area selection mode should be on single point AF, shutter speed should be above 1/100 second, and aperture should be as narrow as possible, above f/8. For smartphone, use high dynamic range setting if available, use of flash, digital filter, effect apps, and digital zoom is not recommended. If a few basic technical rules are known and applied, high-quality photographs can be taken by amateur photographers and fit the standards accepted in clinical practice, academic communication, and publications. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Variable aperture-based ptychographical iterative engine method

    NASA Astrophysics Data System (ADS)

    Sun, Aihui; Kong, Yan; Meng, Xin; He, Xiaoliang; Du, Ruijun; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-02-01

    A variable aperture-based ptychographical iterative engine (vaPIE) is demonstrated both numerically and experimentally to reconstruct the sample phase and amplitude rapidly. By adjusting the size of a tiny aperture under the illumination of a parallel light beam to change the illumination on the sample step by step and recording the corresponding diffraction patterns sequentially, both the sample phase and amplitude can be faithfully reconstructed with a modified ptychographical iterative engine (PIE) algorithm. Since many fewer diffraction patterns are required than in common PIE and the shape, the size, and the position of the aperture need not to be known exactly, this proposed vaPIE method remarkably reduces the data acquisition time and makes PIE less dependent on the mechanical accuracy of the translation stage; therefore, the proposed technique can be potentially applied for various scientific researches.

  7. Nano-scale zero valent iron transport in a variable aperture dolomite fracture and a glass fracture

    NASA Astrophysics Data System (ADS)

    Mondal, P.; Sleep, B. E.; Cui, Z.; Zhou, Z.

    2014-12-01

    Experiments and numerical simulations are being performed to understand the transport behavior of carboxymethyl cellulose polymer stabilized nano-scale zero valent iron (nZVI) in a variable aperture dolomite rock fracture and a variable aperture glass replica of a fractured slate. The rock fracture was prepared by artificially inducing a fracture in a dolomite block along a stylolite, and the glass fracture was prepared by creating molds with melted glass on two opposing sides of a fractured slate rock block. Both of the fractures were 0.28 m in length and 0.21 m in width. Equivalent hydraulic apertures are about 110 microns for the rock fracture and 250 microns for the glass replica fracture. Sodium bromide and lissamine green B (LGB) serve as conservative tracers in the rock fracture and glass replica fracture, respectively. A dark box set-up with a light source and digital camera is being used to visualize the LGB and CMC-nZVI movement in the glass fracture. Experiments are being performed to determine the effects of water specific discharge and CMC concentration on nZVI transport in the fractures. Transmission electron microscopy, dynamic light scattering, and UV-visual spectrophotometry were performed to determine the stability and characteristics of the CMC-nZVI mixture. The transport of bromide, LGB, CMC, and CMC-nZVI in both fractures is being evaluated through analysis of the effluent concentrations. Time-lapse images are also being captured for the glass fracture. Bromide, LGB, and CMC recoveries have exceeded 95% in both fractures. Significant channeling has been observed in the fractures for CMC transport due to viscous effects.

  8. Identifying fecal matter contamination in produce fields using multispectral reflectance imaging under ambient solar illumination

    NASA Astrophysics Data System (ADS)

    Everard, Colm D.; Kim, Moon S.; Lee, Hoonsoo; O'Donnell, Colm P.

    2016-05-01

    An imaging device to detect fecal contamination in fresh produce fields could allow the producer avoid harvesting fecal contaminated produce. E.coli O157:H7 outbreaks have been associated with fecal contaminated leafy greens. In this study, in-field spectral profiles of bovine fecal matter, soil, and spinach leaves are compared. A common aperture imager designed with two identical monochromatic cameras, a beam splitter, and optical filters was used to simultaneously capture two-spectral images of leaves contaminated with both fecal matter and soil. The optical filters where 10 nm full width half maximum bandpass filters, one at 690 nm and the second at 710 nm. These were mounted in front of the object lenses. New images were created using the ratio of these two spectral images on a pixel by pixel basis. Image analysis results showed that the fecal matter contamination could be distinguished from soil and leaf on the ratio images. The use of this technology has potential to allow detection of fecal contamination in produce fields which can be a source of foodbourne illnesses. It has the added benefit of mitigating cross-contamination during harvesting and processing.

  9. Applications of superconducting bolometers in security imaging

    NASA Astrophysics Data System (ADS)

    Luukanen, A.; Leivo, M. M.; Rautiainen, A.; Grönholm, M.; Toivanen, H.; Grönberg, L.; Helistö, P.; Mäyrä, A.; Aikio, M.; Grossman, E. N.

    2012-12-01

    Millimeter-wave (MMW) imaging systems are currently undergoing deployment World-wide for airport security screening applications. Security screening through MMW imaging is facilitated by the relatively good transmission of these wavelengths through common clothing materials. Given the long wavelength of operation (frequencies between 20 GHz to ~ 100 GHz, corresponding to wavelengths between 1.5 cm and 3 mm), existing systems are suited for close-range imaging only due to substantial diffraction effects associated with practical aperture diameters. The present and arising security challenges call for systems that are capable of imaging concealed threat items at stand-off ranges beyond 5 meters at near video frame rates, requiring substantial increase in operating frequency in order to achieve useful spatial resolution. The construction of such imaging systems operating at several hundred GHz has been hindered by the lack of submm-wave low-noise amplifiers. In this paper we summarize our efforts in developing a submm-wave video camera which utilizes cryogenic antenna-coupled microbolometers as detectors. Whilst superconducting detectors impose the use of a cryogenic system, we argue that the resulting back-end complexity increase is a favorable trade-off compared to complex and expensive room temperature submm-wave LNAs both in performance and system cost.

  10. Supernova 2009kf: An Ultraviolet Bright Type IIP Supernova Discovered With Pan-Starrs 1 and Galex

    DTIC Science & Technology

    2010-07-01

    The 7 deg2 camera and 1.8 m aperture could allow IIP SNe to be used as cosmological probes at z ∼ 0.2 and the brightest events to be found out to z...ultraviolet (NUV). We discuss the implication of this rare SN for understanding the explosions and the use of Type IIP events for probing cosmology and...SFR at high redshifts. We adopt the cosmological parameters H0 = 70 km s−1 Mpc−1, ΩM = 0.3, ΩΛ = 0.7. 2. DISCOVERY AND OBSERVATIONAL DATA SN 2009kf

  11. VizieR Online Data Catalog: 1992-1997 binary star speckle measurements (Balega+, 1999)

    NASA Astrophysics Data System (ADS)

    Balega, I. I.; Balega, Y. Y.; Maksimov, A. F.; Pluzhnik, E. A.; Shkhagosheva, Z. U.; Vasyuk, V. A.

    2000-11-01

    We present the results of speckle interferometric measurements of binary stars made with the television photon-counting camera at the 6-m Big Azimuthal Telescope (BTA) and 1-m telescope of the Special Astrophysical Observatory (SAO) between August 1992 and May 1997. The data contain 89 observations of 62 star systems on the large telescope and 21 on the smaller one. For the 6-m aperture 18 systems remained unresolved. The measured angular separation ranged from 39 mas, two times above the BTA diffraction limit, to 1593 mas. (3 data files).

  12. Binary star speckle measurements during 1992-1997 from the SAO 6-m and 1-m telescopes in Zelenchuk

    NASA Astrophysics Data System (ADS)

    Balega, I. I.; Balega, Y. Y.; Maksimov, A. F.; Pluzhnik, E. A.; Shkhagosheva, Z. U.; Vasyuk, V. A.

    1999-12-01

    We present the results of speckle interferometric measurements of binary stars made with the television photon-counting camera at the 6-m Big Azimuthal Telescope (BTA) and 1-m telescope of the Special Astrophysical Observatory (SAO) between August 1992 and May 1997. The data contain 89 observations of 62 star systems on the large telescope and 21 on the smaller one. For the 6-m aperture 18 systems remained unresolved. The measured angular separation ranged from 39 mas, two times above the BTA diffraction limit, to 1593 mas.

  13. Are camera surveys useful for assessing recruitment in white-tailed deer?

    Treesearch

    M. Colter Chitwood; Marcus A. Lashley; John C. Kilgo; Michael J. Cherry; L. Mike Conner; Mark Vukovich; H. Scott Ray; Charles Ruth; Robert J. Warren; Christopher S. DePerno; Christopher E. Moorman

    2017-01-01

    Camera surveys commonly are used by managers and hunters to estimate white-tailed deer Odocoileus virginianus density and demographic rates. Though studies have documented biases and inaccuracies in the camera survey methodology, camera traps remain popular due to ease of use, cost-effectiveness, and ability to survey large areas. Because recruitment is a key parameter...

  14. Energy, momentum and propagation of non-paraxial high-order Gaussian beams in the presence of an aperture

    NASA Astrophysics Data System (ADS)

    Stilgoe, Alexander B.; Nieminen, Timo A.; Rubinsztein-Dunlop, Halina

    2015-12-01

    Non-paraxial theories of wave propagation are essential to model the interaction of highly focused light with matter. Here we investigate the energy, momentum and propagation of the Laguerre-, Hermite- and Ince-Gaussian solutions (LG, HG, and IG) of the paraxial wave equation in an apertured non-paraxial regime. We investigate the far-field relationships between the LG, HG, and IG solutions and the vector spherical wave function (VSWF) solutions of the vector Helmholtz wave equation. We investigate the convergence of the VSWF and the various Gaussian solutions in the presence of an aperture. Finally, we investigate the differences in linear and angular momentum evaluated in the paraxial and non-paraxial regimes. The non-paraxial model we develop can be applied to calculations of the focusing of high-order Gaussian modes in high-resolution microscopes. We find that the addition of an aperture in high numerical aperture optical systems does not greatly affect far-field properties except when the beam is significantly clipped by an aperture. Diffraction from apertures causes large distortions in the near-field and will influence light-matter interactions. The method is not limited to a particular solution of the paraxial wave equation. Our model is constructed in a formalism that is commonly used in scattering calculations. It is thus applicable to optical trapping and other optical investigations of matter.

  15. Diffraction patterns in Fresnel approximation of periodic objects for a colorimeter of two apertures

    NASA Astrophysics Data System (ADS)

    Cortes-Reynoso, Jose-German R.; Suarez-Romero, Jose G.; Hurtado-Ramos, Juan B.; Tepichin-Rodriguez, Eduardo; Solorio-Leyva, Juan Carlos

    2004-10-01

    In this work, we present a study of Fresnel diffraction of periodic structures in an optical system of two apertures. This system of two apertures was used successfully for measuring color in textile samples solving the problems of illumination and directionality that present current commercial equipments. However, the system is sensible to the spatial frequency of the periodic sample"s area enclosed in its optical field of view. The study of Fresnel diffraction allows us to establish criteria for geometrical parameters of measurements in order to assure invariance in angular rotations and spatial positions. In this work, we use the theory of partial coherence to calculate the diffraction through two continuous apertures. In the calculation process, we use the concept of point-spread function of the system for partial coherence, in this way we avoid complicated statistical processes commonly used in the partial coherence theory.

  16. Variable aperture-based ptychographical iterative engine method.

    PubMed

    Sun, Aihui; Kong, Yan; Meng, Xin; He, Xiaoliang; Du, Ruijun; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-02-01

    A variable aperture-based ptychographical iterative engine (vaPIE) is demonstrated both numerically and experimentally to reconstruct the sample phase and amplitude rapidly. By adjusting the size of a tiny aperture under the illumination of a parallel light beam to change the illumination on the sample step by step and recording the corresponding diffraction patterns sequentially, both the sample phase and amplitude can be faithfully reconstructed with a modified ptychographical iterative engine (PIE) algorithm. Since many fewer diffraction patterns are required than in common PIE and the shape, the size, and the position of the aperture need not to be known exactly, this proposed vaPIE method remarkably reduces the data acquisition time and makes PIE less dependent on the mechanical accuracy of the translation stage; therefore, the proposed technique can be potentially applied for various scientific researches. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  17. Improving land vehicle situational awareness using a distributed aperture system

    NASA Astrophysics Data System (ADS)

    Fortin, Jean; Bias, Jason; Wells, Ashley; Riddle, Larry; van der Wal, Gooitzen; Piacentino, Mike; Mandelbaum, Robert

    2005-05-01

    U.S. Army Research, Development, and Engineering Command (RDECOM) Communications Electronics Research, Development and Engineering Center (CERDEC) Night Vision and Electronic Sensors Directorate (NVESD) has performed early work to develop a Distributed Aperture System (DAS). The DAS aims at improving the situational awareness of armored fighting vehicle crews under closed-hatch conditions. The concept is based on a plurality of sensors configured to create a day and night dome of surveillance coupled with heads up displays slaved to the operator's head to give a "glass turret" feel. State-of-the-art image processing is used to produce multiple seamless hemispherical views simultaneously available to the vehicle commander, crew members and dismounting infantry. On-the-move automatic cueing of multiple moving/pop-up low silhouette threats is also done with the possibility to save/revisit/share past events. As a first step in this development program, a contract was awarded to United Defense to further develop the Eagle VisionTM system. The second-generation prototype features two camera heads, each comprising four high-resolution (2048x1536) color sensors, and each covering a field of view of 270°hx150°v. High-bandwidth digital links interface the camera heads with a field programmable gate array (FPGA) based custom processor developed by Sarnoff Corporation. The processor computes the hemispherical stitch and warp functions required for real-time, low latency, immersive viewing (360°hx120°v, 30° down) and generates up to six simultaneous extended graphics array (XGA) video outputs for independent display either on a helmet-mounted display (with associated head tracking device) or a flat panel display (and joystick). The prototype is currently in its last stage of development and will be integrated on a vehicle for user evaluation and testing. Near-term improvements include the replacement of the color camera heads with a pixel-level fused combination of uncooled long wave infrared (LWIR) and low light level intensified imagery. It is believed that the DAS will significantly increase situational awareness by providing the users with a day and night, wide area coverage, immersive visualization capability.

  18. Causes of cine image quality deterioration in cardiac catheterization laboratories.

    PubMed

    Levin, D C; Dunham, L R; Stueve, R

    1983-10-01

    Deterioration of cineangiographic image quality can result from malfunctions or technical errors at a number of points along the cine imaging chain: generator and automatic brightness control, x-ray tube, x-ray beam geometry, image intensifier, optics, cine camera, cine film, film processing, and cine projector. Such malfunctions or errors can result in loss of image contrast, loss of spatial resolution, improper control of film optical density (brightness), or some combination thereof. While the electronic and photographic technology involved is complex, physicians who perform cardiac catheterization should be conversant with the problems and what can be done to solve them. Catheterization laboratory personnel have control over a number of factors that directly affect image quality, including radiation dose rate per cine frame, kilovoltage or pulse width (depending on type of automatic brightness control), cine run time, selection of small or large focal spot, proper object-intensifier distance and beam collimation, aperture of the cine camera lens, selection of cine film, processing temperature, processing immersion time, and selection of developer.

  19. A novel optical system design of light field camera

    NASA Astrophysics Data System (ADS)

    Wang, Ye; Li, Wenhua; Hao, Chenyang

    2016-01-01

    The structure of main lens - Micro Lens Array (MLA) - imaging sensor is usually adopted in optical system of light field camera, and the MLA is the most important part in the optical system, which has the function of collecting and recording the amplitude and phase information of the field light. In this paper, a novel optical system structure is proposed. The novel optical system is based on the 4f optical structure, and the micro-aperture array (MAA) is used to instead of the MLA for realizing the information acquisition of the 4D light field. We analyze the principle that the novel optical system could realize the information acquisition of the light field. At the same time, a simple MAA, line grating optical system, is designed by ZEMAX software in this paper. The novel optical system is simulated by a line grating optical system, and multiple images are obtained in the image plane. The imaging quality of the novel optical system is analyzed.

  20. Astronomical Polarimetry with the RIT Polarization Imaging Camera

    NASA Astrophysics Data System (ADS)

    Vorobiev, Dmitry V.; Ninkov, Zoran; Brock, Neal

    2018-06-01

    In the last decade, imaging polarimeters based on micropolarizer arrays have been developed for use in terrestrial remote sensing and metrology applications. Micropolarizer-based sensors are dramatically smaller and more mechanically robust than other polarimeters with similar spectral response and snapshot capability. To determine the suitability of these new polarimeters for astronomical applications, we developed the RIT Polarization Imaging Camera to investigate the performance of these devices, with a special attention to the low signal-to-noise regime. We characterized the device performance in the lab, by determining the relative throughput, efficiency, and orientation of every pixel, as a function of wavelength. Using the resulting pixel response model, we developed demodulation procedures for aperture photometry and imaging polarimetry observing modes. We found that, using the current calibration, RITPIC is capable of detecting polarization signals as small as ∼0.3%. The relative ease of data collection, calibration, and analysis provided by these sensors suggest than they may become an important tool for a number of astronomical targets.

  1. Real-time millimeter-wave imaging radiometer for avionic synthetic vision

    NASA Astrophysics Data System (ADS)

    Lovberg, John A.; Chou, Ri-Chee; Martin, Christopher A.

    1994-07-01

    ThermoTrex Corporation (TTC) has developed an imaging radiometer, the passive microwave camera (PMC), that uses an array of frequency-scanned antennas coupled to a multi-channel acousto-optic (Bragg cell) spectrum analyzer to form visible images of a scene through acquisition of thermal blackbody radiation in the millimeter-wave spectrum. The output of the Bragg cell is imaged by a standard video camera and passed to a computer for normalization and display at real-time frame rates. One application of this system could be its incorporation into an enhanced vision system to provide pilots with a clear view of the runway during fog and other adverse weather conditions. The unique PMC system architecture will allow compact large-aperture implementations because of its flat antenna sensor. Other potential applications include air traffic control, all-weather area surveillance, fire detection, and security. This paper describes the architecture of the TTC PMC and shows examples of images acquired with the system.

  2. LCD-based digital eyeglass for modulating spatial-angular information.

    PubMed

    Bian, Zichao; Liao, Jun; Guo, Kaikai; Heng, Xin; Zheng, Guoan

    2015-05-04

    Using programmable aperture to modulate spatial-angular information of light field is well-known in computational photography and microscopy. Inspired by this concept, we report a digital eyeglass design that adaptively modulates light field entering human eyes. The main hardware includes a transparent liquid crystal display (LCD) and a mini-camera. The device analyzes the spatial-angular information of the camera image in real time and subsequently sends a command to form a certain pattern on the LCD. We show that, the eyeglass prototype can adaptively reduce light transmission from bright sources by ~80% and retain transparency to other dim objects meanwhile. One application of the reported device is to reduce discomforting glare caused by vehicle headlamps. To this end, we report the preliminary result of using the reported device in a road test. The reported device may also find applications in military operations (sniper scope), laser counter measure, STEM education, and enhancing visual contrast for visually impaired patients and elderly people with low vision.

  3. James Webb Space Telescope (JWST) and Star Formation

    NASA Technical Reports Server (NTRS)

    Greene, Thomas P.

    2010-01-01

    The 6.5-m aperture James Webb Space Telescope (JWST) will be a powerful tool for studying and advancing numerous areas of astrophysics. Its Fine Guidance Sensor, Near-Infrared Camera, Near-Infrared Spectrograph, and Mid-Infrared Instrument will be capable of making very sensitive, high angular resolution imaging and spectroscopic observations spanning 0.7 - 28 ?m wavelength. These capabilities are very well suited for probing the conditions of star formation in the distant and local Universe. Indeed, JWST has been designed to detect first light objects as well as to study the fine details of jets, disks, chemistry, envelopes, and the central cores of nearby protostars. We will be able to use its cameras, coronagraphs, and spectrographs (including multi-object and integral field capabilities) to study many aspects of star forming regions throughout the galaxy, the Local Group, and more distant regions. I will describe the basic JWST scientific capabilities and illustrate a few ways how they can be applied to star formation issues and conditions with a focus on Galactic regions.

  4. A 90GHz Bolometer Camera Detector System for the Green Bank Telescope

    NASA Technical Reports Server (NTRS)

    Benford, Dominic J.; Allen, Christine A.; Buchanan, Ernest D.; Chen, Tina C.; Chervenak, James A.; Devlin, Mark J.; Dicker, Simon R.; Forgione, Joshua B.

    2004-01-01

    We describe a close-packed, two-dimensional imaging detector system for operation at 90GHz (3.3mm) for the 100 m Green Bank Telescope (GBT) This system will provide high sensitivity (<1mjy in 1s rapid imaging (15'x15' to 250 microJy in 1 hr) at the world's largest steerable aperture. The heart of this camera is an 8x8 close packed, Nyquist-sampled array of superconducting transition edge sensor bolometers. We have designed and are producing a functional superconducting bolometer array system using a monolithic planar architecture and high-speed multiplexed readout electronics. With an NEP of approx. 2.10(exp 17) W/square root Hz, the TES bolometers will provide fast linear sensitive response for high performance imaging. The detectors are read out by and 8x8 time domain SQUID multiplexer. A digital/analog electronics system has been designed to enable read out by SQUID multiplexers. First light for this instrument on the GBT is expected within a year.

  5. A 90GHz Bolometer Camera Detector System for the Green

    NASA Technical Reports Server (NTRS)

    Benford, Dominic J.; Allen, Christine A.; Buchanan, Ernest; Chen, Tina C.; Chervenak, James A.; Devlin, Mark J.; Dicker, Simon R.; Forgione, Joshua B.

    2004-01-01

    We describe a close-packed, two-dimensional imaging detector system for operation at 90GHz (3.3 mm) for the 100m Green Bank Telescope (GBT). This system will provide high sensitivity (less than 1mJy in 1s) rapid imaging (15'x15' to 150 micron Jy in 1 hr) at the world's largest steerable aperture. The heart of this camera is an 8x8 close-packed, Nyquist-sampled array of superconducting transition edge sensor (TES) bolometers. We have designed and are producing a functional superconducting bolometer array system using a monolithic planar architecture and high-speed multiplexed readout electronics. With an NEP of approximately 2 x 10(exp -17) W/square root of Hz, the TES bolometers will provide fast, linear, sensitive response for high performance imaging. The detectors are read out by an 8x8 time domain SQUID multiplexer. A digital/analog electronics system has been designed to enable read out by SQUID multiplexers. First light for this instrument on the GBT is expected within a year.

  6. A high resolution IR/visible imaging system for the W7-X limiter

    NASA Astrophysics Data System (ADS)

    Wurden, G. A.; Stephey, L. A.; Biedermann, C.; Jakubowski, M. W.; Dunn, J. P.; Gamradt, M.

    2016-11-01

    A high-resolution imaging system, consisting of megapixel mid-IR and visible cameras along the same line of sight, has been prepared for the new W7-X stellarator and was operated during Operational Period 1.1 to view one of the five inboard graphite limiters. The radial line of sight, through a large diameter (184 mm clear aperture) uncoated sapphire window, couples a direct viewing 1344 × 784 pixel FLIR SC8303HD camera. A germanium beam-splitter sends visible light to a 1024 × 1024 pixel Allied Vision Technologies Prosilica GX1050 color camera. Both achieve sub-millimeter resolution on the 161 mm wide, inertially cooled, segmented graphite tiles. The IR and visible cameras are controlled via optical fibers over full Camera Link and dual GigE Ethernet (2 Gbit/s data rates) interfaces, respectively. While they are mounted outside the cryostat at a distance of 3.2 m from the limiter, they are close to a large magnetic trim coil and require soft iron shielding. We have taken IR data at 125 Hz to 1.25 kHz frame rates and seen that surface temperature increases in excess of 350 °C, especially on leading edges or defect hot spots. The IR camera sees heat-load stripe patterns on the limiter and has been used to infer limiter power fluxes (˜1-4.5 MW/m2), during the ECRH heating phase. IR images have also been used calorimetrically between shots to measure equilibrated bulk tile temperature, and hence tile energy inputs (in the range of 30 kJ/tile with 0.6 MW, 6 s heating pulses). Small UFO's can be seen and tracked by the FLIR camera in some discharges. The calibrated visible color camera (100 Hz frame rate) has also been equipped with narrow band C-III and H-alpha filters, to compare with other diagnostics, and is used for absolute particle flux determination from the limiter surface. Sometimes, but not always, hot-spots in the IR are also seen to be bright in C-III light.

  7. A high resolution IR/visible imaging system for the W7-X limiter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wurden, G. A., E-mail: wurden@lanl.gov; Dunn, J. P.; Stephey, L. A.

    A high-resolution imaging system, consisting of megapixel mid-IR and visible cameras along the same line of sight, has been prepared for the new W7-X stellarator and was operated during Operational Period 1.1 to view one of the five inboard graphite limiters. The radial line of sight, through a large diameter (184 mm clear aperture) uncoated sapphire window, couples a direct viewing 1344 × 784 pixel FLIR SC8303HD camera. A germanium beam-splitter sends visible light to a 1024 × 1024 pixel Allied Vision Technologies Prosilica GX1050 color camera. Both achieve sub-millimeter resolution on the 161 mm wide, inertially cooled, segmented graphitemore » tiles. The IR and visible cameras are controlled via optical fibers over full Camera Link and dual GigE Ethernet (2 Gbit/s data rates) interfaces, respectively. While they are mounted outside the cryostat at a distance of 3.2 m from the limiter, they are close to a large magnetic trim coil and require soft iron shielding. We have taken IR data at 125 Hz to 1.25 kHz frame rates and seen that surface temperature increases in excess of 350 °C, especially on leading edges or defect hot spots. The IR camera sees heat-load stripe patterns on the limiter and has been used to infer limiter power fluxes (∼1–4.5 MW/m{sup 2}), during the ECRH heating phase. IR images have also been used calorimetrically between shots to measure equilibrated bulk tile temperature, and hence tile energy inputs (in the range of 30 kJ/tile with 0.6 MW, 6 s heating pulses). Small UFO’s can be seen and tracked by the FLIR camera in some discharges. The calibrated visible color camera (100 Hz frame rate) has also been equipped with narrow band C-III and H-alpha filters, to compare with other diagnostics, and is used for absolute particle flux determination from the limiter surface. Sometimes, but not always, hot-spots in the IR are also seen to be bright in C-III light.« less

  8. Combined hostile fire and optics detection

    NASA Astrophysics Data System (ADS)

    Brännlund, Carl; Tidström, Jonas; Henriksson, Markus; Sjöqvist, Lars

    2013-10-01

    Snipers and other optically guided weapon systems are serious threats in military operations. We have studied a SWIR (Short Wave Infrared) camera-based system with capability to detect and locate snipers both before and after shot over a large field-of-view. The high frame rate SWIR-camera allows resolution of the temporal profile of muzzle flashes which is the infrared signature associated with the ejection of the bullet from the rifle. The capability to detect and discriminate sniper muzzle flashes with this system has been verified by FOI in earlier studies. In this work we have extended the system by adding a laser channel for optics detection. A laser diode with slit-shaped beam profile is scanned over the camera field-of-view to detect retro reflection from optical sights. The optics detection system has been tested at various distances up to 1.15 km showing the feasibility to detect rifle scopes in full daylight. The high speed camera gives the possibility to discriminate false alarms by analyzing the temporal data. The intensity variation, caused by atmospheric turbulence, enables discrimination of small sights from larger reflectors due to aperture averaging, although the targets only cover a single pixel. It is shown that optics detection can be integrated in combination with muzzle flash detection by adding a scanning rectangular laser slit. The overall optics detection capability by continuous surveillance of a relatively large field-of-view looks promising. This type of multifunctional system may become an important tool to detect snipers before and after shot.

  9. APT: Aperture Photometry Tool

    NASA Astrophysics Data System (ADS)

    Laher, Russ

    2012-08-01

    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It has a graphical user interface (GUI) which allows the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. Mouse-clicking on a source in the displayed image draws a circular or elliptical aperture and sky annulus around the source and computes the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs, including image histogram, and aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has functions for customizing calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source model, accessed via the radial-profile-plot panel, allows recovery of source intensity from pixels with missing data and can be especially beneficial in crowded fields.

  10. Condenser for illuminating a ringfield camera with synchrotron emission light

    DOEpatents

    Sweatt, W.C.

    1996-04-30

    The present invention relates generally to the field of condensers for collecting light from a synchrotron radiation source and directing the light into a ringfield of a lithography camera. The present invention discloses a condenser comprising collecting, processing, and imaging optics. The collecting optics are comprised of concave and convex spherical mirrors that collect the light beams. The processing optics, which receive the light beams, are comprised of flat mirrors that converge and direct the light beams into a real entrance pupil of the camera in a symmetrical pattern. In the real entrance pupil are located flat mirrors, common to the beams emitted from the preceding mirrors, for generating substantially parallel light beams and for directing the beams toward the ringfield of a camera. Finally, the imaging optics are comprised of a spherical mirror, also common to the beams emitted from the preceding mirrors, images the real entrance pupil through the resistive mask and into the virtual entrance pupil of the camera. Thus, the condenser is comprised of a plurality of beams with four mirrors corresponding to a single beam plus two common mirrors. 9 figs.

  11. Condenser for illuminating a ringfield camera with synchrotron emission light

    DOEpatents

    Sweatt, William C.

    1996-01-01

    The present invention relates generally to the field of condensers for collecting light from a synchrotron radiation source and directing the light into a ringfield of a lithography camera. The present invention discloses a condenser comprising collecting, processing, and imaging optics. The collecting optics are comprised of concave and convex spherical mirrors that collect the light beams. The processing optics, which receive the light beams, are comprised of flat mirrors that converge and direct the light beams into a real entrance pupil of the camera in a symmetrical pattern. In the real entrance pupil are located flat mirrors, common to the beams emitted from the preceding mirrors, for generating substantially parallel light beams and for directing the beams toward the ringfield of a camera. Finally, the imaging optics are comprised of a spherical mirror, also common to the beams emitted from the preceding mirrors, images the real entrance pupil through the resistive mask and into the virtual entrance pupil of the camera. Thus, the condenser is comprised of a plurality of beams with four mirrors corresponding to a single beam plus two common mirrors.

  12. Common aperture multispectral sensor flight test program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bird, R.S.; Kaufman, C.S.

    1996-11-01

    This paper will provide an overview of the Common Aperture Multispectral Sensor (CAMS) Hardware Demonstrator. CAMS is a linescanning sensor that simultaneously collected digital imagery over the Far-IR (8 to 12 {mu}m) and visible spectral (0.55 to 1.1 PM) spectral bands, correlated at the pixel level. CAMS was initially sponsored by the U.S. Naval Air System Commands F/A-18 program office (PMA-265). The current CAMS field tests are under the direction of Northrop-Grumman for the Defense Nuclear Agency (DNA) in support of the Follow-On Open Skies Sensor Evaluation Program (FOSEP) and are scheduled to be conducted in April 1996. 8 figs.,more » 4 tabs.« less

  13. Characterization of lunar surface materials for use in construction

    NASA Technical Reports Server (NTRS)

    Johnson, Stewart W.; Burns, Jack O.

    1992-01-01

    The Workshop on the Concept of a Common Lunar Lander, which was held at the NASA Johnson Space Center on July 1 and 2, 1991, discussed potential payloads to be placed on the Moon by a common, generic, unmanned, vehicle beginning late in this decade. At this workshop, a variety of payloads were identified including a class of one-meter (and larger) optical telescopes to operate on the lunar surface. These telescopes for lunar-based astronomy are presented in an earlier section of this report. The purpose of this section is to suggest that these and other payloads for the Common Lunar Lander be used to facilitate technology development for the proposed 16-meter Aperture UV/Visible/IR Large Lunar Telescope (LLT) and a large optical aperture-synthesis instrument analogous to the Very Large Array of the National Radio Astronomy Observatory.

  14. Characterising rock fracture aperture-spacing relationships using power-law relationships: some considerations

    NASA Astrophysics Data System (ADS)

    Brook, Martin; Hebblewhite, Bruce; Mitra, Rudrajit

    2016-04-01

    The size-scaling of rock fractures is a well-studied problem in geology, especially for permeability quantification. The intensity of fractures may control the economic exploitation of fractured reservoirs because fracture intensity describes the abundance of fractures potentially available for fluid flow. Moreover, in geotechnical engineering, fractures are important for parameterisation of stress models and excavation design. As fracture data is often collected from widely-spaced boreholes where core recovery is often incomplete, accurate interpretation and representation of fracture aperture-frequency relationships from sparse datasets is important. Fracture intensity is the number of fractures encountered per unit length along a sample scanline oriented perpendicular to the fractures in a set. Cumulative frequency of fractures (F) is commonly related to fracture aperture (A) in the form of a power-law (F = aA-b), with variations in the size of the a coefficient between sites interpreted to equate to fracture frequency for a given aperture (A). However, a common flaw in this approach is that even a small change in b can have a large effect on the response of the fracture frequency (F) parameter. We compare fracture data from the Late Permian Rangal Coal Measures from Australia's Bowen Basin, with fracture data from Jurassic carbonates from the Sierra Madre Oriental, northeastern Mexico. Both power-law coefficient a and exponent b control the fracture aperture-frequency relationship in conjunction with each other; that is, power-laws with relatively low a coefficients have relatively high b exponents and vice versa. Hence, any comparison of different power-laws must take both a and b into consideration. The corollary is that different sedimentary beds in the Sierra Madre carbonates do not show ˜8× the fracture frequency for a given fracture aperture, as based solely on the comparison of coefficient a. Rather, power-law "sensitivity factors" developed from both Sierra Madre and the Bowen Basin span similar ranges, indicating that the factor of increase in frequency (F) for a doubling of aperture size (A) shows similar relationships and variability from both sites. Despite their limitations, we conclude that fracture aperture-frequency power-law relationships are valid and, when interpreted carefully, provide a useful basis for comparing rock fracture distributions across different sites.

  15. Extended depth of field integral imaging using multi-focus fusion

    NASA Astrophysics Data System (ADS)

    Piao, Yongri; Zhang, Miao; Wang, Xiaohui; Li, Peihua

    2018-03-01

    In this paper, we propose a new method for depth of field extension in integral imaging by realizing the image fusion method on the multi-focus elemental images. In the proposed method, a camera is translated on a 2D grid to take multi-focus elemental images by sweeping the focus plane across the scene. Simply applying an image fusion method on the elemental images holding rich parallax information does not work effectively because registration accuracy of images is the prerequisite for image fusion. To solve this problem an elemental image generalization method is proposed. The aim of this generalization process is to geometrically align the objects in all elemental images so that the correct regions of multi-focus elemental images can be exacted. The all-in focus elemental images are then generated by fusing the generalized elemental images using the block based fusion method. The experimental results demonstrate that the depth of field of synthetic aperture integral imaging system has been extended by realizing the generation method combined with the image fusion on multi-focus elemental images in synthetic aperture integral imaging system.

  16. Tomographic Aperture-Encoded Particle Tracking Velocimetry: A New Approach to Volumetric PIV

    NASA Astrophysics Data System (ADS)

    Troolin, Dan; Boomsma, Aaron; Lai, Wing; Pothos, Stamatios; Fluid Mechanics Research Instruments Team

    2016-11-01

    Volumetric velocity fields are useful in a wide variety of fluid mechanics applications. Several types of three-dimensional imaging methods have been used in the past to varying degrees of success, for example, 3D PTV (Maas et al., 1993), DDPIV (Peireira et al., 2006), Tomographic PIV (Elsinga, 2006), and V3V (Troolin and Longmire, 2009), among others. Each of these techniques has shown advantages and disadvantages in different areas. With the advent of higher resolution and lower noise cameras with higher stability levels, new techniques are emerging that combine the advantages of the existing techniques. This talk describes a new technique called Tomographic Aperture-Encoded Particle Tracking Velocimetry (TAPTV), in which segmented triangulation and diameter tolerance are used to achieve three-dimensional particle tracking with extremely high particle densities (on the order of ppp = 0.2 or higher) without the drawbacks normally associated with ghost particles (for example in TomoPIV). The results are highly spatially-resolved data with very fast processing times. A detailed explanation of the technique as well as plots, movies, and experimental considerations will be discussed.

  17. Three-dimensional motion-picture imaging of dynamic object by parallel-phase-shifting digital holographic microscopy using an inverted magnification optical system

    NASA Astrophysics Data System (ADS)

    Fukuda, Takahito; Shinomura, Masato; Xia, Peng; Awatsuji, Yasuhiro; Nishio, Kenzo; Matoba, Osamu

    2017-04-01

    We constructed a parallel-phase-shifting digital holographic microscopy (PPSDHM) system using an inverted magnification optical system, and succeeded in three-dimensional (3D) motion-picture imaging for 3D displacement of a microscopic object. In the PPSDHM system, the inverted and afocal magnification optical system consisted of a microscope objective (16.56 mm focal length and 0.25 numerical aperture) and a convex lens (300 mm focal length and 82 mm aperture diameter). A polarization-imaging camera was used to record multiple phase-shifted holograms with a single-shot exposure. We recorded an alum crystal, sinking down in aqueous solution of alum, by the constructed PPSDHM system at 60 frames/s for about 20 s and reconstructed high-quality 3D motion-picture image of the crystal. Then, we calculated amounts of displacement of the crystal from the amounts in the focus plane and the magnifications of the magnification optical system, and obtained the 3D trajectory of the crystal by that amounts.

  18. Incoherent digital holograms acquired by interferenceless coded aperture correlation holography system without refractive lenses.

    PubMed

    Kumar, Manoj; Vijayakumar, A; Rosen, Joseph

    2017-09-14

    We present a lensless, interferenceless incoherent digital holography technique based on the principle of coded aperture correlation holography. The acquired digital hologram by this technique contains a three-dimensional image of some observed scene. Light diffracted by a point object (pinhole) is modulated using a random-like coded phase mask (CPM) and the intensity pattern is recorded and composed as a point spread hologram (PSH). A library of PSHs is created using the same CPM by moving the pinhole to all possible axial locations. Intensity diffracted through the same CPM from an object placed within the axial limits of the PSH library is recorded by a digital camera. The recorded intensity this time is composed as the object hologram. The image of the object at any axial plane is reconstructed by cross-correlating the object hologram with the corresponding component of the PSH library. The reconstruction noise attached to the image is suppressed by various methods. The reconstruction results of multiplane and thick objects by this technique are compared with regular lens-based imaging.

  19. Image degradation characteristics and restoration based on regularization for diffractive imaging

    NASA Astrophysics Data System (ADS)

    Zhi, Xiyang; Jiang, Shikai; Zhang, Wei; Wang, Dawei; Li, Yun

    2017-11-01

    The diffractive membrane optical imaging system is an important development trend of ultra large aperture and lightweight space camera. However, related investigations on physics-based diffractive imaging degradation characteristics and corresponding image restoration methods are less studied. In this paper, the model of image quality degradation for the diffraction imaging system is first deduced mathematically based on diffraction theory and then the degradation characteristics are analyzed. On this basis, a novel regularization model of image restoration that contains multiple prior constraints is established. After that, the solving approach of the equation with the multi-norm coexistence and multi-regularization parameters (prior's parameters) is presented. Subsequently, the space-variant PSF image restoration method for large aperture diffractive imaging system is proposed combined with block idea of isoplanatic region. Experimentally, the proposed algorithm demonstrates its capacity to achieve multi-objective improvement including MTF enhancing, dispersion correcting, noise and artifact suppressing as well as image's detail preserving, and produce satisfactory visual quality. This can provide scientific basis for applications and possesses potential application prospects on future space applications of diffractive membrane imaging technology.

  20. A laboratory verification sensor

    NASA Technical Reports Server (NTRS)

    Vaughan, Arthur H.

    1988-01-01

    The use of a variant of the Hartmann test is described to sense the coalignment of the 36 primary mirror segments of the Keck 10-meter Telescope. The Shack-Hartmann alignment camera is a surface-tilt-error-sensing device, operable with high sensitivity over a wide range of tilt errors. An interferometer, on the other hand, is a surface-height-error-sensing device. In general, if the surface height error exceeds a few wavelengths of the incident illumination, an interferogram is difficult to interpret and loses utility. The Shack-Hartmann aligment camera is, therefore, likely to be attractive as a development tool for segmented mirror telescopes, particularly at early stages of development in which the surface quality of developmental segments may be too poor to justify interferometric testing. The constraints are examined which would define the first-order properties of a Shack-Hartmann alignment camera and the precision and range of measurement one could expect to achieve with it are investigated. Fundamental constraints do arise, however, from consideration of geometrical imaging, diffraction, and the density of sampling of images at the detector array. Geometrical imagining determines the linear size of the image, and depends on the primary mirror diameter and the f-number of a lenslet. Diffraction is another constraint; it depends on the lenslet aperture. Finally, the sampling density at the detector array is important since the number of pixels in the image determines how accurately the centroid of the image can be measured. When these factors are considered under realistic assumptions it is apparent that the first order design of a Shack-Hartmann alignment camera is completely determined by the first-order constraints considered, and that in the case of a 20-meter telescope with seeing-limited imaging, such a camera, used with a suitable detector array, will achieve useful precision.

  1. A Simple Approach to Collecting Useful Wildlife Data Using Remote Camera-Traps in Undergraduate Biology Courses

    ERIC Educational Resources Information Center

    Christensen, David R.

    2016-01-01

    Remote camera-traps are commonly used to estimate the abundance, diversity, behavior and habitat use of wildlife in an inexpensive and nonintrusive manner. Because of the increasing use of remote-cameras in wildlife studies, students interested in wildlife biology should be exposed to the use of remote-cameras early in their academic careers.…

  2. The Fate of Colloidal Swarms in Fractures

    NASA Astrophysics Data System (ADS)

    Pyrak-Nolte, L. J.; Olander, M. K.

    2009-12-01

    In the next 10-20 years, nano- and micro-sensor engineering will advance to the stage where sensor swarms could be deployed in the subsurface to probe rock formations and the fluids contained in them. Sensor swarms are groups of nano- or micro- sensors that are maintained as a coherent group to enable either sensor-to-sensor communication and/or coherent transmission of information as a group. The ability to maintain a swarm of sensors depends on the complexity of the flow paths in the rock, on the size and shape of the sensors and on the chemical interaction among the sensors, fluids, and rock surfaces. In this study, we investigate the effect of fracture aperture and fluid currents on the formation, evolution and break-up of colloidal swarms under gravity. Transparent cubic samples (100 mm x 100 mm x 100 mm) containing synthetic fractures with uniform and non-uniform aperture distributions were used to quantify the effect of aperture on swarm formation, swarm velocity, and swarm geometry using optical imaging. A fracture with a uniform aperture distribution was fabricated from two polished rectangular prisms of acrylic. A fracture with a non-uniform aperture distribution was created with a polished rectangular acrylic prism and an acrylic replica of an induced fracture surface from a carbonate rock. A series of experiments were performed to determine how swarm movement and geometry are affected as the walls of the fracture are brought closer together from 50 mm to 1 mm. During the experiments, the fracture was fully saturated with water. We created the swarms using two different particle sizes in dilute suspension (~ 1.0% by mass) . The particles were 3 micron diameter fluorescent polymer beads and 25 micron diameter soda-lime glass beads. The swarm behavior was imaged using an optical fluorescent imaging system composed of a CCD camera illuminated by a 100 mW diode-pumped doubled YAG laser. A swam was created when approximately 0.01 g drop of the suspension was released under gravity into the water. The swarm density is slightly greater than water and falls faster than the terminal velocity of an individual particle in water. The cohesiveness of the swarm was maintained over 50 mm to 95 mm even in the presence of fluid currents. The swarm velocity decreased with decreasing fracture aperture. When the apertures are small, swarms break-up and reform as they pass through a variable aperture fracture. Acknowledgment: The authors wish to acknowledge support of this work by the Geosciences Research Program, Office of Basic Energy Sciences US Department of Energy (DE-FG02-09ER16022) and the Summer Undergraduate Research Fellowship program at Purdue University.

  3. ESTADIUS: A High Motion "One Arcsec" Daytime Attitude Estimation System for Stratospheric Applications

    NASA Astrophysics Data System (ADS)

    Montel, J.; Andre, Y.; Mirc, F.; Etcheto, P.; Evrard, J.; Bray, N.; Saccoccio, M.; Tomasini, L.; Perot, E.

    2015-09-01

    ESTADIUS is an autonomous, accurate and daytime attitude estimation system, for stratospheric balloons that require a high level of attitude measurement and stability. The system has been developed by CNES. ESTADIUS is based on star sensor an pyrometer data fusion within an extended Kalman filter. The star sensor is composed of a 16 MPixels visible-CCD camera and a large aperture camera lens (focal length of 135mm, aperture f/1.8, 10ºx15º field of view or FOV) which provides very accurate stars measurements due to very low pixel angular size. This also allows detecting stars against a bright sky background. The pyrometer is a 0.01º/h performance class Fiber Optic Gyroscope (FOG). The system is adapted to work down to an altitude of ~25km, even under high cinematic conditions. Key elements of ESTADIUS are: daytime conditions use (as well as night time), autonomy (automatic recognition of constellations), high angular rate robustness (a few deg/s thanks to the high performance of attitude propagation), stray-light robustness (thanks to a high performance baffle), high accuracy (<1", 1σ). Four stratospheric qualification flights were very successfully performed in 2010/2011 and 2013/2014 in Kiruna (Sweden) and Timmins (Canada). ESTADIUS will allow long stratospheric flights with a unique attitude estimation system avoiding the restriction of night/day conditions at launch. The first operational flight of ESTADIUS will be in 2015 for the PILOT scientific missions (led by IRAP and CNES in France). Further balloon missions such as CIDRE will use the system ESTADIUS is probably the first autonomous, large FOV, daytime stellar attitude measurement system. This paper details the technical features and in-flight results.

  4. Developments of a multi-wavelength spectro-polarimeter on the Domeless Solar Telescope at Hida Observatory

    NASA Astrophysics Data System (ADS)

    Anan, Tetsu; Huang, Yu-Wei; Nakatani, Yoshikazu; Ichimoto, Kiyoshi; UeNo, Satoru; Kimura, Goichi; Ninomiya, Shota; Okada, Sanetaka; Kaneda, Naoki

    2018-05-01

    To obtain full Stokes spectra in multi-wavelength windows simultaneously, we developed a new spectro-polarimeter on the Domeless Solar Telescope at Hida Observatory. The new polarimeter consists of a 60 cm aperture vacuum telescope on an altazimuth mounting, an image rotator, a high-dispersion spectrograph, and a polarization modulator and an analyzer composed of a continuously rotating waveplate with a retardation that is nearly constant at around 127° in 500-1100 nm. There are also a polarizing beam splitter located close behind the focus of the telescope, fast and large format CMOS cameras, and an infrared camera. A slit spectrograph allows us to obtain spectra in as many wavelength windows as the number of cameras. We characterized the instrumental polarization of the entire system and established a polarization calibration procedure. The cross-talks among the Stokes Q, U, and V have been evaluated to be about 0.06%-1.2%, depending on the degree of the intrinsic polarizations. In a typical observing setup, a sensitivity of 0.03% can be achieved in 20-60 seconds for 500-1100 nm. The new polarimeter is expected to provide a powerful tool for diagnosing the 3D magnetic field and other vector physical quantities in the solar atmosphere.

  5. An effective rectification method for lenselet-based plenoptic cameras

    NASA Astrophysics Data System (ADS)

    Jin, Jing; Cao, Yiwei; Cai, Weijia; Zheng, Wanlu; Zhou, Ping

    2016-10-01

    The Lenselet-Based Plenoptic has recently drawn a lot of attention in the field of computational photography. The additional information inherent in light field allows a wide range of applications, but some preliminary processing of the raw image is necessary before further operations. In this paper, an effective method is presented for the rotation rectification of the raw image. The rotation is caused by imperfectly position of micro-lens array relative to the sensor plane in commercially available Lytro plenoptic cameras. The key to our method is locating the center of each microlens image, which is projected by a micro-lens. Because of vignetting, the pixel values at centers of the micro-lens image are higher than those at the peripheries. A mask is applied to probe the micro-lens image to locate the center area by finding the local maximum response. The error of the center coordinate estimate is corrected and the angle of rotation is computed via a subsequent line fitting. The algorithm is performed on two images captured by different Lytro cameras. The angles of rotation are -0.3600° and -0.0621° respectively and the rectified raw image is useful and reliable for further operations, such as extraction of the sub-aperture images. The experimental results demonstrate that our method is efficient and accurate.

  6. Model of an optical system's influence on sensitivity of microbolometric focal plane array

    NASA Astrophysics Data System (ADS)

    Gogler, Sławomir; Bieszczad, Grzegorz; Zarzycka, Alicja; Szymańska, Magdalena; Sosnowski, Tomasz

    2012-10-01

    Thermal imagers and used therein infrared array sensors are subject to calibration procedure and evaluation of their voltage sensitivity on incident radiation during manufacturing process. The calibration procedure is especially important in so-called radiometric cameras, where accurate radiometric quantities, given in physical units, are of concern. Even though non-radiometric cameras are not expected to stand up to such elevated standards, it is still important, that the image faithfully represents temperature variations across the scene. The detectors used in thermal camera are illuminated by infrared radiation transmitted through a specialized optical system. Each optical system used influences irradiation distribution across an sensor array. In the article a model describing irradiation distribution across an array sensor working with an optical system used in the calibration set-up has been proposed. In the said method optical and geometrical considerations of the array set-up have been taken into account. By means of Monte-Carlo simulation, large number of rays has been traced to the sensor plane, what allowed to determine the irradiation distribution across the image plane for different aperture limiting configurations. Simulated results have been confronted with proposed analytical expression. Presented radiometric model allows fast and accurate non-uniformity correction to be carried out.

  7. Improved depth estimation with the light field camera

    NASA Astrophysics Data System (ADS)

    Wang, Huachun; Sang, Xinzhu; Chen, Duo; Guo, Nan; Wang, Peng; Yu, Xunbo; Yan, Binbin; Wang, Kuiru; Yu, Chongxiu

    2017-10-01

    Light-field cameras are used in consumer and industrial applications. An array of micro-lenses captures enough information that one can refocus images after acquisition, as well as shift one's viewpoint within the sub-apertures of the main lens, effectively obtaining multiple views. Thus, depth estimation from both defocus and correspondence are now available in a single capture. And Lytro.Inc also provides a depth estimation from a single-shot capture with light field camera, like Lytro Illum. This Lytro depth estimation containing many correct depth information can be used for higher quality estimation. In this paper, we present a novel simple and principled algorithm that computes dense depth estimation by combining defocus, correspondence and Lytro depth estimations. We analyze 2D epipolar image (EPI) to get defocus and correspondence depth maps. Defocus depth is obtained by computing the spatial gradient after angular integration and correspondence depth by computing the angular variance from EPIs. Lytro depth can be extracted from Lyrto Illum with software. We then show how to combine the three cues into a high quality depth map. Our method for depth estimation is suitable for computer vision applications such as matting, full control of depth-of-field, and surface reconstruction, as well as light filed display

  8. Design of a nano-satellite demonstrator of an infrared imaging space interferometer: the HyperCube

    NASA Astrophysics Data System (ADS)

    Dohlen, Kjetil; Vives, Sébastien; Rakotonimbahy, Eddy; Sarkar, Tanmoy; Tasnim Ava, Tanzila; Baccichet, Nicola; Savini, Giorgio; Swinyard, Bruce

    2014-07-01

    The construction of a kilometer-baseline far infrared imaging interferometer is one of the big instrumental challenges for astronomical instrumentation in the coming decades. Recent proposals such as FIRI, SPIRIT, and PFI illustrate both science cases, from exo-planetary science to study of interstellar media and cosmology, and ideas for construction of such instruments, both in space and on the ground. An interesting option for an imaging multi-aperture interferometer with km baseline is the space-based hyper telescope (HT) where a giant, sparsely populated primary mirror is constituted of several free-flying satellites each carrying a mirror segment. All the segments point the same object and direct their part of the pupil towards a common focus where another satellite, containing recombiner optics and a detector unit, is located. In Labeyrie's [1] original HT concept, perfect phasing of all the segments was assumed, allowing snap-shot imaging within a reduced field of view and coronagraphic extinction of the star. However, for a general purpose observatory, image reconstruction using closure phase a posteriori image reconstruction is possible as long as the pupil is fully non-redundant. Such reconstruction allows for much reduced alignment tolerances, since optical path length control is only required to within several tens of wavelengths, rather than within a fraction of a wavelength. In this paper we present preliminary studies for such an instrument and plans for building a miniature version to be flown on a nano satellite. A design for recombiner optics is proposed, including a scheme for exit pupil re-organization, is proposed, indicating the focal plane satellite in the case of a km-baseline interferometer could be contained within a 1m3 unit. Different options for realization of a miniature version are presented, including instruments for solar observations in the visible and the thermal infrared and giant planet observations in the visible, and an algorithm for design of optimal aperture layout based on least-squares minimization is described. A first experimental setup realized by master students is presented, where a 20mm baseline interferometer with 1mm apertures associated with a thermal infrared camera pointed the sun. The absence of fringes in this setup is discussed in terms of spatial spectrum analysis. Finally, we discuss requirements in terms of satellite pointing requirements for such a miniature interferometer.

  9. Particle Swarms in Fractures: Open Versus Partially Closed Systems

    NASA Astrophysics Data System (ADS)

    Boomsma, E.; Pyrak-Nolte, L. J.

    2014-12-01

    In the field, fractures may be isolated or connected to fluid reservoirs anywhere along the perimeter of a fracture. These boundaries affect fluid circulation, flow paths and communication with external reservoirs. The transport of drop like collections of colloidal-sized particles (particle swarms) in open and partially closed systems was studied. A uniform aperture synthetic fracture was constructed using two blocks (100 x 100 x 50 mm) of transparent acrylic placed parallel to each other. The fracture was fully submerged a tank filled with 100cSt silicone oil. Fracture apertures were varied from 5-80 mm. Partially closed systems were created by sealing the sides of the fracture with plastic film. The four boundary conditions study were: (Case 1) open, (Case 2) closed on the sides, (Case 3) closed on the bottom, and (Case 4) closed on both the sides and bottom of the fracture. A 15 μL dilute suspension of soda-lime glass particles in oil (2% by mass) were released into the fracture. Particle swarms were illuminated using a green (525 nm) LED array and imaged with a CCD camera. The presence of the additional boundaries modified the speed of the particle swarms (see figure). In Case 1, enhanced swarm transport was observed for a range of apertures, traveling faster than either very small or very large apertures. In Case 2, swarm velocities were enhanced over a larger range of fracture apertures than in any of the other cases. Case 3 shifted the enhanced transport regime to lower apertures and also reduced swarm speed when compared to Case 2. Finally, Case 4 eliminated the enhanced transport regime entirely. Communication between the fluid in the fracture and an external fluid reservoir resulted in enhanced swarm transport in Cases 1-3. The non-rigid nature of a swarm enables drag from the fracture walls to modify the swarm geometry. The particles composing a swarm reorganize in response to the fracture, elongating the swarm and maintaining its density. Unlike a drop or solid sphere, fracture boundaries do not exclusively decelerate swarm motion but instead produce enhanced swarm transport. Acknowledgments: This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Geosciences Research Program under Award Number (DE-FG02-09ER16022).

  10. Person re-identification over camera networks using multi-task distance metric learning.

    PubMed

    Ma, Lianyang; Yang, Xiaokang; Tao, Dacheng

    2014-08-01

    Person reidentification in a camera network is a valuable yet challenging problem to solve. Existing methods learn a common Mahalanobis distance metric by using the data collected from different cameras and then exploit the learned metric for identifying people in the images. However, the cameras in a camera network have different settings and the recorded images are seriously affected by variability in illumination conditions, camera viewing angles, and background clutter. Using a common metric to conduct person reidentification tasks on different camera pairs overlooks the differences in camera settings; however, it is very time-consuming to label people manually in images from surveillance videos. For example, in most existing person reidentification data sets, only one image of a person is collected from each of only two cameras; therefore, directly learning a unique Mahalanobis distance metric for each camera pair is susceptible to over-fitting by using insufficiently labeled data. In this paper, we reformulate person reidentification in a camera network as a multitask distance metric learning problem. The proposed method designs multiple Mahalanobis distance metrics to cope with the complicated conditions that exist in typical camera networks. We address the fact that these Mahalanobis distance metrics are different but related, and learned by adding joint regularization to alleviate over-fitting. Furthermore, by extending, we present a novel multitask maximally collapsing metric learning (MtMCML) model for person reidentification in a camera network. Experimental results demonstrate that formulating person reidentification over camera networks as multitask distance metric learning problem can improve performance, and our proposed MtMCML works substantially better than other current state-of-the-art person reidentification methods.

  11. Dual FOV infrared lens design with the laser common aperture optics

    NASA Astrophysics Data System (ADS)

    Chang, Wei-jun; Zhang, Xuan-zhi; Luan, Ya-dong; Zhang, Bo

    2015-02-01

    With the demand of autonomous precision guidance of air defense missile, the system scheme of the IR imaging/Ladar dual-mode seeker with a common aperture was proposed, and the optical system used in was designed. The system had a common receiving aperture, and its structure was very compact, so it could meet the requirement for the miniaturization of the seeker. Besides, it also could meet the demands of a wide field of view for searching target, and the demands for accurately recognizing and tracking the target at the same time. In order to increase the narrow FOV tracking performance, the dual FOV infrared optical used the zooming mode which some components flip in or out the optical system to firm the target signal. The dual FOV optics are divided into the zooming part, with dual variable focal length, and the reimaging part which was chosen in such a way to minimize the objective lens while maintaining 100% cold shield efficiency. The final infrared optics including 4°×3°(NFOV) and 16°×12°(WFOV) was designed. The NFOV lens composed of two common IR/Ladar lens, three relay lens, a beam splitter and two reflective fold mirrors, while WFOV lens increased two lens such as Germanium and Silicon. The common IR/Ladar lens ZnS and ZnSe could refractive the IR optics and Laser optics. The beam splitter which refractived IR optics and reflected Laser optics was located in the middle of Germanium and Silicon. The designed optical system had good image quality, and fulfilled the performance requirement of seeker system.

  12. SU-E-J-17: A Study of Accelerator-Induced Cerenkov Radiation as a Beam Diagnostic and Dosimetry Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bateman, F; Tosh, R

    2014-06-01

    Purpose: To investigate accelerator-induced Cerenkov radiation imaging as a possible beam diagnostic and medical dosimetry tool. Methods: Cerenkov emission produced by clinical accelerator beams in a water phantom was imaged using a camera system comprised of a high-sensitivity thermoelectrically-cooled CCD camera coupled to a large aperture (f/0.75) objective lens with 16:1 magnification. This large format lens allows a significant amount of the available Cerenkov light to be collected and focused onto the CCD camera to form the image. Preliminary images, obtained with 6 MV photon beams, used an unshielded camera mounted horizontally with the beam normal to the water surface,more » and confirmed the detection of Cerenkov radiation. Several improvements were subsequently made including the addition of radiation shielding around the camera, and altering of the beam and camera angles to give a more favorable geometry for Cerenkov light collection. A detailed study was then undertaken over a range of electron and photon beam energies and dose rates to investigate the possibility of using this technique for beam diagnostics and dosimetry. Results: A series of images were obtained at a fixed dose rate over a range of electron energies from 6 to 20 MeV. The location of maximum intensity was found to vary linearly with the energy of the beam. A linear relationship was also found between the light observed from a fixed point on the central axis and the dose rate for both photon and electron beams. Conclusion: We have found that the analysis of images of beam-induced Cerenkov light in a water phantom has potential for use as a beam diagnostic and medical dosimetry tool. Our future goals include the calibration of the light output in terms of radiation dose and development of a tomographic system for 3D Cerenkov imaging in water phantoms and other media.« less

  13. Effect of camera resolution and bandwidth on facial affect recognition.

    PubMed

    Cruz, Mario; Cruz, Robyn Flaum; Krupinski, Elizabeth A; Lopez, Ana Maria; McNeeley, Richard M; Weinstein, Ronald S

    2004-01-01

    This preliminary study explored the effect of camera resolution and bandwidth on facial affect recognition, an important process and clinical variable in mental health service delivery. Sixty medical students and mental health-care professionals were recruited and randomized to four different combinations of commonly used teleconferencing camera resolutions and bandwidths: (1) one chip charged coupling device (CCD) camera, commonly used for VHSgrade taping and in teleconferencing systems costing less than $4,000 with a resolution of 280 lines, and 128 kilobytes per second bandwidth (kbps); (2) VHS and 768 kbps; (3) three-chip CCD camera, commonly used for Betacam (Beta) grade taping and in teleconferencing systems costing more than $4,000 with a resolution of 480 lines, and 128 kbps; and (4) Betacam and 768 kbps. The subjects were asked to identify four facial affects dynamically presented on videotape by an actor and actress presented via a video monitor at 30 frames per second. Two-way analysis of variance (ANOVA) revealed a significant interaction effect for camera resolution and bandwidth (p = 0.02) and a significant main effect for camera resolution (p = 0.006), but no main effect for bandwidth was detected. Post hoc testing of interaction means, using the Tukey Honestly Significant Difference (HSD) test and the critical difference (CD) at the 0.05 alpha level = 1.71, revealed subjects in the VHS/768 kbps (M = 7.133) and VHS/128 kbps (M = 6.533) were significantly better at recognizing the displayed facial affects than those in the Betacam/768 kbps (M = 4.733) or Betacam/128 kbps (M = 6.333) conditions. Camera resolution and bandwidth combinations differ in their capacity to influence facial affect recognition. For service providers, this study's results support the use of VHS cameras with either 768 kbps or 128 kbps bandwidths for facial affect recognition compared to Betacam cameras. The authors argue that the results of this study are a consequence of the VHS camera resolution/bandwidth combinations' ability to improve signal detection (i.e., facial affect recognition) by subjects in comparison to Betacam camera resolution/bandwidth combinations.

  14. Gamma-Ray Imaging Probes.

    NASA Astrophysics Data System (ADS)

    Wild, Walter James

    1988-12-01

    External nuclear medicine diagnostic imaging of early primary and metastatic lung cancer tumors is difficult due to the poor sensitivity and resolution of existing gamma cameras. Nonimaging counting detectors used for internal tumor detection give ambiguous results because distant background variations are difficult to discriminate from neighboring tumor sites. This suggests that an internal imaging nuclear medicine probe, particularly an esophageal probe, may be advantageously used to detect small tumors because of the ability to discriminate against background variations and the capability to get close to sites neighboring the esophagus. The design, theory of operation, preliminary bench tests, characterization of noise behavior and optimization of such an imaging probe is the central theme of this work. The central concept lies in the representation of the aperture shell by a sequence of binary digits. This, coupled with the mode of operation which is data encoding within an axial slice of space, leads to the fundamental imaging equation in which the coding operation is conveniently described by a circulant matrix operator. The coding/decoding process is a classic coded-aperture problem, and various estimators to achieve decoding are discussed. Some estimators require a priori information about the object (or object class) being imaged; the only unbiased estimator that does not impose this requirement is the simple inverse-matrix operator. The effects of noise on the estimate (or reconstruction) is discussed for general noise models and various codes/decoding operators. The choice of an optimal aperture for detector count times of clinical relevance is examined using a statistical class-separability formalism.

  15. Process for producing laser-formed video calibration markers.

    PubMed

    Franck, J B; Keller, P N; Swing, R A; Silberberg, G G

    1983-08-15

    A process for producing calibration markers directly on the photoconductive surface of video camera tubes has been developed. This process includes the use of a Nd:YAG laser operating at 1.06 microm with a 9.5-nsec pulse width (full width at half-maximum). The laser was constrained to operate in the TEM(00) spatial mode by intracavity aperturing. The use of this technology has produced an increase of up to 50 times the accuracy of geometric measurement. This is accomplished by a decrease in geometric distortion and an increase in geometric scaling. The process by which these laser-formed video calibrations are made will be discussed.

  16. SHIELD: FITGALAXY -- A Software Package for Automatic Aperture Photometry of Extended Sources

    NASA Astrophysics Data System (ADS)

    Marshall, Melissa

    2013-01-01

    Determining the parameters of extended sources, such as galaxies, is a common but time-consuming task. Finding a photometric aperture that encompasses the majority of the flux of a source and identifying and excluding contaminating objects is often done by hand - a lengthy and difficult to reproduce process. To make extracting information from large data sets both quick and repeatable, I have developed a program called FITGALAXY, written in IDL. This program uses minimal user input to automatically fit an aperture to, and perform aperture and surface photometry on, an extended source. FITGALAXY also automatically traces the outlines of surface brightness thresholds and creates surface brightness profiles, which can then be used to determine the radial properties of a source. Finally, the program performs automatic masking of contaminating sources. Masks and apertures can be applied to multiple images (regardless of the WCS solution or plate scale) in order to accurately measure the same source at different wavelengths. I present the fluxes, as measured by the program, of a selection of galaxies from the Local Volume Legacy Survey. I then compare these results with the fluxes given by Dale et al. (2009) in order to assess the accuracy of FITGALAXY.

  17. Design, Fabrication, Optical Testing, and Performance of Diamond Machined Aspheric Mirrors for Ground-Based Near-IR Astronomy

    NASA Technical Reports Server (NTRS)

    Ohl, Raymond G.; Mink, Ronald; Chambers, V. John; Connelly, Joseph A.; Mentzell, J. Eric; Tveekrem, June L.; Howard, Joseph M.; Preuss, Werner; Schroeder, Mechthild; Sohn, Alex; hide

    2002-01-01

    Challenges in fabrication and testing have historically limited the choice of surfaces available for the design of reflective optical instruments. Spherical and conic mirrors are common, but, for future science instruments, more degrees of freedom are necessary to meet challenging performance and packaging requirements. These instruments will be composed of unusual aspheres located far off-axis with large spherical departure, and some designs will require asymmetric surface profiles. In particular, single-surface astigmatism correction in spectrographs necessitates a toroidal surface, which lacks an axis of rotational symmetry. We describe the design, fabrication, optical testing, and performance of three rotationally symmetric, off-axis, aspheric mirrors and one toroidal, off-axis, biconic camera mirror on aluminum substrates for the Infrared Multi-Object Spectrograph (IRMOS) instrument. IRMOS is a facility instrument for the Kitt Peak National Observatory's Mayall Telescope (3.8 m) and an engineering prototype for a possible design of the Next Generation Space Telescope/Multi-Object Spectrograph. The symmetric mirrors range in aperture from 94x86 mm to 286x269 mm and in f-number from 0.9 to 2.4. They are various off-axis, convex and concave, prolate and oblate ellipsoids. The concave biconic mirror has a 94x76 mm aperture, Rx=377 mm, kx=0.0778, Ry=407 mm, and ky=0.1265 and is decentered. by -2 mm in x and 227 mm in y. The mirrors have an aspect ratio of approximately 4:1. The surface error fabrication tolerances are less than 63.3 nm RMS figure error and less than 10 nm RMS microroughness. The mirrors are attached to the instrument bench via a semi-kinematic, integral flexure mount. We describe mirror design, diamond machining, the results of figure testing using computer-generated holograms, and imaging and scattered light modeling and performance.

  18. Demonstrations of Optical Spectra with a Video Camera

    ERIC Educational Resources Information Center

    Kraftmakher, Yaakov

    2012-01-01

    The use of a video camera may markedly improve demonstrations of optical spectra. First, the output electrical signal from the camera, which provides full information about a picture to be transmitted, can be used for observing the radiant power spectrum on the screen of a common oscilloscope. Second, increasing the magnification by the camera…

  19. Automatic exposure for panoramic systems in uncontrolled lighting conditions: a football stadium case study

    NASA Astrophysics Data System (ADS)

    Gaddam, Vamsidhar Reddy; Griwodz, Carsten; Halvorsen, Pâl.

    2014-02-01

    One of the most common ways of capturing wide eld-of-view scenes is by recording panoramic videos. Using an array of cameras with limited overlapping in the corresponding images, one can generate good panorama images. Using the panorama, several immersive display options can be explored. There is a two fold synchronization problem associated to such a system. One is the temporal synchronization, but this challenge can easily be handled by using a common triggering solution to control the shutters of the cameras. The other synchronization challenge is the automatic exposure synchronization which does not have a straight forward solution, especially in a wide area scenario where the light conditions are uncontrolled like in the case of an open, outdoor football stadium. In this paper, we present the challenges and approaches for creating a completely automatic real-time panoramic capture system with a particular focus on the camera settings. One of the main challenges in building such a system is that there is not one common area of the pitch that is visible to all the cameras that can be used for metering the light in order to nd appropriate camera parameters. One approach we tested is to use the green color of the eld grass. Such an approach provided us with acceptable results only in limited light conditions.A second approach was devised where the overlapping areas between adjacent cameras are exploited, thus creating pairs of perfectly matched video streams. However, there still existed some disparity between di erent pairs. We nally developed an approach where the time between two temporal frames is exploited to communicate the exposures among the cameras where we achieve a perfectly synchronized array. An analysis of the system and some experimental results are presented in this paper. In summary, a pilot-camera approach running in auto-exposure mode and then distributing the used exposure values to the other cameras seems to give best visual results.

  20. On Variable Geometric Factor Systems for Top-Hat Electrostatic Space Plasma Analyzers

    NASA Technical Reports Server (NTRS)

    Kataria, Dhiren O.; Collinson, Glyn A.

    2010-01-01

    Even in the relatively small region of space that is the Earth's magnetosphere, ion and electron fluxes can vary by several orders of magnitude. Top-hat electrostatic analyzers currently do not possess the dynamic range required to sample plasma under all conditions. The purpose of this study was to compare, through computer simulation, three new electrostatic methods that would allow the sensitivity of a sensor to be varied through control of its geometric factor (GF) (much like an aperture on a camera). The methods studied were inner filter plates, split hemispherical analyzer (SHA) and top-cap electrode. This is the first discussion of the filter plate concept and also the first study where all three systems are studied within a common analyzer design, so that their relative merits could be fairly compared. Filter plates were found to have the important advantage that they facilitate the reduction in instrument sensitivity whilst keeping all other instrument parameters constant. However, it was discovered that filter plates have numerous disadvantages that make such a system impracticable for a top-hat electrostatic analyzer. It was found that both the top-cap electrode and SHA are promising variable geometric factor system (VGFS) concepts for implementation into a top-hat electrostatic analyzer, each with distinct advantages over the other.

  1. Large-mirror testing facility at the National Optical Astronomy Observatories

    NASA Astrophysics Data System (ADS)

    Coudé du Foresto, V.; Fox, J.; Poczulp, G. A.; Richardson, J.; Roddier, Claude; Roddier, Francois; Barr, L. D.

    1991-09-01

    A method for testing the surfaces of large mirrors has been developed to be used even when conditions of vibration and thermal turbulence in the light path cannot be eliminated. The full aperture of the mirror under test is examined by means of a scatterplate interferometer that has the property of being a quasi-common-path method, although any means for obtaining interference fringes can be used. By operating the test equipment remotely, the optician does not cause unnecessary vibrations or heat in the testing area. The typical test is done with a camera exposure of about a millisecond to 'freeze' the fringe pattern on the detector. Averaging up to 10 separate exposures effectively eliminates the turbulence effects. From the intensity information, a phase map of the wavefront reflected from the surface is obtained using a phase-unwrapping technique. The method provides the optician with complete numerical information and visual plots for the surface under test and the diffracted image the method will produce to an accuracy of 0.01 micron measured peak-to-valley. The method has been extensively used for a variety of test of a 1.8-m-diam borosilicate-glass honeycomb mirror, where the method was shown to have a sensitivity equal to a Foucault test.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, K; University of Western Ontario, London, ON, CA

    Purpose: Film densitometers optimized for radiochromic EBT3 film are not commercially available and flatbed document scanners are limited by broad spectral overlap of the red green and blue channels and polarization effects. Non-polarized, narrow bandwith, intensity stabilized light emitting diode (LED) light sources were constructed and transmission measurements compared for point, slot and full aperture geometries. Methods: LED’s with 10 nm bandwidths were focused to 1 mm spots for point scanning. The LED’s back-illuminated a 6 mm diameter diffuser near the focal point of a 25 cm diameter Fresnel lens for slot and full aperture transmission imaging. Films were locatedmore » at the aperture plane, 15 cm from the lens and imaged with a 16 bit digital camera with 50 mm lens. Sheets, 10×8 cm, were irradiated to 25 Gy with a 4×4 cm 6 MV photon beam. Transmission measurements with a 633nm, linearly polarized laser were used as lowacceptance angle reference geometry. Results: LED intensity stabilized to <1% within one minute of startup. The useful transmission range increased as the acceptance angle decreased, laser(∼2%)> spot(∼3%)> slot(∼5%)> full aperture(∼10%). The Fresnel lens introduced a 4-fold polarization symmetry that increased radially causing a 2% change in transmission at 10 cm from the optic axis for EBT3 film. Polarized spot densitometer and laser transmission were similar. Conclusion: Spectrally filtered LED sources were effective for transmission measurements of radiochromic films. Because of polarization sensitivity of EBT3 film, the scanning spot geometry is the most general for measuring unpolarised transmission of larger film sizes, ie > 15×15 cm. Due to EBT3 film scatter, spot scanning should provide the most accurate profiles of high dose gradients, for a given acceptance angle. The spot scanning densitometer provided useful transmission to ∼5, 25, >25 Gy with 635, 590 and 530 nm light respectively.« less

  3. A Simple Spectrophotometer Using Common Materials and a Digital Camera

    ERIC Educational Resources Information Center

    Widiatmoko, Eko; Widayani; Budiman, Maman; Abdullah, Mikrajuddin; Khairurrijal

    2011-01-01

    A simple spectrophotometer was designed using cardboard, a DVD, a pocket digital camera, a tripod and a computer. The DVD was used as a diffraction grating and the camera as a light sensor. The spectrophotometer was calibrated using a reference light prior to use. The spectrophotometer was capable of measuring optical wavelengths with a…

  4. Compressive Coded-Aperture Multimodal Imaging Systems

    NASA Astrophysics Data System (ADS)

    Rueda-Chacon, Hoover F.

    Multimodal imaging refers to the framework of capturing images that span different physical domains such as space, spectrum, depth, time, polarization, and others. For instance, spectral images are modeled as 3D cubes with two spatial and one spectral coordinate. Three-dimensional cubes spanning just the space domain, are referred as depth volumes. Imaging cubes varying in time, spectra or depth, are referred as 4D-images. Nature itself spans different physical domains, thus imaging our real world demands capturing information in at least 6 different domains simultaneously, giving turn to 3D-spatial+spectral+polarized dynamic sequences. Conventional imaging devices, however, can capture dynamic sequences with up-to 3 spectral channels, in real-time, by the use of color sensors. Capturing multiple spectral channels require scanning methodologies, which demand long time. In general, to-date multimodal imaging requires a sequence of different imaging sensors, placed in tandem, to simultaneously capture the different physical properties of a scene. Then, different fusion techniques are employed to mix all the individual information into a single image. Therefore, new ways to efficiently capture more than 3 spectral channels of 3D time-varying spatial information, in a single or few sensors, are of high interest. Compressive spectral imaging (CSI) is an imaging framework that seeks to optimally capture spectral imagery (tens of spectral channels of 2D spatial information), using fewer measurements than that required by traditional sensing procedures which follows the Shannon-Nyquist sampling. Instead of capturing direct one-to-one representations of natural scenes, CSI systems acquire linear random projections of the scene and then solve an optimization algorithm to estimate the 3D spatio-spectral data cube by exploiting the theory of compressive sensing (CS). To date, the coding procedure in CSI has been realized through the use of ``block-unblock" coded apertures, commonly implemented as chrome-on-quartz photomasks. These apertures block or permit to pass the entire spectrum from the scene at given spatial locations, thus modulating the spatial characteristics of the scene. In the first part, this thesis aims to expand the framework of CSI by replacing the traditional block-unblock coded apertures by patterned optical filter arrays, referred as ``color" coded apertures. These apertures are formed by tiny pixelated optical filters, which in turn, allow the input image to be modulated not only spatially but spectrally as well, entailing more powerful coding strategies. The proposed colored coded apertures are either synthesized through linear combinations of low-pass, high-pass and band-pass filters, paired with binary pattern ensembles realized by a digital-micromirror-device (DMD), or experimentally realized through thin-film color-patterned filter arrays. The optical forward model of the proposed CSI architectures will be presented along with the design and proof-of-concept implementations, which achieve noticeable improvements in the quality of the reconstructions compared with conventional block-unblock coded aperture-based CSI architectures. On another front, due to the rich information contained in the infrared spectrum as well as the depth domain, this thesis aims to explore multimodal imaging by extending the range sensitivity of current CSI systems to a dual-band visible+near-infrared spectral domain, and also, it proposes, for the first time, a new imaging device that captures simultaneously 4D data cubes (2D spatial+1D spectral+depth imaging) with as few as a single snapshot. Due to the snapshot advantage of this camera, video sequences are possible, thus enabling the joint capture of 5D imagery. It aims to create super-human sensing that will enable the perception of our world in new and exciting ways. With this, we intend to advance in the state of the art in compressive sensing systems to extract depth while accurately capturing spatial and spectral material properties. The applications of such a sensor are self-evident in fields such as computer/robotic vision because they would allow an artificial intelligence to make informed decisions about not only the location of objects within a scene but also their material properties.

  5. Portraiture lens concept in a mobile phone camera

    NASA Astrophysics Data System (ADS)

    Sheil, Conor J.; Goncharov, Alexander V.

    2017-11-01

    A small form-factor lens was designed for the purpose of portraiture photography, the size of which allows use within smartphone casing. The current general requirement of mobile cameras having good all-round performance results in a typical, familiar, many-element design. Such designs have little room for improvement, in terms of the available degrees of freedom and highly-demanding target metrics such as low f-number and wide field of view. However, the specific application of the current portraiture lens relaxed the requirement of an all-round high-performing lens, allowing improvement of certain aspects at the expense of others. With a main emphasis on reducing depth of field (DoF), the current design takes advantage of the simple geometrical relationship between DoF and pupil diameter. The system has a large aperture, while a reasonable f-number gives a relatively large focal length, requiring a catadioptric lens design with double ray path; hence, field of view is reduced. Compared to typical mobile lenses, the large diameter reduces depth of field by a factor of four.

  6. Optical Characterization of the SPT-3G Camera

    NASA Astrophysics Data System (ADS)

    Pan, Z.; Ade, P. A. R.; Ahmed, Z.; Anderson, A. J.; Austermann, J. E.; Avva, J. S.; Thakur, R. Basu; Bender, A. N.; Benson, B. A.; Carlstrom, J. E.; Carter, F. W.; Cecil, T.; Chang, C. L.; Cliche, J. F.; Cukierman, A.; Denison, E. V.; de Haan, T.; Ding, J.; Dobbs, M. A.; Dutcher, D.; Everett, W.; Foster, A.; Gannon, R. N.; Gilbert, A.; Groh, J. C.; Halverson, N. W.; Harke-Hosemann, A. H.; Harrington, N. L.; Henning, J. W.; Hilton, G. C.; Holzapfel, W. L.; Huang, N.; Irwin, K. D.; Jeong, O. B.; Jonas, M.; Khaire, T.; Kofman, A. M.; Korman, M.; Kubik, D.; Kuhlmann, S.; Kuo, C. L.; Lee, A. T.; Lowitz, A. E.; Meyer, S. S.; Michalik, D.; Montgomery, J.; Nadolski, A.; Natoli, T.; Nguyen, H.; Noble, G. I.; Novosad, V.; Padin, S.; Pearson, J.; Posada, C. M.; Rahlin, A.; Ruhl, J. E.; Saunders, L. J.; Sayre, J. T.; Shirley, I.; Shirokoff, E.; Smecher, G.; Sobrin, J. A.; Stark, A. A.; Story, K. T.; Suzuki, A.; Tang, Q. Y.; Thompson, K. L.; Tucker, C.; Vale, L. R.; Vanderlinde, K.; Vieira, J. D.; Wang, G.; Whitehorn, N.; Yefremenko, V.; Yoon, K. W.; Young, M. R.

    2018-05-01

    The third-generation South Pole Telescope camera is designed to measure the cosmic microwave background across three frequency bands (centered at 95, 150 and 220 GHz) with ˜ 16,000 transition-edge sensor (TES) bolometers. Each multichroic array element on a detector wafer has a broadband sinuous antenna that couples power to six TESs, one for each of the three observing bands and both polarizations, via lumped element filters. Ten detector wafers populate the detector array, which is coupled to the sky via a large-aperture optical system. Here we present the frequency band characterization with Fourier transform spectroscopy, measurements of optical time constants, beam properties, and optical and polarization efficiencies of the detector array. The detectors have frequency bands consistent with our simulations and have high average optical efficiency which is 86, 77 and 66% for the 95, 150 and 220 GHz detectors. The time constants of the detectors are mostly between 0.5 and 5 ms. The beam is round with the correct size, and the polarization efficiency is more than 90% for most of the bolometers.

  7. Deep Rapid Optical Follow-Up of Gravitational Wave Sources with the Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Cowperthwaite, Philip

    2018-01-01

    The detection of an electromagnetic counterpart associated with a gravitational wave detection by the Advanced LIGO and VIRGO interferometers is one of the great observational challenges of our time. The large localization regions and potentially faint counterparts require the use of wide-field, large aperture telescopes. As a result, the Dark Energy Camera, a 3.3 sq deg CCD imager on the 4-m Blanco telescope at CTIO in Chile is the most powerful instrument for this task in the Southern Hemisphere. I will report on the results from our joint program between the community and members of the dark energy survey to conduct rapid and efficient follow-up of gravitational wave sources. This includes systematic searches for optical counterparts, as well as developing an understanding of contaminating sources on timescales not normally probed by traditional untargeted supernova surveys. I will additionally comment on the immense science gains to be made by a joint detection and discuss future prospects from the standpoint of both next generation wide-field telescopes and next generation gravitational wave detectors.

  8. Characterization of a digital camera as an absolute tristimulus colorimeter

    NASA Astrophysics Data System (ADS)

    Martinez-Verdu, Francisco; Pujol, Jaume; Vilaseca, Meritxell; Capilla, Pascual

    2003-01-01

    An algorithm is proposed for the spectral and colorimetric characterization of digital still cameras (DSC) which allows to use them as tele-colorimeters with CIE-XYZ color output, in cd/m2. The spectral characterization consists of the calculation of the color-matching functions from the previously measured spectral sensitivities. The colorimetric characterization consists of transforming the RGB digital data into absolute tristimulus values CIE-XYZ (in cd/m2) under variable and unknown spectroradiometric conditions. Thus, at the first stage, a gray balance has been applied over the RGB digital data to convert them into RGB relative colorimetric values. At a second stage, an algorithm of luminance adaptation vs. lens aperture has been inserted in the basic colorimetric profile. Capturing the ColorChecker chart under different light sources, the DSC color analysis accuracy indexes, both in a raw state and with the corrections from a linear model of color correction, have been evaluated using the Pointer'86 color reproduction index with the unrelated Hunt'91 color appearance model. The results indicate that our digital image capture device, in raw performance, lightens and desaturates the colors.

  9. Simultaneous water vapor and dry air optical path length measurements and compensation with the large binocular telescope interferometer

    NASA Astrophysics Data System (ADS)

    Defrère, D.; Hinz, P.; Downey, E.; Böhm, M.; Danchi, W. C.; Durney, O.; Ertel, S.; Hill, J. M.; Hoffmann, W. F.; Mennesson, B.; Millan-Gabet, R.; Montoya, M.; Pott, J.-U.; Skemer, A.; Spalding, E.; Stone, J.; Vaz, A.

    2016-08-01

    The Large Binocular Telescope Interferometer uses a near-infrared camera to measure the optical path length variations between the two AO-corrected apertures and provide high-angular resolution observations for all its science channels (1.5-13 microns). There is however a wavelength dependent component to the atmospheric turbulence, which can introduce optical path length errors when observing at a wavelength different from that of the fringe sensing camera. Water vapor in particular is highly dispersive and its effect must be taken into account for high-precision infrared interferometric observations as described previously for VLTI/MIDI or the Keck Interferometer Nuller. In this paper, we describe the new sensing approach that has been developed at the LBT to measure and monitor the optical path length fluctuations due to dry air and water vapor separately. After reviewing the current performance of the system for dry air seeing compensation, we present simultaneous H-, K-, and N-band observations that illustrate the feasibility of our feedforward approach to stabilize the path length fluctuations seen by the LBTI nuller.

  10. High resolution imaging at Palomar

    NASA Technical Reports Server (NTRS)

    Kulkarni, Shrinivas R.

    1992-01-01

    For the last two years we have embarked on a program of understanding the ultimate limits of ground-based optical imaging. We have designed and fabricated a camera specifically for high resolution imaging. This camera has now been pressed into service at the prime focus of the Hale 5 m telescope. We have concentrated on two techniques: the Non-Redundant Masking (NRM) and Weigelt's Fully Filled Aperture (FFA) method. The former is the optical analog of radio interferometry and the latter is a higher order extension of the Labeyrie autocorrelation method. As in radio Very Long Baseline Interferometry (VLBI), both these techniques essentially measure the closure phase and, hence, true image construction is possible. We have successfully imaged binary stars and asteroids with angular resolution approaching the diffraction limit of the telescope and image quality approaching that of a typical radio VLBI map. In addition, we have carried out analytical and simulation studies to determine the ultimate limits of ground-based optical imaging, the limits of space-based interferometric imaging, and investigated the details of imaging tradeoffs of beam combination in optical interferometers.

  11. Retrieval of Garstang's emission function from all-sky camera images

    NASA Astrophysics Data System (ADS)

    Kocifaj, Miroslav; Solano Lamphar, Héctor Antonio; Kundracik, František

    2015-10-01

    The emission function from ground-based light sources predetermines the skyglow features to a large extent, while most mathematical models that are used to predict the night sky brightness require the information on this function. The radiant intensity distribution on a clear sky is experimentally determined as a function of zenith angle using the theoretical approach published only recently in MNRAS, 439, 3405-3413. We have made the experiments in two localities in Slovakia and Mexico by means of two digital single lens reflex professional cameras operating with different lenses that limit the system's field-of-view to either 180º or 167º. The purpose of using two cameras was to identify variances between two different apertures. Images are taken at different distances from an artificial light source (a city) with intention to determine the ratio of zenith radiance relative to horizontal irradiance. Subsequently, the information on the fraction of the light radiated directly into the upward hemisphere (F) is extracted. The results show that inexpensive devices can properly identify the upward emissions with adequate reliability as long as the clear sky radiance distribution is dominated by a largest ground-based light source. Highly unstable turbidity conditions can also make the parameter F difficult to find or even impossible to retrieve. The measurements at low elevation angles should be avoided due to a potentially parasitic effect of direct light emissions from luminaires surrounding the measuring site.

  12. Design study for a 16x zoom lens system for visible surveillance camera

    NASA Astrophysics Data System (ADS)

    Vella, Anthony; Li, Heng; Zhao, Yang; Trumper, Isaac; Gandara-Montano, Gustavo A.; Xu, Di; Nikolov, Daniel K.; Chen, Changchen; Brown, Nicolas S.; Guevara-Torres, Andres; Jung, Hae Won; Reimers, Jacob; Bentley, Julie

    2015-09-01

    *avella@ur.rochester.edu Design study for a 16x zoom lens system for visible surveillance camera Anthony Vella*, Heng Li, Yang Zhao, Isaac Trumper, Gustavo A. Gandara-Montano, Di Xu, Daniel K. Nikolov, Changchen Chen, Nicolas S. Brown, Andres Guevara-Torres, Hae Won Jung, Jacob Reimers, Julie Bentley The Institute of Optics, University of Rochester, Wilmot Building, 275 Hutchison Rd, Rochester, NY, USA 14627-0186 ABSTRACT High zoom ratio zoom lenses have extensive applications in broadcasting, cinema, and surveillance. Here, we present a design study on a 16x zoom lens with 4 groups (including two internal moving groups), designed for, but not limited to, a visible spectrum surveillance camera. Fifteen different solutions were discovered with nearly diffraction limited performance, using PNPX or PNNP design forms with the stop located in either the third or fourth group. Some interesting patterns and trends in the summarized results include the following: (a) in designs with such a large zoom ratio, the potential of locating the aperture stop in the front half of the system is limited, with ray height variations through zoom necessitating a very large lens diameter; (b) in many cases, the lens zoom motion has significant freedom to vary due to near zero total power in the middle two groups; and (c) we discuss the trade-offs between zoom configuration, stop location, packaging factors, and zoom group aberration sensitivity.

  13. Overview of the Multi-Spectral Imager on the NEAR spacecraft

    NASA Astrophysics Data System (ADS)

    Hawkins, S. E., III

    1996-07-01

    The Multi-Spectral Imager on the Near Earth Asteroid Rendezvous (NEAR) spacecraft is a 1 Hz frame rate CCD camera sensitive in the visible and near infrared bands (~400-1100 nm). MSI is the primary instrument on the spacecraft to determine morphology and composition of the surface of asteroid 433 Eros. In addition, the camera will be used to assist in navigation to the asteroid. The instrument uses refractive optics and has an eight position spectral filter wheel to select different wavelength bands. The MSI optical focal length of 168 mm gives a 2.9 ° × 2.25 ° field of view. The CCD is passively cooled and the 537×244 pixel array output is digitized to 12 bits. Electronic shuttering increases the effective dynamic range of the instrument by more than a factor of 100. A one-time deployable cover protects the instrument during ground testing operations and launch. A reduced aperture viewport permits full field of view imaging while the cover is in place. A Data Processing Unit (DPU) provides the digital interface between the spacecraft and the Camera Head and uses an RTX2010 processor. The DPU provides an eight frame image buffer, lossy and lossless data compression routines, and automatic exposure control. An overview of the instrument is presented and design parameters and trade-offs are discussed.

  14. Assessment of the UV camera sulfur dioxide retrieval for point source plumes

    USGS Publications Warehouse

    Dalton, M.P.; Watson, I.M.; Nadeau, P.A.; Werner, C.; Morrow, W.; Shannon, J.M.

    2009-01-01

    Digital cameras, sensitive to specific regions of the ultra-violet (UV) spectrum, have been employed for quantifying sulfur dioxide (SO2) emissions in recent years. The instruments make use of the selective absorption of UV light by SO2 molecules to determine pathlength concentration. Many monitoring advantages are gained by using this technique, but the accuracy and limitations have not been thoroughly investigated. The effect of some user-controlled parameters, including image exposure duration, the diameter of the lens aperture, the frequency of calibration cell imaging, and the use of the single or paired bandpass filters, have not yet been addressed. In order to clarify methodological consequences and quantify accuracy, laboratory and field experiments were conducted. Images were collected of calibration cells under varying observational conditions, and our conclusions provide guidance for enhanced image collection. Results indicate that the calibration cell response is reliably linear below 1500 ppm m, but that the response is significantly affected by changing light conditions. Exposure durations that produced maximum image digital numbers above 32 500 counts can reduce noise in plume images. Sulfur dioxide retrieval results from a coal-fired power plant plume were compared to direct sampling measurements and the results indicate that the accuracy of the UV camera retrieval method is within the range of current spectrometric methods. ?? 2009 Elsevier B.V.

  15. Building EOS capability for Malaysia - the options

    NASA Astrophysics Data System (ADS)

    Subari, M. D.; Hassan, A.

    2014-06-01

    Earth observation satellite (EOS) is currently a major tool to monitor earth dynamics and increase human understanding of earth surface process. Since the early 80s, Malaysia has been using EOS images for various applications, such as weather forecasting, land use mapping, agriculture, environment monitoring and others. Until now, all EOS images were obtained from foreign satellite systems. Realising on the strategic need of having its own capability, Malaysia embarked into EOS development programs in the early 90s. Starting with TiungSAT-1, a micro-satellite carrying small camera, then followed by RazakSAT, a small satellite carrying 2.5 m panchromatic (PAN) medium-aperture-camera, the current satellite program development, the RazakSAT-2, designed to carry a 1.0 m high resolution PAN and 4.0m multi-spectral camera, would become a strategic initiative of the government in developing and accelerating the nation's capability in the area of satellite technology and its application. Would this effort continue until all needs of the remote sensing community being fulfilled by its own EOS? This paper will analyze the intention of the Malaysian government through its National Space Policy and other related policy documents, and proposes some policy options on this. Key factors to be considered are specific data need of the EOS community, data availability and the more subjective political motivations such as national pride.

  16. Comparative Analysis of Gene Expression for Convergent Evolution of Camera Eye Between Octopus and Human

    PubMed Central

    Ogura, Atsushi; Ikeo, Kazuho; Gojobori, Takashi

    2004-01-01

    Although the camera eye of the octopus is very similar to that of humans, phylogenetic and embryological analyses have suggested that their camera eyes have been acquired independently. It has been known as a typical example of convergent evolution. To study the molecular basis of convergent evolution of camera eyes, we conducted a comparative analysis of gene expression in octopus and human camera eyes. We sequenced 16,432 ESTs of the octopus eye, leading to 1052 nonredundant genes that have matches in the protein database. Comparing these 1052 genes with 13,303 already-known ESTs of the human eye, 729 (69.3%) genes were commonly expressed between the human and octopus eyes. On the contrary, when we compared octopus eye ESTs with human connective tissue ESTs, the expression similarity was quite low. To trace the evolutionary changes that are potentially responsible for camera eye formation, we also compared octopus-eye ESTs with the completed genome sequences of other organisms. We found that 1019 out of the 1052 genes had already existed at the common ancestor of bilateria, and 875 genes were conserved between humans and octopuses. It suggests that a larger number of conserved genes and their similar gene expression may be responsible for the convergent evolution of the camera eye. PMID:15289475

  17. Automated stent defect detection and classification with a high numerical aperture optical system

    NASA Astrophysics Data System (ADS)

    Bermudez, Carlos; Laguarta, Ferran; Cadevall, Cristina; Matilla, Aitor; Ibañez, Sergi; Artigas, Roger

    2017-06-01

    Stent quality control is a highly critical process. Cardiovascular stents have to be inspected 100% so as no defective stent is implanted in a human body. However, this visual control is currently performed manually and every stent could need tenths of minutes to be inspected. In this paper, a novel optical inspection system is presented. By the combination of a high numerical aperture (NA) optical system, a rotational stage and a line-scan camera, unrolled sections of the outer and inner surfaces of the stent are obtained and image-processed at high speed. Defects appearing in those surfaces and also in the edges are extremely contrasted due to the shadowing effect of the high NA illumination and acquisition approach. Therefore by means of morphological operations and a sensitivity parameter, defects are detected. Based on a trained defect library, a binary classifier sorts each kind of defect through a set of scoring vectors, providing the quality operator with all the required information to finally take a decision. We expect this new approach to make defect detection completely objective and to dramatically reduce the time and cost of stent quality control stage.

  18. Synthetic Aperture Microwave Imaging (SAMI) of the plasma edge on NSTX-U

    NASA Astrophysics Data System (ADS)

    Vann, Roddy; Taylor, Gary; Brunner, Jakob; Ellis, Bob; Thomas, David

    2016-10-01

    The Synthetic Aperture Microwave Imaging (SAMI) system is a unique phased-array microwave camera with a +/-40° field of view in both directions. It can image cut-off surfaces corresponding to frequencies in the range 10-34.5GHz; these surfaces are typically in the plasma edge. SAMI operates in two modes: either imaging thermal emission from the plasma (often modified by its interaction with the plasma edge e.g. via BXO mode conversion) or ``active probing'' i.e. injecting a broad beam at the plasma surface and imaging the reflected/back-scattered signal. SAMI was successfully pioneered on the Mega-Amp Spherical Tokamak (MAST) at Culham Centre for Fusion Energy. SAMI has now been installed and commissioned on the National Spherical Torus Experiment Upgrade (NSTX-U) at Princeton Plasma Physics Laboratory. The firmware has been upgraded to include real-time digital filtering, which enables continuous acquisition of the Doppler back-scattered active probing data. In this poster we shall present SAMI's analysis of the plasma edge on NSTX-U including measurements of the edge pitch angle on NSTX-U using SAMI's unique 2-D Doppler-backscattering capability.

  19. Habitable Exoplanet Imaging Mission (HabEx): Architecture of the 4m Mission Concept

    NASA Astrophysics Data System (ADS)

    Kuan, Gary M.; Warfield, Keith R.; Mennesson, Bertrand; Kiessling, Alina; Stahl, H. Philip; Martin, Stefan; Shaklan, Stuart B.; amini, rashied

    2018-01-01

    The Habitable Exoplanet Imaging Mission (HabEx) study is tasked by NASA to develop a scientifically compelling and technologically feasible exoplanet direct imaging mission concept, with extensive general astrophysics capabilities, for the 2020 Decadal Survey in Astrophysics. The baseline architecture of this space-based observatory concept encompasses an unobscured 4m diameter aperture telescope flying in formation with a 72-meter diameter starshade occulter. This large aperture, ultra-stable observatory concept extends and enhances upon the legacy of the Hubble Space Telescope by allowing us to probe even fainter objects and peer deeper into the Universe in the same ultraviolet, visible, and near infrared wavelengths, and gives us the capability, for the first time, to image and characterize potentially habitable, Earth-sized exoplanets orbiting nearby stars. Revolutionary direct imaging of exoplanets will be undertaken using a high-contrast coronagraph and a starshade imager. General astrophysics science will be undertaken with two world-class instruments – a wide-field workhorse camera for imaging and multi-object grism spectroscopy, and a multi-object, multi-resolution ultraviolet spectrograph. This poster outlines the baseline architecture of the HabEx flagship mission concept.

  20. Sub-micrometer resolution proximity X-ray microscope with digital image registration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chkhalo, N. I.; Salashchenko, N. N.; Sherbakov, A. V., E-mail: SherbakovAV@ipm.sci-nnov.ru

    A compact laboratory proximity soft X-ray microscope providing submicrometer spatial resolution and digital image registration is described. The microscope consists of a laser-plasma soft X-ray radiation source, a Schwarzschild objective to illuminate the test sample, and a two-coordinate detector for image registration. Radiation, which passes through the sample under study, generates an absorption image on the front surface of the detector. Optical ceramic YAG:Ce was used to convert the X-rays into visible light. An image was transferred from the scintillator to a charge-coupled device camera with a Mitutoyo Plan Apo series lens. The detector’s design allows the use of lensesmore » with numerical apertures of NA = 0.14, 0.28, and 0.55 without changing the dimensions and arrangement of the elements of the device. This design allows one to change the magnification, spatial resolution, and field of view of the X-ray microscope. A spatial resolution better than 0.7 μm and an energy conversion efficiency of the X-ray radiation with a wavelength of 13.5 nm into visible light collected by the detector of 7.2% were achieved with the largest aperture lens.« less

  1. Active liquid-crystal deflector and lens with Fresnel structure

    NASA Astrophysics Data System (ADS)

    Shibuya, Giichi; Yamano, Shohei; Yoshida, Hiroyuki; Ozaki, Masanori

    2017-02-01

    A new type of tunable Fresnel deflector and lens composed of liquid crystal was developed. Combined structure of multiple interdigitated electrodes and the high-resistivity (HR) layer implements the saw-tooth distribution of electrical potential with only the planar surfaces of the transparent substrates. According to the numerical calculation and design, experimental devices were manufactured with the liquid crystal (LC) material sealed into the sandwiched flat glass plates of 0.7 mm thickness with rubbed alignment layers set to an anti-parallel configuration. Fabricated beam deflector with no moving parts shows the maximum tilt angle of +/-1.3 deg which can apply for optical image stabilizer (OIS) of micro camera. We also discussed and verified their lens characteristics to be extended more advanced applications. Transparent interdigitated electrodes were concentrically aligned on the lens aperture with the insulator gaps under their boundary area. The diameter of the lens aperture was 30 mm and the total number of Fresnel zone was 100. Phase retardation of the beam wavefront irradiated from the LC lens device can be evaluated by polarizing microscope images with a monochromatic filter. Radial positions of each observed fringe are plotted and fitted with 2nd degree polynomial approximation. The number of appeared fringes is over 600 in whole lens aperture area and the correlation coefficients of all approximations are over 0.993 that seems enough ideal optical wavefront. The obtained maximum lens powers from the approximations are about +/-4 m-1 which was satisfied both convex and concave lens characteristics; and their practical use for the tunable lens grade eyeglasses became more prospective.

  2. Genetic mechanisms involved in the evolution of the cephalopod camera eye revealed by transcriptomic and developmental studies

    PubMed Central

    2011-01-01

    Background Coleoid cephalopods (squids and octopuses) have evolved a camera eye, the structure of which is very similar to that found in vertebrates and which is considered a classic example of convergent evolution. Other molluscs, however, possess mirror, pin-hole, or compound eyes, all of which differ from the camera eye in the degree of complexity of the eye structures and neurons participating in the visual circuit. Therefore, genes expressed in the cephalopod eye after divergence from the common molluscan ancestor could be involved in eye evolution through association with the acquisition of new structural components. To clarify the genetic mechanisms that contributed to the evolution of the cephalopod camera eye, we applied comprehensive transcriptomic analysis and conducted developmental validation of candidate genes involved in coleoid cephalopod eye evolution. Results We compared gene expression in the eyes of 6 molluscan (3 cephalopod and 3 non-cephalopod) species and selected 5,707 genes as cephalopod camera eye-specific candidate genes on the basis of homology searches against 3 molluscan species without camera eyes. First, we confirmed the expression of these 5,707 genes in the cephalopod camera eye formation processes by developmental array analysis. Second, using molecular evolutionary (dN/dS) analysis to detect positive selection in the cephalopod lineage, we identified 156 of these genes in which functions appeared to have changed after the divergence of cephalopods from the molluscan ancestor and which contributed to structural and functional diversification. Third, we selected 1,571 genes, expressed in the camera eyes of both cephalopods and vertebrates, which could have independently acquired a function related to eye development at the expression level. Finally, as experimental validation, we identified three functionally novel cephalopod camera eye genes related to optic lobe formation in cephalopods by in situ hybridization analysis of embryonic pygmy squid. Conclusion We identified 156 genes positively selected in the cephalopod lineage and 1,571 genes commonly found in the cephalopod and vertebrate camera eyes from the analysis of cephalopod camera eye specificity at the expression level. Experimental validation showed that the cephalopod camera eye-specific candidate genes include those expressed in the outer part of the optic lobes, which unique to coleoid cephalopods. The results of this study suggest that changes in gene expression and in the primary structure of proteins (through positive selection) from those in the common molluscan ancestor could have contributed, at least in part, to cephalopod camera eye acquisition. PMID:21702923

  3. Numerical investigation of a vortex ring impinging on a coaxial aperture

    NASA Astrophysics Data System (ADS)

    Hu, Jiacheng; Peterson, Sean D.

    2017-11-01

    Recent advancements in smart materials have sparked an interest in the development of small scale fluidic energy harvesters for powering distributed applications in aquatic environments, where coherent vortex structures are prevalent. Thus, it is crucial to investigate the interaction of viscous vortices in the proximity of a thin plate (a common harvester configuration). Hence, the present study systematically examines the interaction of a vortex ring impinging on an infinitesimally thin wall with a coaxially aligned annular aperture. The rigid aperture serves as an axisymmetric counterpart of the thin plate, and the vortex ring represents a typical coherent vortex structure. The results indicate that the vortex dynamics can be categorized into two regimes based on the aperture to ring radius ratio (Rr). The rebound regime (Rr < 0.9) exhibits the classical unsteady boundary layer interaction in a vortex ring-wall collision. The vortex ring is able to slip past the aperture when Rr >= 0.9 , and an increase in the vortex ring impulse is observed for 1.0 <= Rr <= 1.3 due to fluid entrainment. Furthermore, pressure loadings are also compared to elucidate an optimal energy harvesting strategy in vortex impact configurations. This work was supported by the Natural Sciences and Engineering Research Council of Canada (NSERC) Discovery Grant (RGPIN-05778) and Alexander Graham Bell Canada Graduate Scholarship (CGS-D).

  4. Focal volume optics and experimental artifacts in confocal fluorescence correlation spectroscopy.

    PubMed Central

    Hess, Samuel T; Webb, Watt W

    2002-01-01

    Fluorescence correlation spectroscopy (FCS) can provide a wealth of information about biological and chemical systems on a broad range of time scales (<1 micros to >1 s). Numerical modeling of the FCS observation volume combined with measurements has revealed, however, that the standard assumption of a three-dimensional Gaussian FCS observation volume is not a valid approximation under many common measurement conditions. As a result, the FCS autocorrelation will contain significant, systematic artifacts that are most severe with confocal optics when using a large detector aperture and aperture-limited illumination. These optical artifacts manifest themselves in the fluorescence correlation as an apparent additional exponential component or diffusing species with significant (>30%) amplitude that can imply extraneous kinetics, shift the measured diffusion time by as much as approximately 80%, and cause the axial ratio to diverge. Artifacts can be minimized or virtually eliminated by using a small confocal detector aperture, underfilled objective back-aperture, or two-photon excitation. However, using a detector aperture that is smaller or larger than the optimal value (approximately 4.5 optical units) greatly reduces both the count rate per molecule and the signal-to-noise ratio. Thus, there is a tradeoff between optimizing signal-to-noise and reducing experimental artifacts in one-photon FCS. PMID:12324447

  5. Subaperture correlation based digital adaptive optics for full field optical coherence tomography.

    PubMed

    Kumar, Abhishek; Drexler, Wolfgang; Leitgeb, Rainer A

    2013-05-06

    This paper proposes a sub-aperture correlation based numerical phase correction method for interferometric full field imaging systems provided the complex object field information can be extracted. This method corrects for the wavefront aberration at the pupil/ Fourier transform plane without the need of any adaptive optics, spatial light modulators (SLM) and additional cameras. We show that this method does not require the knowledge of any system parameters. In the simulation study, we consider a full field swept source OCT (FF SSOCT) system to show the working principle of the algorithm. Experimental results are presented for a technical and biological sample to demonstrate the proof of the principle.

  6. Analysis of single quantum-dot mobility inside 1D nanochannel devices

    NASA Astrophysics Data System (ADS)

    Hoang, H. T.; Segers-Nolten, I. M.; Tas, N. R.; van Honschoten, J. W.; Subramaniam, V.; Elwenspoek, M. C.

    2011-07-01

    We visualized individual quantum dots using a combination of a confining nanochannel and an ultra-sensitive microscope system, equipped with a high numerical aperture lens and a highly sensitive camera. The diffusion coefficients of the confined quantum dots were determined from the experimentally recorded trajectories according to the classical diffusion theory for Brownian motion in two dimensions. The calculated diffusion coefficients were three times smaller than those in bulk solution. These observations confirm and extend the results of Eichmann et al (2008 Langmuir 24 714-21) to smaller particle diameters and more narrow confinement. A detailed analysis shows that the observed reduction in mobility cannot be explained by conventional hydrodynamic theory.

  7. Calibration of scintillation-light filters for neutron time-of-flight spectrometers at the National Ignition Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sayre, D. B., E-mail: sayre4@llnl.gov; Barbosa, F.; Caggiano, J. A.

    Sixty-four neutral density filters constructed of metal plates with 88 apertures of varying diameter have been radiographed with a soft x-ray source and CCD camera at National Security Technologies, Livermore. An analysis of the radiographs fits the radial dependence of the apertures’ image intensities to sigmoid functions, which can describe the rapidly decreasing intensity towards the apertures’ edges. The fitted image intensities determine the relative attenuation value of each filter. Absolute attenuation values of several imaged filters, measured in situ during calibration experiments, normalize the relative quantities which are now used in analyses of neutron spectrometer data at the Nationalmore » Ignition Facility.« less

  8. Calibration of scintillation-light filters for neutron time-of-flight spectrometers at the National Ignition Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sayre, D. B.; Barbosa, F.; Caggiano, J. A.

    Sixty-four neutral density filters constructed of metal plates with 88 apertures of varying diameter have been radiographed with a soft x-ray source and CCD camera at National Security Technologies, Livermore. An analysis of the radiographs fits the radial dependence of the apertures’ image intensities to sigmoid functions, which can describe the rapidly decreasing intensity towards the apertures’ edges. Here, the fitted image intensities determine the relative attenuation value of each filter. Absolute attenuation values of several imaged filters, measured in situ during calibration experiments, normalize the relative quantities which are now used in analyses of neutron spectrometer data at themore » National Ignition Facility.« less

  9. Calibration of scintillation-light filters for neutron time-of-flight spectrometers at the National Ignition Facility

    DOE PAGES

    Sayre, D. B.; Barbosa, F.; Caggiano, J. A.; ...

    2016-07-26

    Sixty-four neutral density filters constructed of metal plates with 88 apertures of varying diameter have been radiographed with a soft x-ray source and CCD camera at National Security Technologies, Livermore. An analysis of the radiographs fits the radial dependence of the apertures’ image intensities to sigmoid functions, which can describe the rapidly decreasing intensity towards the apertures’ edges. Here, the fitted image intensities determine the relative attenuation value of each filter. Absolute attenuation values of several imaged filters, measured in situ during calibration experiments, normalize the relative quantities which are now used in analyses of neutron spectrometer data at themore » National Ignition Facility.« less

  10. Differential high-speed digital micromirror device based fluorescence speckle confocal microscopy.

    PubMed

    Jiang, Shihong; Walker, John

    2010-01-20

    We report a differential fluorescence speckle confocal microscope that acquires an image in a fraction of a second by exploiting the very high frame rate of modern digital micromirror devices (DMDs). The DMD projects a sequence of predefined binary speckle patterns to the sample and modulates the intensity of the returning fluorescent light simultaneously. The fluorescent light reflecting from the DMD's "on" and "off" pixels is modulated by correlated speckle and anticorrelated speckle, respectively, to form two images on two CCD cameras in parallel. The sum of the two images recovers a widefield image, but their difference gives a near-confocal image in real time. Experimental results for both low and high numerical apertures are shown.

  11. Activities of JAXA's Innovative Technology Center on Space Debris Observation

    NASA Astrophysics Data System (ADS)

    Yanagisawa, T.; Kurosaki, H.; Nakajima, A.

    The innovative technology research center of JAXA is developing observational technologies for GEO objects in order to cope with the space debris problem. The center had constructed the optical observational facility for space debris at Mt. Nyukasa, Nagano in 2006. As observational equipments such as CCD cameras and telescopes were set up, the normal observation started. In this paper, the detail of the facilities and its activities are introduced. The observational facility contains two telescopes and two CCD cameras. The apertures of the telescopes are 35cm and 25 cm, respectively. One CCD camera in which 2K2K chip is installed can observe a sky region of 1.3 times 1.3-degree using the 35cm telescope. The other CCD camera that contains two 4K2K chips has an ability to observe 2.6 times 2.6-degree's region with the 25cm telescope. One of our main objectives is to detect faint GEO objects that are not catalogued. Generally, the detection limit of GEO object is determined by the aperture of the telescope. However, by improving image processing techniques, the limit may become low. We are developing some image processing methods that use many CCD frames to detect faint objects. We are trying to use FPGA (Field Programmable Gate Array) system to reduce analyzing time. By applying these methods to the data taken by a large telescope, the detection limit will be significantly lowered. The orbital determination of detected GEO debris is one of the important things to do. Especially, the narrow field view of an optical telescope hinders us from re-detection of the GEO debris for the orbital determination. Long observation time is required for one GEO object for the orbital determination that is inefficient. An effective observation strategy should be considered. We are testing one observation method invented by Umehara that observes one inertia position in the space. By observing one inertia position for two nights, a GEO object that passed through the position in the first night must pass through the position in the second night. The rough orbit is determined from two nights' data. The test observation showed that this method was able to detect many GEO objects and determined their orbits by three nights' observations. We also joined the campaign observations of IADC(Inter-Agency Space Debris Coordination Committee). By analyzing the observed data with the method that we developed, 88 catalogued and 38 un-catalogued objects were detected. The magnitude of the faintest object detected in this campaign observation was 18.5. The object is un-detectable by human inspection.

  12. Camera traps reveal an apparent mutualism between a common mesocarnivore and an endangered ungulate

    USGS Publications Warehouse

    Cove, Michael V.; Maurer, Andrew S.; O'Connell, Allan F.

    2017-01-01

    Camera traps are commonly used to study mammal ecology and they occasionally capture previously undocumented species interactions. The key deer (Odocoileus virginianus clavium) is an endangered endemic subspecies of the Florida Keys, where it exists with few predators. We obtained a camera trap sequence of 80 photos in which a key deer interacted with two northern raccoons (Procyon lotor). One of the raccoons groomed the deer’s face for ∼1 min. This interaction is peculiar and appears mutualistic because the deer was not concerned and willingly remained still throughout the physical contact. Although mutualistic relationships between deer and birds are common, we are unaware of any previously documented mesocarnivore-deer mutualisms. Key deer have evolved in the absence of mammalian predators and we hypothesize that they exhibit reduced vigilance or concern when encountering other species because of predator naivety. Key deer and raccoons are commonly associated with humans and urbanization and an alternative hypothesis is that the interactions are a consequence of heightened deer density, causing a greater probability of sustained interactions with the common mesocarnivores.

  13. Nonholonomic camera-space manipulation using cameras mounted on a mobile base

    NASA Astrophysics Data System (ADS)

    Goodwine, Bill; Seelinger, Michael J.; Skaar, Steven B.; Ma, Qun

    1998-10-01

    The body of work called `Camera Space Manipulation' is an effective and proven method of robotic control. Essentially, this technique identifies and refines the input-output relationship of the plant using estimation methods and drives the plant open-loop to its target state. 3D `success' of the desired motion, i.e., the end effector of the manipulator engages a target at a particular location with a particular orientation, is guaranteed when there is camera space success in two cameras which are adequately separated. Very accurate, sub-pixel positioning of a robotic end effector is possible using this method. To date, however, most efforts in this area have primarily considered holonomic systems. This work addresses the problem of nonholonomic camera space manipulation by considering the problem of a nonholonomic robot with two cameras and a holonomic manipulator on board the nonholonomic platform. While perhaps not as common in robotics, such a combination of holonomic and nonholonomic degrees of freedom are ubiquitous in industry: fork lifts and earth moving equipment are common examples of a nonholonomic system with an on-board holonomic actuator. The nonholonomic nature of the system makes the automation problem more difficult due to a variety of reasons; in particular, the target location is not fixed in the image planes, as it is for holonomic systems (since the cameras are attached to a moving platform), and there is a fundamental `path dependent' nature of nonholonomic kinematics. This work focuses on the sensor space or camera-space-based control laws necessary for effectively implementing an autonomous system of this type.

  14. Optimization of a miniature short-wavelength infrared objective optics of a short-wavelength infrared to visible upconversion layer attached to a mobile-devices visible camera

    NASA Astrophysics Data System (ADS)

    Kadosh, Itai; Sarusi, Gabby

    2017-10-01

    The use of dual cameras in parallax in order to detect and create 3-D images in mobile devices has been increasing over the last few years. We propose a concept where the second camera will be operating in the short-wavelength infrared (SWIR-1300 to 1800 nm) and thus have night vision capability while preserving most of the other advantages of dual cameras in terms of depth and 3-D capabilities. In order to maintain commonality of the two cameras, we propose to attach to one of the cameras a SWIR to visible upconversion layer that will convert the SWIR image into a visible image. For this purpose, the fore optics (the objective lenses) should be redesigned for the SWIR spectral range and the additional upconversion layer, whose thickness is <1 μm. Such layer should be attached in close proximity to the mobile device visible range camera sensor (the CMOS sensor). This paper presents such a SWIR objective optical design and optimization that is formed and fit mechanically to the visible objective design but with different lenses in order to maintain the commonality and as a proof-of-concept. Such a SWIR objective design is very challenging since it requires mimicking the original visible mobile camera lenses' sizes and the mechanical housing, so we can adhere to the visible optical and mechanical design. We present in depth a feasibility study and the overall optical system performance of such a SWIR mobile-device camera fore optics design.

  15. Using a plenoptic camera to measure distortions in wavefronts affected by atmospheric turbulence

    NASA Astrophysics Data System (ADS)

    Eslami, Mohammed; Wu, Chensheng; Rzasa, John; Davis, Christopher C.

    2012-10-01

    Ideally, as planar wave fronts travel through an imaging system, all rays, or vectors pointing in the direction of the propagation of energy are parallel, and thus the wave front is focused to a particular point. If the wave front arrives at an imaging system with energy vectors that point in different directions, each part of the wave front will be focused at a slightly different point on the sensor plane and result in a distorted image. The Hartmann test, which involves the insertion of a series of pinholes between the imaging system and the sensor plane, was developed to sample the wavefront at different locations and measure the distortion angles at different points in the wave front. An adaptive optic system, such as a deformable mirror, is then used to correct for these distortions and allow the planar wave front to focus at the point desired on the sensor plane, thereby correcting the distorted image. The apertures of a pinhole array limit the amount of light that reaches the sensor plane. By replacing the pinholes with a microlens array each bundle of rays is focused to brighten the image. Microlens arrays are making their way into newer imaging technologies, such as "light field" or "plenoptic" cameras. In these cameras, the microlens array is used to recover the ray information of the incoming light by using post processing techniques to focus on objects at different depths. The goal of this paper is to demonstrate the use of these plenoptic cameras to recover the distortions in wavefronts. Taking advantage of the microlens array within the plenoptic camera, CODE-V simulations show that its performance can provide more information than a Shack-Hartmann sensor. Using the microlens array to retrieve the ray information and then backstepping through the imaging system provides information about distortions in the arriving wavefront.

  16. A survey of camera error sources in machine vision systems

    NASA Astrophysics Data System (ADS)

    Jatko, W. B.

    In machine vision applications, such as an automated inspection line, television cameras are commonly used to record scene intensity in a computer memory or frame buffer. Scene data from the image sensor can then be analyzed with a wide variety of feature-detection techniques. Many algorithms found in textbooks on image processing make the implicit simplifying assumption of an ideal input image with clearly defined edges and uniform illumination. The ideal image model is helpful to aid the student in understanding the principles of operation, but when these algorithms are blindly applied to real-world images the results can be unsatisfactory. This paper examines some common measurement errors found in camera sensors and their underlying causes, and possible methods of error compensation. The role of the camera in a typical image-processing system is discussed, with emphasis on the origination of signal distortions. The effects of such things as lighting, optics, and sensor characteristics are considered.

  17. The PLATO camera

    NASA Astrophysics Data System (ADS)

    Laubier, D.; Bodin, P.; Pasquier, H.; Fredon, S.; Levacher, P.; Vola, P.; Buey, T.; Bernardi, P.

    2017-11-01

    PLATO (PLAnetary Transits and Oscillation of stars) is a candidate for the M3 Medium-size mission of the ESA Cosmic Vision programme (2015-2025 period). It is aimed at Earth-size and Earth-mass planet detection in the habitable zone of bright stars and their characterisation using the transit method and the asterosismology of their host star. That means observing more than 100 000 stars brighter than magnitude 11, and more than 1 000 000 brighter than magnitude 13, with a long continuous observing time for 20 % of them (2 to 3 years). This yields a need for an unusually long term signal stability. For the brighter stars, the noise requirement is less than 34 ppm.hr-1/2, from a frequency of 40 mHz down to 20 μHz, including all sources of noise like for instance the motion of the star images on the detectors and frequency beatings. Those extremely tight requirements result in a payload consisting of 32 synchronised, high aperture, wide field of view cameras thermally regulated down to -80°C, whose data are combined to increase the signal to noise performances. They are split into 4 different subsets pointing at 4 directions to widen the total field of view; stars in the centre of that field of view are observed by all 32 cameras. 2 extra cameras are used with color filters and provide pointing measurement to the spacecraft Attitude and Orbit Control System (AOCS) loop. The satellite is orbiting the Sun at the L2 Lagrange point. This paper presents the optical, electronic and electrical, thermal and mechanical designs devised to achieve those requirements, and the results from breadboards developed for the optics, the focal plane, the power supply and video electronics.

  18. Image Intensifier Modules For Use With Commercially Available Solid State Cameras

    NASA Astrophysics Data System (ADS)

    Murphy, Howard; Tyler, Al; Lake, Donald W.

    1989-04-01

    A modular approach to design has contributed greatly to the success of the family of machine vision video equipment produced by EG&G Reticon during the past several years. Internal modularity allows high-performance area (matrix) and line scan cameras to be assembled with two or three electronic subassemblies with very low labor costs, and permits camera control and interface circuitry to be realized by assemblages of various modules suiting the needs of specific applications. Product modularity benefits equipment users in several ways. Modular matrix and line scan cameras are available in identical enclosures (Fig. 1), which allows enclosure components to be purchased in volume for economies of scale and allows field replacement or exchange of cameras within a customer-designed system to be easily accomplished. The cameras are optically aligned (boresighted) at final test; modularity permits optical adjustments to be made with the same precise test equipment for all camera varieties. The modular cameras contain two, or sometimes three, hybrid microelectronic packages (Fig. 2). These rugged and reliable "submodules" perform all of the electronic operations internal to the camera except for the job of image acquisition performed by the monolithic image sensor. Heat produced by electrical power dissipation in the electronic modules is conducted through low resistance paths to the camera case by the metal plates, which results in a thermally efficient and environmentally tolerant camera with low manufacturing costs. A modular approach has also been followed in design of the camera control, video processor, and computer interface accessory called the Formatter (Fig. 3). This unit can be attached directly onto either a line scan or matrix modular camera to form a self-contained units, or connected via a cable to retain the advantages inherent to a small, light weight, and rugged image sensing component. Available modules permit the bus-structured Formatter to be configured as required by a specific camera application. Modular line and matrix scan cameras incorporating sensors with fiber optic faceplates (Fig 4) are also available. These units retain the advantages of interchangeability, simple construction, ruggedness, and optical precision offered by the more common lens input units. Fiber optic faceplate cameras are used for a wide variety of applications. A common usage involves mating of the Reticon-supplied camera to a customer-supplied intensifier tube for low light level and/or short exposure time situations.

  19. Aperture Photometry Tool

    NASA Astrophysics Data System (ADS)

    Laher, Russ R.; Gorjian, Varoujan; Rebull, Luisa M.; Masci, Frank J.; Fowler, John W.; Helou, George; Kulkarni, Shrinivas R.; Law, Nicholas M.

    2012-07-01

    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It is a graphical user interface (GUI) designed to allow the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. The finely tuned layout of the GUI, along with judicious use of color-coding and alerting, is intended to give maximal user utility and convenience. Simply mouse-clicking on a source in the displayed image will instantly draw a circular or elliptical aperture and sky annulus around the source and will compute the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs with just the push of a button, including image histogram, x and y aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has many functions for customizing the calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source model, which is accessed via the radial-profile-plot panel, allows recovery of source intensity from pixels with missing data and can be especially beneficial in crowded fields.

  20. Predicting mineral precipitation in fractures: The influence of local heterogeneity on the feedback between precipitation and permeability

    NASA Astrophysics Data System (ADS)

    Jones, T.; Detwiler, R. L.

    2016-12-01

    Long-term subsurface energy production and contaminant storage strategies often rely on induced-mineralization to control the transport of dissolved ions. In low-permeability rocks, precipitation is most likely to occur in fractures that act as leakage pathways for fluids that are in chemical disequilibrium with the formation minerals. These fractures are commonly idealized as parallel-plate channels with uniform surface mineralogy, and as a result, our predictions often suggest that precipitation leads to fast permeability reduction. However, natural fractures contain both heterogeneous mineralogy and three-dimensional surface roughness, and our understanding of how precipitation affects local permeability in these environments is limited. To examine the impacts of local heterogeneity on the feedback between mineral precipitation and permeability, we performed two long-term experiments in transparent analog fractures: (i) uniform-aperture and (ii) variable-aperture. We controlled the initial heterogeneous surface mineralogy in both experiments by seeding the bottom borosilicate fracture surfaces with randomly distributed clusters of CaCO3 crystals. Continuous flow ISCO pumps injected a well-mixed CaCl2-NaHCO3 solution, log(ΩCaCO3) = 1.44, into the fracture at 0.5 ml/min and transmitted-light techniques provided high-resolution (83 x 83 µm), direct measurements of aperture and fluid transport across the fracture. In experiment (i), precipitation decreased local aperture at discrete CaCO3 reaction sites near the fracture inlet, but transport variations across the fracture remained relatively small due to the initial lack of aperture heterogeneity. In contrast, the feedback between precipitation and aperture in experiment (ii) focused flow into large-aperture, preferential flow paths that contained significantly less CaCO3 area than the fracture scale average. Precipitation-induced aperture reduction in (ii) reduced dissolved ion transport into small-aperture regions of the fracture that were abundant with CaCO3 and led to a 72% decrease in measured precipitation rate. These results suggest that incorporating the effects of local heterogeneity may dramatically improve our ability to predict precipitation-induced permeability alterations in fractured rocks.

  1. Multiple Sensor Camera for Enhanced Video Capturing

    NASA Astrophysics Data System (ADS)

    Nagahara, Hajime; Kanki, Yoshinori; Iwai, Yoshio; Yachida, Masahiko

    A resolution of camera has been drastically improved under a current request for high-quality digital images. For example, digital still camera has several mega pixels. Although a video camera has the higher frame-rate, the resolution of a video camera is lower than that of still camera. Thus, the high-resolution is incompatible with the high frame rate of ordinary cameras in market. It is difficult to solve this problem by a single sensor, since it comes from physical limitation of the pixel transfer rate. In this paper, we propose a multi-sensor camera for capturing a resolution and frame-rate enhanced video. Common multi-CCDs camera, such as 3CCD color camera, has same CCD for capturing different spectral information. Our approach is to use different spatio-temporal resolution sensors in a single camera cabinet for capturing higher resolution and frame-rate information separately. We build a prototype camera which can capture high-resolution (2588×1958 pixels, 3.75 fps) and high frame-rate (500×500, 90 fps) videos. We also proposed the calibration method for the camera. As one of the application of the camera, we demonstrate an enhanced video (2128×1952 pixels, 90 fps) generated from the captured videos for showing the utility of the camera.

  2. Optimal camera exposure for video surveillance systems by predictive control of shutter speed, aperture, and gain

    NASA Astrophysics Data System (ADS)

    Torres, Juan; Menéndez, José Manuel

    2015-02-01

    This paper establishes a real-time auto-exposure method to guarantee that surveillance cameras in uncontrolled light conditions take advantage of their whole dynamic range while provide neither under nor overexposed images. State-of-the-art auto-exposure methods base their control on the brightness of the image measured in a limited region where the foreground objects are mostly located. Unlike these methods, the proposed algorithm establishes a set of indicators based on the image histogram that defines its shape and position. Furthermore, the location of the objects to be inspected is likely unknown in surveillance applications. Thus, the whole image is monitored in this approach. To control the camera settings, we defined a parameters function (Ef ) that linearly depends on the shutter speed and the electronic gain; and is inversely proportional to the square of the lens aperture diameter. When the current acquired image is not overexposed, our algorithm computes the value of Ef that would move the histogram to the maximum value that does not overexpose the capture. When the current acquired image is overexposed, it computes the value of Ef that would move the histogram to a value that does not underexpose the capture and remains close to the overexposed region. If the image is under and overexposed, the whole dynamic range of the camera is therefore used, and a default value of the Ef that does not overexpose the capture is selected. This decision follows the idea that to get underexposed images is better than to get overexposed ones, because the noise produced in the lower regions of the histogram can be removed in a post-processing step while the saturated pixels of the higher regions cannot be recovered. The proposed algorithm was tested in a video surveillance camera placed at an outdoor parking lot surrounded by buildings and trees which produce moving shadows in the ground. During the daytime of seven days, the algorithm was running alternatively together with a representative auto-exposure algorithm in the recent literature. Besides the sunrises and the nightfalls, multiple weather conditions occurred which produced light changes in the scene: sunny hours that produced sharpen shadows and highlights; cloud coverages that softened the shadows; and cloudy and rainy hours that dimmed the scene. Several indicators were used to measure the performance of the algorithms. They provided the objective quality as regards: the time that the algorithms recover from an under or over exposure, the brightness stability, and the change related to the optimal exposure. The results demonstrated that our algorithm reacts faster to all the light changes than the selected state-of-the-art algorithm. It is also capable of acquiring well exposed images and maintaining the brightness stable during more time. Summing up the results, we concluded that the proposed algorithm provides a fast and stable auto-exposure method that maintains an optimal exposure for video surveillance applications. Future work will involve the evaluation of this algorithm in robotics.

  3. A direct-view customer-oriented digital holographic camera

    NASA Astrophysics Data System (ADS)

    Besaga, Vira R.; Gerhardt, Nils C.; Maksimyak, Peter P.; Hofmann, Martin R.

    2018-01-01

    In this paper, we propose a direct-view digital holographic camera system consisting mostly of customer-oriented components. The camera system is based on standard photographic units such as camera sensor and objective and is adapted to operate under off-axis external white-light illumination. The common-path geometry of the holographic module of the system ensures direct-view operation. The system can operate in both self-reference and self-interference modes. As a proof of system operability, we present reconstructed amplitude and phase information of a test sample.

  4. Quantum geodesy

    NASA Astrophysics Data System (ADS)

    Jitrik, Oliverio; Lanzagorta, Marco; Uhlmann, Jeffrey; Venegas-Andraca, Salvador E.

    2017-05-01

    The study of plate tectonic motion is important to generate theoretical models of the structure and dynamics of the Earth. In turn, understanding tectonic motion provides insight to develop sophisticated models that can be used for earthquake early warning systems and for nuclear forensics. Tectonic geodesy uses the position of a network of points on the surface of earth to determine the motion of tectonic plates and the deformation of the earths crust. GPS and interferometric synthetic aperture radar are commonly used techniques used in tectonic geodesy. In this paper we will describe the feasibility of interferometric synthetic aperture quantum radar and its theoretical performance for tectonic geodesy.

  5. Automated interferometric synthetic aperture microscopy and computational adaptive optics for improved optical coherence tomography.

    PubMed

    Xu, Yang; Liu, Yuan-Zhi; Boppart, Stephen A; Carney, P Scott

    2016-03-10

    In this paper, we introduce an algorithm framework for the automation of interferometric synthetic aperture microscopy (ISAM). Under this framework, common processing steps such as dispersion correction, Fourier domain resampling, and computational adaptive optics aberration correction are carried out as metrics-assisted parameter search problems. We further present the results of this algorithm applied to phantom and biological tissue samples and compare with manually adjusted results. With the automated algorithm, near-optimal ISAM reconstruction can be achieved without manual adjustment. At the same time, the technical barrier for the nonexpert using ISAM imaging is also significantly lowered.

  6. Extended capture range for focus-diverse phase retrieval in segmented aperture systems using geometrical optics.

    PubMed

    Jurling, Alden S; Fienup, James R

    2014-03-01

    Extending previous work by Thurman on wavefront sensing for segmented-aperture systems, we developed an algorithm for estimating segment tips and tilts from multiple point spread functions in different defocused planes. We also developed methods for overcoming two common modes for stagnation in nonlinear optimization-based phase retrieval algorithms for segmented systems. We showed that when used together, these methods largely solve the capture range problem in focus-diverse phase retrieval for segmented systems with large tips and tilts. Monte Carlo simulations produced a rate of success better than 98% for the combined approach.

  7. Earthquake Building Damage Mapping Based on Feature Analyzing Method from Synthetic Aperture Radar Data

    NASA Astrophysics Data System (ADS)

    An, L.; Zhang, J.; Gong, L.

    2018-04-01

    Playing an important role in gathering information of social infrastructure damage, Synthetic Aperture Radar (SAR) remote sensing is a useful tool for monitoring earthquake disasters. With the wide application of this technique, a standard method, comparing post-seismic to pre-seismic data, become common. However, multi-temporal SAR processes, are not always achievable. To develop a post-seismic data only method for building damage detection, is of great importance. In this paper, the authors are now initiating experimental investigation to establish an object-based feature analysing classification method for building damage recognition.

  8. Comparison of three different detectors applied to synthetic aperture radar data

    NASA Astrophysics Data System (ADS)

    Ranney, Kenneth I.; Khatri, Hiralal; Nguyen, Lam H.

    2002-08-01

    The U.S. Army Research Laboratory has investigated the relative performance of three different target detection paradigms applied to foliage penetration (FOPEN) synthetic aperture radar (SAR) data. The three detectors - a quadratic polynomial discriminator (QPD), Bayesian neural network (BNN) and a support vector machine (SVM) - utilize a common collection of statistics (feature values) calculated from the fully polarimetric FOPEN data. We describe the parametric variations required as part of the algorithm optimizations, and we present the relative performance of the detectors in terms of probability of false alarm (Pfa) and probability of detection (Pd).

  9. Process simulation in digital camera system

    NASA Astrophysics Data System (ADS)

    Toadere, Florin

    2012-06-01

    The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.

  10. Designing manufacturable filters for a 16-band plenoptic camera using differential evolution

    NASA Astrophysics Data System (ADS)

    Doster, Timothy; Olson, Colin C.; Fleet, Erin; Yetzbacher, Michael; Kanaev, Andrey; Lebow, Paul; Leathers, Robert

    2017-05-01

    A 16-band plenoptic camera allows for the rapid exchange of filter sets via a 4x4 filter array on the lens's front aperture. This ability to change out filters allows for an operator to quickly adapt to different locales or threat intelligence. Typically, such a system incorporates a default set of 16 equally spaced at-topped filters. Knowing the operating theater or the likely targets of interest it becomes advantageous to tune the filters. We propose using a modified beta distribution to parameterize the different possible filters and differential evolution (DE) to search over the space of possible filter designs. The modified beta distribution allows us to jointly optimize the width, taper and wavelength center of each single- or multi-pass filter in the set over a number of evolutionary steps. Further, by constraining the function parameters we can develop solutions which are not just theoretical but manufacturable. We examine two independent tasks: general spectral sensing and target detection. In the general spectral sensing task we utilize the theory of compressive sensing (CS) and find filters that generate codings which minimize the CS reconstruction error based on a fixed spectral dictionary of endmembers. For the target detection task and a set of known targets, we train the filters to optimize the separation of the background and target signature. We compare our results to the default 16 at-topped non-overlapping filter set which comes with the plenoptic camera and full hyperspectral resolution data which was previously acquired.

  11. Hyper Suprime-Cam: System design and verification of image quality

    NASA Astrophysics Data System (ADS)

    Miyazaki, Satoshi; Komiyama, Yutaka; Kawanomoto, Satoshi; Doi, Yoshiyuki; Furusawa, Hisanori; Hamana, Takashi; Hayashi, Yusuke; Ikeda, Hiroyuki; Kamata, Yukiko; Karoji, Hiroshi; Koike, Michitaro; Kurakami, Tomio; Miyama, Shoken; Morokuma, Tomoki; Nakata, Fumiaki; Namikawa, Kazuhito; Nakaya, Hidehiko; Nariai, Kyoji; Obuchi, Yoshiyuki; Oishi, Yukie; Okada, Norio; Okura, Yuki; Tait, Philip; Takata, Tadafumi; Tanaka, Yoko; Tanaka, Masayuki; Terai, Tsuyoshi; Tomono, Daigo; Uraguchi, Fumihiro; Usuda, Tomonori; Utsumi, Yousuke; Yamada, Yoshihiko; Yamanoi, Hitomi; Aihara, Hiroaki; Fujimori, Hiroki; Mineo, Sogo; Miyatake, Hironao; Oguri, Masamune; Uchida, Tomohisa; Tanaka, Manobu M.; Yasuda, Naoki; Takada, Masahiro; Murayama, Hitoshi; Nishizawa, Atsushi J.; Sugiyama, Naoshi; Chiba, Masashi; Futamase, Toshifumi; Wang, Shiang-Yu; Chen, Hsin-Yo; Ho, Paul T. P.; Liaw, Eric J. Y.; Chiu, Chi-Fang; Ho, Cheng-Lin; Lai, Tsang-Chih; Lee, Yao-Cheng; Jeng, Dun-Zen; Iwamura, Satoru; Armstrong, Robert; Bickerton, Steve; Bosch, James; Gunn, James E.; Lupton, Robert H.; Loomis, Craig; Price, Paul; Smith, Steward; Strauss, Michael A.; Turner, Edwin L.; Suzuki, Hisanori; Miyazaki, Yasuhito; Muramatsu, Masaharu; Yamamoto, Koei; Endo, Makoto; Ezaki, Yutaka; Ito, Noboru; Kawaguchi, Noboru; Sofuku, Satoshi; Taniike, Tomoaki; Akutsu, Kotaro; Dojo, Naoto; Kasumi, Kazuyuki; Matsuda, Toru; Imoto, Kohei; Miwa, Yoshinori; Suzuki, Masayuki; Takeshi, Kunio; Yokota, Hideo

    2018-01-01

    The Hyper Suprime-Cam (HSC) is an 870 megapixel prime focus optical imaging camera for the 8.2 m Subaru telescope. The wide-field corrector delivers sharp images of 0{^''.}2 (FWHM) in the HSC-i band over the entire 1.5° diameter field of view. The collimation of the camera with respect to the optical axis of the primary mirror is done with hexapod actuators, the mechanical accuracy of which is a few microns. Analysis of the remaining wavefront error in off-focus stellar images reveals that the collimation of the optical components meets design specifications. While there is a flexure of mechanical components, it also is within the design specification. As a result, the camera achieves its seeing-limited imaging on Maunakea during most of the time; the median seeing over several years of observing is 0.67" (FWHM) in the i band. The sensors use p-channel, fully depleted CCDs of 200 μm thickness (2048 × 4176 15 μm square pixels) and we employ 116 of them to pave the 50 cm diameter focal plane. The minimum interval between exposures is 34 s, including the time to read out arrays, to transfer data to the control computer, and to save them to the hard drive. HSC on Subaru uniquely features a combination of a large aperture, a wide field of view, sharp images and a high sensitivity especially at longer wavelengths, which makes the HSC one of the most powerful observing facilities in the world.

  12. Investigation of sparsity metrics for autofocusing in digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Fan, Xin; Healy, John J.; Hennelly, Bryan M.

    2017-05-01

    Digital holographic microscopy (DHM) is an optoelectronic technique that is made up of two parts: (i) the recording of the interference pattern of the diffraction pattern of an object and a known reference wavefield using a digital camera and (ii) the numerical reconstruction of the complex object wavefield using the recorded interferogram and a distance parameter as input. The latter is based on the simulation of optical propagation from the camera plane to a plane at any arbitrary distance from the camera. A key advantage of DHM over conventional microscopy is that both the phase and intensity information of the object can be recovered at any distance, using only one capture, and this facilitates the recording of scenes that may change dynamically and that may otherwise go in and out of focus. Autofocusing using traditional microscopy requires mechanical movement of the translation stage or the microscope objective, and multiple image captures that are then compared using some metric. Autofocusing in DHM is similar, except that the sequence of intensity images, to which the metric is applied, is generated numerically from a single capture. We recently investigated the application of a number of sparsity metrics for DHM autofocusing and in this paper we extend this work to include more such metrics, and apply them over a greater range of biological diatom cells and magnification/numerical apertures. We demonstrate for the first time that these metrics may be grouped together according to matching behavior following high pass filtering.

  13. LWIR NUC using an uncooled microbolometer camera

    NASA Astrophysics Data System (ADS)

    Laveigne, Joe; Franks, Greg; Sparkman, Kevin; Prewarski, Marcus; Nehring, Brian; McHugh, Steve

    2010-04-01

    Performing a good non-uniformity correction is a key part of achieving optimal performance from an infrared scene projector. Ideally, NUC will be performed in the same band in which the scene projector will be used. Cooled, large format MWIR cameras are readily available and have been successfully used to perform NUC, however, cooled large format LWIR cameras are not as common and are prohibitively expensive. Large format uncooled cameras are far more available and affordable, but present a range of challenges in practical use for performing NUC on an IRSP. Santa Barbara Infrared, Inc. reports progress on a continuing development program to use a microbolometer camera to perform LWIR NUC on an IRSP. Camera instability and temporal response and thermal resolution are the main difficulties. A discussion of processes developed to mitigate these issues follows.

  14. Imaging of spatially extended hot spots with coded apertures for intra-operative nuclear medicine applications

    NASA Astrophysics Data System (ADS)

    Kaissas, I.; Papadimitropoulos, C.; Potiriadis, C.; Karafasoulis, K.; Loukas, D.; Lambropoulos, C. P.

    2017-01-01

    Coded aperture imaging transcends planar imaging with conventional collimators in efficiency and Field of View (FOV). We present experimental results for the detection of 141 keV and 122 keV γ-photons emitted by uniformly extended 99mTc and 57Co hot-spots along with simulations of uniformly and normally extended 99mTc hot-spots. These results prove that the method can be used for intra-operative imaging of radio-traced sentinel nodes and thyroid remnants. The study is performed using a setup of two gamma cameras, each consisting of a coded-aperture (or mask) of Modified Uniformly Redundant Array (MURA) of rank 19 positioned on top of a CdTe detector. The detector pixel pitch is 350 μm and its active area is 4.4 × 4.4 cm2, while the mask element size is 1.7 mm. The detectable photon energy ranges from 15 keV up to 200 keV with an energy resolution of 3-4 keV FWHM. Triangulation is exploited to estimate the 3D spatial coordinates of the radioactive spots within the system FOV. Two extended sources, with uniform distributed activity (11 and 24 mm in diameter, respectively), positioned at 16 cm from the system and with 3 cm distance between their centers, can be resolved and localized with accuracy better than 5%. The results indicate that the estimated positions of spatially extended sources lay within their volume size and that neighboring sources, even with a low level of radioactivity, such as 30 MBq, can be clearly distinguished with an acquisition time about 3 seconds.

  15. Free-Space Optical Communications Program at JPL

    NASA Technical Reports Server (NTRS)

    Hemmati, H.

    1999-01-01

    Conceptual design of a multi-functional optical instrument is underway for the X2000-Second Delivery Program. The transceiver will perform both free-space optical-communication and science imaging by sharing a common 10-cm aperture telescope.

  16. 3D for the people: multi-camera motion capture in the field with consumer-grade cameras and open source software

    PubMed Central

    Evangelista, Dennis J.; Ray, Dylan D.; Hedrick, Tyson L.

    2016-01-01

    ABSTRACT Ecological, behavioral and biomechanical studies often need to quantify animal movement and behavior in three dimensions. In laboratory studies, a common tool to accomplish these measurements is the use of multiple, calibrated high-speed cameras. Until very recently, the complexity, weight and cost of such cameras have made their deployment in field situations risky; furthermore, such cameras are not affordable to many researchers. Here, we show how inexpensive, consumer-grade cameras can adequately accomplish these measurements both within the laboratory and in the field. Combined with our methods and open source software, the availability of inexpensive, portable and rugged cameras will open up new areas of biological study by providing precise 3D tracking and quantification of animal and human movement to researchers in a wide variety of field and laboratory contexts. PMID:27444791

  17. Discovery of KPS-1b, a Transiting Hot-Jupiter, with an Amateur Telescope Setup (Abstract)

    NASA Astrophysics Data System (ADS)

    Benni, P.; Burdanov, A.; Krushinsky, V.; Sokov, E.

    2018-06-01

    (Abstract only) Using readily available amateur equipment, a wide-field telescope (Celestron RASA, 279 mm f/2.2) coupled with a SBIG ST-8300M camera was set up at a private residence in a fairly light polluted suburban town thirty miles outside of Boston, Massachusetts. This telescope participated in the Kourovka Planet Search (KPS) prototype survey, along with a MASTER-II Ural wide field telescope near Yekaterinburg, Russia. One goal was to determine if higher resolution imaging ( 2 arcsec/pixel) with much lower sky coverage can practically detect exoplanet transits compared to the successful very wide-field exoplanet surveys (KELT, XO, WASP, HATnet, TrES, Qatar, etc.) which used an array of small aperture telescopes coupled to CCDs.

  18. LLE review, volume 73. Quarterly report, October 1997--December 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-04-01

    This progress report contains discussion on the following topics: A high-bandwidth electrical-waveform generator based on aperture-coupled striplines for OMEGA pulse-shaping applications; sweep deflection circuit development using computer-aided circuit design for the OMEGA multichannel streak camera; D-{sup 3}He protons as a diagnostic for target {rho}R; growth rates of the ablative Rayleigh-Taylor instability in inertial confinement fusion; three-dimensional analysis of the power transfer between crossed laser beams; characterization of freestanding polymer films for application in 351-nm, high-peak-power laser systems; subsurface damage in microgrinding optical glasses; bound-abrasive polishers for optical glass; and color gamut of cholesteric liquid crystal films and flakes by standardmore » colorimetry.« less

  19. A comprehensive experimental characterization of the iPIX gamma imager

    NASA Astrophysics Data System (ADS)

    Amgarou, K.; Paradiso, V.; Patoz, A.; Bonnet, F.; Handley, J.; Couturier, P.; Becker, F.; Menaa, N.

    2016-08-01

    The results of more than 280 different experiments aimed at exploring the main features and performances of a newly developed gamma imager, called iPIX, are summarized in this paper. iPIX is designed to quickly localize radioactive sources while estimating the ambient dose equivalent rate at the measurement point. It integrates a 1 mm thick CdTe detector directly bump-bonded to a Timepix chip, a tungsten coded-aperture mask, and a mini RGB camera. It also represents a major technological breakthrough in terms of lightness, compactness, usability, response sensitivity, and angular resolution. As an example of its key strengths, an 241Am source with a dose rate of only few nSv/h can be localized in less than one minute.

  20. A Near-Infrared Spectrometer to Measure Zodiacal Light Absorption Spectrum

    NASA Technical Reports Server (NTRS)

    Kutyrev, A. S.; Arendt, R.; Dwek, E.; Kimble, R.; Moseley, S. H.; Rapchun, D.; Silverberg, R. F.

    2010-01-01

    We have developed a high throughput infrared spectrometer for zodiacal light fraunhofer lines measurements. The instrument is based on a cryogenic dual silicon Fabry-Perot etalon which is designed to achieve high signal to noise Fraunhofer line profile measurements. Very large aperture silicon Fabry-Perot etalons and fast camera optics make these measurements possible. The results of the absorption line profile measurements will provide a model free measure of the zodiacal Light intensity in the near infrared. The knowledge of the zodiacal light brightness is crucial for accurate subtraction of zodiacal light foreground for accurate measure of the extragalactic background light after the subtraction of zodiacal light foreground. We present the final design of the instrument and the first results of its performance.

  1. The cartography of Venus with Magellan data

    NASA Technical Reports Server (NTRS)

    Kirk, R. L.; Morgan, H. F.; Russell, J. F.

    1993-01-01

    Maps of Venus based on Magellan data are being compiled at 1:50,000,000, 1:5,000,000 and 1:1,500,000 scales. Topographic contour lines based on radar altimetry data are overprinted on the image maps, along with feature nomenclature. Map controls are based on existing knowledge of the spacecraft orbit; photogrammetric triangulation, a traditional basis for geodetic control for bodies where framing cameras were used, is not feasible with the radar images of Venus. Preliminary synthetic aperture radar (SAR) image maps have some data gaps and cosmetic inconsistencies, which will be corrected on final compilations. Eventual revision of geodetic controls and of the adopted Venusian spin-axis location will result in geometric adjustments, particularly on large-scale maps.

  2. Incorporating Scale-Dependent Fracture Stiffness for Improved Reservoir Performance Prediction

    NASA Astrophysics Data System (ADS)

    Crawford, B. R.; Tsenn, M. C.; Homburg, J. M.; Stehle, R. C.; Freysteinson, J. A.; Reese, W. C.

    2017-12-01

    We present a novel technique for predicting dynamic fracture network response to production-driven changes in effective stress, with the potential for optimizing depletion planning and improving recovery prediction in stress-sensitive naturally fractured reservoirs. A key component of the method involves laboratory geomechanics testing of single fractures in order to develop a unique scaling relationship between fracture normal stiffness and initial mechanical aperture. Details of the workflow are as follows: tensile, opening mode fractures are created in a variety of low matrix permeability rocks with initial, unstressed apertures in the micrometer to millimeter range, as determined from image analyses of X-ray CT scans; subsequent hydrostatic compression of these fractured samples with synchronous radial strain and flow measurement indicates that both mechanical and hydraulic aperture reduction varies linearly with the natural logarithm of effective normal stress; these stress-sensitive single-fracture laboratory observations are then upscaled to networks with fracture populations displaying frequency-length and length-aperture scaling laws commonly exhibited by natural fracture arrays; functional relationships between reservoir pressure reduction and fracture network porosity, compressibility and directional permeabilities as generated by such discrete fracture network modeling are then exported to the reservoir simulator for improved naturally fractured reservoir performance prediction.

  3. Design of optical system for binocular fundus camera.

    PubMed

    Wu, Jun; Lou, Shiliang; Xiao, Zhitao; Geng, Lei; Zhang, Fang; Wang, Wen; Liu, Mengjia

    2017-12-01

    A non-mydriasis optical system for binocular fundus camera has been designed in this paper. It can capture two images of the same fundus retinal region from different angles at the same time, and can be used to achieve three-dimensional reconstruction of fundus. It is composed of imaging system and illumination system. In imaging system, Gullstrand Le Grand eye model is used to simulate normal human eye, and Schematic eye model is used to test the influence of ametropia in human eye on imaging quality. Annular aperture and black dot board are added into illumination system, so that the illumination system can eliminate stray light produced by corneal-reflected light and omentoscopic lens. Simulation results show that MTF of each visual field at the cut-off frequency of 90lp/mm is greater than 0.2, system distortion value is -2.7%, field curvature is less than 0.1 mm, radius of Airy disc is 3.25um. This system has a strong ability of chromatic aberration correction and focusing, and can image clearly for human fundus in which the range of diopters is from -10 D to +6 D(1 D = 1 m -1 ).

  4. Nanosatellite optical downlink experiment: design, simulation, and prototyping

    NASA Astrophysics Data System (ADS)

    Clements, Emily; Aniceto, Raichelle; Barnes, Derek; Caplan, David; Clark, James; Portillo, Iñigo del; Haughwout, Christian; Khatsenko, Maxim; Kingsbury, Ryan; Lee, Myron; Morgan, Rachel; Twichell, Jonathan; Riesing, Kathleen; Yoon, Hyosang; Ziegler, Caleb; Cahoy, Kerri

    2016-11-01

    The nanosatellite optical downlink experiment (NODE) implements a free-space optical communications (lasercom) capability on a CubeSat platform that can support low earth orbit (LEO) to ground downlink rates>10 Mbps. A primary goal of NODE is to leverage commercially available technologies to provide a scalable and cost-effective alternative to radio-frequency-based communications. The NODE transmitter uses a 200-mW 1550-nm master-oscillator power-amplifier design using power-efficient M-ary pulse position modulation. To facilitate pointing the 0.12-deg downlink beam, NODE augments spacecraft body pointing with a microelectromechanical fast steering mirror (FSM) and uses an 850-nm uplink beacon to an onboard CCD camera. The 30-cm aperture ground telescope uses an infrared camera and FSM for tracking to an avalanche photodiode detector-based receiver. Here, we describe our approach to transition prototype transmitter and receiver designs to a full end-to-end CubeSat-scale system. This includes link budget refinement, drive electronics miniaturization, packaging reduction, improvements to pointing and attitude estimation, implementation of modulation, coding, and interleaving, and ground station receiver design. We capture trades and technology development needs and outline plans for integrated system ground testing.

  5. Activation and control of microlens liquid arrays on functionalized polar electric crystal substrates by electro-wetting effect and temperature

    NASA Astrophysics Data System (ADS)

    Ferraro, Pietro; Grilli, Simonetta; Miccio, Lisa; Vespini, Veronica; Finizio, Sergio DeNicola Andrea

    2008-11-01

    In recent years a variety of liquid bases optical elements have been conceived, designed and fabricated even for commercial products like digital cameras o cellular phone cameras. The impressive development of microfluidic systems in conjunction with optics has led to the creation of a completely new Science field of investigation named optofludics. Optofludics, among others topics, deals with investigation and methods for realizing liquid micro-lenses. A variety of liquid micro-lenses have been designed and realized by using different configurations. We demonstrate that a lensing effect can be obtained in an open microfluidic system by using a thin layer of liquid on a polar electric crystal such as Lithium Niobate (LiNbO3). Electrowetting patterning on LiNbO3 surface is obtained by pyroelectric effect consisting in a simple but reliable electrodes-less and circuit-less configuration. The electrodes are intrinsically embedded into the substrate. The material is functionalised by means of a micro-engineering electric filed poling process. Lens array with variable focus has been demonstrated with a large number of lens elements (10x10) on micrometric scale (aperture of single lens 100 microns).

  6. Computational imaging with a balanced detector.

    PubMed

    Soldevila, F; Clemente, P; Tajahuerce, E; Uribe-Patarroyo, N; Andrés, P; Lancis, J

    2016-06-29

    Single-pixel cameras allow to obtain images in a wide range of challenging scenarios, including broad regions of the electromagnetic spectrum and through scattering media. However, there still exist several drawbacks that single-pixel architectures must address, such as acquisition speed and imaging in the presence of ambient light. In this work we introduce balanced detection in combination with simultaneous complementary illumination in a single-pixel camera. This approach enables to acquire information even when the power of the parasite signal is higher than the signal itself. Furthermore, this novel detection scheme increases both the frame rate and the signal-to-noise ratio of the system. By means of a fast digital micromirror device together with a low numerical aperture collecting system, we are able to produce a live-feed video with a resolution of 64 × 64 pixels at 5 Hz. With advanced undersampling techniques, such as compressive sensing, we can acquire information at rates of 25 Hz. By using this strategy, we foresee real-time biological imaging with large area detectors in conditions where array sensors are unable to operate properly, such as infrared imaging and dealing with objects embedded in turbid media.

  7. Modification of the Miyake-Apple technique for simultaneous anterior and posterior video imaging of wet laboratory-based corneal surgery.

    PubMed

    Tan, Johnson C H; Meadows, Howard; Gupta, Aanchal; Yeung, Sonia N; Moloney, Gregory

    2014-03-01

    The aim of this study was to describe a modification of the Miyake-Apple posterior video analysis for the simultaneous visualization of the anterior and posterior corneal surfaces during wet laboratory-based deep anterior lamellar keratoplasty (DALK). A human donor corneoscleral button was affixed to a microscope slide and placed onto a custom-made mounting box. A big bubble DALK was performed on the cornea in the wet laboratory. An 11-diopter intraocular lens was positioned over the aperture of the back camera of an iPhone. This served to video record the posterior view of the corneoscleral button during the big bubble formation. An overhead operating microscope with an attached video camcorder recorded the anterior view during the surgery. The anterior and posterior views of the wet laboratory-based DALK surgery were simultaneously captured and edited using video editing software. The formation of the big bubble can be studied. This video recording camera system has the potential to act as a valuable research and teaching tool in corneal lamellar surgery, especially in the behavior of the big bubble formation in DALK.

  8. Computational imaging with a balanced detector

    NASA Astrophysics Data System (ADS)

    Soldevila, F.; Clemente, P.; Tajahuerce, E.; Uribe-Patarroyo, N.; Andrés, P.; Lancis, J.

    2016-06-01

    Single-pixel cameras allow to obtain images in a wide range of challenging scenarios, including broad regions of the electromagnetic spectrum and through scattering media. However, there still exist several drawbacks that single-pixel architectures must address, such as acquisition speed and imaging in the presence of ambient light. In this work we introduce balanced detection in combination with simultaneous complementary illumination in a single-pixel camera. This approach enables to acquire information even when the power of the parasite signal is higher than the signal itself. Furthermore, this novel detection scheme increases both the frame rate and the signal-to-noise ratio of the system. By means of a fast digital micromirror device together with a low numerical aperture collecting system, we are able to produce a live-feed video with a resolution of 64 × 64 pixels at 5 Hz. With advanced undersampling techniques, such as compressive sensing, we can acquire information at rates of 25 Hz. By using this strategy, we foresee real-time biological imaging with large area detectors in conditions where array sensors are unable to operate properly, such as infrared imaging and dealing with objects embedded in turbid media.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leary, T.J.; Lamb, A.

    The Department of Energy`s Office of Arms Control and Non-Proliferation (NN-20) has developed a suite of airborne remote sensing systems that simultaneously collect coincident data from a US Navy P-3 aircraft. The primary objective of the Airborne Multisensor Pod System (AMPS) Program is {open_quotes}to collect multisensor data that can be used for data research, both to reduce interpretation problems associated with data overload and to develop information products more complete than can be obtained from any single sensor.{close_quotes} The sensors are housed in wing-mounted pods and include: a Ku-Band Synthetic Aperture Radar; a CASI Hyperspectral Imager; a Daedalus 3600 Airbornemore » Multispectral Scanner; a Wild Heerbrugg RC-30 motion compensated large format camera; various high resolution, light intensified and thermal video cameras; and several experimental sensors (e.g. the Portable Hyperspectral Imager of Low-Light Spectroscopy (PHILLS)). Over the past year or so, the Coastal Marine Resource Assessment (CAMRA) group at the Florida Department of Environmental Protection`s Marine Research Institute (FMRI) has been working with the Department of Energy through the Naval Research Laboratory to develop applications and products from existing data. Considerable effort has been spent identifying image formats integration parameters. 2 refs., 3 figs., 2 tabs.« less

  10. Computational imaging with a balanced detector

    PubMed Central

    Soldevila, F.; Clemente, P.; Tajahuerce, E.; Uribe-Patarroyo, N.; Andrés, P.; Lancis, J.

    2016-01-01

    Single-pixel cameras allow to obtain images in a wide range of challenging scenarios, including broad regions of the electromagnetic spectrum and through scattering media. However, there still exist several drawbacks that single-pixel architectures must address, such as acquisition speed and imaging in the presence of ambient light. In this work we introduce balanced detection in combination with simultaneous complementary illumination in a single-pixel camera. This approach enables to acquire information even when the power of the parasite signal is higher than the signal itself. Furthermore, this novel detection scheme increases both the frame rate and the signal-to-noise ratio of the system. By means of a fast digital micromirror device together with a low numerical aperture collecting system, we are able to produce a live-feed video with a resolution of 64 × 64 pixels at 5 Hz. With advanced undersampling techniques, such as compressive sensing, we can acquire information at rates of 25 Hz. By using this strategy, we foresee real-time biological imaging with large area detectors in conditions where array sensors are unable to operate properly, such as infrared imaging and dealing with objects embedded in turbid media. PMID:27353733

  11. Conceptual design of the CZMIL data acquisition system (DAS): integrating a new bathymetric lidar with a commercial spectrometer and metric camera for coastal mapping applications

    NASA Astrophysics Data System (ADS)

    Fuchs, Eran; Tuell, Grady

    2010-04-01

    The CZMIL system is a new generation airborne bathymetric and topographic remote sensing platform composed of an active lidar, passive hyperspectral imager, high resolution frame camera, navigation system, and storage media running on a linux-based Gigabit Ethernet network. The lidar is a hybrid scanned-flash system employing a 10 KHz green laser and novel circular scanner, with a large aperture receiver (0.20m) having multiple channels. A PMT-based segmented detector is used on one channel to support simultaneous topographic and bathymetric data collection, and multiple fields-of- view are measured to support bathymetric measurements. The measured laser returns are digitized at 1 GHz to produce the waveforms required for ranging measurements, and unique data compression and storage techniques are used to address the large data volume. Simulated results demonstrate CZMIL's capability to discriminate bottom and surface returns in very shallow water conditions without compromising performance in deep water. Simulated waveforms are compared with measured data from the SHOALS system and show promising expected results. The system's prototype is expected to be completed by end of 2010, and ready for initial calibration tests in the spring of 2010.

  12. Invariant principles of speech motor control that are not language-specific.

    PubMed

    Chakraborty, Rahul

    2012-12-01

    Bilingual speakers must learn to modify their speech motor control mechanism based on the linguistic parameters and rules specified by the target language. This study examines if there are aspects of speech motor control which remain invariant regardless of the first (L1) and second (L2) language targets. Based on the age of academic exposure and proficiency in L2, 21 Bengali-English bilingual participants were classified into high (n = 11) and low (n = 10) L2 (English) proficiency groups. Using the Optotrak 3020 motion sensitive camera system, the lips and jaw movements were recorded while participants produced Bengali (L1) and English (L2) sentences. Based on kinematic analyses of the lip and jaw movements, two different variability measures (i.e., lip aperture and lower lip/jaw complex) were computed for English and Bengali sentences. Analyses demonstrated that the two groups of bilingual speakers produced lip aperture complexes (a higher order synergy) that were more consistent in co-ordination than were the lower lip/jaw complexes (a lower order synergy). Similar findings were reported earlier in monolingual English speakers by Smith and Zelaznik. Thus, this hierarchical organization may be viewed as a fundamental principle of speech motor control, since it is maintained even in bilingual speakers.

  13. Studies of Bagley Icefield during surge and Black Rapids Glacier, Alaska, using spaceborne SAR interferometry

    NASA Astrophysics Data System (ADS)

    Fatland, Dennis Robert

    1998-12-01

    This thesis presents studies of two temperate valley glaciers---Bering Glacier in the Chugach-St.Elias Mountains, South Central Alaska, and Black Rapids Glacier in the Alaska Range, Interior Alaska---using differential spaceborne radar interferometry. The first study was centered on the 1993--95 surge of Bering Glacier and the resultant ice dynamics on its accumulation area, the Bagley Icefield. The second study site was chosen for purposes of comparison of the interferometry results with conventional field measurements, particularly camera survey data and airborne laser altimetry. A comprehensive suite of software was written to interferometrically process synthetic aperture radar (SAR) data in order to derive estimates of surface elevation and surface velocity on these subject glaciers. In addition to these results, the data revealed unexpected but fairly common concentric rings called 'phase bull's-eyes', image features typically 0.5 to 4 km in diameter located over the central part of various glaciers. These bull's-eyes led to a hypothetical model in which they were interpreted to indicate transitory instances of high subglacial water pressure that locally lift the glacier from its bed by several centimeters. This model is associated with previous findings about the nature of glacier bed hydrology and glacier surging. In addition to the dynamical analysis presented herein, this work is submitted as a contribution to the ongoing development of spaceborne radar interferometry as a glaciological tool.

  14. Cadastral Audit and Assessments Using Unmanned Aerial Systems

    NASA Astrophysics Data System (ADS)

    Cunningham, K.; Walker, G.; Stahlke, E.; Wilson, R.

    2011-09-01

    Ground surveys and remote sensing are integral to establishing fair and equitable property valuations necessary for real property taxation. The International Association of Assessing Officers (IAAO) has embraced aerial and street-view imaging as part of its standards related to property tax assessments and audits. New technologies, including unmanned aerial systems (UAS) paired with imaging sensors, will become more common as local governments work to ensure their cadastre and tax rolls are both accurate and complete. Trends in mapping technology have seen an evolution in platforms from large, expensive manned aircraft to very small, inexpensive UAS. Traditional methods of photogrammetry have also given way to new equipment and sensors: digital cameras, infrared imagers, light detection and ranging (LiDAR) laser scanners, and now synthetic aperture radar (SAR). At the University of Alaska Fairbanks (UAF), we work extensively with unmanned aerial systems equipped with each of these newer sensors. UAF has significant experience flying unmanned systems in the US National Airspace, having begun in 1969 with scientific rockets and expanded to unmanned aircraft in 2003. Ongoing field experience allows UAF to partner effectively with outside organizations to test and develop leading-edge research in UAS and remote sensing. This presentation will discuss our research related to various sensors and payloads for mapping. We will also share our experience with UAS and optical systems for creating some of the first cadastral surveys in rural Alaska.

  15. Nanotip Carpets as Antireflection Surfaces

    NASA Technical Reports Server (NTRS)

    Bae, Youngsam; Mobasser, Sohrab; Manohara, Harish; Lee, Choonsup

    2008-01-01

    Carpet-like random arrays of metal-coated silicon nanotips have been shown to be effective as antireflection surfaces. Now undergoing development for incorporation into Sun sensors that would provide guidance for robotic exploratory vehicles on Mars, nanotip carpets of this type could also have many uses on Earth as antireflection surfaces in instruments that handle or detect ultraviolet, visible, or infrared light. In the original Sun-sensor application, what is required is an array of 50-micron-diameter apertures on what is otherwise an opaque, minimally reflective surface, as needed to implement a miniature multiple-pinhole camera. The process for fabrication of an antireflection nanotip carpet for this application (see Figure 1) includes, and goes somewhat beyond, the process described in A New Process for Fabricating Random Silicon Nanotips (NPO-40123), NASA Tech Briefs, Vol. 28, No. 1 (November 2004), page 62. In the first step, which is not part of the previously reported process, photolithography is performed to deposit etch masks to define the 50-micron apertures on a silicon substrate. In the second step, which is part of the previously reported process, the non-masked silicon area between the apertures is subjected to reactive ion etching (RIE) under a special combination of conditions that results in the growth of fluorine-based compounds in randomly distributed formations, known in the art as "polymer RIE grass," that have dimensions of the order of microns. The polymer RIE grass formations serve as microscopic etch masks during the next step, in which deep reactive ion etching (DRIE) is performed. What remains after DRIE is the carpet of nano - tips, which are high-aspect-ratio peaks, the tips of which have radii of the order of nanometers. Next, the nanotip array is evaporatively coated with Cr/Au to enhance the absorption of light (more specifically, infrared light in the Sun-sensor application). The photoresist etch masks protecting the apertures are then removed by dipping the substrate into acetone. Finally, for the Sun-sensor application, the back surface of the substrate is coated with a 57-nm-thick layer of Cr for attenuation of sunlight.

  16. SU-E-T-453: A Novel Daily QA System for Robotic Image Guided Radiosurgery with Variable Aperture Collimator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, L; Nelson, B

    Purpose: A novel end-to-end system using a CCD camera and a scintillator based phantom that is capable of measuring the beam-by-beam delivery accuracy of Robotic Radiosurgery has been developed and reported in our previous work. This work investigates its application to end-to-end type daily QA for Robotic Radiosurgery (Cyberknife) with Variable Aperture Collimator (Iris). Methods: The phantom was first scanned with a CT scanner at 0.625 slice thickness and exported to the Cyberknife Muliplan (v4.6) treatment planning system. An isocentric treatment plan was created consisting of ten beams of 25 Monitor Units each using Iris apertures of 7.5, 10, 15,more » 20, and 25 mm. The plan was delivered six times in two days on the Cyberknife G4 system with fiducial tracking on the four metal fiducials embedded in phantom with re-positioning between the measurements. The beam vectors (X, Y, Z) are measured and compared with the plan from the machine delivery file (XML file). The Iris apertures (FWHM) were measured from the beam flux map and compared with the commissioning data. Results: The average beam positioning accuracies of the six deliveries are 0.71 ± 0.40 mm, 0.72 ± 0.44 mm, 0.74 ± 0.42 mm, 0.70 ± 0.40 mm, 0.79 ± 0.44 mm and 0.69 ± 0.41 mm respectively. Radiation beam width (FWHM) variations are within ±0.05 mm, and they agree with the commissioning data within 0.22 mm. The delivery time for the plan is about 7 minutes and the results are given instantly. Conclusion: The experimental results agree with stated sub-millimeter delivery accuracy of Cyberknife system. Beam FWHM variations comply with the 0.2 mm accuracy of the Iris collimator at SAD. The XRV-100 system has proven to be a powerful tool in performing end-to-end type tests for Robotic Image Guided Radiosurgery Daily QA.« less

  17. Motion coherence affects human perception and pursuit similarly.

    PubMed

    Beutter, B R; Stone, L S

    2000-01-01

    Pursuit and perception both require accurate information about the motion of objects. Recovering the motion of objects by integrating the motion of their components is a difficult visual task. Successful integration produces coherent global object motion, while a failure to integrate leaves the incoherent local motions of the components unlinked. We compared the ability of perception and pursuit to perform motion integration by measuring direction judgments and the concomitant eye-movement responses to line-figure parallelograms moving behind stationary rectangular apertures. The apertures were constructed such that only the line segments corresponding to the parallelogram's sides were visible; thus, recovering global motion required the integration of the local segment motion. We investigated several potential motion-integration rules by using stimuli with different object, vector-average, and line-segment terminator-motion directions. We used an oculometric decision rule to directly compare direction discrimination for pursuit and perception. For visible apertures, the percept was a coherent object, and both the pursuit and perceptual performance were close to the object-motion prediction. For invisible apertures, the percept was incoherently moving segments, and both the pursuit and perceptual performance were close to the terminator-motion prediction. Furthermore, both psychometric and oculometric direction thresholds were much higher for invisible apertures than for visible apertures. We constructed a model in which both perception and pursuit are driven by a shared motion-processing stage, with perception having an additional input from an independent static-processing stage. Model simulations were consistent with our perceptual and oculomotor data. Based on these results, we propose the use of pursuit as an objective and continuous measure of perceptual coherence. Our results support the view that pursuit and perception share a common motion-integration stage, perhaps within areas MT or MST.

  18. Reliability of dynamometric passive properties of the pelvic floor muscles in postmenopausal women with stress urinary incontinence.

    PubMed

    Morin, Mélanie; Gravel, Denis; Bourbonnais, Daniel; Dumoulin, Chantale; Ouellet, Stéphane

    2008-01-01

    The passive properties of the pelvic floor muscles (PFM) might play a role in stress urinary incontinence (SUI) pathophysiology. To investigate the test-retest reliability of the dynamometric passive properties of the PFM in postmenopausal SUI women. Thirty-two SUI postmenopausal women were convened to two sessions 2 weeks apart. In each session, the measurements were repeated twice. The pelvic floor musculature was evaluated in four different conditions: (1) forces recorded at minimal aperture (initial passive resistance); (2) passive resistance at maximal aperture; (3) five lengthening and shortening cycles (Forces and passive elastic stiffness (PES) were evaluated at different vaginal apertures. Hysteresis was also calculated.); (4) Percentage of passive resistance loss after 1 min of sustained stretching was computed. The generalizability theory was used to calculate two reliability estimates, the dependability indices (Phi) and the standard error of measurement (SEM), for one session involving one measurement or the mean of two measurements. Overall, the reliability of the passive properties was good with indices of dependability of 0.75-0.93. The SEMs for forces and PES were 0.24-0.67 N and 0.03-0.10 N/mm, respectively, for mean, maximal and 20-mm apertures, representing an error between 13% and 23%. Passive forces at minimal aperture showed lower reliability (Phi = 0.51-0.57) compared with other vaginal openings. The aperture at a common force of 0.5 N was the only parameter demonstrating a poor reliability (Phi = 0.35). This new approach for assessing PFM passive properties showed enough reliability for highly recommending its inclusion in the PFM assessment of SUI postmenopausal women. (c) 2008 Wiley-Liss, Inc.

  19. Motion coherence affects human perception and pursuit similarly

    NASA Technical Reports Server (NTRS)

    Beutter, B. R.; Stone, L. S.

    2000-01-01

    Pursuit and perception both require accurate information about the motion of objects. Recovering the motion of objects by integrating the motion of their components is a difficult visual task. Successful integration produces coherent global object motion, while a failure to integrate leaves the incoherent local motions of the components unlinked. We compared the ability of perception and pursuit to perform motion integration by measuring direction judgments and the concomitant eye-movement responses to line-figure parallelograms moving behind stationary rectangular apertures. The apertures were constructed such that only the line segments corresponding to the parallelogram's sides were visible; thus, recovering global motion required the integration of the local segment motion. We investigated several potential motion-integration rules by using stimuli with different object, vector-average, and line-segment terminator-motion directions. We used an oculometric decision rule to directly compare direction discrimination for pursuit and perception. For visible apertures, the percept was a coherent object, and both the pursuit and perceptual performance were close to the object-motion prediction. For invisible apertures, the percept was incoherently moving segments, and both the pursuit and perceptual performance were close to the terminator-motion prediction. Furthermore, both psychometric and oculometric direction thresholds were much higher for invisible apertures than for visible apertures. We constructed a model in which both perception and pursuit are driven by a shared motion-processing stage, with perception having an additional input from an independent static-processing stage. Model simulations were consistent with our perceptual and oculomotor data. Based on these results, we propose the use of pursuit as an objective and continuous measure of perceptual coherence. Our results support the view that pursuit and perception share a common motion-integration stage, perhaps within areas MT or MST.

  20. Viewpoint matters: objective performance metrics for surgeon endoscope control during robot-assisted surgery.

    PubMed

    Jarc, Anthony M; Curet, Myriam J

    2017-03-01

    Effective visualization of the operative field is vital to surgical safety and education. However, additional metrics for visualization are needed to complement other common measures of surgeon proficiency, such as time or errors. Unlike other surgical modalities, robot-assisted minimally invasive surgery (RAMIS) enables data-driven feedback to trainees through measurement of camera adjustments. The purpose of this study was to validate and quantify the importance of novel camera metrics during RAMIS. New (n = 18), intermediate (n = 8), and experienced (n = 13) surgeons completed 25 virtual reality simulation exercises on the da Vinci Surgical System. Three camera metrics were computed for all exercises and compared to conventional efficiency measures. Both camera metrics and efficiency metrics showed construct validity (p < 0.05) across most exercises (camera movement frequency 23/25, camera movement duration 22/25, camera movement interval 19/25, overall score 24/25, completion time 25/25). Camera metrics differentiated new and experienced surgeons across all tasks as well as efficiency metrics. Finally, camera metrics significantly (p < 0.05) correlated with completion time (camera movement frequency 21/25, camera movement duration 21/25, camera movement interval 20/25) and overall score (camera movement frequency 20/25, camera movement duration 19/25, camera movement interval 20/25) for most exercises. We demonstrate construct validity of novel camera metrics and correlation between camera metrics and efficiency metrics across many simulation exercises. We believe camera metrics could be used to improve RAMIS proficiency-based curricula.

  1. Recommended survey designs for occupancy modelling using motion-activated cameras: insights from empirical wildlife data

    PubMed Central

    Lewis, Jesse S.; Gerber, Brian D.

    2014-01-01

    Motion-activated cameras are a versatile tool that wildlife biologists can use for sampling wild animal populations to estimate species occurrence. Occupancy modelling provides a flexible framework for the analysis of these data; explicitly recognizing that given a species occupies an area the probability of detecting it is often less than one. Despite the number of studies using camera data in an occupancy framework, there is only limited guidance from the scientific literature about survey design trade-offs when using motion-activated cameras. A fuller understanding of these trade-offs will allow researchers to maximise available resources and determine whether the objectives of a monitoring program or research study are achievable. We use an empirical dataset collected from 40 cameras deployed across 160 km2 of the Western Slope of Colorado, USA to explore how survey effort (number of cameras deployed and the length of sampling period) affects the accuracy and precision (i.e., error) of the occupancy estimate for ten mammal and three virtual species. We do this using a simulation approach where species occupancy and detection parameters were informed by empirical data from motion-activated cameras. A total of 54 survey designs were considered by varying combinations of sites (10–120 cameras) and occasions (20–120 survey days). Our findings demonstrate that increasing total sampling effort generally decreases error associated with the occupancy estimate, but changing the number of sites or sampling duration can have very different results, depending on whether a species is spatially common or rare (occupancy = ψ) and easy or hard to detect when available (detection probability = p). For rare species with a low probability of detection (i.e., raccoon and spotted skunk) the required survey effort includes maximizing the number of sites and the number of survey days, often to a level that may be logistically unrealistic for many studies. For common species with low detection (i.e., bobcat and coyote) the most efficient sampling approach was to increase the number of occasions (survey days). However, for common species that are moderately detectable (i.e., cottontail rabbit and mule deer), occupancy could reliably be estimated with comparatively low numbers of cameras over a short sampling period. We provide general guidelines for reliably estimating occupancy across a range of terrestrial species (rare to common: ψ = 0.175–0.970, and low to moderate detectability: p = 0.003–0.200) using motion-activated cameras. Wildlife researchers/managers with limited knowledge of the relative abundance and likelihood of detection of a particular species can apply these guidelines regardless of location. We emphasize the importance of prior biological knowledge, defined objectives and detailed planning (e.g., simulating different study-design scenarios) for designing effective monitoring programs and research studies. PMID:25210658

  2. A hydrostatic leak test for water pipeline by using distributed optical fiber vibration sensing system

    NASA Astrophysics Data System (ADS)

    Wu, Huijuan; Sun, Zhenshi; Qian, Ya; Zhang, Tao; Rao, Yunjiang

    2015-07-01

    A hydrostatic leak test for water pipeline with a distributed optical fiber vibration sensing (DOVS) system based on the phase-sensitive OTDR technology is studied in this paper. By monitoring one end of a common communication optical fiber cable, which is laid in the inner wall of the pipe, we can detect and locate the water leakages easily. Different apertures under different pressures are tested and it shows that the DOVS has good responses when the aperture is equal or larger than 4 mm and the inner pressure reaches 0.2 Mpa for a steel pipe with DN 91cm×EN 2cm.

  3. A parametric study of various synthetic aperture telescope configurations for coherent imaging applications

    NASA Technical Reports Server (NTRS)

    Harvey, James E.; Wissinger, Alan B.; Bunner, Alan N.

    1986-01-01

    The comparative advantages of synthetic aperture telescopes (SATs) of segmented primary mirror and common secondary mirror type, on the one hand, and on the other those employing an array of independent telescopes, are discussed. The diffraction-limited optical performance of both redundant and nonredundant subaperture configurations are compared in terms of point spread function characteristics and encircled energy plots. Coherent imaging with afocal telescope SATs involves a pupil-mapping operation followed by a Fourier transform one. A quantitative analysis of the off-axis optical performance degradation due to pupil-mapping errors is presented, together with the field-dependent effects of residual design aberrations of independent telescopes.

  4. Development of a low-cost sun sensor for nanosatellites

    NASA Astrophysics Data System (ADS)

    Antonello, Andrea; Olivieri, Lorenzo; Francesconi, Alessandro

    2018-03-01

    Sun sensors represent a common and reliable technology for attitude determination, employed in many space missions thanks to their limited size and weight. Typically, two-axis digital Sun sensors employ an array of active pixels arranged behind a small aperture; the position of the sunlight's spot allows to determine the direction of the Sun. With the advent of smaller vehicles such as CubeSats and Nanosats, there is the need to further reduce the size and weight of such devices: as a trade-off, this usually results in the curtail of the performances. Nowadays, state of the art Sun sensors for CubeSats have resolutions of about 0.5°, with fields of view in the ±45° to ±90° range, with off-the-self prices of several thousands of dollars. In this work we introduce a novel low-cost miniaturized Sun sensor, based on a commercial CMOS camera detector; its main feature is the reduced size with respect to state-of-the-art sensors developed from the same technology, making it employable on CubeSats. The sensor consists of a precisely machined pinhole with a 10 μm circular aperture, placed at a distance of 7 mm from the CMOS. The standoff distance and casing design allow for a maximum resolution of less than 0.03°, outperforming most of the products currently available for nano and pico platforms; furthermore, the nature of the technology allows for reduced size and lightweight characteristics. The design, development and laboratory tests of the sensor are here introduced, starting with the definition of the physical model, the geometrical layout and its theoretical resolution; a more accurate model was then developed in order to account for the geometrical deviations and deformations of the pinhole-projected light-spot, as well as to account for the background noise and disturbances to the electronics. Finally, the laboratory setup is presented along with the test campaigns: the results obtained are compared with the simulations, allowing for the validation of the theoretical model.

  5. Multi-Angle Snowflake Camera Instrument Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stuefer, Martin; Bailey, J.

    2016-07-01

    The Multi-Angle Snowflake Camera (MASC) takes 9- to 37-micron resolution stereographic photographs of free-falling hydrometers from three angles, while simultaneously measuring their fall speed. Information about hydrometeor size, shape orientation, and aspect ratio is derived from MASC photographs. The instrument consists of three commercial cameras separated by angles of 36º. Each camera field of view is aligned to have a common single focus point about 10 cm distant from the cameras. Two near-infrared emitter pairs are aligned with the camera’s field of view within a 10-angular ring and detect hydrometeor passage, with the lower emitters configured to trigger the MASCmore » cameras. The sensitive IR motion sensors are designed to filter out slow variations in ambient light. Fall speed is derived from successive triggers along the fall path. The camera exposure times are extremely short, in the range of 1/25,000th of a second, enabling the MASC to capture snowflake sizes ranging from 30 micrometers to 3 cm.« less

  6. Pressure letdown method and device for coal conversion systems

    NASA Technical Reports Server (NTRS)

    Kendal, J. M.; Walsh, J. V. (Inventor)

    1983-01-01

    In combination with a reactor for a coal utilization system, a pressure letdown device accepts from a reactor, a polyphase fluid at an entrance pressure and an entrance velocity, and discharges the fluid from the device at a discharge pressure substantially lower than the entrance pressure and at a discharge temperature and a discharge velocity substantially equal to the entrance temperature and entrance velocity. The device is characterized by a series of pressure letdown stages including several symmetrical baffles, disposed in coaxially nested alignment. In each baffle several ports or apertures of uniform dimensions are defined. The number of ports or apertures for each baffle plate is unique with respect to the number of ports or apertures defined in each of the other baffles. The mass rate of flow for each port is a function of the area of the port, the pressure of the fluid as applied to the port, and a common pressure ratio established across the ports.

  7. Chromatic energy filter and characterization of laser-accelerated proton beams for particle therapy

    NASA Astrophysics Data System (ADS)

    Hofmann, Ingo; Meyer-ter-Vehn, Jürgen; Yan, Xueqing; Al-Omari, Husam

    2012-07-01

    The application of laser accelerated protons or ions for particle therapy has to cope with relatively large energy and angular spreads as well as possibly significant random fluctuations. We suggest a method for combined focusing and energy selection, which is an effective alternative to the commonly considered dispersive energy selection by magnetic dipoles. Our method is based on the chromatic effect of a magnetic solenoid (or any other energy dependent focusing device) in combination with an aperture to select a certain energy width defined by the aperture radius. It is applied to an initial 6D phase space distribution of protons following the simulation output from a Radiation Pressure Acceleration model. Analytical formula for the selection aperture and chromatic emittance are confirmed by simulation results using the TRACEWIN code. The energy selection is supported by properly placed scattering targets to remove the imprint of the chromatic effect on the beam and to enable well-controlled and shot-to-shot reproducible energy and transverse density profiles.

  8. Large deviation principle at work: Computation of the statistical properties of the exact one-point aperture mass

    NASA Astrophysics Data System (ADS)

    Reimberg, Paulo; Bernardeau, Francis

    2018-01-01

    We present a formalism based on the large deviation principle (LDP) applied to cosmological density fields, and more specifically to the arbitrary functional of density profiles, and we apply it to the derivation of the cumulant generating function and one-point probability distribution function (PDF) of the aperture mass (Map ), a common observable for cosmic shear observations. We show that the LDP can indeed be used in practice for a much larger family of observables than previously envisioned, such as those built from continuous and nonlinear functionals of density profiles. Taking advantage of this formalism, we can extend previous results, which were based on crude definitions of the aperture mass, with top-hat windows and the use of the reduced shear approximation (replacing the reduced shear with the shear itself). We were precisely able to quantify how this latter approximation affects the Map statistical properties. In particular, we derive the corrective term for the skewness of the Map and reconstruct its one-point PDF.

  9. Nanoantennas for enhancing and confining the magnetic optical field

    NASA Astrophysics Data System (ADS)

    Grosjean, Thierry; Mivelle, Mathieu; Baida, Fadi I.; Burr, Geoffrey W.; Fischer, Ulrich C.

    2011-05-01

    We propose different optical antenna structures for enhancing and confining the magnetic optical field. A common feature of these structures are concave corners in thin metal films as locations of the enhanced magnetic field. This proposal is inspired by Babinet's principle as the concave edges are the complementary structures to convex metal corners, which are known to be locations of a strongly enhanced electric field. Bowtie antennas and the bowtie apertures of appropriate size were shown to exhibit resonances in the infrared frequency range with an especially strong enhancement of the electrical field in the gap between 2 convex metal corners. We show by numerical calculations, that the complementary structures, the complementary bowtie aperture - the diabolo antenna - and the complementary bow tie antenna - two closely spaced triangular apertures in a metal film with a narrow gap between two opposing concave corners - exhibit resonances with a strongly enhanced magnetic field at the narrow metal constriction between the concave corners. We suggest sub-wavelength circuits of concave and convex corners as building blocks of planar metamaterials.

  10. Automatic source camera identification using the intrinsic lens radial distortion

    NASA Astrophysics Data System (ADS)

    Choi, Kai San; Lam, Edmund Y.; Wong, Kenneth K. Y.

    2006-11-01

    Source camera identification refers to the task of matching digital images with the cameras that are responsible for producing these images. This is an important task in image forensics, which in turn is a critical procedure in law enforcement. Unfortunately, few digital cameras are equipped with the capability of producing watermarks for this purpose. In this paper, we demonstrate that it is possible to achieve a high rate of accuracy in the identification by noting the intrinsic lens radial distortion of each camera. To reduce manufacturing cost, the majority of digital cameras are equipped with lenses having rather spherical surfaces, whose inherent radial distortions serve as unique fingerprints in the images. We extract, for each image, parameters from aberration measurements, which are then used to train and test a support vector machine classifier. We conduct extensive experiments to evaluate the success rate of a source camera identification with five cameras. The results show that this is a viable approach with high accuracy. Additionally, we also present results on how the error rates may change with images captured using various optical zoom levels, as zooming is commonly available in digital cameras.

  11. Zernike-like systems in polygons and polygonal facets.

    PubMed

    Ferreira, Chelo; López, José L; Navarro, Rafael; Sinusía, Ester Pérez

    2015-07-20

    Zernike polynomials are commonly used to represent the wavefront phase on circular optical apertures, since they form a complete and orthonormal basis on the unit disk. In [Opt. Lett.32, 74 (2007)10.1364/OL.32.000074OPLEDP0146-9592] we introduced a new Zernike basis for elliptic and annular optical apertures based on an appropriate diffeomorphism between the unit disk and the ellipse and the annulus. Here, we present a generalization of this Zernike basis for a variety of important optical apertures, paying special attention to polygons and the polygonal facets present in segmented mirror telescopes. On the contrary to ad hoc solutions, most of them based on the Gram-Smith orthonormalization method, here we consider a piecewise diffeomorphism that transforms the unit disk into the polygon under consideration. We use this mapping to define a Zernike-like orthonormal system over the polygon. We also consider ensembles of polygonal facets that are essential in the design of segmented mirror telescopes. This generalization, based on in-plane warping of the basis functions, provides a unique solution, and what is more important, it guarantees a reasonable level of invariance of the mathematical properties and the physical meaning of the initial basis functions. Both the general form and the explicit expressions for a typical example of telescope optical aperture are provided.

  12. Advances in stereomicroscopy

    NASA Astrophysics Data System (ADS)

    Schnitzler, H.; Zimmer, Klaus-Peter

    2008-09-01

    Similar to human's binocular vision, stereomicroscopes are comprised of two optical paths under a convergence angle providing a full perspective insight into the world's microstructure. The numerical aperture of stereomicroscopes has continuously increased over the years, reaching the point where the lenses of left and right perspective paths touched each other. This constraint appeared as an upper limit for the resolution of stereomicroscopes, as the resolution of a stereomicroscope was deduced from the numerical apertures of the two equally sized perspective channels. We present the optical design and advances in resolution of the world's first asymmetrical stereomicroscope, which is a technological breakthrough in many aspects of stereomicroscopes. This unique approach uses a large numerical aperture and thus an, so far, unachievable high lateral resolution in the one path, and a small aperture in the other path, which provides a high depth of field ("Fusion Optics"). This new concept is a technical challenge for the optical design of the zoom system as well as for the common main objectives. Furthermore, the new concept makes use of the particular way in which perspective information by binocular vision is formed in the human's brain. In conjunction with a research project at the University of Zurich, Leica Microsystems consolidated the functionality of this concept in to a new generation of stereomicroscopes.

  13. Implementation of an axisymmetric drop shape apparatus using a Raspberry-Pi single-board computer and a web camera

    NASA Astrophysics Data System (ADS)

    Carlà, Marcello; Orlando, Antonio

    2018-07-01

    This paper describes the implementation of an axisymmetric drop shape apparatus for the measurement of surface or interfacial tension of a hanging liquid drop, using only cheap resources like a common web camera and a single-board microcomputer. The mechanics of the apparatus is composed of stubs of commonly available aluminium bar, with all other mechanical parts manufactured with an amateur 3D printer. All of the required software, either for handling the camera and taking the images, or for processing the drop images to get the drop profile and fit it with the Bashforth and Adams equation, is freely available under an open source license. Despite the very limited cost of the whole setup, an extensive test has demonstrated an overall accuracy of ±0.2% or better.

  14. Dynamical Modeling of NGC 6397: Simulated HST Imaging

    NASA Astrophysics Data System (ADS)

    Dull, J. D.; Cohn, H. N.; Lugger, P. M.; Slavin, S. D.; Murphy, B. W.

    1994-12-01

    The proximity of NGC 6397 (2.2 kpc) provides an ideal opportunity to test current dynamical models for globular clusters with the HST Wide-Field/Planetary Camera (WFPC2)\\@. We have used a Monte Carlo algorithm to generate ensembles of simulated Planetary Camera (PC) U-band images of NGC 6397 from evolving, multi-mass Fokker-Planck models. These images, which are based on the post-repair HST-PC point-spread function, are used to develop and test analysis methods for recovering structural information from actual HST imaging. We have considered a range of exposure times up to 2.4times 10(4) s, based on our proposed HST Cycle 5 observations. Our Fokker-Planck models include energy input from dynamically-formed binaries. We have adopted a 20-group mass spectrum extending from 0.16 to 1.4 M_sun. We use theoretical luminosity functions for red giants and main sequence stars. Horizontal branch stars, blue stragglers, white dwarfs, and cataclysmic variables are also included. Simulated images are generated for cluster models at both maximal core collapse and at a post-collapse bounce. We are carrying out stellar photometry on these images using ``DAOPHOT-assisted aperture photometry'' software that we have developed. We are testing several techniques for analyzing the resulting star counts, to determine the underlying cluster structure, including parametric model fits and the nonparametric density estimation methods. Our simulated images also allow us to investigate the accuracy and completeness of methods for carrying out stellar photometry in HST Planetary Camera images of dense cluster cores.

  15. SOFIA science instruments: commissioning, upgrades and future opportunities

    NASA Astrophysics Data System (ADS)

    Smith, Erin C.; Miles, John W.; Helton, L. Andrew; Sankrit, Ravi; Andersson, B. G.; Becklin, Eric E.; De Buizer, James M.; Dowell, C. D.; Dunham, Edward W.; Güsten, Rolf; Harper, Doyal A.; Herter, Terry L.; Keller, Luke D.; Klein, Randolf; Krabbe, Alfred; Logsdon, Sarah; Marcum, Pamela M.; McLean, Ian S.; Reach, William T.; Richter, Matthew J.; Roellig, Thomas L.; Sandell, Göran; Savage, Maureen L.; Temi, Pasquale; Vacca, William D.; Vaillancourt, John E.; Van Cleve, Jeffrey E.; Young, Erick T.

    2014-07-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is the world's largest airborne observatory, featuring a 2.5 meter effective aperture telescope housed in the aft section of a Boeing 747SP aircraft. SOFIA's current instrument suite includes: FORCAST (Faint Object InfraRed CAmera for the SOFIA Telescope), a 5-40 μm dual band imager/grism spectrometer developed at Cornell University; HIPO (High-speed Imaging Photometer for Occultations), a 0.3-1.1μm imager built by Lowell Observatory; GREAT (German Receiver for Astronomy at Terahertz Frequencies), a multichannel heterodyne spectrometer from 60-240 μm, developed by a consortium led by the Max Planck Institute for Radio Astronomy; FLITECAM (First Light Infrared Test Experiment CAMera), a 1-5 μm wide-field imager/grism spectrometer developed at UCLA; FIFI-LS (Far-Infrared Field-Imaging Line Spectrometer), a 42-200 μm IFU grating spectrograph completed by University Stuttgart; and EXES (Echelon-Cross-Echelle Spectrograph), a 5-28 μm highresolution spectrometer designed at the University of Texas and being completed by UC Davis and NASA Ames Research Center. HAWC+ (High-resolution Airborne Wideband Camera) is a 50-240 μm imager that was originally developed at the University of Chicago as a first-generation instrument (HAWC), and is being upgraded at JPL to add polarimetry and new detectors developed at Goddard Space Flight Center (GSFC). SOFIA will continually update its instrument suite with new instrumentation, technology demonstration experiments and upgrades to the existing instrument suite. This paper details the current instrument capabilities and status, as well as the plans for future instrumentation.

  16. Photogrammetry research for FAST eleven-meter reflector panel surface shape measurement

    NASA Astrophysics Data System (ADS)

    Zhou, Rongwei; Zhu, Lichun; Li, Weimin; Hu, Jingwen; Zhai, Xuebing

    2010-10-01

    In order to design and manufacture the Five-hundred-meter Aperture Spherical Radio Telescope (FAST) active reflector measuring equipment, measurement on each reflector panel surface shape was presented, static measurement of the whole neutral spherical network of nodes was performed, real-time dynamic measurement at the cable network dynamic deformation was undertaken. In the implementation process of the FAST, reflector panel surface shape detection was completed before eleven-meter reflector panel installation. Binocular vision system was constructed based on the method of binocular stereo vision in machine vision, eleven-meter reflector panel surface shape was measured with photogrammetry method. Cameras were calibrated with the feature points. Under the linearity camera model, the lighting spot array was used as calibration standard pattern, and the intrinsic and extrinsic parameters were acquired. The images were collected for digital image processing and analyzing with two cameras, feature points were extracted with the detection algorithm of characteristic points, and those characteristic points were matched based on epipolar constraint method. Three-dimensional reconstruction coordinates of feature points were analyzed and reflective panel surface shape structure was established by curve and surface fitting method. The error of reflector panel surface shape was calculated to realize automatic measurement on reflector panel surface shape. The results show that unit reflector panel surface inspection accuracy was 2.30mm, within the standard deviation error of 5.00mm. Compared with the requirement of reflector panel machining precision, photogrammetry has fine precision and operation feasibility on eleven-meter reflector panel surface shape measurement for FAST.

  17. A New Approach for Combining Time-of-Flight and RGB Cameras Based on Depth-Dependent Planar Projective Transformations

    PubMed Central

    Salinas, Carlota; Fernández, Roemi; Montes, Héctor; Armada, Manuel

    2015-01-01

    Image registration for sensor fusion is a valuable technique to acquire 3D and colour information for a scene. Nevertheless, this process normally relies on feature-matching techniques, which is a drawback for combining sensors that are not able to deliver common features. The combination of ToF and RGB cameras is an instance that problem. Typically, the fusion of these sensors is based on the extrinsic parameter computation of the coordinate transformation between the two cameras. This leads to a loss of colour information because of the low resolution of the ToF camera, and sophisticated algorithms are required to minimize this issue. This work proposes a method for sensor registration with non-common features and that avoids the loss of colour information. The depth information is used as a virtual feature for estimating a depth-dependent homography lookup table (Hlut). The homographies are computed within sets of ground control points of 104 images. Since the distance from the control points to the ToF camera are known, the working distance of each element on the Hlut is estimated. Finally, two series of experimental tests have been carried out in order to validate the capabilities of the proposed method. PMID:26404315

  18. Ultrathin zoom telescopic objective.

    PubMed

    Li, Lei; Wang, Di; Liu, Chao; Wang, Qiong-Hua

    2016-08-08

    We report an ultrathin zoom telescopic objective that can achieve continuous zoom change and has reduced compact volume. The objective consists of an annular folded lens and three electrowetting liquid lenses. The annular folded lens undertakes the main part of the focal power of the lens system. Due to a multiple-fold design, the optical path is folded in a lens with the thickness of ~1.98mm. The electrowetting liquid lenses constitute a zoom part. Based on the proposed objective, an ultrathin zoom telescopic camera is demonstrated. We analyze the properties of the proposed objective. The aperture of the proposed objective is ~15mm. The total length of the system is ~18mm with a tunable focal length ~48mm to ~65mm. Compared with the conventional zoom telescopic objective, the total length has been largely reduced.

  19. Co-Phasing the Large Binocular Telescope:. [Status and Performance of LBTI-PHASECam

    NASA Technical Reports Server (NTRS)

    Defrere, D.; Hinz, P.; Downey, E.; Ashby, D.; Bailey, V.; Brusa, G.; Christou, J.; Danchi, W. C.; Grenz, P.; Hill, J. M.; hide

    2014-01-01

    The Large Binocular Telescope Interferometer is a NASA-funded nulling and imaging instrument designed to coherently combine the two 8.4-m primary mirrors of the LBT for high-sensitivity, high-contrast, and high-resolution infrared imaging (1.5-13 micrometer). PHASECam is LBTI's near-infrared camera used to measure tip-tilt and phase variations between the two AO-corrected apertures and provide high-angular resolution observations. We report on the status of the system and describe its on-sky performance measured during the first semester of 2014. With a spatial resolution equivalent to that of a 22.8-meter telescope and the light-gathering power of single 11.8-meter mirror, the co-phased LBT can be considered to be a forerunner of the next-generation extremely large telescopes (ELT).

  20. Christoph Scheiner and the optics of the eye. (German Title: Christoph Scheiner und die Optik des Auges)

    NASA Astrophysics Data System (ADS)

    Daxecker, Franz

    Some of Scheiner's discoveries and experiments are taken from the books «Oculus», (Innsbruck 1619) and «Rosa Ursina sive Sol» (Rome 1626-1630): determination of the radius of curvature of the cornea, discovery of the nasal exit of the visual nerve, increase in the curvature of the lens in case of accommodation, anatomy of the eye, light reaction of the pupil, contraction of the pupil during accommodation, Scheiner's test (double images caused by ametropia), stenopeic effect, crossing rays in the eye, aperture, description of the cataract treatment, refractive indices of various parts of the eye, eye model, visual pivot angle of the eye, proof of crossing rays on the retina, comparison of the camera obscura and the optics of the eye.

  1. Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Mather, John; Stockman, H. S.; Fisher, Richard R. (Technical Monitor)

    2000-01-01

    The Next Generation Space Telescope (NGST), planned for launch in 2009, will be an 8-m class radiatively cooled infrared telescope at the Lagrange point L2. It will cover the wavelength range from 0.6 to 28 microns with cameras and spectrometers, to observe the first luminous objects after the Big Bang, and the formation, growth, clustering, and evolution of galaxies, stars, and protoplanetary clouds, leading to better understanding of our own Origins. It will seek evidence of the cosmic dark matter through its gravitational effects. With an aperture three times greater than the Hubble Space Telescope, it will provide extraordinary advances in capabilities and enable the discovery of many new phenomena. It is a joint project of the NASA, ESA, and CSA, and scientific operations will be provided by the Space Telescope Science Institute.

  2. Surveillance Cameras in Schools: An Ethical Analysis

    ERIC Educational Resources Information Center

    Warnick, Bryan R.

    2007-01-01

    In this essay, Bryan R. Warnick responds to the increasing use of surveillance cameras in public schools by examining the ethical questions raised by their use. He explores the extent of a student's right to privacy in schools, stipulates how video surveillance is similar to and different from commonly accepted in-person surveillance practices,…

  3. An alternative approach to depth of field which avoids the blur circle and uses the pixel pitch

    NASA Astrophysics Data System (ADS)

    Schuster, Norbert

    2015-09-01

    Modern thermal imaging systems apply more and more uncooled detectors. High volume applications work with detectors which have a reduced pixel count (typical between 200x150 and 640x480). This shrinks the application of modern image treatment procedures like wave front coding. On the other hand side, uncooled detectors demand lenses with fast F-numbers near 1.0. Which are the limits on resolution if the target to analyze changes its distance to the camera system? The aim to implement lens arrangements without any focusing mechanism demands a deeper quantification of the Depth of Field problem. The proposed Depth of Field approach avoids the classic "accepted image blur circle". It bases on a camera specific depth of focus which is transformed in the object space by paraxial relations. The traditional RAYLEIGH's -criterion bases on the unaberrated Point Spread Function and delivers a first order relation for the depth of focus. Hence, neither the actual lens resolution neither the detector impact is considered. The camera specific depth of focus respects a lot of camera properties: Lens aberrations at actual F-number, detector size and pixel pitch. The through focus MTF is the base of the camera specific depth of focus. It has a nearly symmetric course around the maximum of sharp imaging. The through focus MTF is considered at detector's Nyquist frequency. The camera specific depth of focus is this the axial distance in front and behind of sharp image plane where the through focus MTF is <0.25. This camera specific depth of focus is transferred in the object space by paraxial relations. It follows a general applicable Depth of Field diagram which could be applied to lenses realizing a lateral magnification range -0.05…0. Easy to handle formulas are provided between hyperfocal distance and the borders of the Depth of Field in dependence on sharp distances. These relations are in line with the classical Depth of Field-theory. Thermal pictures, taken by different IR-camera cores, illustrate the new approach. The quite often requested graph "MTF versus distance" choses the half Nyquist frequency as reference. The paraxial transfer of the through focus MTF in object space distorts the MTF-curve: hard drop at closer distances than sharp distance, smooth drop at further distances. The formula of a general Diffraction-Limited-Through-Focus-MTF (DLTF) is deducted. Arbitrary detector-lens combinations could be discussed. Free variables in this analysis are waveband, aperture based F-number (lens) and pixel pitch (detector). The DLTF- discussion provides physical limits and technical requirements. The detector development with pixel pitches smaller than captured wavelength in the LWIR-region generates a special challenge for optical design.

  4. Data annotation, recording and mapping system for the US open skies aircraft

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, B.W.; Goede, W.F.; Farmer, R.G.

    1996-11-01

    This paper discusses the system developed by Northrop Grumman for the Defense Nuclear Agency (DNA), US Air Force, and the On-Site Inspection Agency (OSIA) to comply with the data annotation and reporting provisions of the Open Skies Treaty. This system, called the Data Annotation, Recording and Mapping System (DARMS), has been installed on the US OC-135 and meets or exceeds all annotation requirements for the Open Skies Treaty. The Open Skies Treaty, which will enter into force in the near future, allows any of the 26 signatory countries to fly fixed wing aircraft with imaging sensors over any of themore » other treaty participants, upon very short notice, and with no restricted flight areas. Sensor types presently allowed by the treaty are: optical framing and panoramic film cameras; video cameras ranging from analog PAL color television cameras to the more sophisticated digital monochrome and color line scanning or framing cameras; infrared line scanners; and synthetic aperture radars. Each sensor type has specific performance parameters which are limited by the treaty, as well as specific annotation requirements which must be achieved upon full entry into force. DARMS supports U.S. compliance with the Opens Skies Treaty by means of three subsystems: the Data Annotation Subsytem (DAS), which annotates sensor media with data obtained from sensors and the aircraft`s avionics system; the Data Recording System (DRS), which records all sensor and flight events on magnetic media for later use in generating Treaty mandated mission reports; and the Dynamic Sensor Mapping Subsystem (DSMS), which provides observers and sensor operators with a real-time moving map displays of the progress of the mission, complete with instantaneous and cumulative sensor coverages. This paper will describe DARMS and its subsystems in greater detail, along with the supporting avionics sub-systems. 7 figs.« less

  5. Improved signal to noise ratio and sensitivity of an infrared imaging video bolometer on large helical device by using an infrared periscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandya, Shwetang N., E-mail: pandya.shwetang@LHD.nifs.ac.jp; Sano, Ryuichi; Peterson, Byron J.

    An Infrared imaging Video Bolometer (IRVB) diagnostic is currently being used in the Large Helical Device (LHD) for studying the localization of radiation structures near the magnetic island and helical divertor X-points during plasma detachment and for 3D tomography. This research demands high signal to noise ratio (SNR) and sensitivity to improve the temporal resolution for studying the evolution of radiation structures during plasma detachment and a wide IRVB field of view (FoV) for tomography. Introduction of an infrared periscope allows achievement of a higher SNR and higher sensitivity, which in turn, permits a twofold improvement in the temporal resolutionmore » of the diagnostic. Higher SNR along with wide FoV is achieved simultaneously by reducing the separation of the IRVB detector (metal foil) from the bolometer's aperture and the LHD plasma. Altering the distances to meet the aforesaid requirements results in an increased separation between the foil and the IR camera. This leads to a degradation of the diagnostic performance in terms of its sensitivity by 1.5-fold. Using an infrared periscope to image the IRVB foil results in a 7.5-fold increase in the number of IR camera pixels imaging the foil. This improves the IRVB sensitivity which depends on the square root of the number of IR camera pixels being averaged per bolometer channel. Despite the slower f-number (f/# = 1.35) and reduced transmission (τ{sub 0} = 89%, due to an increased number of lens elements) for the periscope, the diagnostic with an infrared periscope operational on LHD has improved in terms of sensitivity and SNR by a factor of 1.4 and 4.5, respectively, as compared to the original diagnostic without a periscope (i.e., IRVB foil being directly imaged by the IR camera through conventional optics). The bolometer's field of view has also increased by two times. The paper discusses these improvements in apt details.« less

  6. Laser guide star pointing camera for ESO LGS Facilities

    NASA Astrophysics Data System (ADS)

    Bonaccini Calia, D.; Centrone, M.; Pedichini, F.; Ricciardi, A.; Cerruto, A.; Ambrosino, F.

    2014-08-01

    Every observatory using LGS-AO routinely has the experience of the long time needed to bring and acquire the laser guide star in the wavefront sensor field of view. This is mostly due to the difficulty of creating LGS pointing models, because of the opto-mechanical flexures and hysteresis in the launch and receiver telescope structures. The launch telescopes are normally sitting on the mechanical structure of the larger receiver telescope. The LGS acquisition time is even longer in case of multiple LGS systems. In this framework the optimization of the LGS systems absolute pointing accuracy is relevant to boost the time efficiency of both science and technical observations. In this paper we show the rationale, the design and the feasibility tests of a LGS Pointing Camera (LPC), which has been conceived for the VLT Adaptive Optics Facility 4LGSF project. The LPC would assist in pointing the four LGS, while the VLT is doing the initial active optics cycles to adjust its own optics on a natural star target, after a preset. The LPC allows minimizing the needed accuracy for LGS pointing model calibrations, while allowing to reach sub-arcsec LGS absolute pointing accuracy. This considerably reduces the LGS acquisition time and observations operation overheads. The LPC is a smart CCD camera, fed by a 150mm diameter aperture of a Maksutov telescope, mounted on the top ring of the VLT UT4, running Linux and acting as server for the client 4LGSF. The smart camera is able to recognize within few seconds the sky field using astrometric software, determining the stars and the LGS absolute positions. Upon request it returns the offsets to give to the LGS, to position them at the required sky coordinates. As byproduct goal, once calibrated the LPC can calculate upon request for each LGS, its return flux, its fwhm and the uplink beam scattering levels.

  7. The need for separate operational and engineering user interfaces for command and control of airborne synthetic aperture radar systems

    NASA Astrophysics Data System (ADS)

    Klein, Laura M.; McNamara, Laura A.

    2017-05-01

    In this paper, we address the needed components to create usable engineering and operational user interfaces (UIs) for airborne Synthetic Aperture Radar (SAR) systems. As airborne SAR technology gains wider acceptance in the remote sensing and Intelligence, Surveillance, and Reconnaissance (ISR) communities, the need for effective and appropriate UIs to command and control these sensors has also increased. However, despite the growing demand for SAR in operational environments, the technology still faces an adoption roadblock, in large part due to the lack of effective UIs. It is common to find operational interfaces that have barely grown beyond the disparate tools engineers and technologists developed to demonstrate an initial concept or system. While sensor usability and utility are common requirements to engineers and operators, their objectives for interacting with the sensor are different. As such, the amount and type of information presented ought to be tailored to the specific application.

  8. Space based topographic mapping experiment using Seasat synthetic aperture radar and LANDSAT 3 return beam vidicon imagery

    NASA Technical Reports Server (NTRS)

    Mader, G. L.

    1981-01-01

    A technique for producing topographic information is described which is based on same side/same time viewing using a dissimilar combination of radar imagery and photographic images. Common geographic areas viewed from similar space reference locations produce scene elevation displacements in opposite direction and proper use of this characteristic can yield the perspective information necessary for determination of base to height ratios. These base to height ratios can in turn be used to produce a topographic map. A test area covering the Harrisburg, Pennsylvania region was observed by synthetic aperture radar on the Seasat satellite and by return beam vidicon on by the LANDSAT - 3 satellite. The techniques developed for the scaling re-orientation and common registration of the two images are presented along with the topographic determination data. Topographic determination based exclusively on the images content is compared to the map information which is used as a performance calibration base.

  9. A common-aperture X- and S-band four-function feedcone. [hornfeed design for antennas of Deep Space Network

    NASA Technical Reports Server (NTRS)

    Withington, J. R.; Williams, W. F.

    1982-01-01

    Williams and Withington (1979) have considered a prototype X-S-band feedhorn which enabled simultaneous X- and S-band reception from a Cassegrain antenna. This feedhorn has quite successfully demonstrated an alternate method to the standard Deep Space Network (DSN) system of multiple subreflectors and dichroic plate for dual-band reception. In connection with a Network Consolidation Program, involving centralized control of existing antennas and construction of new reflector antennas, a second-generation feedhorn/combiner was conceived to show that this common-aperture feedhorn system was capable of performing all necessary functions the DSN would be called upon to perform with existing and future X-S-band spacecraft. Attention is given to the feedhorn concept, the combiner concept, the first and the second generation of the horn, Sand X-band tuning, and planned capabilities. The feedhorn greatly extends the state of the art in DSN performance and will enhance DSN capabilities in the future.

  10. Patient Eye Examinations - Adults

    MedlinePlus

    Explore Recent Photos Trending Flickr VR The Commons Galleries World Map Camera Finder The Weekly Flickr Flickr Blog Create Upload Log In Sign Up Explore Recent Photos Trending The Commons Galleries The Weekly Flickr Flickr ...

  11. Anatomy of the Eye

    MedlinePlus

    Explore Recent Photos Trending Flickr VR The Commons Galleries World Map Camera Finder The Weekly Flickr Flickr Blog Create Upload Log In Sign Up Explore Recent Photos Trending The Commons Galleries The Weekly Flickr Flickr ...

  12. Analysis of limited-diffractive and limited-dispersive X-waves generated by finite radial waveguides

    NASA Astrophysics Data System (ADS)

    Fuscaldo, Walter; Pavone, Santi C.; Valerio, Guido; Galli, Alessandro; Albani, Matteo; Ettorre, Mauro

    2016-05-01

    In this work, we analyze the spatial and temporal features of electromagnetic X-waves propagating in free space and generated by planar radiating apertures. The performance of ideal X-waves is discussed and compared to practical cases where the important effects related to the finiteness of the radiating aperture and the wavenumber dispersion are taken into account. In particular, a practical device consisting of a radial waveguide loaded with radiating slots aligned along a spiral path is considered for the practical case in the millimeter-wave range. A common mathematical framework is defined for a precise comparison of the spatiotemporal properties and focusing capabilities of the generated X-wave. It is clearly shown that the fractional bandwidth of the radiating aperture has a key role in the longitudinal confinement of an X-wave in both ideal and practical cases. In addition, the finiteness of the radiating aperture as well as the wavenumber dispersion clearly affect both the transverse and the longitudinal profiles of the generated radiation as it travels beyond the depth-of-field of the generated X-wave. Nevertheless, the spatiotemporal properties of the X-wave are preserved even in this "dispersive-finite" case within a defined region and duration related to the nondiffractive range and fractional bandwidth of the spectral components of the generated X-wave. The proposed analysis may open new perspectives for the efficient generation of X-waves over finite radiating apertures at millimeter waves where the dispersive behavior of realistic devices is no longer negligible.

  13. Multi-frame image processing with panning cameras and moving subjects

    NASA Astrophysics Data System (ADS)

    Paolini, Aaron; Humphrey, John; Curt, Petersen; Kelmelis, Eric

    2014-06-01

    Imaging scenarios commonly involve erratic, unpredictable camera behavior or subjects that are prone to movement, complicating multi-frame image processing techniques. To address these issues, we developed three techniques that can be applied to multi-frame image processing algorithms in order to mitigate the adverse effects observed when cameras are panning or subjects within the scene are moving. We provide a detailed overview of the techniques and discuss the applicability of each to various movement types. In addition to this, we evaluated algorithm efficacy with demonstrated benefits using field test video, which has been processed using our commercially available surveillance product. Our results show that algorithm efficacy is significantly improved in common scenarios, expanding our software's operational scope. Our methods introduce little computational burden, enabling their use in real-time and low-power solutions, and are appropriate for long observation periods. Our test cases focus on imaging through turbulence, a common use case for multi-frame techniques. We present results of a field study designed to test the efficacy of these techniques under expanded use cases.

  14. Plenoptic camera based on a liquid crystal microlens array

    NASA Astrophysics Data System (ADS)

    Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Xie, Changsheng

    2015-09-01

    A type of liquid crystal microlens array (LCMLA) with tunable focal length by the voltage signals applied between its top and bottom electrodes, is fabricated and then the common optical focusing characteristics are tested. The relationship between the focal length and the applied voltage signals is given. The LCMLA is integrated with an image sensor and further coupled with a main lens so as to construct a plenoptic camera. Several raw images at different voltage signals applied are acquired and contrasted through the LCMLA-based plenoptic camera constructed by us. Our experiments demonstrate that through utilizing a LCMLA in a plenoptic camera, the focused zone of the LCMLA-based plenoptic camera can be shifted effectively only by changing the voltage signals loaded between the electrodes of the LCMLA, which is equivalent to the extension of the depth of field.

  15. Contributed Review: Camera-limits for wide-field magnetic resonance imaging with a nitrogen-vacancy spin sensor

    NASA Astrophysics Data System (ADS)

    Wojciechowski, Adam M.; Karadas, Mürsel; Huck, Alexander; Osterkamp, Christian; Jankuhn, Steffen; Meijer, Jan; Jelezko, Fedor; Andersen, Ulrik L.

    2018-03-01

    Sensitive, real-time optical magnetometry with nitrogen-vacancy centers in diamond relies on accurate imaging of small (≪10-2), fractional fluorescence changes across the diamond sample. We discuss the limitations on magnetic field sensitivity resulting from the limited number of photoelectrons that a camera can record in a given time. Several types of camera sensors are analyzed, and the smallest measurable magnetic field change is estimated for each type. We show that most common sensors are of a limited use in such applications, while certain highly specific cameras allow achieving nanotesla-level sensitivity in 1 s of a combined exposure. Finally, we demonstrate the results obtained with a lock-in camera that paves the way for real-time, wide-field magnetometry at the nanotesla level and with a micrometer resolution.

  16. High-speed electromechanical chutter for imaging spectrographs

    NASA Technical Reports Server (NTRS)

    Nguyen, Quang-Viet (Inventor)

    2005-01-01

    The present invention presents a high-speed electromechanical shutter which has at least two rotary beam choppers that are synchronized using a phase-locked loop electronic control to reduce the duty cycle. These choppers have blade means that can comprise discs or drums, each having about 60 (+/- 15) slots which are from about 0.3 to about 0.8 mm wide and about 5 to about 20 nun long (radially) which are evenly distributed through out 360 deg, and a third rotary chopper which is optically aligned has a small number of slots, such as for example, 1 to 10 slots which are about 1 to about 2 mm wide and about 5 to about 20 mm long (radially). Further the blade means include phase slots that allow the blade means to be phase locked using a closed loop control circuit. In addition, in a preferred embodiment, the system also has a leaf shutter. Thus the invention preferably achieves a gate width of less than about 100 microseconds, using motors that operate at 3000 to 10,OOO rpm, and with a phase jitter of less than about 1.5 microseconds, and further using an aperture with more than about 75% optical transmission with a clear aperture of about 0.8 -10 nun. The system can be synchronized to external sources at 0 6 kHz lasers, data acquisition systems, and cameras.

  17. High-speed electromechanical shutter for imaging spectrographs

    NASA Technical Reports Server (NTRS)

    Nguyen, Quang-Viet (Inventor)

    2005-01-01

    The present invention presents a high-speed electromechanical shutter which has at least two rotary beam choppers that are synchronized using a phase-locked loop electronic control to reduce the duty cycle. These choppers have blade means that can comprise discs or drums, each having about 60 (+/-15) slots which are from about 0.3 to about 0.8 mm wide and about 5 to about 20 mm long (radially) which are evenly distributed through out 360?, and a third rotary chopper which is optically aligned has a small number of slots, such as for example, 1 to 10 slots which are about 1 to about 2 mm wide and about 5 to about 20 mm long (radially). Further the blade means include phase slots that allow the blade means to be phase locked using a closed loop control circuit. In addition, in a preferred embodiment, the system also has a leaf shutter. Thus the invention preferably achieves a gate width of less than about 100 microseconds, using motors that operate at 3000 to 10,000 rpm, and with a phase jitter of less than about 1.5 microseconds, and further using an aperture with more than about 75% optical transmission with a clear aperture of about 0.8 mm?10 mm. The system can be synchronized to external sources at 0 6 kHz lasers, data acquisition systems, and cameras.

  18. The analysis of the wavefront aberration caused by the gravity of the tunable-focus liquid-filled membrane lens

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Liu, Pengfei; Wei, Xiaona; Zhuang, Songlin; Yang, Bo

    2010-11-01

    Liquid lens is a novel optical device which can implement active zooming. With liquid lens, zoom camera can be designed with more miniature size and simpler structure than before. It is thought that the micro zoom system with liquid lens has a very wide potential applications in many fields, in which the volume and weight of the system are critically limited, such as endoscope, mobile, PDA and so on. There are mainly three types of tunable-focus liquid lens: liquid crystal lens, electrowetting effect based liquid lens and liquid-filled membrane lens. Comparing with the other two kinds of liquid lens, the liquid-filled membrane lens has the advantages of simple structure, flexible aperture and high zooming efficiency. But its membrane surface will have an initial shape deformation caused by the gravity when the aperture of the lens is at large size, which will lead to the wave front aberration and the imaging quality impairing. In this paper, the initial deformation of the lens caused by the gravity was simulated based on the theory of Elastic Mechanics, which was calculated by the Finite Element Analysis method. The relationship between the diameter of the lens and the wave front aberration caused by the gravity was studied. And the Optical path difference produced by different liquid density was also analyzed.

  19. Scalable gamma-ray camera for wide-area search based on silicon photomultipliers array

    NASA Astrophysics Data System (ADS)

    Jeong, Manhee; Van, Benjamin; Wells, Byron T.; D'Aries, Lawrence J.; Hammig, Mark D.

    2018-03-01

    Portable coded-aperture imaging systems based on scintillators and semiconductors have found use in a variety of radiological applications. For stand-off detection of weakly emitting materials, large volume detectors can facilitate the rapid localization of emitting materials. We describe a scalable coded-aperture imaging system based on 5.02 × 5.02 cm2 CsI(Tl) scintillator modules, each partitioned into 4 × 4 × 20 mm3 pixels that are optically coupled to 12 × 12 pixel silicon photo-multiplier (SiPM) arrays. The 144 pixels per module are read-out with a resistor-based charge-division circuit that reduces the readout outputs from 144 to four signals per module, from which the interaction position and total deposited energy can be extracted. All 144 CsI(Tl) pixels are readily distinguishable with an average energy resolution, at 662 keV, of 13.7% FWHM, a peak-to-valley ratio of 8.2, and a peak-to-Compton ratio of 2.9. The detector module is composed of a SiPM array coupled with a 2 cm thick scintillator and modified uniformly redundant array mask. For the image reconstruction, cross correlation and maximum likelihood expectation maximization methods are used. The system shows a field of view of 45° and an angular resolution of 4.7° FWHM.

  20. Understanding and predicting the dynamics of tokamak discharges during startup and rampdown

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, G. L.; Politzer, P. A.; Humphreys, D. A.

    Understanding the dynamics of plasma startup and termination is important for present tokamaks and for predictive modeling of future burning plasma devices such as ITER. We report on experiments in the DIII-D tokamak that explore the plasma startup and rampdown phases and on the benchmarking of transport models. Key issues have been examined such as plasma initiation and burnthrough with limited inductive voltage and achieving flattop and maximum burn within the technical limits of coil systems and their actuators while maintaining the desired q profile. Successful rampdown requires scenarios consistent with technical limits, including controlled H-L transitions, while avoiding verticalmore » instabilities, additional Ohmic transformer flux consumption, and density limit disruptions. Discharges were typically initiated with an inductive electric field typical of ITER, 0.3 V/m, most with second harmonic electron cyclotron assist. A fast framing camera was used during breakdown and burnthrough of low Z impurity charge states to study the formation physics. An improved 'large aperture' ITER startup scenario was developed, and aperture reduction in rampdown was found to be essential to avoid instabilities. Current evolution using neoclassical conductivity in the CORSICA code agrees with rampup experiments, but the prediction of the temperature and internal inductance evolution using the Coppi-Tang model for electron energy transport is not yet accurate enough to allow extrapolation to future devices.« less

  1. Photometric Studies of GEO Debris

    NASA Technical Reports Server (NTRS)

    Seitzer, Patrick; Cowardin, Heather M.; Barker, Edwin; Abercromby, Kira J.; Foreman, Gary; Horstman, Matt

    2009-01-01

    The photometric signature of a debris object can be useful in determining what the physical characteristics of a piece of debris are. We report on optical observations in multiple filters of debris at geosynchronous Earth orbit (GEO). Our sample is taken from GEO objects discovered in a survey with the University of Michigan's 0.6-m aperture Schmidt telescope MODEST (for Michigan Orbital DEbris Survey Telescope), and then followed up in real-time with the SMARTS (Small and Medium Aperture Research Telescope System) 0.9-m at CTIO for orbits and photometry. Our goal is to determine 6 parameter orbits and measure colors for all objects fainter than R = 15 th magnitude that are discovered in the MODEST survey. At this magnitude the distribution of observed angular rates changes significantly from that of brighter objects. There are two objectives: 1. Estimate the orbital distribution of objects selected on the basis of two observational criteria: brightness (magnitude) and angular rates. 2. Obtain magnitudes and colors in standard astronomical filters (BVRI) for comparison with reflectance spectra of likely spacecraft materials. What is the faint debris likely to be? In this paper we report on the photometric results. For a sample of 50 objects, more than 90 calibrated sequences of R-B-V-I-R magnitudes have been obtained with the CTIO 0.9-m. For objects that do not show large brightness variations, the colors are largely redder than solar in both B-R and R-I. The width of the color distribution may be intrinsic to the nature of the surfaces, but also could be that we are seeing irregularly shaped objects and measuring the colors at different times with just one telescope. For a smaller sample of objects we have observed with synchronized CCD cameras on the two telescopes. The CTIO 0.9-m observes in B, and MODEST in R. The CCD cameras are electronically linked together so that the start time and duration of observations are the same to better than 50 milliseconds. Thus the B-R color is a true measure of the surface of the debris piece facing the telescopes for that observation. Any change in color reflects a real change in the debris surface.

  2. Accurate reconstruction of hyperspectral images from compressive sensing measurements

    NASA Astrophysics Data System (ADS)

    Greer, John B.; Flake, J. C.

    2013-05-01

    The emerging field of Compressive Sensing (CS) provides a new way to capture data by shifting the heaviest burden of data collection from the sensor to the computer on the user-end. This new means of sensing requires fewer measurements for a given amount of information than traditional sensors. We investigate the efficacy of CS for capturing HyperSpectral Imagery (HSI) remotely. We also introduce a new family of algorithms for constructing HSI from CS measurements with Split Bregman Iteration [Goldstein and Osher,2009]. These algorithms combine spatial Total Variation (TV) with smoothing in the spectral dimension. We examine models for three different CS sensors: the Coded Aperture Snapshot Spectral Imager-Single Disperser (CASSI-SD) [Wagadarikar et al.,2008] and Dual Disperser (CASSI-DD) [Gehm et al.,2007] cameras, and a hypothetical random sensing model closer to CS theory, but not necessarily implementable with existing technology. We simulate the capture of remotely sensed images by applying the sensor forward models to well-known HSI scenes - an AVIRIS image of Cuprite, Nevada and the HYMAP Urban image. To measure accuracy of the CS models, we compare the scenes constructed with our new algorithm to the original AVIRIS and HYMAP cubes. The results demonstrate the possibility of accurately sensing HSI remotely with significantly fewer measurements than standard hyperspectral cameras.

  3. Visual field information in Nap-of-the-Earth flight by teleoperated Helmet-Mounted displays

    NASA Technical Reports Server (NTRS)

    Grunwald, Arthur J.; Kohn, S.; Merhav, S. J.

    1991-01-01

    The human ability to derive Control-Oriented Visual Field Information from teleoperated Helmet-Mounted displays in Nap-of-the-Earth flight, is investigated. The visual field with these types of displays originates from a Forward Looking Infrared Radiation Camera, gimbal-mounted at the front of the aircraft and slaved to the pilot's line-of-sight, to obtain wide-angle visual coverage. Although these displays are proved to be effective in Apache and Cobra helicopter night operations, they demand very high pilot proficiency and work load. Experimental work presented in the paper has shown that part of the difficulties encountered in vehicular control by means of these displays can be attributed to the narrow viewing aperture and head/camera slaving system phase lags. Both these shortcomings will impair visuo-vestibular coordination, when voluntary head rotation is present. This might result in errors in estimating the Control-Oriented Visual Field Information vital in vehicular control, such as the vehicle yaw rate or the anticipated flight path, or might even lead to visuo-vestibular conflicts (motion sickness). Since, under these conditions, the pilot will tend to minimize head rotation, the full wide-angle coverage of the Helmet-Mounted Display, provided by the line-of-sight slaving system, is not always fully utilized.

  4. Characterization of the reference wave in a compact digital holographic camera.

    PubMed

    Park, I S; Middleton, R J C; Coggrave, C R; Ruiz, P D; Coupland, J M

    2018-01-01

    A hologram is a recording of the interference between an unknown object wave and a coherent reference wave. Providing the object and reference waves are sufficiently separated in some region of space and the reference beam is known, a high-fidelity reconstruction of the object wave is possible. In traditional optical holography, high-quality reconstruction is achieved by careful reillumination of the holographic plate with the exact same reference wave that was used at the recording stage. To reconstruct high-quality digital holograms the exact parameters of the reference wave must be known mathematically. This paper discusses a technique that obtains the mathematical parameters that characterize a strongly divergent reference wave that originates from a fiber source in a new compact digital holographic camera. This is a lensless design that is similar in principle to a Fourier hologram, but because of the large numerical aperture, the usual paraxial approximations cannot be applied and the Fourier relationship is inexact. To characterize the reference wave, recordings of quasi-planar object waves are made at various angles of incidence using a Dammann grating. An optimization process is then used to find the reference wave that reconstructs a stigmatic image of the object wave regardless of the angle of incidence.

  5. FPscope: a field-portable high-resolution microscope using a cellphone lens.

    PubMed

    Dong, Siyuan; Guo, Kaikai; Nanda, Pariksheet; Shiradkar, Radhika; Zheng, Guoan

    2014-10-01

    The large consumer market has made cellphone lens modules available at low-cost and in high-quality. In a conventional cellphone camera, the lens module is used to demagnify the scene onto the image plane of the camera, where image sensor is located. In this work, we report a 3D-printed high-resolution Fourier ptychographic microscope, termed FPscope, which uses a cellphone lens in a reverse manner. In our platform, we replace the image sensor with sample specimens, and use the cellphone lens to project the magnified image to the detector. To supersede the diffraction limit of the lens module, we use an LED array to illuminate the sample from different incident angles and synthesize the acquired images using the Fourier ptychographic algorithm. As a demonstration, we use the reported platform to acquire high-resolution images of resolution target and biological specimens, with a maximum synthetic numerical aperture (NA) of 0.5. We also show that, the depth-of-focus of the reported platform is about 0.1 mm, orders of magnitude longer than that of a conventional microscope objective with a similar NA. The reported platform may enable healthcare accesses in low-resource settings. It can also be used to demonstrate the concept of computational optics for educational purposes.

  6. X-ray Measurements of Laser Irradiated Foam Filled Liners

    NASA Astrophysics Data System (ADS)

    Patankar, Siddharth; Mariscal, Derek; Goyon, Clement; Baker, Kevin; MacLaren, Stephan; Hammer, Jim; Baumann, Ted; Amendt, Peter; Menapace, Joseph; Berger, Bob; Afeyan, Bedros; Tabak, Max; Dixit, Sham; Kim, Sung Ho; Moody, John; Jones, Ogden

    2016-10-01

    Low-density foam liners are being investigated as sources of efficient x-rays. Understanding the laser-foam interaction is key to modeling and optimizing foam composition and density for x-ray production with reduced backscatter. We report on the experimental results of laser-irradiated foam liners filled with SiO2 and Ta2O5 foams at densities between 2 to 30mg/cc. The foam liners consist of polyimide tubes filled with low-density foams and sealed with a gold foil at one end. The open end of the tube is driven with 250J of 527nm laser light in a 2ns 2-step pulse using the Jupiter Laser Facility at LLNL. A full aperture backscatter system is used to diagnose the coupled energy and losses. A streaked x-ray camera and filtered x-ray pinhole cameras are used to measure laser penetration into the low-density foam for different mass densities. A HOPG crystal spectrometer is used to estimate a thermal electron temperature. Comparisons with beam propagation and x-ray emission simulations are presented. This work was performed under the auspices of the U.S. Department of Energy by the Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, with funding support from the Laboratory Directed Research and Development Program under project 15.

  7. A mobile device-based imaging spectrometer for environmental monitoring by attaching a lightweight small module to a commercial digital camera.

    PubMed

    Cai, Fuhong; Lu, Wen; Shi, Wuxiong; He, Sailing

    2017-11-15

    Spatially-explicit data are essential for remote sensing of ecological phenomena. Lately, recent innovations in mobile device platforms have led to an upsurge in on-site rapid detection. For instance, CMOS chips in smart phones and digital cameras serve as excellent sensors for scientific research. In this paper, a mobile device-based imaging spectrometer module (weighing about 99 g) is developed and equipped on a Single Lens Reflex camera. Utilizing this lightweight module, as well as commonly used photographic equipment, we demonstrate its utility through a series of on-site multispectral imaging, including ocean (or lake) water-color sensing and plant reflectance measurement. Based on the experiments we obtain 3D spectral image cubes, which can be further analyzed for environmental monitoring. Moreover, our system can be applied to many kinds of cameras, e.g., aerial camera and underwater camera. Therefore, any camera can be upgraded to an imaging spectrometer with the help of our miniaturized module. We believe it has the potential to become a versatile tool for on-site investigation into many applications.

  8. Single-snapshot 2D color measurement by plenoptic imaging system

    NASA Astrophysics Data System (ADS)

    Masuda, Kensuke; Yamanaka, Yuji; Maruyama, Go; Nagai, Sho; Hirai, Hideaki; Meng, Lingfei; Tosic, Ivana

    2014-03-01

    Plenoptic cameras enable capture of directional light ray information, thus allowing applications such as digital refocusing, depth estimation, or multiband imaging. One of the most common plenoptic camera architectures contains a microlens array at the conventional image plane and a sensor at the back focal plane of the microlens array. We leverage the multiband imaging (MBI) function of this camera and develop a single-snapshot, single-sensor high color fidelity camera. Our camera is based on a plenoptic system with XYZ filters inserted in the pupil plane of the main lens. To achieve high color measurement precision of this system, we perform an end-to-end optimization of the system model that includes light source information, object information, optical system information, plenoptic image processing and color estimation processing. Optimized system characteristics are exploited to build an XYZ plenoptic colorimetric camera prototype that achieves high color measurement precision. We describe an application of our colorimetric camera to color shading evaluation of display and show that it achieves color accuracy of ΔE<0.01.

  9. Issues in implementing services for a wireless web-enabled digital camera

    NASA Astrophysics Data System (ADS)

    Venkataraman, Shyam; Sampat, Nitin; Fisher, Yoram; Canosa, John; Noel, Nicholas

    2001-05-01

    The competition in the exploding digital photography market has caused vendors to explore new ways to increase their return on investment. A common view among industry analysts is that increasingly it will be services provided by these cameras, and not the cameras themselves, that will provide the revenue stream. These services will be coupled to e- Appliance based Communities. In addition, the rapidly increasing need to upload images to the Internet for photo- finishing services as well as the need to download software upgrades to the camera is driving many camera OEMs to evaluate the benefits of using the wireless web to extend their enterprise systems. Currently, creating a viable e- appliance such as a digital camera coupled with a wireless web service requires more than just a competency in product development. This paper will evaluate the system implications in the deployment of recurring revenue services and enterprise connectivity of a wireless, web-enabled digital camera. These include, among other things, an architectural design approach for services such as device management, synchronization, billing, connectivity, security, etc. Such an evaluation will assist, we hope, anyone designing or connecting a digital camera to the enterprise systems.

  10. Standard design for National Ignition Facility x-ray streak and framing cameras.

    PubMed

    Kimbrough, J R; Bell, P M; Bradley, D K; Holder, J P; Kalantar, D K; MacPhee, A G; Telford, S

    2010-10-01

    The x-ray streak camera and x-ray framing camera for the National Ignition Facility were redesigned to improve electromagnetic pulse hardening, protect high voltage circuits from pressure transients, and maximize the use of common parts and operational software. Both instruments use the same PC104 based controller, interface, power supply, charge coupled device camera, protective hermetically sealed housing, and mechanical interfaces. Communication is over fiber optics with identical facility hardware for both instruments. Each has three triggers that can be either fiber optic or coax. High voltage protection consists of a vacuum sensor to enable the high voltage and pulsed microchannel plate phosphor voltage. In the streak camera, the high voltage is removed after the sweep. Both rely on the hardened aluminum box and a custom power supply to reduce electromagnetic pulse/electromagnetic interference (EMP/EMI) getting into the electronics. In addition, the streak camera has an EMP/EMI shield enclosing the front of the streak tube.

  11. Ranging Apparatus and Method Implementing Stereo Vision System

    NASA Technical Reports Server (NTRS)

    Li, Larry C. (Inventor); Cox, Brian J. (Inventor)

    1997-01-01

    A laser-directed ranging system for use in telerobotics applications and other applications involving physically handicapped individuals. The ranging system includes a left and right video camera mounted on a camera platform, and a remotely positioned operator. The position of the camera platform is controlled by three servo motors to orient the roll axis, pitch axis and yaw axis of the video cameras, based upon an operator input such as head motion. A laser is provided between the left and right video camera and is directed by the user to point to a target device. The images produced by the left and right video cameras are processed to eliminate all background images except for the spot created by the laser. This processing is performed by creating a digital image of the target prior to illumination by the laser, and then eliminating common pixels from the subsequent digital image which includes the laser spot. The horizontal disparity between the two processed images is calculated for use in a stereometric ranging analysis from which range is determined.

  12. Light-Directed Ranging System Implementing Single Camera System for Telerobotics Applications

    NASA Technical Reports Server (NTRS)

    Wells, Dennis L. (Inventor); Li, Larry C. (Inventor); Cox, Brian J. (Inventor)

    1997-01-01

    A laser-directed ranging system has utility for use in various fields, such as telerobotics applications and other applications involving physically handicapped individuals. The ranging system includes a single video camera and a directional light source such as a laser mounted on a camera platform, and a remotely positioned operator. In one embodiment, the position of the camera platform is controlled by three servo motors to orient the roll axis, pitch axis and yaw axis of the video cameras, based upon an operator input such as head motion. The laser is offset vertically and horizontally from the camera, and the laser/camera platform is directed by the user to point the laser and the camera toward a target device. The image produced by the video camera is processed to eliminate all background images except for the spot created by the laser. This processing is performed by creating a digital image of the target prior to illumination by the laser, and then eliminating common pixels from the subsequent digital image which includes the laser spot. A reference point is defined at a point in the video frame, which may be located outside of the image area of the camera. The disparity between the digital image of the laser spot and the reference point is calculated for use in a ranging analysis to determine range to the target.

  13. LEO-ground scintillation measurements with the optical ground station Oberpfaffenhofen and SOTA/OPALS space terminals

    NASA Astrophysics Data System (ADS)

    Moll, Florian; Kolev, Dimitar; Abrahamson, Matthew; Schmidt, Christopher; Mata Calvo, Ramon; Fuchs, Christian

    2016-10-01

    The optical satellite-ground channel is turbulent and causes scintillation of the power received by a ground based telescope. Measurements are important to quantify the effect and evaluate common theory. A telescope with 40 cm primary mirror is used to measure the signals from the OPALS terminal on the International Space Station and the SOTA terminal on the SOCRATES satellite. The measurement instrument is a pupil camera from which images are recorded and intensity scintillation index, power scintillation index, probability density function of intensity and intensity correlation width are derived. A preliminary analysis of measurements from three satellite passed is performed, presented and discussed. The intensity scintillation index ranges from 0.25 to 0.03 within elevations of 26 to 66 deg. Power scintillation index varies from 0.08 to 0.006 and correlation width of intensity between 11 and 3 cm. The measurements can be used to estimate the fluctuation dynamics to be expected for a future operational ground receiver. The measurements are compared to model calculations based on the HV5/7-profile. Good agreement is observed to some part in the intensity scintillation index. Agreement is less for the power scintillation index and intensity correlation width. The reason seems to be a reduction of aperture averaging in some sections of the measurements due to increased speckle size. Finally, topics for future work are identified to improve the measurement analysis and deeper investigate the origin of the observed behavior.

  14. Phase and amplitude wave front sensing and reconstruction with a modified plenoptic camera

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Nelson, William; Davis, Christopher C.

    2014-10-01

    A plenoptic camera is a camera that can retrieve the direction and intensity distribution of light rays collected by the camera and allows for multiple reconstruction functions such as: refocusing at a different depth, and for 3D microscopy. Its principle is to add a micro-lens array to a traditional high-resolution camera to form a semi-camera array that preserves redundant intensity distributions of the light field and facilitates back-tracing of rays through geometric knowledge of its optical components. Though designed to process incoherent images, we found that the plenoptic camera shows high potential in solving coherent illumination cases such as sensing both the amplitude and phase information of a distorted laser beam. Based on our earlier introduction of a prototype modified plenoptic camera, we have developed the complete algorithm to reconstruct the wavefront of the incident light field. In this paper the algorithm and experimental results will be demonstrated, and an improved version of this modified plenoptic camera will be discussed. As a result, our modified plenoptic camera can serve as an advanced wavefront sensor compared with traditional Shack- Hartmann sensors in handling complicated cases such as coherent illumination in strong turbulence where interference and discontinuity of wavefronts is common. Especially in wave propagation through atmospheric turbulence, this camera should provide a much more precise description of the light field, which would guide systems in adaptive optics to make intelligent analysis and corrections.

  15. Absolute colorimetric characterization of a DSLR camera

    NASA Astrophysics Data System (ADS)

    Guarnera, Giuseppe Claudio; Bianco, Simone; Schettini, Raimondo

    2014-03-01

    A simple but effective technique for absolute colorimetric camera characterization is proposed. It offers a large dynamic range requiring just a single, off-the-shelf target and a commonly available controllable light source for the characterization. The characterization task is broken down in two modules, respectively devoted to absolute luminance estimation and to colorimetric characterization matrix estimation. The characterized camera can be effectively used as a tele-colorimeter, giving an absolute estimation of the XYZ data in cd=m2. The user is only required to vary the f - number of the camera lens or the exposure time t, to better exploit the sensor dynamic range. The estimated absolute tristimulus values closely match the values measured by a professional spectro-radiometer.

  16. The Mast Cameras and Mars Descent Imager (MARDI) for the 2009 Mars Science Laboratory

    NASA Technical Reports Server (NTRS)

    Malin, M. C.; Bell, J. F.; Cameron, J.; Dietrich, W. E.; Edgett, K. S.; Hallet, B.; Herkenhoff, K. E.; Lemmon, M. T.; Parker, T. J.; Sullivan, R. J.

    2005-01-01

    Based on operational experience gained during the Mars Exploration Rover (MER) mission, we proposed and were selected to conduct two related imaging experiments: (1) an investigation of the geology and short-term atmospheric vertical wind profile local to the Mars Science Laboratory (MSL) landing site using descent imaging, and (2) a broadly-based scientific investigation of the MSL locale employing visible and very near infra-red imaging techniques from a pair of mast-mounted, high resolution cameras. Both instruments share a common electronics design, a design also employed for the MSL Mars Hand Lens Imager (MAHLI) [1]. The primary differences between the cameras are in the nature and number of mechanisms and specific optics tailored to each camera s requirements.

  17. Security camera resolution measurements: Horizontal TV lines versus modulation transfer function measurements.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birch, Gabriel Carisle; Griffin, John Clark

    2015-01-01

    The horizontal television lines (HTVL) metric has been the primary quantity used by division 6000 related to camera resolution for high consequence security systems. This document shows HTVL measurements are fundamen- tally insufficient as a metric to determine camera resolution, and propose a quantitative, standards based methodology by measuring the camera system modulation transfer function (MTF), the most common and accepted metric of res- olution in the optical science community. Because HTVL calculations are easily misinterpreted or poorly defined, we present several scenarios in which HTVL is frequently reported, and discuss their problems. The MTF metric is discussed, and scenariosmore » are presented with calculations showing the application of such a metric.« less

  18. Effects of camera location on the reconstruction of 3D flare trajectory with two cameras

    NASA Astrophysics Data System (ADS)

    Özsaraç, Seçkin; Yeşilkaya, Muhammed

    2015-05-01

    Flares are used as valuable electronic warfare assets for the battle against infrared guided missiles. The trajectory of the flare is one of the most important factors that determine the effectiveness of the counter measure. Reconstruction of the three dimensional (3D) position of a point, which is seen by multiple cameras, is a common problem. Camera placement, camera calibration, corresponding pixel determination in between the images of different cameras and also the triangulation algorithm affect the performance of 3D position estimation. In this paper, we specifically investigate the effects of camera placement on the flare trajectory estimation performance by simulations. Firstly, 3D trajectory of a flare and also the aircraft, which dispenses the flare, are generated with simple motion models. Then, we place two virtual ideal pinhole camera models on different locations. Assuming the cameras are tracking the aircraft perfectly, the view vectors of the cameras are computed. Afterwards, using the view vector of each camera and also the 3D position of the flare, image plane coordinates of the flare on both cameras are computed using the field of view (FOV) values. To increase the fidelity of the simulation, we have used two sources of error. One is used to model the uncertainties in the determination of the camera view vectors, i.e. the orientations of the cameras are measured noisy. Second noise source is used to model the imperfections of the corresponding pixel determination of the flare in between the two cameras. Finally, 3D position of the flare is estimated using the corresponding pixel indices, view vector and also the FOV of the cameras by triangulation. All the processes mentioned so far are repeated for different relative camera placements so that the optimum estimation error performance is found for the given aircraft and are trajectories.

  19. An electrically tunable plenoptic camera using a liquid crystal microlens array.

    PubMed

    Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Ji, An; Xie, Changsheng

    2015-05-01

    Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated with an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.

  20. An electrically tunable plenoptic camera using a liquid crystal microlens array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, Yu; School of Automation, Huazhong University of Science and Technology, Wuhan 430074; Wuhan National Laboratory for Optoelectronics, Huazhong University of Science and Technology, Wuhan 430074

    2015-05-15

    Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated withmore » an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.« less

  1. An electrically tunable plenoptic camera using a liquid crystal microlens array

    NASA Astrophysics Data System (ADS)

    Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Ji, An; Xie, Changsheng

    2015-05-01

    Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated with an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.

  2. More than sixty origins of pantoporate pollen in angiosperms.

    PubMed

    Prieu, Charlotte; Sauquet, Hervé; Gouyon, Pierre-Henri; Albert, Béatrice

    2017-12-01

    Apertures in pollen grains are key structures of the wall, involved in pollen tube germination and exchanges with the environment. Aperture types in angiosperms are diverse, but pollen with one and three apertures (including monosulcate and tricolpate, respectively) are the two most common types. Here, we investigate the phylogenetic distribution in angiosperms of pollen with many round, scattered apertures called pantoporate pollen. We constructed a morphological data set with species producing pantoporate pollen and representative angiosperm species with other pollen types, sampled from every angiosperm order, with a total of 1260 species distributed in 330 families. This data set was analyzed with parsimony to characterize the phylogenetic distribution of pantoporate pollen in angiosperms. We show that pantoporate pollen is distributed throughout most of the angiosperm tree, including early diverging angiosperms, monocots, and eudicots. However, this pollen type is usually restricted to a few species in a given group, and is seldom fixed at large taxonomical scales, with a few notable exceptions. Pantoporate pollen evolved many times during angiosperm history, but the persistence of this morphology in the long term is infrequent. This distribution pattern could indicate conflicting short-term and long-term selective pressures, pantoporate pollen being selected in the short run, but eliminated in the long run. Biological hypotheses supporting this scenario are discussed, in the context of both theoretical and empirical data on pollen biology. © 2017 Botanical Society of America.

  3. Optical design of optical synthetic aperture telescope

    NASA Astrophysics Data System (ADS)

    Zhou, Chenghao; Wang, Zhile

    2018-03-01

    Optical synthetic aperture (OSA) is a promising solution for very high-resolution imaging while reducing its volume and mass. In this paper, first, the configuration of OSA systems are analyzed and the design methods of two types (Fizeau and Michelson) of OSA systems are summarized and researched. Second, Fizeau and Michelson OSA prototype systems are designed in detail. In the Michelson configuration, the instrument is made of sub-telescopes distributed in entrance pupil and combined by a common telescope via phase delay line. The design of Michelson configuration is more difficult than that of Fizeau configuration. In the design of Fizeau configuration, according to the third aberration theory tworeflective system is designed. Then the primary mirror of the two mirror system is replaced by the synthetic aperture. The whole system was simulated by Zemax software to obtain the Modulation transform function (MTF). In the design of Michelson configuration, the system is first divided into three parts: the afocal interferometric telescopes, beam combiner system and phase delay line. The three parts are designed respectively and then combined in Zemax software to obtain the MTF.

  4. Beamforming using subspace estimation from a diagonally averaged sample covariance.

    PubMed

    Quijano, Jorge E; Zurk, Lisa M

    2017-08-01

    The potential benefit of a large-aperture sonar array for high resolution target localization is often challenged by the lack of sufficient data required for adaptive beamforming. This paper introduces a Toeplitz-constrained estimator of the clairvoyant signal covariance matrix corresponding to multiple far-field targets embedded in background isotropic noise. The estimator is obtained by averaging along subdiagonals of the sample covariance matrix, followed by covariance extrapolation using the method of maximum entropy. The sample covariance is computed from limited data snapshots, a situation commonly encountered with large-aperture arrays in environments characterized by short periods of local stationarity. Eigenvectors computed from the Toeplitz-constrained covariance are used to construct signal-subspace projector matrices, which are shown to reduce background noise and improve detection of closely spaced targets when applied to subspace beamforming. Monte Carlo simulations corresponding to increasing array aperture suggest convergence of the proposed projector to the clairvoyant signal projector, thereby outperforming the classic projector obtained from the sample eigenvectors. Beamforming performance of the proposed method is analyzed using simulated data, as well as experimental data from the Shallow Water Array Performance experiment.

  5. Improved high-throughput quantification of luminescent microplate assays using a common Western-blot imaging system.

    PubMed

    Hawkins, Liam J; Storey, Kenneth B

    2017-01-01

    Common Western-blot imaging systems have previously been adapted to measure signals from luminescent microplate assays. This can be a cost saving measure as Western-blot imaging systems are common laboratory equipment and could substitute a dedicated luminometer if one is not otherwise available. One previously unrecognized limitation is that the signals captured by the cameras in these systems are not equal for all wells. Signals are dependent on the angle of incidence to the camera, and thus the location of the well on the microplate. Here we show that: •The position of a well on a microplate significantly affects the signal captured by a common Western-blot imaging system from a luminescent assay.•The effect of well position can easily be corrected for.•This method can be applied to commercially available luminescent assays, allowing for high-throughput quantification of a wide range of biological processes and biochemical reactions.

  6. Spectrally resolved single-shot wavefront sensing of broadband high-harmonic sources

    NASA Astrophysics Data System (ADS)

    Freisem, L.; Jansen, G. S. M.; Rudolf, D.; Eikema, K. S. E.; Witte, S.

    2018-03-01

    Wavefront sensors are an important tool to characterize coherent beams of extreme ultraviolet radiation. However, conventional Hartmann-type sensors do not allow for independent wavefront characterization of different spectral components that may be present in a beam, which limits their applicability for intrinsically broadband high-harmonic generation (HHG) sources. Here we introduce a wavefront sensor that measures the wavefronts of all the harmonics in a HHG beam in a single camera exposure. By replacing the mask apertures with transmission gratings at different orientations, we simultaneously detect harmonic wavefronts and spectra, and obtain sensitivity to spatiotemporal structure such as pulse front tilt as well. We demonstrate the capabilities of the sensor through a parallel measurement of the wavefronts of 9 harmonics in a wavelength range between 25 and 49 nm, with up to lambda/32 precision.

  7. A conceptual design study for a two-dimensional, electronically scanned thinned array radiometer

    NASA Technical Reports Server (NTRS)

    Mutton, Philip; Chromik, Christopher C.; Dixon, Iain; Statham, Richard B.; Stillwagen, Frederic H.; Vontheumer, Alfred E.; Sasamoto, Washito A.; Garn, Paul A.; Cosgrove, Patrick A.; Ganoe, George G.

    1993-01-01

    A conceptual design for the Two-Dimensional, Electronically Steered Thinned Array Radiometer (ESTAR) is described. This instrument is a synthetic aperture microwave radiometer that operates in the L-band frequency range for the measurement of soil moisture and ocean salinity. Two auxiliary instruments, an 8-12 micron, scanning infrared radiometer and a 0.4-1.0 micron, charge coupled device (CCD) video camera, are included to provided data for sea surface temperature measurements and spatial registration of targets respectively. The science requirements were defined by Goddard Space Flight Center. Instrument and the spacecraft configurations are described for missions using the Pegasus and Taurus launch vehicles. The analyses and design trades described include: estimations of size, mass and power, instrument viewing coverage, mechanical design trades, structural and thermal analyses, data and communications performance assessments, and cost estimation.

  8. Advanced Sensors and Applications Study (ASAS)

    NASA Technical Reports Server (NTRS)

    Chism, S. B.; Hughes, C. L.

    1976-01-01

    The present EOD requirements for sensors in the space shuttle era are reported with emphasis on those applications which were deemed important enough to warrant separate sections. The application areas developed are: (1) agriculture; (2) atmospheric corrections; (3) cartography; (4) coastal studies; (5) forestry; (6) geology; (7) hydrology; (8) land use; (9) oceanography; and (10) soil moisture. For each application area. The following aspects were covered: (1) specific goals and techniques, (2) individual sensor requirements including types, bands, resolution, etc.; (3) definition of mission requirements, type orbits, coverages, etc.; and (4) discussion of anticipated problem areas and solutions. The remote sensors required for these application areas include; (1) camera systems; (2) multispectral scanners; (3) microwave scatterometers; (4) synthetic aperture radars; (5) microwave radiometers; and (6) vidicons. The emphasis in the remote sensor area was on the evaluation of present technology implications about future systems.

  9. System Integration of FastSPECT III, a Dedicated SPECT Rodent-Brain Imager Based on BazookaSPECT Detector Technology

    PubMed Central

    Miller, Brian W.; Furenlid, Lars R.; Moore, Stephen K.; Barber, H. Bradford; Nagarkar, Vivek V.; Barrett, Harrison H.

    2010-01-01

    FastSPECT III is a stationary, single-photon emission computed tomography (SPECT) imager designed specifically for imaging and studying neurological pathologies in rodent brain, including Alzheimer’s and Parkinsons’s disease. Twenty independent BazookaSPECT [1] gamma-ray detectors acquire projections of a spherical field of view with pinholes selected for desired resolution and sensitivity. Each BazookaSPECT detector comprises a columnar CsI(Tl) scintillator, image-intensifier, optical lens, and fast-frame-rate CCD camera. Data stream back to processing computers via firewire interfaces, and heavy use of graphics processing units (GPUs) ensures that each frame of data is processed in real time to extract the images of individual gamma-ray events. Details of the system design, imaging aperture fabrication methods, and preliminary projection images are presented. PMID:21218137

  10. Image simulation for electron energy loss spectroscopy

    DOE PAGES

    Oxley, Mark P.; Pennycook, Stephen J.

    2007-10-22

    In this paper, aberration correction of the probe forming optics of the scanning transmission electron microscope has allowed the probe-forming aperture to be increased in size, resulting in probes of the order of 1 Å in diameter. The next generation of correctors promise even smaller probes. Improved spectrometer optics also offers the possibility of larger electron energy loss spectrometry detectors. The localization of images based on core-loss electron energy loss spectroscopy is examined as function of both probe-forming aperture and detector size. The effective ionization is nonlocal in nature, and two common local approximations are compared to full nonlocal calculations.more » Finally, the affect of the channelling of the electron probe within the sample is also discussed.« less

  11. Wigner analysis of three dimensional pupil with finite lateral aperture

    PubMed Central

    Chen, Hsi-Hsun; Oh, Se Baek; Zhai, Xiaomin; Tsai, Jui-Chang; Cao, Liang-Cai; Barbastathis, George; Luo, Yuan

    2015-01-01

    A three dimensional (3D) pupil is an optical element, most commonly implemented on a volume hologram, that processes the incident optical field on a 3D fashion. Here we analyze the diffraction properties of a 3D pupil with finite lateral aperture in the 4-f imaging system configuration, using the Wigner Distribution Function (WDF) formulation. Since 3D imaging pupil is finite in both lateral and longitudinal directions, the WDF of the volume holographic 4-f imager theoretically predicts distinct Bragg diffraction patterns in phase space. These result in asymmetric profiles of diffracted coherent point spread function between degenerate diffraction and Bragg diffraction, elucidating the fundamental performance of volume holographic imaging. Experimental measurements are also presented, confirming the theoretical predictions. PMID:25836443

  12. Thermal damage study of beryllium windows used as vacuum barriers in synchrotron radiation beamlines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holdener, F.R.; Johnson, G.L.; Karpenko, V.P.

    An experimental study to investigate thermal-induced damage to SSRL-designed beryllium foil windows was performed at LLNL's Laser Welding Research Facility. The primary goal of this study was to determine the threshold at which thermal-stress-induced damage occurs in these commonly used vacuum barriers. An Nd:Yag pulsed laser with cylindrical optics and a carefully designed test cell provided a test environment that closely resembles the actual beamline conditions at SSRL. Tests performed on two beryllium window geometries, with different vertical aperture dimensions but equal foil thicknesses of 0.254 mm, resulted in two focused total-power thresholds at which incipient damage was determined. Formore » a beam spot size similar to that of the Beamline-X Wiggler Line, onset of surface damage for a 5-mm by 25-mm aperture window was observed at 170 W after 174,000 laser pulses (1.2-ms pulse at 100 pps). A second window with double the vertical aperture dimension (10 mm by 25 mm) was observed to have surface cracking after 180,000 laser pulses with 85 W impinging its front surface. It failed after approximately 1,000,000 pulses. Another window of the same type (10 mm by 25 mm) received 2,160,000 laser pulses at 74.4 W, and subsequent metallographic sectioning revealed no signs of through-thickness damage. Comparison of windows with equal foil thicknesses and aperture dimensions has effectively identified the heat flux limit for incipient failure. The data show that halving the aperture's vertical dimension allows doubling the total incident power for equivalent onsets of thermal-induced damage.« less

  13. Stitching interferometry for ellipsoidal x-ray mirrors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yumoto, Hirokatsu, E-mail: yumoto@spring8.or.jp; Koyama, Takahisa; Matsuyama, Satoshi

    2016-05-15

    Ellipsoidal mirrors, which can efficiently produce a two-dimensional focusing beam with a single mirror, are superior x-ray focusing optics, especially when compared to elliptical-cylinder mirrors in the Kirkpatrick–Baez geometry. However, nano-focusing ellipsoidal mirrors are not commonly used for x-ray optics because achieving the accuracy required for the surface metrology of nano-focusing ellipsoidal mirrors is difficult due to their small radius of curvature along the short ellipsoidal axis. Here, we developed a surface metrology system for nano-focusing ellipsoidal mirrors using stitching interferometric techniques. The developed system simultaneously measures sub-aperture shapes with a microscopic interferometer and the tilt angles of the sub-aperturemore » shapes with a large Fizeau interferometer. After correcting the systematic errors included in the sub-aperture shapes, the entire mirror shape is calculated by stitching the sub-aperture shapes based on the obtained relative angles between partially overlapped sub-apertures. In this study, we developed correction methods for systematic errors in sub-aperture shapes that originated from off-axis aberrations produced in the optics of the microscopic interferometer. The systematic errors on an ellipsoidal mirror were estimated by measuring a series of tilted plane substrates and the ellipsoidal substrate. From measurements of an ellipsoidal mirror with a 3.6-mm radius of curvature at the mirror center, we obtained a measurement repeatability of 0.51 nm (root-mean-square) in an assessment area of 0.5 mm × 99.18 mm. This value satisfies the requirements for surface metrology of nano-focusing x-ray mirrors. Thus, the developed metrology system should be applicable for fabricating nano-focusing ellipsoidal mirrors.« less

  14. Characterizing volcanic activity: Application of freely-available webcams

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Harrild, M.; Webley, P. W.

    2017-12-01

    In recent years, freely-available web-based cameras, or webcams, have become more readily available allowing an increased level of monitoring at active volcanoes across the globe. While these cameras have been extensively used as qualitative tools, they provide a unique dataset to perform quantitative analyzes of the changing behavior of the particular volcano within the cameras field of view. We focus on the multitude of these freely-available webcams and present a new algorithm to detect changes in volcanic activity using nighttime webcam data. Our approach uses a quick, efficient, and fully automated algorithm to identify changes in webcam data in near real-time, including techniques such as edge detection, Gaussian mixture models, and temporal/spatial statistical tests, which are applied to each target image. Often the image metadata (exposure, gain settings, aperture, focal length, etc.) are unknown, meaning we developed our algorithm to identify the quantity of volcanically incandescent pixels as well as the number of specific algorithm tests needed to detect thermal activity, instead of directly correlating brightness in the webcam to eruption temperatures. We compared our algorithm results to a manual analysis of webcam data for several volcanoes and determined a false detection rate of less than 3% for the automated approach. In our presentation, we describe the different tests integrated into our algorithm, lessons learned, and how we applied our method to several volcanoes across the North Pacific during its development and implementation. We will finish with a discussion on the global applicability of our approach and how to build a 24/7, 365 day a year tool that can be used as an additional data source for real-time analysis of volcanic activity.

  15. A Real-Time Microwave Camera at K-Band (24 GHz)

    NASA Technical Reports Server (NTRS)

    Ghasr, M. T.; Abou-Khousa, M.; Baumgartner, M. A.; Zoughi, R.

    2009-01-01

    Nondestructive testing (NDT) community has been moving towards effective and robust inspection systems that can provide real-time information about materials, geometrical, structural and mechanical characteristics of composite materials/structures. Moreover, in many applications it is desired to have an image of the structure, after which the information contained in the image is correlated to the above characteristics. Microwave signals penetrate inside of dielectric composite structures and their interaction with the interior of the structure renders critical information for NDT purposes. Subsequently, this information (i.e., magnitude and phase or reflected signal) may be used to produce an image of the interior of the structure revealing potential flaws or anomalies. Image processing and reconstruction techniques may also be incorporated to produce high-resolution images (i.e., synthetic-aperture, back-propagation, etc.). There are several different approaches for designing areal-time microwave camera system. One approach is based on modulated scatterer technique (MST), which is used to tag scattered electric field in a discrete two-dimensional (2D) spatial domain (e.g. a retina) resulting in the 2D magnitude and phase distribution of the scattered electric field which is required for producing an image of a material or structure under inspection. The ability to rapidly modulate resonant slot antennas in such a retina along with using receivers with fast responses provide for real-time image production capability. Design issue and criteria become more challenging at higher frequencies and for a relatively large retina size. This paper presents the basic design and challenges for a microwave camera with a retina size of 6" by 6" operating at a frequency of 24 GHz.

  16. Coaxial volumetric velocimetry

    NASA Astrophysics Data System (ADS)

    Schneiders, Jan F. G.; Scarano, Fulvio; Jux, Constantin; Sciacchitano, Andrea

    2018-06-01

    This study describes the working principles of the coaxial volumetric velocimeter (CVV) for wind tunnel measurements. The measurement system is derived from the concept of tomographic PIV in combination with recent developments of Lagrangian particle tracking. The main characteristic of the CVV is its small tomographic aperture and the coaxial arrangement between the illumination and imaging directions. The system consists of a multi-camera arrangement subtending only few degrees solid angle and a long focal depth. Contrary to established PIV practice, laser illumination is provided along the same direction as that of the camera views, reducing the optical access requirements to a single viewing direction. The laser light is expanded to illuminate the full field of view of the cameras. Such illumination and imaging conditions along a deep measurement volume dictate the use of tracer particles with a large scattering area. In the present work, helium-filled soap bubbles are used. The fundamental principles of the CVV in terms of dynamic velocity and spatial range are discussed. Maximum particle image density is shown to limit tracer particle seeding concentration and instantaneous spatial resolution. Time-averaged flow fields can be obtained at high spatial resolution by ensemble averaging. The use of the CVV for time-averaged measurements is demonstrated in two wind tunnel experiments. After comparing the CVV measurements with the potential flow in front of a sphere, the near-surface flow around a complex wind tunnel model of a cyclist is measured. The measurements yield the volumetric time-averaged velocity and vorticity field. The measurements of the streamlines in proximity of the surface give an indication of the skin-friction lines pattern, which is of use in the interpretation of the surface flow topology.

  17. Toolkit for testing scientific CCD cameras

    NASA Astrophysics Data System (ADS)

    Uzycki, Janusz; Mankiewicz, Lech; Molak, Marcin; Wrochna, Grzegorz

    2006-03-01

    The CCD Toolkit (1) is a software tool for testing CCD cameras which allows to measure important characteristics of a camera like readout noise, total gain, dark current, 'hot' pixels, useful area, etc. The application makes a statistical analysis of images saved in files with FITS format, commonly used in astronomy. A graphical interface is based on the ROOT package, which offers high functionality and flexibility. The program was developed in a way to ensure future compatibility with different operating systems: Windows and Linux. The CCD Toolkit was created for the "Pie of the Sky" project collaboration (2).

  18. VizieR Online Data Catalog: Periods of 4-10 Myr old T Tauri members of Orion OB1 (Karim+, 2016)

    NASA Astrophysics Data System (ADS)

    Karim, M. T.; Stassun, K. G.; Briceno, C.; Vivas, A. K.; Raetz, S.; Mateu, C.; Downes, J. J.; Calvet, N.; Hernandez, J.; Neuhauser, R.; Mugrauer, M.; Takahashi, H.; Tachihara, K.; Chini, R.; Cruz-Dias, G. A.; Aarnio, A.; James, D. J.; Hackstein, M.

    2017-02-01

    The Astronomia Variability Survey of Orion (CVSO) was carried out at the Llano del Hato National Astronomical Observatory in Venezuela, with the QUEST CCD mosaic camera (8000*8000pixels) on the 1m (clear aperture) Schmidt telescope, with a plate scale of 1.02''/pixel and field of view of 5.4 deg2. This V-, RC-, and IC-band multi-epoch survey, covering ~180deg2 of the Orion OB1 association, spans a time baseline of 12yr, from 1998 December to 2011 February. The 25 Ori cluster was observed by the 0.6/0.9m Schmidt-type telescope at Jena Observatory (Germany), the two 5.9'' telescopes at Observatorio Cerro Armazones (OCA, Chile), and the 1.5m reflector at the Gunma Astronomical Observatory in Japan, over four observing campaigns during the years 2010-2013. The Jena Schmidt-type telescope was equipped with the optical Schmidt Telescope Camera (STK), with an e2v 42-10 2048*2048 detector, yielding a plate scale of 1.55''/pixel and a field of view of 53'*53', thus encompassing most of the cluster. The Jena 50s exposures, all taken through the R filter, were centered on 25 Ori. A total of 8506 individual exposures were obtained in 108 nights. The Gunma 1.5m reflector observations were carried out by obtaining 60s integrations in R with the Gunma Low-resolution Spectrograph and Imager (GLOWS), which has an e2v CCD55-30 1250*1152 pixel detector with a 0.6''/pixel scale, covering a field of view of 12.5'*11.5'. Observations were obtained during four nights in year 2010. The Observatorio Cerro Armazones observations were done in the R band using the RoBoTT (Robotic Bochum TWin Telescope), which consists of twin Takahashi 150mm aperture apochromatic astrographs, each equipped with an Apogee U16M camera with a KAF-16803 4096*4096 pixel CCD, providing a 2.7°*2.7° field of view with 2.37''/pixel scale. The 60s exposures were centered on 25 Ori, spanning an area much larger than the cluster. OCA data were obtained during all YETI seasons. During the nights of 2006 January 8-15, we used the 0.9m telescope with the 8000*8000 pixel MOSAIC imager at the Kitt Peak National Observatory (KPNO), Arizona, USA, to obtain IC-band time-series observations of several regions in the Orion OB1 association, including the 25 Ori cluster in the OB1a subassociation, and fields in the OB1b subassociation, under NOAO program 2005B-0529. (1 data file).

  19. Thermal infrared panoramic imaging sensor

    NASA Astrophysics Data System (ADS)

    Gutin, Mikhail; Tsui, Eddy K.; Gutin, Olga; Wang, Xu-Ming; Gutin, Alexey

    2006-05-01

    Panoramic cameras offer true real-time, 360-degree coverage of the surrounding area, valuable for a variety of defense and security applications, including force protection, asset protection, asset control, security including port security, perimeter security, video surveillance, border control, airport security, coastguard operations, search and rescue, intrusion detection, and many others. Automatic detection, location, and tracking of targets outside protected area ensures maximum protection and at the same time reduces the workload on personnel, increases reliability and confidence of target detection, and enables both man-in-the-loop and fully automated system operation. Thermal imaging provides the benefits of all-weather, 24-hour day/night operation with no downtime. In addition, thermal signatures of different target types facilitate better classification, beyond the limits set by camera's spatial resolution. The useful range of catadioptric panoramic cameras is affected by their limited resolution. In many existing systems the resolution is optics-limited. Reflectors customarily used in catadioptric imagers introduce aberrations that may become significant at large camera apertures, such as required in low-light and thermal imaging. Advantages of panoramic imagers with high image resolution include increased area coverage with fewer cameras, instantaneous full horizon detection, location and tracking of multiple targets simultaneously, extended range, and others. The Automatic Panoramic Thermal Integrated Sensor (APTIS), being jointly developed by Applied Science Innovative, Inc. (ASI) and the Armament Research, Development and Engineering Center (ARDEC) combines the strengths of improved, high-resolution panoramic optics with thermal imaging in the 8 - 14 micron spectral range, leveraged by intelligent video processing for automated detection, location, and tracking of moving targets. The work in progress supports the Future Combat Systems (FCS) and the Intelligent Munitions Systems (IMS). The APTIS is anticipated to operate as an intelligent node in a wireless network of multifunctional nodes that work together to serve in a wide range of applications of homeland security, as well as serve the Army in tasks of improved situational awareness (SA) in defense and offensive operations, and as a sensor node in tactical Intelligence Surveillance Reconnaissance (ISR). The novel ViperView TM high-resolution panoramic thermal imager is the heart of the APTIS system. It features an aberration-corrected omnidirectional imager with small optics designed to match the resolution of a 640x480 pixels IR camera with improved image quality for longer range target detection, classification, and tracking. The same approach is applicable to panoramic cameras working in the visible spectral range. Other components of the ATPIS system include network communications, advanced power management, and wakeup capability. Recent developments include image processing, optical design being expanded into the visible spectral range, and wireless communications design. This paper describes the development status of the APTIS system.

  20. Calibration and accuracy analysis of a focused plenoptic camera

    NASA Astrophysics Data System (ADS)

    Zeller, N.; Quint, F.; Stilla, U.

    2014-08-01

    In this article we introduce new methods for the calibration of depth images from focused plenoptic cameras and validate the results. We start with a brief description of the concept of a focused plenoptic camera and how from the recorded raw image a depth map can be estimated. For this camera, an analytical expression of the depth accuracy is derived for the first time. In the main part of the paper, methods to calibrate a focused plenoptic camera are developed and evaluated. The optical imaging process is calibrated by using a method which is already known from the calibration of traditional cameras. For the calibration of the depth map two new model based methods, which make use of the projection concept of the camera are developed. These new methods are compared to a common curve fitting approach, which is based on Taylor-series-approximation. Both model based methods show significant advantages compared to the curve fitting method. They need less reference points for calibration than the curve fitting method and moreover, supply a function which is valid in excess of the range of calibration. In addition the depth map accuracy of the plenoptic camera was experimentally investigated for different focal lengths of the main lens and is compared to the analytical evaluation.

  1. A New Era in Solar Thermal-IR Astronomy: the NSO Array Camera (NAC) on the McMath-Pierce Telescope

    NASA Astrophysics Data System (ADS)

    Ayres, T.; Penn, M.; Plymate, C.; Keller, C.

    2008-09-01

    The U.S. National Solar Observatory Array Camera (NAC) is a cryogenically cooled 1Kx1K InSb ``Aladdin" array that recently became operational at the McMath-Pierce facility on Kitt Peak, a high dry site in the southwest U.S. (Arizona). The new camera is similar to those already incorporated into instruments on nighttime telescopes, and has unprecedented sensitivity, low noise, and excellent cosmetics compared with the Amber Engineering (AE) device it replaces. (The latter was scavenged from a commercial surveillance camera in the 1990's: only 256X256 format, high noise, and annoying flatfield structure). The NAC focal plane is maintained at 30 K by a mechanical closed-cycle helium cooler, dispensing with the cumbersome pumped--solid-N2 40 K system used previously with the AE camera. The NAC linearity has been verified for exposures as short as 1 ms, although latency in the data recording holds the maximum frame rate to about 8 Hz (in "streaming mode"). The camera is run in tandem with the Infrared Adaptive Optics (IRAO) system. Utilizing a 37-actuator deformable mirror, IRAO can--under moderate seeing conditions--correct the telescope image to the diffraction limit longward of 2.3 mu (if a suitable high contrast target is available: the IR granulation has proven too bland to reliably track). IRAO also provides fine control over the solar image for spatial scanning in long-slit mode with the 14 m vertical "Main" spectrograph (MS). A 1'X1' area scan, with 0.5" steps orthogonal to the slit direction, requires less than half a minute, much shorter than p-mode and granulation evolution time scales. A recent engineering test run, in April 2008, utilized NAC/IRAO/MS to capture the fundamental (4.6 mu) and first-overtone (2.3 mu) rovibrational bands of CO, including maps of quiet regions, drift scans along the equatorial limbs (to measure the off-limb molecular emissions), and imaging of a fortuitous small sunspot pair, a final gasp, perhaps, of Cycle 23. Future work with the NAC will emphasize pathfinding toward the next generation of IR imaging spectrometers for the Advanced Technology Solar Telescope, whose 4 m aperture finally will bring sorely needed high spatial resolution to daytime infrared astronomy. In the meantime, the NAC is available to qualified solar physicists from around the world to conduct forefront research in the 1-5 mu region, on the venerable--but infrared friendly--McMath-Pierce telescope.

  2. Object tracking using multiple camera video streams

    NASA Astrophysics Data System (ADS)

    Mehrubeoglu, Mehrube; Rojas, Diego; McLauchlan, Lifford

    2010-05-01

    Two synchronized cameras are utilized to obtain independent video streams to detect moving objects from two different viewing angles. The video frames are directly correlated in time. Moving objects in image frames from the two cameras are identified and tagged for tracking. One advantage of such a system involves overcoming effects of occlusions that could result in an object in partial or full view in one camera, when the same object is fully visible in another camera. Object registration is achieved by determining the location of common features in the moving object across simultaneous frames. Perspective differences are adjusted. Combining information from images from multiple cameras increases robustness of the tracking process. Motion tracking is achieved by determining anomalies caused by the objects' movement across frames in time in each and the combined video information. The path of each object is determined heuristically. Accuracy of detection is dependent on the speed of the object as well as variations in direction of motion. Fast cameras increase accuracy but limit the speed and complexity of the algorithm. Such an imaging system has applications in traffic analysis, surveillance and security, as well as object modeling from multi-view images. The system can easily be expanded by increasing the number of cameras such that there is an overlap between the scenes from at least two cameras in proximity. An object can then be tracked long distances or across multiple cameras continuously, applicable, for example, in wireless sensor networks for surveillance or navigation.

  3. Enhanced LWIR NUC using an uncooled microbolometer camera

    NASA Astrophysics Data System (ADS)

    LaVeigne, Joe; Franks, Greg; Sparkman, Kevin; Prewarski, Marcus; Nehring, Brian

    2011-06-01

    Performing a good non-uniformity correction is a key part of achieving optimal performance from an infrared scene projector, and the best NUC is performed in the band of interest for the sensor being tested. While cooled, large format MWIR cameras are readily available and have been successfully used to perform NUC, similar cooled, large format LWIR cameras are not as common and are prohibitively expensive. Large format uncooled cameras are far more available and affordable, but present a range of challenges in practical use for performing NUC on an IRSP. Some of these challenges were discussed in a previous paper. In this discussion, we report results from a continuing development program to use a microbolometer camera to perform LWIR NUC on an IRSP. Camera instability and temporal response and thermal resolution were the main problems, and have been solved by the implementation of several compensation strategies as well as hardware used to stabilize the camera. In addition, other processes have been developed to allow iterative improvement as well as supporting changes of the post-NUC lookup table without requiring re-collection of the pre-NUC data with the new LUT in use.

  4. Science, conservation, and camera traps

    USGS Publications Warehouse

    Nichols, James D.; Karanth, K. Ullas; O'Connel, Allan F.; O'Connell, Allan F.; Nichols, James D.; Karanth, K. Ullas

    2011-01-01

    Biologists commonly perceive camera traps as a new tool that enables them to enter the hitherto secret world of wild animals. Camera traps are being used in a wide range of studies dealing with animal ecology, behavior, and conservation. Our intention in this volume is not to simply present the various uses of camera traps, but to focus on their use in the conduct of science and conservation. In this chapter, we provide an overview of these two broad classes of endeavor and sketch the manner in which camera traps are likely to be able to contribute to them. Our main point here is that neither photographs of individual animals, nor detection history data, nor parameter estimates generated from detection histories are the ultimate objective of a camera trap study directed at either science or management. Instead, the ultimate objectives are best viewed as either gaining an understanding of how ecological systems work (science) or trying to make wise decisions that move systems from less desirable to more desirable states (conservation, management). Therefore, we briefly describe here basic approaches to science and management, emphasizing the role of field data and associated analyses in these processes. We provide examples of ways in which camera trap data can inform science and management.

  5. Temporal Coding of Volumetric Imagery

    NASA Astrophysics Data System (ADS)

    Llull, Patrick Ryan

    'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration of other information within that video; namely, focal and spectral information. The next part of the thesis demonstrates derivative works of CACTI: compressive extended depth of field and compressive spectral-temporal imaging. These works successfully show the technique's extension of temporal coding to improve sensing performance in these other dimensions. Geometrical optics-related tradeoffs, such as the classic challenges of wide-field-of-view and high resolution photography, have motivated the development of mulitscale camera arrays. The advent of such designs less than a decade ago heralds a new era of research- and engineering-related challenges. One significant challenge is that of managing the focal volume (x,y,z ) over wide fields of view and resolutions. The fourth chapter shows advances on focus and image quality assessment for a class of multiscale gigapixel cameras developed at Duke. Along the same line of work, we have explored methods for dynamic and adaptive addressing of focus via point spread function engineering. We demonstrate another form of temporal coding in the form of physical translation of the image plane from its nominal focal position. We demonstrate this technique's capability to generate arbitrary point spread functions.

  6. Measuring the Orbital Period of the Moon Using a Digital Camera

    ERIC Educational Resources Information Center

    Hughes, Stephen W.

    2006-01-01

    A method of measuring the orbital velocity of the Moon around the Earth using a digital camera is described. Separate images of the Moon and stars taken 24 hours apart were loaded into Microsoft PowerPoint and the centre of the Moon marked on each image. Four stars common to both images were connected together to form a "home-made" constellation.…

  7. Radiometric calibration of wide-field camera system with an application in astronomy

    NASA Astrophysics Data System (ADS)

    Vítek, Stanislav; Nasyrova, Maria; Stehlíková, Veronika

    2017-09-01

    Camera response function (CRF) is widely used for the description of the relationship between scene radiance and image brightness. Most common application of CRF is High Dynamic Range (HDR) reconstruction of the radiance maps of imaged scenes from a set of frames with different exposures. The main goal of this work is to provide an overview of CRF estimation algorithms and compare their outputs with results obtained under laboratory conditions. These algorithms, typically designed for multimedia content, are unfortunately quite useless with astronomical image data, mostly due to their nature (blur, noise, and long exposures). Therefore, we propose an optimization of selected methods to use in an astronomical imaging application. Results are experimentally verified on the wide-field camera system using Digital Single Lens Reflex (DSLR) camera.

  8. Processing the Viking lander camera data

    NASA Technical Reports Server (NTRS)

    Levinthal, E. C.; Tucker, R.; Green, W.; Jones, K. L.

    1977-01-01

    Over 1000 camera events were returned from the two Viking landers during the Primary Mission. A system was devised for processing camera data as they were received, in real time, from the Deep Space Network. This system provided a flexible choice of parameters for three computer-enhanced versions of the data for display or hard-copy generation. Software systems allowed all but 0.3% of the imagery scan lines received on earth to be placed correctly in the camera data record. A second-order processing system was developed which allowed extensive interactive image processing including computer-assisted photogrammetry, a variety of geometric and photometric transformations, mosaicking, and color balancing using six different filtered images of a common scene. These results have been completely cataloged and documented to produce an Experiment Data Record.

  9. Automatic Calibration of Stereo-Cameras Using Ordinary Chess-Board Patterns

    NASA Astrophysics Data System (ADS)

    Prokos, A.; Kalisperakis, I.; Petsa, E.; Karras, G.

    2012-07-01

    Automation of camera calibration is facilitated by recording coded 2D patterns. Our toolbox for automatic camera calibration using images of simple chess-board patterns is freely available on the Internet. But it is unsuitable for stereo-cameras whose calibration implies recovering camera geometry and their true-to-scale relative orientation. In contrast to all reported methods requiring additional specific coding to establish an object space coordinate system, a toolbox for automatic stereo-camera calibration relying on ordinary chess-board patterns is presented here. First, the camera calibration algorithm is applied to all image pairs of the pattern to extract nodes of known spacing, order them in rows and columns, and estimate two independent camera parameter sets. The actual node correspondences on stereo-pairs remain unknown. Image pairs of a textured 3D scene are exploited for finding the fundamental matrix of the stereo-camera by applying RANSAC to point matches established with the SIFT algorithm. A node is then selected near the centre of the left image; its match on the right image is assumed as the node closest to the corresponding epipolar line. This yields matches for all nodes (since these have already been ordered), which should also satisfy the 2D epipolar geometry. Measures for avoiding mismatching are taken. With automatically estimated initial orientation values, a bundle adjustment is performed constraining all pairs on a common (scaled) relative orientation. Ambiguities regarding the actual exterior orientations of the stereo-camera with respect to the pattern are irrelevant. Results from this automatic method show typical precisions not above 1/4 pixels for 640×480 web cameras.

  10. Off-Grid Direction of Arrival Estimation Based on Joint Spatial Sparsity for Distributed Sparse Linear Arrays

    PubMed Central

    Liang, Yujie; Ying, Rendong; Lu, Zhenqi; Liu, Peilin

    2014-01-01

    In the design phase of sensor arrays during array signal processing, the estimation performance and system cost are largely determined by array aperture size. In this article, we address the problem of joint direction-of-arrival (DOA) estimation with distributed sparse linear arrays (SLAs) and propose an off-grid synchronous approach based on distributed compressed sensing to obtain larger array aperture. We focus on the complex source distribution in the practical applications and classify the sources into common and innovation parts according to whether a signal of source can impinge on all the SLAs or a specific one. For each SLA, we construct a corresponding virtual uniform linear array (ULA) to create the relationship of random linear map between the signals respectively observed by these two arrays. The signal ensembles including the common/innovation sources for different SLAs are abstracted as a joint spatial sparsity model. And we use the minimization of concatenated atomic norm via semidefinite programming to solve the problem of joint DOA estimation. Joint calculation of the signals observed by all the SLAs exploits their redundancy caused by the common sources and decreases the requirement of array size. The numerical results illustrate the advantages of the proposed approach. PMID:25420150

  11. Design of an airborne lidar for stratospheric aerosol measurements

    NASA Technical Reports Server (NTRS)

    Evans, W. E.

    1977-01-01

    A modular, multiple-telescope receiving concept is developed to gain a relatively large receiver collection aperture without requiring extensive modifications to the aircraft. This concept, together with the choice of a specific photodetector, signal processing, and data recording system capable of maintaining approximately 1% precision over the required large signal amplitude range, is found to be common to all of the options. It is recommended that development of the lidar begin by more detailed definition of solutions to these important common signal detection and recording problems.

  12. CameraHRV: robust measurement of heart rate variability using a camera

    NASA Astrophysics Data System (ADS)

    Pai, Amruta; Veeraraghavan, Ashok; Sabharwal, Ashutosh

    2018-02-01

    The inter-beat-interval (time period of the cardiac cycle) changes slightly for every heartbeat; this variation is measured as Heart Rate Variability (HRV). HRV is presumed to occur due to interactions between the parasym- pathetic and sympathetic nervous system. Therefore, it is sometimes used as an indicator of the stress level of an individual. HRV also reveals some clinical information about cardiac health. Currently, HRV is accurately measured using contact devices such as a pulse oximeter. However, recent research in the field of non-contact imaging Photoplethysmography (iPPG) has made vital sign measurements using just the video recording of any exposed skin (such as a person's face) possible. The current signal processing methods for extracting HRV using peak detection perform well for contact-based systems but have poor performance for the iPPG signals. The main reason for this poor performance is the fact that current methods are sensitive to large noise sources which are often present in iPPG data. Further, current methods are not robust to motion artifacts that are common in iPPG systems. We developed a new algorithm, CameraHRV, for robustly extracting HRV even in low SNR such as is common with iPPG recordings. CameraHRV combined spatial combination and frequency demodulation to obtain HRV from the instantaneous frequency of the iPPG signal. CameraHRV outperforms other current methods of HRV estimation. Ground truth data was obtained from FDA-approved pulse oximeter for validation purposes. CameraHRV on iPPG data showed an error of 6 milliseconds for low motion and varying skin tone scenarios. The improvement in error was 14%. In case of high motion scenarios like reading, watching and talking, the error was 10 milliseconds.

  13. Optimization and verification of image reconstruction for a Compton camera towards application as an on-line monitor for particle therapy

    NASA Astrophysics Data System (ADS)

    Taya, T.; Kataoka, J.; Kishimoto, A.; Tagawa, L.; Mochizuki, S.; Toshito, T.; Kimura, M.; Nagao, Y.; Kurita, K.; Yamaguchi, M.; Kawachi, N.

    2017-07-01

    Particle therapy is an advanced cancer therapy that uses a feature known as the Bragg peak, in which particle beams suddenly lose their energy near the end of their range. The Bragg peak enables particle beams to damage tumors effectively. To achieve precise therapy, the demand for accurate and quantitative imaging of the beam irradiation region or dosage during therapy has increased. The most common method of particle range verification is imaging of annihilation gamma rays by positron emission tomography. Not only 511-keV gamma rays but also prompt gamma rays are generated during therapy; therefore, the Compton camera is expected to be used as an on-line monitor for particle therapy, as it can image these gamma rays in real time. Proton therapy, one of the most common particle therapies, uses a proton beam of approximately 200 MeV, which has a range of ~ 25 cm in water. As gamma rays are emitted along the path of the proton beam, quantitative evaluation of the reconstructed images of diffuse sources becomes crucial, but it is far from being fully developed for Compton camera imaging at present. In this study, we first quantitatively evaluated reconstructed Compton camera images of uniformly distributed diffuse sources, and then confirmed that our Compton camera obtained 3 %(1 σ) and 5 %(1 σ) uniformity for line and plane sources, respectively. Based on this quantitative study, we demonstrated on-line gamma imaging during proton irradiation. Through these studies, we show that the Compton camera is suitable for future use as an on-line monitor for particle therapy.

  14. Continuous monitoring of Hawaiian volcanoes with thermal cameras

    USGS Publications Warehouse

    Patrick, Matthew R.; Orr, Tim R.; Antolik, Loren; Lee, Robert Lopaka; Kamibayashi, Kevan P.

    2014-01-01

    Continuously operating thermal cameras are becoming more common around the world for volcano monitoring, and offer distinct advantages over conventional visual webcams for observing volcanic activity. Thermal cameras can sometimes “see” through volcanic fume that obscures views to visual webcams and the naked eye, and often provide a much clearer view of the extent of high temperature areas and activity levels. We describe a thermal camera network recently installed by the Hawaiian Volcano Observatory to monitor Kīlauea’s summit and east rift zone eruptions (at Halema‘uma‘u and Pu‘u ‘Ō‘ō craters, respectively) and to keep watch on Mauna Loa’s summit caldera. The cameras are long-wave, temperature-calibrated models protected in custom enclosures, and often positioned on crater rims close to active vents. Images are transmitted back to the observatory in real-time, and numerous Matlab scripts manage the data and provide automated analyses and alarms. The cameras have greatly improved HVO’s observations of surface eruptive activity, which includes highly dynamic lava lake activity at Halema‘uma‘u, major disruptions to Pu‘u ‘Ō‘ō crater and several fissure eruptions.

  15. Are camera surveys useful for assessing recruitment in white-tailed deer?

    DOE PAGES

    Chitwood, M. Colter; Lashley, Marcus A.; Kilgo, John C.; ...

    2016-12-27

    Camera surveys commonly are used by managers and hunters to estimate white-tailed deer Odocoileus virginianus density and demographic rates. Though studies have documented biases and inaccuracies in the camera survey methodology, camera traps remain popular due to ease of use, cost-effectiveness, and ability to survey large areas. Because recruitment is a key parameter in ungulate population dynamics, there is a growing need to test the effectiveness of camera surveys for assessing fawn recruitment. At Savannah River Site, South Carolina, we used six years of camera-based recruitment estimates (i.e. fawn:doe ratio) to predict concurrently collected annual radiotag-based survival estimates. The coefficientmore » of determination (R) was 0.445, indicating some support for the viability of cameras to reflect recruitment. Here, we added two years of data from Fort Bragg Military Installation, North Carolina, which improved R to 0.621 without accounting for site-specific variability. Also, we evaluated the correlation between year-to-year changes in recruitment and survival using the Savannah River Site data; R was 0.758, suggesting that camera-based recruitment could be useful as an indicator of the trend in survival. Because so few researchers concurrently estimate survival and camera-based recruitment, examining this relationship at larger spatial scales while controlling for numerous confounding variables remains difficult. We believe that future research should test the validity of our results from other areas with varying deer and camera densities, as site (e.g. presence of feral pigs Sus scrofa) and demographic (e.g. fawn age at time of camera survey) parameters may have a large influence on detectability. Until such biases are fully quantified, we urge researchers and managers to use caution when advocating the use of camera-based recruitment estimates.« less

  16. Plenoptic particle image velocimetry with multiple plenoptic cameras

    NASA Astrophysics Data System (ADS)

    Fahringer, Timothy W.; Thurow, Brian S.

    2018-07-01

    Plenoptic particle image velocimetry was recently introduced as a viable three-dimensional, three-component velocimetry technique based on light field cameras. One of the main benefits of this technique is its single camera configuration allowing the technique to be applied in facilities with limited optical access. The main drawback of this configuration is decreased accuracy in the out-of-plane dimension. This work presents a solution with the addition of a second plenoptic camera in a stereo-like configuration. A framework for reconstructing volumes with multiple plenoptic cameras including the volumetric calibration and reconstruction algorithms, including: integral refocusing, filtered refocusing, multiplicative refocusing, and MART are presented. It is shown that the addition of a second camera improves the reconstruction quality and removes the ‘cigar’-like elongation associated with the single camera system. In addition, it is found that adding a third camera provides minimal improvement. Further metrics of the reconstruction quality are quantified in terms of a reconstruction algorithm, particle density, number of cameras, camera separation angle, voxel size, and the effect of common image noise sources. In addition, a synthetic Gaussian ring vortex is used to compare the accuracy of the single and two camera configurations. It was determined that the addition of a second camera reduces the RMSE velocity error from 1.0 to 0.1 voxels in depth and 0.2 to 0.1 voxels in the lateral spatial directions. Finally, the technique is applied experimentally on a ring vortex and comparisons are drawn from the four presented reconstruction algorithms, where it was found that MART and multiplicative refocusing produced the cleanest vortex structure and had the least shot-to-shot variability. Filtered refocusing is able to produce the desired structure, albeit with more noise and variability, while integral refocusing struggled to produce a coherent vortex ring.

  17. Dark Spots on Titan

    NASA Image and Video Library

    2005-05-02

    This recent image of Titan reveals more complex patterns of bright and dark regions on the surface, including a small, dark, circular feature, completely surrounded by brighter material. During the two most recent flybys of Titan, on March 31 and April 16, 2005, Cassini captured a number of images of the hemisphere of Titan that faces Saturn. The image at the left is taken from a mosaic of images obtained in March 2005 (see PIA06222) and shows the location of the more recently acquired image at the right. The new image shows intriguing details in the bright and dark patterns near an 80-kilometer-wide (50-mile) crater seen first by Cassini's synthetic aperture radar experiment during a Titan flyby in February 2005 (see PIA07368) and subsequently seen by the imaging science subsystem cameras as a dark spot (center of the image at the left). Interestingly, a smaller, roughly 20-kilometer-wide (12-mile), dark and circular feature can be seen within an irregularly-shaped, brighter ring, and is similar to the larger dark spot associated with the radar crater. However, the imaging cameras see only brightness variations, and without topographic information, the identity of this feature as an impact crater cannot be conclusively determined from this image. The visual infrared mapping spectrometer, which is sensitive to longer wavelengths where Titan's atmospheric haze is less obscuring -- observed this area simultaneously with the imaging cameras, so those data, and perhaps future observations by Cassini's radar, may help to answer the question of this feature's origin. The new image at the right consists of five images that have been added together and enhanced to bring out surface detail and to reduce noise, although some camera artifacts remain. These images were taken with the Cassini spacecraft narrow-angle camera using a filter sensitive to wavelengths of infrared light centered at 938 nanometers -- considered to be the imaging science subsystem's best spectral filter for observing the surface of Titan. This view was acquired from a distance of 33,000 kilometers (20,500 miles). The pixel scale of this image is 390 meters (0.2 miles) per pixel, although the actual resolution is likely to be several times larger. http://photojournal.jpl.nasa.gov/catalog/PIA06234

  18. Design of compact off-axis four-mirror anastigmatic system for space communications

    NASA Astrophysics Data System (ADS)

    Zhao, Fa-cai; Sun, Quan-she; Chen, Kun-feng; Zhu, Xing-bang; Wang, Shao-shui; Wang, Guo-quan; Zheng, Xiang-liang

    2013-08-01

    The deployment of advanced hyperspectral imaging and other Earth sensing instruments onboard Earth observing satellites is driving the demand for high-data rate communications. Space laser communications technology offers the potential for significantly increasing in data return capability from space to Earth. Compared to the current state of the art radio frequency communications links, lasercom links operate at much higher carrier frequencies. The use of higher carrier frequencies implies a much smaller diffraction loss, which in turn, results in a much higher efficiency in delivering the signal energy. Optical communications meet the required data rates with small, low-mass, and low-power communications packages. The communications optical system assembly typically consists of a front aperture, reflection or refraction type telescope, with or without a solar rejection filter, aft optics, fine-pointing mirrors, and array detectors. Optical system used in space laser communications usually has long focal length, large aperture compared with common optical systems. So the reflective optical system is widely used. An unobstructed four-mirror anastigmatic telescope system was proposed, which was modified based on the theory about geometry optics of common-axis three-mirror systems. Intermediate image was between secondary and tertiary mirror. In order to fold the optical path, four-mirror was designed by adding the plane reflective mirror at intermediate image. The design was analyzed, then a system with effective aperture of 200mm and field of view of 1.0°x1.0° was designed, total length and magnification are 700mm and 20, respectively. The system has advantages of large magnification, relative short physical size and loose manufacturing tolerances.

  19. Large optical glass blanks for the ELT generation

    NASA Astrophysics Data System (ADS)

    Jedamzik, Ralf; Petzold, Uwe; Dietrich, Volker; Wittmer, Volker; Rexius, Olga

    2016-07-01

    The upcoming extremely large telescope projects like the E-ELT, TMT or GMT telescopes require not only large amount of mirror blank substrates but have also sophisticated instrument setups. Common instrument components are atmospheric dispersion correctors that compensate for the varying atmospheric path length depending on the telescope inclination angle. These elements consist usually of optical glass blanks that have to be large due to the increased size of the focal beam of the extremely large telescopes. SCHOTT has a long experience in producing and delivering large optical glass blanks for astronomical applications up to 1 m and in homogeneity grades up to H3 quality in the past. The most common optical glass available in large formats is SCHOTT N-BK7. But other glass types like F2 or LLF1 can also be produced in formats up to 1 m. The extremely large telescope projects partly demand atmospheric dispersion components even in sizes beyond 1m up to a range of 1.5 m diameter. The production of such large homogeneous optical glass banks requires tight control of all process steps. To cover this demand in the future SCHOTT initiated a research project to improve the large optical blank production process steps from melting to annealing and measurement. Large optical glass blanks are measured in several sub-apertures that cover the total clear aperture of the application. With SCHOTT's new stitching software it is now possible to combine individual sub-aperture measurements to a total homogeneity map of the blank. In this presentation first results will be demonstrated.

  20. Fresnel Lenses for Wide-Aperture Optical Receivers

    NASA Technical Reports Server (NTRS)

    Hemmati, Hamid

    2004-01-01

    Wide-aperture receivers for freespace optical communication systems would utilize Fresnel lenses instead of conventional telescope lenses, according to a proposal. Fresnel lenses weigh and cost much less than conventional lenses having equal aperture widths. Plastic Fresnel lenses are commercially available in diameters up to 5 m large enough to satisfy requirements for aperture widths of the order of meters for collecting sufficient light in typical long-distance free-space optical communication systems. Fresnel lenses are not yet suitable for high-quality diffraction-limited imaging, especially in polychromatic light. However, optical communication systems utilize monochromatic light, and there is no requirement for high-quality imaging; instead, the basic requirement for an optical receiver is to collect the incoming monochromatic light over a wide aperture and concentrate the light onto a photodetector. Because of lens aberrations and diffraction, the light passing through any lens is focused to a blur circle rather than to a point. Calculations for some representative cases of wide-aperture non-diffraction-limited Fresnel lenses have shown that it should be possible to attain blur-circle diameters of less than 2 mm. Preferably, the blur-circle diameter should match the width of the photodetector. For most high-bandwidth communication applications, the required photodetector diameters would be about 1 mm. In a less-preferable case in which the blur circle was wider than a single photodetector, it would be possible to occupy the blur circle with an array of photodetectors. As an alternative to using a single large Fresnel lens, one could use an array of somewhat smaller lenses to synthesize the equivalent aperture area. Such a configuration might be preferable in a case in which a single Fresnel lens of the requisite large size would be impractical to manufacture, and the blur circle could not be made small enough. For example one could construct a square array of four 5-m-diameter Fresnel lenses to obtain the same light-collecting area as that of a single 10-m-diameter lens. In that case (see figure), the light collected by each Fresnel lens could be collimated, the collimated beams from the four Fresnel lenses could be reflected onto a common offaxis paraboloidal reflector, and the paraboloidal reflector would focus the four beams onto a single photodetector. Alternatively, detected signal from each detector behind each lens would be digitized before summing the signals.

  1. High resolution telescope including an array of elemental telescopes aligned along a common axis and supported on a space frame with a pivot at its geometric center

    DOEpatents

    Norbert, M.A.; Yale, O.

    1992-04-28

    A large effective-aperture, low-cost optical telescope with diffraction-limited resolution enables ground-based observation of near-earth space objects. The telescope has a non-redundant, thinned-aperture array in a center-mount, single-structure space frame. It employes speckle interferometric imaging to achieve diffraction-limited resolution. The signal-to-noise ratio problem is mitigated by moving the wavelength of operation to the near-IR, and the image is sensed by a Silicon CCD. The steerable, single-structure array presents a constant pupil. The center-mount, radar-like mount enables low-earth orbit space objects to be tracked as well as increases stiffness of the space frame. In the preferred embodiment, the array has elemental telescopes with subaperture of 2.1 m in a circle-of-nine configuration. The telescope array has an effective aperture of 12 m which provides a diffraction-limited resolution of 0.02 arc seconds. Pathlength matching of the telescope array is maintained by a electro-optical system employing laser metrology. Speckle imaging relaxes pathlength matching tolerance by one order of magnitude as compared to phased arrays. Many features of the telescope contribute to substantial reduction in costs. These include eliminating the conventional protective dome and reducing on-site construction activities. The cost of the telescope scales with the first power of the aperture rather than its third power as in conventional telescopes. 15 figs.

  2. High resolution telescope including an array of elemental telescopes aligned along a common axis and supported on a space frame with a pivot at its geometric center

    DOEpatents

    Norbert, Massie A.; Yale, Oster

    1992-01-01

    A large effective-aperture, low-cost optical telescope with diffraction-limited resolution enables ground-based observation of near-earth space objects. The telescope has a non-redundant, thinned-aperture array in a center-mount, single-structure space frame. It employes speckle interferometric imaging to achieve diffraction-limited resolution. The signal-to-noise ratio problem is mitigated by moving the wavelength of operation to the near-IR, and the image is sensed by a Silicon CCD. The steerable, single-structure array presents a constant pupil. The center-mount, radar-like mount enables low-earth orbit space objects to be tracked as well as increases stiffness of the space frame. In the preferred embodiment, the array has elemental telescopes with subaperture of 2.1 m in a circle-of-nine configuration. The telescope array has an effective aperture of 12 m which provides a diffraction-limited resolution of 0.02 arc seconds. Pathlength matching of the telescope array is maintained by a electro-optical system employing laser metrology. Speckle imaging relaxes pathlength matching tolerance by one order of magnitude as compared to phased arrays. Many features of the telescope contribute to substantial reduction in costs. These include eliminating the conventional protective dome and reducing on-site construction activities. The cost of the telescope scales with the first power of the aperture rather than its third power as in conventional telescopes.

  3. Freeform array projection

    NASA Astrophysics Data System (ADS)

    Michaelis, D.; Schreiber, P.; Li, C.; Bräuer, A.; Gross, H.

    2015-09-01

    The concept of multichannel array projection is generalized in order to realize an ultraslim, highly efficient optical system for structured illumination with high lumen output, where additionally the Köhler illumination principle is utilized and source light homogenization occurs. The optical system consists of a multitude of neighboring optical channels. In each channel two optical freeforms generate a real or a virtual spatial light pattern and furthermore, the ray directions are modified to enable Köhler illumination of a subsequent projection lens. The internal light pattern may be additionally influenced by absorbing apertures or slides. The projection lens transfers the resulting light pattern to a target, where the total target distribution is produced by superposition of all individual channel output pattern. The optical system without absorbing apertures can be regarded as a generalization of a fly's eye condenser for structured illumination. In this case light pattern is exclusively generated by freeform light redistribution. The commonly occurring blurring effect for freeform beamshaping is reduced due to the creation of a virtual object light structure by means of the two freeform surfaces and its imaging towards the target. But, the remaining blurring inhibits very high spatial frequencies at the target. In order to create target features with very high spatial resolution the absorbing apertures can be utilized. In this case the freeform beamshaping can be used for an enhanced light transmission through the absorbing apertures. The freeform surfaces are designed by a generalized approach of Cartesian oval representation.

  4. Optical stent inspection of surface texture and coating thickness

    NASA Astrophysics Data System (ADS)

    Bermudez, Carlos; Laguarta, Ferran; Cadevall, Cristina; Matilla, Aitor; Ibañez, Sergi; Artigas, Roger

    2017-02-01

    Stent quality control is a critical process. Coronary stents have to be inspected 100% so no defective stent is implanted into a human body. We have developed a high numerical aperture optical stent inspection system able to acquire both 2D and 3D images. Combining a rotational stage, an area camera with line-scan capability and a triple illumination arrangement, unrolled sections of the outer, inner, and sidewalls surfaces are obtained with high resolution. During stent inspection, surface roughness and coating thickness uniformity is of high interest. Due to the non-planar shape of the surface of the stents, the thickness values of the coating need to be corrected with the 3D surface local slopes. A theoretical model and a simulation are proposed, and a measurement with white light interferometry is shown. Confocal and spectroscopic reflectometry showed to be limited in this application due to stent surface roughness. Due to the high numerical aperture of the optical system, only certain parts of the stent are in focus, which is a problem for defect detection, specifically on the sidewalls. In order to obtain fully focused 2D images, an extended depth of field algorithm has been implemented. A comparison between pixel variance and Laplacian filtering is shown. To recover the stack image, two different methods are proposed: maximum projection and weighted intensity. Finally, we also discuss the implementation of the processing algorithms in both the CPU and GPU, targeting real-time 2-Million pixel image acquisition at 50 frames per second.

  5. Analysis of Low-Light and Night-Time Stereo-Pair Images for Photogrammetric Reconstruction

    NASA Astrophysics Data System (ADS)

    Santise, M.; Thoeni, K.; Roncella, R.; Diotri, F.; Giacomini, A.

    2018-05-01

    Rockfalls and rockslides represent a significant risk to human lives and infrastructures because of the high levels of energy involved in the phenomena. Generally, these events occur in accordance to specific environmental conditions, such as temperature variations between day and night, that can contribute to the triggering of structural instabilities in the rock-wall and the detachment of blocks and debris. The monitoring and the geostructural characterization of the wall are required for reducing the potential hazard and to improve the management of the risk at the bottom of the slopes affected by such phenomena. In this context, close range photogrammetry is largely used for the monitoring of high-mountain terrains and rock walls in mine sites allowing for periodic survey of rockfalls and wall movements. This work focuses on the analysis of low-light and night-time images of a fixed-base stereo pair photogrammetry system. The aim is to study the reliability of the images acquired over the night to produce digital surface models (DSMs) for change detection. The images are captured by a high-sensitivity DLSR camera using various settings accounting for different values of ISO, aperture and time of exposure. For each acquisition, the DSM is compared to a photogrammetric reference model produced by images captured in optimal illumination conditions. Results show that, with high level of ISO and maintaining the same grade of aperture, extending the exposure time improves the quality of the point clouds in terms of completeness and accuracy of the photogrammetric models.

  6. Simulation of MEMS for the Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Mott, Brent; Kuhn, Jonathan; Broduer, Steve (Technical Monitor)

    2001-01-01

    The NASA Goddard Space Flight Center (GSFC) is developing optical micro-electromechanical system (MEMS) components for potential application in Next Generation Space Telescope (NGST) science instruments. In this work, we present an overview of the electro-mechanical simulation of three MEMS components for NGST, which include a reflective micro-mirror array and transmissive microshutter array for aperture control for a near infrared (NIR) multi-object spectrometer and a large aperture MEMS Fabry-Perot tunable filter for a NIR wide field camera. In all cases the device must operate at cryogenic temperatures with low power consumption and low, complementary metal oxide semiconductor (CMOS) compatible, voltages. The goal of our simulation efforts is to adequately predict both the performance and the reliability of the devices during ground handling, launch, and operation to prevent failures late in the development process and during flight. This goal requires detailed modeling and validation of complex electro-thermal-mechanical interactions and very large non-linear deformations, often involving surface contact. Various parameters such as spatial dimensions and device response are often difficult to measure reliably at these small scales. In addition, these devices are fabricated from a wide variety of materials including surface micro-machined aluminum, reactive ion etched (RIE) silicon nitride, and deep reactive ion etched (DRIE) bulk single crystal silicon. The above broad set of conditions combine to be a formidable challenge for space flight qualification analysis. These simulations represent NASA/GSFC's first attempts at implementing a comprehensive strategy to address complex MEMS structures.

  7. The NIRCam Optical Telescope Simulator (NOTES)

    NASA Technical Reports Server (NTRS)

    Kubalak, David; Hakun, Claef; Greeley, Bradford; Eichorn, William; Leviton, Douglas; Guishard, Corina; Gong, Qian; Warner, Thomas; Bugby, David; Robinson, Frederick; hide

    2007-01-01

    The Near Infra-Red Camera (NIRCam), the 0.6-5.0 micron imager and wavefront sensing instrument for the James Webb Space Telescope (JWST), will be used on orbit both as a science instrument, and to tune the alignment of the telescope. The NIRCam Optical Telescope Element Simulator (NOTES) will be used during ground testing to provide an external stimulus to verify wavefront error, imaging characteristics, and wavefront sensing performance of this crucial instrument. NOTES is being designed and built by NASA Goddard Space Flight Center with the help of Swales Aerospace and Orbital Sciences Corporation. It is a single-point imaging system that uses an elliptical mirror to form an U20 image of a point source. The point source will be fed via optical fibers from outside the vacuum chamber. A tip/tilt mirror is used to change the chief ray angle of the beam as it passes through the aperture stop and thus steer the image over NIRCam's field of view without moving the pupil or introducing field aberrations. Interchangeable aperture stop elements allow us to simulate perfect JWST wavefronts for wavefront error testing, or introduce transmissive phase plates to simulate a misaligned JWST segmented mirror for wavefront sensing verification. NOTES will be maintained at an operating temperature of 80K during testing using thermal switches, allowing it to operate within the same test chamber as the NIRCam instrument. We discuss NOTES' current design status and on-going development activities.

  8. Signal-to-noise ratio of Singer product apertures

    NASA Astrophysics Data System (ADS)

    Shutler, Paul M. E.; Byard, Kevin

    2017-09-01

    Formulae for the signal-to-noise ratio (SNR) of Singer product apertures are derived, allowing optimal Singer product apertures to be identified, and the CPU time required to decode them is quantified. This allows a systematic comparison to be made of the performance of Singer product apertures against both conventionally wrapped Singer apertures, and also conventional product apertures such as square uniformly redundant arrays. For very large images, equivalently for images at very high resolution, the SNR of Singer product apertures is asymptotically as good as the best conventional apertures, but Singer product apertures decode faster than any conventional aperture by at least a factor of ten for image sizes up to several megapixels. These theoretical predictions are verified using numerical simulations, demonstrating that coded aperture video is for the first time a realistic possibility.

  9. Real-time depth camera tracking with geometrically stable weight algorithm

    NASA Astrophysics Data System (ADS)

    Fu, Xingyin; Zhu, Feng; Qi, Feng; Wang, Mingming

    2017-03-01

    We present an approach for real-time camera tracking with depth stream. Existing methods are prone to drift in sceneries without sufficient geometric information. First, we propose a new weight method for an iterative closest point algorithm commonly used in real-time dense mapping and tracking systems. By detecting uncertainty in pose and increasing weight of points that constrain unstable transformations, our system achieves accurate and robust trajectory estimation results. Our pipeline can be fully parallelized with GPU and incorporated into the current real-time depth camera tracking system seamlessly. Second, we compare the state-of-the-art weight algorithms and propose a weight degradation algorithm according to the measurement characteristics of a consumer depth camera. Third, we use Nvidia Kepler Shuffle instructions during warp and block reduction to improve the efficiency of our system. Results on the public TUM RGB-D database benchmark demonstrate that our camera tracking system achieves state-of-the-art results both in accuracy and efficiency.

  10. Investigating the Suitability of Mirrorless Cameras in Terrestrial Photogrammetric Applications

    NASA Astrophysics Data System (ADS)

    Incekara, A. H.; Seker, D. Z.; Delen, A.; Acar, A.

    2017-11-01

    Digital single-lens reflex cameras (DSLR) which are commonly referred as mirrored cameras are preferred for terrestrial photogrammetric applications such as documentation of cultural heritage, archaeological excavations and industrial measurements. Recently, digital cameras which are called as mirrorless systems that can be used with different lens combinations have become available for using similar applications. The main difference between these two camera types is the presence of the mirror mechanism which means that the incoming beam towards the lens is different in the way it reaches the sensor. In this study, two different digital cameras, one with a mirror (Nikon D700) and the other without a mirror (Sony a6000), were used to apply close range photogrammetric application on the rock surface at Istanbul Technical University (ITU) Ayazaga Campus. Accuracy of the 3D models created by means of photographs taken with both cameras were compared with each other using difference values between field and model coordinates which were obtained after the alignment of the photographs. In addition, cross sections were created on the 3D models for both data source and maximum area difference between them is quite small because they are almost overlapping. The mirrored camera has become more consistent in itself with respect to the change of model coordinates for models created with photographs taken at different times, with almost the same ground sample distance. As a result, it has been determined that mirrorless cameras and point cloud produced using photographs obtained from these cameras can be used for terrestrial photogrammetric studies.

  11. Multi-mode horn

    NASA Technical Reports Server (NTRS)

    Neilson, Jeffrey M. (Inventor)

    2002-01-01

    A horn has an input aperture and an output aperture, and comprises a conductive inner surface formed by rotating a curve about a central axis. The curve comprises a first arc having an input aperture end and a transition end, and a second arc having a transition end and an output aperture end. When rotated about the central axis, the first arc input aperture end forms an input aperture, and the second arc output aperture end forms an output aperture. The curve is then optimized to provide a mode conversion which maximizes the power transfer of input energy to the Gaussian mode at the output aperture.

  12. Identification of Active Galactic Nuclei through HST optical variability in the GOODS South field

    NASA Astrophysics Data System (ADS)

    Pouliasis, Ektoras; Georgantopoulos; Bonanos, A.; HCV Team

    2016-08-01

    This work aims to identify AGN in the GOODS South deep field through optical variability. This method can easily identify low-luminosity AGN. In particular, we use images in the z-band obtained from the Hubble Space Telescope with the ACS/WFC camera over 5 epochs separated by ~45 days. Aperture photometry has been performed using SExtractor to extract the lightcurves. Several variability indices, such as the median absolute deviation, excess variance, and sigma were applied to automatically identify the variable sources. After removing artifacts, stars and supernovae from the variable selected sample and keeping only those sources with known photometric or spectroscopic redshift, the optical variability was compared to variability in other wavelengths (X-rays, mid-IR, radio). This multi-wavelength study provides important constraints on the structure and the properties of the AGN and their relation to their hosts. This work is a part of the validation of the Hubble Catalog of Variables (HCV) project, which has been launched at the National Observatory of Athens by ESA, and aims to identify all sources (pointlike and extended) showing variability, based on the Hubble Source Catalog (HSC, Whitmore et al. 2015). The HSC version 1 was released in February 2015 and includes 80 million sources imaged with the WFPC2, ACS/WFC, WFC3/UVIS and WFC3/IR cameras.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanov, Oleg P.; Semin, Ilya A.; Potapov, Victor N.

    Gamma-ray imaging is the most important way to identify unknown gamma-ray emitting objects in decommissioning, security, overcoming accidents. Over the past two decades a system for producing of gamma images in these conditions became more or less portable devices. But in recent years these systems have become the hand-held devices. This is very important, especially in emergency situations, and measurements for safety reasons. We describe the first integrated hand-held instrument for emergency and security applications. The device is based on the coded aperture image formation, position sensitive gamma-ray (X-ray) detector Medipix2 (detectors produces by X-ray Imaging Europe) and tablet computer.more » The development was aimed at creating a very low weight system with high angular resolution. We present some sample gamma-ray images by camera. Main estimated parameters of the system are the following. The field of view video channel ∼ 490 deg. The field of view gamma channel ∼ 300 deg. The sensitivity of the system with a hexagonal mask for the source of Cs-137 (Eg = 662 keV), is in units of dose D ∼ 100 mR. This option is less then order of magnitude worse than for the heavy, non-hand-held systems (e.g., gamma-camera Cartogam, by Canberra.) The angular resolution of the gamma channel for the sources of Cs-137 (Eg = 662 keV) is about 1.20 deg. (authors)« less

  14. Full-field OCT: applications in ophthalmology

    NASA Astrophysics Data System (ADS)

    Grieve, Kate; Dubois, Arnaud; Paques, Michel; Le Gargasson, Jean-Francois; Boccara, Albert C.

    2005-04-01

    We present images of ocular tissues obtained using ultrahigh resolution full-field OCT. The experimental setup is based on the Linnik interferometer, illuminated by a tungsten halogen lamp. En face tomographic images are obtained in real-time without scanning by computing the difference of two phase-opposed interferometric images recorded by a high-resolution CCD camera. A spatial resolution of 0.7 μm × 0.9 μm (axial × transverse) is achieved thanks to the short source coherence length and the use of high numerical aperture microscope objectives. A detection sensitivity of 90 dB is obtained by means of image averaging and pixel binning. Whole unfixed eyes and unstained tissue samples (cornea, lens, retina, choroid and sclera) of ex vivo rat, mouse, rabbit and porcine ocular tissues were examined. The unprecedented resolution of our instrument allows cellular-level resolution in the cornea and retina, and visualization of individual fibers in the lens. Transcorneal lens imaging was possible in all animals, and in albino animals, transscleral retinal imaging was achieved. We also introduce our rapid acquisition full-field optical coherence tomography system designed to accommodate in vivo ophthalmologic imaging. The variations on the original system technology include the introduction of a xenon arc lamp as source, and rapid image acquisition performed by a high-speed CMOS camera, reducing acquisition time to 5 ms per frame.

  15. Small form-factor VGA camera with variable focus by liquid lens

    NASA Astrophysics Data System (ADS)

    Oikarinen, Kari A.; Aikio, Mika

    2010-05-01

    We present the design of a 24 mm long variable focus lens for 1/4" sensor. The chosen CMOS color sensor has VGA (640×480) resolution and 5.6 μm pixel size. The lens utilizes one Varioptic Arctic 320 liquid lens that has a voltage-controllable focal length due to the electrowetting effect. There are no mechanical moving parts. The principle of operation of the liquid lens is explained briefly. We discuss designing optical systems with this type of lens. This includes a modeling approach that allows entering a voltage value to modify the configuration of the liquid lens. The presented design consists only of spherical glass surfaces. The choice to use spherical surfaces was made in order to decrease the costs of manufacturing and provide more predictable performance by the better established method. Fabrication tolerances are compensated by the adjustability of the liquid lens, further increasing the feasibility of manufacturing. The lens is manufactured and assembled into a demonstrator camera. It has an f-number of 2.5 and 40 degree full field of view. The effective focal length varies around 6 millimeters as the liquid lens is adjusted. In simulations we have achieved a focus distance controllable between 20 millimeters and infinity. The design differs from previous approaches by having the aperture stop in the middle of the system instead of in front.

  16. Keyhole imaging method for dynamic objects behind the occlusion area

    NASA Astrophysics Data System (ADS)

    Hao, Conghui; Chen, Xi; Dong, Liquan; Zhao, Yuejin; Liu, Ming; Kong, Lingqin; Hui, Mei; Liu, Xiaohua; Wu, Hong

    2018-01-01

    A method of keyhole imaging based on camera array is realized to obtain the video image behind a keyhole in shielded space at a relatively long distance. We get the multi-angle video images by using a 2×2 CCD camera array to take the images behind the keyhole in four directions. The multi-angle video images are saved in the form of frame sequences. This paper presents a method of video frame alignment. In order to remove the non-target area outside the aperture, we use the canny operator and morphological method to realize the edge detection of images and fill the images. The image stitching of four images is accomplished on the basis of the image stitching algorithm of two images. In the image stitching algorithm of two images, the SIFT method is adopted to accomplish the initial matching of images, and then the RANSAC algorithm is applied to eliminate the wrong matching points and to obtain a homography matrix. A method of optimizing transformation matrix is proposed in this paper. Finally, the video image with larger field of view behind the keyhole can be synthesized with image frame sequence in which every single frame is stitched. The results show that the screen of the video is clear and natural, the brightness transition is smooth. There is no obvious artificial stitching marks in the video, and it can be applied in different engineering environment .

  17. Digitizing an Analog Radiography Teaching File Under Time Constraint: Trade-Offs in Efficiency and Image Quality.

    PubMed

    Loehfelm, Thomas W; Prater, Adam B; Debebe, Tequam; Sekhar, Aarti K

    2017-02-01

    We digitized the radiography teaching file at Black Lion Hospital (Addis Ababa, Ethiopia) during a recent trip, using a standard digital camera and a fluorescent light box. Our goal was to photograph every radiograph in the existing library while optimizing the final image size to the maximum resolution of a high quality tablet computer, preserving the contrast resolution of the radiographs, and minimizing total library file size. A secondary important goal was to minimize the cost and time required to take and process the images. Three workers were able to efficiently remove the radiographs from their storage folders, hang them on the light box, operate the camera, catalog the image, and repack the radiographs back to the storage folder. Zoom, focal length, and film speed were fixed, while aperture and shutter speed were manually adjusted for each image, allowing for efficiency and flexibility in image acquisition. Keeping zoom and focal length fixed, which kept the view box at the same relative position in all of the images acquired during a single photography session, allowed unused space to be batch-cropped, saving considerable time in post-processing, at the expense of final image resolution. We present an analysis of the trade-offs in workflow efficiency and final image quality, and demonstrate that a few people with minimal equipment can efficiently digitize a teaching file library.

  18. 67P/Churyumov-Gerasimenko: Activity between March and June 2014 as observed from Rosetta/OSIRIS

    NASA Astrophysics Data System (ADS)

    Tubiana, C.; Snodgrass, C.; Bertini, I.; Mottola, S.; Vincent, J.-B.; Lara, L.; Fornasier, S.; Knollenberg, J.; Thomas, N.; Fulle, M.; Agarwal, J.; Bodewits, D.; Ferri, F.; Güttler, C.; Gutierrez, P. J.; La Forgia, F.; Lowry, S.; Magrin, S.; Oklay, N.; Pajola, M.; Rodrigo, R.; Sierks, H.; A'Hearn, M. F.; Angrilli, F.; Barbieri, C.; Barucci, M. A.; Bertaux, J.-L.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; De Cecco, M.; Debei, S.; Groussin, O.; Hviid, S. F.; Ip, W.; Jorda, L.; Keller, H. U.; Koschny, D.; Kramm, R.; Kührt, E.; Küppers, M.; Lazzarin, M.; Lamy, P. L.; Lopez Moreno, J. J.; Marzari, F.; Michalik, H.; Naletto, G.; Rickman, H.; Sabau, L.; Wenzel, K.-P.

    2015-01-01

    Aims: 67P/Churyumov-Gerasimenko is the target comet of the ESA's Rosetta mission. After commissioning at the end of March 2014, the Optical, Spectroscopic, and Infrared Remote Imaging System (OSIRIS) onboard Rosetta, started imaging the comet and its dust environment to investigate how they change and evolve while approaching the Sun. Methods: We focused our work on Narrow Angle Camera (NAC) orange images and Wide Angle Camera (WAC) red and visible-610 images acquired between 2014 March 23 and June 24 when the nucleus of 67P was unresolved and moving from approximately 4.3 AU to 3.8 AU inbound. During this period the 67P - Rosetta distance decreased from 5 million to 120 thousand km. Results: Through aperture photometry, we investigated how the comet brightness varies with heliocentric distance. 67P was likely already weakly active at the end of March 2014, with excess flux above that expected for the nucleus. The comet's brightness was mostly constant during the three months of approach observations, apart from one outburst that occurred around April 30 and a second increase in flux after June 20. Coma was resolved in the profiles from mid-April. Analysis of the coma morphology suggests that most of the activity comes from a source towards the celestial north pole of the comet, but the outburst that occurred on April 30 released material in a different direction.

  19. An X-ray beam position monitor based on the photoluminescence of helium gas

    NASA Astrophysics Data System (ADS)

    Revesz, Peter; White, Jeffrey A.

    2005-03-01

    A new method for white beam position monitoring for both bend magnet and wiggler synchrotron X-ray radiation has been developed. This method utilizes visible light luminescence generated as a result of ionization by the intense X-ray flux. In video beam position monitors (VBPMs), the luminescence of helium gas at atmospheric pressure is observed through a view port using a CCD camera next to the beam line. The beam position, profile, integrated intensity and FWHM are calculated from the distribution of luminescence intensity in each captured image by custom software. Misalignment of upstream apertures changes the image profile making VBPMs helpful for initial alignment of upstream beam line components. VBPMs can thus provide more information about the X-ray beam than most beam position monitors (BPMs). A beam position calibration procedure, employing a tilted plane-parallel glass plate placed in front of the camera lens, has also been developed. The accuracy of the VBPM system was measured during a bench-top experiment to be better than 1 μm. The He-luminescence-based VBPM system has been operative on three CHESS beam lines (F hard-bend and wiggler, A-line wiggler and G-line wiggler) for about a year. The beam positions are converted to analog voltages and used as feedback signals for beam stabilization. In our paper we discuss details of VBPM construction and describe further results of its performance.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chitwood, M. Colter; Lashley, Marcus A.; Kilgo, John C.

    Camera surveys commonly are used by managers and hunters to estimate white-tailed deer Odocoileus virginianus density and demographic rates. Though studies have documented biases and inaccuracies in the camera survey methodology, camera traps remain popular due to ease of use, cost-effectiveness, and ability to survey large areas. Because recruitment is a key parameter in ungulate population dynamics, there is a growing need to test the effectiveness of camera surveys for assessing fawn recruitment. At Savannah River Site, South Carolina, we used six years of camera-based recruitment estimates (i.e. fawn:doe ratio) to predict concurrently collected annual radiotag-based survival estimates. The coefficientmore » of determination (R) was 0.445, indicating some support for the viability of cameras to reflect recruitment. Here, we added two years of data from Fort Bragg Military Installation, North Carolina, which improved R to 0.621 without accounting for site-specific variability. Also, we evaluated the correlation between year-to-year changes in recruitment and survival using the Savannah River Site data; R was 0.758, suggesting that camera-based recruitment could be useful as an indicator of the trend in survival. Because so few researchers concurrently estimate survival and camera-based recruitment, examining this relationship at larger spatial scales while controlling for numerous confounding variables remains difficult. We believe that future research should test the validity of our results from other areas with varying deer and camera densities, as site (e.g. presence of feral pigs Sus scrofa) and demographic (e.g. fawn age at time of camera survey) parameters may have a large influence on detectability. Until such biases are fully quantified, we urge researchers and managers to use caution when advocating the use of camera-based recruitment estimates.« less

Top